Abstract: In this paper a data miner based on the learning
automata is proposed and is called LA-miner. The LA-miner extracts
classification rules from data sets automatically. The proposed
algorithm is established based on the function optimization using
learning automata. The experimental results on three benchmarks
indicate that the performance of the proposed LA-miner is
comparable with (sometimes better than) the Ant-miner (a data miner
algorithm based on the Ant Colony optimization algorithm) and CNZ
(a well-known data mining algorithm for classification).
Abstract: Statistical selection procedures are used to select the
best simulated system from a finite set of alternatives. In this paper,
we present a procedure that can be used to select the best system
when the number of alternatives is large. The proposed procedure
consists a combination between Ranking and Selection, and Ordinal
Optimization procedures. In order to improve the performance of Ordinal
Optimization, Optimal Computing Budget Allocation technique
is used to determine the best simulation lengths for all simulation
systems and to reduce the total computation time. We also argue
the effect of increment in simulation samples for the combined
procedure. The results of numerical illustration show clearly the effect
of increment in simulation samples on the proposed combination of
selection procedure.
Abstract: A method is presented for the construction of arbitrary
even-input sorting networks exhibiting better properties than the
networks created using a conventional technique of the same type.
The method was discovered by means of a genetic algorithm combined
with an application-specific development. Similarly to human
inventions in the area of theoretical computer science, the evolved
invention was analyzed: its generality was proven and area and time
complexities were determined.
Abstract: Enterprise Wide Information Systems (EWIS)
implementation involves the entire business and will require changes
throughout the firm. Because of the scope, complexity and
continuous nature of ERP, the project-based approach to managing
the implementation process resulted in failure rates of between 60%
and 80%. In recent years ERP systems have received much attention.
The organizational relevance and risk of ERP projects make it
important for organizations to focus on ways to make ERP
implementation successful. Once these systems are in place,
however, their performance depends on the identified macro
variables viz. 'Business Process', 'Decision Making' and 'Individual
/ Group working'. The questionnaire was designed and administered.
The responses from 92 organizations were compiled. The
relationship of these variables with EWIS performance is analyzed
using inferential statistical measurements. The study helps to
understand the performance of model presented. The study suggested
in keeping away from the calamities and thereby giving the
necessary competitive edge. Whenever some discrepancy is
identified during the process of performance appraisal care has to be
taken to draft necessary preventive measures. If all these measures
are taken care off then the EWIS performance will definitely deliver
the results.
Abstract: A study of electromagnetic flow meter is presented in the paper. Comparison has been made between the analytical and the numerical results by the use of FEM numerical analysis (Quick Field 5.6) for determining polarization voltage through the circle cross section of the polarization transducer. Exciting and geometrical parameters increasing its effectiveness has been examined. The aim is to obtain maximal output signal. The investigations include different variants of the magnetic flux density distribution around the tube: homogeneous field of magnitude Bm, linear distribution with maximal value Bm and trapezium distribution conserving the same exciting magnetic energy as the homogeneous field.
Abstract: Adhesion strength of exterior or interior coating of
steel pipes is too important. Increasing of coating adhesion on
surfaces can increase the life time of coating, safety factor of
transmitting line pipe and decreasing the rate of corrosion and costs.
Preparation of steel pipe surfaces before doing the coating process is
done by shot and grit blasting. This is a mechanical way to do it.
Some effective parameters on that process, are particle size of
abrasives, distance to surface, rate of abrasive flow, abrasive physical
properties, shapes, selection of abrasive, kind of machine and its
power, standard of surface cleanness degree, roughness, time of
blasting and weather humidity. This search intended to find some
better conditions which improve the surface preparation, adhesion
strength and corrosion resistance of coating. So, this paper has
studied the effect of varying abrasive flow rate, changing the
abrasive particle size, time of surface blasting on steel surface
roughness and over blasting on it by using the centrifugal blasting
machine. After preparation of numbers of steel samples (according to
API 5L X52) and applying epoxy powder coating on them, to
compare strength adhesion of coating by Pull-Off test. The results
have shown that, increasing the abrasive particles size and flow rate,
can increase the steel surface roughness and coating adhesion
strength but increasing the blasting time can do surface over blasting
and increasing surface temperature and hardness too, change,
decreasing steel surface roughness and coating adhesion strength.
Abstract: This paper presents a VLSI design approach of a highspeed
and real-time 2-D Discrete Wavelet Transform computing. The
proposed architecture, based on new and fast convolution approach,
reduces the hardware complexity in addition to reduce the critical
path to the multiplier delay. Furthermore, an advanced twodimensional
(2-D) discrete wavelet transform (DWT)
implementation, with an efficient memory area, is designed to
produce one output in every clock cycle. As a result, a very highspeed
is attained. The system is verified, using JPEG2000
coefficients filters, on Xilinx Virtex-II Field Programmable Gate
Array (FPGA) device without accessing any external memory. The
resulting computing rate is up to 270 M samples/s and the (9,7) 2-D
wavelet filter uses only 18 kb of memory (16 kb of first-in-first-out
memory) with 256×256 image size. In this way, the developed design
requests reduced memory and provide very high-speed processing as
well as high PSNR quality.
Abstract: The extract of milk thistle contains a mix of flavonolignans termed silymarine.. In order to analysis influence of growth regulators, genotype, explant and subculture on the accumulation of flavonolignans, a study was carried out by using two genotype (Budakalszi and Noor abad moghan cultivars), cotyledon and hypocotyle explants, solid media of MS supplemented by different combinations of two growth regulators; Kinetin (0.1, 1 mg/l) and 2,4-D (1, 2 mg/l). Seeds of the plant were germinated in MS media whitout growth regulators in growth chamber at 26°C and darkness condition. In order to callus induction, the culture media was supplemented whit different concentrations of 2,4-D and kinetin. Calli obtained from explants were sub-cultured four times into the fresh media of the first experiment. flavonoides was extracted from calli in four subcultures. The flavonoid components were determined by high- performance liquid choromatography (HPLC) and separated into Taxifolin, Silydianin+Silychristin, Silybin A+B and Isosilybin A+B. Results showed that with increasing callus age, increased accumulation of silybin A+B, but reduced Isosilybin A+B content. Highest accumulation of Taxifolin was observed at first calli. Calli produced from cotyledon explant of Budakalszi cultivar were superior for Silybin A+B, where calli from hypocotyl explant produced higher amount of Taxifolin and Silydianin+Silychristin. The best cultivar for Silymarin production in this study was Budakalszi cultivar. High amount of SBN A+B and TXF were obtained from hypocotil explant.
Abstract: This study deals with the phenomena of reflection and transmission (refraction) of qSV-waves, for an incident of quasi transverse vertically waves, at a plane interface of two semi-infinite piezoelectric elastic media under the influence of the initial stresses. The relations governing the reflection and transmission coefficients of these reflected waves for various suitable boundary conditions are derived. We have shown analytically that reflection and transmission coefficients of (qP) and (qSV) waves depend upon the angle of incidence, the parameters of electric potential, the material constants of the medium as will as the initial stresses presented in the media. The numerical calculations of the reflection and transmission amplitude ratios for different values of initial stresses have been carried out by computer for different materials as examples and the results are given in the form of graphs. Finally, some of particular cases are considered.
Abstract: In this paper we propose an enhanced equalization technique for multi-carrier code division multiple access (MC-CDMA). This method is based on the control of Equal Gain Combining (EGC) technique. Indeed, we introduce a new level changer to the EGC equalizer in order to adapt the equalization parameters to the channel coefficients. The optimal equalization level is, first, determined by channel training. The new approach reduces drastically the mutliuser interferences caused by interferes, without increasing the noise power. To compare the performances of the proposed equalizer, the theoretical analysis and numerical performances are given.
Abstract: A method for solving linear and non-linear Goursat
problem is given by using the two-dimensional differential transform
method. The approximate solution of this problem is calculated in
the form of a series with easily computable terms and also the exact
solutions can be achieved by the known forms of the series solutions.
The method can easily be applied to many linear and non-linear
problems and is capable of reducing the size of computational work.
Several examples are given to demonstrate the reliability and the
performance of the presented method.
Abstract: There are two paradigms proposed to provide QoS for Internet applications: Integrated service (IntServ) and Differentiated service (DiffServ).Intserv is not appropriate for large network like Internet. Because is very complex. Therefore, to reduce the complexity of QoS management, DiffServ was introduced to provide QoS within a domain using aggregation of flow and per- class service. In theses networks QoS between classes is constant and it allows low priority traffic to be effected from high priority traffic, which is not suitable. In this paper, we proposed a fuzzy controller, which reduced the effect of low priority class on higher priority ones. Our simulations shows that, our approach reduces the latency dependency of low priority class on higher priority ones, in an effective manner.
Abstract: Data mining is an extraordinarily demanding field referring to extraction of implicit knowledge and relationships, which are not explicitly stored in databases. A wide variety of methods of data mining have been introduced (classification, characterization, generalization...). Each one of these methods includes more than algorithm. A system of data mining implies different user categories,, which mean that the user-s behavior must be a component of the system. The problem at this level is to know which algorithm of which method to employ for an exploratory end, which one for a decisional end, and how can they collaborate and communicate. Agent paradigm presents a new way of conception and realizing of data mining system. The purpose is to combine different algorithms of data mining to prepare elements for decision-makers, benefiting from the possibilities offered by the multi-agent systems. In this paper the agent framework for data mining is introduced, and its overall architecture and functionality are presented. The validation is made on spatial data. Principal results will be presented.
Abstract: This paper presents a novel method for data hiding based on neighborhood pixels information to calculate the number of bits that can be used for substitution and modified Least Significant Bits technique for data embedding. The modified solution is independent of the nature of the data to be hidden and gives correct results along with un-noticeable image degradation. The technique, to find the number of bits that can be used for data hiding, uses the green component of the image as it is less sensitive to human eye and thus it is totally impossible for human eye to predict whether the image is encrypted or not. The application further encrypts the data using a custom designed algorithm before embedding bits into image for further security. The overall process consists of three main modules namely embedding, encryption and extraction cm.
Abstract: According to the density of the chips, designers are
trying to put so any facilities of computational and storage on single
chips. Along with the complexity of computational and storage
circuits, the designing, testing and debugging become more and more
complex and expensive. So, hardware design will be built by using
very high speed hardware description language, which is more
efficient and cost effective. This paper will focus on the
implementation of 32-bit ALU design based on Verilog hardware
description language. Adder and subtracter operate correctly on both
unsigned and positive numbers. In ALU, addition takes most of the
time if it uses the ripple-carry adder. The general strategy for
designing fast adders is to reduce the time required to form carry
signals. Adders that use this principle are called carry look- ahead
adder. The carry look-ahead adder is to be designed with combination
of 4-bit adders. The syntax of Verilog HDL is similar to the C
programming language. This paper proposes a unified approach to
ALU design in which both simulation and formal verification can
co-exist.
Abstract: A high performance thin layer chromatography
system (HPTLC) for the separation of vitamin B2 and B12 has been
developed. The separation was successfully using a solvent system of
methanol, water, ammonia 7.3.1 (V/V) as mobile phase on HPTLC
plates impregnated with boric acid. The effect of other mobile phases
on the separation of vitamins was also examined. The method is
based on different behavior of investigated compounds in
impregnated TLC plates with different amount of boric acid. The Rf
values of vitamin B2 and B12 are considered on non impregnated
and impregnated silica gel HPTLC plate with boric acid. The effect
of boric acid in the mobile phase and on HPTLC plates on the RF
values of the vitamins has also been studied.
Abstract: In an open real-time system environment, the coexistence of different kinds of real-time and non real-time applications makes the system scheduling mechanism face new requirements and challenges. One two-level scheduling scheme of the open real-time systems is introduced, and points out that hard and soft real-time applications are scheduled non-distinctively as the same type real-time applications, the Quality of Service (QoS) cannot be guaranteed. It has two flaws: The first, it can not differentiate scheduling priorities of hard and soft real-time applications, that is to say, it neglects characteristic differences between hard real-time applications and soft ones, so it does not suit a more complex real-time environment. The second, the worst case execution time of soft real-time applications cannot be predicted exactly, so it is not worth while to cost much spending in order to assure all soft real-time applications not to miss their deadlines, and doing that may cause resource wasting. In order to solve this problem, a novel two-level real-time scheduling mechanism (including scheduling profile and scheduling algorithm) which adds the process of dealing with soft real-time applications is proposed. Finally, we verify real-time scheduling mechanism from two aspects of theory and experiment. The results indicate that our scheduling mechanism can achieve the following objectives. (1) It can reflect the difference of priority when scheduling hard and soft real-time applications. (2) It can ensure schedulability of hard real-time applications, that is, their rate of missing deadline is 0. (3) The overall rate of missing deadline of soft real-time applications can be less than 1. (4) The deadline of a non-real-time application is not set, whereas the scheduling algorithm that server 0 S uses can avoid the “starvation" of jobs and increase QOS. By doing that, our scheduling mechanism is more compatible with different types of applications and it will be applied more widely.
Abstract: Model Predictive Control has been previously applied
to supply chain problems with promising results; however hitherto
proposed systems possessed no information on future demand. A
forecasting methodology will surely promote the efficiency of
control actions by providing insight on the future. A complete supply
chain management framework that is based on Model Predictive
Control (MPC) and Time Series Forecasting will be presented in this
paper. The proposed framework will be tested on industrial data in
order to assess the efficiency of the method and the impact of
forecast accuracy on overall control performance of the supply chain.
To this end, forecasting methodologies with different characteristics
will be implemented on test data to generate forecasts that will serve
as input to the Model Predictive Control module.
Abstract: The appearance management behavior of tanning by gay men is examined through the lens of Impression Formation. The study proposes that body image, self-esteem, and internalized homophobia are connected and affect the motives for engaging in sun, salon, and cosmetic tanning. Motives examined were: to look masculine, to look attractive to (potential) partners, to look attractive in general, to socialize, to meet a peer standard, and for personal satisfaction. Using regression analysis to examine data of 103 gay men who engage in at least one method of tanning, results reveal that components of body image and internalized homophobia–but not self-esteem–are linked to various motives and methods of tanning. These findings support and extend the literature of Impression Formation Theory and provide practitioners in the health and healthrelated fields new avenues to pursue when dealing with diseases related to tanning.
Abstract: In designing river intakes and diversion structures, it is paramount that the sediments entering the intake are minimized or, if possible, completely separated. Due to high water velocity, sediments can significantly damage hydraulic structures especially when mechanical equipment like pumps and turbines are used. This subsequently results in wasting water, electricity and further costs. Therefore, it is prudent to investigate and analyze the performance of lateral intakes affected by sediment control structures. Laboratory experiments, despite their vast potential and benefits, can face certain limitations and challenges. Some of these include: limitations in equipment and facilities, space constraints, equipment errors including lack of adequate precision or mal-operation, and finally, human error. Research has shown that in order to achieve the ultimate goal of intake structure design – which is to design longlasting and proficient structures – the best combination of sediment control structures (such as sill and submerged vanes) along with parameters that increase their performance (such as diversion angle and location) should be determined. Cost, difficulty of execution and environmental impacts should also be included in evaluating the optimal design. This solution can then be applied to similar problems in the future. Subsequently, the model used to arrive at the optimal design requires high level of accuracy and precision in order to avoid improper design and execution of projects. Process of creating and executing the design should be as comprehensive and applicable as possible. Therefore, it is important that influential parameters and vital criteria is fully understood and applied at all stages of choosing the optimal design. In this article, influential parameters on optimal performance of the intake, advantages and disadvantages, and efficiency of a given design are studied. Then, a multi-criterion decision matrix is utilized to choose the optimal model that can be used to determine the proper parameters in constructing the intake.