Abstract: Multiprocessor task scheduling is a NP-hard problem and Genetic Algorithm (GA) has been revealed as an excellent technique for finding an optimal solution. In the past, several methods have been considered for the solution of this problem based on GAs. But, all these methods consider single criteria and in the present work, minimization of the bi-criteria multiprocessor task scheduling problem has been considered which includes weighted sum of makespan & total completion time. Efficiency and effectiveness of genetic algorithm can be achieved by optimization of its different parameters such as crossover, mutation, crossover probability, selection function etc. The effects of GA parameters on minimization of bi-criteria fitness function and subsequent setting of parameters have been accomplished by central composite design (CCD) approach of response surface methodology (RSM) of Design of Experiments. The experiments have been performed with different levels of GA parameters and analysis of variance has been performed for significant parameters for minimisation of makespan and total completion time simultaneously.
Abstract: Moisture is an important consideration in many
aspects ranging from irrigation, soil chemistry, golf course, corrosion
and erosion, road conditions, weather predictions, livestock feed
moisture levels, water seepage etc. Vegetation and crops always
depend more on the moisture available at the root level than on
precipitation occurrence. In this paper, design of an instrument is
discussed which tells about the variation in the moisture contents of
soil. This is done by measuring the amount of water content in soil by
finding the variation in capacitance of soil with the help of a
capacitive sensor. The greatest advantage of soil moisture sensor is
reduced water consumption. The sensor is also be used to set lower
and upper threshold to maintain optimum soil moisture saturation and
minimize water wilting, contributes to deeper plant root growth
,reduced soil run off /leaching and less favorable condition for insects
and fungal diseases. Capacitance method is preferred because, it
provides absolute amount of water content and also measures water
content at any depth.
Abstract: The first generation of Mobile Agents based Intrusion
Detection System just had two components namely data collection
and single centralized analyzer. The disadvantage of this type of
intrusion detection is if connection to the analyzer fails, the entire
system will become useless. In this work, we propose novel hybrid
model for Mobile Agent based Distributed Intrusion Detection
System to overcome the current problem. The proposed model has
new features such as robustness, capability of detecting intrusion
against the IDS itself and capability of updating itself to detect new
pattern of intrusions. In addition, our proposed model is also capable
of tackling some of the weaknesses of centralized Intrusion Detection
System models.
Abstract: In this paper, the requirement for Coke quality
prediction, its role in Blast furnaces, and the model output is
explained. By applying method of Artificial Neural Networking
(ANN) using back propagation (BP) algorithm, prediction model has
been developed to predict CSR. Important blast furnace functions
such as permeability, heat exchanging, melting, and reducing
capacity are mostly connected to coke quality. Coke quality is further
dependent upon coal characterization and coke making process
parameters. The ANN model developed is a useful tool for process
experts to adjust the control parameters in case of coke quality
deviations. The model also makes it possible to predict CSR for new
coal blends which are yet to be used in Coke Plant. Input data to the
model was structured into 3 modules, for tenure of past 2 years and
the incremental models thus developed assists in identifying the
group causing the deviation of CSR.
Abstract: This paper is a survey of current component-based
software technologies and the description of promotion and
inhibition factors in CBSE. The features that software components
inherit are also discussed. Quality Assurance issues in componentbased
software are also catered to. The feat research on the quality
model of component based system starts with the study of what the
components are, CBSE, its development life cycle and the pro &
cons of CBSE. Various attributes are studied and compared keeping
in view the study of various existing models for general systems and
CBS. When illustrating the quality of a software component an apt
set of quality attributes for the description of the system (or
components) should be selected. Finally, the research issues that can
be extended are tabularized.
Abstract: Random Forests are a powerful classification technique, consisting of a collection of decision trees. One useful feature of Random Forests is the ability to determine the importance of each variable in predicting the outcome. This is done by permuting each variable and computing the change in prediction accuracy before and after the permutation. This variable importance calculation is similar to a one-factor-at a time experiment and therefore is inefficient. In this paper, we use a regular fractional factorial design to determine which variables to permute. Based on the results of the trials in the experiment, we calculate the individual importance of the variables, with improved precision over the standard method. The method is illustrated with a study of student attrition at Monash University.
Abstract: Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for preprocessing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based preprocessing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.
Abstract: Web usage mining algorithms have been widely
utilized for modeling user web navigation behavior. In this study we
advance a model for mining of user-s navigation pattern. The model
makes user model based on expectation-maximization (EM)
algorithm.An EM algorithm is used in statistics for finding maximum
likelihood estimates of parameters in probabilistic models, where the
model depends on unobserved latent variables. The experimental
results represent that by decreasing the number of clusters, the log
likelihood converges toward lower values and probability of the
largest cluster will be decreased while the number of the clusters
increases in each treatment.
Abstract: This paper proposes a new decision making approch
based on quantitative possibilistic influence diagrams which are
extension of standard influence diagrams in the possibilistic framework.
We will in particular treat the case where several expert
opinions relative to value nodes are available. An initial expert assigns
confidence degrees to other experts and fixes a similarity threshold
that provided possibility distributions should respect. To illustrate our
approach an evaluation algorithm for these multi-source possibilistic
influence diagrams will also be proposed.
Abstract: Detection of incipient abnormal events is important to
improve safety and reliability of machine operations and reduce losses
caused by failures. Improper set-ups or aligning of parts often leads to
severe problems in many machines. The construction of prediction
models for predicting faulty conditions is quite essential in making
decisions on when to perform machine maintenance. This paper
presents a multivariate calibration monitoring approach based on the
statistical analysis of machine measurement data. The calibration
model is used to predict two faulty conditions from historical reference
data. This approach utilizes genetic algorithms (GA) based variable
selection, and we evaluate the predictive performance of several
prediction methods using real data. The results shows that the
calibration model based on supervised probabilistic principal
component analysis (SPPCA) yielded best performance in this work.
By adopting a proper variable selection scheme in calibration models,
the prediction performance can be improved by excluding
non-informative variables from their model building steps.
Abstract: In this paper, we are interested in attitude control of a satellite, which using wheels of reaction, by state feedback. First, we develop a method allowing us to put the control and its integral in the state-feedback form. Then, by using the theorem of Gronwall- Bellman, we put the sufficient conditions so that the nonlinear system modeling the satellite is stabilisable and observed by state feedback.
Abstract: Modeling of Panel Zone (PZ) seismic behavior,
because of its role in overall ductility and lateral stiffness of steel
moment frames, has been considered a challenge for years. There are
some studies regarding the effects of different doubler plates
thicknesses and geometric properties of PZ on its seismic behavior.
However, there is not much investigation on the effects of number of
provided continuity plates in case of presence of one triangular
haunch, two triangular haunches and rectangular haunch (T shape
haunches) for exterior columns. In this research first detailed finite
element models of 12tested connection of SAC joint venture were
created and analyzed then obtained cyclic behavior backbone curves
of these models besides other FE models for similar tests were used
for neural network training. Then seismic behavior of these data is
categorized according to continuity plate-s arrangements and
differences in type of haunches. PZ with one-sided haunches have
little plastic rotation. As the number of continuity plates increases
due to presence of two triangular haunches (four continuity plate),
there will be no plastic rotation, in other words PZ behaves in its
elastic range. In the case of rectangular haunch, PZ show more plastic
rotation in comparison with one-sided triangular haunch and
especially double-sided triangular haunches. Moreover, the models
that will be presented in case of triangular one-sided and double-
sided haunches and rectangular haunches as a result of this study
seem to have a proper estimation of PZ seismic behavior.
Abstract: Single biometric modality recognition is not able to meet the high performance supplies in most cases with its application become more and more broadly. Multimodal biometrics identification represents an emerging trend recently. This paper investigates a novel algorithm based on fusion of both fingerprint and fingervein biometrics. For both biometric recognition, we employ the Monogenic Local Binary Pattern (MonoLBP). This operator integrate the orginal LBP (Local Binary Pattern ) with both other rotation invariant measures: local phase and local surface type. Experimental results confirm that a weighted sum based proposed fusion achieves excellent identification performances opposite unimodal biometric systems. The AUC of proposed approach based on combining the two modalities has very close to unity (0.93).
Abstract: This paper presents performance comparison of three estimation techniques used for peak load forecasting in power systems. The three optimum estimation techniques are, genetic algorithms (GA), least error squares (LS) and, least absolute value filtering (LAVF). The problem is formulated as an estimation problem. Different forecasting models are considered. Actual recorded data is used to perform the study. The performance of the above three optimal estimation techniques is examined. Advantages of each algorithms are reported and discussed.
Abstract: This paper examines many mathematical methods for
molding the hourly price forward curve (HPFC); the model will be
constructed by numerous regression methods, like polynomial
regression, radial basic function neural networks & a furrier series.
Examination the models goodness of fit will be done by means of
statistical & graphical tools. The criteria for choosing the model will
depend on minimize the Root Mean Squared Error (RMSE), using the
correlation analysis approach for the regression analysis the optimal
model will be distinct, which are robust against model
misspecification. Learning & supervision technique employed to
determine the form of the optimal parameters corresponding to each
measure of overall loss. By using all the numerical methods that
mentioned previously; the explicit expressions for the optimal model
derived and the optimal designs will be implemented.
Abstract: Solidification cracking and hydrogen cracking are some defects generated in the fusion welding of ultrahigh carbon steels. However, friction stir welding (FSW) of such steels, being a solid-state technique, has been demonstrated to alleviate such problems encountered in traditional welding. FSW include different process parameters that must be carefully defined prior processing. These parameters included but not restricted to: tool feed, tool RPM, tool geometry, tool tilt angle. These parameters form a key factor behind avoiding warm holes and voids behind the tool and in achieving a defect-free weld. More importantly, these parameters directly affect the microstructure of the weld and hence the final mechanical properties of weld. For that, 3D finite element (FE) thermo-mechanical model was developed using DEFORM 3D to simulate FSW of carbon steel. At points of interest in the joint, tracking is done for history of critical state variables such as temperature, stresses, and strain rates. Typical results found include the ability to simulate different weld zones. Simulations predictions were successfully compared to experimental FSW tests. It is believed that such a numerical model can be used to optimize FSW processing parameters to favor desirable defect free weld with better mechanical properties.
Abstract: In a recent major industry-supported research and development study, a novel framework was developed and applied for assessment of reliability and quality performance levels in reallife power systems with practical large-scale sizes. The new assessment methodology is based on three metaphors (dimensions) representing the relationship between available generation capacities and required demand levels. The paper shares the results of the successfully completed stud and describes the implementation of the new methodology on practical zones in the Saudi electricity system.
Abstract: Saturated hydraulic conductivity of Soil is an
important property in processes involving water and solute flow in
soils. Saturated hydraulic conductivity of soil is difficult to measure
and can be highly variable, requiring a large number of replicate
samples. In this study, 60 sets of soil samples were collected at
Saqhez region of Kurdistan province-IRAN. The statistics such as
Correlation Coefficient (R), Root Mean Square Error (RMSE), Mean
Bias Error (MBE) and Mean Absolute Error (MAE) were used to
evaluation the multiple linear regression models varied with number
of dataset. In this study the multiple linear regression models were
evaluated when only percentage of sand, silt, and clay content (SSC)
were used as inputs, and when SSC and bulk density, Bd, (SSC+Bd)
were used as inputs. The R, RMSE, MBE and MAE values of the 50
dataset for method (SSC), were calculated 0.925, 15.29, -1.03 and
12.51 and for method (SSC+Bd), were calculated 0.927, 15.28,-1.11
and 12.92, respectively, for relationship obtained from multiple
linear regressions on data. Also the R, RMSE, MBE and MAE values
of the 10 dataset for method (SSC), were calculated 0.725, 19.62, -
9.87 and 18.91 and for method (SSC+Bd), were calculated 0.618,
24.69, -17.37 and 22.16, respectively, which shows when number of
dataset increase, precision of estimated saturated hydraulic
conductivity, increases.
Abstract: The triumph of inductive neuro-stimulation since its rediscovery in the 1980s has been quite spectacular. In lots of branches ranging from clinical applications to basic research this system is absolutely indispensable. Nevertheless, the basic knowledge about the processes underlying the stimulation effect is still very rough and rarely refined in a quantitative way. This seems to be not only an inexcusable blank spot in biophysics and for stimulation prediction, but also a fundamental hindrance for technological progress. The already very sophisticated devices have reached a stage where further optimization requires better strategies than provided by simple linear membrane models of integrate-and-fire style. Addressing this problem for the first time, we suggest in the following text a way for virtual quantitative analysis of a stimulation system. Concomitantly, this ansatz seems to provide a route towards a better understanding by using nonlinear signal processing and taking the nerve as a filter that is adapted for neuronal magnetic stimulation. The model is compact and easy to adjust. The whole setup behaved very robustly during all performed tests. Exemplarily a recent innovative stimulator design known as cTMS is analyzed and dimensioned with this approach in the following. The results show hitherto unforeseen potentials.
Abstract: Deep cold rolling (DCR) and low plasticity burnishing (LPB) process are cold working processes, which easily produce a smooth and work-hardened surface by plastic deformation of surface irregularities. The present study focuses on the surface roughness and surface hardness aspects of AISI 4140 work material, using fractional factorial design of experiments. The assessment of the surface integrity aspects on work material was done, in order to identify the predominant factors amongst the selected parameters. They were then categorized in order of significance followed by setting the levels of the factors for minimizing surface roughness and/or maximizing surface hardness. In the present work, the influence of main process parameters (force, feed rate, number of tool passes/overruns, initial roughness of the work piece, ball material, ball diameter and lubricant used) on the surface roughness and the hardness of AISI 4140 steel were studied for both LPB and DCR process and the results are compared. It was observed that by using LPB process surface hardness has been improved by 167% and in DCR process surface hardness has been improved by 442%. It was also found that the force, ball diameter, number of tool passes and initial roughness of the workpiece are the most pronounced parameters, which has a significant effect on the work piece-s surface during deep cold rolling and low plasticity burnishing process.