Abstract: Quality control charts are very effective in detecting
out of control signals but when a control chart signals an out of
control condition of the process mean, searching for a special cause
in the vicinity of the signal time would not always lead to prompt
identification of the source(s) of the out of control condition as the
change point in the process parameter(s) is usually different from the
signal time. It is very important to manufacturer to determine at what
point and which parameters in the past caused the signal. Early
warning of process change would expedite the search for the special
causes and enhance quality at lower cost. In this paper the quality
variables under investigation are assumed to follow a multivariate
normal distribution with known means and variance-covariance
matrix and the process means after one step change remain at the new
level until the special cause is being identified and removed, also it is
supposed that only one variable could be changed at the same time.
This research applies artificial neural network (ANN) to identify the
time the change occurred and the parameter which caused the change
or shift. The performance of the approach was assessed through a
computer simulation experiment. The results show that neural
network performs effectively and equally well for the whole shift
magnitude which has been considered.
Abstract: This paper is concerned with the delay-distributiondependent
stability criteria for bidirectional associative memory
(BAM) neural networks with time-varying delays. Based on the
Lyapunov-Krasovskii functional and stochastic analysis approach,
a delay-probability-distribution-dependent sufficient condition is derived
to achieve the globally asymptotically mean square stable of
the considered BAM neural networks. The criteria are formulated in
terms of a set of linear matrix inequalities (LMIs), which can be
checked efficiently by use of some standard numerical packages. Finally,
a numerical example and its simulation is given to demonstrate
the usefulness and effectiveness of the proposed results.
Abstract: Clean air in subway station is important to passengers. The Platform Screen Doors (PSDs) can improve indoor air quality in the subway station; however the air quality in the subway tunnel is degraded. The subway tunnel has high CO2 concentration and indoor particulate matter (PM) value. The Indoor Air Quality (IAQ) level in subway environment degrades by increasing the frequency of the train operation and the number of the train. The ventilation systems of the subway tunnel need improvements to have better air-quality. Numerical analyses might be effective tools to analyze the performance of subway twin-track tunnel ventilation systems. An existing subway twin-track tunnel in the metropolitan Seoul subway system is chosen for the numerical simulations. The ANSYS CFX software is used for unsteady computations of the airflow inside the twin-track tunnel when the train moves. The airflow inside the tunnel is simulated when one train runs and two trains run at the same time in the tunnel. The piston-effect inside the tunnel is analyzed when all shafts function as the natural ventilation shaft. The supplied air through the shafts is mixed with the pollutant air in the tunnel. The pollutant air is exhausted by the mechanical ventilation shafts. The supplied and discharged airs are balanced when only one train runs in the twin-track tunnel. The pollutant air in the tunnel is high when two trains run simultaneously in opposite direction and all shafts functioned as the natural shaft cases when there are no electrical power supplies in the shafts. The remained pollutant air inside the tunnel enters into the station platform when the doors are opened.
Abstract: A complex valued neural network is a neural network
which consists of complex valued input and/or weights and/or thresholds
and/or activation functions. Complex-valued neural networks
have been widening the scope of applications not only in electronics
and informatics, but also in social systems. One of the most important
applications of the complex valued neural network is in signal
processing. In Neural networks, generalized mean neuron model
(GMN) is often discussed and studied. The GMN includes a new
aggregation function based on the concept of generalized mean of all
the inputs to the neuron. This paper aims to present exhaustive results
of using Generalized Mean Neuron model in a complex-valued neural
network model that uses the back-propagation algorithm (called
-Complex-BP-) for learning. Our experiments results demonstrate the
effectiveness of a Generalized Mean Neuron Model in a complex
plane for signal processing over a real valued neural network. We
have studied and stated various observations like effect of learning
rates, ranges of the initial weights randomly selected, error functions
used and number of iterations for the convergence of error required on
a Generalized Mean neural network model. Some inherent properties
of this complex back propagation algorithm are also studied and
discussed.
Abstract: Localized surface plasmon resonance (LSPR) is the
coherent oscillation of conductive electrons confined in noble
metallic nanoparticles excited by electromagnetic radiation, and
nanosphere lithography (NSL) is one of the cost-effective methods to
fabricate metal nanostructures for LSPR. NSL can be categorized
into two major groups: dispersed NSL and closely pack NSL. In
recent years, gold nanocrescents and gold nanoholes with vertical
sidewalls fabricated by dispersed NSL, and silver nanotriangles and
gold nanocaps on silica nanospheres fabricated by closely pack NSL,
have been reported for LSPR biosensing. This paper introduces
several novel gold nanostructures fabricated by NSL in LSPR
applications, including 3D nanostructures obtained by evaporating
gold obliquely on dispersed nanospheres, nanoholes with slant
sidewalls, and patchy nanoparticles on closely packed nanospheres,
all of which render satisfactory sensitivity for LSPR sensing. Since
the LSPR spectrum is very sensitive to the shape of the metal
nanostructures, formulas are derived and software is developed for
calculating the profiles of the obtainable metal nanostructures by
NSL, for different nanosphere masks with different fabrication
conditions. The simulated profiles coincide well with the profiles of
the fabricated gold nanostructures observed under scanning electron
microscope (SEM) and atomic force microscope (AFM), which
proves that the software is a useful tool for the process design of
different LSPR nanostructures.
Abstract: Grid networks provide the ability to perform higher throughput computing by taking advantage of many networked computer-s resources to solve large-scale computation problems. As the popularity of the Grid networks has increased, there is a need to efficiently distribute the load among the resources accessible on the network. In this paper, we present a stochastic network system that gives a distributed load-balancing scheme by generating almost regular networks. This network system is self-organized and depends only on local information for load distribution and resource discovery. The in-degree of each node is refers to its free resources, and job assignment and resource discovery processes required for load balancing is accomplished by using fitted random sampling. Simulation results show that the generated network system provides an effective, scalable, and reliable load-balancing scheme for the distributed resources accessible on Grid networks.
Abstract: Independent component analysis (ICA) in the
frequency domain is used for solving the problem of blind source
separation (BSS). However, this method has some problems. For
example, a general ICA algorithm cannot determine the permutation
of signals which is important in the frequency domain ICA. In this
paper, we propose an approach to the solution for a permutation
problem. The idea is to effectively combine two conventional
approaches. This approach improves the signal separation
performance by exploiting features of the conventional approaches.
We show the simulation results using artificial data.
Abstract: The quest for alternatefuels for a CI engine has
become all the more imperative considering its importance in the
economy of a nation and from the standpoint of preserving the environment. Reported in this paper are the combustion performance and P-θ characteristics of a CI engine operating on B20 biodiesel fuel derived from Jatropha oil.Itis observed that the twin effect of advancing the injection timing and increasing the injector opening pressure (IOP) up to 220 barhas resulted in minimum brake specific
energy consumption and higherpeak pressure. It is also observed that
the crank angle of occurrence of peak pressure progressestowards top
dead center (TDC) as the timing is advanced and IOP is increased.
Abstract: The last decade has shown that object-oriented
concept by itself is not that powerful to cope with the rapidly
changing requirements of ongoing applications. Component-based
systems achieve flexibility by clearly separating the stable parts of
systems (i.e. the components) from the specification of their
composition. In order to realize the reuse of components effectively
in CBSD, it is required to measure the reusability of components.
However, due to the black-box nature of components where the
source code of these components are not available, it is difficult to
use conventional metrics in Component-based Development as these
metrics require analysis of source codes. In this paper, we survey
few existing component-based reusability metrics. These metrics
give a border view of component-s understandability, adaptability,
and portability. It also describes the analysis, in terms of quality
factors related to reusability, contained in an approach that aids
significantly in assessing existing components for reusability.
Abstract: In this paper, to optimize the “Characteristic Straight Line Method" which is used in the soil displacement analysis, a “best estimate" of the geodetic leveling observations has been achieved by taking in account the concept of 'Height systems'. This concept has been discussed in detail and consequently the concept of “height". In landslides dynamic analysis, the soil is considered as a mosaic of rigid blocks. The soil displacement has been monitored and analyzed by using the “Characteristic Straight Line Method". Its characteristic components have been defined constructed from a “best estimate" of the topometric observations. In the measurement of elevation differences, we have used the most modern leveling equipment available. Observational procedures have also been designed to provide the most effective method to acquire data. In addition systematic errors which cannot be sufficiently controlled by instrumentation or observational techniques are minimized by applying appropriate corrections to the observed data: the level collimation correction minimizes the error caused by nonhorizontality of the leveling instrument's line of sight for unequal sight lengths, the refraction correction is modeled to minimize the refraction error caused by temperature (density) variation of air strata, the rod temperature correction accounts for variation in the length of the leveling rod' s Invar/LO-VAR® strip which results from temperature changes, the rod scale correction ensures a uniform scale which conforms to the international length standard and the introduction of the concept of the 'Height systems' where all types of height (orthometric, dynamic, normal, gravity correction, and equipotential surface) have been investigated. The “Characteristic Straight Line Method" is slightly more convenient than the “Characteristic Circle Method". It permits to evaluate a displacement of very small magnitude even when the displacement is of an infinitesimal quantity. The inclination of the landslide is given by the inverse of the distance reference point O to the “Characteristic Straight Line". Its direction is given by the bearing of the normal directed from point O to the Characteristic Straight Line (Fig..6). A “best estimate" of the topometric observations was used to measure the elevation of points carefully selected, before and after the deformation. Gross errors have been eliminated by statistical analyses and by comparing the heights within local neighborhoods. The results of a test using an area where very interesting land surface deformation occurs are reported. Monitoring with different options and qualitative comparison of results based on a sufficient number of check points are presented.
Abstract: The purpose of this paper is to perform a multidisciplinary design and analysis (MDA) of honeycomb panels used in the satellites structural design. All the analysis is based on clamped-free boundary conditions. In the present work, detailed finite element models for honeycomb panels are developed and analysed. Experimental tests were carried out on a honeycomb specimen of which the goal is to compare the previous modal analysis made by the finite element method as well as the existing equivalent approaches. The obtained results show a good agreement between the finite element analysis, equivalent and tests results; the difference in the first two frequencies is less than 4% and less than 10% for the third frequency. The results of the equivalent model presented in this analysis are obtained with a good accuracy. Moreover, investigations carried out in this research relate to the honeycomb plate modal analysis under several aspects including the structural geometrical variation by studying the various influences of the dimension parameters on the modal frequency, the variation of core and skin material of the honeycomb. The various results obtained in this paper are promising and show that the geometry parameters and the type of material have an effect on the value of the honeycomb plate modal frequency.
Abstract: Automatic segmentation of skin lesions is the first step
towards development of a computer-aided diagnosis of melanoma.
Although numerous segmentation methods have been developed,
few studies have focused on determining the most discriminative
and effective color space for melanoma application. This paper
proposes a novel automatic segmentation algorithm using color space
analysis and clustering-based histogram thresholding, which is able to
determine the optimal color channel for segmentation of skin lesions.
To demonstrate the validity of the algorithm, it is tested on a set of 30
high resolution dermoscopy images and a comprehensive evaluation
of the results is provided, where borders manually drawn by four
dermatologists, are compared to automated borders detected by the
proposed algorithm. The evaluation is carried out by applying three
previously used metrics of accuracy, sensitivity, and specificity and
a new metric of similarity. Through ROC analysis and ranking the
metrics, it is shown that the best results are obtained with the X and
XoYoR color channels which results in an accuracy of approximately
97%. The proposed method is also compared with two state-ofthe-
art skin lesion segmentation methods, which demonstrates the
effectiveness and superiority of the proposed segmentation method.
Abstract: This paper presents a useful sub-pixel image
registration method using line segments and a sub-pixel edge detector.
In this approach, straight line segments are first extracted from gray
images at the pixel level before applying the sub-pixel edge detector.
Next, all sub-pixel line edges are mapped onto the orientation-distance
parameter space to solve for line correspondence between images.
Finally, the registration parameters with sub-pixel accuracy are
analytically solved via two linear least-square problems. The present
approach can be applied to various fields where fast registration with
sub-pixel accuracy is required. To illustrate, the present approach is
applied to the inspection of printed circuits on a flat panel. Numerical
example shows that the present approach is effective and accurate
when target images contain a sufficient number of line segments,
which is true in many industrial problems.
Abstract: In this paper, a methodology of a model based on
predicting the tool forces oblique machining are introduced by
adopting the orthogonal technique. The applied analytical calculation
is mostly based on Devries model and some parts of the methodology
are employed from Amareggo-Brown model. Model validation is
performed by comparing experimental data with the prediction results
on machining titanium alloy (Ti-6Al-4V) based on micro-cutting tool
perspective. Good agreements with the experiments are observed. A
detailed friction form that affected the tool forces also been examined
with reasonable results obtained.
Abstract: Multiprocessor task scheduling is a NP-hard problem and Genetic Algorithm (GA) has been revealed as an excellent technique for finding an optimal solution. In the past, several methods have been considered for the solution of this problem based on GAs. But, all these methods consider single criteria and in the present work, minimization of the bi-criteria multiprocessor task scheduling problem has been considered which includes weighted sum of makespan & total completion time. Efficiency and effectiveness of genetic algorithm can be achieved by optimization of its different parameters such as crossover, mutation, crossover probability, selection function etc. The effects of GA parameters on minimization of bi-criteria fitness function and subsequent setting of parameters have been accomplished by central composite design (CCD) approach of response surface methodology (RSM) of Design of Experiments. The experiments have been performed with different levels of GA parameters and analysis of variance has been performed for significant parameters for minimisation of makespan and total completion time simultaneously.
Abstract: Post growth annealing of solution grown ZnO
nanowire array is performed under controlled oxygen ambience. The
role of annealing over surface defects and their consequence on
dark/photo-conductivity and photosensitivity of nanowire array is
investigated. Surface defect properties are explored using various
measurement tools such as contact angle, photoluminescence, Raman
spectroscopy and XPS measurements. The contact angle of the NW
films reduces due to oxygen annealing and nanowire film surface
changes from hydrophobic (96°) to hydrophilic (16°). Raman and
XPS spectroscopy reveal that oxygen annealing improves the crystal
quality of the nanowire films. The defect band emission intensity
(relative to band edge emission, ID/IUV) reduces from 1.3 to 0.2 after
annealing at 600 °C at 10 SCCM flow of oxygen. An order
enhancement in dark conductivity is observed in O2 annealed
samples, while photoconductivity is found to be slightly reduced due
to lower concentration of surface related oxygen defects.
Abstract: Compacted clay liners (CCLs) are the main materials
used in waste disposal landfills due to their low permeability. In this
study, the effect on the shear resistant of clays with inorganic salt
solutions as permeate fluid was experimentally investigated. For this
purpose, NaCl inorganic salt solution at concentrations of 2, 5, 10%
and deionized water were used. Laboratory direct shear and Vane
shear tests were conducted on three compacted clays with low,
medium and high plasticity. Results indicated that the solutions type
and its concentration affect the shear properties of the mixture. In the
light of this study, the influence magnitude of these inorganic salts in
varies concentrations in different clays were determined and more
suitable compacted clay with the compare of plasticity were found.
Abstract: A novel sponge submerged membrane bioreactor
(SSMBR) was developed to effectively remove organics and
nutrients from wastewater. Sponge is introduced within the SSMBR
as a medium for the attached growth of biomass. This paper evaluates
the effects of new and acclimatized sponges for dissolved organic
carbon (DOC) removal from wastewater at different mixed liquor
suspended solids- (MLSS) concentration of the sludge. It was
observed in a series of experimental studies that the acclimatized
sponge performed better than the new sponge whilst the optimum
DOC removal could be achieved at 10g/L of MLSS with the
acclimatized sponge. Moreover, the paper analyses the relationships
between the MLSSsponge/MLSSsludge and the DOC removal efficiency
of SSMBR. The results showed a non-linear relationship between the
biomass parameters of the sponge and the sludge, and the DOC
removal efficiency of SSMBR. A second-order polynomial function
could reasonably represent these relationships.
Abstract: In research on natural ventilation, and passive cooling
with forced convection, is essential to know how heat flows in a solid
object and the pattern of temperature distribution on their surfaces,
and eventually how air flows through and convects heat from the
surfaces of steel under roof. This paper presents some results from
running the computational fluid dynamic program (CFD) by
comparison between natural ventilation and forced convection within
roof attic that is received directly from solar radiation. The CFD
program for modeling air flow inside roof attic has been modified to
allow as two cases. First case, the analysis under natural ventilation,
is closed area in roof attic and second case, the analysis under forced
convection, is opened area in roof attic. These extend of all cases to
available predictions of variations such as temperature, pressure, and
mass flow rate distributions in each case within roof attic. The
comparison shows that this CFD program is an effective model for
predicting air flow of temperature and heat transfer coefficient
distribution within roof attic. The result shows that forced convection
can help to reduce heat transfer through roof attic and an around area
of steel core has temperature inner zone lower than natural
ventilation type. The different temperature on the steel core of roof
attic of two cases was 10-15 oK.
Abstract: Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for preprocessing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based preprocessing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.