Abstract: The study of the generated defects on manufactured
parts shows the difficulty to maintain parts in their positions during
the machining process and to estimate them during the pre-process
plan. This work presents a contribution to the development of 3D
models for the optimization of the manufacturing tolerances. An
experimental study allows the measurement of the defects of part
positioning for the determination of ε and the choice of an optimal
setup of the part. An approach of 3D tolerance based on the small
displacements method permits the determination of the
manufacturing errors upstream. A developed tool, allows an
automatic generation of the tolerance intervals along the three axes.
Abstract: In this paper we present a new approach to detecting a
flaw in T.O.F.D (Time Of Flight Diffraction) type ultrasonic image
based on texture features. Texture is one of the most important
features used in recognizing patterns in an image. The paper
describes texture features based on 2D Gabor functions, i.e.,
Gaussian shaped band-pass filters, with dyadic treatment of the radial
spatial frequency range and multiple orientations, which represent an
appropriate choice for tasks requiring simultaneous measurement in
both space and frequency domains. The most relevant features are
used as input data on a Fuzzy c-mean clustering classifier. The
classes that exist are only two: 'defects' or 'no defects'. The proposed
approach is tested on the T.O.F.D image achieved at the laboratory
and on the industrial field.
Abstract: The objective of global optimization is to find the
globally best solution of a model. Nonlinear models are ubiquitous
in many applications and their solution often requires a global
search approach; i.e. for a function f from a set A ⊂ Rn to
the real numbers, an element x0 ∈ A is sought-after, such that
∀ x ∈ A : f(x0) ≤ f(x). Depending on the field of application,
the question whether a found solution x0 is not only a local minimum
but a global one is very important.
This article presents a probabilistic approach to determine the
probability of a solution being a global minimum. The approach is
independent of the used global search method and only requires a
limited, convex parameter domain A as well as a Lipschitz continuous
function f whose Lipschitz constant is not needed to be known.
Abstract: Real-time embedded systems should benefit from
component-based software engineering to handle complexity and
deal with dependability. In these systems, applications should not
only be logically correct but also behave within time windows.
However, in the current component based software engineering
approaches, a few of component models handles time properties in
a manner that allows efficient analysis and checking at the
architectural level. In this paper, we present a meta-model for
component-based software description that integrates timing
issues. To achieve a complete functional model of software
components, our meta-model focuses on four functional aspects:
interface, static behavior, dynamic behavior, and interaction
protocol. With each aspect we have explicitly associated a time
model. Such a time model can be used to check a component-s
design against certain properties and to compute the timing
properties of component assemblies.
Abstract: In this paper, the problem of estimating the optimal
radio capacity of a single-cell spread spectrum (SS) multiple-inputmultiple-
output (MIMO) system operating in a Rayleigh fading environment
is examined. The optimisation between the radio capacity
and the theoretically achievable average channel capacity (in the
sense of information theory) per user of a MIMO single-cell SS system
operating in a Rayleigh fading environment is presented. Then,
the spectral efficiency is estimated in terms of the achievable average
channel capacity per user, during the operation over a broadcast
time-varying link, and leads to a simple novel-closed form expression
for the optimal radio capacity value based on the maximization
of the achieved spectral efficiency. Numerical results are presented to
illustrate the proposed analysis.
Abstract: This paper presents an approach which is based on the
use of supervised feed forward neural network, namely multilayer
perceptron (MLP) neural network and finite element method (FEM)
to solve the inverse problem of parameters identification. The
approach is used to identify unknown parameters of ferromagnetic
materials. The methodology used in this study consists in the
simulation of a large number of parameters in a material under test,
using the finite element method (FEM). Both variations in relative
magnetic permeability and electrical conductivity of the material
under test are considered. Then, the obtained results are used to
generate a set of vectors for the training of MLP neural network.
Finally, the obtained neural network is used to evaluate a group of
new materials, simulated by the FEM, but not belonging to the
original dataset. Noisy data, added to the probe measurements is used
to enhance the robustness of the method. The reached results
demonstrate the efficiency of the proposed approach, and encourage
future works on this subject.
Abstract: In this paper, we propose a new image segmentation approach for colour textured images. The proposed method for image segmentation consists of two stages. In the first stage, textural features using gray level co-occurrence matrix(GLCM) are computed for regions of interest (ROI) considered for each class. ROI acts as ground truth for the classes. Ohta model (I1, I2, I3) is the colour model used for segmentation. Statistical mean feature at certain inter pixel distance (IPD) of I2 component was considered to be the optimized textural feature for further segmentation. In the second stage, the feature matrix obtained is assumed to be the degraded version of the image labels and modeled as Markov Random Field (MRF) model to model the unknown image labels. The labels are estimated through maximum a posteriori (MAP) estimation criterion using ICM algorithm. The performance of the proposed approach is compared with that of the existing schemes, JSEG and another scheme which uses GLCM and MRF in RGB colour space. The proposed method is found to be outperforming the existing ones in terms of segmentation accuracy with acceptable rate of convergence. The results are validated with synthetic and real textured images.
Abstract: Advances in computing applications in recent years
have prompted the demand for more flexible scheduling models for
QoS demand. Moreover, in practical applications, partly violated
temporal constraints can be tolerated if the violation meets certain
distribution. So we need extend the traditional Liu and Lanland model
to adapt to these circumstances. There are two extensions, which are
the (m, k)-firm model and Window-Constrained model. This paper
researches on weakly hard real-time constraints and their combination
to support QoS. The fact that a practical application can tolerate some
violations of temporal constraint under certain distribution is
employed to support adaptive QoS on the open real-time system. The
experiment results show these approaches are effective compared to
traditional scheduling algorithms.
Abstract: The aim of the study was to investigate whether there
is the promotion of product ecodesign measures as a result of
adopting ISO 14001 certification in manufacturing companies in the
Republic of Slovenia. Companies gave the most of their product
development attention to waste and energy reduction during
manufacturing process and reduction of material consumption per
unit of product. Regarding the importance of different ecodesign
criteria reduction of material consumption per unit of product was
reported as the most important criterion. Less attention is paid to endof-
life issues considering recycling or packaging. Most
manufacturing enterprises considered ISO 14001 standard as a very
useful tool or at least a useful tool helping them to accelerate and
establish product ecodesign activities. Two most frequently
considered ecodesign drivers are increased competitive advantage
and legal requirements and two most important barriers are high
development costs and insufficient market demand.
Abstract: Moulded parts contribute to more than 70% of
components in products. However, common defects particularly in
plastic injection moulding exist such as: warpage, shrinkage, sink
marks, and weld lines. In this paper Taguchi experimental design
methods are applied to reduce the warpage defect of thin plate
Acrylonitrile Butadiene Styrene (ABS) and are demonstrated in two
levels; namely, orthogonal arrays of Taguchi and the Analysis of
Variance (ANOVA). Eight trials have been run in which the optimal
parameters that can minimize the warpage defect in factorial
experiment are obtained. The results obtained from ANOVA
approach analysis with respect to those derived from MINITAB
illustrate the most significant factors which may cause warpage in
injection moulding process. Moreover, ANOVA approach in
comparison with other approaches like S/N ratio is more accurate and
with the interaction of factors it is possible to achieve higher and the
better outcomes.
Abstract: We introduce an effective approach for automatic offline au- thentication of handwritten samples where the forgeries are skillfully done, i.e., the true and forgery sample appearances are almost alike. Subtle details of temporal information used in online verification are not available offline and are also hard to recover robustly. Thus the spatial dynamic information like the pen-tip pressure characteristics are considered, emphasizing on the extraction of low density pixels. The points result from the ballistic rhythm of a genuine signature which a forgery, however skillful that may be, always lacks. Ten effective features, including these low density points and den- sity ratio, are proposed to make the distinction between a true and a forgery sample. An adaptive decision criteria is also derived for better verification judgements.
Abstract: Multicast Network Technology has pervaded our
lives-a few examples of the Networking Techniques and also for the
improvement of various routing devices we use. As we know the
Multicast Data is a technology offers many applications to the user
such as high speed voice, high speed data services, which is presently
dominated by the Normal networking and the cable system and
digital subscriber line (DSL) technologies. Advantages of Multi cast
Broadcast such as over other routing techniques. Usually QoS
(Quality of Service) Guarantees are required in most of Multicast
applications. The bandwidth-delay constrained optimization and we
use a multi objective model and routing approach based on genetic
algorithm that optimizes multiple QoS parameters simultaneously.
The proposed approach is non-dominated routes and the performance
with high efficiency of GA. Its betterment and high optimization has
been verified. We have also introduced and correlate the result of
multicast GA with the Broadband wireless to minimize the delay in
the path.
Abstract: The customary practice of identifying industrial sickness is a set traditional techniques which rely upon a range of manual monitoring and compilation of financial records. It makes the process tedious, time consuming and often are susceptible to manipulation. Therefore, certain readily available tools are required which can deal with such uncertain situations arising out of industrial sickness. It is more significant for a country like India where the fruits of development are rarely equally distributed. In this paper, we propose an approach based on Artificial Neural Network (ANN) to deal with industrial sickness with specific focus on a few such units taken from a less developed north-east (NE) Indian state like Assam. The proposed system provides decision regarding industrial sickness using eight different parameters which are directly related to the stages of sickness of such units. The mechanism primarily uses certain signals and symptoms of industrial health to decide upon the state of a unit. Specifically, we formulate an ANN based block with data obtained from a few selected units of Assam so that required decisions related to industrial health could be taken. The system thus formulated could become an important part of planning and development. It can also contribute towards computerization of decision support systems related to industrial health and help in better management.
Abstract: Because of importance of energy, optimization of
power generation systems is necessary. Gas turbine cycles are
suitable manner for fast power generation, but their efficiency is
partly low. In order to achieving higher efficiencies, some
propositions are preferred such as recovery of heat from exhaust
gases in a regenerator, utilization of intercooler in a multistage
compressor, steam injection to combustion chamber and etc.
However thermodynamic optimization of gas turbine cycle, even
with above components, is necessary. In this article multi-objective
genetic algorithms are employed for Pareto approach optimization of
Regenerative-Intercooling-Gas Turbine (RIGT) cycle. In the multiobjective
optimization a number of conflicting objective functions
are to be optimized simultaneously. The important objective
functions that have been considered for optimization are entropy
generation of RIGT cycle (Ns) derives using Exergy Analysis and
Gouy-Stodola theorem, thermal efficiency and the net output power
of RIGT Cycle. These objectives are usually conflicting with each
other. The design variables consist of thermodynamic parameters
such as compressor pressure ratio (Rp), excess air in combustion
(EA), turbine inlet temperature (TIT) and inlet air temperature (T0).
At the first stage single objective optimization has been investigated
and the method of Non-dominated Sorting Genetic Algorithm
(NSGA-II) has been used for multi-objective optimization.
Optimization procedures are performed for two and three objective
functions and the results are compared for RIGT Cycle. In order to
investigate the optimal thermodynamic behavior of two objectives,
different set, each including two objectives of output parameters, are
considered individually. For each set Pareto front are depicted. The
sets of selected decision variables based on this Pareto front, will
cause the best possible combination of corresponding objective
functions. There is no superiority for the points on the Pareto front
figure, but they are superior to any other point. In the case of three
objective optimization the results are given in tables.
Abstract: This paper presents a critical study about the
application of Neural Networks to ion-exchange process. Ionexchange
is a complex non-linear process involving many factors
influencing the ions uptake mechanisms from the pregnant solution.
The following step includes the elution. Published data presents
empirical isotherm equations with definite shortcomings resulting in
unreliable predictions. Although Neural Network simulation
technique encounters a number of disadvantages including its “black
box", and a limited ability to explicitly identify possible causal
relationships, it has the advantage to implicitly handle complex
nonlinear relationships between dependent and independent
variables. In the present paper, the Neural Network model based on
the back-propagation algorithm Levenberg-Marquardt was developed
using a three layer approach with a tangent sigmoid transfer function
(tansig) at hidden layer with 11 neurons and linear transfer function
(purelin) at out layer. The above mentioned approach has been used
to test the effectiveness in simulating ion exchange processes. The
modeling results showed that there is an excellent agreement between
the experimental data and the predicted values of copper ions
removed from aqueous solutions.
Abstract: This article is an extension and a practical application
approach of Wheeler-s NEBIC theory (Net Enabled Business
Innovation Cycle). NEBIC theory is a new approach in IS research
and can be used for dynamic environment related to new technology.
Firms can follow the market changes rapidly with support of the IT
resources. Flexible firms adapt their market strategies, and respond
more quickly to customers changing behaviors. When every leading
firm in an industry has access to the same IT resources, the way that
these IT resources are managed will determine the competitive
advantages or disadvantages of firm. From Dynamic Capabilities
Perspective and from newly introduced NEBIC theory by Wheeler,
we know that only IT resources cannot deliver customer value but
good configuration of those resources can guarantee customer value
by choosing the right emerging technology, grasping the economic
opportunities through business innovation and growth. We found
evidences in literature that SOA (Service Oriented Architecture) is a
promising emerging technology which can deliver the desired
economic opportunity through modularity, flexibility and loosecoupling.
SOA can also help firms to connect in network which can
open a new window of opportunity to collaborate in innovation and
right kind of outsourcing
Abstract: To explore pipelines is one of various bio-mimetic
robot applications. The robot may work in common buildings such as
between ceilings and ducts, in addition to complicated and massive
pipeline systems of large industrial plants. The bio-mimetic robot finds
any troubled area or malfunction and then reports its data. Importantly,
it can not only prepare for but also react to any abnormal routes in the
pipeline. The pipeline monitoring tasks require special types of mobile
robots. For an effective movement along a pipeline, the movement of
the robot will be similar to that of insects or crawling animals. During
its movement along the pipelines, a pipeline monitoring robot has an
important task of finding the shapes of the approaching path on the
pipes. In this paper we propose an effective solution to the pipeline
pattern recognition, based on the fuzzy classification rules for the
measured IR distance data.
Abstract: We present in this paper a new approach for specific JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. Traditionally, steganographic algorithms try to preserve statistics of the DCT and of the spatial domain, but they cannot preserve both and also control the alteration of the compressed data. We have noticed a deviation of the entropy of the compressed data after a first embedding. This deviation is greater when the image is a cover medium than when the image is a stego image. To observe this deviation, we pointed out new statistic features and combined them with the Multiple Embedding Method. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. Finally, we designed a Fisher discriminant based classifier for well known steganographic algorithms, Outguess, F5 and Hide and Seek. The experiemental results we obtained show the efficiency of our classifier for these algorithms. Moreover, it is also designed to work with low embedding rates (< 10-5) and according to the avalanche criterion of RLE and Huffman compression step, its efficiency is independent of the quantity of hidden information.
Abstract: Three new algorithms based on minimization of autocorrelation of transmitted symbols and the SLM approach which are computationally less demanding have been proposed. In the first algorithm, autocorrelation of complex data sequence is minimized to a value of 1 that results in reduction of PAPR. Second algorithm generates multiple random sequences from the sequence generated in the first algorithm with same value of autocorrelation i.e. 1. Out of these, the sequence with minimum PAPR is transmitted. Third algorithm is an extension of the second algorithm and requires minimum side information to be transmitted. Multiple sequences are generated by modifying a fixed number of complex numbers in an OFDM data sequence using only one factor. The multiple sequences represent the same data sequence and the one giving minimum PAPR is transmitted. Simulation results for a 256 subcarrier OFDM system show that significant reduction in PAPR is achieved using the proposed algorithms.
Abstract: Current trends in manufacturing are characterized by
production broadening, innovation cycle shortening, and the products
having a new shape, material and functions. The production strategy
focused on time needed change from the traditional functional
production structure to flexible manufacturing cells and lines.
Production by automated manufacturing system (AMS) is one of the
most important manufacturing philosophies in the last years. The
main goals of the project we are involved in lies on building a
laboratory in which will be located a flexible manufacturing system
consisting of at least two production machines with NC control
(milling machines, lathe). These machines will be linked to a
transport system and they will be served by industrial robots. Within
this flexible manufacturing system a station for the quality control
consisting of a camera system and rack warehouse will be also
located. The design, analysis and improvement of this manufacturing
system, specially with a special focus on the communication among
devices constitute the main aims of this paper. The key determining
factors for the manufacturing system design are: the product, the
production volume, the used machines, the disposable manpower, the
disposable infrastructure and the legislative frame for the specific
cases.