Abstract: High Resolution NMR Spectroscopy offers unique screening capabilities for food quality and safety by combining non-targeted and targeted screening in one analysis.
The objective is to demonstrate, that due to its extreme reproducibility NMR can detect smallest changes in concentrations of many components in a mixture, which is best monitored by statistical evaluation however also delivers reliable quantification results.
The methodology typically uses a 400 MHz high resolution instrument under full automation after minimized sample preparation.
For example one fruit juice analysis in a push button operation takes at maximum 15 minutes and delivers a multitude of results, which are automatically summarized in a PDF report.
The method has been proven on fruit juices, where so far unknown frauds could be detected. In addition conventional targeted parameters are obtained in the same analysis. This technology has the advantage that NMR is completely quantitative and concentration calibration only has to be done once for all compounds. Since NMR is so reproducible, it is also transferable between different instruments (with same field strength) and laboratories. Based on strict SOP`s, statistical models developed once can be used on multiple instruments and strategies for compound identification and quantification are applicable as well across labs.
Abstract: The computer aided for design, analysis, control, visualization and simulation of robotized workcells is very interesting in this time. Computer Aided Robot Control (CARC) is a subsystem of the system CIM including the computer aided systems of all activities connected with visualization and working of robotized workcells. There are three basic ideas: current CAD/CAM/CAE systems for design and 3D visualization, special PC based control and simulation systems and Augmented Reality Aided Manufacturing (ARAM) systems. This paper describes example of Open Source software application that can to be utilized at planning of the robotized workcells, visualization and off-line programming the automated processes realized by authors.
Abstract: Real time non-invasive Brain Computer Interfaces have a significant progressive role in restoring or maintaining a quality life for medically challenged people. This manuscript provides a comprehensive review of emerging research in the field of cognitive/affective computing in context of human neural responses. The perspectives of different emotion assessment modalities like face expressions, speech, text, gestures, and human physiological responses have also been discussed. Focus has been paid to explore the ability of EEG (Electroencephalogram) signals to portray thoughts, feelings, and unspoken words. An automated workflow-based protocol to design an EEG-based real time Brain Computer Interface system for analysis and classification of human emotions elicited by external audio/visual stimuli has been proposed. The front end hardware includes a cost effective and portable Emotiv EEG Neuroheadset unit, a personal computer and a set of external stimulators. Primary signal analysis and processing of real time acquired EEG shall be performed using MATLAB based advanced brain mapping toolbox EEGLab/BCILab. This shall be followed by the development of MATLAB based self-defined algorithm to capture and characterize temporal and spectral variations in EEG under emotional stimulations. The extracted hybrid feature set shall be used to classify emotional states using artificial intelligence tools like Artificial Neural Network. The final system would result in an inexpensive, portable and more intuitive Brain Computer Interface in real time scenario to control prosthetic devices by translating different brain states into operative control signals.
Abstract: This paper describes an automated implementable
system for impulsive signals detection and recognition. The system
uses a Digital Signal Processing device for the detection and
identification process. Here the system analyses the signals in real
time in order to produce a particular response if needed. The system
analyses the signals in real time in order to produce a specific output
if needed. Detection is achieved through normalizing the inputs and
comparing the read signals to a dynamic threshold and thus avoiding
detections linked to loud or fluctuating environing noise.
Identification is done through neuronal network algorithms. As a
setup our system can receive signals to “learn” certain patterns.
Through “learning” the system can recognize signals faster, inducing
flexibility to new patterns similar to those known. Sound is captured
through a simple jack input, and could be changed for an enhanced
recording surface such as a wide-area recorder. Furthermore a
communication module can be added to the apparatus to send alerts
to another interface if needed.
Abstract: The industrial automation is dependent upon pneumatic control systems. The industrial units are now controlled with digital control systems to tackle the process variables like Temperature, Pressure, Flow rates and Composition.
This research work produces an evaluation of the response time fluctuations for proportional mode, proportional integral and proportional integral derivative modes of automated chemical process control. The controller output is measured for different values of gain with respect to time in three modes (P, PI and PID). In case of P-mode for different values of gain the controller output has negligible change. When the controller output of PI-mode is checked for constant gain, it can be seen that by decreasing the integral time the controller output has showed more fluctuations. The PID mode results have found to be more interesting in a way that when rate minute has changed, the controller output has also showed fluctuations with respect to time. The controller output for integral mode and derivative mode are observed with lesser steady state error, minimum offset and larger response time to control the process variable. The tuning parameters in case of P-mode are only steady state gain with greater errors with respect to controller output. The integral mode showed controller outputs with intermediate responses during integral gain (ki). By increasing the rate minute the derivative gain (kd) also increased which showed the controlled oscillations in case of PID mode and lesser overshoot.
Abstract: The conventional assessment of human semen is a
highly subjective assessment, with considerable intra- and interlaboratory
variability. Computer-Assisted Sperm Analysis (CASA)
systems provide a rapid and automated assessment of the sperm
characteristics, together with improved standardization and quality
control. However, the outcome of CASA systems is sensitive to the
method of experimentation. While conventional CASA systems use
digital microscopes with phase-contrast accessories, producing
higher contrast images, we have used raw semen samples (no
staining materials) and a regular light microscope, with a digital
camera directly attached to its eyepiece, to insure cost benefits and
simple assembling of the system. However, since the accurate finding
of sperms in the semen image is the first step in the examination and
analysis of the semen, any error in this step can affect the outcome of
the analysis. This article introduces and explains an algorithm for
finding sperms in low contrast images: First, an image enhancement
algorithm is applied to remove extra particles from the image. Then,
the foreground particles (including sperms and round cells) are
segmented form the background. Finally, based on certain features
and criteria, sperms are separated from other cells.
Abstract: In this paper, several improvements are proposed to
previous work of automated classification of alcoholics and nonalcoholics.
In the previous paper, multiplayer-perceptron neural
network classifying energy of gamma band Visual Evoked Potential
(VEP) signals gave the best classification performance using 800
VEP signals from 10 alcoholics and 10 non-alcoholics. Here, the
dataset is extended to include 3560 VEP signals from 102 subjects:
62 alcoholics and 40 non-alcoholics. Three modifications are
introduced to improve the classification performance: i) increasing
the gamma band spectral range by increasing the pass-band width of
the used filter ii) the use of Multiple Signal Classification algorithm
to obtain the power of the dominant frequency in gamma band VEP
signals as features and iii) the use of the simple but effective knearest
neighbour classifier. To validate that these two modifications
do give improved performance, a 10-fold cross validation
classification (CVC) scheme is used. Repeat experiments of the
previously used methodology for the extended dataset are performed
here and improvement from 94.49% to 98.71% in maximum
averaged CVC accuracy is obtained using the modifications. This
latest results show that VEP based classification of alcoholics is
worth exploring further for system development.
Abstract: Business Process Reengineering (BPR) is an essential tool before an information system project implementation. Enterprise Resource Planning (ERP) projects definitely require the standardization and fixation of business processes from customer order to shipment. Therefore, ERP implementations are well proven to be coupled with BPR, although the extend and timing of BPR with respect to ERP implementation differ. This study aims at analyzing the effects of BPR on ERP implementation success. Basing on two Turkish ERP implementations in pharmaceutical sector, a comparative study is performed. One of the ERP implementations took place after a BPR implementation, whereas the other implementation was without a prior BPR application. Both implementations have been realized with the same consultant team, the case with prior BPR implementation going live first. The results of the case study reveal that if business processes are not optimized and improved before an ERP implementation, ERP live system would face with disharmony problems of processes and processes automated by ERP. This suggests a definite precedence relationship between BPR and ERP applications
Abstract: Fishing has always been an essential component of
the Polynesians- life. Fishhooks, mostly in pearl shell, found during
archaeological excavations are the artifacts related to this activity the
most numerous. Thanks to them, we try to reconstruct the ancient
techniques of resources exploitation, inside the lagoons and offshore.
They can also be used as chronological and cultural indicators. The
shapes and dimensions of these artifacts allow comparisons and
classifications used in both functional approach and chrono-cultural
perspective. Hence it is very important for the ethno-archaeologists
to dispose of reliable methods and standardized measurement of
these artifacts. Such a reliable objective and standardized method
have been previously proposed. But this method cannot be envisaged
manually because of the very important time required to measure
each fishhook manually and the quantity of fishhooks to measure
(many hundreds). We propose in this paper a detailed acquisition
protocol of fishhooks and an automation of every step of this method.
We also provide some experimental results obtained on the fishhooks
coming from three archaeological excavations sites.
Abstract: In this paper, a new automated methodology to detect the optic disc (OD) automatically in retinal images from patients with risk of being affected by Diabetic Retinopathy (DR) and Macular Edema (ME) is presented. The detection procedure comprises two independent methodologies. On one hand, a location methodology obtains a pixel that belongs to the OD using image contrast analysis and structure filtering techniques and, on the other hand, a boundary segmentation methodology estimates a circular approximation of the OD boundary by applying mathematical morphology, edge detection techniques and the Circular Hough Transform. The methodologies were tested on a set of 1200 images composed of 229 retinographies from patients affected by DR with risk of ME, 431 with DR and no risk of ME and 540 images of healthy retinas. The location methodology obtained 98.83% success rate, whereas the OD boundary segmentation methodology obtained good circular OD boundary approximation in 94.58% of cases. The average computational time measured over the total set was 1.67 seconds for OD location and 5.78 seconds for OD boundary segmentation.
Abstract: Fine alignment of main ship power plants mechanisms
and shaft lines provides long-term and failure-free performance of
propulsion system while fast and high-quality installation of
mechanisms and shaft lines decreases common labor intensity. For
checking shaft line allowed stress and setting its alignment it is
required to perform calculations considering various stages of life
cycle. In 2012 JSC SSTC developed special software complex
“Shaftline” for calculation of alignment of having its own I/O
interface and display of shaft line 3D model. Alignment of shaft line
as per bearing loads is rather labor-intensive procedure. In order to
decrease its duration, JSC SSTC developed automated alignment
system from ship power plants mechanisms. System operation
principle is based on automatic simulation of design load on bearings.
Initial data for shaft line alignment can be exported to automated
alignment system from PC “Shaft line”.
Abstract: Automated production lines with so called 'hard structures' are widely used in manufacturing. Designers segmented these lines into sections by placing a buffer between the series of machine tools to increase productivity. In real production condition the capacity of a buffer system is limited and real production line can compensate only some part of the productivity losses of an automated line. The productivity of such production lines cannot be readily determined. This paper presents mathematical approach to solving the structure of section-based automated production lines by criterion of maximum productivity.
Abstract: This paper describes the development of a fully
automated measurement software for antenna radiation pattern
measurements in a Compact Antenna Test Range (CATR). The
CATR has a frequency range from 2-40 GHz and the measurement
hardware includes a Network Analyzer for transmitting and
Receiving the microwave signal and a Positioner controller to control
the motion of the Styrofoam column. The measurement process
includes Calibration of CATR with a Standard Gain Horn (SGH)
antenna followed by Gain versus angle measurement of the Antenna
under test (AUT). The software is designed to control a variety of
microwave transmitter / receiver and two axis Positioner controllers
through the standard General Purpose interface bus (GPIB) interface.
Addition of new Network Analyzers is supported through a slight
modification of hardware control module. Time-domain gating is
implemented to remove the unwanted signals and get the isolated
response of AUT. The gated response of the AUT is compared with
the calibration data in the frequency domain to obtain the desired
results. The data acquisition and processing is implemented in
Agilent VEE and Matlab. A variety of experimental measurements
with SGH antennas were performed to validate the accuracy of
software. A comparison of results with existing commercial
softwares is presented and the measured results are found to be
within .2 dBm.
Abstract: The paper describes a knowledge based system for
analysis of microscopic wear particles. Wear particles contained in
lubricating oil carry important information concerning machine
condition, in particular the state of wear. Experts (Tribologists) in the
field extract this information to monitor the operation of the machine
and ensure safety, efficiency, quality, productivity, and economy of
operation. This procedure is not always objective and it can also be
expensive. The aim is to classify these particles according to their
morphological attributes of size, shape, edge detail, thickness ratio,
color, and texture, and by using this classification thereby predict
wear failure modes in engines and other machinery. The attribute
knowledge links human expertise to the devised Knowledge Based
Wear Particle Analysis System (KBWPAS). The system provides an
automated and systematic approach to wear particle identification
which is linked directly to wear processes and modes that occur in
machinery. This brings consistency in wear judgment prediction
which leads to standardization and also less dependence on
Tribologists.
Abstract: A lot of research has been done in the past decade in the field of audio content analysis for extracting various information from audio signal. One such significant information is the "perceived mood" or the "emotions" related to a music or audio clip. This information is extremely useful in applications like creating or adapting the play-list based on the mood of the listener. This information could also be helpful in better classification of the music database. In this paper we have presented a method to classify music not just based on the meta-data of the audio clip but also include the "mood" factor to help improve the music classification. We propose an automated and efficient way of classifying music samples based on the mood detection from the audio data. We in particular try to classify the music based on mood for Indian bollywood music. The proposed method tries to address the following problem statement: Genre information (usually part of the audio meta-data) alone does not help in better music classification. For example the acoustic version of the song "nothing else matters by Metallica" can be classified as melody music and thereby a person in relaxing or chill out mood might want to listen to this track. But more often than not this track is associated with metal / heavy rock genre and if a listener classified his play-list based on the genre information alone for his current mood, the user shall miss out on listening to this track. Currently methods exist to detect mood in western or similar kind of music. Our paper tries to solve the issue for Indian bollywood music from an Indian cultural context
Abstract: Since Software testing becomes an important part of
Software development in order to improve the quality of software,
many automation tools are created to help testing functionality of
software. There are a few issues about usability of these tools, one is
that the result log which is generated from tools contains useless
information that the tester cannot use result log to communicate
efficiently, or the result log needs to use a specific application to open.
This paper introduces a new method, SBTAR that improves usability
of automated test tools in a part of a result log. The practice will use
the capability of tools named as IBM Rational Robot to create a
customized function, the function would generate new format of a
result log which contains useful information faster and easier to
understand than using the original result log which was generated
from the tools. This result log also increases flexibility by Microsoft
Word or WordPad to make them readable.
Abstract: Quality control in ceramic tile manufacturing is hard, labor intensive and it is performed in a harsh industrial environment with noise, extreme temperature and humidity. It can be divided into color analysis, dimension verification, and surface defect detection, which is the main purpose of our work. Defects detection is still based on the judgment of human operators while most of the other manufacturing activities are automated so, our work is a quality control enhancement by integrating a visual control stage using image processing and morphological operation techniques before the packing operation to improve the homogeneity of batches received by final users.
Abstract: Nowadays the market for industrial companies is becoming more and more globalized and highly competitive, forcing them to shorten the duration of the manufacturing system development time in order to reduce the time to market. In order to achieve this target, the hierarchical systems used in previous manufacturing systems are not enough because they cannot deal effectively with unexpected situations. To achieve flexibility in manufacturing systems, the concept of an Autonomous Decentralized Flexible Manufacturing System (AD-FMS) is useful. In this paper, we introduce a hypothetical reasoning based algorithm called the Algorithm for Future Anticipative Reasoning (AFAR) which is able to decide on a conceivable next action of an Automated Guided Vehicle (AGV) that works autonomously in the AD-FMS.
Abstract: This paper presents an automated inspection algorithm
for a thick plate. Thick plates typically have various types of surface
defects, such as scabs, scratches, and roller marks. These defects have
individual characteristics including brightness and shape. Therefore, it
is not simple to detect all the defects. In order to solve these problems
and to detect defects more effectively, we propose a dual light
switching lighting method and a defect detection algorithm based on
Gabor filters.
Abstract: Modeling of complex dynamic systems, which are
very complicated to establish mathematical models, requires new and
modern methodologies that will exploit the existing expert
knowledge, human experience and historical data. Fuzzy cognitive
maps are very suitable, simple, and powerful tools for simulation and
analysis of these kinds of dynamic systems. However, human experts
are subjective and can handle only relatively simple fuzzy cognitive
maps; therefore, there is a need of developing new approaches for an
automated generation of fuzzy cognitive maps using historical data.
In this study, a new learning algorithm, which is called Big Bang-Big
Crunch, is proposed for the first time in literature for an automated
generation of fuzzy cognitive maps from data. Two real-world
examples; namely a process control system and radiation therapy
process, and one synthetic model are used to emphasize the
effectiveness and usefulness of the proposed methodology.