Abstract: This paper describes the results of an extensive study
and comparison of popular hash functions SHA-1, SHA-256,
RIPEMD-160 and RIPEMD-320 with JERIM-320, a 320-bit hash
function. The compression functions of hash functions like SHA-1
and SHA-256 are designed using serial successive iteration whereas
those like RIPEMD-160 and RIPEMD-320 are designed using two
parallel lines of message processing. JERIM-320 uses four parallel
lines of message processing resulting in higher level of security than
other hash functions at comparable speed and memory requirement.
The performance evaluation of these methods has been done by using
practical implementation and also by using step computation
methods. JERIM-320 proves to be secure and ensures the integrity of
messages at a higher degree. The focus of this work is to establish
JERIM-320 as an alternative of the present day hash functions for the
fast growing internet applications.
Abstract: Autofluorescence (AF) bronchoscopy is an
established method to detect dysplasia and carcinoma in situ (CIS).
For this reason the “Sotiria" Hospital uses the Karl Storz D-light
system. However, in early tumor stages the visualization is not that
obvious. With the help of a PC, we analyzed the color images we
captured by developing certain tools in Matlab®. We used statistical
methods based on texture analysis, signal processing methods based
on Gabor models and conversion algorithms between devicedependent
color spaces. Our belief is that we reduced the error made
by the naked eye. The tools we implemented improve the quality of
patients' life.
Abstract: The use of buffer thresholds, blocking and adequate
service strategies are well-known techniques for computer networks
traffic congestion control. This motivates the study of series queues
with blocking, feedback (service under Head of Line (HoL) priority
discipline) and finite capacity buffers with thresholds. In this paper,
the external traffic is modelled using the Poisson process and the
service times have been modelled using the exponential distribution.
We consider a three-station network with two finite buffers, for
which a set of thresholds (tm1 and tm2) is defined. This computer
network behaves as follows. A task, which finishes its service at
station B, gets sent back to station A for re-processing with
probability o. When the number of tasks in the second buffer exceeds
a threshold tm2 and the number of task in the first buffer is less than
tm1, the fed back task is served under HoL priority discipline. In
opposite case, for fed backed tasks, “no two priority services in
succession" procedure (preventing a possible overflow in the first
buffer) is applied. Using an open Markovian queuing schema with
blocking, priority feedback service and thresholds, a closed form
cost-effective analytical solution is obtained. The model of servers
linked in series is very accurate. It is derived directly from a twodimensional
state graph and a set of steady-state equations, followed
by calculations of main measures of effectiveness. Consequently,
efficient expressions of the low computational cost are determined.
Based on numerical experiments and collected results we conclude
that the proposed model with blocking, feedback and thresholds can
provide accurate performance estimates of linked in series networks.
Abstract: Cassava bagasse is one of major biomass wastes in Thailand from starch processing industry, which contains high starch content of about 60%. The object of this study was to investigate the optimal condition for hydrothermally pretreating cassava baggasses with or without acid addition. The pretreated samples were measured reducing sugar yield directly or after enzymatic hydrolysis (alpha-amylase). In enzymatic hydrolysis, the highest reducing sugar content was obtained under hydrothermal conditions for at 125oC for 30 min. The result shows that pretreating cassava baggasses increased the efficiency of enzymatic hydrolysis. For acid hydrolysis, pretreating cassava baggasses with sulfuric acid at 120oC for 60 min gave a maximum reducing sugar yield. In this study, sulfuric acid had a greater capacity for hydrolyzing cassava baggasses than phosphoric acid. In comparison, dilute acid hydrolysis to provide a higher yield of reducing sugar than the enzymatic hydrolysis combined hydrothermal pretreatment. However, enzymatic hydrolysis in a combination with hydrothermal pretreatment was an alternative to enhance efficiency reducing sugar production from cassava bagasse.
Abstract: Frequent machine breakdowns, low plant availability and increased overtime are a great threat to a manufacturing plant as they increase operating costs of an industry. The main aim of this study was to improve Overall Equipment Effectiveness (OEE) at a manufacturing company through the implementation of innovative maintenance strategies. A case study approach was used. The paper focuses on improving the maintenance in a manufacturing set up using an innovative maintenance regime mix to improve overall equipment effectiveness. Interviews, reviewing documentation and historical records, direct and participatory observation were used as data collection methods during the research. Usually production is based on the total kilowatt of motors produced per day. The target kilowatt at 91% availability is 75 Kilowatts a day. Reduced demand and lack of raw materials particularly imported items are adversely affecting the manufacturing operations. The company had to reset its targets from the usual figure of 250 Kilowatt per day to mere 75 per day due to lower availability of machines as result of breakdowns as well as lack of raw materials. The price reductions and uncertainties as well as general machine breakdowns further lowered production. Some recommendations were given. For instance, employee empowerment in the company will enhance responsibility and authority to improve and totally eliminate the six big losses. If the maintenance department is to realise its proper function in a progressive, innovative industrial society, then its personnel must be continuously trained to meet current needs as well as future requirements. To make the maintenance planning system effective, it is essential to keep track of all the corrective maintenance jobs and preventive maintenance inspections. For large processing plants these cannot be handled manually. It was therefore recommended that the company implement (Computerised Maintenance Management System) CMMS.
Abstract: Automatic determination of blood in less bright or
noisy capsule endoscopic images is difficult due to low S/N ratio.
Especially it may not be accurate to analyze these images due to the
influence of external disturbance. Therefore, we proposed detection
methods that are not dependent only on color bands. In locating
bleeding regions, the identification of object outlines in the frame and
features of their local colors were taken into consideration. The results
showed that the capability of detecting bleeding was much improved.
Abstract: Mammographic images and data analysis to
facilitate modelling or computer aided diagnostic (CAD) software development should best be done using a common database that can handle various mammographic image file
formats and relate these to other patient information.
This would optimize the use of the data as both primary
reporting and enhanced information extraction of research data could be performed from the single dataset. One desired
improvement is the integration of DICOM file header information into the database, as an efficient and reliable source of supplementary patient information intrinsically
available in the images.
The purpose of this paper was to design a suitable database to link and integrate different types of image files and gather common information that can be further used for research
purposes. An interface was developed for accessing, adding,
updating, modifying and extracting data from the common
database, enhancing the future possible application of the data in CAD processing.
Technically, future developments envisaged include the creation of an advanced search function to selects image files
based on descriptor combinations. Results can be further used for specific CAD processing and other research. Design of a
user friendly configuration utility for importing of the required fields from the DICOM files must be done.
Abstract: Noise level has critical effects on the diagnostic
performance of signal-averaged electrocardiogram (SAECG), because
the true starting and end points of QRS complex would be masked by
the residual noise and sensitive to the noise level. Several studies and
commercial machines have used a fixed number of heart beats
(typically between 200 to 600 beats) or set a predefined noise level
(typically between 0.3 to 1.0 μV) in each X, Y and Z lead to perform
SAECG analysis. However different criteria or methods used to
perform SAECG would cause the discrepancies of the noise levels
among study subjects. According to the recommendations of 1991
ESC, AHA and ACC Task Force Consensus Document for the use of
SAECG, the determinations of onset and offset are related closely to
the mean and standard deviation of noise sample. Hence this study
would try to perform SAECG using consistent root-mean-square
(RMS) noise levels among study subjects and analyze the noise level
effects on SAECG. This study would also evaluate the differences
between normal subjects and chronic renal failure (CRF) patients in
the time-domain SAECG parameters.
The study subjects were composed of 50 normal Taiwanese and 20
CRF patients. During the signal-averaged processing, different RMS
noise levels were adjusted to evaluate their effects on three time
domain parameters (1) filtered total QRS duration (fQRSD), (2) RMS
voltage of the last QRS 40 ms (RMS40), and (3) duration of the low
amplitude signals below 40 μV (LAS40). The study results
demonstrated that the reduction of RMS noise level can increase
fQRSD and LAS40 and decrease the RMS40, and can further increase
the differences of fQRSD and RMS40 between normal subjects and
CRF patients. The SAECG may also become abnormal due to the
reduction of RMS noise level. In conclusion, it is essential to establish
diagnostic criteria of SAECG using consistent RMS noise levels for
the reduction of the noise level effects.
Abstract: The processing of the electrocardiogram (ECG) signal consists essentially in the detection of the characteristic points of
signal which are an important tool in the diagnosis of heart diseases. The most suitable are the detection of R waves. In this paper, we
present various mathematical tools used for filtering ECG using digital filtering and Discreet Wavelet Transform (DWT) filtering. In
addition, this paper will include two main R peak detection methods
by applying a windowing process: The first method is based on calculations derived, the second is a time-frequency method based on
Dyadic Wavelet Transform DyWT.
Abstract: The influence of extrusion parameters on surface
quality and properties of AA6061+x% vol. SiC (x = 0; 2,5; 5; 7,5;10)
composites was discussed in this paper. The averages size of
AA6061 and SiC particles were 10.6 μm and 0.42 μm, respectively.
Two series of composites (I - compacts were preheated at extrusion
temperature through 0.5 h and cooled by water directly after process;
II - compacts were preheated through 3 hours and were not cooled)
were consolidated via powder metallurgy processing and extruded by
KoBo method. High values of density for both series of composites
were achieved. Better surface quality was observed for II series of
composites. Moreover, for these composites lower (compared to I
series) but more uniform strength properties over the cross-section of
the bar were noticed. Microstructure and Young-s modulus
investigations were made.
Abstract: The multi-agent system for processing Bio-signals
will help the medical practitioners to have a standard examination
procedure stored in web server. Web Servers supporting any standard
Search Engine follow all possible combinations of the search
keywords as an input by the user to a Search Engine. As a result, a
huge number of Web-pages are shown in the Web browser. It also
helps the medical practitioner to interact with the expert in the field
his need in order to make a proper judgment in the diagnosis phase
[3].A web server uses a web server plug in to establish and
maintained the medical practitioner to make a fast analysis. If the
user uses the web server client can get a related data requesting their
search. DB agent, EEG / ECG / EMG agents- user placed with
difficult aspects for updating medical information-s in web server.
Abstract: The standard investigational method for obstructive
sleep apnea syndrome (OSAS) diagnosis is polysomnography (PSG),
which consists of a simultaneous, usually overnight recording of
multiple electro-physiological signals related to sleep and
wakefulness. This is an expensive, encumbering and not a readily
repeated protocol, and therefore there is need for simpler and easily
implemented screening and detection techniques. Identification of
apnea/hypopnea events in the screening recordings is the key factor
for the diagnosis of OSAS. The analysis of a solely single-lead
electrocardiographic (ECG) signal for OSAS diagnosis, which may
be done with portable devices, at patient-s home, is the challenge of
the last years. A novel artificial neural network (ANN) based
approach for feature extraction and automatic identification of
respiratory events in ECG signals is presented in this paper. A
nonlinear principal component analysis (NLPCA) method was
considered for feature extraction and support vector machine for
classification/recognition. An alternative representation of the
respiratory events by means of Kohonen type neural network is
discussed. Our prospective study was based on OSAS patients of the
Clinical Hospital of Pneumology from Iaşi, Romania, males and
females, as well as on non-OSAS investigated human subjects. Our
computed analysis includes a learning phase based on cross signal
PSG annotation.
Abstract: Nano sized zirconium dioxide in monoclinic phase (m-ZrO2) has been synthesized in pure form through co-precipitation processing at different calcination temperatures and has been characterized by several techniques such as XRD, FT-IR, UV-Vis Spectroscopy and SEM. The dielectric and capacitance values of the pelletized samples have been examined at room temperature as the functions of frequency. The higher dielectric constant value of the sample having larger grain size proves the strong influence of grain size on the dielectric constant.
Abstract: In hydrocyclones, the particle separation efficiency is
limited by the suspended fine particles, which are discharged with the
coarse product in the underflow. It is well known that injecting water
in the conical part of the cyclone reduces the fine particle fraction in
the underflow. This paper presents a mathematical model that
simulates the water injection in the conical component. The model
accounts for the fluid flow and the particle motion. Particle
interaction, due to hindered settling caused by increased density and
viscosity of the suspension, and fine particle entrainment by settling
coarse particles are included in the model. Water injection in the
conical part of the hydrocyclone is performed to reduce fine particle
discharge in the underflow. The model demonstrates the impact of
the injection rate, injection velocity, and injection location on the
shape of the partition curve. The simulations are compared with
experimental data of a 50-mm cyclone.
Abstract: As the Textile Industry is the second largest industry
in Egypt and as small and medium-sized enterprises (SMEs) make up
a great portion of this industry therein it is essential to apply the
concept of Cleaner Production for the purpose of reducing pollution.
In order to achieve this goal, a case study concerned with ecofriendly
stone-washing of jeans-garments was investigated. A raw
material-substitution option was adopted whereby the toxic
potassium permanganate and sodium sulfide were replaced by the
environmentally compatible hydrogen peroxide and glucose
respectively where the concentrations of both replaced chemicals
together with the operating time were optimized. In addition, a
process-rationalization option involving four additional processes
was investigated. By means of criteria such as product quality,
effluent analysis, mass and heat balance; and cost analysis with the
aid of a statistical model, a process optimization treatment revealed
that the superior process optima were 50%, 0.15% and 50min for
H2O2 concentration, glucose concentration and time, respectively.
With these values the superior process ought to reduce the annual
cost by about EGP 105 relative to the currently used conventional
method.
Abstract: Discourse pronominal anaphora resolution must be part of any efficient information processing systems, since the reference of a pronoun is dependent on an antecedent located in the discourse. Contrary to knowledge-poor approaches, this paper shows that syntax-semantic relations are basic in pronominal anaphora resolution. The identification of quantified expressions to which pronouns can be anaphorically related provides further evidence that pronominal anaphora is based on domains of interpretation where asymmetric agreement holds.
Abstract: In text categorization problem the most used method
for documents representation is based on words frequency vectors
called VSM (Vector Space Model). This representation is based only
on words from documents and in this case loses any “word context"
information found in the document. In this article we make a
comparison between the classical method of document representation
and a method called Suffix Tree Document Model (STDM) that is
based on representing documents in the Suffix Tree format. For the
STDM model we proposed a new approach for documents
representation and a new formula for computing the similarity
between two documents. Thus we propose to build the suffix tree
only for any two documents at a time. This approach is faster, it has
lower memory consumption and use entire document representation
without using methods for disposing nodes. Also for this method is
proposed a formula for computing the similarity between documents,
which improves substantially the clustering quality. This
representation method was validated using HAC - Hierarchical
Agglomerative Clustering. In this context we experiment also the
stemming influence in the document preprocessing step and highlight
the difference between similarity or dissimilarity measures to find
“closer" documents.
Abstract: Drilling is the most common machining operation and it forms the highest machining cost in many manufacturing activities including automotive engine production. The outcome of this operation depends upon many factors including utilization of proper cutting tool geometry, cutting tool material and the type of coating used to improve hardness and resistance to wear, and also cutting parameters. With the availability of a large array of tool geometries, materials and coatings, is has become a challenging task to select the best tool and cutting parameters that would result in the lowest machining cost or highest profit rate. This paper describes an algorithm developed to help achieve good performances in drilling operations by automatically determination of proper cutting tools and cutting parameters. It also helps determine machining sequences resulting in minimum tool changes that would eventually reduce machining time and cost where multiple tools are used.
Abstract: This paper describes an effective solution to the task
of a remote monitoring of super-extended objects (oil and gas
pipeline, railways, national frontier). The suggested solution is based
on the principle of simultaneously monitoring of seismoacoustic and
optical/infrared physical fields. The principle of simultaneous
monitoring of those fields is not new but in contrast to the known
solutions the suggested approach allows to control super-extended
objects with very limited operational costs. So-called C-OTDR
(Coherent Optical Time Domain Reflectometer) systems are used to
monitor the seismoacoustic field. Far-CCTV systems are used to
monitor the optical/infrared field. A simultaneous data processing
provided by both systems allows effectively detecting and classifying
target activities, which appear in the monitored objects vicinity. The
results of practical usage had shown high effectiveness of the
suggested approach.
Abstract: Motor imagery classification provides an important basis for designing Brain Machine Interfaces [BMI]. A BMI captures and decodes brain EEG signals and transforms human thought into actions. The ability of an individual to control his EEG through imaginary mental tasks enables him to control devices through the BMI. This paper presents a method to design a four state BMI using EEG signals recorded from the C3 and C4 locations. Principle features extracted through principle component analysis of the segmented EEG are analyzed using two novel classification algorithms using Elman recurrent neural network and functional link neural network. Performance of both classifiers is evaluated using a particle swarm optimization training algorithm; results are also compared with the conventional back propagation training algorithm. EEG motor imagery recorded from two subjects is used in the offline analysis. From overall classification performance it is observed that the BP algorithm has higher average classification of 93.5%, while the PSO algorithm has better training time and maximum classification. The proposed methods promises to provide a useful alternative general procedure for motor imagery classification