Abstract: The modified Claus process is the major technology
for the recovery of elemental sulfur from hydrogen sulfide. The
chemical reactions that can occur in the reaction furnace are
numerous and many byproducts such as carbon disulfide and carbon
carbonyl sulfide are produced. These compounds can often contribute
from 20 to 50% of the pollutants and therefore, should be hydrolyzed
in the catalytic converter. The inlet temperature of the first catalytic
reactor should be maintained over than 250 °C, to hydrolyze COS
and CS2. In this paper, the various configurations for the first
converter reheating of sulfur recovery unit are investigated. As a
result, the performance of each method is presented for a typical
clause unit. The results show that the hot gas method seems to be
better than the other methods.
Abstract: This paper proposes a method for speckle reduction in
medical ultrasound imaging while preserving the edges with the
added advantages of adaptive noise filtering and speed. A nonlinear
image diffusion method that incorporates local image parameter,
namely, scatterer density in addition to gradient, to weight the
nonlinear diffusion process, is proposed. The method was tested for
the isotropic case with a contrast detail phantom and varieties of
clinical ultrasound images, and then compared to linear and some
other diffusion enhancement methods. Different diffusion parameters
were tested and tuned to best reduce speckle noise and preserve
edges. The method showed superior performance measured both
quantitatively and qualitatively when incorporating scatterer density
into the diffusivity function. The proposed filter can be used as a
preprocessing step for ultrasound image enhancement before
applying automatic segmentation, automatic volumetric calculations,
or 3D ultrasound volume rendering.
Abstract: Computer networks are essential part in computerbased
information systems. The performance of these networks has a
great influence on the whole information system. Measuring the
usability criteria and customers satisfaction on small computer
network is very important. In this article, an effective approach for
measuring the usability of business network in an information system
is introduced. The usability process for networking provides us with a
flexible and a cost-effective way to assess the usability of a network
and its products. In addition, the proposed approach can be used to
certify network product usability late in the development cycle.
Furthermore, it can be used to help in developing usable interfaces
very early in the cycle and to give a way to measure, track, and
improve usability. Moreover, a new approach for fast information
processing over computer networks is presented. The entire data are
collected together in a long vector and then tested as a one input
pattern. Proposed fast time delay neural networks (FTDNNs) use
cross correlation in the frequency domain between the tested data and
the input weights of neural networks. It is proved mathematically and
practically that the number of computation steps required for the
presented time delay neural networks is less than that needed by
conventional time delay neural networks (CTDNNs). Simulation
results using MATLAB confirm the theoretical computations.
Abstract: In this study, the effects of biogas fuels on the performance of an annular micro gas turbine (MGT) were assessed experimentally and numerically. In the experiments, the proposed MGT system was operated successfully under each test condition; minimum composition to the fuel with the biogas was roughly 50% CH4 with 50% CO2. The power output was around 170W at 85,000 RPM as 90% CH4 with 10% CO2 was used and 70W at 65,000 RPM as 70% CH4 with 30% CO2 was used. When a critical limit of 60% CH4 was reached, the power output was extremely low. Furthermore, the theoretical Brayton cycle efficiency and electric efficiency of the MGT were calculated as 23% and 10%, respectively. Following the experiments, the measured data helped us identify the parameters of dynamic model in numerical simulation. Additionally, a numerical analysis of re-designed combustion chamber showed that the performance of MGT could be improved by raising the temperature at turbine inlet. This study presents a novel distributed power supply system that can utilize renewable biogas. The completed micro biogas power supply system is small, low cost, easy to maintain and suited to household use.
Abstract: The field of biomedical materials plays an imperative
requisite and a critical role in manufacturing a variety of biological
artificial replacements in a modern world. Recently, titanium (Ti)
materials are being used as biomaterials because of their superior
corrosion resistance and tremendous specific strength, free- allergic
problems and the greatest biocompatibility compared to other
competing biomaterials such as stainless steel, Co-Cr alloys,
ceramics, polymers, and composite materials. However, regardless of
these excellent performance properties, Implantable Ti materials have
poor shear strength and wear resistance which limited their
applications as biomaterials. Even though the wear properties of Ti
alloys has revealed some improvements, the crucial effectiveness of
biomedical Ti alloys as wear components requires a comprehensive
deep understanding of the wear reasons, mechanisms, and techniques
that can be used to improve wear behavior. This review examines
current information on the effect of thermal and thermomechanical
processing of implantable Ti materials on the long-term prosthetic
requirement which related with wear behavior. This paper focuses
mainly on the evolution, evaluation and development of effective
microstructural features that can improve wear properties of bio
grade Ti materials using thermal and thermomechanical treatments.
Abstract: A prototype model of an emulsion separator was
designed and manufactured. Generally, it is a cylinder filled with
different fractal modules. The emulsion was fed into the reactor by a
peristaltic pump through an inlet placed at the boundary between the
two phases. For hydrodynamic design and sizing of the reactor the
assumptions of the theory of filtration were used and methods to
describe the separation process were developed. Based on this
methodology and using numerical methods and software of Autodesk
the process is simulated in different operating modes. The basic
hydrodynamic characteristics - speed and performance for different
types of fractal systems and decisions to optimize the design of the
reactor were also defined.
Abstract: Accurately predicting non-peak traffic is crucial to
daily traffic for all forecasting models. In the paper, least squares
support vector machines (LS-SVMs) are investigated to solve such a
practical problem. It is the first time to apply the approach and analyze
the forecast performance in the domain. For comparison purpose, two
parametric and two non-parametric techniques are selected because of
their effectiveness proved in past research. Having good
generalization ability and guaranteeing global minima, LS-SVMs
perform better than the others. Providing sufficient improvement in
stability and robustness reveals that the approach is practically
promising.
Abstract: Computation of facility location problem for every
location in the country is not easy simultaneously. Solving the
problem is described by using cluster computing. A technique is to
design parallel algorithm by using local search with single swap
method in order to solve that problem on clusters. Parallel
implementation is done by the use of portable parallel programming,
Message Passing Interface (MPI), on Microsoft Windows Compute
Cluster. In this paper, it presents the algorithm that used local search
with single swap method and implementation of the system of a
facility to be opened by using MPI on cluster. If large datasets are
considered, the process of calculating a reasonable cost for a facility
becomes time consuming. The result shows parallel computation of
facility location problem on cluster speedups and scales well as
problem size increases.
Abstract: Cognitive Science appeared about 40 years ago,
subsequent to the challenge of the Artificial Intelligence, as common
territory for several scientific disciplines such as: IT, mathematics,
psychology, neurology, philosophy, sociology, and linguistics. The
new born science was justified by the complexity of the problems
related to the human knowledge on one hand, and on the other by the
fact that none of the above mentioned sciences could explain alone
the mental phenomena. Based on the data supplied by the
experimental sciences such as psychology or neurology, models of
the human mind operation are built in the cognition science. These
models are implemented in computer programs and/or electronic
circuits (specific to the artificial intelligence) – cognitive systems –
whose competences and performances are compared to the human
ones, leading to the psychology and neurology data reinterpretation,
respectively to the construction of new models. During these
processes if psychology provides the experimental basis, philosophy
and mathematics provides the abstraction level utterly necessary for
the intermission of the mentioned sciences.
The ongoing general problematic of the cognitive approach
provides two important types of approach: the computational one,
starting from the idea that the mental phenomenon can be reduced to
1 and 0 type calculus operations, and the connection one that
considers the thinking products as being a result of the interaction
between all the composing (included) systems. In the field of
psychology measurements in the computational register use classical
inquiries and psychometrical tests, generally based on calculus
methods. Deeming things from both sides that are representing the
cognitive science, we can notice a gap in psychological product
measurement possibilities, regarded from the connectionist
perspective, that requires the unitary understanding of the quality –
quantity whole. In such approach measurement by calculus proves to
be inefficient. Our researches, deployed for longer than 20 years,
lead to the conclusion that measuring by forms properly fits to the
connectionism laws and principles.
Abstract: This work has been carried out in order to provide an understanding of the physical behaviors of the flow variation of pressure and temperature in a vortex tube. A computational fluid dynamics model is used to predict the flow fields and the associated temperature separation within a Ranque–Hilsch vortex tube. The CFD model is a steady axisymmetric model (with swirl) that utilizes the standard k-ε turbulence model. The second–order numerical schemes, was used to carry out all the computations. Vortex tube with a circumferential inlet stream and an axial (cold) outlet stream and a circumferential (hot) outlet stream was considered. Performance curves (temperature separation versus cold outlet mass fraction) were obtained for a specific vortex tube with a given inlet mass flow rate. Simulations have been carried out for varying amounts of cold outlet mass flow rates. The model results have a good agreement with experimental data.
Abstract: Instead of traditional (nominal) classification we investigate
the subject of ordinal classification or ranking. An enhanced
method based on an ensemble of Support Vector Machines (SVM-s)
is proposed. Each binary classifier is trained with specific weights
for each object in the training data set. Experiments on benchmark
datasets and synthetic data indicate that the performance of our
approach is comparable to state of the art kernel methods for
ordinal regression. The ensemble method, which is straightforward
to implement, provides a very good sensitivity-specificity trade-off
for the highest and lowest rank.
Abstract: Ant Colony Algorithms have been applied to difficult
combinatorial optimization problems such as the travelling salesman
problem and the quadratic assignment problem. In this paper gridbased
and random-based ant colony algorithms are proposed for
automatic 3D hose routing and their pros and cons are discussed. The
algorithm uses the tessellated format for the obstacles and the
generated hoses in order to detect collisions. The representation of
obstacles and hoses in the tessellated format greatly helps the
algorithm towards handling free-form objects and speeds up
computation. The performance of algorithm has been tested on a
number of 3D models.
Abstract: Automatic methods of detecting changes through
satellite imaging are the object of growing interest, especially
beca²use of numerous applications linked to analysis of the Earth’s
surface or the environment (monitoring vegetation, updating maps,
risk management, etc...). This work implemented spatial analysis
techniques by using images with different spatial and spectral
resolutions on different dates. The work was based on the principle
of control charts in order to set the upper and lower limits beyond
which a change would be noted. Later, the a contrario approach was
used. This was done by testing different thresholds for which the
difference calculated between two pixels was significant. Finally,
labeled images were considered, giving a particularly low difference
which meant that the number of “false changes” could be estimated
according to a given limit.
Abstract: This paper presents Simulation and experimental
study aimed at investigating the effectiveness of an adaptive artificial
neural network stabilizer on enhancing the damping torque of a
synchronous generator. For this purpose, a power system comprising
a synchronous generator feeding a large power system through a
short tie line is considered. The proposed adaptive neuro-control
system consists of two multi-layered feed forward neural networks,
which work as a plant model identifier and a controller. It generates
supplementary control signals to be utilized by conventional
controllers. The details of the interfacing circuits, sensors and
transducers, which have been designed and built for use in tests, are
presented. The synchronous generator is tested to investigate the
effect of tuning a Power System Stabilizer (PSS) on its dynamic
stability. The obtained simulation and experimental results verify the
basic theoretical concepts.
Abstract: Partial combustion of biomass in the gasifier generates producer gas that can be used for heating purposes and as supplementary or sole fuel in internal combustion engines. In this study, the virgin biomass obtained from hingan shell is used as the feedstock for gasifier to generate producer gas. The gasifier-engine system is operated on diesel and on esters of vegetable oil of hingan in liquid fuel mode operation and then on liquid fuel and producer gas combination in dual fuel mode operation. The performance and emission characteristics of the CI engine is analyzed by running the engine in liquid fuel mode operation and in dual fuel mode operation at different load conditions with respect to maximum diesel savings in the dual fuel mode operation. It was observed that specific energy consumption in the dual fuel mode of operation is found to be in the higher side at all load conditions. The brake thermal efficiency of the engine using diesel or hingan oil methyl ester (HOME) is higher than that of dual fuel mode operation. A diesel replacement in the tune of 60% in dual fuel mode is possible with the use of hingan shell producer gas. The emissions parameters such CO, HC, NOx, CO2 and smoke are higher in the case of dual fuel mode of operation as compared to that of liquid fuel mode.
Abstract: In this paper, we propose a geometric modeling of
illumination on the patterned image containing etching transistor. This
image is captured by a commercial camera during the inspection of
a TFT-LCD panel. Inspection of defect is an important process in the
production of LCD panel, but the regional difference in brightness,
which has a negative effect on the inspection, is due to the uneven
illumination environment. In order to solve this problem, we present
a geometric modeling of illumination consisting of an interpolation
using the least squares method and 3D modeling using bezier surface.
Our computational time, by using the sampling method, is shorter
than the previous methods. Moreover, it can be further used to correct
brightness in every patterned image.
Abstract: Designing, implementing, and debugging concurrency
control algorithms in a real system is a complex, tedious, and errorprone
process. Further, understanding concurrency control
algorithms and distributed computations is itself a difficult task.
Visualization can help with both of these problems. Thus, we have
developed an exploratory environment in which people can prototype
and test various versions of concurrency control algorithms, study
and debug distributed computations, and view performance statistics
of distributed systems. In this paper, we describe the exploratory
environment and show how it can be used to explore concurrency
control algorithms for the interactive steering of distributed
computations.
Abstract: In the semiconductor manufacturing process, large
amounts of data are collected from various sensors of multiple
facilities. The collected data from sensors have several different characteristics
due to variables such as types of products, former processes
and recipes. In general, Statistical Quality Control (SQC) methods
assume the normality of the data to detect out-of-control states of
processes. Although the collected data have different characteristics,
using the data as inputs of SQC will increase variations of data,
require wide control limits, and decrease performance to detect outof-
control. Therefore, it is necessary to separate similar data groups
from mixed data for more accurate process control. In the paper,
we propose a regression tree using split algorithm based on Pearson
distribution to handle non-normal distribution in parametric method.
The regression tree finds similar properties of data from different
variables. The experiments using real semiconductor manufacturing
process data show improved performance in fault detecting ability.
Abstract: The aim of this article is to assess the existing
business models used by the banks operating in the CEE countries in
the time period from 2006 till 2011.
In order to obtain research results, the authors performed
qualitative analysis of the scientific literature on bank business
models, which have been grouped into clusters that consist of such
components as: 1) capital and reserves; 2) assets; 3) deposits, and 4)
loans.
In their turn, bank business models have been developed based on
the types of core activities of the banks, and have been divided into
four groups: Wholesale, Investment, Retail and Universal Banks.
Descriptive statistics have been used to analyse the models,
determining mean, minimal and maximal values of constituent
cluster components, as well as standard deviation. The analysis of
the data is based on such bank variable indices as Return on Assets
(ROA) and Return on Equity (ROE).
Abstract: This research was to study a comparison of inspector-s performance between regular and complex visual inspection task. Visual task was simulated on DVD read control circuit. Inspection task was performed by using computer. Subjects were 10 undergraduate randomly selected and test for 20/20. Then, subjects were divided into two groups, five for regular inspection (control group) and five for complex inspection (treatment group) tasks. Result was showed that performance on regular and complex inspectors was significantly difference at the level of 0.05. Inspector performance on regular inspection was showed high percentage on defects detected by using equal time to complex inspection. This would be indicated that inspector performance was affected by visual inspection task.