Abstract: This work deals with modeling and simulation of SO2 removal in a ceramic membrane by means of FEM. A mass transfer model was developed to predict the performance of SO2 absorption in a chemical solvent. The model was based on solving conservation equations for gas component in the membrane. Computational fluid dynamics (CFD) of mass and momentum were used to solve the model equations. The simulations aimed to obtain the distribution of gas concentration in the absorption process. The effect of the operating parameters on the efficiency of the ceramic membrane was evaluated. The modeling findings showed that the gas phase velocity has significant effect on the removal of gas whereas the liquid phase does not affect the SO2 removal significantly. It is also indicated that the main mass transfer resistance is placed in the membrane and gas phase because of high tortuosity of the ceramic membrane.
Abstract: A data warehouse (DW) is a system which has value and role for decision-making by querying. Queries to DW are critical regarding to their complexity and length. They often access millions of tuples, and involve joins between relations and aggregations. Materialized views are able to provide the better performance for DW queries. However, these views have maintenance cost, so materialization of all views is not possible. An important challenge of DW environment is materialized view selection because we have to realize the trade-off between performance and view maintenance cost. Therefore, in this paper, we introduce a new approach aimed at solve this challenge based on Two-Phase Optimization (2PO), which is a combination of Simulated Annealing (SA) and Iterative Improvement (II), with the use of Multiple View Processing Plan (MVPP). Our experiments show that our method provides a further improvement in term of query processing cost and view maintenance cost.
Abstract: We have defined two suites of metrics, which cover
static and dynamic aspects of component assembly. The static
metrics measure complexity and criticality of component assembly,
wherein complexity is measured using Component Packing Density
and Component Interaction Density metrics. Further, four criticality
conditions namely, Link, Bridge, Inheritance and Size criticalities
have been identified and quantified. The complexity and criticality
metrics are combined to form a Triangular Metric, which can be used
to classify the type and nature of applications. Dynamic metrics are
collected during the runtime of a complete application. Dynamic
metrics are useful to identify super-component and to evaluate the
degree of utilisation of various components. In this paper both static
and dynamic metrics are evaluated using Weyuker-s set of properties.
The result shows that the metrics provide a valid means to measure
issues in component assembly. We relate our metrics suite with
McCall-s Quality Model and illustrate their impact on product
quality and to the management of component-based product
development.
Abstract: Computer languages are usually lumped together
into broad -paradigms-, leaving us in want of a finer classification
of kinds of language. Theories distinguishing between -genuine
differences- in language has been called for, and we propose that
such differences can be observed through a notion of expressive mode.
We outline this concept, propose how it could be operationalized and
indicate a possible context for the development of a corresponding
theory. Finally we consider a possible application in connection
with evaluation of language revision. We illustrate this with a case,
investigating possible revisions of the relational algebra in order to
overcome weaknesses of the division operator in connection with
universal queries.
Abstract: Irradiated material is a typical example of a complex
system with nonlinear coupling between its elements. During
irradiation the radiation damage is developed and this development
has bifurcations and qualitatively different kinds of behavior.
The accumulation of primary defects in irradiated crystals is
considered in frame work of nonlinear evolution of complex system.
The thermo-concentration nonlinear feedback is carried out as a
mechanism of self-oscillation development.
It is shown that there are two ways of the defect density evolution
under stationary irradiation. The first is the accumulation of defects;
defect density monotonically grows and tends to its stationary state
for some system parameters. Another way that takes place for
opportune parameters is the development of self-oscillations of the
defect density.
The stationary state, its stability and type are found. The
bifurcation values of parameters (environment temperature, defect
generation rate, etc.) are obtained. The frequency of the selfoscillation
and the conditions of their development is found and
rated. It is shown that defect density, heat fluxes and temperature
during self-oscillations can reach much higher values than the
expected steady-state values. It can lead to a change of typical
operation and an accident, e.g. for nuclear equipment.
Abstract: In this paper we have proposed a methodology to
develop an amperometric biosensor for the analysis of glucose
concentration using a simple microcontroller based data acquisition
system. The work involves the development of Detachable
Membrane Unit (enzyme based biomembrane) with immobilized
glucose oxidase on the membrane and interfacing the same to the
signal conditioning system. The current generated by the biosensor
for different glucose concentrations was signal conditioned, then
acquired and computed by a simple AT89C51-microcontroller. The
optimum operating parameters for the better performance were found
and reported. The detailed performance evaluation of the biosensor
has been carried out. The proposed microcontroller based biosensor
system has the sensitivity of 0.04V/g/dl, with a resolution of
50mg/dl. It has exhibited very good inter day stability observed up to
30 days. Comparing to the reference method such as HPLC, the
accuracy of the proposed biosensor system is well within ± 1.5%.
The system can be used for real time analysis of glucose
concentration in the field such as, food and fermentation and clinical
(In-Vitro) applications.
Abstract: Water hyacinth has been used in aquatic systems for
wastewater purification in many years worldwide. The role of water
hyacinth (Eichhornia crassipes) species in polishing nitrate and
phosphorus concentration from municipal wastewater treatment plant
effluent by phytoremediation method was evaluated. The objective
of this project is to determine the removal efficiency of water
hyacinth in polishing nitrate and phosphorus, as well as chemical
oxygen demand (COD) and ammonia. Water hyacinth is considered
as the most efficient aquatic plant used in removing vast range of
pollutants such as organic matters, nutrients and heavy metals. Water
hyacinth, also referred as macrophytes, were cultivated in the
treatment house in a reactor tank of approximately 90(L) x 40(W) x
25(H) in dimension and built with three compartments. Three water
hyacinths were placed in each compartments and water sample in
each compartment were collected in every two days. The plant
observation was conducted by weight measurement, plant uptake and
new young shoot development. Water hyacinth effectively removed
approximately 49% of COD, 81% of ammonia, 67% of phosphorus
and 92% of nitrate. It also showed significant growth rate at starting
from day 6 with 0.33 shoot/day and they kept developing up to 0.38
shoot/day at the end of day 24. From the studies conducted, it was
proved that water hyacinth is capable of polishing the effluent of
municipal wastewater which contains undesirable amount of nitrate
and phosphorus concentration.
Abstract: Partial discharge (PD) detection is an important
method to evaluate the insulation condition of metal-clad apparatus.
Non-intrusive sensors which are easy to install and have no
interruptions on operation are preferred in onsite PD detection.
However, it often lacks of accuracy due to the interferences in PD
signals. In this paper a novel PD extraction method that uses frequency
analysis and entropy based time-frequency (TF) analysis is introduced.
The repetitive pulses from convertor are first removed via frequency
analysis. Then, the relative entropy and relative peak-frequency of
each pulse (i.e. time-indexed vector TF spectrum) are calculated and
all pulses with similar parameters are grouped. According to the
characteristics of non-intrusive sensor and the frequency distribution
of PDs, the pulses of PD and interferences are separated. Finally the
PD signal and interferences are recovered via inverse TF transform.
The de-noised result of noisy PD data demonstrates that the
combination of frequency and time-frequency techniques can
discriminate PDs from interferences with various frequency
distributions.
Abstract: When cars are released from the factory, strut noises are very small and therefore it is difficult to perceive them. As the use time and travel distance increase, however, strut noises get larger so as to cause users much uneasiness. The noises generated at the field include engine noises and flow noises and therefore it is difficult to clearly discern the noises generated from struts. This study developed a test method which can reproduce field strut noises in the lab. Using the newly developed noise evaluation test, this study analyzed the effects that insulator performance degradation and failure can have on car noises. The study also confirmed that the insulator durability test by the simple back-and-forth motion cannot completely reflect the state of the parts failure in the field. Based on this, the study also confirmed that field noises can be reproduced through a durability test that considers heat aging.
Abstract: A new digital watermarking technique for images that
are sensitive to blocking artifacts is presented. Experimental results
show that the proposed MDCT based approach produces highly
imperceptible watermarked images and is robust to attacks such as
compression, noise, filtering and geometric transformations. The
proposed MDCT watermarking technique is applied to fingerprints
for ensuring security. The face image and demographic text data of
an individual are used as multiple watermarks. An AFIS system was
used to quantitatively evaluate the matching performance of the
MDCT-based watermarked fingerprint. The high fingerprint
matching scores show that the MDCT approach is resilient to
blocking artifacts. The quality of the extracted face and extracted text
images was computed using two human visual system metrics and
the results show that the image quality was high.
Abstract: In digital signal processing it is important to
approximate multi-dimensional data by the method called rank
reduction, in which we reduce the rank of multi-dimensional data from
higher to lower. For 2-dimennsional data, singular value
decomposition (SVD) is one of the most known rank reduction
techniques. Additional, outer product expansion expanded from SVD
was proposed and implemented for multi-dimensional data, which has
been widely applied to image processing and pattern recognition.
However, the multi-dimensional outer product expansion has behavior
of great computation complex and has not orthogonally between the
expansion terms. Therefore we have proposed an alterative method,
Third-order Orthogonal Tensor Product Expansion short for 3-OTPE.
3-OTPE uses the power method instead of nonlinear optimization
method for decreasing at computing time. At the same time the group
of B. D. Lathauwer proposed Higher-Order SVD (HOSVD) that is
also developed with SVD extensions for multi-dimensional data.
3-OTPE and HOSVD are similarly on the rank reduction of
multi-dimensional data. Using these two methods we can obtain
computation results respectively, some ones are the same while some
ones are slight different. In this paper, we compare 3-OTPE to
HOSVD in accuracy of calculation and computing time of resolution,
and clarify the difference between these two methods.
Abstract: The development of shape and size of a crack in a
pressure vessel under uniaxial and biaxial loadings is important in
fitness-for-service evaluations such as leak-before-break. In this
work finite element modelling was used to evaluate the mean stress
and the J-integral around a front of a surface-breaking crack. A
procedure on the basis of ductile tearing resistance curves of high and
low constrained fracture mechanics geometries was developed to
estimate the amount of ductile crack extension for surface-breaking
cracks and to show the evolution of the initial crack shape. The
results showed non-uniform constraint levels and crack driving forces
around the crack front at large deformation levels. It was also shown
that initially semi-elliptical surface cracks under biaxial load
developed higher constraint levels around the crack front than in
uniaxial tension. However similar crack shapes were observed with
more extensions associated with cracks under biaxial loading.
Abstract: We demonstrate single-photon interference over 10 km using a plug and play system for quantum key distribution. The quality of the interferometer is measured by using the interferometer
visibility. The coding of the signal is based on the phase coding and the value of visibility is based on the interference effect, which result a number of count. The setup gives full control of polarization inside
the interferometer. The quality measurement of the interferometer is based on number of count per second and the system produces 94 % visibility in one of the detectors.
Abstract: The third phase of web means semantic web requires many web pages which are annotated with metadata. Thus, a crucial question is where to acquire these metadata. In this paper we propose our approach, a semi-automatic method to annotate the texts of documents and web pages and employs with a quite comprehensive knowledge base to categorize instances with regard to ontology. The approach is evaluated against the manual annotations and one of the most popular annotation tools which works the same as our tool. The approach is implemented in .net framework and uses the WordNet for knowledge base, an annotation tool for the Semantic Web.
Abstract: The complexity of today-s software systems makes
collaborative development necessary to accomplish tasks.
Frameworks are necessary to allow developers perform their tasks
independently yet collaboratively. Similarity detection is one of the
major issues to consider when developing such frameworks. It allows
developers to mine existing repositories when developing their own
views of a software artifact, and it is necessary for identifying the
correspondences between the views to allow merging them and
checking their consistency. Due to the importance of the
requirements specification stage in software development, this paper
proposes a framework for collaborative development of Object-
Oriented formal specifications along with a similarity detection
approach to support the creation, merging and consistency checking
of specifications. The paper also explores the impact of using
additional concepts on improving the matching results. Finally, the
proposed approach is empirically evaluated.
Abstract: We present a simplified equalization technique for a
π/4 differential quadrature phase shift keying ( π/4 -DQPSK) modulated
signal in a multipath fading environment. The proposed equalizer is
realized as a fractionally spaced adaptive decision feedback equalizer
(FS-ADFE), employing exponential step-size least mean square
(LMS) algorithm as the adaptation technique. The main advantage of
the scheme stems from the usage of exponential step-size LMS algorithm
in the equalizer, which achieves similar convergence behavior
as that of a recursive least squares (RLS) algorithm with significantly
reduced computational complexity. To investigate the finite-precision
performance of the proposed equalizer along with the π/4 -DQPSK
modem, the entire system is evaluated on a 16-bit fixed point digital
signal processor (DSP) environment. The proposed scheme is found
to be attractive even for those cases where equalization is to be
performed within a restricted number of training samples.
Abstract: The weight constrained shortest path problem
(WCSPP) is one of most several known basic problems in
combinatorial optimization. Because of its importance in many areas
of applications such as computer science, engineering and operations
research, many researchers have extensively studied the WCSPP.
This paper mainly concentrates on the reduction of total search space
for finding WCSP using some existing Genetic Algorithm (GA). For
this purpose, some controlled schemes of genetic operators are
adopted on list chromosome representation. This approach gives a
near optimum solution with smaller elapsed generation than classical
GA technique. From further analysis on the matter, a new
generalized schema theorem is also developed from the philosophy
of Holland-s theorem.
Abstract: This paper outlines the development of a learning retrieval agent. Task of this agent is to extract knowledge of the Active Semantic Network in respect to user-requests. Based on a reinforcement learning approach, the agent learns to interpret the user-s intention. Especially, the learning algorithm focuses on the retrieval of complex long distant relations. Increasing its learnt knowledge with every request-result-evaluation sequence, the agent enhances his capability in finding the intended information.
Abstract: In this paper, investigation of subsynchronous
resonance (SSR) characteristics of a hybrid series compensated
system and the design of voltage controller for three level 24-pulse
Voltage Source Converter based Static Synchronous Series
Compensator (SSSC) is presented. Hybrid compensation consists of
series fixed capacitor and SSSC which is a active series FACTS
controller. The design of voltage controller for SSSC is based on
damping torque analysis, and Genetic Algorithm (GA) is adopted for
tuning the controller parameters. The SSR Characteristics of SSSC
with constant reactive voltage control modes has been investigated.
The results show that the constant reactive voltage control of SSSC
has the effect of reducing the electrical resonance frequency, which
detunes the SSR.The analysis of SSR with SSSC is carried out based
on frequency domain method, eigenvalue analysis and transient
simulation. While the eigenvalue and damping torque analysis are
based on D-Q model of SSSC, the transient simulation considers both
D-Q and detailed three phase nonlinear system model using
switching functions.
Abstract: Convergence of power series solutions for a class of
non-linear Abel type equations, including an equation that arises
in nonlinear cooling of semi-infinite rods, is very slow inside their
small radius of convergence. Beyond that the corresponding power
series are wildly divergent. Implementation of nonlinear sequence
transformation allow effortless evaluation of these power series on
very large intervals..