Abstract: This paper proposes a smart design strategy for a sequential detector to reliably detect the primary user-s signal, especially in fast fading environments. We study the computation of the log-likelihood ratio for coping with a fast changing received signal and noise sample variances, which are considered random variables. First, we analyze the detectability of the conventional generalized log-likelihood ratio (GLLR) scheme when considering fast changing statistics of unknown parameters caused by fast fading effects. Secondly, we propose an efficient sensing algorithm for performing the sequential probability ratio test in a robust and efficient manner when the channel statistics are unknown. Finally, the proposed scheme is compared to the conventional method with simulation results with respect to the average number of samples required to reach a detection decision.
Abstract: It has been shown that the solution of water shortage problem in Central Asia closely connected with inclusion of atmosphere water vapour into the system of response and water resources management. Some methods of water extraction from atmosphere have been discussed.
Abstract: Color Image quantization (CQ) is an important
problem in computer graphics, image and processing. The aim of
quantization is to reduce colors in an image with minimum distortion.
Clustering is a widely used technique for color quantization; all
colors in an image are grouped to small clusters. In this paper, we
proposed a new hybrid approach for color quantization using firefly
algorithm (FA) and K-means algorithm. Firefly algorithm is a swarmbased
algorithm that can be used for solving optimization problems.
The proposed method can overcome the drawbacks of both
algorithms such as the local optima converge problem in K-means
and the early converge of firefly algorithm. Experiments on three
commonly used images and the comparison results shows that the
proposed algorithm surpasses both the base-line technique k-means
clustering and original firefly algorithm.
Abstract: Transesterification of candlenut (aleurites moluccana)
oil with methanol using potassium hydroxide as catalyst was
studied. The objective of the present investigation was to produce
the methyl ester for use as biodiesel. The operation variables
employed were methanol to oil molar ratio (3:1 – 9:1), catalyst
concentration (0.50 – 1.5 %) and temperature (303 – 343K). Oil
volume of 150 mL, reaction time of 75 min were fixed as common
parameters in all the experiments. The concentration of methyl ester
was evaluated by mass balance of free glycerol formed which was
analyzed by using periodic acid. The optimal triglyceride conversion
was attained by using methanol to oil ratio of 6:1, potassium
hydroxide as catalyst was of 1%, at room temperature. Methyl ester
formed was characterized by its density, viscosity, cloud and pour
points. The biodiesel properties had properties similar to those of
diesel oil, except for the viscosity that was higher.
Abstract: The purposes of this research are to study and develop
the algorithm of Thai spoonerism words by semi-automatic computer
programs, that is to say, in part of data input, syllables are already
separated and in part of spoonerism, the developed algorithm is
utilized, which can establish rules and mechanisms in Thai
spoonerism words for bi-syllables by utilizing analysis in elements of
the syllables, namely cluster consonant, vowel, intonation mark and
final consonant. From the study, it is found that bi-syllable Thai
spoonerism has 1 case of spoonerism mechanism, namely
transposition in value of vowel, intonation mark and consonant of
both 2 syllables but keeping consonant value and cluster word (if
any).
From the study, the rules and mechanisms in Thai spoonerism
word were applied to develop as Thai spoonerism word software,
utilizing PHP program. the software was brought to conduct a
performance test on software execution; it is found that the program
performs bi-syllable Thai spoonerism correctly or 99% of all words
used in the test and found faults on the program at 1% as the words
obtained from spoonerism may not be spelling in conformity with
Thai grammar and the answer in Thai spoonerism could be more than
1 answer.
Abstract: In this article the accumulated results out of the effects
and length of the manufacture and production projects in the
university and research standard have been settled with the usefulness
definition of the process of project management for the accessibility
to the proportional pattern in the “time and action" stages. Studies
show that many problems confronted by the researchers in these
projects are connected to the non-profiting of: 1) autonomous timing
for gathering the educational theme, 2) autonomous timing for
planning and pattern, presenting before the construction, and 3)
autonomous timing for manufacture and sample presentation from the
output. The result of this study indicates the division of every
manufacture and production projects into three smaller autonomous
projects from its kind, budget and autonomous expenditure, shape
and order of the stages for the management of these kinds of projects.
In this case study real result are compared with theoretical results.
Abstract: Pretreatment of lignocellulosic biomass materials from
poplar, acacia, oak, and fir with different ionic liquids (ILs)
containing 1-alkyl-3-methyl-imidazolium cations and various anions
has been carried out. The dissolved cellulose from biomass was
precipitated by adding anti-solvents into the solution and vigorous
stirring. Commercial cellulases Celluclast 1.5L and Accelerase 1000
have been used for hydrolysis of untreated and pretreated
lignocellulosic biomass. Among the tested ILs, [Emim]COOCH3
showed the best efficiency, resulting in highest amount of liberated
reducing sugars. Pretreatment of lignocellulosic biomass using
glycerol-ionic liquids combined pretreatment and dilute acid-ionic
liquids combined pretreatment were evaluated and compared with
glycerol pretreatment, ionic liquids pretreatment and dilute acid
pretreatment.
Abstract: We propose the use of magneto-optic Kerr effect (MOKE) to realize single-qubit quantum gates. We consider longitudinal and polar MOKE in reflection geometry in which the magnetic field is parallel to both the plane of incidence and surface of the film. MOKE couples incident TE and TM polarized photons and the Hamiltonian that represents this interaction is isomorphic to that of a canonical two-level quantum system. By varying the phase and amplitude of the magnetic field, we can realize Hadamard, NOT, and arbitrary phase-shift single-qubit quantum gates. The principal advantage is operation with magnetically non-transparent materials.
Abstract: Vehicle suspension design must fulfill
some conflicting criteria. Among those is ride comfort
which is attained by minimizing the acceleration
transmitted to the sprung mass, via suspension spring
and damper. Also good handling of a vehicle is a
desirable property which requires stiff suspension and
therefore is in contrast with a vehicle with good ride.
Among the other desirable features of a suspension is
the minimization of the maximum travel of suspension.
This travel which is called suspension working space in
vehicle dynamics literature is also a design constraint
and it favors good ride. In this research a full car 8
degrees of freedom model has been developed and the
three above mentioned criteria, namely: ride, handling
and working space has been adopted as objective
functions. The Multi Objective Programming (MOP)
discipline has been used to find the Pareto Front and
some reasoning used to chose a design point between
these non dominated points of Pareto Front.
Abstract: The traditional Failure Mode and Effects Analysis
(FMEA) uses Risk Priority Number (RPN) to evaluate the risk level
of a component or process. The RPN index is determined by
calculating the product of severity, occurrence and detection indexes.
The most critically debated disadvantage of this approach is that
various sets of these three indexes may produce an identical value of
RPN. This research paper seeks to address the drawbacks in
traditional FMEA and to propose a new approach to overcome these
shortcomings. The Risk Priority Code (RPC) is used to prioritize
failure modes, when two or more failure modes have the same RPN.
A new method is proposed to prioritize failure modes, when there is a
disagreement in ranking scale for severity, occurrence and detection.
An Analysis of Variance (ANOVA) is used to compare means of
RPN values. SPSS (Statistical Package for the Social Sciences)
statistical analysis package is used to analyze the data. The results
presented are based on two case studies. It is found that the proposed
new methodology/approach resolves the limitations of traditional
FMEA approach.
Abstract: Resins are used in nuclear power plants for water
ultrapurification. Two approaches are considered in this work:
column experiments and simulations. A software called OPTIPUR
was developed, tested and used. The approach simulates the onedimensional
reactive transport in porous medium with convectivedispersive
transport between particles and diffusive transport within
the boundary layer around the particles. The transfer limitation in the
boundary layer is characterized by the mass transfer coefficient
(MTC). The influences on MTC were measured experimentally. The
variation of the inlet concentration does not influence the MTC; on
the contrary of the Darcy velocity which influences. This is consistent
with results obtained using the correlation of Dwivedi&Upadhyay.
With the MTC, knowing the number of exchange site and the relative
affinity, OPTIPUR can simulate the column outlet concentration
versus time. Then, the duration of use of resins can be predicted in
conditions of a binary exchange.
Abstract: In this paper, we present a preconditioned AOR-type iterative method for solving the linear systems Ax = b, where A is a Z-matrix. And give some comparison theorems to show that the rate of convergence of the preconditioned AOR-type iterative method is faster than the rate of convergence of the AOR-type iterative method.
Abstract: Since the one-to-one word translator does not have the
facility to translate pragmatic aspects of Javanese, the parallel text
alignment model described uses a phrase pair combination. The
algorithm aligns the parallel text automatically from the beginning to
the end of each sentence. Even though the results of the phrase pair
combination outperform the previous algorithm, it is still inefficient.
Recording all possible combinations consume more space in the
database and time consuming. The original algorithm is modified by
applying the edit distance coefficient to improve the data-storage
efficiency. As a result, the data-storage consumption is 90% reduced
as well as its learning period (42s).
Abstract: It was analyzed of fatty acid composition of 16 strains
of microalgae lipid fractions isolated from different basins of
Kazakhstan and characterized by stable active growth in the
laboratory. Three species of green microalgae (Oocystis
rhomboideus, Chlorococcum infusionum, Dictyochlorella globosa)
and three species of diatoms (Synedra sp., Nitzshia sp., Pleurosigma
attenuatum) are characterized by a high content of lipids and are
promising for further study as a source of polyunsaturated fatty acids.
Abstract: Complex networks have been intensively studied across
many fields, especially in Internet technology, biological engineering,
and nonlinear science. Software is built up out of many interacting
components at various levels of granularity, such as functions, classes,
and packages, representing another important class of complex networks.
It can also be studied using complex network theory. Over the
last decade, many papers on the interdisciplinary research between
software engineering and complex networks have been published.
It provides a different dimension to our understanding of software
and also is very useful for the design and development of software
systems. This paper will explore how to use the complex network
theory to analyze software structure, and briefly review the main
advances in corresponding aspects.
Abstract: Existing literature ondesign reasoning seems to give
either one sided accounts on expert design behaviour based on
internal processing. In the same way ecological theoriesseem to
focus one sidedly on external elementsthat result in a lack of unifying
design cognition theory. Although current extended design cognition
studies acknowledge the intellectual interaction between internal and
external resources, there still seems to be insufficient understanding
of the complexities involved in such interactive processes. As
such,this paper proposes a novelmulti-directional model for design
researchers tomap the complex and dynamic conduct controlling
behaviour in which both the computational and ecological
perspectives are integrated in a vertical manner. A clear distinction
between identified intentional and emerging physical drivers, and
relationships between them during the early phases of experts- design
process, is demonstrated by presenting a case study in which the
model was employed.
Abstract: The recycling of concrete, bricks and masonry rubble
as concrete aggregates is an important way to contribute to a
sustainable material flow. However, there are still various
uncertainties limiting the widespread use of Recycled Concrete
Aggregates (RCA). The fluctuations in the composition of grade
recycled aggregates and their influence on the properties of fresh and
hardened concrete are of particular concern regarding the use of
RCA. Most of problems occurring while using recycled concrete
aggregates as aggregates are due to higher porosity and hence higher
water absorption, lower mechanical strengths, residual impurities on
the surface of the RCA forming weaker bond between cement paste
and aggregate. So, the reuse of RCA is still limited. Efficient
polymer based treatment is proposed in order to reuse RCA easier.
The silicon-based polymer treatments of RCA were carried out and
were compared. This kind of treatment can improve the properties of
RCA such as the rate of water absorption on treated RCA is
significantly reduced.
Abstract: The major problem that wireless communication
systems undergo is multipath fading caused by scattering of the
transmitted signal. However, we can treat multipath propagation as
multiple channels between the transmitter and receiver to improve
the signal-to-scattering-noise ratio. While using Single Input
Multiple Output (SIMO) systems, the diversity receivers extract
multiple signal branches or copies of the same signal received from
different channels and apply gain combining schemes such as Root
Mean Square Gain Combining (RMSGC). RMSGC asymptotically
yields an identical performance to that of the theoretically optimal
Maximum Ratio Combining (MRC) for values of mean Signal-to-
Noise-Ratio (SNR) above a certain threshold value without the need
for SNR estimation. This paper introduces an improvement of
RMSGC using two different issues. We found that post-detection and
de-noising the received signals improve the performance of RMSGC
and lower the threshold SNR.
Abstract: Building intelligent traffic guide systems has been an
interesting subject recently. A good system should be able to observe
all important visual information to be able to analyze the context of
the scene. To do so, signs in general, and traffic signs in particular,
are usually taken into account as they contain rich information to
these systems. Therefore, many researchers have put an effort on
sign recognition field. Sign localization or sign detection is the most
important step in the sign recognition process. This step filters out
non informative area in the scene, and locates candidates in later
steps. In this paper, we apply a new approach in detecting sign
locations using a new color invariant model. Experiments are carried
out with different datasets introduced in other works where authors
claimed the difficulty in detecting signs under unfavorable imaging
conditions. Our method is simple, fast and most importantly it gives
a high detection rate in locating signs.
Abstract: Electrical Discharge Machine (EDM) is especially
used for the manufacturing of 3-D complex geometry and hard
material parts that are extremely difficult-to-machine by conventional
machining processes. In this paper authors review the research work
carried out in the development of die-sinking EDM within the past
decades for the improvement of machining characteristics such as
Material Removal Rate, Surface Roughness and Tool Wear Ratio. In
this review various techniques reported by EDM researchers for
improving the machining characteristics have been categorized as
process parameters optimization, multi spark technique, powder
mixed EDM, servo control system and pulse discriminating. At the
end, flexible machine controller is suggested for Die Sinking EDM to
enhance the machining characteristics and to achieve high-level
automation. Thus, die sinking EDM can be integrated with Computer
Integrated Manufacturing environment as a need of agile
manufacturing systems.