Abstract: Nowadays, hard disk is one of the most popular storage components. In hard disk industry, the hard disk drive must pass various complex processes and tested systems. In each step, there are some failures. To reduce waste from these failures, we must find the root cause of those failures. Conventionall data analysis method is not effective enough to analyze the large capacity of data. In this paper, we proposed the Hough method for straight line detection that helps to detect straight line defect patterns that occurs in hard disk drive. The proposed method will help to increase more speed and accuracy in failure analysis.
Abstract: In this study, the designed dual stage membrane
bioreactor (MBR) system was conceptualized for the treatment of
cyanide and heavy metals in electroplating wastewater. The design
consisted of a primary treatment stage to reduce the impact of
fluctuations and the secondary treatment stage to remove the residual
cyanide and heavy metal contaminants in the wastewater under
alkaline pH conditions. The primary treatment stage contained
hydrolyzed Citrus sinensis (C. sinensis) pomace and the secondary
treatment stage contained active Aspergillus awamori (A. awamori)
biomass, supplemented solely with C. sinensis pomace extract from
the hydrolysis process. An average of 76.37%, 95.37%, 93.26 and
94.76% and 99.55%, 99.91%, 99.92% and 99.92% degradation
efficiency for total cyanide (T-CN), including the sorption of nickel
(Ni), zinc (Zn) and copper (Cu) were observed after the first and
second treatment stages, respectively. Furthermore, cyanide
conversion by-products degradation was 99.81% and 99.75 for both
formate (CHOO-) and ammonium (NH4
+) after the second treatment
stage. After the first, second and third regeneration cycles of the C.
sinensis pomace in the first treatment stage, Ni, Zn and Cu removal
achieved was 99.13%, 99.12% and 99.04% (first regeneration cycle),
98.94%, 98.92% and 98.41% (second regeneration cycle) and 98.46
%, 98.44% and 97.91% (third regeneration cycle), respectively.
There was relatively insignificant standard deviation detected in all
the measured parameters in the system which indicated
reproducibility of the remediation efficiency in this continuous
system.
Abstract: Orthogonal Frequency Division Multiplexing
(OFDM) is an efficient method of data transmission for high speed
communication systems. However, the main drawback of OFDM
systems is that, it suffers from the problem of high Peak-to-Average
Power Ratio (PAPR) which causes inefficient use of the High Power
Amplifier and could limit transmission efficiency. OFDM consist of
large number of independent subcarriers, as a result of which the
amplitude of such a signal can have high peak values. In this paper,
we propose an effective reduction scheme that combines DCT and
SLM techniques. The scheme is composed of the DCT followed by
the SLM using the Riemann matrix to obtain phase sequences for the
SLM technique. The simulation results show PAPR can be greatly
reduced by applying the proposed scheme. In comparison with
OFDM, while OFDM had high values of PAPR –about 10.4dB our
proposed method achieved about 4.7dB reduction of the PAPR with
low complexities computation. This approach also avoids
randomness in phase sequence selection, which makes it simpler to
decode at the receiver. As an added benefit, the matrices can be
generated at the receiver end to obtain the data signal and hence it is
not required to transmit side information (SI).
Abstract: In this paper, we propose a method to extract the road
signs. Firstly, the grabbed image is converted into the HSV color space
to detect the road signs. Secondly, the morphological operations are
used to reduce noise. Finally, extract the road sign using the geometric
property. The feature extraction of road sign is done by using the color
information. The proposed method has been tested for the real
situations. From the experimental results, it is seen that the proposed
method can extract the road sign features effectively.
Abstract: In order to optimize annual IT spending and to reduce
the complexity of an entire system architecture, SOA trials have been
started. It is common knowledge that to design an SOA system we
have to adopt the top-down approach, but in reality silo systems are
being made, so these companies cannot reuse newly designed services,
and cannot enjoy SOA-s economic benefits. To prevent this situation,
we designed a generic SOA development process referred to as the
architecture of “mass customization."
To define the generic detail development processes, we did a case
study on an imaginary company. Through the case study, we could
define the practical development processes and found this could vastly
reduce updating development costs.
Abstract: How to effectively allocate system resource to process
the Client request by Gateway servers is a challenging problem. In
this paper, we propose an improved scheme for autonomous
performance of Gateway servers under highly dynamic traffic loads.
We devise a methodology to calculate Queue Length and Waiting
Time utilizing Gateway Server information to reduce response time
variance in presence of bursty traffic. The most widespread
contemplation is performance, because Gateway Servers must offer
cost-effective and high-availability services in the elongated period,
thus they have to be scaled to meet the expected load. Performance
measurements can be the base for performance modeling and
prediction. With the help of performance models, the performance
metrics (like buffer estimation, waiting time) can be determined at
the development process. This paper describes the possible queue
models those can be applied in the estimation of queue length to
estimate the final value of the memory size. Both simulation and
experimental studies using synthesized workloads and analysis of
real-world Gateway Servers demonstrate the effectiveness of the
proposed system.
Abstract: As global industry developed rapidly, the energy
demand also rises simultaneously. In the production process, there’s a
lot of energy consumed in the process. Formally, the energy used in
generating the heat in the production process. In the total energy
consumption, 40% of the heat was used in process heat, mechanical
work, chemical energy and electricity. The remaining 50% were
released into the environment. It will cause energy waste and
environment pollution. There are many ways for recovering the waste
heat in factory. Organic Rankine Cycle (ORC) system can produce
electricity and reduce energy costs by recovering the waste of low
temperature heat in the factory. In addition, ORC is the technology
with the highest power generating efficiency in low-temperature heat
recycling. However, most of factories executives are still hesitated
because of the high implementation cost of the ORC system, even a lot
of heat are wasted. Therefore, this study constructs a nonlinear
mathematical model of waste heat recovery equipment configuration
to maximize profits. A particle swarm optimization algorithm is
developed to generate the optimal facility installation plan for the ORC
system.
Abstract: This present paper proposes the modified Elastic Strip
method for mobile robot to avoid obstacles with a real time system in
an uncertain environment. The method deals with the problem of
robot in driving from an initial position to a target position based on
elastic force and potential field force. To avoid the obstacles, the
robot has to modify the trajectory based on signal received from the
sensor system in the sampling times. It was evident that with the
combination of Modification Elastic strip and Pseudomedian filter to
process the nonlinear data from sensor uncertainties in the data
received from the sensor system can be reduced. The simulations and
experiments of these methods were carried out.
Abstract: Testing accounts for the major percentage of technical
contribution in the software development process. Typically, it
consumes more than 50 percent of the total cost of developing a
piece of software. The selection of software tests is a very important
activity within this process to ensure the software reliability
requirements are met. Generally tests are run to achieve maximum
coverage of the software code and very little attention is given to the
achieved reliability of the software. Using an existing methodology,
this paper describes how to use Bayesian Belief Networks (BBNs) to
select unit tests based on their contribution to the reliability of the
module under consideration. In particular the work examines how the
approach can enhance test-first development by assessing the quality
of test suites resulting from this development methodology and
providing insight into additional tests that can significantly reduce
the achieved reliability. In this way the method can produce an
optimal selection of inputs and the order in which the tests are
executed to maximize the software reliability. To illustrate this
approach, a belief network is constructed for a modern software
system incorporating the expert opinion, expressed through
probabilities of the relative quality of the elements of the software,
and the potential effectiveness of the software tests. The steps
involved in constructing the Bayesian Network are explained as is a
method to allow for the test suite resulting from test-driven
development.
Abstract: Environmental statistics reveals that the pollution of
acid rain in South Korea is a serious issue. Yet the awareness of people
is low. Even after a gradual decrease of pollutant emission in Korea,
the acidity has not been reduced. There no boundaries in the
atmosphere are set and the influence of the neighboring countries such
as China is apparent. Governmental efforts among China, Japan and
Korea have been made on this issue. However, not much progress has
been observed. Along with the governmental activities, therefore, an
active monitoring of the pollution among the countries and the
promotion of environmental awareness at the civil level including
especially the middle and high schools are highly recommended. It
will be this young generation who will face damaged country as
inheritance not the current generation.
Abstract: Currently, web usage make a huge data from a lot of
user attention. In general, proxy server is a system to support web
usage from user and can manage system by using hit rates. This
research tries to improve hit rates in proxy system by applying data
mining technique. The data set are collected from proxy servers in the
university and are investigated relationship based on several features.
The model is used to predict the future access websites. Association
rule technique is applied to get the relation among Date, Time, Main
Group web, Sub Group web, and Domain name for created model.
The results showed that this technique can predict web content for the
next day, moreover the future accesses of websites increased from
38.15% to 85.57 %.
This model can predict web page access which tends to increase
the efficient of proxy servers as a result. In additional, the
performance of internet access will be improved and help to reduce
traffic in networks.
Abstract: The pavement constructions on soft and expansive soils are not durable and unable to sustain heavy traffic loading. As a result, pavement failures and settlement problems will occur very often even under light traffic loading due to cyclic and rolling effects. Geotechnical engineers have dwelled deeply into this matter, and adopt various methods to improve the engineering characteristics of soft fine-grained soils and expansive soils. The problematic soils are either replaced by good and better quality material or treated by using chemical stabilization with various binding materials. Increased the strength and durability are also the part of the sustainability drive to reduce the environment footprint of the built environment by the efficient use of resources and waste recycle materials. This paper presents a series of laboratory tests and evaluates the effect of cement and fly ash on the strength and drainage characteristics of soil in Miri. The tests were performed at different percentages of cement and fly ash by dry weight of soil. Additional tests were also performed on soils treated with the combinations of fly ash with cement and lime. The results of this study indicate an increase in unconfined compression strength and a decrease in hydraulic conductivity of the treated soil.
Abstract: Cerium-doped lanthanum bromide LaBr3:Ce(5%)
crystals are considered to be one of the most advanced scintillator
materials used in PET scanning, combining a high light yield, fast
decay time and excellent energy resolution. Apart from the correct
choice of scintillator, it is also important to optimise the detector
geometry, not least in terms of source-to-detector distance in order to
obtain reliable measurements and efficiency. In this study a
commercially available 25 mm x 25 mm BrilLanCeTM 380 LaBr3: Ce
(5%) detector was characterised in terms of its efficiency at varying
source-to-detector distances. Gamma-ray spectra of 22Na, 60Co, and
137Cs were separately acquired at distances of 5, 10, 15, and 20cm. As
a result of the change in solid angle subtended by the detector, the
geometric efficiency reduced in efficiency with increasing distance.
High efficiencies at low distances can cause pulse pile-up when
subsequent photons are detected before previously detected events
have decayed. To reduce this systematic error the source-to-detector
distance should be balanced between efficiency and pulse pile-up
suppression as otherwise pile-up corrections would need to be
necessary at short distances. In addition to the experimental
measurements Monte Carlo simulations have been carried out for the
same setup, allowing a comparison of results. The advantages and
disadvantages of each approach have been highlighted.
Abstract: This paper presents a heuristic to solve large size 0-1 Multi constrained Knapsack problem (01MKP) which is NP-hard. Many researchers are used heuristic operator to identify the redundant constraints of Linear Programming Problem before applying the regular procedure to solve it. We use the intercept matrix to identify the zero valued variables of 01MKP which is known as redundant variables. In this heuristic, first the dominance property of the intercept matrix of constraints is exploited to reduce the search space to find the optimal or near optimal solutions of 01MKP, second, we improve the solution by using the pseudo-utility ratio based on surrogate constraint of 01MKP. This heuristic is tested for benchmark problems of sizes upto 2500, taken from literature and the results are compared with optimum solutions. Space and computational complexity of solving 01MKP using this approach are also presented. The encouraging results especially for relatively large size test problems indicate that this heuristic can successfully be used for finding good solutions for highly constrained NP-hard problems.
Abstract: An efficient architecture for low jitter All Digital
Phase Locked Loop (ADPLL) suitable for high speed SoC
applications is presented in this paper. The ADPLL is designed using
standard cells and described by Hardware Description Language
(HDL). The ADPLL implemented in a 90 nm CMOS process can
operate from 10 to 200 MHz and achieve worst case frequency
acquisition in 14 reference clock cycles. The simulation result shows
that PLL has cycle to cycle jitter of 164 ps and period jitter of 100 ps
at 100MHz. Since the digitally controlled oscillator (DCO) can
achieve both high resolution and wide frequency range, it can meet
the demands of system-level integration. The proposed ADPLL can
easily be ported to different processes in a short time. Thus, it can
reduce the design time and design complexity of the ADPLL, making
it very suitable for System-on-Chip (SoC) applications.
Abstract: Image Searching was always a problem specially when these images are not properly managed or these are distributed over different locations. Currently different techniques are used for image search. On one end, more features of the image are captured and stored to get better results. Storing and management of such features is itself a time consuming job. While on the other extreme if fewer features are stored the accuracy rate is not satisfactory. Same image stored with different visual properties can further reduce the rate of accuracy. In this paper we present a new concept of using polynomials of sorted histogram of the image. This approach need less overhead and can cope with the difference in visual features of image.
Abstract: In this paper, we propose a Perceptually Optimized Embedded ZeroTree Image Coder (POEZIC) that introduces a perceptual weighting to wavelet transform coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to the coding quality obtained using the SPIHT algorithm only. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEZIC quality assessment. Our POEZIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) luminance masking and Contrast masking, 2) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting, 3) the Wavelet Error Sensitivity WES used to reduce the perceptual quantization errors. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.
Abstract: A large quantity of world-s oil reserves exists in
carbonate reservoirs. Carbonate reservoirs are very sensitive to
chemical enhanced oil recovery process because of containing large
amount of calcite, dolomite and calcium sulfate minerals. These
minerals cause major obstacles during alkali-surfactant-polymer
(ASP) flooding. Alkali reacts with these minerals and form undesired
precipitations which plug effective porous openings, reduce
permeability and cause scale occurrence at the wellbore. In this
paper, a new chemical combination consists of acrylic acid and alkali
was used to minimize precipitation problem during ASP flooding. A
series of fluid-fluid compatibility tests were performed using acrylic
acid and different concentrations of alkaline. Two types of alkalis
namely; sodium carbonate and sodium metaborate were screened. As
a result, the combination of acrylic acid and sodium carbonate was
not effective in preventing calcium and magnesium precipitations.
However, acrylic acid and sodium metaborate showed promising
results for keeping all solutions without any precipitations. The ratio
of acrylic acid to sodium metaborate of 0.7:1.0 was found to be
optimum for achieving a compatible solution for 30 days at 80oC.
Abstract: In this paper, we investigate the strategic stochastic air traffic flow management problem which seeks to balance airspace capacity and demand under weather disruptions. The goal is to reduce the need for myopic tactical decisions that do not account for probabilistic knowledge about the NAS near-future states. We present and discuss a scenario-based modeling approach based on a time-space stochastic process to depict weather disruption occurrences in the NAS. A solution framework is also proposed along with a distributed implementation aimed at overcoming scalability problems. Issues related to this implementation are also discussed.
Abstract: EGOTHOR is a search engine that indexes the Web
and allows us to search the Web documents. Its hit list contains URL
and title of the hits, and also some snippet which tries to shortly
show a match. The snippet can be almost always assembled by an
algorithm that has a full knowledge of the original document (mostly
HTML page). It implies that the search engine is required to store
the full text of the documents as a part of the index.
Such a requirement leads us to pick up an appropriate compression
algorithm which would reduce the space demand. One of the solutions
could be to use common compression methods, for instance gzip or
bzip2, but it might be preferable if we develop a new method which
would take advantage of the document structure, or rather, the textual
character of the documents.
There already exist a special compression text algorithms and
methods for a compression of XML documents. The aim of this
paper is an integration of the two approaches to achieve an optimal
level of the compression ratio