Abstract: This paper presents a new algorithm for the channel estimation of the OFDM system based on a pilot signal for the new generation of high data rate communication systems. In orthogonal frequency division multiplexing (OFDM) systems over fast-varying fading channels, channel estimation and tracking is generally carried out by transmitting known pilot symbols in given positions of the frequency-time grid. In this paper, we propose to derive an improved algorithm based on the calculation of the mean and the variance of the adjacent pilot signals for a specific distribution of the pilot signals in the OFDM frequency-time grid then calculating of the entire unknown channel coefficients from the equation of the mean and the variance. Simulation results shows that the performance of the OFDM system increase as the length of the channel increase where the accuracy of the estimated channel will be increased using this low complexity algorithm, also the number of the pilot signal needed to be inserted in the OFDM signal will be reduced which lead to increase in the throughput of the signal over the OFDM system in compared with other type of the distribution such as Comb type and Block type channel estimation.
Abstract: We review a knowledge extractor model in
constructing 3G Killer Applications. The success of 3G is essential
for Government as it became part of Telecommunications National
Strategy. The 3G wireless technologies may reach larger area and
increase country-s ICT penetration. In order to understand future
customers needs, the operators require proper information
(knowledge) lying inside. Our work approached future customers as
complex system where the complex knowledge may expose regular
behavior. The hidden information from 3G future customers is
revealed by using fractal-based questionnaires. Afterward, further
statistical analysis is used to match the results with operator-s
strategic plan. The developments of 3G applications also consider its
saturation time and further improvement of the application.
Abstract: Uncertainties of a serial production line affect on the
production throughput. The uncertainties cannot be prevented in a
real production line. However the uncertain conditions can be
controlled by a robust prediction model. Thus, a hybrid model
including autoregressive integrated moving average (ARIMA) and
multiple polynomial regression, is proposed to model the nonlinear
relationship of production uncertainties with throughput. The
uncertainties under consideration of this study are demand, breaktime,
scrap, and lead-time. The nonlinear relationship of production
uncertainties with throughput are examined in the form of quadratic
and cubic regression models, where the adjusted R-squared for
quadratic and cubic regressions was 98.3% and 98.2%. We optimized
the multiple quadratic regression (MQR) by considering the time
series trend of the uncertainties using ARIMA model. Finally the
hybrid model of ARIMA and MQR is formulated by better adjusted
R-squared, which is 98.9%.
Abstract: In this study, aeroelastic response and performance
analyses have been conducted for a 5MW-Class composite wind
turbine blade model. Advanced coupled numerical method based on
computational fluid dynamics (CFD) and computational flexible
multi-body dynamics (CFMBD) has been developed in order to
investigate aeroelastic responses and performance characteristics of
the rotating composite blade. Reynolds-Averaged Navier-Stokes
(RANS) equations with k-ω SST turbulence model were solved for
unsteady flow problems on the rotating turbine blade model. Also,
structural analyses considering rotating effect have been conducted
using the general nonlinear finite element method. A fully implicit
time marching scheme based on the Newmark direct integration
method is applied to solve the coupled aeroelastic governing equations
of the 3D turbine blade for fluid-structure interaction (FSI) problems.
Detailed dynamic responses and instantaneous velocity contour on the
blade surfaces which considering flow-separation effects were
presented to show the multi-physical phenomenon of the huge rotating
wind- turbine blade model.
Abstract: This paper presented two new efficient algorithms
for contour approximation. The proposed algorithm is compared
with Ramer (good quality), Triangle (faster) and Trapezoid (fastest)
in this work; which are briefly described. Cartesian co-ordinates of
an input contour are processed in such a manner that finally
contours is presented by a set of selected vertices of the edge of the
contour. In the paper the main idea of the analyzed procedures for
contour compression is performed. For comparison, the mean
square error and signal-to-noise ratio criterions are used.
Computational time of analyzed methods is estimated depending on
a number of numerical operations. Experimental results are
obtained both in terms of image quality, compression ratios, and
speed. The main advantages of the analyzed algorithm is small
numbers of the arithmetic operations compared to the existing
algorithms.
Abstract: This paper focuses on robust design and optimization
of industrial production wastes. Past literatures were reviewed to case
study Clamason Industries Limited (CIL) - a leading ladder-tops
manufacturer. A painstaking study of the firm-s practices at the shop
floor revealed that Over-production, Waiting time, Excess inventory,
and Defects are the major wastes that are impeding their progress and
profitability. Design expert8 software was used to apply Taguchi
robust design and response surface methodology in order to model,
analyse and optimise the wastes cost in CIL. Waiting time and overproduction
rank first and second in contributing to the costs of wastes
in CIL. For minimal wastes cost the control factors of overproduction,
waiting-time, defects and excess-inventory must be set at
0.30, 390.70, 4 and 55.70 respectively for CIL. The optimal value of
cost of wastes for the months studied was 22.3679. Finally, a
recommendation was made that for the company to enhance their
profitability and customer satisfaction, they must adopt the Shingeo
Shingo-s Single Minute Exchange of Dies (SMED), which will
immediately tackle the waste of waiting by drastically reducing their
setup time.
Abstract: This paper proposes a set of quasi-static mathematical
model of magnetic fields caused by high voltage conductors of
distribution transformer by using a set of second-order partial
differential equation. The modification for complex magnetic field
analysis and time-harmonic simulation are also utilized. In this
research, transformers were study in both balanced and unbalanced
loading conditions. Computer-based simulation utilizing the threedimensional
finite element method (3-D FEM) is exploited as a tool
for visualizing magnetic fields distribution volume a distribution
transformer. Finite Element Method (FEM) is one among popular
numerical methods that is able to handle problem complexity in
various forms. At present, the FEM has been widely applied in most
engineering fields. Even for problems of magnetic field distribution,
the FEM is able to estimate solutions of Maxwell-s equations
governing the power transmission systems. The computer simulation
based on the use of the FEM has been developed in MATLAB
programming environment.
Abstract: Anaerobic treatment has many advantages over other
biological method particularly when used to treat complex
wastewater such as petroleum refinery wastewater. In this study two
Up-flow Anaerobic Sludge Blanket (UASB) reactors were operated
in parallel to treat six volumetric organic loads (0.58, 1.21, 0.89,
2.34, 1.47 and 4.14 kg COD/m3·d) to evaluate the chemical oxygen
demand (COD) removal efficiency. The reactors were continuously
adapting to the changing of operation condition with increase in the
removal efficiency or slight decrease until the last load which was
more than two times the load, at which the reactor stressed and the
removal efficiency decreased to 75% with effluent concentration of
1746 mg COD/L. Other parameters were also monitored such as pH,
alkalinity, volatile fatty acid and gas production rate. The UASB
reactor was suitable to treat petroleum refinery wastewater and the
highest COD removal rate was 83% at 1215 kg/m3·d with COD
concentration about 356 mg/L in the effluent.
Abstract: Amarindra-vinitchai-mahaisuraya Bhiman throne hall
is one of the most significant throne halls in the grand palace in the
Ratthanakosin city situated in Bangkok, Thailand. This is the first
group of throne halls built in order to serve as a place for meetings,
performing state affairs and royal duties until the present time. The
structure and pattern of architectural design including the decoration
and interior design of the throne hall obviously exhibits and convey
the status of the king under the context of Thai society in the early
period of Ratthanakosin era. According to the tradition of ruling the
kingdom in absolute monarchy which had been in place since
Ayutthaya era (A.D.1350-1767), the king was deemed as Deva Raja,
the highest power and authority over the kingdom and as the greatest
emperor of the universe (Chakkravatin). The architectural design
adopted the concept of “Prasada" or Viman which served as the
dwelling place of the gods and was presented in the form of “Thai
traditional architecture" For the interior design of the throne hall, it
had been adopted to be the heaven and the centre of the Universe in
line with the cosmological beliefs of ancient people described in
scripture Tribhumikatha (Tri Bhumi) written by Phra Maha Thamma
Raja (Phraya Lithai) of the Sukhothai era (A.D.1347-1368).
According to this belief, the throne hall had been designed to represent
mount Meru, the central of the universe. On the top end of Mount
Meru is situated the Viman and dwelling place of Indra who is the king
of gods according to the idea of Deva Raja (the king god Avatar). At
the same time, Indra also existed as the king of the universe
simultaneously.
Abstract: The paper presented a transient population dynamics of phase singularities in 2D Beeler-Reuter model. Two stochastic modelings are examined: (i) the Master equation approach with the transition rate (i.e., λ(n, t) = λ(t)n and μ(n, t) = μ(t)n) and (ii) the nonlinear Langevin equation approach with a multiplicative noise. The exact general solution of the Master equation with arbitrary time-dependent transition rate is given. Then, the exact solution of the mean field equation for the nonlinear Langevin equation is also given. It is demonstrated that transient population dynamics is successfully identified by the generalized Logistic equation with fractional higher order nonlinear term. It is also demonstrated the necessity of introducing time-dependent transition rate in the master equation approach to incorporate the effect of nonlinearity.
Abstract: This paper proposes an innovative methodology for
Acceptance Sampling by Variables, which is a particular category of
Statistical Quality Control dealing with the assurance of products
quality. Our contribution lies in the exploitation of machine learning
techniques to address the complexity and remedy the drawbacks of
existing approaches. More specifically, the proposed methodology
exploits Artificial Neural Networks (ANNs) to aid decision making
about the acceptance or rejection of an inspected sample. For any
type of inspection, ANNs are trained by data from corresponding
tables of a standard-s sampling plan schemes. Once trained, ANNs
can give closed-form solutions for any acceptance quality level and
sample size, thus leading to an automation of the reading of the
sampling plan tables, without any need of compromise with the
values of the specific standard chosen each time. The proposed
methodology provides enough flexibility to quality control engineers
during the inspection of their samples, allowing the consideration of
specific needs, while it also reduces the time and the cost required for
these inspections. Its applicability and advantages are demonstrated
through two numerical examples.
Abstract: Lignocellulosic materials are new targeted source to
produce second generation biofuels like biobutanol. However, this
process is significantly resisted by the native structure of biomass.
Therefore, pretreatment process is always essential to remove
hemicelluloses and lignin prior to the enzymatic hydrolysis.
The goals of pretreatment are removing hemicelluloses and
lignin, increasing biomass porosity, and increasing the enzyme
accessibility. The main goal of this research is to study the important
variables such as pretreatment temperature and time, which can give
the highest total sugar yield in pretreatment step by using dilute
phosphoric acid. After pretreatment, the highest total sugar yield of
13.61 g/L was obtained under an optimal condition at 140°C for 10
min of pretreatment time by using 1.75% (w/w) H3PO4 and at 15:1
liquid to solid ratio. The total sugar yield of two-stage process
(pretreatment+enzymatic hydrolysis) of 27.38 g/L was obtained.
Abstract: In the previous multi-solid models,¤ò approach is
used for the calculation of fugacity in the liquid phase. For the first
time, in the proposed multi-solid thermodynamic model,γ approach
has been used for calculation of fugacity in the liquid mixture.
Therefore, some activity coefficient models have been studied that
the results show that the predictive Wilson model is more appropriate
than others. The results demonstrate γ approach using the predictive
Wilson model is in more agreement with experimental data than the
previous multi-solid models. Also, by this method, generates a new
approach for presenting stability analysis in phase equilibrium
calculations. Meanwhile, the run time in γ approach is less than the
previous methods used ¤ò approach. The results of the new model
present 0.75 AAD % (Average Absolute Deviation) from the
experimental data which is less than the results error of the previous
multi-solid models obviously.
Abstract: Resource-constrained project scheduling is an NPhard
optimisation problem. There are many different heuristic
strategies how to shift activities in time when resource requirements
exceed their available amounts. These strategies are frequently based
on priorities of activities. In this paper, we assume that a suitable
heuristic has been chosen to decide which activities should be
performed immediately and which should be postponed and
investigate the resource-constrained project scheduling problem
(RCPSP) from the implementation point of view. We propose an
efficient routine that, instead of shifting the activities, extends their
duration. It makes it possible to break down their duration into active
and sleeping subintervals. Then we can apply the classical Critical
Path Method that needs only polynomial running time. This
algorithm can simply be adapted for multiproject scheduling with
limited resources.
Abstract: The need to increase the efficiency of health care
systems is becoming an obligation, and one of area of improvement
is the discharge process. The objective of this work is to minimize
the patients discharge time (for insured patients) to be less than 50
minutes by using six sigma approach, this improvement will also:
lead to an increase in customer satisfaction, increase the number of
admissions and turnover on the rooms, increase hospital
profitability.Three different departments were considered in this
study: Female, Male, and Paediatrics. Six Sigma approach coupled
with simulation has been applied to reduce the patients discharge
time for pediatrics, female, and male departments at hospital. Upon
applying these recommendations at hospital: 60%, 80%, and 22% of
insured female, male, and pediatrics patients respectively will have
discharge time less than the upper specification time i.e. 50 min.
Abstract: Speckle phenomena results from when coherent
radiation is reflected from a rough surface. Characterizing the speckle
strongly depends on the measurement condition and experimental
setup. In this paper we report the experimental results produced with
different parameters in the setup. We investigated the factors which
affects the speckle contrast, such as, F-number, gamma value and
exposure time of the camera, rather than geometric factors like the
distance between the projector lens to the screen, the viewing distance,
etc. The measurement results show that the speckle contrast decreases
by decreasing F-number, by increasing gamma value, and slightly
affects by exposure time of the camera and the gain value of the
camera.
Abstract: This paper proposes a novel game theoretical
technique to address the problem of data object replication in largescale
distributed computing systems. The proposed technique draws
inspiration from computational economic theory and employs the
extended Vickrey auction. Specifically, players in a non-cooperative
environment compete for server-side scarce memory space to
replicate data objects so as to minimize the total network object
transfer cost, while maintaining object concurrency. Optimization of
such a cost in turn leads to load balancing, fault-tolerance and
reduced user access time. The method is experimentally evaluated
against four well-known techniques from the literature: branch and
bound, greedy, bin-packing and genetic algorithms. The experimental
results reveal that the proposed approach outperforms the four
techniques in both the execution time and solution quality.
Abstract: The catalytic dehydroxylation of glycerol to propylene
glycol was investigated over Cu-ZnO/Al2O3 prepared by incipient
wetness impregnation (IWI) method with different purity feedstocks -
refined glycerol and technical grade glycerol. The main purpose is to
investigate the effects of feed impurities that cause the catalyst
deactivation. The prepared catalyst were tested for its catalytic
activity and selectivity in a continuous flow fixed bed reactor at 523
K, 500 psig, H2/feed molar ratio of 4 and WHSV of 3 h-1. The results
showed that conversion of refined glycerol and technical grade
glycerol at time on stream 6 hour are 99% and 71% and selectivity to
propylene glycol are 87% and 56% respectively. The ICP-EOS and
TPO results indicated that the cause of catalyst deactivation was the
amount of impurities in the feedstock. The higher amount of
impurities (especially Na and K) the lower catalytic activity.
Abstract: CONWIP (constant work-in-process) as a pull
production system have been widely studied by researchers to date.
The CONWIP pull production system is an alternative to pure push
and pure pull production systems. It lowers and controls inventory
levels which make the throughput better, reduces production lead
time, delivery reliability and utilization of work. In this article a
CONWIP pull production system was simulated. It was simulated
push and pull planning system. To compare these systems via a
production planning system (PPS) game were adjusted parameters of
each production planning system. The main target was to reduce the
total WIP and achieve throughput and delivery reliability to
minimum values. Data was recorded and evaluated. A future state
was made for real production of plastic components and the setup of
the two indicators with CONWIP pull production system which can
greatly help the company to be more competitive on the market.
Abstract: This paper analyzes the patterns of the Monte Carlo
data for a large number of variables and minterms, in order to
characterize the circuit path length behavior. We propose models
that are determined by training process of shortest path length
derived from a wide range of binary decision diagram (BDD)
simulations. The creation of the model was done use of feed forward
neural network (NN) modeling methodology. Experimental results
for ISCAS benchmark circuits show an RMS error of 0.102 for the
shortest path length complexity estimation predicted by the NN
model (NNM). Use of such a model can help reduce the time
complexity of very large scale integrated (VLSI) circuitries and
related computer-aided design (CAD) tools that use BDDs.