Abstract: Mining Sequential Patterns in large databases has become
an important data mining task with broad applications. It is
an important task in data mining field, which describes potential
sequenced relationships among items in a database. There are many
different algorithms introduced for this task. Conventional algorithms
can find the exact optimal Sequential Pattern rule but it takes a
long time, particularly when they are applied on large databases.
Nowadays, some evolutionary algorithms, such as Particle Swarm
Optimization and Genetic Algorithm, were proposed and have been
applied to solve this problem. This paper will introduce a new kind
of hybrid evolutionary algorithm that combines Genetic Algorithm
(GA) with Particle Swarm Optimization (PSO) to mine Sequential
Pattern, in order to improve the speed of evolutionary algorithms
convergence. This algorithm is referred to as SP-GAPSO.
Abstract: Recently, fast neural networks for object/face
detection were presented in [1-3]. The speed up factor of these
networks relies on performing cross correlation in the frequency
domain between the input image and the weights of the hidden
layer. But, these equations given in [1-3] for conventional and fast
neural networks are not valid for many reasons presented here. In
this paper, correct equations for cross correlation in the spatial and
frequency domains are presented. Furthermore, correct formulas for
the number of computation steps required by conventional and fast
neural networks given in [1-3] are introduced. A new formula for
the speed up ratio is established. Also, corrections for the equations
of fast multi scale object/face detection are given. Moreover,
commutative cross correlation is achieved. Simulation results show
that sub-image detection based on cross correlation in the frequency
domain is faster than classical neural networks.
Abstract: A new code synchronization algorithm is proposed in
this paper for the secondary cell-search stage in wideband CDMA
systems. Rather than using the Cyclically Permutable (CP) code in the
Secondary Synchronization Channel (S-SCH) to simultaneously
determine the frame boundary and scrambling code group, the new
synchronization algorithm implements the same function with less
system complexity and less Mean Acquisition Time (MAT). The
Secondary Synchronization Code (SSC) is redesigned by splitting into
two sub-sequences. We treat the information of scrambling code group
as data bits and use simple time diversity BCH coding for further
reliability. It avoids involved and time-costly Reed-Solomon (RS)
code computations and comparisons. Analysis and simulation results
show that the Synchronization Error Rate (SER) yielded by the new
algorithm in Rayleigh fading channels is close to that of the
conventional algorithm in the standard. This new synchronization
algorithm reduces system complexities, shortens the average
cell-search time and can be implemented in the slot-based cell-search
pipeline. By taking antenna diversity and pipelining correlation
processes, the new algorithm also shows its flexible application in
multiple antenna systems.
Abstract: Carbon fibers are fabricated from different materials,
such as special polyacrylonitrile (PAN) fibers, rayon fibers and pitch.
Among these three groups of materials, PAN fibers are the most
widely used precursor for the manufacture of carbon fibers. The
process of fabrication carbon fibers from special PAN fibers includes
two steps; oxidative stabilization at low temperature and
carbonization at high temperatures in an inert atmosphere. Due to the
high price of raw materials (special PAN fibers), carbon fibers are
still expensive.
In the present work the main goal is making carbon fibers from
low price commercial PAN fibers with modified chemical
compositions. The results show that in case of conducting completes
stabilization process, it is possible to produce carbon fibers with
desirable tensile strength from this type of PAN fibers. To this
matter, thermal characteristics of commercial PAN fibers were
investigated and based upon the obtained results, with some changes
in conventional procedure of stabilization in terms of temperature
and time variables; the desirable conditions of complete stabilization
is achieved.
Abstract: This paper present a new method for design of power
system stabilizer (PSS) based on sliding mode control (SMC)
technique. The control objective is to enhance stability and improve
the dynamic response of the multi-machine power system. In order to
test effectiveness of the proposed scheme, simulation will be carried
out to analyze the small signal stability characteristics of the system
about the steady state operating condition following the change in
reference mechanical torque and also parameters uncertainties. For
comparison, simulation of a conventional control PSS (lead-lag
compensation type) will be carried out. The main approach is
focusing on the control performance which later proven to have the
degree of shorter reaching time and lower spike.
Abstract: Reinforced concrete stair slabs with mid landings i.e.
Dog-legged shaped are conventionally designed as per specifications
of standard codes of practices which guide about the effective span
according to the varying support conditions. Presently, the behavior
of such slabs has been investigated using Finite Element method. A
single flight stair slab with landings on both sides and supported at
ends on wall, and a multi flight stair slab with landings and six
different support arrangements have been analyzed. The results
obtained for stresses, strains and deflections are used to describe the
behavior of such stair slabs, including locations of critical moments
and deflections. Values of critical moments obtained by F.E. analysis
have also have been compared with that obtained from conventional
analysis. Analytical results show that the moments are also critical
near the kinks i.e. junction of mid-landing and inclined waist slab.
This change in the behavior of dog-legged stair slab may be due to
continuity of the material in transverse direction in two landings
adjoining the waist slab, hence additional stiffness achieved. This
change in the behavior is generally not taken care of in conventional
method of design.
Abstract: Although achieving zero-defect software release is
practically impossible, software industries should take maximum
care to detect defects/bugs well ahead in time allowing only bare
minimums to creep into released version. This is a clear indicator of
time playing an important role in the bug detection. In addition to
this, software quality is the major factor in software engineering
process. Moreover, early detection can be achieved only through
static code analysis as opposed to conventional testing.
BugCatcher.Net is a static analysis tool, which detects bugs in .NET®
languages through MSIL (Microsoft Intermediate Language)
inspection. The tool utilizes a Parser based on Finite State Automata
to carry out bug detection. After being detected, bugs need to be
corrected immediately. BugCatcher.Net facilitates correction, by
proposing a corrective solution for reported warnings/bugs to end
users with minimum side effects. Moreover, the tool is also capable
of analyzing the bug trend of a program under inspection.
Abstract: Designing modern machine tools is a complex task. A
simulation tool to aid the design work, a virtual machine, has
therefore been developed in earlier work. The virtual machine
considers the interaction between the mechanics of the machine
(including structural flexibility) and the control system. This paper
exemplifies the usefulness of the virtual machine as a tool for product
development. An optimisation study is conducted aiming at
improving the existing design of a machine tool regarding weight and
manufacturing accuracy at maintained manufacturing speed. The
problem can be categorised as constrained multidisciplinary multiobjective
multivariable optimisation. Parameters of the control and
geometric quantities of the machine are used as design variables. This
results in a mix of continuous and discrete variables and an
optimisation approach using a genetic algorithm is therefore
deployed. The accuracy objective is evaluated according to
international standards. The complete systems model shows nondeterministic
behaviour. A strategy to handle this based on statistical
analysis is suggested. The weight of the main moving parts is reduced
by more than 30 per cent and the manufacturing accuracy is
improvement by more than 60 per cent compared to the original
design, with no reduction in manufacturing speed. It is also shown
that interaction effects exist between the mechanics and the control,
i.e. this improvement would most likely not been possible with a
conventional sequential design approach within the same time, cost
and general resource frame. This indicates the potential of the virtual
machine concept for contributing to improved efficiency of both
complex products and the development process for such products.
Companies incorporating such advanced simulation tools in their
product development could thus improve its own competitiveness as
well as contribute to improved resource efficiency of society at large.
Abstract: The impact of fixed speed squirrel cage type as well as
variable speed doubly fed induction generators (DFIG) on dynamic
performance of a multimachine power system has been investigated.
Detailed models of the various components have been presented and
the integration of asynchronous and synchronous generators has been
carried out through a rotor angle based transform. Simulation studies
carried out considering the conventional dynamic model of squirrel
cage asynchronous generators show that integration, as such, could
degrade to the AC system performance transiently. This article
proposes a frequency or power controller which can effectively
control the transients and restore normal operation of fixed speed
induction generator quickly. Comparison of simulation results
between classical cage and doubly-fed induction generators indicate
that the doubly fed induction machine is more adaptable to
multimachine AC system. Frequency controller installed in the DFIG
system can also improve its transient profile.
Abstract: Real options theory suggests that managerial flexibility embedded within irreversible investments can account for a significant value in project valuation. Although the argument has become the dominant focus of capital investment theory over decades, yet recent survey literature in capital budgeting indicates that corporate practitioners still do not explicitly apply real options in investment decisions. In this paper, we explore how real options decision criteria can be transformed into equivalent capital budgeting criteria under the consideration of uncertainty, assuming that underlying stochastic process follows a geometric Brownian motion (GBM), a mixed diffusion-jump (MX), or a mean-reverting process (MR). These equivalent valuation techniques can be readily decomposed into conventional investment rules and “option impacts", the latter of which describe the impacts on optimal investment rules with the option value considered. Based on numerical analysis and Monte Carlo simulation, three major findings are derived. First, it is shown that real options could be successfully integrated into the mindset of conventional capital budgeting. Second, the inclusion of option impacts tends to delay investment. It is indicated that the delay effect is the most significant under a GBM process and the least significant under a MR process. Third, it is optimal to adopt the new capital budgeting criteria in investment decision-making and adopting a suboptimal investment rule without considering real options could lead to a substantial loss in value.
Abstract: This paper presents an environmental and technoeconomic
evaluation of light duty vehicles in Iran. A comprehensive
well-to-wheel (WTW) analysis is applied to compare different
automotive fuel chains, conventional internal combustion engines and
innovative vehicle powertrains. The study examines the
competitiveness of 15 various pathways in terms of energy
efficiencies, GHG emissions, and levelized cost of different energy
carriers. The results indicate that electric vehicles including battery
electric vehicles (BEV), fuel cell vehicles (FCV) and plug-in hybrid
electric vehicles (PHEV) increase the WTW energy efficiency by
54%, 51% and 46%, respectively, compared to common internal
combustion engines powered by gasoline. On the other hand,
greenhouse gas (GHG) emissions per kilometer of FCV and BEV
would be 48% lower than that of gasoline engines. It is concluded
that BEV has the lowest total cost of energy consumption and
external cost of emission, followed by internal combustion engines
(ICE) fueled by CNG. Conventional internal combustion engines
fueled by gasoline, on the other hand, would have the highest costs.
Abstract: Soil erosion is the most serious problem faced at
global and local level. So planning of soil conservation measures has
become prominent agenda in the view of water basin managers. To
plan for the soil conservation measures, the information on soil
erosion is essential. Universal Soil Loss Equation (USLE), Revised
Universal Soil Loss Equation 1 (RUSLE1or RUSLE) and Modified
Universal Soil Loss Equation (MUSLE), RUSLE 1.06, RUSLE1.06c,
RUSLE2 are most widely used conventional erosion estimation
methods. The essential drawbacks of USLE, RUSLE1 equations are
that they are based on average annual values of its parameters and so
their applicability to small temporal scale is questionable. Also these
equations do not estimate runoff generated soil erosion. So
applicability of these equations to estimate runoff generated soil
erosion is questionable. Data used in formation of USLE, RUSLE1
equations was plot data so its applicability at greater spatial scale
needs some scale correction factors to be induced. On the other hand
MUSLE is unsuitable for predicting sediment yield of small and large
events. Although the new revised forms of USLE like RUSLE 1.06,
RUSLE1.06c and RUSLE2 were land use independent and they have
almost cleared all the drawbacks in earlier versions like USLE and
RUSLE1, they are based on the regional data of specific area and
their applicability to other areas having different climate, soil, land
use is questionable. These conventional equations are applicable for
sheet and rill erosion and unable to predict gully erosion and spatial
pattern of rills. So the research was focused on development of nonconventional
(other than conventional) methods of soil erosion
estimation. When these non-conventional methods are combined with
GIS and RS, gives spatial distribution of soil erosion. In the present
paper the review of literature on non- conventional methods of soil
erosion estimation supported by GIS and RS is presented.
Abstract: This paper presents a tested research concept that
implements a complex evolutionary algorithm, genetic algorithm
(GA), in a multi-microcontroller environment. Parallel Distributed
Genetic Algorithm (PDGA) is employed in adaptive beam forming
technique to reduce power usage of adaptive antenna at WCDMA
base station. Adaptive antenna has dynamic beam that requires more
advanced beam forming algorithm such as genetic algorithm which
requires heavy computation and memory space. Microcontrollers are
low resource platforms that are normally not associated with GAs,
which are typically resource intensive. The aim of this project was to
design a cooperative multiprocessor system by expanding the role of
small scale PIC microcontrollers to optimize WCDMA base station
transmitter power. Implementation results have shown that PDGA
multi-microcontroller system returned optimal transmitted power
compared to conventional GA.
Abstract: Recently, majors of doctors are divided into terribly lots of detailed areas. However, it is actually not a rare case that a doctor has a patient who is not in his/her major. He/She must judge an assessment and make a medical treatment plan for this patient. According to our investigation, conventional approaches such as image diagnosis cooperation are insufficient. This paper proposes an 'Assessment / Medical Treatment Plan Consulting System'. We have implemented a pilot system based on our proposition. Its effectiveness is clarified by an evaluation.
Abstract: In recent years, copulas have become very popular in
financial research and actuarial science as they are more flexible in
modelling the co-movements and relationships of risk factors as compared
to the conventional linear correlation coefficient by Pearson.
However, a precise estimation of the copula parameters is vital in
order to correctly capture the (possibly nonlinear) dependence structure
and joint tail events. In this study, we employ two optimization
heuristics, namely Differential Evolution and Threshold Accepting to
tackle the parameter estimation of multivariate t distribution models
in the EML approach. Since the evolutionary optimizer does not rely
on gradient search, the EML approach can be applied to estimation of
more complicated copula models such as high-dimensional copulas.
Our experimental study shows that the proposed method provides
more robust and more accurate estimates as compared to the IFM
approach.
Abstract: In this paper, we present a technical and an economic
assessment of several sources of renewable energy in Saudi Arabia;
mainly solar, wind, hydro and biomass. We analyze the
environmental and climatic conditions in relation to these sources
and give an overview of some of the existing clean energy
technologies. Using standardized cost and efficiency data, we carry
out a cost benefit analysis to understand the economic factors
influencing the sustainability of energy production from renewable
sources in light of the energy cost and demand in the Saudi market.
Finally, we take a look at the Saudi petroleum industry and the
existing sources of conventional energy and assess the potential of
building a successful market for renewable energy under the
constraints imposed by the flow of subsidized cheap oil. We show
that while some renewable energy resources are well suited for
distributed or grid connected generation in the kingdom, their
viability is greatly undercut by the well developed and well
capitalized oil industry.
Abstract: This paper examined the influence of matching
students- learning preferences with the teaching methodology
adopted, on their academic performance in an accounting course in
two types of learning environment in one university in Lebanon:
classes with PowerPoint (PPT) vs. conventional classes. Learning
preferences were either for PPT or for Conventional methodology. A
statistically significant increase in academic achievement is found in
the conventionally instructed group as compared to the group taught
with PPT. This low effectiveness of PPT might be attributed to the
learning preferences of Lebanese students. In the PPT group, better
academic performance was found among students with
learning/teaching match as compared with students with
learning/teaching mismatch. Since the majority of students display a
preference for the conventional methodology, the result might
suggest that Lebanese students- performance is not optimized by PPT
in the accounting classrooms, not because of PPT itself, but because
it is not matching the Lebanese students- learning preferences in such
a quantitative course.
Abstract: Image convolution similar to the receptive fields
found in mammalian visual pathways has long been used in
conventional image processing in the form of Gabor masks.
However, no VLSI implementation of parallel, multi-layered pulsed
processing has been brought forward which would emulate this
property. We present a technical realization of such a pulsed image
processing scheme. The discussed IC also serves as a general testbed
for VLSI-based pulsed information processing, which is of interest
especially with regard to the robustness of representing an analog
signal in the phase or duration of a pulsed, quasi-digital signal, as
well as the possibility of direct digital manipulation of such an
analog signal. The network connectivity and processing properties
are reconfigurable so as to allow adaptation to various processing
tasks.
Abstract: Independent component analysis (ICA) in the
frequency domain is used for solving the problem of blind source
separation (BSS). However, this method has some problems. For
example, a general ICA algorithm cannot determine the permutation
of signals which is important in the frequency domain ICA. In this
paper, we propose an approach to the solution for a permutation
problem. The idea is to effectively combine two conventional
approaches. This approach improves the signal separation
performance by exploiting features of the conventional approaches.
We show the simulation results using artificial data.
Abstract: The last decade has shown that object-oriented
concept by itself is not that powerful to cope with the rapidly
changing requirements of ongoing applications. Component-based
systems achieve flexibility by clearly separating the stable parts of
systems (i.e. the components) from the specification of their
composition. In order to realize the reuse of components effectively
in CBSD, it is required to measure the reusability of components.
However, due to the black-box nature of components where the
source code of these components are not available, it is difficult to
use conventional metrics in Component-based Development as these
metrics require analysis of source codes. In this paper, we survey
few existing component-based reusability metrics. These metrics
give a border view of component-s understandability, adaptability,
and portability. It also describes the analysis, in terms of quality
factors related to reusability, contained in an approach that aids
significantly in assessing existing components for reusability.