Abstract: Decisions are regularly made during a project or
daily life. Some decisions are critical and have a direct impact on
project or human success. Formal evaluation is thus required,
especially for crucial decisions, to arrive at the optimal solution
among alternatives to address issues. According to microeconomic
theory, all people-s decisions can be modeled as indifference curves.
The proposed approach supports formal analysis and decision by
constructing indifference curve model from the previous experts-
decision criteria. These knowledge embedded in the system can be
reused or help naïve users select alternative solution of the similar
problem. Moreover, the method is flexible to cope with unlimited
number of factors influencing the decision-making. The preliminary
experimental results of the alternative selection are accurately
matched with the expert-s decisions.
Abstract: In text categorization problem the most used method
for documents representation is based on words frequency vectors
called VSM (Vector Space Model). This representation is based only
on words from documents and in this case loses any “word context"
information found in the document. In this article we make a
comparison between the classical method of document representation
and a method called Suffix Tree Document Model (STDM) that is
based on representing documents in the Suffix Tree format. For the
STDM model we proposed a new approach for documents
representation and a new formula for computing the similarity
between two documents. Thus we propose to build the suffix tree
only for any two documents at a time. This approach is faster, it has
lower memory consumption and use entire document representation
without using methods for disposing nodes. Also for this method is
proposed a formula for computing the similarity between documents,
which improves substantially the clustering quality. This
representation method was validated using HAC - Hierarchical
Agglomerative Clustering. In this context we experiment also the
stemming influence in the document preprocessing step and highlight
the difference between similarity or dissimilarity measures to find
“closer" documents.
Abstract: In this paper we present an autoregressive model with
neural networks modeling and standard error backpropagation
algorithm training optimization in order to predict the gross domestic
product (GDP) growth rate of four countries. Specifically we propose
a kind of weighted regression, which can be used for econometric
purposes, where the initial inputs are multiplied by the neural
networks final optimum weights from input-hidden layer after the
training process. The forecasts are compared with those of the
ordinary autoregressive model and we conclude that the proposed
regression-s forecasting results outperform significant those of
autoregressive model in the out-of-sample period. The idea behind
this approach is to propose a parametric regression with weighted
variables in order to test for the statistical significance and the
magnitude of the estimated autoregressive coefficients and
simultaneously to estimate the forecasts.
Abstract: This study proposes a multi-response surface
optimization problem (MRSOP) for determining the proper choices
of a process parameter design (PPD) decision problem in a noisy
environment of a grease position process in an electronic industry.
The proposed models attempts to maximize dual process responses
on the mean of parts between failure on left and right processes. The
conventional modified simplex method and its hybridization of the
stochastic operator from the hunting search algorithm are applied to
determine the proper levels of controllable design parameters
affecting the quality performances. A numerical example
demonstrates the feasibility of applying the proposed model to the
PPD problem via two iterative methods. Its advantages are also
discussed. Numerical results demonstrate that the hybridization is
superior to the use of the conventional method. In this study, the
mean of parts between failure on left and right lines improve by
39.51%, approximately. All experimental data presented in this
research have been normalized to disguise actual performance
measures as raw data are considered to be confidential.
Abstract: Study was conducted to determine the concentration of
copper, cadmium, lead and zinc in Cabomba furcata that found
abundance in Lake Chini. This aquatic plant was collected randomly
within the lake for heavy metal determination. Water quality
measurement was undertaken in situ for temperature, pH,
conductivity and dissolved oksigen using portable multi sensor probe
YSI model 556. The C. furcata was digested using wet digestion
method and heavy metal concentrations were analysed using Atomic
Absorption Spectrometer (AAS) Perkin Elmer 4100B (flame
method). Result of water quality classify Lake Chini between class II
to class III using Malaysian Water Quality Standard. According to
this standard, Lake Chini has moderate quality, which normal for
natural lake. Heavy metal concentrations in C.furcata were low and
found to be lower than the critical toxic value in aquatic plants. Oneway
ANOVA test indicated the heavy metal concentrations in
C.furcata were significantly differ between sampling location. Water
quality and heavy metal concentrations indicates that Lake Chini was
not receives anthropogenic load from nearby activities.
Abstract: In this paper, the application of multiple Elman neural networks to time series data regression problems is studied. An ensemble of Elman networks is formed by boosting to enhance the performance of the individual networks. A modified version of the AdaBoost algorithm is employed to integrate the predictions from multiple networks. Two benchmark time series data sets, i.e., the Sunspot and Box-Jenkins gas furnace problems, are used to assess the effectiveness of the proposed system. The simulation results reveal that an ensemble of boosted Elman networks can achieve a higher degree of generalization as well as performance than that of the individual networks. The results are compared with those from other learning systems, and implications of the performance are discussed.
Abstract: Because of excellent properties, people has paid more
attention to SPIHI algorithm, which is based on the traditional wavelet
transformation theory, but it also has its shortcomings. Combined the
progress in the present wavelet domain and the human's visual
characteristics, we propose an improved algorithm based on human
visual characteristics of SPIHT in the base of analysis of SPIHI
algorithm. The experiment indicated that the coding speed and quality
has been enhanced well compared to the original SPIHT algorithm,
moreover improved the quality of the transmission cut off.
Abstract: In this paper a bank of velocity filters is devised to be
used for isolating a moving object with specific velocity in a sequence of frames. The approach used is a 3-D FFT based experimental procedure without applying any theoretical concept
from velocity filters. Accordingly, velocity filters are built using the
spectral signature of each separate moving object. Experimentation
reveals the capabilities of the constructed filter bank to separate moving objects as far as the amplitude as well as the direction of the
velocity are concerned.
Abstract: In this paper, the existence of multiple positive
solutions for a class of third-order three-point discrete boundary value
problem is studied by applying algebraic topology method.
Abstract: In this paper we present a new approach to deal with
image segmentation. The fact that a single segmentation result do not
generally allow a higher level process to take into account all the
elements included in the image has motivated the consideration of
image segmentation as a multiobjective optimization problem. The
proposed algorithm adopts a split/merge strategy that uses the result
of the k-means algorithm as input for a quantum evolutionary
algorithm to establish a set of non-dominated solutions. The
evaluation is made simultaneously according to two distinct features:
intra-region homogeneity and inter-region heterogeneity. The
experimentation of the new approach on natural images has proved
its efficiency and usefulness.
Abstract: As mobile service's subscriber is increasing; mobile
contents services are getting more and more variables. So, mobile
contents development needs not only contents design but also
guideline for just mobile. And when mobile contents are developed, it
is important to pass the limit and restriction of the mobile. The
restrictions of mobile are small browser and screen size, limited
download size and uncomfortable navigation. So each contents of
mobile guideline will be presented for user's usability, easy of
development and consistency of rule. This paper will be proposed
methodology which is each contents of mobile guideline. Mobile web
will be developed by mobile guideline which I proposed.
Abstract: Cellular communication is being widely used by all
over the world. The users of handsets are increasing due to the
request from marketing sector. The important aspect that has to be
touch in this paper is about the security system of cellular
communication. It is important to provide users with a secure channel
for communication. A brief description of the new GSM cellular
network architecture will be provided. Limitations of cellular
networks, their security issues and the different types of attacks will
be discussed. The paper will go over some new security mechanisms
that have been proposed by researchers. Overall, this paper clarifies
the security system or services of cellular communication using
GSM. Three Malaysian Communication Companies were taken as
Case study in this paper.
Abstract: Motor imagery classification provides an important basis for designing Brain Machine Interfaces [BMI]. A BMI captures and decodes brain EEG signals and transforms human thought into actions. The ability of an individual to control his EEG through imaginary mental tasks enables him to control devices through the BMI. This paper presents a method to design a four state BMI using EEG signals recorded from the C3 and C4 locations. Principle features extracted through principle component analysis of the segmented EEG are analyzed using two novel classification algorithms using Elman recurrent neural network and functional link neural network. Performance of both classifiers is evaluated using a particle swarm optimization training algorithm; results are also compared with the conventional back propagation training algorithm. EEG motor imagery recorded from two subjects is used in the offline analysis. From overall classification performance it is observed that the BP algorithm has higher average classification of 93.5%, while the PSO algorithm has better training time and maximum classification. The proposed methods promises to provide a useful alternative general procedure for motor imagery classification
Abstract: An edge based local search algorithm, called ELS, is proposed for the maximum clique problem (MCP), a well-known combinatorial optimization problem. ELS is a two phased local search method effectively £nds the near optimal solutions for the MCP. A parameter ’support’ of vertices de£ned in the ELS greatly reduces the more number of random selections among vertices and also the number of iterations and running times. Computational results on BHOSLIB and DIMACS benchmark graphs indicate that ELS is capable of achieving state-of-the-art-performance for the maximum clique with reasonable average running times.
Abstract: The algorithms of convex hull have been extensively studied in literature, principally because of their wide range of applications in different areas. This article presents an efficient algorithm to construct approximate convex hull from a set of n points in the plane in O(n + k) time, where k is the approximation error control parameter. The proposed algorithm is suitable for applications preferred to reduce the computation time in exchange of accuracy level such as animation and interaction in computer graphics where rapid and real-time graphics rendering is indispensable.
Abstract: Artificial atoms are growing fields of interest due to their physical and optoelectronicapplications. The absorption spectra of the proposed artificial atom inpresence of Tera-Hertz field is investigated theoretically. We use the non-perturbativeFloquet theory and finite difference method to study the electronic structure of ArtificialAtom. The effect of static electric field on the energy levels of artificial atom is studied.The effect of orientation of static electric field on energy levels and diploe matrix elementsis also highlighted.
Abstract: This paper presents a novel CMOS four-transistor
SRAM cell for very high density and low power embedded SRAM
applications as well as for stand-alone SRAM applications. This cell
retains its data with leakage current and positive feedback without
refresh cycle. The new cell size is 20% smaller than a conventional
six-transistor cell using same design rules. Also proposed cell uses
two word-lines and one pair bit-line. Read operation perform from
one side of cell, and write operation perform from another side of
cell, and swing voltage reduced on word-lines thus dynamic power
during read/write operation reduced. The fabrication process is fully
compatible with high-performance CMOS logic technologies,
because there is no need to integrate a poly-Si resistor or a TFT load.
HSPICE simulation in standard 0.25μm CMOS technology confirms
all results obtained from this paper.
Abstract: System testing is actually done to the entire system
against the Functional Requirement Specification and/or the System
Requirement Specification. Moreover, it is an investigatory testing
phase, where the focus is to have almost a destructive attitude and
test not only the design, but also the behavior and even the believed
expectations of the customer. It is also intended to test up to and
beyond the bounds defined in the software/hardware requirements
specifications. In Motorola®, Automated Testing is one of the testing
methodologies uses by GSG-iSGT (Global Software Group - iDEN
TM
Subcriber Group-Test) to increase the testing volume, productivity
and reduce test cycle-time in iDEN
TM
phones testing. Testing is able
to produce more robust products before release to the market. In this
paper, iHopper is proposed as a tool to perform stress test on iDEN
TM
phonse. We will discuss the value that automation has brought to
iDEN
TM
Phone testing such as improving software quality in the
iDEN
TM
phone together with some metrics. We will also look into
the advantages of the proposed system and some discussion of the
future work as well.
Abstract: Optimal load shedding (LS) design as an emergency plan is one of the main control challenges posed by emerging new uncertainties and numerous distributed generators including renewable energy sources in a modern power system. This paper presents an overview of the key issues and new challenges on optimal LS synthesis concerning the integration of wind turbine units into the power systems. Following a brief survey on the existing LS methods, the impact of power fluctuation produced by wind powers on system frequency and voltage performance is presented. The most LS schemas proposed so far used voltage or frequency parameter via under-frequency or under-voltage LS schemes. Here, the necessity of considering both voltage and frequency indices to achieve a more effective and comprehensive LS strategy is emphasized. Then it is clarified that this problem will be more dominated in the presence of wind turbines.
Abstract: The normalized difference vegetation index (NDVI)
and normalized difference moisture index (NDMI) derived from the
moderate resolution imaging spectroradiometer (MODIS) have been
widely used to identify spatial information of drought condition. The
relationship between NDVI and NDMI has been analyzed using
Pearson correlation analysis and showed strong positive relationship.
The drought indices have detected drought conditions and identified
spatial extents of drought. A comparison between normal year and
drought year demonstrates that the amplitude analysis considered both
vegetation and moisture condition is an effective method to identify
drought condition. We proposed the amplitude analysis is useful for
quick spatial assessment of drought information at a regional scale.