Abstract: In an Orthogonal Frequency Division Multiplexing (OFDM) systems, the Peak to Average power Ratio (PAR) is high. The clipping signal scheme is a useful and simple method to reduce the PAR. However, it introduces additional noise that degrades the systems performance. We propose an oversampling scheme to deal with the received signal in order to reduce the clipping noise by using Finite Impulse Response (FIR) filter. Coefficients of filter are obtained by correlation function of the received signal and the oversampling information at receiver. The performance of the proposed technique is evaluated for frequency selective channel. Results show that the proposed scheme can mitigate the clipping noise significantly for OFDM systems and in order to maintain the system's capacity, the clipping ratio should be larger than 2.5.
Abstract: This paper proposes an improvement method of classification
efficiency in a classification model. The model is used
in a risk search system and extracts specific labels from articles
posted at bulletin board sites. The system can analyze the important
discussions composed of the articles. The improvement method
introduces ensemble learning methods that use multiple classification
models. Also, it introduces expressions related to the specific labels
into generation of word vectors. The paper applies the improvement
method to articles collected from three bulletin board sites selected
by users and verifies the effectiveness of the improvement method.
Abstract: The objective of this research is to calculate the
optimal inventory lot-sizing for each supplier and minimize the total
inventory cost which includes joint purchase cost of the products,
transaction cost for the suppliers, and holding cost for remaining
inventory. Genetic algorithms (GAs) are applied to the multi-product
and multi-period inventory lot-sizing problems with supplier
selection under storage space. Also a maximum storage space for the
decision maker in each period is considered. The decision maker
needs to determine what products to order in what quantities with
which suppliers in which periods. It is assumed that demand of
multiple products is known over a planning horizon. The problem is
formulated as a mixed integer programming and is solved with the
GAs. The detailed computation results are presented.
Abstract: Quality Function Deployment (QFD) is an expounded, multi-step planning method for delivering commodity, services, and processes to customers, both external and internal to an organization. It is a way to convert between the diverse customer languages expressing demands (Voice of the Customer), and the organization-s languages expressing results that sate those demands. The policy is to establish one or more matrices that inter-relate producer and consumer reciprocal expectations. Due to its visual presence is called the “House of Quality" (HOQ). In this paper, we assumed HOQ in multi attribute decision making (MADM) pattern and through a proposed MADM method, rank technical specifications. Thereafter compute satisfaction degree of customer requirements and for it, we apply vagueness and uncertainty conditions in decision making by fuzzy set theory. This approach would propound supervised neural network (perceptron) for MADM problem solving.
Abstract: Gene, principal unit of inheritance, is an ordered
sequence of nucleotides. The genes of eukaryotic organisms include
alternating segments of exons and introns. The region of
Deoxyribonucleic acid (DNA) within a gene containing instructions
for coding a protein is called exon. On the other hand, non-coding
regions called introns are another part of DNA that regulates gene
expression by removing from the messenger Ribonucleic acid (RNA)
in a splicing process. This paper proposes to determine splice
junctions that are exon-intron boundaries by analyzing DNA
sequences. A splice junction can be either exon-intron (EI) or intron
exon (IE). Because of the popularity and compatibility of the
artificial neural network (ANN) in genetic fields; various ANN
models are applied in this research. Multi-layer Perceptron (MLP),
Radial Basis Function (RBF) and Generalized Regression Neural
Networks (GRNN) are used to analyze and detect the splice junctions
of gene sequences. 10-fold cross validation is used to demonstrate
the accuracy of networks. The real performances of these networks
are found by applying Receiver Operating Characteristic (ROC)
analysis.
Abstract: Carbon nanotubes (CNTs) with their high mechanical,
electrical, thermal and chemical properties are regarded as promising
materials for many different potential applications. Having unique
properties they can be used in a wide range of fields such as
electronic devices, electrodes, drug delivery systems, hydrogen
storage, textile etc. Catalytic chemical vapor deposition (CCVD) is a
common method for CNT production especially for mass production.
Catalysts impregnated on a suitable substrate are important for
production with chemical vapor deposition (CVD) method. Iron
catalyst and MgO substrate is one of most common catalyst-substrate
combination used for CNT. In this study, CNTs were produced by
CCVD of acetylene (C2H2) on magnesium oxide (MgO) powder
substrate impregnated by iron nitrate (Fe(NO3)3•9H2O) solution. The
CNT synthesis conditions were as follows: at synthesis temperatures
of 500 and 800°C multiwall and single wall CNTs were produced
respectively. Iron (Fe) catalysts were prepared by with Fe:MgO ratio
of 1:100, 5:100 and 10:100. The duration of syntheses were 30 and
60 minutes for all temperatures and catalyst percentages. The
synthesized materials were characterized by thermal gravimetric
analysis (TGA), transmission electron microscopy (TEM) and Raman
spectroscopy.
Abstract: The three-time-scale plant model of a wind power
generator, including a wind turbine, a flexible vertical shaft, a Variable
Inertia Flywheel (VIF) module, an Active Magnetic Bearing (AMB)
unit and the applied wind sequence, is constructed. In order to make
the wind power generator be still able to operate as the spindle speed
exceeds its rated speed, the VIF is equipped so that the spindle speed
can be appropriately slowed down once any stronger wind field is
exerted. To prevent any potential damage due to collision by shaft
against conventional bearings, the AMB unit is proposed to regulate
the shaft position deviation. By singular perturbation order-reduction
technique, a lower-order plant model can be established for the
synthesis of feedback controller. Two major system parameter
uncertainties, an additive uncertainty and a multiplicative uncertainty,
are constituted by the wind turbine and the VIF respectively.
Frequency Shaping Sliding Mode Control (FSSMC) loop is proposed
to account for these uncertainties and suppress the unmodeled
higher-order plant dynamics. At last, the efficacy of the FSSMC is
verified by intensive computer and experimental simulations for
regulation on position deviation of the shaft and counter-balance of
unpredictable wind disturbance.
Abstract: The aim of this study is evaluating the antinociceptive
and anti-inflamatory activity of Geum kokanicum. After
determination total extract LD50, different doses of extract were
chosen for intrapritoneal injections. In inflammation test, male NMRI
mice were divided into 6 groups: control (normal saline), positive
control (Dexamethasone 15mg/kg), and total extract (0.025, 0.05,
0.1, and 0.2 gr/kg). The inflammation was produced by xyleneinduced
edema. In order to evaluate the antinociceptive effect of total
extract, formalin test was used. Mice were divided into 6 groups:
control, positive control (morphine 10mg/kg), and 4 groups which
received total extract. Then they received Formalin. The animals
were observed for the reaction to pain. Data were analyzed using
One-way ANOVA followed by Tukey-Kramer multiple comparison
test. LD50 was 1 gr/kg. Data indicated that 0.5,0.1 and 0.2 gr/kg
doses of total extract have particular antinociceptive and antiinflammatory
effects in a comparison with control (P
Abstract: This paper presents an interval-based multi-attribute
decision making (MADM) approach in support of the decision
process with imprecise information. The proposed decision
methodology is based on the model of linear additive utility function
but extends the problem formulation with the measure of composite
utility variance. A sample study concerning with the evaluation of
electric generation expansion strategies is provided showing how the
imprecise data may affect the choice toward the best solution and
how a set of alternatives, acceptable to the decision maker (DM),
may be identified with certain confidence.
Abstract: In this article we are going to discuss the improvement
of the multi classes- classification problem using multi layer
Perceptron. The considered approach consists in breaking down the
n-class problem into two-classes- subproblems. The training of each
two-class subproblem is made independently; as for the phase of test,
we are going to confront a vector that we want to classify to all two
classes- models, the elected class will be the strongest one that won-t
lose any competition with the other classes. Rates of recognition
gotten with the multi class-s approach by two-class-s decomposition
are clearly better that those gotten by the simple multi class-s
approach.
Abstract: In this work we adopt a combination of Laplace
transform and the decomposition method to find numerical solutions
of a system of multi-pantograph equations. The procedure leads to a
rapid convergence of the series to the exact solution after computing a
few terms. The effectiveness of the method is demonstrated in some
examples by obtaining the exact solution and in others by computing
the absolute error which decreases as the number of terms of the series
increases.
Abstract: Multimedia information availability has increased
dramatically with the advent of video broadcasting on handheld
devices. But with this availability comes problems of maintaining the
security of information that is displayed in public. ISMA Encryption
and Authentication (ISMACryp) is one of the chosen technologies for
service protection in DVB-H (Digital Video Broadcasting-
Handheld), the TV system for portable handheld devices. The
ISMACryp is encoded with H.264/AVC (advanced video coding),
while leaving all structural data as it is. Two modes of ISMACryp are
available; the CTR mode (Counter type) and CBC mode (Cipher
Block Chaining) mode. Both modes of ISMACryp are based on 128-
bit AES algorithm. AES algorithms are more complex and require
larger time for execution which is not suitable for real time
application like live TV. The proposed system aims to gain a deep
understanding of video data security on multimedia technologies and
to provide security for real time video applications using selective
encryption for H.264/AVC. Five level of security proposed in this
paper based on the content of NAL unit in Baseline Constrain profile
of H.264/AVC. The selective encryption in different levels provides
encryption of intra-prediction mode, residue data, inter-prediction
mode or motion vectors only. Experimental results shown in this
paper described that fifth level which is ISMACryp provide higher
level of security with more encryption time and the one level provide
lower level of security by encrypting only motion vectors with lower
execution time without compromise on compression and quality of
visual content. This encryption scheme with compression process
with low cost, and keeps the file format unchanged with some direct
operations supported. Simulation was being carried out in Matlab.
Abstract: The healthcare environment is generally perceived as
being information rich yet knowledge poor. However, there is a lack
of effective analysis tools to discover hidden relationships and trends
in data. In fact, valuable knowledge can be discovered from
application of data mining techniques in healthcare system. In this
study, a proficient methodology for the extraction of significant
patterns from the Coronary Heart Disease warehouses for heart
attack prediction, which unfortunately continues to be a leading cause
of mortality in the whole world, has been presented. For this purpose,
we propose to enumerate dynamically the optimal subsets of the
reduced features of high interest by using rough sets technique
associated to dynamic programming. Therefore, we propose to
validate the classification using Random Forest (RF) decision tree to
identify the risky heart disease cases. This work is based on a large
amount of data collected from several clinical institutions based on
the medical profile of patient. Moreover, the experts- knowledge in
this field has been taken into consideration in order to define the
disease, its risk factors, and to establish significant knowledge
relationships among the medical factors. A computer-aided system is
developed for this purpose based on a population of 525 adults. The
performance of the proposed model is analyzed and evaluated based
on set of benchmark techniques applied in this classification problem.
Abstract: The Wavelet-Galerkin finite element method for
solving the one-dimensional heat equation is presented in this work.
Two types of basis functions which are the Lagrange and multi-level
wavelet bases are employed to derive the full form of matrix system.
We consider both linear and quadratic bases in the Galerkin method.
Time derivative is approximated by polynomial time basis that
provides easily extend the order of approximation in time space. Our
numerical results show that the rate of convergences for the linear
Lagrange and the linear wavelet bases are the same and in order 2
while the rate of convergences for the quadratic Lagrange and the
quadratic wavelet bases are approximately in order 4. It also reveals
that the wavelet basis provides an easy treatment to improve
numerical resolutions that can be done by increasing just its desired
levels in the multilevel construction process.
Abstract: In this paper, the hardware implementation of the
RSA public-key cryptographic algorithm is presented. The RSA
cryptographic algorithm is depends on the computation of repeated
modular exponentials.
The Montgomery algorithm is used and modified to reduce
hardware resources and to achieve reasonable operating speed for
FPGA. An efficient architecture for modular multiplications based on
the array multiplier is proposed. We have implemented a RSA
cryptosystem based on Montgomery algorithm. As a result, it is
shown that proposed architecture contributes to small area and
reasonable speed.
Abstract: The paper presents an approach for handling uncertain
information in deductive databases using multivalued logics. Uncertainty
means that database facts may be assigned logical values other
than the conventional ones - true and false. The logical values represent
various degrees of truth, which may be combined and propagated
by applying the database rules. A corresponding multivalued database
semantics is defined. We show that it extends successful conventional
semantics as the well-founded semantics, and has a polynomial time
data complexity.
Abstract: Decision fusion is one of hot research topics in
classification area, which aims to achieve the best possible
performance for the task at hand. In this paper, we
investigate the usefulness of this concept to improve change
detection accuracy in remote sensing. Thereby, outputs of
two fuzzy change detectors based respectively on
simultaneous and comparative analysis of multitemporal
data are fused by using fuzzy integral operators. This
method fuses the objective evidences produced by the
change detectors with respect to fuzzy measures that express
the difference of performance between them. The proposed
fusion framework is evaluated in comparison with some
ordinary fuzzy aggregation operators. Experiments carried
out on two SPOT images showed that the fuzzy integral was
the best performing. It improves the change detection
accuracy while attempting to equalize the accuracy rate in
both change and no change classes.
Abstract: It is impossible to think about democracy without elections. The litmus test of any electoral process in any country is the possibility of a one time minority to become a majority at another time and a peaceful transition of power. In many countries in Sub-Saharan Africa though the multi-party elections appeared to be competitive they failed the acid test of democracy: peaceful regime change in a free and fair election. Failure to solve electoral disputes might lead to bloody electoral conflicts as witnessed in many emerging democracies in Africa. The aim of this paper is to investigate electoral conflicts in Africa since the end of the Cold War by using the 2005 post-election violence in Ethiopia as a case study. In Ethiopia, the coming to power of the EPRDF in 1991 marked the fall of the Derg dictatorial military government and the beginning of a multi-party democracy. The country held multi-party parliamentary elections in 1995, 2000, and 2005 where the ruling EPRDF party “won" the elections through violence, involving intimidation, manipulation, detentions of political opponents, torture, and political assassinations. The 2005 electoral violence was the worst electoral violence in the country-s political history that led to the death of 193 protestors and the imprisonment of more than 40, 000 people. It is found out that the major causes of the 2005 Ethiopian election were the defeat of the ruling party in the election and its attempt to reverse the poll results by force; the Opposition-s lack of decisive leadership; the absence of independent courts and independent electoral management body; and the ruling party-s direct control over the army and police.
Abstract: This research aims to develop and evaluate a training
course to promote learning activities of 2nd year, Suan Sunandha
Rajabhat University, faculty of education students using multiple
intelligences theory. The process is divided into two phases: Phase 1
development of training course to promote learning activities
consisting of principles, objectives of the course, structure, training
duration, content, training materials, training activities, media
training, monitoring, measurement and evaluation quality of the
course. Phase 2 evaluation efficiency of training course was to use
the improved curriculum with experimental group which is 2nd year,
Suan Sunandha Rajabhat University, faculty of education students
was drawn randomly 152 students. The experimental pattern was
randomized Control Group Pre-Test Post-Test Design, Analysis Data
by t-Test with the software SPFSS for Windows. Research has shown
that: 1). the ability of teaching and learning according to the theory of
multiple intelligences after training is higher than before training
significantly in statistic at .01 level, 2). The satisfaction of students
to the training courses was overall at the highest level.
Abstract: In this paper, the implementation of a rule-based
intuitive reasoner is presented. The implementation included two
parts: the rule induction module and the intuitive reasoner. A large
weather database was acquired as the data source. Twelve weather
variables from those data were chosen as the “target variables"
whose values were predicted by the intuitive reasoner. A “complex"
situation was simulated by making only subsets of the data available
to the rule induction module. As a result, the rules induced were
based on incomplete information with variable levels of certainty.
The certainty level was modeled by a metric called "Strength of
Belief", which was assigned to each rule or datum as ancillary
information about the confidence in its accuracy. Two techniques
were employed to induce rules from the data subsets: decision tree
and multi-polynomial regression, respectively for the discrete and the
continuous type of target variables. The intuitive reasoner was tested
for its ability to use the induced rules to predict the classes of the
discrete target variables and the values of the continuous target
variables. The intuitive reasoner implemented two types of
reasoning: fast and broad where, by analogy to human thought, the
former corresponds to fast decision making and the latter to deeper
contemplation. . For reference, a weather data analysis approach
which had been applied on similar tasks was adopted to analyze the
complete database and create predictive models for the same 12
target variables. The values predicted by the intuitive reasoner and
the reference approach were compared with actual data. The intuitive
reasoner reached near-100% accuracy for two continuous target
variables. For the discrete target variables, the intuitive reasoner
predicted at least 70% as accurately as the reference reasoner. Since
the intuitive reasoner operated on rules derived from only about 10%
of the total data, it demonstrated the potential advantages in dealing
with sparse data sets as compared with conventional methods.