Abstract: This paper presents a formalisation of the different existing code mutation techniques (polymorphism and metamorphism) by means of formal grammars. While very few theoretical results are known about the detection complexity of viral mutation techniques, we exhaustively address this critical issue by considering the Chomsky classification of formal grammars. This enables us to determine which family of code mutation techniques are likely to be detected or on the contrary are bound to remain undetected. As an illustration we then present, on a formal basis, a proof-of-concept metamorphic mutation engine denoted PB MOT, whose detection has been proven to be undecidable.
Abstract: In this paper, we propose a texture feature-based
language identification using wavelet-domain BDIP (block difference
of inverse probabilities) and BVLC (block variance of local
correlation coefficients) features and FFT (fast Fourier transform)
feature. In the proposed method, wavelet subbands are first obtained
by wavelet transform from a test image and denoised by Donoho-s
soft-thresholding. BDIP and BVLC operators are next applied to the
wavelet subbands. FFT blocks are also obtained by 2D (twodimensional)
FFT from the blocks into which the test image is
partitioned. Some significant FFT coefficients in each block are
selected and magnitude operator is applied to them. Moments for each
subband of BDIP and BVLC and for each magnitude of significant
FFT coefficients are then computed and fused into a feature vector. In
classification, a stabilized Bayesian classifier, which adopts variance
thresholding, searches the training feature vector most similar to the
test feature vector. Experimental results show that the proposed
method with the three operations yields excellent language
identification even with rather low feature dimension.
Abstract: This paper presents a novel methodology for Maximum Power Point Tracking (MPPT) of a grid-connected 20 kW Photovoltaic (PV) system using neuro-fuzzy network. The proposed method predicts the reference PV voltage guarantying optimal power transfer between the PV generator and the main utility grid. The neuro-fuzzy network is composed of a fuzzy rule-based classifier and three Radial Basis Function Neural Networks (RBFNN). Inputs of the network (irradiance and temperature) are classified before they are fed into the appropriated RBFNN for either training or estimation process while the output is the reference voltage. The main advantage of the proposed methodology, comparing to a conventional single neural network-based approach, is the distinct generalization ability regarding to the nonlinear and dynamic behavior of a PV generator. In fact, the neuro-fuzzy network is a neural network based multi-model machine learning that defines a set of local models emulating the complex and non-linear behavior of a PV generator under a wide range of operating conditions. Simulation results under several rapid irradiance variations proved that the proposed MPPT method fulfilled the highest efficiency comparing to a conventional single neural network.
Abstract: Nowadays, the rapid development of multimedia
and internet allows for wide distribution of digital media data.
It becomes much easier to edit, modify and duplicate digital
information Besides that, digital documents are also easy to
copy and distribute, therefore it will be faced by many
threatens. It-s a big security and privacy issue with the large
flood of information and the development of the digital
format, it become necessary to find appropriate protection
because of the significance, accuracy and sensitivity of the
information. Nowadays protection system classified with more
specific as hiding information, encryption information, and
combination between hiding and encryption to increase information
security, the strength of the information hiding science is due to the
non-existence of standard algorithms to be used in hiding secret
messages. Also there is randomness in hiding methods such as
combining several media (covers) with different methods to pass a
secret message. In addition, there are no formal methods to be
followed to discover the hidden data. For this reason, the task of this
research becomes difficult. In this paper, a new system of information
hiding is presented. The proposed system aim to hidden information
(data file) in any execution file (EXE) and to detect the hidden file
and we will see implementation of steganography system which
embeds information in an execution file. (EXE) files have been
investigated. The system tries to find a solution to the size of the
cover file and making it undetectable by anti-virus software. The
system includes two main functions; first is the hiding of the
information in a Portable Executable File (EXE), through the
execution of four process (specify the cover file, specify the
information file, encryption of the information, and hiding the
information) and the second function is the extraction of the hiding
information through three process (specify the steno file, extract the
information, and decryption of the information). The system has
achieved the main goals, such as make the relation of the size of the
cover file and the size of information independent and the result file
does not make any conflict with anti-virus software.
Abstract: In this paper, a class of predator-prey-chain model with harvesting terms are studied. By using Mawhin-s continuation theorem of coincidence degree theory and some skills of inequalities, some sufficient conditions are established for the existence of eight positive periodic solutions. Finally, an example is presented to illustrate the feasibility and effectiveness of the results.
Abstract: Sunflower stalks were analysed for chemical
compositions: pentosan 15.84%, holocellulose 70.69%,
alphacellulose 45.74%, glucose 27.10% and xylose 7.69% based on
dry weight of 100-g raw material. The most optimum condition for
steam explosion pretreatment was as follows. Sunflower stalks were
cut into small pieces and soaked in 0.02 M H2SO4 for overnight.
After that, they were steam exploded at 207 C and 21 kg/cm2 for 3
minutes to fractionate cellulose, hemicellulose and lignin. The
resulting hydrolysate, containing hemicellulose, and cellulose pulp
contained xylose sugar at 2.53% and 7.00%, respectively.The pulp
was further subjected to enzymatic saccharification at 50 C, pH 4.8 citrate buffer) with pulp/buffer 6% (w/w)and Celluclast 1.5L/pulp
2.67% (w/w) to obtain single glucose with maximum yield 11.97%.
After fixed-bed fermentation under optimum condition using
conventional yeast mixtures to produce bioethanol, it indicated
maximum ethanol yield of 0.028 g/100 g sunflower stalk.
Abstract: This paper describes a new approach of classification
using genetic programming. The proposed technique consists of
genetically coevolving a population of non-linear transformations on
the input data to be classified, and map them to a new space with a
reduced dimension, in order to get a maximum inter-classes
discrimination. The classification of new samples is then performed
on the transformed data, and so become much easier. Contrary to the
existing GP-classification techniques, the proposed one use a
dynamic repartition of the transformed data in separated intervals, the
efficacy of a given intervals repartition is handled by the fitness
criterion, with a maximum classes discrimination. Experiments were
first performed using the Fisher-s Iris dataset, and then, the KDD-99
Cup dataset was used to study the intrusion detection and
classification problem. Obtained results demonstrate that the
proposed genetic approach outperform the existing GP-classification
methods [1],[2] and [3], and give a very accepted results compared to
other existing techniques proposed in [4],[5],[6],[7] and [8].
Abstract: This study analyzes the effect of discretization on
classification of datasets including continuous valued features. Six
datasets from UCI which containing continuous valued features are
discretized with entropy-based discretization method. The
performance improvement between the dataset with original features
and the dataset with discretized features is compared with k-nearest
neighbors, Naive Bayes, C4.5 and CN2 data mining classification
algorithms. As the result the classification accuracies of the six
datasets are improved averagely by 1.71% to 12.31%.
Abstract: Microbubbbles incorporating ultrasound have been used to increase the efficacy of targeted drug delivery, because microstreaming induced by cavitating bubbles affects the drug perfusion into the target cells and tissues. In order to clarify the physical effects of microstreaming on drug perfusion into tissues, a preliminary experimental study of perfusion enhancement by a stably oscillating microbubble was performed. Microstreaming was induced by an oscillating bubble at 15 kHz, and perfusion of dye into an agar phantom was optically measured by histology on agar phantom. Surface color intensity and the penetration length of dye in the agar phantom were increased more than 70% and 30%, respectively, due to the microstreaming induced by an oscillating bubble. The mass of dye perfused into a tissue phantom for 30 s was increased about 80% in the phantom with an oscillating bubble. This preliminary experiment shows the physical effects of steady streaming by an oscillating bubble can enhance the drug perfusion into the tissues while minimizing the biological effects.
Abstract: This paper focuses on the use of project work as a
pretext for applying the conventions of writing, or the correctness of
mechanics, usage, and sentence formation, in a content-based class in
a Rajabhat University. Its aim was to explore to what extent the
student teachers’ academic achievement of the basic writing features
against the 70% attainment target after the use of project is. The
organization of work around an agreed theme in which the students
reproduce language provided by texts and instructors is expected to
enhance students’ correct writing conventions. The sample of the
study comprised of 38 fourth-year English major students. The data
was collected by means of achievement test and student writing
works. The scores in the summative achievement test were analyzed
by mean score, standard deviation, and percentage. It was found that
the student teachers do more achieve of practicing mechanics and
usage, and less in sentence formation. The students benefited from
the exposure to texts during conducting the project; however, their
automaticity of how and when to form phrases and clauses into
simple/complex sentences had room for improvement.
Abstract: The modified Claus process is commonly used in oil
refining and gas processing to recover sulfur and destroy
contaminants formed in upstream processing. A Claus furnace feed
containing a relatively low concentration of H2S may be incapable of
producing a stable flame. Also, incomplete combustion of
hydrocarbons in the feed can lead to deterioration of the catalyst in
the reactors due to soot or carbon deposition. Therefore, special
consideration is necessary to achieve the appropriate overall sulfur
recovery. In this paper, some configurations available to treat lean
acid gas streams are described and the most appropriate ones are
studied to overcome low H2S concentration problems. As a result,
overall sulfur recovery is investigated for feed preheating and hot gas
configurations.
Abstract: This paper presents the optimal controller design of
the generator control unit in the aircraft power system. The adaptive
tabu search technique is applied to tune the controller parameters
until the best terminal output voltage of generator is achieved. The
output response from the system with the controllers designed by the
proposed technique is compared with those from the conventional
method. The transient simulations using the commercial software
package show that the controllers designed from the adaptive tabu
search algorithm can provide the better output performance compared
with the result from the classical method. The proposed design
technique is very flexible and useful for electrical aircraft engineers.
Abstract: Electronic commerce is growing rapidly with on-line
sales already heading for hundreds of billion dollars per year. Due to
the huge amount of money transferred everyday, an increased
security level is required. In this work we present the architecture of
an intelligent speaker verification system, which is able to accurately
verify the registered users of an e-commerce service using only their
voices as an input. According to the proposed architecture, a
transaction-based e-commerce application should be complemented
by a biometric server where customer-s unique set of speech models
(voiceprint) is stored. The verification procedure requests from the
user to pronounce a personalized sequence of digits and after
capturing speech and extracting voice features at the client side are
sent back to the biometric server. The biometric server uses pattern
recognition to decide whether the received features match the stored
voiceprint of the customer who claims to be, and accordingly grants
verification. The proposed architecture can provide e-commerce
applications with a higher degree of certainty regarding the identity
of a customer, and prevent impostors to execute fraudulent
transactions.
Abstract: While in practice negotiation is always a mix of
cooperation and competition, these two elements correspond to
different approaches of the relationship and also different orientations
in term of strategy, techniques, tactics and arguments employed by
the negotiators with related effects and in the end leading to different
outcomes. The levels of honesty, trust and therefore cooperation are
influenced not only by the uncertainty of the situation, the objectives,
stakes or power but also by the orientation given from the very
beginning of the relationship. When negotiation is reduced to a
confrontation of power, participants rely on coercive measures, using
different kinds of threats or make false promises and bluff in order to
establish a more acceptable balance of power.
Most of the negotiators have a tendency to complain about the
unethical aspects of the tactics used by their counterparts while, as
the same time, they are mostly unaware of the sources of influence of
their own vision and practices. In this article, our intention is to
clarify these sources and try to understand what can lead negotiators
to unethical practices.
Abstract: The paper presents a multimodal approach for biometric authentication, based on multiple classifiers. The proposed solution uses a post-classification biometric fusion method in which the biometric data classifiers outputs are combined in order to improve the overall biometric system performance by decreasing the classification error rates. The paper shows also the biometric recognition task improvement by means of a carefully feature selection, as much as not all of the feature vectors components support the accuracy improvement.
Abstract: High Power Lasers produce an intense burst of
Bremmstrahlung radiation which has potential applications in broadband
x-ray radiography. Since the radiation produced is through the
interaction of accelerated electrons with the remaining laser target,
these bursts are extremely short – in the region of a few ps. As a
result, the laser-produced x-rays are capable of imaging complex
dynamic objects with zero motion blur.
Abstract: This study discusses the effect of uncertainty on
production levels of a petrochemical complex. Uncertainly or
variations in some model parameters, such as prices, supply and
demand of materials, can affect the optimality or the efficiency of any
chemical process. For any petrochemical complex with many plants,
there are many sources of uncertainty and frequent variations which
require more attention. Many optimization approaches are proposed
in the literature to incorporate uncertainty within the model in order
to obtain a robust solution. In this work, a stability analysis approach
is applied to a deterministic LP model of a petrochemical complex
consists of ten plants to investigate the effect of such variations on
the obtained optimal production levels. The proposed approach can
determinate the allowable variation ranges of some parameters,
mainly objective or RHS coefficients, before the system lose its
optimality. Parameters with relatively narrow range of variations, i.e.
stability limits, are classified as sensitive parameters or constraints
that need accurate estimate or intensive monitoring. These stability
limits offer easy-to-use information to the decision maker and help in
understanding the interaction between some model parameters and
deciding when the system need to be re-optimize. The study shows
that maximum production of ethylene and the prices of intermediate
products are the most sensitive factors that affect the stability of the
optimum solution
Abstract: This paper discusses a qualitative simulator QRiOM
that uses Qualitative Reasoning (QR) technique, and a process-based
ontology to model, simulate and explain the behaviour of selected
organic reactions. Learning organic reactions requires the application
of domain knowledge at intuitive level, which is difficult to be
programmed using traditional approach. The main objective of
QRiOM is to help learners gain a better understanding of the
fundamental organic reaction concepts, and to improve their
conceptual comprehension on the subject by analyzing the multiple
forms of explanation generated by the software. This paper focuses
on the generation of explanation based on causal theories to explicate
various phenomena in the chemistry subject. QRiOM has been tested
with three classes problems related to organic chemistry, with
encouraging results. This paper also presents the results of
preliminary evaluation of QRiOM that reveal its explanation
capability and usefulness.
Abstract: This paper explores the use of project work in a
content-based instruction in a Rajabhat University, a teacher college,
where student teachers are instructed to perform teaching roles
mainly in basic education level. Its aim is to link theory to practice,
and to help language teachers maximize the full potential of project
work for genuine communication and give real meaning to writing
activity. Two research questions are formulated to guide this study:
a) What is the academic achievement of the students- writing skill
against the 70% attainment target after the use of project to enhance
the skill? and b) To what degree is the development of the students-
writing skills during the course of project to enhance the skill? The
sample of the study comprised of 38 fourth-year English major
students. The data was collected by means of achievement test,
student writing works, and project diary. The scores in the summative
achievement test were analyzed by mean score, standard deviation,
and t-test. Project diary serves as students- record of the language
acquired during the project. List of structures and vocabulary noted in
the diary has shown students- ability to attend to, recognize, and
focus on meaningful patterns of language forms.
Abstract: Bionanotechnology deals with nanoscopic interactions between nanostructured materials and biological systems. Polymer nanocomposites with optimized biological activity have attracted great attention. Nanoclay is considered as reinforcing nanofiller in manufacturing of high performance nanocomposites. In current study, organomodified-nanoclay with negatively charged silicate layers was incorporated into biomedical grade silicone rubber. Nanoparticle loading has been tailored to enhance cell behavior. Addition of nanoparticles led to improved mechanical properties of substrate with enhanced strength and stiffness while no toxic effects was observed. Results indicated improved viability and proliferation of cells by addition of nanofillers. The improved mechanical properties of the matrix result in proper cell response through adjustment and arrangement of cytoskeletal fibers. Results can be applied in tissue engineering when enhanced substrates are required for improvement of cell behavior for in vivo applications.