Abstract: In this paper optimization of routing in ad-hoc
networks is surveyed and a new method for reducing the complexity
of routing algorithms is suggested. Using binary matrices for each
node in the network and updating it once the routing is done, helps
nodes to stop repeating the routing protocols in each data transfer.
The algorithm suggested can reduce the complexity of routing to the
least amount possible.
Abstract: In this present work, the development of an avionics
system for flight data collection of a Raptor 30 V2 is carried out. For the data acquisition both onground and onboard avionics systems are developed for testing of a small-scale Unmanned Aerial Vehicle
(UAV) helicopter. The onboard avionics record the helicopter state
outputs namely accelerations, angular rates and Euler angles, in real time, and the on ground avionics system record the inputs given to
the radio controlled helicopter through a transmitter, in real time. The avionic systems are designed and developed taking into consideration
low weight, small size, anti-vibration, low power consumption, and easy interfacing. To mitigate the medium frequency vibrations
embedded on the UAV helicopter during flight, a damper is designed
and its performance is evaluated. A number of flight tests are carried
out and the data obtained is then analyzed for accuracy and repeatability and conclusions are inferred.
Abstract: This paper provides a flexible way of controlling
Variable-Bit-Rate (VBR) of compressed digital video, applicable to
the new H264 video compression standard. The entire video
sequence is assessed in advance and the quantisation level is then set
such that bit rate (and thus the frame rate) remains within
predetermined limits compatible with the bandwidth of the
transmission system and the capabilities of the remote end, while at
the same time providing constant quality similar to VBR encoding.
A process for avoiding buffer starvation by selectively eliminating
frames from the encoded output at times when the frame rate is slow
(large number of bits per frame) will be also described. Finally, the
problem of buffer overflow will be solved by selectively eliminating
frames from the received input to the decoder. The decoder detects
the omission of the frames and resynchronizes the transmission by
monitoring time stamps and repeating frames if necessary.
Abstract: In this paper, naturally immobilized lipase, Carica
papaya lipase, catalyzed biodiesel production from fish oil was
studied. The refined fish oil, extracted from the discarded parts of
fish, was used as a starting material for biodiesel production. The
effects of molar ratio of oil: methanol, lipase dosage, initial water
activity of lipase, temperature and solvent were investigated. It was
found that Carica papaya lipase was suitable for methanolysis of fish
oil to produce methyl ester. The maximum yield of methyl ester
could reach up to 83% with the optimal reaction conditions: oil:
methanol molar ratio of 1: 4, 20% (based on oil) of lipase, initial
water activity of lipase at 0.23 and 20% (based on oil) of tert-butanol
at 40oC after 18 h of reaction time. There was negligible loss in
lipase activity even after repeated use for 30 cycles.
Abstract: An attempt in this paper proposes a re-modification to
the minimum moment approach of resource leveling which is a modified minimum moment approach to the traditional method by
Harris. The method is based on critical path method. The new approach suggests the difference between the methods in the
selection criteria of activity which needs to be shifted for leveling resource histogram. In traditional method, the improvement factor
found first to select the activity for each possible day of shifting. In
modified method maximum value of the product of Resources Rate
and Free Float was found first and improvement factor is then
calculated for that activity which needs to be shifted. In the proposed
method the activity to be selected first for shifting is based on the largest value of resource rate. The process is repeated for all the
remaining activities for possible shifting to get updated histogram.
The proposed method significantly reduces the number of iterations
and is easier for manual computations.
Abstract: This study examined the effects of neuromuscular
training (NT) on limits of stability (LOS) in female individuals.
Twenty female basketball amateurs were assigned into NT
experimental group or control group by volunteer. All the players were
underwent regular basketball practice, 90 minutes, 3 times per week
for 6 weeks, but the NT experimental group underwent extra NT with
plyometric and core training, 50 minutes, 3 times per week for 6 weeks
during this period. Limits of stability (LOS) were evaluated by the
Biodex Balance System. One factor ANCOVA was used to examine
the differences between groups after training. The significant level for
statistic was set at p
Abstract: In this paper, a neural tree (NT) classifier having a
simple perceptron at each node is considered. A new concept for
making a balanced tree is applied in the learning algorithm of the
tree. At each node, if the perceptron classification is not accurate and
unbalanced, then it is replaced by a new perceptron. This separates
the training set in such a way that almost the equal number of patterns
fall into each of the classes. Moreover, each perceptron is trained only
for the classes which are present at respective node and ignore other
classes. Splitting nodes are employed into the neural tree architecture
to divide the training set when the current perceptron node repeats
the same classification of the parent node. A new error function based
on the depth of the tree is introduced to reduce the computational
time for the training of a perceptron. Experiments are performed to
check the efficiency and encouraging results are obtained in terms of
accuracy and computational costs.
Abstract: In this work, I present a review on Sparse Distributed
Memory for Small Cues (SDMSCue), a variant of Sparse Distributed
Memory (SDM) that is capable of handling small cues. I then conduct
and show some cognitive experiments on SDMSCue to test its
cognitive soundness compared to SDM. Small cues refer to input
cues that are presented to memory for reading associations; but have
many missing parts or fields from them. The original SDM failed to
handle such a problem. SDMSCue handles and overcomes this
pitfall. The main idea in SDMSCue; is the repeated projection of the
semantic space on smaller subspaces; that are selected based on the
input cue length and pattern. This process allows for Read/Write
operations using an input cue that is missing a large portion.
SDMSCue is augmented with the use of genetic algorithms for
memory allocation and initialization. I claim that SDM functionality
is a subset of SDMSCue functionality.
Abstract: Electronics Products that achieve high levels of integrated communications, computing and entertainment, multimedia features in small, stylish and robust new form factors are winning in the market place. Due to the high costs that an industry may undergo and how a high yield is directly proportional to high profits, IC (Integrated Circuit) manufacturers struggle to maximize yield, but today-s customers demand miniaturization, low costs, high performance and excellent reliability making the yield maximization a never ending research of an enhanced assembly process. With factors such as minimum tolerances, tighter parameter variations a systematic approach is needed in order to predict the assembly process. In order to evaluate the quality of upcoming circuits, yield models are used which not only predict manufacturing costs but also provide vital information in order to ease the process of correction when the yields fall below expectations. For an IC manufacturer to obtain higher assembly yields all factors such as boards, placement, components, the material from which the components are made of and processes must be taken into consideration. Effective placement yield depends heavily on machine accuracy and the vision of the system which needs the ability to recognize the features on the board and component to place the device accurately on the pads and bumps of the PCB. There are currently two methods for accurate positioning, using the edge of the package and using solder ball locations also called footprints. The only assumption that a yield model makes is that all boards and devices are completely functional. This paper will focus on the Monte Carlo method which consists in a class of computational algorithms (information processed algorithms) which depends on repeated random samplings in order to compute the results. This method utilized in order to recreate the simulation of placement and assembly processes within a production line.
Abstract: The present paper concerns with the influence of fiber
packing on the transverse plastic properties of metal matrix
composites. A micromechanical modeling procedure is used to
predict the effective mechanical properties of composite materials at
large tensile and compressive deformations. Microstructure is
represented by a repeating unit cell (RUC). Two fiber arrays are
considered including ideal square fiber packing and random fiber
packing defined by random sequential algorithm. The
micromechanical modeling procedure is implemented for
graphite/aluminum metal matrix composite in which the
reinforcement behaves as elastic, isotropic solids and the matrix is
modeled as an isotropic elastic-plastic solid following the von Mises
criterion with isotropic hardening and the Ramberg-Osgood
relationship between equivalent true stress and logarithmic strain.
The deformation is increased to a considerable value to evaluate both
elastic and plastic behaviors of metal matrix composites. The yields
strength and true elastic-plastic stress are determined for
graphite/aluminum composites.
Abstract: The objective of this study is to evaluate the threshold
stress of the clay with sand subgrade soil. Threshold stress can be
defined as the stress level above which cyclic loading leads to
excessive deformation and eventual failure. The thickness
determination of highways formations using the threshold stress
approach is a more realistic assessment of the soil behaviour because
it is subjected to repeated loadings from moving vehicles. Threshold
stress can be evaluated by plastic strain criterion, which is based on
the accumulated plastic strain behaviour during cyclic loadings [1].
Several conditions of the all-round pressure the subgrade soil namely,
zero confinement, low all-round pressure and high all-round pressure
are investigated. The threshold stresses of various soil conditions are
determined. Threshold stress of the soil are 60%, 31% and 38.6% for
unconfined partially saturated sample, low effective stress saturated
sample, high effective stress saturated sample respectively.
Abstract: Rice husk is a lignocellulosic source that can be
converted to ethanol. Three hundreds grams of rice husk was mixed
with 1 L of 0.18 N sulfuric acid solutions then was heated in an
autoclave. The reaction was expected to be at constant temperature
(isothermal), but before that temperature was achieved, reaction has
occurred. The first liquid sample was taken at temperature of 140 0C
and repeated every 5 minute interval. So the data obtained are in the
regions of non-isothermal and isothermal. It was observed that the
degradation has significant effects on the ethanol production. The
kinetic constants can be expressed by Arrhenius equation with the
frequency factors for hydrolysis and sugar degradation of 1.58 x 105
min-1 and 2.29 x 108 L/mole-min, respectively, while the activation
energies are 64,350 J/mole and 76,571 J/mole. The highest ethanol
concentration from fermentation is 1.13% v/v, attained at 220 0C.
Abstract: This paper presents a heuristic approach to solve the Generalized Assignment Problem (GAP) which is NP-hard. It is worth mentioning that many researches used to develop algorithms for identifying the redundant constraints and variables in linear programming model. Some of the algorithms are presented using intercept matrix of the constraints to identify redundant constraints and variables prior to the start of the solution process. Here a new heuristic approach based on the dominance property of the intercept matrix to find optimal or near optimal solution of the GAP is proposed. In this heuristic, redundant variables of the GAP are identified by applying the dominance property of the intercept matrix repeatedly. This heuristic approach is tested for 90 benchmark problems of sizes upto 4000, taken from OR-library and the results are compared with optimum solutions. Computational complexity is proved to be O(mn2) of solving GAP using this approach. The performance of our heuristic is compared with the best state-ofthe- art heuristic algorithms with respect to both the quality of the solutions. The encouraging results especially for relatively large size test problems indicate that this heuristic approach can successfully be used for finding good solutions for highly constrained NP-hard problems.
Abstract: The standard investigational method for obstructive
sleep apnea syndrome (OSAS) diagnosis is polysomnography (PSG),
which consists of a simultaneous, usually overnight recording of
multiple electro-physiological signals related to sleep and
wakefulness. This is an expensive, encumbering and not a readily
repeated protocol, and therefore there is need for simpler and easily
implemented screening and detection techniques. Identification of
apnea/hypopnea events in the screening recordings is the key factor
for the diagnosis of OSAS. The analysis of a solely single-lead
electrocardiographic (ECG) signal for OSAS diagnosis, which may
be done with portable devices, at patient-s home, is the challenge of
the last years. A novel artificial neural network (ANN) based
approach for feature extraction and automatic identification of
respiratory events in ECG signals is presented in this paper. A
nonlinear principal component analysis (NLPCA) method was
considered for feature extraction and support vector machine for
classification/recognition. An alternative representation of the
respiratory events by means of Kohonen type neural network is
discussed. Our prospective study was based on OSAS patients of the
Clinical Hospital of Pneumology from Iaşi, Romania, males and
females, as well as on non-OSAS investigated human subjects. Our
computed analysis includes a learning phase based on cross signal
PSG annotation.
Abstract: A parallel block method based on Backward
Differentiation Formulas (BDF) is developed for the parallel solution
of stiff Ordinary Differential Equations (ODEs). Most common
methods for solving stiff systems of ODEs are based on implicit
formulae and solved using Newton iteration which requires repeated
solution of systems of linear equations with coefficient matrix, I -
hβJ . Here, J is the Jacobian matrix of the problem. In this paper,
the matrix operations is paralleled in order to reduce the cost of the
iterations. Numerical results are given to compare the speedup and
efficiency of parallel algorithm and that of sequential algorithm.
Abstract: The sensitivity of orifice plate metering to disturbed
flow (either asymmetric or swirling) is a subject of great concern to
flow meter users and manufacturers. The distortions caused by pipe
fittings and pipe installations upstream of the orifice plate are major
sources of this type of non-standard flows. These distortions can alter
the accuracy of metering to an unacceptable degree. In this work, a
multi-scale object known as metal foam has been used to generate a
predetermined turbulent flow upstream of the orifice plate. The
experimental results showed that the combination of an orifice plate
and metal foam flow conditioner is broadly insensitive to upstream
disturbances. This metal foam demonstrated a good performance in
terms of removing swirl and producing a repeatable flow profile
within a short distance downstream of the device. The results of using
a combination of a metal foam flow conditioner and orifice plate for
non-standard flow conditions including swirling flow and asymmetric
flow show this package can preserve the accuracy of metering up to
the level required in the standards.
Abstract: We analyze hand dexterity in Parkinson-s disease patients (PD) and control subjects using a natural manual transport task (moving an object from one place to another). Eight PD patients and ten control subjects performed the task repeatedly at maximum speed both in OFF and ON medicated status. The movement parameters and the grip and load forces were recorded by a single optoelectronic camera and force transducers built in the especially designed object. Using the force and velocity signals, ten subsequent phases of the transport movement were defined and their durations were measured. The outline of 3D optical measurement is presented to obtain more precise movement trajectory.
Abstract: This paper addresses one important aspect of
combustion system analysis, the spray evaporation and
dispersion modeling. In this study we assume an empty
cylinder which is as a simulator for a ramjet engine and the
cylinder has been studied by cold flow. Four nozzles have the
duties of injection which are located in the entrance of
cylinder. The air flow comes into the cylinder from one side
and injection operation will be done. By changing injection
velocity and entrance air flow velocity, we have studied
droplet sizing and efficient mass fraction of fuel vapor near
and at the exit area. We named the mass of fuel vapor inside
the flammability limit as the efficient mass fraction. Further,
we decreased the initial temperature of fuel droplets and we
have repeated the investigating again. To fulfill the calculation
we used a modified version of KIVA-3V.
Abstract: Academic research information service is a must for surveying previous studies in research and development process. OntoFrame is an academic research information service under Semantic Web framework different from simple keyword-based services such as CiteSeer and Google Scholar. The first purpose of this study is for revealing user behavior in their surveys, the objects of using academic research information services, and their needs. The second is for applying lessons learned from the results to OntoFrame.
Abstract: In this paper, the hardware implementation of the
RSA public-key cryptographic algorithm is presented. The RSA
cryptographic algorithm is depends on the computation of repeated
modular exponentials.
The Montgomery algorithm is used and modified to reduce
hardware resources and to achieve reasonable operating speed for
FPGA. An efficient architecture for modular multiplications based on
the array multiplier is proposed. We have implemented a RSA
cryptosystem based on Montgomery algorithm. As a result, it is
shown that proposed architecture contributes to small area and
reasonable speed.