Abstract: It is known that the heart interacts with and adapts to
its venous and arterial loading conditions. Various experimental
studies and modeling approaches have been developed to investigate
the underlying mechanisms. This paper presents a model of the left
ventricle derived based on nonlinear stress-length myocardial
characteristics integrated over truncated ellipsoidal geometry, and
second-order dynamic mechanism for the excitation-contraction
coupling system. The results of the model presented here describe the
effects of the viscoelastic damping element of the electromechanical
coupling system on the hemodynamic response. Different heart rates
are considered to study the pacing effects on the performance of the
left-ventricle against constant preload and afterload conditions under
various damping conditions. The results indicate that the pacing
process of the left ventricle has to take into account, among other
things, the viscoelastic damping conditions of the myofilament
excitation-contraction process.
Abstract: Building intelligent traffic guide systems has been an
interesting subject recently. A good system should be able to observe
all important visual information to be able to analyze the context of
the scene. To do so, signs in general, and traffic signs in particular,
are usually taken into account as they contain rich information to
these systems. Therefore, many researchers have put an effort on
sign recognition field. Sign localization or sign detection is the most
important step in the sign recognition process. This step filters out
non informative area in the scene, and locates candidates in later
steps. In this paper, we apply a new approach in detecting sign
locations using a new color invariant model. Experiments are carried
out with different datasets introduced in other works where authors
claimed the difficulty in detecting signs under unfavorable imaging
conditions. Our method is simple, fast and most importantly it gives
a high detection rate in locating signs.
Abstract: Machining through turning was carried out in a lathe
to study the chip formation of Multiphase Ferrite
(F-B-M) microalloyed steel. Taguchi orthogonal array was employed
to perform the machining. Continuous and discontinuous chips were
formed for different cutting parameters like speed, feed and depth of
cut. Optical and scanning electron microscope was employed to
identify the chip morphology.
Abstract: This research aims to examine the key success factors
for the diffusion of mobile entertainment services in Malaysia. The
drivers and barriers observed in this research include perceived
benefit; concerns pertaining to pricing, product and technological
standardization, privacy and security; as well as influences from
peers and community. An analysis of a Malaysian survey of 384
respondents between 18 to 25 years shows that subscribers placed
greater importance on perceived benefit of mobile entertainment
services compared to other factors. Results of the survey also show
that there are strong positive correlations between all the factors,
with pricing issue–perceived benefit showing the strongest
relationship. This paper aims to provide an extensive study on the
drivers and barriers that could be used to derive architecture for
entertainment service provision to serve as a guide for telcos to
outline suitable approaches in order to encourage mass market
adoption of mobile entertainment services in Malaysia.
Abstract: One important objective in Precision Agriculture is to minimize the volume of herbicides that are applied to the fields through the use of site-specific weed management systems. In order to reach this goal, two major factors need to be considered: 1) the similar spectral signature, shape and texture between weeds and crops; 2) the irregular distribution of the weeds within the crop's field. This paper outlines an automatic computer vision system for the detection and differential spraying of Avena sterilis, a noxious weed growing in cereal crops. The proposed system involves two processes: image segmentation and decision making. Image segmentation combines basic suitable image processing techniques in order to extract cells from the image as the low level units. Each cell is described by two area-based attributes measuring the relations among the crops and the weeds. From these attributes, a hybrid decision making approach determines if a cell must be or not sprayed. The hybrid approach uses the Support Vector Machines and the Fuzzy k-Means methods, combined through the fuzzy aggregation theory. This makes the main finding of this paper. The method performance is compared against other available strategies.
Abstract: with increasing circuits- complexity and demand to
use portable devices, power consumption is one of the most
important parameters these days. Full adders are the basic block of
many circuits. Therefore reducing power consumption in full adders
is very important in low power circuits. One of the most powerconsuming
modules in full adders is XOR/XNOR circuit. This paper
presents two new full adders based on two new logic approaches. The
proposed logic approaches use one XOR or XNOR gate to implement
a full adder cell. Therefore, delay and power will be decreased. Using
two new approaches and two XOR and XNOR gates, two new full
adders have been implemented in this paper. Simulations are carried
out by HSPICE in 0.18μm bulk technology with 1.8V supply voltage.
The results show that the ten-transistors proposed full adder has 12%
less power consumption and is 5% faster in comparison to MB12T
full adder. 9T is more efficient in area and is 24% better than similar
10T full adder in term of power consumption. The main drawback of
the proposed circuits is output threshold loss problem.
Abstract: School brawls have taken casualties to the life of
students in Jakarta. In the last time, school brawl studies investigate
the cause with groups approach such as cognitive dissonance that
provocation and resentment among student in the schools. This
research focus on individual factors as the cause of school brawls,
where the characteristics of children with ADHD, lack of self-control
regulation, and level of depression. The results show that in fact the
lower influence of individual factor to be come conduct disorder. The
meaning students have good self-regulation control, insignificant
characteristics of children with ADHD, and moderate of depression
level. Concluded group factor more significant than individual factor
to caused school brawl.
Abstract: This paper presents the simulation of fragmentation
warhead using a hydrocode, Autodyn. The goal of this research is to
determine the lethal range of such a warhead. This study investigates
the lethal range of warheads with and without steel balls as
preformed fragments. The results from the FE simulation, i.e. initial
velocities and ejected spray angles of fragments, are further processed
using an analytical approach so as to determine a fragment hit density
and probability of kill of a modelled warhead. In order to simulate a
plenty of preformed fragments inside a warhead, the model requires
expensive computation resources. Therefore, this study attempts to
model the problem in an alternative approach by considering an
equivalent mass of preformed fragments to the mass of warhead
casing. This approach yields approximately 7% and 20% difference
of fragment velocities from the analytical results for one and two
layers of preformed fragments, respectively. The lethal ranges of the
simulated warheads are 42.6 m and 56.5 m for warheads with one and
two layers of preformed fragments, respectively, compared to 13.85
m for a warhead without preformed fragment. These lethal ranges are
based on the requirement of fragment hit density. The lethal ranges
which are based on the probability of kill are 27.5 m, 61 m and 70 m
for warheads with no preformed fragment, one and two layers of
preformed fragments, respectively.
Abstract: The mixed oxide nuclear fuel (MOX) of U and Pu contains several percent of fission products and minor actinides, such as neptunium, americium and curium. It is important to determine accurately the decay heat from Curium isotopes as they contribute significantly in the MOX fuel. This heat generation can cause samples to melt very quickly if excessive quantities of curium are present. In the present paper, we introduce a new approach that can predict the decay heat from curium isotopes. This work is a part of the project funded by King Abdulaziz City of Science and Technology (KASCT), Long-Term Comprehensive National Plan for Science, Technology and Innovations, and take place in King Abdulaziz University (KAU), Saudi Arabia. The approach is based on the numerical solution of coupled linear differential equations that describe decays and buildups of many nuclides to calculate the decay heat produced after shutdown. Results show the consistency and reliability of the approach applied.
Abstract: For over a decade, the Pulse Coupled Neural Network
(PCNN) based algorithms have been successfully used in image
interpretation applications including image segmentation. There are
several versions of the PCNN based image segmentation methods,
and the segmentation accuracy of all of them is very sensitive to the
values of the network parameters. Most methods treat PCNN
parameters like linking coefficient and primary firing threshold as
global parameters, and determine them by trial-and-error. The
automatic determination of appropriate values for linking coefficient,
and primary firing threshold is a challenging problem and deserves
further research. This paper presents a method for obtaining global as
well as local values for the linking coefficient and the primary firing
threshold for neurons directly from the image statistics. Extensive
simulation results show that the proposed approach achieves
excellent segmentation accuracy comparable to the best accuracy
obtainable by trial-and-error for a variety of images.
Abstract: Since the conception of JML, many tools, applications and implementations have been done. In this context, the users or developers who want to use JML seem surounded by many of these tools, applications and so on. Looking for a common infrastructure and an independent language to provide a bridge between these tools and JML, we developed an approach to embedded contracts in XML for Java: XJML. This approach offer us the ability to separate preconditions, posconditions and class invariants using JML and XML, so we made a front-end which can process Runtime Assertion Checking, Extended Static Checking and Full Static Program Verification. Besides, the capabilities for this front-end can be extended and easily implemented thanks to XML. We believe that XJML is an easy way to start the building of a Graphic User Interface delivering in this way a friendly and IDE independency to developers community wich want to work with JML.
Abstract: In 1990 [1] the subband-DFT (SB-DFT) technique was proposed. This technique used the Hadamard filters in the decomposition step to split the input sequence into low- and highpass sequences. In the next step, either two DFTs are needed on both bands to compute the full-band DFT or one DFT on one of the two bands to compute an approximate DFT. A combination network with correction factors was to be applied after the DFTs. Another approach was proposed in 1997 [2] for using a special discrete wavelet transform (DWT) to compute the discrete Fourier transform (DFT). In the first step of the algorithm, the input sequence is decomposed in a similar manner to the SB-DFT into two sequences using wavelet decomposition with Haar filters. The second step is to perform DFTs on both bands to obtain the full-band DFT or to obtain a fast approximate DFT by implementing pruning at both input and output sides. In this paper, the wavelet-based DFT (W-DFT) with Haar filters is interpreted as SB-DFT with Hadamard filters. The only difference is in a constant factor in the combination network. This result is very important to complete the analysis of the W-DFT, since all the results concerning the accuracy and approximation errors in the SB-DFT are applicable. An application example in spectral analysis is given for both SB-DFT and W-DFT (with different filters). The adaptive capability of the SB-DFT is included in the W-DFT algorithm to select the band of most energy as the band to be computed. Finally, the W-DFT is extended to the two-dimensional case. An application in image transformation is given using two different types of wavelet filters.
Abstract: One of the main research directions in CAD/CAM
machining area is the reducing of machining time.
The feedrate scheduling is one of the advanced techniques that
allows keeping constant the uncut chip area and as sequel to keep
constant the main cutting force. They are two main ways for feedrate
optimization. The first consists in the cutting force monitoring, which
presumes to use complex equipment for the force measurement and
after this, to set the feedrate regarding the cutting force variation. The
second way is to optimize the feedrate by keeping constant the
material removal rate regarding the cutting conditions.
In this paper there is proposed a new approach using an extended
database that replaces the system model.
The feedrate scheduling is determined based on the identification
of the reconfigurable machine tool, and the feed value determination
regarding the uncut chip section area, the contact length between tool
and blank and also regarding the geometrical roughness.
The first stage consists in the blank and tool monitoring for the
determination of actual profiles. The next stage is the determination
of programmed tool path that allows obtaining the piece target
profile.
The graphic representation environment models the tool and blank
regions and, after this, the tool model is positioned regarding the
blank model according to the programmed tool path. For each of
these positions the geometrical roughness value, the uncut chip area
and the contact length between tool and blank are calculated. Each of
these parameters are compared with the admissible values and
according to the result the feed value is established.
We can consider that this approach has the following advantages:
in case of complex cutting processes the prediction of cutting force is
possible; there is considered the real cutting profile which has
deviations from the theoretical profile; the blank-tool contact length
limitation is possible; it is possible to correct the programmed tool
path so that the target profile can be obtained.
Applying this method, there are obtained data sets which allow the
feedrate scheduling so that the uncut chip area is constant and, as a
result, the cutting force is constant, which allows to use more
efficiently the machine tool and to obtain the reduction of machining
time.
Abstract: This paper proposes a new technique for improving
the efficiency of software testing, which is based on a conventional
attempt to reduce test cases that have to be tested for any given
software. The approach utilizes the advantage of Regression Testing
where fewer test cases would lessen time consumption of the testing
as a whole. The technique also offers a means to perform test case
generation automatically. Compared to one of the techniques in the
literature where the tester has no option but to perform the test case
generation manually, the proposed technique provides a better
option. As for the test cases reduction, the technique uses simple
algebraic conditions to assign fixed values to variables (Maximum,
minimum and constant variables). By doing this, the variables values
would be limited within a definite range, resulting in fewer numbers
of possible test cases to process. The technique can also be used in
program loops and arrays.
Abstract: Scheduling for the flexible job shop is very important
in both fields of production management and combinatorial
optimization. However, it quit difficult to achieve an optimal solution
to this problem with traditional optimization approaches owing to the
high computational complexity. The combining of several
optimization criteria induces additional complexity and new
problems. In this paper, a Pareto approach to solve the multi
objective flexible job shop scheduling problems is proposed. The
objectives considered are to minimize the overall completion time
(makespan) and total weighted tardiness (TWT). An effective
simulated annealing algorithm based on the proposed approach is
presented to solve multi objective flexible job shop scheduling
problem. An external memory of non-dominated solutions is
considered to save and update the non-dominated solutions during
the solution process. Numerical examples are used to evaluate and
study the performance of the proposed algorithm. The proposed
algorithm can be applied easily in real factory conditions and for
large size problems. It should thus be useful to both practitioners and
researchers.
Abstract: The focus of this research is in the area of the soviet period and the mission of the Russian Orthodox Church in Kazakhstan in the XIX century. There was close connection of national customs and traditions with religious practices, outlooks and attitudes. In particular, such an approach has alleged estimation by Kazakh historians of the process of Christianization of the local population. Some of them are inclined to consider the small number of Christening Kazakhs as evidence that the Russian Orthodox Church didn’t achieve its mission. The number of historians who think that the church didn’t achieve its mission has thousand over the last centuries, however our calculations of the number of Kazakhs who became Orthodox Christian is much more than other historians think. Such Christians can be divided into 3 groups: Some remained Christian until their deaths, others had two faiths and the third hid their true religions, having returned to their former belief. Therefore, to define the exact amount of Christening Kazakhs represented a challenge. Some data does not create a clear picture of the level of Christianization, constant and accurate was not collected. The data appearing in reports of spiritual attendants and civil authorities is not always authentic. Article purpose is illumination and the analysis missionary activity of Russian Orthodox Church in Kazakhstan.
Abstract: Argument over the use of particular method in interlanguage pragmatics has increased recently. Researchers argued the advantages and disadvantages of each method either natural or elicited. Findings of different studies indicated that the use of one method may not provide enough data to answer all its questions. The current study investigated the validity of using multimethod approach in interlanguage pragmatics to understand the development of requests in Arabic as a second language (Arabic L2). To this end, the study adopted two methods belong to two types of data sources: the institutional discourse (natural data), and the role play (elicited data). Participants were 117 learners of Arabic L2 at the university level, representing four levels (beginners, low-intermediate, highintermediate, and advanced). Results showed that using two or more methods in interlanguage pragmatics affect the size and nature of data.
Abstract: In this paper we investigate the influence of external
noise on the inference of network structures. The purpose of our
simulations is to gain insights in the experimental design of microarray
experiments to infer, e.g., transcription regulatory networks
from microarray experiments. Here external noise means, that the
dynamics of the system under investigation, e.g., temporal changes of
mRNA concentration, is affected by measurement errors. Additionally
to external noise another problem occurs in the context of microarray
experiments. Practically, it is not possible to monitor the mRNA
concentration over an arbitrary long time period as demanded by the
statistical methods used to learn the underlying network structure. For
this reason, we use only short time series to make our simulations
more biologically plausible.
Abstract: One approach to assess neural networks underlying the cognitive processes is to study Electroencephalography (EEG). It is relevant to detect various mental states and characterize the physiological changes that help to discriminate two situations. That is why an EEG (amplitude, synchrony) classification procedure is described, validated. The two situations are "eyes closed" and "eyes opened" in order to study the "alpha blocking response" phenomenon in the occipital area. The good classification rate between the two situations is 92.1 % (SD = 3.5%) The spatial distribution of a part of amplitude features that helps to discriminate the two situations are located in the occipital regions that permit to validate the localization method. Moreover amplitude features in frontal areas, "short distant" synchrony in frontal areas and "long distant" synchrony between frontal and occipital area also help to discriminate between the two situations. This procedure will be used for mental fatigue detection.
Abstract: Virtually all existing networked system management
tools use a Manager/Agent paradigm. That is, distributed agents are
deployed on managed devices to collect local information and report
it back to some management unit. Even those that use standard
protocols such as SNMP fall into this model. Using standard protocol
has the advantage of interoperability among devices from different
vendors. However, it may not be able to provide customized
information that is of interest to satisfy specific management needs.
In this dissertation work, different approaches are used to
collect information regarding the devices attached to a Local Area
Network. An SNMP aware application is being developed that will
manage the discovery procedure and will be used as data collector.