Abstract: This research work is aimed at speech recognition
using scaly neural networks. A small vocabulary of 11 words were
established first, these words are “word, file, open, print, exit, edit,
cut, copy, paste, doc1, doc2". These chosen words involved with
executing some computer functions such as opening a file, print
certain text document, cutting, copying, pasting, editing and exit.
It introduced to the computer then subjected to feature extraction
process using LPC (linear prediction coefficients). These features are
used as input to an artificial neural network in speaker dependent
mode. Half of the words are used for training the artificial neural
network and the other half are used for testing the system; those are
used for information retrieval.
The system components are consist of three parts, speech
processing and feature extraction, training and testing by using neural
networks and information retrieval.
The retrieve process proved to be 79.5-88% successful, which is
quite acceptable, considering the variation to surrounding, state of
the person, and the microphone type.
Abstract: In ad hoc networks, the main issue about designing of protocols is quality of service, so that in wireless sensor networks the main constraint in designing protocols is limited energy of sensors. In fact, protocols which minimize the power consumption in sensors are more considered in wireless sensor networks. One approach of reducing energy consumption in wireless sensor networks is to reduce the number of packages that are transmitted in network. The technique of collecting data that combines related data and prevent transmission of additional packages in network can be effective in the reducing of transmitted packages- number. According to this fact that information processing consumes less power than information transmitting, Data Aggregation has great importance and because of this fact this technique is used in many protocols [5]. One of the Data Aggregation techniques is to use Data Aggregation tree. But finding one optimum Data Aggregation tree to collect data in networks with one sink is a NP-hard problem. In the Data Aggregation technique, related information packages are combined in intermediate nodes and form one package. So the number of packages which are transmitted in network reduces and therefore, less energy will be consumed that at last results in improvement of longevity of network. Heuristic methods are used in order to solve the NP-hard problem that one of these optimization methods is to solve Simulated Annealing problems. In this article, we will propose new method in order to build data collection tree in wireless sensor networks by using Simulated Annealing algorithm and we will evaluate its efficiency whit Genetic Algorithm.
Abstract: Researchers have been applying tional intelligence (AI/CI) methods to computer games. In this research field, further researchesare required to compare AI/CI
methods with respect to each game application. In th
our experimental result on the comparison of three evolutionary algorithms – evolution strategy, genetic algorithm, and their hybrid
applied to evolving controller agents for the CIG 2007 Simulated Car Racing competition. Our experimental result shows that, premature
convergence of solutions was observed in the case of ES, and GA outperformed ES in the last half of generations. Besides, a hybrid
which uses GA first and ES next evolved the best solution among the whole solutions being generated. This result shows the ability of GA in
globally searching promising areas in the early stage and the ability of ES in locally searching the focused area (fine-tuning solutions).
Abstract: Nevertheless the widespread application of finite
mixture models in segmentation, finite mixture model selection is
still an important issue. In fact, the selection of an adequate number
of segments is a key issue in deriving latent segments structures and
it is desirable that the selection criteria used for this end are effective.
In order to select among several information criteria, which may
support the selection of the correct number of segments we conduct a
simulation study. In particular, this study is intended to determine
which information criteria are more appropriate for mixture model
selection when considering data sets with only categorical
segmentation base variables. The generation of mixtures of
multinomial data supports the proposed analysis. As a result, we
establish a relationship between the level of measurement of
segmentation variables and some (eleven) information criteria-s
performance. The criterion AIC3 shows better performance (it
indicates the correct number of the simulated segments- structure
more often) when referring to mixtures of multinomial segmentation
base variables.
Abstract: This paper presents the region based segmentation method for ultrasound images using local statistics. In this segmentation approach the homogeneous regions depends on the image granularity features, where the interested structures with dimensions comparable to the speckle size are to be extracted. This method uses a look up table comprising of the local statistics of every pixel, which are consisting of the homogeneity and similarity bounds according to the kernel size. The shape and size of the growing regions depend on this look up table entries. The algorithms are implemented by using connected seeded region growing procedure where each pixel is taken as seed point. The region merging after the region growing also suppresses the high frequency artifacts. The updated merged regions produce the output in formed of segmented image. This algorithm produces the results that are less sensitive to the pixel location and it also allows a segmentation of the accurate homogeneous regions.
Abstract: Solid fuel transient burning behavior under oxidizer
gas flow is numerically investigated. It is done using analysis of the
regression rate responses to the imposed sudden and oscillatory
variation at inflow properties. The conjugate problem is considered
by simultaneous solution of flow and solid phase governing
equations to compute the fuel regression rate. The advection
upstream splitting method is used as flow computational scheme in
finite volume method. The ignition phase is completely simulated to
obtain the exact initial condition for response analysis. The results
show that the transient burning effects which lead to the combustion
instabilities and intermittent extinctions could be observed in solid
fuels as the solid propellants.
Abstract: This paper present the harmonic elimination of hybrid
multilevel inverters (HMI) which could be increase the number of
output voltage level. Total Harmonic Distortion (THD) is one of the
most important requirements concerning performance indices.
Because of many numbers output levels of HMI, it had numerous
unknown variables of eliminate undesired individual harmonic and
THD nonlinear equations set. Optimized harmonic stepped waveform
(OHSW) is solving switching angles conventional method, but most
complicated for solving as added level. The artificial intelligent
techniques are deliberation to solve this problem. This paper presents
the Particle Swarm Optimization (PSO) technique for solving
switching angles to get minimum THD and eliminate undesired
individual harmonics of 15-levels hybrid multilevel inverters.
Consequently it had many variables and could eliminate numerous
harmonics. Both advantages including high level of inverter and
Particle Swarm Optimization (PSO) are used as powerful tools for
harmonics elimination.
Abstract: Urban disaster risks and vulnerabilities are great problems for Turkey. The annual loss of life and property through disaster in the world-s major metropolitan areas is increasing. Urban concentrations of the poor and less-informed in environmentally fragile locations suffer the impact of disaster disproportionately. Gecekondu (squatter) developments will compound the inherent risks associated with high-density environments, in appropriate technologies, and inadequate infrastructure. On the other hand, there are many geological disadvantages such as sitting on top of active tectonic plate boundaries, and why having avalanche, flood, and landslide and drought prone areas in Turkey. However, this natural formation is inevitable; the only way to survive in such a harsh geography is to be aware of importance of these natural events and to take political and physical measures. The main aim of this research is to bring up the magnitude of natural hazard risks in Izmir built-up zone, not being taken into consideration adequately. Because the dimensions of the peril are not taken seriously enough, the natural hazard risks, which are commonly well known, are not considered important or they are being forgotten after some time passes. Within this research, the magnitude of natural hazard risks for Izmir is being presented in the scope of concrete and local researches over Izmir risky areas.
Abstract: Knowledge development in companies relies on
knowledge-intensive business processes, which are characterized by
a high complexity in their execution, weak structuring,
communication-oriented tasks and high decision autonomy, and often the need for creativity and innovation. A foundation of knowledge development is provided, which is based on a new conception of
knowledge and knowledge dynamics. This conception consists of a three-dimensional model of knowledge with types, kinds and qualities. Built on this knowledge conception, knowledge dynamics is
modeled with the help of general knowledge conversions between
knowledge assets. Here knowledge dynamics is understood to cover
all of acquisition, conversion, transfer, development and usage of
knowledge. Through this conception we gain a sound basis for
knowledge management and development in an enterprise. Especially
the type dimension of knowledge, which categorizes it according to
its internality and externality with respect to the human being, is crucial for enterprise knowledge management and development,
because knowledge should be made available by converting it to
more external types.
Built on this conception, a modeling approach for knowledgeintensive
business processes is introduced, be it human-driven,e-driven or task-driven processes. As an example for this approach, a model of the creative activity for the renewal planning of
a product is given.
Abstract: The aim of this study was to compare the solubility of selected volatile organic compounds in water and silicon oil using the simple static headspace method. The experimental design allowed equilibrium achievement within 30 – 60 minutes. Infinite dilution activity coefficients and Henry-s law constants for various organics representing esters, ketones, alkanes, aromatics, cycloalkanes and amines were measured at 303K. The measurements were reproducible with a relative standard deviation and coefficient of variation of 1.3x10-3 and 1.3 respectively. The static determined activity coefficients using shaker flasks were reasonably comparable to those obtained using the gas liquid - chromatographic technique and those predicted using the group contribution methods mainly the UNIFAC. Silicon oil chemically known as polydimethysiloxane was found to be better absorbent for VOCs than water which quickly becomes saturated. For example the infinite dilution mole fraction based activity coefficients of hexane is 0.503 and 277 000 in silicon oil and water respectively. Thus silicon oil gives a superior factor of 550 696. Henry-s law constants and activity coefficients at infinite dilution play a significant role in the design of scrubbers for abatement of volatile organic compounds from contaminated air streams. This paper presents the phase equilibrium of volatile organic compounds in very dilute aqueous and polymeric solutions indicating the movement and fate of chemical in air and solvent. The successful comparison of the results obtained here and those obtained using other methods by the same authors and in literature, means that the results obtained here are reliable.
Abstract: Despite the recent surge of research in control of
worm propagation, currently, there is no effective defense system
against such cyber attacks. We first design a distributed detection
architecture called Detection via Distributed Blackholes (DDBH).
Our novel detection mechanism could be implemented via virtual
honeypots or honeynets. Simulation results show that a worm can be
detected with virtual honeypots on only 3% of the nodes. Moreover,
the worm is detected when less than 1.5% of the nodes are infected.
We then develop two control strategies: (1) optimal dynamic trafficblocking,
for which we determine the condition that guarantees
minimum number of removed nodes when the worm is contained and
(2) predictive dynamic traffic-blocking–a realistic deployment of
the optimal strategy on scale-free graphs. The predictive dynamic
traffic-blocking, coupled with the DDBH, ensures that more than
40% of the network is unaffected by the propagation at the time
when the worm is contained.
Abstract: Concrete strength evaluated from compression tests
on cores is affected by several factors causing differences from the
in-situ strength at the location from which the core specimen was
extracted. Among the factors, there is the damage possibly occurring
during the drilling phase that generally leads to underestimate the
actual in-situ strength. In order to quantify this effect, in this study
two wide datasets have been examined, including: (i) about 500 core
specimens extracted from Reinforced Concrete existing structures,
and (ii) about 600 cube specimens taken during the construction of
new structures in the framework of routine acceptance control. The
two experimental datasets have been compared in terms of
compression strength and specific weight values, accounting for the
main factors affecting a concrete property, that is type and amount of
cement, aggregates' grading, type and maximum size of aggregates,
water/cement ratio, placing and curing modality, concrete age. The
results show that the magnitude of the strength reduction due to
drilling damage is strongly affected by the actual properties of
concrete, being inversely proportional to its strength. Therefore, the
application of a single value of the correction coefficient, as generally
suggested in the technical literature and in structural codes, appears
inappropriate. A set of values of the drilling damage coefficient is
suggested as a function of the strength obtained from compressive
tests on cores.
Abstract: Nigerian bread is baked with vitamin A fortified wheat flour. Study aimed at determining its contribution to preschoolers- vitamin A nutriture. A cross-sectional/experimental study was carried out in four poor-urban Local Government Areas (LGAs) of Metropolitan Lagos, Nigeria. A pretested food frequency questionnaire was administered to randomly selected mothers of 1600 preschoolers (24-59 months). Retinyl Palmitate content of fourteen bread samples randomly collected from bakeries in all LGAs was analyzed at 0 and 5 days at 25oC using High Performance Liquid Chromatography. Data analysis was done at p
Abstract: The highly nonlinear characteristics of drying
processes have prompted researchers to seek new nonlinear control
solutions. However, the relation between the implementation
complexity, on-line processing complexity, reliability control
structure and controller-s performance is not well established. The
present paper proposes high performance nonlinear fuzzy controllers
for a real-time operation of a drying machine, being developed under
a consistent match between those issues. A PCI-6025E data
acquisition device from National Instruments® was used, and the
control system was fully designed with MATLAB® / SIMULINK
language. Drying parameters, namely relative humidity and
temperature, were controlled through MIMOs Hybrid Bang-bang+PI
(BPI) and Four-dimensional Fuzzy Logic (FLC) real-time-based
controllers to perform drying tests on biological materials. The
performance of the drying strategies was compared through several
criteria, which are reported without controllers- retuning. Controllers-
performance analysis has showed much better performance of FLC
than BPI controller. The absolute errors were lower than 8,85 % for
Fuzzy Logic Controller, about three times lower than the
experimental results with BPI control.
Abstract: Intermetallic Ni3Al – based alloys belong to a group
of advanced materials characterized by good chemical and physical
properties (such as structural stability, corrosion resistance) which
offer advenced technological applications. The paper presents the
study of catalytic properties of Ni3Al foils (thickness approximately
50 &m) in the methanol and hexane decomposition. The egzamined
material posses microcrystalline structure without any additional
catalysts on the surface. The better catalytic activity of Ni3Al foils
with respect to quartz plates in both methanol and hexane
decomposition was confirmed. On thin Ni3Al foils the methanol
conversion reaches approximately 100% above 480 oC while the
hexane conversion reaches approximately 100% (98,5%) at 500 oC.
Deposit formed during the methanol decomposition is built up of
carbon nanofibers decorated with metal-like nanoparticles.
Abstract: One major issue that is regularly cited as a block to
the widespread use of online assessments in eLearning, is that of the
authentication of the student and the level of confidence that an
assessor can have that the assessment was actually completed by that
student. Currently, this issue is either ignored, in which case
confidence in the assessment and any ensuing qualification is
damaged, or else assessments are conducted at central, controlled
locations at specified times, losing the benefits of the distributed
nature of the learning programme. Particularly as we move towards
constructivist models of learning, with intentions towards achieving
heutagogic learning environments, the benefits of a properly
managed online assessment system are clear. Here we discuss some
of the approaches that could be adopted to address these issues,
looking at the use of existing security and biometric techniques,
combined with some novel behavioural elements. These approaches
offer the opportunity to validate the student on accessing an
assessment, on submission, and also during the actual production of
the assessment. These techniques are currently under development in
the DECADE project, and future work will evaluate and report their
use..
Abstract: In this study, a synthetic pathway was created by
assembling genes from Clostridium butyricum and Escherichia coli
in different combinations. Among the genes were dhaB1 and dhaB2
from C. butyricum VPI1718 coding for glycerol dehydratase (GDHt)
and its activator (GDHtAc), respectively, involved in the conversion
of glycerol to 3-hydroxypropionaldehyde (3-HPA). The yqhD gene
from E.coli BL21 was also included which codes for an NADPHdependent
1,3-propanediol oxidoreductase isoenzyme (PDORI)
reducing 3-HPA to 1,3-propanediol (1,3-PD). Molecular modeling
analysis indicated that the conformation of fusion protein of YQHD
and DHAB1 was favorable for direct molecular channeling of the
intermediate 3-HPA. According to the simulation results, the yqhD
and dhaB1 gene were assembled in the upstream of dhaB2 to express
a fusion protein, yielding the recombinant strain E. coliBL21
(DE3)//pET22b+::yqhD-dhaB1_dhaB2 (strain BP41Y3). Strain
BP41Y3 gave 10-fold higher 1,3-PD concentration than E. coliBL21
(DE3)//pET22b+::yqhD-dhaB1_dhaB2 (strain BP31Y2) expressing
the recombinant enzymes simultaneously but in a non-fusion mode.
This is the first report using a gene fusion approach to enhance the
biological conversion of glycerol to the value added compound 1,3-
PD.
Abstract: Grid computing is a group of clusters connected over
high-speed networks that involves coordinating and sharing
computational power, data storage and network resources operating
across dynamic and geographically dispersed locations. Resource
management and job scheduling are critical tasks in grid computing.
Resource selection becomes challenging due to heterogeneity and
dynamic availability of resources. Job scheduling is a NP-complete
problem and different heuristics may be used to reach an optimal or
near optimal solution. This paper proposes a model for resource and
job scheduling in dynamic grid environment. The main focus is to
maximize the resource utilization and minimize processing time of
jobs. Grid resource selection strategy is based on Max Heap Tree
(MHT) that best suits for large scale application and root node of
MHT is selected for job submission. Job grouping concept is used to
maximize resource utilization for scheduling of jobs in grid
computing. Proposed resource selection model and job grouping
concept are used to enhance scalability, robustness, efficiency and
load balancing ability of the grid.
Abstract: We present a method for fast volume rendering using
graphics hardware (GPU). To our knowledge, it is the first implementation
on the GPU. Based on the Shear-Warp algorithm, our
GPU-based method provides real-time frame rates and outperforms
the CPU-based implementation. When the number of slices is not
sufficient, we add in-between slices computed by interpolation. This
improves then the quality of the rendered images. We have also
implemented the ray marching algorithm on the GPU. The results
generated by the three algorithms (CPU-based and GPU-based Shear-
Warp, GPU-based Ray Marching) for two test models has proved that
the ray marching algorithm outperforms the shear-warp methods in
terms of speed up and image quality.
Abstract: Sub-prime mortgage crisis which began in the US is
regarded as the most economic crisis since the Great Depression in the
early 20th century. Especially, hidden problems on efficient operation
of a business were disclosed at a time and many financial institutions
went bankrupt and filed for court receivership. The collapses of
physical market lead to bankruptcy of manufacturing and construction
businesses. This study is to analyze dynamic efficiency of construction
businesses during the five years at the turn of the global financial
crisis. By discovering the trend and stability of efficiency of a
construction business, this study-s objective is to improve
management efficiency of a construction business in the
ever-changing construction market. Variables were selected by
analyzing corporate information on top 20 construction businesses in
Korea and analyzed for static efficiency in 2008 and dynamic
efficiency between 2006 and 2010. Unlike other studies, this study
succeeded in deducing efficiency trend and stability of a construction
business for five years by using the DEA/Window model. Using the
analysis result, efficient and inefficient companies could be figured
out. In addition, relative efficiency among DMU was measured by
comparing the relationship between input and output variables of
construction businesses. This study can be used as a literature to
improve management efficiency for companies with low efficiency
based on efficiency analysis of construction businesses.