Abstract: Semantic query optimization consists in restricting the
search space in order to reduce the set of objects of interest for a
query. This paper presents an indexing method based on UB-trees
and a static analysis of the constraints associated to the views of the
database and to any constraint expressed on attributes. The result of
the static analysis is a partitioning of the object space into disjoint
blocks. Through Space Filling Curve (SFC) techniques, each
fragment (block) of the partition is assigned a unique identifier,
enabling the efficient indexing of fragments by UB-trees. The search
space corresponding to a range query is restricted to a subset of the
blocks of the partition. This approach has been developed in the
context of a KB-DBMS but it can be applied to any relational
system.
Abstract: A combined three-microphone voice activity detector (VAD) and noise-canceling system is studied to enhance speech recognition in an automobile environment. A previous experiment clearly shows the ability of the composite system to cancel a single noise source outside of a defined zone. This paper investigates the performance of the composite system when there are frequently moving noise sources (noise sources are coming from different locations but are not always presented at the same time) e.g. there is other passenger speech or speech from a radio when a desired speech is presented. To work in a frequently moving noise sources environment, whilst a three-microphone voice activity detector (VAD) detects voice from a “VAD valid zone", the 3-microphone noise canceller uses a “noise canceller valid zone" defined in freespace around the users head. Therefore, a desired voice should be in the intersection of the noise canceller valid zone and VAD valid zone. Thus all noise is suppressed outside this intersection of area. Experiments are shown for a real environment e.g. all results were recorded in a car by omni-directional electret condenser microphones.
Abstract: In situ observation of absorption spectral change of
heptil viologen cation radical (HV+.) was performed by slab optical
waveguide (SOWG) spectroscopy utilizing indium-tin-oxide (ITO)
electrodes. Synchronizing with electrochemical techniques, we
observed the adsorption process of HV+.on the ITO electrode. In this
study, we carried out the ITO-SOWG observations using KBr aqueous
solution containing different concentration of HV to investigate the
concentration dependent spectral change. A few specific absorption
bands, which indicated HV+.existed as both monomer and dimer on
ITO electrode surface with a monolayer or a few layers deposition,
were observed in UV-visible region. The change in the peak position
of the absorption spectra from adsorption species of HV+. were
correlated with the concentration of HV as well as the electrode
potential.
Abstract: Virtualization-based server consolidation has been
proven to be an ideal technique to solve the server sprawl problem by
consolidating multiple virtualized servers onto a few physical servers
leading to improved resource utilization and return on investment. In
this paper, we solve this problem by using existing servers, which are
heterogeneous and diversely preferred by IT managers. Five practical
consolidation rules are introduced, and a decision model is proposed to
optimally allocate source services to physical target servers while
maximizing the average resource utilization and preference value. Our
model can be regarded as a multi-objective multi-dimension
bin-packing (MOMDBP) problem with constraints, which is strongly
NP-hard. An improved grouping generic algorithm (GGA) is
introduced for the problem. Extensive simulations were performed and
the results are given.
Abstract: Nowadays, fluidized bed plays an important part in industry. The design of this kind of reactor requires knowing the interfacial area between two phases and this interfacial area leads to calculate the solid holdup in the bed. Consequently achieving interfacial area between gas and solid in the bed experimentally is so significant. On interfacial area measurement in fluidized bed with gas has been worked, but light transmission technique has been used less. Therefore, in the current research the possibility of using of this technique and its accuracy are investigated. Measuring, a fluidized bed was designed and the problems were averted as far as possible. By using fine solid with equal shape and diameter and installing an optical system, the absorption of light during the time of fluidization has been measured. Results indicate that this method that its validity has been proved in the gas-liquid system, by different reasons have less application in gas-solid system. One important reason could be non-uniformity in such systems.
Abstract: Traditional optical networks are gradually evolving towards intelligent optical networks due to the need for faster bandwidth provisioning, protection and restoration of the network that can be accomplished with devices like optical switch, add drop multiplexer and cross connects. Since dense wavelength multiplexing forms the physical layer for intelligent optical networking, the roll of high speed all optical switch is important. This paper analyzes such an ultra-high speed polymer electro-optic switch. The performances of the 2x2 optical waveguide switch with rectangular, triangular and trapezoidal grating profiles on various device parameters are analyzed. The simulation result shows that trapezoidal grating is the optimized structure which has the coupling length of 81μm and switching voltage of 11V for the operating wavelength of 1550nm. The switching time for this proposed switch is 0.47 picosecond. This makes the proposed switch to be an important element in the intelligent optical network.
Abstract: The menace of counterfeiting pharmaceuticals/drugs has become a major threat to consumers, healthcare providers, drug manufacturers and governments. It is a source of public health concern both in the developed and developing nations. Several solutions for detecting and authenticating counterfeit drugs have been adopted by different nations of the world. In this article, a dialogue system-based drug counterfeiting detection system was developed and the results of the user satisfaction and acceptability of the system are presented. The results show that the users were satisfied with the system and the system was widely accepted as a means of fighting counterfeited drugs.
Abstract: A synchronous network-on-chip using wormhole packet switching
and supporting guaranteed-completion best-effort with low-priority (LP)
and high-priority (HP) wormhole packet delivery service is presented in
this paper. Both our proposed LP and HP message services deliver a good
quality of service in term of lossless packet completion and in-order message
data delivery. However, the LP message service does not guarantee minimal
completion bound. The HP packets will absolutely use 100% bandwidth of
their reserved links if the HP packets are injected from the source node with
maximum injection. Hence, the service are suitable for small size messages
(less than hundred bytes). Otherwise the other HP and LP messages, which
require also the links, will experience relatively high latency depending on the
size of the HP message. The LP packets are routed using a minimal adaptive
routing, while the HP packets are routed using a non-minimal adaptive routing
algorithm. Therefore, an additional 3-bit field, identifying the packet type,
is introduced in their packet headers to classify and to determine the type
of service committed to the packet. Our NoC prototypes have been also
synthesized using a 180-nm CMOS standard-cell technology to evaluate the
cost of implementing the combination of both services.
Abstract: Cloud Computing has recently emerged as a
compelling paradigm for managing and delivering services over the
internet. The rise of Cloud Computing is rapidly changing the
landscape of information technology, and ultimately turning the longheld
promise of utility computing into a reality. As the development
of Cloud Computing paradigm is speedily progressing, concepts, and
terminologies are becoming imprecise and ambiguous, as well as
different technologies are interfering. Thus, it becomes crucial to
clarify the key concepts and definitions. In this paper, we present the
anatomy of Cloud Computing, covering its essential concepts,
prominent characteristics, its affects, architectural design and key
technologies. We differentiate various service and deployment
models. Also, significant challenges and risks need are tackled in
order to guarantee the long-term success of Cloud Computing. The
aim of this paper is to provide a better understanding of the anatomy
of Cloud Computing and pave the way for further research in this
area.
Abstract: Software-as-a-Service (SaaS) is a form of cloud
computing that relieves the user of the burden of hardware and
software installation and management. SaaS can be used at the course
level to enhance curricula and student experience. When cloud
computing and SaaS are included in educational literature, the focus
is typically on implementing administrative functions. Yet, SaaS can
make more immediate and substantial contributions to the technical
course content in educational offerings. This paper explores cloud
computing and SaaS, provides examples, reports on experiences
using SaaS to offer specialized software in courses, and analyzes the
advantages and disadvantages of using SaaS at the course level. The
paper contributes to the literature in higher education by analyzing
the major technical concepts, potential, and constraints for using
SaaS to deliver specialized software at the course level. Further it
may enable more educators and students to benefit from this
emerging technology.
Abstract: Microarrays have become the effective, broadly used tools in biological and medical research to address a wide range of problems, including classification of disease subtypes and tumors. Many statistical methods are available for analyzing and systematizing these complex data into meaningful information, and one of the main goals in analyzing gene expression data is the detection of samples or genes with similar expression patterns. In this paper, we express and compare the performance of several clustering methods based on data preprocessing including strategies of normalization or noise clearness. We also evaluate each of these clustering methods with validation measures for both simulated data and real gene expression data. Consequently, clustering methods which are common used in microarray data analysis are affected by normalization and degree of noise and clearness for datasets.
Abstract: In this paper, a two-dimensional (2D) numerical
model for the tidal currents simulation in Persian Gulf is presented.
The model is based on the depth averaged equations of shallow water
which consider hydrostatic pressure distribution. The continuity
equation and two momentum equations including the effects of bed
friction, the Coriolis effects and wind stress have been solved. To
integrate the 2D equations, the Alternative Direction Implicit (ADI)
technique has been used. The base of equations discritization was
finite volume method applied on rectangular mesh. To evaluate the
model validation, a dam break case study including analytical
solution is selected and the comparison is done. After that, the
capability of the model in simulation of tidal current in a real field is
represented by modeling the current behavior in Persian Gulf. The
tidal fluctuations in Hormuz Strait have caused the tidal currents in
the area of study. Therefore, the water surface oscillations data at
Hengam Island on Hormoz Strait are used as the model input data.
The check point of the model is measured water surface elevations at
Assaluye port. The comparison between the results and the
acceptable agreement of them showed the model ability for modeling
marine hydrodynamic.
Abstract: As a part of evaluation system for R&D program, the
Korean government has applied feasibility analysis since 2008.
Various professionals put forth a great effort in order to catch up the
high degree of freedom of R&D programs, and make contributions to
evolving the feasibility analysis. We analyze diverse R&D programs
from various viewpoints, such as technology, policy, and Economics,
integrate the separate analysis, and finally arrive at a definite result;
whether a program is feasible or unfeasible. This paper describes the
concept and method of the feasibility analysis as a decision making
tool. The analysis unit and content of each criterion, which are key
elements in a comprehensive decision making structure, are examined
Abstract: Data mining, which is the exploration of
knowledge from the large set of data, generated as a result of
the various data processing activities. Frequent Pattern Mining
is a very important task in data mining. The previous
approaches applied to generate frequent set generally adopt
candidate generation and pruning techniques for the
satisfaction of the desired objective. This paper shows how
the different approaches achieve the objective of frequent
mining along with the complexities required to perform the
job. This paper will also look for hardware approach of cache
coherence to improve efficiency of the above process. The
process of data mining is helpful in generation of support
systems that can help in Management, Bioinformatics,
Biotechnology, Medical Science, Statistics, Mathematics,
Banking, Networking and other Computer related
applications. This paper proposes the use of both upward and
downward closure property for the extraction of frequent item
sets which reduces the total number of scans required for the
generation of Candidate Sets.
Abstract: Microarrays technique allows the simultaneous measurements of the expression levels of thousands of mRNAs. By mining this data one can identify the dynamics of the gene expression time series. By recourse of principal component analysis, we uncover the circadian rhythmic patterns underlying the gene expression profiles from Cyanobacterium Synechocystis. We applied PCA to reduce the dimensionality of the data set. Examination of the components also provides insight into the underlying factors measured in the experiments. Our results suggest that all rhythmic content of data can be reduced to three main components.
Abstract: One of the mayor problems of programming a cruise
circuit is to decide which destinations to include and which don-t.
Thus a decision problem emerges, that might be solved using a linear
and goal programming approach. The problem becomes more
complex if several boats in the fleet must be programmed in a limited
schedule, trying their capacity matches best a seasonal demand and
also attempting to minimize the operation costs. Moreover, the
programmer of the company should consider the time of the
passenger as a limited asset, and would like to maximize its usage.
The aim of this work is to design a method in which, using linear and
goal programming techniques, a model to design circuits for the
cruise company decision maker can achieve an optimal solution
within the fleet schedule.
Abstract: Medical applications are among the most impactful
areas of microrobotics. The ultimate goal of medical microrobots is
to reach currently inaccessible areas of the human body and carry out
a host of complex operations such as minimally invasive surgery
(MIS), highly localized drug delivery, and screening for diseases at
their very early stages. Miniature, safe and efficient propulsion
systems hold the key to maturing this technology but they pose
significant challenges. A new type of propulsion developed recently,
uses multi-flagella architecture inspired by the motility mechanism of
prokaryotic microorganisms. There is a lack of efficient methods for
designing this type of propulsion system. The goal of this paper is to
overcome the lack and this way, a numerical strategy is proposed to
design multi-flagella propulsion systems. The strategy is based on the
implementation of the regularized stokeslet and rotlet theory, RFT
theory and new approach of “local corrected velocity". The effects of
shape parameters and angular velocities of each flagellum on overall
flow field and on the robot net forces and moments are considered.
Then a multi-layer perceptron artificial neural network is designed
and employed to adjust the angular velocities of the motors for
propulsion control. The proposed method applied successfully on a
sample configuration and useful demonstrative results is obtained.
Abstract: Probability-based identity disclosure risk
measurement may give the same overall risk for different
anonymization strategy of the same dataset. Some entities in the
anonymous dataset may have higher identification risks than the
others. Individuals are more concerned about higher risks than the
average and are more interested to know if they have a possibility of
being under higher risk. A notation of overall risk in the above
measurement method doesn-t indicate whether some of the involved
entities have higher identity disclosure risk than the others. In this
paper, we have introduced an identity disclosure risk measurement
method that not only implies overall risk, but also indicates whether
some of the members have higher risk than the others. The proposed
method quantifies the overall risk based on the individual risk values,
the percentage of the records that have a risk value higher than the
average and how larger the higher risk values are compared to the
average. We have analyzed the disclosure risks for different
disclosure control techniques applied to original microdata and
present the results.
Abstract: The performances of small and medium enterprises
have stagnated in the last two decades. This has mainly been due to
the emergence of HIV / Aids. The disease has had a detrimental
effect on the general economy of the country leading to morbidity
and mortality of the Kenyan workforce in their primary age. The
present study sought to establish the economic impact of HIV / Aids
on the micro-enterprise development in Obunga slum – Kisumu, in
terms of production loss, increasing labor related cost and to establish
possible strategies to address the impact of HIV / Aids on microenterprises.
The study was necessitated by the observation that most
micro-enterprises in the slum are facing severe economic and social
crisis due to the impact of HIV / Aids, they get depleted and close
down within a short time due to death of skilled and experience
workforce. The study was carried out between June 2008 and June
2009 in Obunga slum. Data was subjected to computer aided
statistical analysis that included descriptive statistic, chi-squared and
ANOVA techniques. Chi-squared analysis on the micro-enterprise
owners opinion on the impact of HIV / Aids on depletion of microenterprise
compared to other diseases indicated high levels of the
negative effects of the disease at significance levels of P
Abstract: This paper proposes an easy-to-use instruction hiding
method to protect software from malicious reverse engineering
attacks. Given a source program (original) to be protected, the
proposed method (1) takes its modified version (fake) as an input,
(2) differences in assembly code instructions between original and
fake are analyzed, and, (3) self-modification routines are introduced
so that fake instructions become correct (i.e., original instructions)
before they are executed and that they go back to fake ones after
they are executed. The proposed method can add a certain amount
of security to a program since the fake instructions in the resultant
program confuse attackers and it requires significant effort to discover
and remove all the fake instructions and self-modification routines.
Also, this method is easy to use (with little effort) because all a user
(who uses the proposed method) has to do is to prepare a fake source
code by modifying the original source code.