Abstract: In this paper optimization of routing in ad-hoc
networks is surveyed and a new method for reducing the complexity
of routing algorithms is suggested. Using binary matrices for each
node in the network and updating it once the routing is done, helps
nodes to stop repeating the routing protocols in each data transfer.
The algorithm suggested can reduce the complexity of routing to the
least amount possible.
Abstract: Simulation of occlusal function during laboratory
material-s testing becomes essential in predicting long-term
performance before clinical usage. The aim of the study was to assess
the influence of chamfer preparation depth on failure risk of heat
pressed ceramic crowns with and without zirconia framework by
means of finite element analysis. 3D models of maxillary central
incisor, prepared for full ceramic crowns with different depths of the
chamfer margin (between 0.8 and 1.2 mm) and 6-degree tapered
walls together with the overlying crowns were generated using
literature data (Fig. 1, 2). The crowns were designed with and
without a zirconia framework with a thickness of 0.4 mm. For all
preparations and crowns, stresses in the pressed ceramic crown,
zirconia framework, pressed ceramic veneer, and dentin were
evaluated separately. The highest stresses were registered in the
dentin. The depth of the preparations had no significant influence on
the stress values of the teeth and pressed ceramics for the studied
cases, only for the zirconia framework. The zirconia framework
decreases the stress values in the veneer.
Abstract: Internet is largely composed of textual contents and a
huge volume of digital contents gets floated over the Internet daily.
The ease of information sharing and re-production has made it
difficult to preserve author-s copyright. Digital watermarking came
up as a solution for copyright protection of plain text problem after
1993. In this paper, we propose a zero text watermarking algorithm
based on occurrence frequency of non-vowel ASCII characters and
words for copyright protection of plain text. The embedding
algorithm makes use of frequency non-vowel ASCII characters and
words to generate a specialized author key. The extraction algorithm
uses this key to extract watermark, hence identify the original
copyright owner. Experimental results illustrate the effectiveness of
the proposed algorithm on text encountering meaning preserving
attacks performed by five independent attackers.
Abstract: This paper presents a studyof the impact of reference
node locations on the accuracy of the indoor positioning systems. In
particular, we analyze the localization accuracy of the RSSI database
mapping techniques, deploying on the IEEE 802.15.4 wireless
networks. The results show that the locations of the reference nodes
used in the positioning systems affect the signal propagation
characteristics in the service area. Thisin turn affects the accuracy of the wireless indoor positioning system. We found that suitable
location of reference nodes could reduce the positioning error upto 35 %.
Abstract: The aim of this paper is to discuss a low-cost methodology that can predict traffic flow conflicts and quantitatively rank crash expectancies (based on relative probability) for various traffic facilities. This paper focuses on the application of statistical distributions to model traffic flow and Monte Carlo techniques to simulate traffic and discusses how to create a tool in order to predict the possibility of a traffic crash. A low-cost data collection methodology has been discussed for the heterogeneous traffic flow that exists and a GIS platform has been proposed to thematically represent traffic flow from simulations and the probability of a crash. Furthermore, discussions have been made to reflect the dynamism of the model in reference to its adaptability, adequacy, economy, and efficiency to ensure adoption.
Abstract: In this paper we intend to ascertain the state of the art on multifingered end-effectors, also known as robotic hands or dexterous robot hands, and propose an experimental setup for an innovative task based design approach, involving cutting edge technologies in motion capture. After an initial description of the capabilities and complexity of a human hand when grasping objects, in order to point out the importance of replicating it, we analyze the mechanical and kinematical structure of some important works carried out all around the world in the last three decades and also review the actuators and sensing technologies used. Finally we describe a new design philosophy proposing an experimental setup for the first stage using recent developments in human body motion capture systems that might lead to lighter and always more dexterous robotic hands.
Abstract: One part of the total employee-s reward is apart from basic wages or salary, employee-s benefits and intangible elements also so called contingent (variable) pay. Contingent pay is connected to performance, contribution, capcompetency or skills of individual employees, and to team-s or company-wide performance or to combination of few of the mentioned possibilities. Main aim of this article is to define, based on available information, contingent pay, describe reasons for its implementation and arguments for and against this type of remuneration, but also bring information not only about its extent and level of utilization by organizations of the Czech Republic operating in the field of environmental protection, but also mention their practical experience with this type of remuneration.
Abstract: In this research, a latent class vector model for pairwise data is formulated. As compared to the basic vector model, this model yields consistent estimates of the parameters since the number of parameters to be estimated does not increase with the number of subjects. The result of the analysis reveals that the model was stable and could classify each subject to the latent classes representing the typical scales used by these subjects.
Abstract: In this paper a novel algorithm is proposed to merit
the accuracy of finger vein recognition. The performances of
Principal Component Analysis (PCA), Kernel Principal Component
Analysis (KPCA), and Kernel Entropy Component Analysis (KECA)
in this algorithm are validated and compared with each other in order
to determine which one is the most appropriate one in terms of finger
vein recognition.
Abstract: The accuracy of estimated stability and control
derivatives of a light aircraft from flight test data were evaluated. The light aircraft, named ChangGong-91, is the first certified aircraft from
the Korean government. The output error method, which is a maximum likelihood estimation technique and considers measurement
noise only, was used to analyze the aircraft responses measures. The
multi-step control inputs were applied in order to excite the short period mode for the longitudinal and Dutch-roll mode for the lateral-directional motion. The estimated stability/control derivatives of Chan Gong-91 were analyzed for the assessment of handling
qualities comparing them with those of similar aircraft. The accuracy of the flight derivative estimates derived from flight test measurement
was examined in engineering judgment, scatter and Cramer-Rao bound, which turned out to be satisfactory with minor defects..
Abstract: The question of interethnic and interreligious conflicts
in ex-Yugoslavia receives much attention within the framework of
the international context created after 1991 because of the impact of
these conflicts on the security and the stability of the region of
Balkans and of Europe.
This paper focuses on the rationales leading to the declaration of
independence by Kosovo according to ethnic and religious criteria
and analyzes why these same rationales were not applied in Bosnia
and Herzegovina. The approach undertaken aims at comparatively
examining the cases of Kosovo, and Bosnia and Herzegovina. At the
same time, it aims at understanding the political decision making of
the international community in the case of Kosovo. Specifically, was
this a good political decision for the security and the stability of the
region of Balkans, of Europe, or even for global security and
stability?
This research starts with an overview on the European security
framework post 1991, paying particular attention to Kosovo and
Bosnia and Herzegovina. It then presents the theoretical and
methodological framework and compares the representative cases.
Using the constructivism issue and the comparative methodology, it
arrives at the results of the study. An important issue of the paper is
the thesis that this event modifies the principles of international law
and creates dangerous precedents for regional stability in the
Balkans.
Abstract: Expression and secretion of inflammation markers are
disturbed in obesity. Interleukin-6 reduces body fat mass. The
common G-174C polymorphism in the promoter of IL-6 gene has
been reported that effects on transcriptional regulation. The objective
was to investigate association of the common polymorphism G-174C
with obesity in Iranian population. The present study is cross
sectional association study that included 242 individuals (110 men
and 132 women). Serum IL-6 levels, C-reactive protein, fasting
blood glucose and blood lipids profile were measured .BMI and
WHR were calculated. Genotyping is carried out by PCR and RFLP.
The frequencies of G and C allele were 64.5% and 35.5%,
respectively. The G-174C polymorphism was not associated with
BMI and WHR. However in obese individual, fasting blood glucose
was significantly higher in carrier of C allele compared with the noncarrier.
The IL-6 G-174C polymorphism is not a risk factor for
obesity in Iranian population.
Abstract: Through 1980s, management accounting researchers
described the increasing irrelevance of traditional control and
performance measurement systems. The Balanced Scorecard (BSC)
is a critical business tool for a lot of organizations. It is a
performance measurement system which translates mission and
strategy into objectives. Strategy map approach is a development
variant of BSC in which some necessary causal relations must be
established. To recognize these relations, experts usually use
experience. It is also possible to utilize regression for the same
purpose. Structural Equation Modeling (SEM), which is one of the
most powerful methods of multivariate data analysis, obtains more
appropriate results than traditional methods such as regression. In the
present paper, we propose SEM for the first time to identify the
relations between objectives in the strategy map, and a test to
measure the importance of relations. In SEM, factor analysis and test
of hypotheses are done in the same analysis. SEM is known to be
better than other techniques at supporting analysis and reporting. Our
approach provides a framework which permits the experts to design
the strategy map by applying a comprehensive and scientific method
together with their experience. Therefore this scheme is a more
reliable method in comparison with the previously established
methods.
Abstract: As a structure for processing string problem, suffix
array is certainly widely-known and extensively-studied. But if the
string access pattern follows the “90/10" rule, suffix array can not take
advantage of the fact that we often find something that we have just
found. Although the splay tree is an efficient data structure for small
documents when the access pattern follows the “90/10" rule, it
requires many structures and an excessive amount of pointer
manipulations for efficiently processing and searching large
documents. In this paper, we propose a new and conceptually powerful
data structure, called splay suffix arrays (SSA), for string search. This
data structure combines the features of splay tree and suffix arrays into
a new approach which is suitable to implementation on both
conventional and clustered computers.
Abstract: A fast and efficient model of application development called user interface oriented application development (UIOAD) is proposed. This approach introduces a convenient way for users to develop a platform independent client-server application.
Abstract: The increasing popularity of wireless technologies
and mobile computing devices has enabled new application areas and
research. One of these new areas is pervasive systems in urban
environments, because urban environments are characterized by high
concentration of these technologies and devices. In this paper we will
show the process of pervasive system design in urban environments,
using as use case a local zoo in Cali, Colombia. Based on an
ethnographic studio, we present the design of a pervasive system for
urban computing based on service oriented architecture to controlled
environment of Cali Zoo. In this paper, the reader will find a
methodological approach for the design of similar systems, using
data collection methods, conceptual frameworks for urban
environments and considerations of analysis and design of service
oriented systems.
Abstract: The effect of the blade tip geometry of a high pressure
gas turbine is studied experimentally and computationally for high
speed leakage flows. For this purpose two simplified models are
constructed, one models a flat tip of the blade and the second models
a cavity tip of the blade. Experimental results are obtained from a
transonic wind tunnel to show the static pressure distribution along
the tip wall and provide flow visualization. RANS computations
were carried to provide further insight into the mean flow behavior
and to calculate the discharge coefficient which is a measure of the
flow leaking over the tip. It is shown that in both geometries of tip
the flow separates over the tip to form a separation bubble. The
bubble is higher for the cavity tip while a complete shock wave
system of oblique waves ending with a normal wave can be seen for
the flat tip. The discharge coefficient for the flat tip shows less
dependence on the pressure ratio over the blade tip than the cavity
tip. However, the discharge coefficient for the cavity tip is lower than
that of the flat tip, showing a better ability to reduce the leakage flow
and thus increase the turbine efficiency.
Abstract: The objective of this paper is to a design of pattern
classification model based on the back-propagation (BP) algorithm for
decision support system. Standard BP model has done full connection
of each node in the layers from input to output layers. Therefore, it
takes a lot of computing time and iteration computing for good
performance and less accepted error rate when we are doing some
pattern generation or training the network.
However, this model is using exclusive connection in between
hidden layer nodes and output nodes. The advantage of this model is
less number of iteration and better performance compare with standard
back-propagation model. We simulated some cases of classification
data and different setting of network factors (e.g. hidden layer number
and nodes, number of classification and iteration). During our
simulation, we found that most of simulations cases were satisfied by
BP based using exclusive connection network model compared to
standard BP. We expect that this algorithm can be available to
identification of user face, analysis of data, mapping data in between
environment data and information.
Abstract: A number of competing methodologies have been developed
to identify genes and classify DNA sequences into coding
and non-coding sequences. This classification process is fundamental
in gene finding and gene annotation tools and is one of the most
challenging tasks in bioinformatics and computational biology. An
information theory measure based on mutual information has shown
good accuracy in classifying DNA sequences into coding and noncoding.
In this paper we describe a species independent iterative
approach that distinguishes coding from non-coding sequences using
the mutual information measure (MIM). A set of sixty prokaryotes is
used to extract universal training data. To facilitate comparisons with
the published results of other researchers, a test set of 51 bacterial
and archaeal genomes was used to evaluate MIM. These results
demonstrate that MIM produces superior results while remaining
species independent.
Abstract: This experiment discusses the effects of fracture
parameters such as depth, length, width, angle and the number of the
fracture to the conductance properties of laterite using the DUK-2B
digital electrical measurement system combined with the method of
simulating the fractures. The results of experiment show that the
changes of fracture parameters produce effects to the conductance
properties of laterite. There is a clear degressive period of the
conductivity of laterite during increasing the depth, length, width, or
the angle and the quantity of fracture gradually. When the depth of
fracture exceeds the half thickness of the soil body, the conductivity of
laterite shows evidently non-linear diminishing pattern and the
amplitude of decrease tends to increase. The length of fracture has
fewer effects than the depth to the conductivity. When the width of
fracture reaches some fixed values, the change of the conductivity is
less sensitive to the change of the width, and at this time, the
conductivity of laterite maintains at a stable level. When the angle of
fracture is less than 45°, the decrease of the conductivity is more
clearly as the angle increases. But when angle is more than 45°,
change of the conductivity is relatively gentle as the angle increases.
The increasing quantity of the fracture causes the other fracture
parameters having great impact on the change of conductivity. When
moisture content and temperature were unchanged, depth and angle of
fractures are the major factors affecting the conductivity of laterite
soil; quantity, length, and width are minor influencing factors. The
sensitivity of fracture parameters affect conductivity of laterite soil is:
depth >angles >quantity >length >width.