Abstract: We have previously introduced an ultrasonic imaging
approach that combines harmonic-sensitive pulse sequences with a
post-beamforming quadratic kernel derived from a second-order
Volterra filter (SOVF). This approach is designed to produce images
with high sensitivity to nonlinear oscillations from microbubble
ultrasound contrast agents (UCA) while maintaining high levels of
noise rejection. In this paper, a two-step algorithm for computing the
coefficients of the quadratic kernel leading to reduction of tissue
component introduced by motion, maximizing the noise rejection and
increases the specificity while optimizing the sensitivity to the UCA
is presented. In the first step, quadratic kernels from individual
singular modes of the PI data matrix are compared in terms of their
ability of maximize the contrast to tissue ratio (CTR). In the second
step, quadratic kernels resulting in the highest CTR values are
convolved. The imaging results indicate that a signal processing
approach to this clinical challenge is feasible.
Abstract: This paper describes the Multilingual Virtual Simulated Patient framework. It has been created to train the social skills and testing the knowledge of primary health care medical students. The framework generates conversational agents which perform in serveral languages as virtual simulated patients that help to improve the communication and diagnosis skills of the students complementing their training process.
Abstract: A simple mobile engine-driven pneumatic paddy
collector made of locally available materials using local
manufacturing technology was designed, fabricated, and tested for
collecting and bagging of paddy dried on concrete pavement. The
pneumatic paddy collector had the following major components:
radial flat bladed type centrifugal fan, power transmission system,
bagging area, frame and the conveyance system. Results showed
significant differences on the collecting capacity, noise level, and fuel
consumption when rotational speed of the air mover shaft was varied.
Other parameters such as collecting efficiency, air velocity,
augmented cracked grain percentage, and germination rate were not
significantly affected by varying rotational speed of the air mover
shaft. The pneumatic paddy collector had a collecting efficiency of
99.33 % with a collecting capacity of 2685.00 kg/h at maximum
rotational speed of centrifugal fan shaft of about 4200 rpm. The
machine entailed an investment cost of P 62,829.25. The break-even
weight of paddy was 510,606.75 kg/yr at a collecting cost of 0.11
P/kg of paddy. Utilizing the machine for 400 hours per year
generated an income of P 23,887.73. The projected time needed to
recover cost of the machine based on 2685 kg/h collecting capacity
was 2.63 year.
Abstract: IETF defines mobility support in IPv6, i.e. MIPv6, to
allow nodes to remain reachable while moving around in the IPv6
internet. When a node moves and visits a foreign network, it is still
reachable through the indirect packet forwarding from its home
network. This triangular routing feature provides node mobility but
increases the communication latency between nodes. This deficiency
can be overcome by using a Binding Update (BU) scheme, which let
nodes keep up-to-date IP addresses and communicate with each other
through direct IP routing. To further protect the security of BU, a
Return Routability (RR) procedure was developed. However, it has
been found that RR procedure is vulnerable to many attacks. In this
paper, we will propose a lightweight RR procedure based on
geometric computing. In consideration of the inherent limitation of
computing resources in mobile node, the proposed scheme is
developed to minimize the cost of computations and to eliminate the
overhead of state maintenance during binding updates. Compared with
other CGA-based BU schemes, our scheme is more efficient and
doesn-t need nonce tables in nodes.
Abstract: Finding effective ways of improving university quality assurance requires, as well, a retraining of the staff. This article illustrates an Online Programme of Excellence Model (OPEM), based on the European quality assurance model, for improving participants- formative programme standards. The results of applying this OPEM indicate the necessity of quality policies that support the evaluators- competencies to improve formative programmes. The study concludes by outlining how faculty and agency staff can use OPEM for the internal and external quality assurance of formative programmes.
Abstract: The main objective of this paper is to analyse the influence of preparation and control of orders on performance. The focused activities explored in this research are: procurement, production and distribution. These changes in performance were obtained through improvement of the supply chain. It is proved using all the company activities that it is possible to increase de efficiency and do services in an adequate way, placing the products in the market efficiently. For that, it was explored the importance of the supply chain, with privilege to the practical environment and the quantification of the obtained results.
Abstract: There are several approaches for handling multiclass classification. Aside from one-against-one (OAO) and one-against-all (OAA), hierarchical classification technique is also commonly used. A binary classification tree is a hierarchical classification structure that breaks down a k-class problem into binary sub-problems, each solved by a binary classifier. In each node, a set of classes is divided into two subsets. A good class partition should be able to group similar classes together. Many algorithms measure similarity in term of distance between class centroids. Classes are grouped together by a clustering algorithm when distances between their centroids are small. In this paper, we present a binary classification tree with tuned observation-based clustering (BCT-TOB) that finds a class partition by performing clustering on observations instead of class centroids. A merging step is introduced to merge any insignificant class split. The experiment shows that performance of BCT-TOB is comparable to other algorithms.
Abstract: Information sharing and exchange, rather than
information processing, is what characterizes information
technology in the 21st century. Ontologies, as shared common
understanding, gain increasing attention, as they appear as the
most promising solution to enable information sharing both at
a semantic level and in a machine-processable way. Domain
Ontology-based modeling has been exploited to provide
shareability and information exchange among diversified,
heterogeneous applications of enterprises.
Contextual ontologies are “an explicit specification of
contextual conceptualization". That is: ontology is
characterized by concepts that have multiple representations
and they may exist in several contexts. Hence, contextual
ontologies are a set of concepts and relationships, which are
seen from different perspectives. Contextualization is to allow
for ontologies to be partitioned according to their contexts.
The need for contextual ontologies in enterprise modeling
has become crucial due to the nature of today's competitive
market. Information resources in enterprise is distributed and
diversified and is in need to be shared and communicated
locally through the intranet and globally though the internet.
This paper discusses the roles that ontologies play in an
enterprise modeling, and how ontologies assist in building a
conceptual model in order to provide communicative and
interoperable information systems. The issue of enterprise
modeling based on contextual domain ontology is also
investigated, and a framework is proposed for an enterprise
model that consists of various applications.
Abstract: This paper presents a experiment to estimate the
influences of cutting conditions in microstructure changes of
machining austenitic 304 stainless steel, especially for wear insert. The
wear insert were prefabricated with a width of 0.5 mm. And the forces,
temperature distribution, RS, and microstructure changes were
measured by force dynamometer, infrared thermal camera, X-ray
diffraction, XRD, SEM, respectively. The results told that the different
combinations of machining condition have a significant influence on
machined surface microstructure changes. In addition to that, the
ANOVA and AOMwere used to tell the different influences of cutting
speed, feed rate, and wear insert.
Abstract: The extraction of meaningful information from image
could be an alternative method for time series analysis. In this paper,
we propose a graphical analysis of time series grouped into table
with adjusted colour scale for numerical values. The advantages of
this method are also discussed. The proposed method is easy to
understand and is flexible to implement the standard methods of
pattern recognition and verification, especially for noisy
environmental data.
Abstract: Heat-inducible gene expression vectors are useful for hyperthermia-induced cancer gene therapy, because the combination
of hyperthermia and gene therapy can considerably improve the therapeutic effects. In the present study, we developed an enhanced
heat-inducible transgene expression system in which a heat-shock
protein (HSP) promoter and tetracycline-responsive transactivator
were combined. When the transactivator plasmid containing the
tetracycline-responsive transactivator gene was co-transfected with
the reporter gene expression plasmid, a high level of heat-induced gene expression was observed compared with that using the HSP
promoter without the transactivator. In vitro evaluation of the
therapeutic effect using HeLa cells showed that heat-induced therapeutic gene expression caused cell death in a high percentage of
these cells, indicating that this strategy is promising for cancer gene therapy.
Abstract: The proposed system identifies the species of the wood
using the textural features present in its barks. Each species of a wood
has its own unique patterns in its bark, which enabled the proposed
system to identify it accurately. Automatic wood recognition system
has not yet been well established mainly due to lack of research in this
area and the difficulty in obtaining the wood database. In our work, a
wood recognition system has been designed based on pre-processing
techniques, feature extraction and by correlating the features of those
wood species for their classification. Texture classification is a problem
that has been studied and tested using different methods due to its
valuable usage in various pattern recognition problems, such as wood
recognition, rock classification. The most popular technique used
for the textural classification is Gray-level Co-occurrence Matrices
(GLCM). The features from the enhanced images are thus extracted
using the GLCM is correlated, which determines the classification
between the various wood species. The result thus obtained shows a
high rate of recognition accuracy proving that the techniques used in
suitable to be implemented for commercial purposes.
Abstract: This study was carried out in Ankara, the capital city of Turkey, in order to determine how people living in the slums of Ankara benefit from educational equality. Within the scope of the research, interviews were made with 64 families whose children have been getting education from the primary schools of these parts and the data of the study was collected by the researcher. The results of the research demonstrate that the children getting education in the slums of Ankara can not experience educational equality and justice. The results of this study show that the opportunities of the schools in the slums of Ankara are very limited, so the individuals in these districts can not equally benefit from the education. The families are aware of the problem they are faced with. KeywordsDiscrimination, inequality, primary education, slums of Turkey.
Abstract: High quality requirements analysis is one of the most
crucial activities to ensure the success of a software project, so that
requirements verification for software system becomes more and more
important in Requirements Engineering (RE) and it is one of the most
helpful strategies for improving the quality of software system.
Related works show that requirement elicitation and analysis can be
facilitated by ontological approaches and semantic web technologies.
In this paper, we proposed a hybrid method which aims to verify
requirements with structural and formal semantics to detect
interactions. The proposed method is twofold: one is for modeling
requirements with the semantic web language OWL, to construct a
semantic context; the other is a set of interaction detection rules which
are derived from scenario-based analysis and represented with
semantic web rule language (SWRL). SWRL based rules are working
with rule engines like Jess to reason in semantic context for
requirements thus to detect interactions. The benefits of the proposed
method lie in three aspects: the method (i) provides systematic steps
for modeling requirements with an ontological approach, (ii) offers
synergy of requirements elicitation and domain engineering for
knowledge sharing, and (3)the proposed rules can systematically assist
in requirements interaction detection.
Abstract: The effect of wheat flour extraction rates on flour
composition, farinographic characteristics and the quality of
sourdough naans was investigated. The results indicated that by
increasing the extraction rate, the amount of protein, fiber, fat and
ash increased, whereas moisture content decreased. Farinographic
characteristic like water absorption and dough development time
increased with an increase in flour extraction rate but the dough
stabilities and tolerance indices were reduced with an increase in
flour extraction rates. Titratable acidity for both sourdough and
sourdough naans also increased along with flour extraction rate. The
study showed that overall quality of sourdough naans were affected
by both flour extraction rate and starter culture used. Sensory
analysis of sourdough naans revealed that desirable extraction rate
for sourdough naan was 76%.
Abstract: In this paper, a delayed physiological control system is investigated. The sufficient conditions for stability of positive equilibrium and existence of local Hopf bifurcation are derived. Furthermore, global existence of periodic solutions is established by using the global Hopf bifurcation theory. Finally, numerical examples are given to support the theoretical analysis.
Abstract: The object of this research is the design and
evaluation of an immersive Virtual Learning Environment (VLE) for
deaf children. Recently we have developed a prototype immersive
VR game to teach sign language mathematics to deaf students age K-
4 [1] [2]. In this paper we describe a significant extension of the
prototype application. The extension includes: (1) user-centered
design and implementation of two additional interactive
environments (a clock store and a bakery), and (2) user-centered
evaluation including development of user tasks, expert panel-based
evaluation, and formative evaluation. This paper is one of the few to
focus on the importance of user-centered, iterative design in VR
application development, and to describe a structured evaluation
method.
Abstract: In this paper, a new formulation for acoustics coupled with linear elasticity is presented. The primary objective of the work is to develop a three dimensional hp adaptive finite element method code destinated for modeling of acoustics of human head. The code will have numerous applications e.g. in designing hearing protection devices for individuals working in high noise environments. The presented work is in the preliminary stage. The variational formulation has been implemented and tested on a sequence of meshes with concentric multi-layer spheres, with material data representing the tissue (the brain), skull and the air. Thus, an efficient solver for coupled elasticity/acoustics problems has been developed, and tested on high contrast material data representing the human head.
Abstract: Due to the complex network architecture, the mobile
adhoc network-s multihop feature gives additional problems to the
users. When the traffic load at each node gets increased, the
additional contention due its traffic pattern might cause the nodes
which are close to destination to starve the nodes more away from the
destination and also the capacity of network is unable to satisfy the
total user-s demand which results in an unfairness problem. In this
paper, we propose to create an algorithm to compute the optimal
MAC-layer bandwidth assigned to each flow in the network. The
bottleneck links contention area determines the fair time share which
is necessary to calculate the maximum allowed transmission rate used
by each flow. To completely utilize the network resources, we
compute two optimal rates namely, the maximum fair share and
minimum fair share. We use the maximum fair share achieved in
order to limit the input rate of those flows which crosses the
bottleneck links contention area when the flows that are not allocated
to the optimal transmission rate and calculate the following highest
fair share. Through simulation results, we show that the proposed
protocol achieves improved fair share and throughput with reduced
delay.
Abstract: Recently studies in area of supply chain network
(SCN) have focused on the disruption issues in distribution systems.
Also this paper extends the previous literature by providing a new biobjective
model for cost minimization of designing a three echelon
SCN across normal and failure scenarios with considering multi
capacity option for manufacturers and distribution centers. Moreover,
in order to solve the problem by means of LINGO software, novel
model will be reformulated through a branch of LP-Metric method
called Min-Max approach.