Abstract: Increasing growth of information volume in the
internet causes an increasing need to develop new (semi)automatic
methods for retrieval of documents and ranking them according to
their relevance to the user query. In this paper, after a brief review
on ranking models, a new ontology based approach for ranking
HTML documents is proposed and evaluated in various
circumstances. Our approach is a combination of conceptual,
statistical and linguistic methods. This combination reserves the
precision of ranking without loosing the speed. Our approach
exploits natural language processing techniques to extract phrases
from documents and the query and doing stemming on words. Then
an ontology based conceptual method will be used to annotate
documents and expand the query. To expand a query the spread
activation algorithm is improved so that the expansion can be done
flexible and in various aspects. The annotated documents and the
expanded query will be processed to compute the relevance degree
exploiting statistical methods. The outstanding features of our
approach are (1) combining conceptual, statistical and linguistic
features of documents, (2) expanding the query with its related
concepts before comparing to documents, (3) extracting and using
both words and phrases to compute relevance degree, (4) improving
the spread activation algorithm to do the expansion based on
weighted combination of different conceptual relationships and (5)
allowing variable document vector dimensions. A ranking system
called ORank is developed to implement and test the proposed
model. The test results will be included at the end of the paper.
Abstract: In a particular case of behavioural model reduction by ANNs, a validity domain shortening has been found. In mechanics, as in other domains, the notion of validity domain allows the engineer to choose a valid model for a particular analysis or simulation. In the study of mechanical behaviour for a cantilever beam (using linear and non-linear models), Multi-Layer Perceptron (MLP) Backpropagation (BP) networks have been applied as model reduction technique. This reduced model is constructed to be more efficient than the non-reduced model. Within a less extended domain, the ANN reduced model estimates correctly the non-linear response, with a lower computational cost. It has been found that the neural network model is not able to approximate the linear behaviour while it does approximate the non-linear behaviour very well. The details of the case are provided with an example of the cantilever beam behaviour modelling.
Abstract: A new concept for long-term reagent storage for Labon- a-Chip (LoC) devices is described. Here we present a polymer multilayer stack with integrated stick packs for long-term storage of several liquid reagents, which are necessary for many diagnostic applications. Stick packs are widely used in packaging industry for storing solids and liquids for long time. The storage concept fulfills two main requirements: First, a long-term storage of reagents in stick packs without significant losses and interaction with surroundings, second, on demand releasing of liquids, which is realized by pushing a membrane against the stick pack through pneumatic pressure. This concept enables long-term on-chip storage of liquid reagents at room temperature and allows an easy implementation in different LoC devices.
Abstract: The purpose of research was to know the role of
immunogenic protein of 49 kDa from V.alginolyticus which capable
to initiate molecule expression of MHC Class II in receptor of
Cromileptes altivelis. The method used was in vivo experimental
research through testing of immunogenic protein 49 kDa from
V.alginolyticus at Cromileptes altivelis (size of 250 - 300 grams)
using 3 times booster by injecting an immunogenic protein in a
intramuscular manner. Response of expressed MHC molecule was
shown using immunocytochemistry method and SEM. Results
indicated that adhesin V.alginolyticus 49 kDa which have
immunogenic character could trigger expression of MHC class II on
receptor of grouper and has been proven by staining using
immunocytochemistry and SEM with labeling using antibody anti
MHC (anti mouse). This visible expression based on binding between
epitopes antigen and antibody anti MHC in the receptor. Using
immunocytochemistry, intracellular response of MHC to in vivo
induction of immunogenic adhesin from V.alginolyticus was shown.
Abstract: Given the motivation of maps impact in enhancing the
perception of the quality of life in a region, this work examines the
use of spatial analytical techniques in exploring the role of space in
shaping human development patterns in Assiut governorate.
Variations of human development index (HDI) of the governorate-s
villages, districts and cities are mapped using geographic information
systems (GIS). Global and local spatial autocorrelation measures are
employed to assess the levels of spatial dependency in the data and to
map clusters of human development. Results show prominent
disparities in HDI between regions of Assiut. Strong patterns of
spatial association were found proving the presence of clusters on the
distribution of HDI. Finally, the study indicates several "hot-spots" in
the governorate to be area of more investigations to explore the
attributes of such levels of human development. This is very
important for accomplishing the development plan of poorest regions
currently adopted in Egypt.
Abstract: The purposes of this paper are to (1) promote excellence in computer science by suggesting a cohesive innovative approach to fill well documented deficiencies in current computer science education, (2) justify (using the authors' and others anecdotal evidence from both the classroom and the real world) why this approach holds great potential to successfully eliminate the deficiencies, (3) invite other professionals to join the authors in proof of concept research. The authors' experiences, though anecdotal, strongly suggest that a new approach involving visual modeling technologies should allow computer science programs to retain a greater percentage of prospective and declared majors as students become more engaged learners, more successful problem-solvers, and better prepared as programmers. In addition, the graduates of such computer science programs will make greater contributions to the profession as skilled problem-solvers. Instead of wearily rememorizing code as they move to the next course, students will have the problem-solving skills to think and work in more sophisticated and creative ways.
Abstract: The notion of Next Generation Network (NGN) is
based on the Network Convergence concept which refers to
integration of services (such as IT and communication services) over
IP layer. As the most popular implementation of Service Oriented
Architecture (SOA), Web Services technology is known to be the
base for service integration. In this paper, we present a platform to
deliver communication services as web services. We also implement
a sample service to show the simplicity of making composite web
and communication services using this platform. A Service Logic
Execution Environment (SLEE) is used to implement the
communication services. The proposed architecture is in agreement
with Service Oriented Architecture (SOA) and also can be integrated
to an Enterprise Service Bus to make a base for NGN Service
Delivery Platform (SDP).
Abstract: Artifact rejection plays a key role in many signal processing applications. The artifacts are disturbance that can occur during the signal acquisition and that can alter the analysis of the signals themselves. Our aim is to automatically remove the artifacts, in particular from the Electroencephalographic (EEG) recordings. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we try to enhance this technique proposing a new method based on the Renyi-s entropy. The performance of our method was tested and compared to the performance of the method in literature and the former proved to outperform the latter.
Abstract: HSDPA is a new feature which is introduced in
Release-5 specifications of the 3GPP WCDMA/UTRA standard to
realize higher speed data rate together with lower round-trip times.
Moreover, the HSDPA concept offers outstanding improvement of
packet throughput and also significantly reduces the packet call
transfer delay as compared to Release -99 DSCH. Till now the
HSDPA system uses turbo coding which is the best coding technique
to achieve the Shannon limit. However, the main drawbacks of turbo
coding are high decoding complexity and high latency which makes
it unsuitable for some applications like satellite communications,
since the transmission distance itself introduces latency due to
limited speed of light. Hence in this paper it is proposed to use LDPC
coding in place of Turbo coding for HSDPA system which decreases
the latency and decoding complexity. But LDPC coding increases the
Encoding complexity. Though the complexity of transmitter
increases at NodeB, the End user is at an advantage in terms of
receiver complexity and Bit- error rate. In this paper LDPC Encoder
is implemented using “sparse parity check matrix" H to generate a
codeword at Encoder and “Belief Propagation algorithm "for LDPC
decoding .Simulation results shows that in LDPC coding the BER
suddenly drops as the number of iterations increase with a small
increase in Eb/No. Which is not possible in Turbo coding. Also same
BER was achieved using less number of iterations and hence the
latency and receiver complexity has decreased for LDPC coding.
HSDPA increases the downlink data rate within a cell to a theoretical
maximum of 14Mbps, with 2Mbps on the uplink. The changes that
HSDPA enables includes better quality, more reliable and more
robust data services. In other words, while realistic data rates are
only a few Mbps, the actual quality and number of users achieved
will improve significantly.
Abstract: With the development of the Internet, E-commerce is
growing at an exponential rate, and lots of online stores are built up to
sell their goods online. A major factor influencing the successful
adoption of E-commerce is consumer-s trust. For new or unknown
Internet business, consumers- lack of trust has been cited as a major
barrier to its proliferation. As web sites provide key interface for
consumer use of E-Commerce, we investigate the design of web site to
build trust in E-Commerce from a design science approach. A
conceptual model is proposed in this paper to describe the ontology of
online transaction and human-computer interaction. Based on this
conceptual model, we provide a personalized webpage design
approach using Bayesian networks learning method. Experimental
evaluation are designed to show the effectiveness of web
personalization in improving consumer-s trust in new or unknown
online store.
Abstract: The purpose of this study was to develop a “teachers’
self-efficacy scale for high school physical education teachers
(TSES-HSPET)” in Taiwan. This scale is based on the self-efficacy
theory of Bandura [1], [2]. This study used exploratory and
confirmatory factor analyses to test the reliability and validity. The
participants were high school physical education teachers in Taiwan.
Both stratified random sampling and cluster sampling were used to
sample participants for the study. 350 teachers were sampled in the
first stage and 234 valid scales (male 133, female 101) returned.
During the second stage, 350 teachers were sampled and 257 valid
scales (male 143, female 110, 4 did not indicate gender) returned. The
exploratory factor analysis was used in the first stage, and it got
60.77% of total variance for construct validity. The Cronbach’s alpha
coefficient of internal consistency was 0.91 for sumscale, and
subscales were 0.84 and 0.90. In the second stage, confirmatory factor
analysis was used to test construct validity. The result showed that the
fit index could be accepted (χ2 (75) =167.94, p
Abstract: In this paper back-propagation artificial neural
network (BPANN) with Levenberg–Marquardt algorithm is
employed to predict the limiting drawing ratio (LDR) of the deep
drawing process. To prepare a training set for BPANN, some finite
element simulations were carried out. die and punch radius, die arc
radius, friction coefficient, thickness, yield strength of sheet and
strain hardening exponent were used as the input data and the LDR
as the specified output used in the training of neural network. As a
result of the specified parameters, the program will be able to
estimate the LDR for any new given condition. Comparing FEM and
BPANN results, an acceptable correlation was found.
Abstract: With today's fast lifestyles and busy schedule, nuclear
families are becoming popular. Thus, the elderly members of these
families are often neglected. This has lead to the popularity of the
concept of Community living for the aged. The elders reside at a
centre, which is controlled by the MANAGER. The manager takes
responsibility of the functioning of the centre which includes taking
care of 'residents' at the centre along with managing the daily chores
of the centre, which he accomplishes with the help of a number of
staff members and volunteers Often the Manager is not an employee
but a volunteer. In such cases especially, time is an important
constraint. A system, which provides an easy and efficient manner of
managing the working of an old age home in detail, will prove to be
of great benefit. We have developed a P.C. based organizer used to
monitor the various activities of an old age home. It is an effective
and easy-to-use system which will enable the manager to keep an
account of all the residents, their accounts, staff members, volunteers,
the centre-s logistic requirements etc. It is thus, a comprehensive
'Organizer' for Old Age Homes.
Abstract: In this paper we present an adaptive method for image
compression that is based on complexity level of the image. The
basic compressor/de-compressor structure of this method is a multilayer
perceptron artificial neural network. In adaptive approach
different Back-Propagation artificial neural networks are used as
compressor and de-compressor and this is done by dividing the
image into blocks, computing the complexity of each block and then
selecting one network for each block according to its complexity
value. Three complexity measure methods, called Entropy, Activity
and Pattern-based are used to determine the level of complexity in
image blocks and their ability in complexity estimation are evaluated
and compared. In training and evaluation, each image block is
assigned to a network based on its complexity value. Best-SNR is
another alternative in selecting compressor network for image blocks
in evolution phase which chooses one of the trained networks such
that results best SNR in compressing the input image block. In our
evaluations, best results are obtained when overlapping the blocks is
allowed and choosing the networks in compressor is based on the
Best-SNR. In this case, the results demonstrate superiority of this
method comparing with previous similar works and JPEG standard
coding.
Abstract: Multifunctional structures are a potentially disruptive
technology that allows for significant mass savings on spacecraft.
The specific concept addressed herein is that of a multifunctional
power structure. In this paper, a parametric optimisation of the
design of such a structure that uses commercially available battery
cells is presented. Using numerical modelling, it was found that there
exists several trade-offs aboutthe conflict between the capacity of the
panel and its mechanical properties. It was found that there is no
universal optimal location for the cells. Placing them close to the
mechanical interfaces increases loading in the mechanically weak
cells whereas placing them at the centre of the panel increases the
stress inthe panel and reduces the stiffness of the structure.
Abstract: A new conceptual architecture for low-level neural
pattern recognition is presented. The key ideas are that the brain
implements support vector machines and that support vectors are
represented as memory patterns in competitive queuing memories. A
binary classifier is built from two competitive queuing memories
holding positive and negative valence training examples respectively.
The support vector machine classification function is calculated in
synchronized evaluation cycles. The kernel is computed by bisymmetric
feed-forward networks feed by sensory input and by
competitive queuing memories traversing the complete sequence of
support vectors. Temporary summation generates the output
classification. It is speculated that perception apparatus in the brain
reuses structures that have evolved for enabling fluent execution of
prepared action sequences so that pattern recognition is built on
internalized motor programmes.
Abstract: The transformation of vocal characteristics aims at
modifying voice such that the intelligibility of aphonic voice is
increased or the voice characteristics of a speaker (source speaker) to
be perceived as if another speaker (target speaker) had uttered it. In
this paper, the current state-of-the-art voice characteristics
transformation methodology is reviewed. Special emphasis is placed
on voice transformation methodology and issues for improving the
transformed speech quality in intelligibility and naturalness are
discussed. In particular, it is suggested to use the modulation theory
of speech as a base for research on high quality voice transformation.
This approach allows one to separate linguistic, expressive, organic
and perspective information of speech, based on an analysis of how
they are fused when speech is produced. Therefore, this theory
provides the fundamentals not only for manipulating non-linguistic,
extra-/paralinguistic and intra-linguistic variables for voice
transformation, but also for paving the way for easily transposing the
existing voice transformation methods to emotion-related voice
quality transformation and speaking style transformation. From the
perspectives of human speech production and perception, the popular
voice transformation techniques are described and classified them
based on the underlying principles either from the speech production
or perception mechanisms or from both. In addition, the advantages
and limitations of voice transformation techniques and the
experimental manipulation of vocal cues are discussed through
examples from past and present research. Finally, a conclusion and
road map are pointed out for more natural voice transformation
algorithms in the future.
Abstract: Delivering course material via a virtual environment
is beneficial to today-s students because it offers the interactivity,
real-time interaction and social presence that students of all ages
have come to accept in our gaming rich community. It is essential
that the Net Generation also known as Generation Why, have
exposure to learning communities that encompass interactivity to
form social and educational connections. As student and professor
become interconnected through collaboration and interaction in a
virtual learning space, relationships develop and students begin to
take on an individual identity. With this in mind the research project
was developed to investigate the use of virtual environments on
student satisfaction and the effectiveness of course delivery.
Furthermore, the project was designed to integrate both interactive
(real-time) classes conducted in the Virtual Reality (VR)
environment while also creating archived VR sessions for student use
in retaining and reviewing course content.
Abstract: Sharing motivations of viral advertisements by
consumers and the impacts of these advertisements on the
perceptions for brand will be questioned in this study. Three
fundamental questions are answered in the study. These are
advertisement watching and sharing motivations of individuals,
criteria of liking viral advertisement and the impact of individual
attitudes for viral advertisement on brand perception respectively.
This study will be carried out via a viral advertisement which was
practiced in Turkey. The data will be collected by survey method and
the sample of the study consists of individuals who experienced the
practice of sample advertisement. Data will be collected by online
survey method and will be analyzed by using SPSS statistical
package program.
Recently traditional advertisement mind have been changing. New
advertising approaches which have significant impacts on consumers
have been argued. Viral advertising is a modernist advertisement
mind which offers significant advantages to brands apart from
traditional advertising channels such as television, radio and
magazines. Viral advertising also known as Electronic Word-of-
Mouth (eWOM) consists of free spread of convincing messages sent
by brands among interpersonal communication. When compared to
the traditional advertising, a more provocative thematic approach is
argued.
The foundation of this approach is to create advertisements that
are worth sharing with others by consumers. When that fact is taken
into consideration, in a manner of speaking it can also be stated that
viral advertising is media engineering.
The content worth sharing makes people being a volunteer
spokesman of a brand and strengthens the emotional bonds among
brand and consumer. Especially for some sectors in countries which
are having traditional advertising channel limitations, viral
advertising creates vital advantages.
Abstract: Ontology Matching is a task needed in various applica-tions, for example for comparison or merging purposes. In literature,many algorithms solving the matching problem can be found, butmost of them do not consider instances at all. Mappings are deter-mined by calculating the string-similarity of labels, by recognizinglinguistic word relations (synonyms, subsumptions etc.) or by ana-lyzing the (graph) structure. Due to the facts that instances are oftenmodeled within the ontology and that the set of instances describesthe meaning of the concepts better than their meta information,instances should definitely be incorporated into the matching process.In this paper several novel instance-based matching algorithms arepresented which enhance the quality of matching results obtainedwith common concept-based methods. Different kinds of formalismsare use to classify concepts on account of their instances and finallyto compare the concepts directly.KeywordsInstances, Ontology Matching, Semantic Web