Abstract: SIP (Session Initiation Protocol), using HTML based
call control messaging which is quite simple and efficient, is being
replaced for VoIP networks recently. As for authentication and
authorization purposes there are many approaches and considerations
for securing SIP to eliminate forgery on the integrity of SIP
messages. On the other hand Elliptic Curve Cryptography has
significant advantages like smaller key sizes, faster computations on
behalf of other Public Key Cryptography (PKC) systems that obtain
data transmission more secure and efficient. In this work a new
approach is proposed for secure SIP authentication by using a public
key exchange mechanism using ECC. Total execution times and
memory requirements of proposed scheme have been improved in
comparison with non-elliptic approaches by adopting elliptic-based
key exchange mechanism.
Abstract: Wireless Sensor Networks can be used to monitor the
physical phenomenon in such areas where human approach is nearly
impossible. Hence the limited power supply is the major constraint of
the WSNs due to the use of non-rechargeable batteries in sensor
nodes. A lot of researches are going on to reduce the energy
consumption of sensor nodes. Energy map can be used with
clustering, data dissemination and routing techniques to reduce the
power consumption of WSNs. Energy map can also be used to know
which part of the network is going to fail in near future. In this paper,
Energy map is constructed using the prediction based approach.
Adaptive alpha GM(1,1) model is used as the prediction model.
GM(1,1) is being used worldwide in many applications for predicting
future values of time series using some past values due to its high
computational efficiency and accuracy.
Abstract: Low-density parity-check (LDPC) codes have been shown to deliver capacity approaching performance; however, problematic graphical structures (e.g. trapping sets) in the Tanner graph of some LDPC codes can cause high error floors in bit-error-ratio (BER) performance under conventional sum-product algorithm (SPA). This paper presents a serial concatenation scheme to avoid the trapping sets and to lower the error floors of LDPC code. The outer code in the proposed concatenation is the LDPC, and the inner code is a high rate array code. This approach applies an interactive hybrid process between the BCJR decoding for the array code and the SPA for the LDPC code together with bit-pinning and bit-flipping techniques. Margulis code of size (2640, 1320) has been used for the simulation and it has been shown that the proposed concatenation and decoding scheme can considerably improve the error floor performance with minimal rate loss.
Abstract: In this paper we propose a novel approach for ascertaining human identity based on fusion of profile face and gait biometric cues The identification approach based on feature learning in PCA-LDA subspace, and classification using multivariate Bayesian classifiers allows significant improvement in recognition accuracy for low resolution surveillance video scenarios. The experimental evaluation of the proposed identification scheme on a publicly available database [2] showed that the fusion of face and gait cues in joint PCA-LDA space turns out to be a powerful method for capturing the inherent multimodality in walking gait patterns, and at the same time discriminating the person identity..
Abstract: Data Envelopment Analysis (DEA) is one of the most
widely used technique for evaluating the relative efficiency of a set
of homogeneous decision making units. Traditionally, it assumes that
input and output variables are known in advance, ignoring the critical
issue of data uncertainty. In this paper, we deal with the problem
of efficiency evaluation under uncertain conditions by adopting the
general framework of the stochastic programming. We assume that
output parameters are represented by discretely distributed random
variables and we propose two different models defined according to a
neutral and risk-averse perspective. The models have been validated
by considering a real case study concerning the evaluation of the
technical efficiency of a sample of individual firms operating in
the Italian leather manufacturing industry. Our findings show the
validity of the proposed approach as ex-ante evaluation technique
by providing the decision maker with useful insights depending on
his risk aversion degree.
Abstract: Studies have shown that the SnAgCu solder family has been widely used as a replacement for conventional Sn-Pb solders. An attractive approach is by introducing alloying additives (rare earth elements (RE), Zn, Co, Fe, Ni, Sb) into the SnAgCu solder, which helps in refining the microstructure also improving the mechanical and wetting properties of the solder. The present work focuses on the effect of additions of 0.5% Ce and Fe into Sn-3.0Ag-0.5Cu solder, in attempt to reduce the intermetallic compound (IMC) growth and reflow properties of the solder on Cu and Ni (P) surface finish, as well as effects thermal aging on the formation of intermetallic compound (IMC) on different surface finish. Excessive intermetallic compound growth may effect the interface and solder joint due to the brittle nature of the intermetallic compounds. Thus, by introducing alloying elements, IMC layer thickness can be decrease, resulting in better joint and solder reliability.
Abstract: This article first summarizes reasons why current approaches supporting Open Learning and Distance Education need to be complemented by tools permitting lecturers, researchers and students to cooperatively organize the semantic content of Learning related materials (courses, discussions, etc.) into a fine-grained shared semantic network. This first part of the article also quickly describes the approach adopted to permit such a collaborative work. Then, examples of such semantic networks are presented. Finally, an evaluation of the approach by students is provided and analyzed.
Abstract: The paper focuses on the area of context modeling with respect to the specification of context-aware systems supporting ubiquitous applications. The proposed approach, followed within the SIMPLICITY IST project, uses a high-level system ontology to derive context models for system components which consequently are mapped to the system's physical entities. For the definition of user and device-related context models in particular, the paper suggests a standard-based process consisting of an analysis phase using the Common Information Model (CIM) methodology followed by an implementation phase that defines 3GPP based components. The benefits of this approach are further depicted by preliminary examples of XML grammars defining profiles and components, component instances, coupled with descriptions of respective ubiquitous applications.
Abstract: In illumination variant face recognition, existing
methods extracting face albedo as light normalized image may lead to
loss of extensive facial details, with light template discarded. To
improve that, a novel approach for realistic facial texture
reconstruction by combining original image and albedo image is
proposed. First, light subspaces of different identities are established
from the given reference face images; then by projecting the original
and albedo image into each light subspace respectively, texture
reference images with corresponding lighting are reconstructed and
two texture subspaces are formed. According to the projections in
texture subspaces, facial texture with normal light can be synthesized.
Due to the combination of original image, facial details can be
preserved with face albedo. In addition, image partition is applied to
improve the synthesization performance. Experiments on Yale B and
CMUPIE databases demonstrate that this algorithm outperforms the
others both in image representation and in face recognition.
Abstract: Phylogenies ; The evolutionary histories of groups of
species are one of the most widely used tools throughout the life
sciences, as well as objects of research with in systematic,
evolutionary biology. In every phylogenetic analysis reconstruction
produces trees. These trees represent the evolutionary histories of
many groups of organisms, bacteria due to horizontal gene transfer
and plants due to process of hybridization. The process of gene
transfer in bacteria and hybridization in plants lead to reticulate
networks, therefore, the methods of constructing trees fail in
constructing reticulate networks. In this paper a model has been
employed to reconstruct phylogenetic network in honey bee. This
network represents reticulate evolution in honey bee. The maximum
parsimony approach has been used to obtain this reticulate network.
Abstract: User-Centered Design (UCD), Usability Engineering (UE) and Participatory Design (PD) are the common Human- Computer Interaction (HCI) approaches that are practiced in the software development process, focusing towards issues and matters concerning user involvement. It overlooks the organizational perspective of HCI integration within the software development organization. The Management Information Systems (MIS) perspective of HCI takes a managerial and organizational context to view the effectiveness of integrating HCI in the software development process. The Human-Centered Design (HCD) which encompasses all of the human aspects including aesthetic and ergonomic, is claimed as to provide a better approach in strengthening the HCI approaches to strengthen the software development process. In determining the effectiveness of HCD in the software development process, this paper presents the findings of a content analysis of HCI approaches by viewing those approaches as a technology which integrates user requirements, ranging from the top management to other stake holder in the software development process. The findings obtained show that HCD approach is a technology that emphasizes on human, tools and knowledge in strengthening the HCI approaches to strengthen the software development process in the quest to produce a sustainable, usable and useful software product.
Abstract: In many applications, it is a priori known that the
target function should satisfy certain constraints imposed by, for
example, economic theory or a human-decision maker. Here we
consider partially monotone problems, where the target variable
depends monotonically on some of the predictor variables but not all.
We propose an approach to build partially monotone models based
on the convolution of monotone neural networks and kernel
functions. The results from simulations and a real case study on
house pricing show that our approach has significantly better
performance than partially monotone linear models. Furthermore, the
incorporation of partial monotonicity constraints not only leads to
models that are in accordance with the decision maker's expertise,
but also reduces considerably the model variance in comparison to
standard neural networks with weight decay.
Abstract: From food consumption surveys has been found that potato consumption comparing to other European countries is one of the highest. Hence acrylamide (AA) intake coming from fried potatoes in population might be high as well. The aim of the research was to determine acrylamide content and estimate intake of acrylamide from roasted potatoes bred and cultivated in Latvia. Five common Latvian potato varieties were selected: Lenora, Brasla, Imanta, Zile, and Madara. A two-year research was conducted during two periods: just after harvesting and after six months of storage. Time and temperature (210 ± 5°C) was recorded during frying. AA was extracted from potatoes by solid phase extraction and AA content was determined by LC-MS/MS. estimated intake of acrylamide ranges from 0.012 to 0.496μgkg-1 BW per day.
Abstract: Various models have been derived by studying large number of completed software projects from various organizations and applications to explore how project sizes mapped into project effort. But, still there is a need to prediction accuracy of the models. As Neuro-fuzzy based system is able to approximate the non-linear function with more precision. So, Neuro-Fuzzy system is used as a soft computing approach to generate model by formulating the relationship based on its training. In this paper, Neuro-Fuzzy technique is used for software estimation modeling of on NASA software project data and performance of the developed models are compared with the Halstead, Walston-Felix, Bailey-Basili and Doty Models mentioned in the literature.
Abstract: Nowadays, quick technological changes force companies
to develop innovative products in an increasingly competitive
environment. Therefore, how to enhance the time of new product
development is very important. This design problem often lacks
the exact formula for getting it, and highly depends upon human
designers- past experiences. For these reasons, in this work, a Casebased
reasoning (CBR) system to assist in new product development
is proposed. When a case is recovered from the case base, the system
will take into account not only the attribute-s specific value and
how important it is. It will also take into account if the attribute
has a positive influence over the product development. Hence the
manufacturing time will be improved. This information will be
introduced as a new concept called “adaptability". An application to
this method for hearing instrument new design illustrates the proposed
approach.
Abstract: This paper presents a solution for the behavioural animation of autonomous virtual agent navigation in virtual environments. We focus on using Dempster-Shafer-s Theory of Evidence in developing visual sensor for virtual agent. The role of the visual sensor is to capture the information about the virtual environment or identifie which part of an obstacle can be seen from the position of the virtual agent. This information is require for vitual agent to coordinate navigation in virtual environment. The virual agent uses fuzzy controller as a navigation system and Fuzzy α - level for the action selection method. The result clearly demonstrates the path produced is reasonably smooth even though there is some sharp turn and also still not diverted too far from the potential shortest path. This had indicated the benefit of our method, where more reliable and accurate paths produced during navigation task.
Abstract: Optical flow is a research topic of interest for many
years. It has, until recently, been largely inapplicable to real-time
applications due to its computationally expensive nature. This paper
presents a new reliable flow technique which is combined with a
motion detection algorithm, from stationary camera image streams,
to allow flow-based analyses of moving entities, such as rigidity, in
real-time. The combination of the optical flow analysis with motion
detection technique greatly reduces the expensive computation of
flow vectors as compared with standard approaches, rendering the
method to be applicable in real-time implementation. This paper
describes also the hardware implementation of a proposed pipelined
system to estimate the flow vectors from image sequences in real
time. This design can process 768 x 576 images at a very high frame
rate that reaches to 156 fps in a single low cost FPGA chip, which is
adequate for most real-time vision applications.
Abstract: In this paper, we propose an adaptation of the Patricia-Tree for sparse datasets to generate non redundant rule associations. Using this adaptation, we can generate frequent closed itemsets that are more compact than frequent itemsets used in Apriori approach. This adaptation has been experimented on a set of datasets benchmarks.
Abstract: In this paper a new approach to face recognition is presented that achieves double dimension reduction making the system computationally efficient with better recognition results. In pattern recognition techniques, discriminative information of image increases with increase in resolution to a certain extent, consequently face recognition results improve with increase in face image resolution and levels off when arriving at a certain resolution level. In the proposed model of face recognition, first image decimation algorithm is applied on face image for dimension reduction to a certain resolution level which provides best recognition results. Due to better computational speed and feature extraction potential of Discrete Cosine Transform (DCT) it is applied on face image. A subset of coefficients of DCT from low to mid frequencies that represent the face adequately and provides best recognition results is retained. A trade of between decimation factor, number of DCT coefficients retained and recognition rate with minimum computation is obtained. Preprocessing of the image is carried out to increase its robustness against variations in poses and illumination level. This new model has been tested on different databases which include ORL database, Yale database and a color database. The proposed technique has performed much better compared to other techniques. The significance of the model is two fold: (1) dimension reduction up to an effective and suitable face image resolution (2) appropriate DCT coefficients are retained to achieve best recognition results with varying image poses, intensity and illumination level.
Abstract: Bond graph models of an electrical transformer including
the nonlinear saturation are presented. The transformer
using electrical and magnetic circuits are modelled. These models
determine the relation between self and mutual inductances, and
the leakage and magnetizing inductances of power transformers
with two windings using the properties of a bond graph. The
equivalence between electrical and magnetic variables is given.
The modelling and analysis using this methodology to three phase
power transformers can be extended.