Abstract: Automatic reusability appraisal is helpful in
evaluating the quality of developed or developing reusable software
components and in identification of reusable components from
existing legacy systems; that can save cost of developing the
software from scratch. But the issue of how to identify reusable
components from existing systems has remained relatively
unexplored. In this research work, structural attributes of software
components are explored using software metrics and quality of the
software is inferred by different Neural Network based approaches,
taking the metric values as input. The calculated reusability value
enables to identify a good quality code automatically. It is found that
the reusability value determined is close to the manual analysis used
to be performed by the programmers or repository managers. So, the
developed system can be used to enhance the productivity and
quality of software development.
Abstract: In this work we present an efficient approach for face
recognition in the infrared spectrum. In the proposed approach
physiological features are extracted from thermal images in order to
build a unique thermal faceprint. Then, a distance transform is used
to get an invariant representation for face recognition. The obtained
physiological features are related to the distribution of blood vessels
under the face skin. This blood network is unique to each individual
and can be used in infrared face recognition. The obtained results are
promising and show the effectiveness of the proposed scheme.
Abstract: The method of gait identification based on the nearest neighbor classification technique with motion similarity assessment by the dynamic time warping is proposed. The model based kinematic motion data, represented by the joints rotations coded by Euler angles and unit quaternions is used. The different pose distance functions in Euler angles and quaternion spaces are considered. To evaluate individual features of the subsequent joints movements during gait cycle, joint selection is carried out. To examine proposed approach database containing 353 gaits of 25 humans collected in motion capture laboratory is used. The obtained results are promising. The classifications, which takes into consideration all joints has accuracy over 91%. Only analysis of movements of hip joints allows to correctly identify gaits with almost 80% precision.
Abstract: With a surge of stream processing applications novel
techniques are required for generation and analysis of association
rules in streams. The traditional rule mining solutions cannot handle
streams because they generally require multiple passes over the data
and do not guarantee the results in a predictable, small time. Though
researchers have been proposing algorithms for generation of rules
from streams, there has not been much focus on their analysis.
We propose Association rule profiling, a user centric process for
analyzing association rules and attaching suitable profiles to them
depending on their changing frequency behavior over a previous
snapshot of time in a data stream.
Association rule profiles provide insights into the changing nature
of associations and can be used to characterize the associations. We
discuss importance of characteristics such as predictability of
linkages present in the data and propose metric to quantify it. We
also show how association rule profiles can aid in generation of user
specific, more understandable and actionable rules.
The framework is implemented as SUPAR: System for Usercentric
Profiling of Association Rules in streaming data. The
proposed system offers following capabilities:
i) Continuous monitoring of frequency of streaming item-sets
and detection of significant changes therein for association rule
profiling.
ii) Computation of metrics for quantifying predictability of
associations present in the data.
iii) User-centric control of the characterization process: user
can control the framework through a) constraint specification and b)
non-interesting rule elimination.
Abstract: Cooperative diversity (CD) has been adopted in many communication systems because it helps in improving performance of the wireless communication systems with the help of the relays that emulate the multiple antenna terminals. This work aims to provide the derivation of the performance analysis expressions of the multiuser diversity (MUD) in the two-hop cooperative multi-relay wireless networks (TCMRNs). Considering the work analysis, we provide analytically the derivation of a closed form expression of the two most commonly used performance metrics namely, the outage probability and the symbol error probability (SEP) for the fixed decode-and-forward (FDF) protocol with MUD.
Abstract: Prior research evidenced that unimodal biometric
systems have several tradeoffs like noisy data, intra-class variations,
restricted degrees of freedom, non-universality, spoof attacks, and
unacceptable error rates. In order for the biometric system to be more
secure and to provide high performance accuracy, more than one
form of biometrics are required. Hence, the need arise for multimodal
biometrics using combinations of different biometric modalities. This
paper introduces a multimodal biometric system (MMBS) based on
fusion of whole dorsal hand geometry and fingerprints that acquires
right and left (Rt/Lt) near-infra-red (NIR) dorsal hand geometry (HG)
shape and (Rt/Lt) index and ring fingerprints (FP). Database of 100
volunteers were acquired using the designed prototype. The acquired
images were found to have good quality for all features and patterns
extraction to all modalities. HG features based on the hand shape
anatomical landmarks were extracted. Robust and fast algorithms for
FP minutia points feature extraction and matching were used. Feature
vectors that belong to similar biometric traits were fused using
feature fusion methodologies. Scores obtained from different
biometric trait matchers were fused using the Min-Max
transformation-based score fusion technique. Final normalized scores
were merged using the sum of scores method to obtain a single
decision about the personal identity based on multiple independent
sources. High individuality of the fused traits and user acceptability
of the designed system along with its experimental high performance
biometric measures showed that this MMBS can be considered for
med-high security levels biometric identification purposes.
Abstract: P2P Networks are highly dynamic structures since
their nodes – peer users keep joining and leaving continuously. In the
paper, we study the effects of network change rates on query routing
efficiency. First we describe some background and an abstract system
model. The chosen routing technique makes use of cached metadata
from previous answer messages and also employs a mechanism for
broken path detection and metadata maintenance. Several metrics are
used to show that the protocol behaves quite well even with high rate
of node departures, but above a certain threshold it literally breaks
down and exhibits considerable efficiency degradation.
Abstract: The success of an electronic system in a System-on- Chip is highly dependent on the efficiency of its interconnection network, which is constructed from routers and channels (the routers move data across the channels between nodes). Since neither classical bus based nor point to point architectures can provide scalable solutions and satisfy the tight power and performance requirements of future applications, the Network-on-Chip (NoC) approach has recently been proposed as a promising solution. Indeed, in contrast to the traditional solutions, the NoC approach can provide large bandwidth with moderate area overhead. The selected topology of the components interconnects plays prime rule in the performance of NoC architecture as well as routing and switching techniques that can be used. In this paper, we present two generic NoC architectures that can be customized to the specific communication needs of an application in order to reduce the area with minimal degradation of the latency of the system. An experimental study is performed to compare these structures with basic NoC topologies represented by 2D mesh, Butterfly-Fat Tree (BFT) and SPIN. It is shown that Cluster mesh (CMesh) and MinRoot schemes achieves significant improvements in network latency and energy consumption with only negligible area overhead and complexity over existing architectures. In fact, in the case of basic NoC topologies, CMesh and MinRoot schemes provides substantial savings in area as well, because they requires fewer routers. The simulation results show that CMesh and MinRoot networks outperforms MESH, BFT and SPIN in main performance metrics.
Abstract: Fault-proneness of a software module is the
probability that the module contains faults. A correlation exists
between the fault-proneness of the software and the measurable
attributes of the code (i.e. the static metrics) and of the testing (i.e.
the dynamic metrics). Early detection of fault-prone software
components enables verification experts to concentrate their time and
resources on the problem areas of the software system under
development. This paper introduces Genetic Algorithm based
software fault prediction models with Object-Oriented metrics. The
contribution of this paper is that it has used Metric values of JEdit
open source software for generation of the rules for the classification
of software modules in the categories of Faulty and non faulty
modules and thereafter empirically validation is performed. The
results shows that Genetic algorithm approach can be used for
finding the fault proneness in object oriented software components.
Abstract: Most fingerprint recognition techniques are based on minutiae matching and have been well studied. However, this technology still suffers from problems associated with the handling of poor quality impressions. One problem besetting fingerprint matching is distortion. Distortion changes both geometric position and orientation, and leads to difficulties in establishing a match among multiple impressions acquired from the same finger tip. Marking all the minutiae accurately as well as rejecting false minutiae is another issue still under research. Our work has combined many methods to build a minutia extractor and a minutia matcher. The combination of multiple methods comes from a wide investigation into research papers. Also some novel changes like segmentation using Morphological operations, improved thinning, false minutiae removal methods, minutia marking with special considering the triple branch counting, minutia unification by decomposing a branch into three terminations, and matching in the unified x-y coordinate system after a two-step transformation are used in the work.
Abstract: Dynamic bandwidth allocation in EPONs can be
generally separated into inter-ONU scheduling and intra-ONU scheduling. In our previous work, the active intra-ONU scheduling
(AS) utilizes multiple queue reports (QRs) in each report message to cooperate with the inter-ONU scheduling and makes the granted
bandwidth fully utilized without leaving unused slot remainder (USR).
This scheme successfully solves the USR problem originating from the
inseparability of Ethernet frame. However, without proper setting of
threshold value in AS, the number of QRs constrained by the IEEE
802.3ah standard is not enough, especially in the unbalanced traffic
environment. This limitation may be solved by enlarging the threshold
value. The large threshold implies the large gap between the adjacent QRs, thus resulting in the large difference between the best granted bandwidth and the real granted bandwidth. In this paper, we integrate
AS with a cooperative prediction mechanism and distribute multiple
QRs to reduce the penalty brought by the prediction error.
Furthermore, to improve the QoS and save the usage of queue reports,
the highest priority (EF) traffic which comes during the waiting time is
granted automatically by OLT and is not considered in the requested
bandwidth of ONU. The simulation results show that the proposed
scheme has better performance metrics in terms of bandwidth
utilization and average delay for different classes of packets.
Abstract: This paper proposes a bi-objective model for the
facility location problem under a congestion system. The idea of the
model is motivated by applications of locating servers in bank
automated teller machines (ATMS), communication networks, and so
on. This model can be specifically considered for situations in which
fixed service facilities are congested by stochastic demand within
queueing framework. We formulate this model with two perspectives
simultaneously: (i) customers and (ii) service provider. The
objectives of the model are to minimize (i) the total expected
travelling and waiting time and (ii) the average facility idle-time.
This model represents a mixed-integer nonlinear programming
problem which belongs to the class of NP-hard problems. In addition,
to solve the model, two metaheuristic algorithms including nondominated
sorting genetic algorithms (NSGA-II) and non-dominated
ranking genetic algorithms (NRGA) are proposed. Besides, to
evaluate the performance of the two algorithms some numerical
examples are produced and analyzed with some metrics to determine
which algorithm works better.
Abstract: A higher order spline interpolated contour obtained
with up-sampling of homogenously distributed coordinates for
segmentation of kidney region in different classes of ultrasound
kidney images has been developed and presented in this paper. The
performance of the proposed method is measured and compared with
modified snake model contour, Markov random field contour and
expert outlined contour. The validation of the method is made in
correspondence with expert outlined contour using maximum coordinate
distance, Hausdorff distance and mean radial distance
metrics. The results obtained reveal that proposed scheme provides
optimum contour that agrees well with expert outlined contour.
Moreover this technique helps to preserve the pixels-of-interest
which in specific defines the functional characteristic of kidney. This
explores various possibilities in implementing computer-aided
diagnosis system exclusively for US kidney images.
Abstract: Nowadays increasingly the population makes use of
Information Technology (IT). As such, in recent year the Portuguese
government increased its focus on using the IT for improving
people-s life and began to develop a set of measures to enable the
modernization of the Public Administration, and so reducing the gap
between Public Administration and citizens.Thus the Portuguese
Government launched the Simplex Program. However these
SIMPLEX eGov measures, which have been implemented over the
years, present a serious challenge: how to forecast its impact on
existing Information Systems Architecture (ISA). Thus, this research
is focus in addressing the problem of automating the evaluation of the
actual impact of implementation an eGovSimplification and
Modernization measures in the Information Systems Architecture. To
realize the evaluation we proposes a Framework, which is supported
by some key concepts as: Quality Factors, ISA modeling,
Multicriteria Approach, Polarity Profile and Quality Metrics
Abstract: Software project effort estimation is frequently seen
as complex and expensive for individual software engineers.
Software production is in a crisis. It suffers from excessive costs.
Software production is often out of control. It has been suggested that
software production is out of control because we do not measure.
You cannot control what you cannot measure. During last decade, a
number of researches on cost estimation have been conducted. The
metric-set selection has a vital role in software cost estimation
studies; its importance has been ignored especially in neural network
based studies. In this study we have explored the reasons of those
disappointing results and implemented different neural network
models using augmented new metrics. The results obtained are
compared with previous studies using traditional metrics. To be able
to make comparisons, two types of data have been used. The first
part of the data is taken from the Constructive Cost Model
(COCOMO'81) which is commonly used in previous studies and the
second part is collected according to new metrics in a leading
international company in Turkey. The accuracy of the selected
metrics and the data samples are verified using statistical techniques.
The model presented here is based on Multi-Layer Perceptron
(MLP). Another difficulty associated with the cost estimation studies
is the fact that the data collection requires time and care. To make a
more thorough use of the samples collected, k-fold, cross validation
method is also implemented. It is concluded that, as long as an
accurate and quantifiable set of metrics are defined and measured
correctly, neural networks can be applied in software cost estimation
studies with success
Abstract: In order to guarantee secure communication for wireless sensor networks (WSNs), many user authentication schemes have successfully drawn researchers- attention and been studied widely. In 2012, He et al. proposed a robust biometric-based user authentication scheme for WSNs. However, this paper demonstrates that He et al.-s scheme has some drawbacks: poor reparability problem, user impersonation attack, and sensor node impersonate attack.
Abstract: In this paper we proposed comparison of four content based objective metrics with results of subjective tests from 80 video sequences. We also include two objective metrics VQM and SSIM to our comparison to serve as “reference” objective metrics because their pros and cons have already been published. Each of the video sequence was preprocessed by the region recognition algorithm and then the particular objective video quality metric were calculated i.e. mutual information, angular distance, moment of angle and normalized cross-correlation measure. The Pearson coefficient was calculated to express metrics relationship to accuracy of the model and the Spearman rank order correlation coefficient to represent the metrics relationship to monotonicity. The results show that model with the mutual information as objective metric provides best result and it is suitable for evaluating quality of video sequences.
Abstract: Today, building automation is advancing from simple
monitoring and control tasks of lightning and heating towards more
and more complex applications that require a dynamic perception
and interpretation of different scenes occurring in a building. Current
approaches cannot handle these newly upcoming demands. In this
article, a bionically inspired approach for multimodal, dynamic scene
perception and interpretation is presented, which is based on neuroscientific
and neuro-psychological research findings about the perceptual
system of the human brain. This approach bases on data from diverse
sensory modalities being processed in a so-called neuro-symbolic
network. With its parallel structure and with its basic elements being
information processing and storing units at the same time, a very
efficient method for scene perception is provided overcoming the
problems and bottlenecks of classical dynamic scene interpretation
systems.
Abstract: Prediction of fault-prone modules provides one way to
support software quality engineering. Clustering is used to determine
the intrinsic grouping in a set of unlabeled data. Among various
clustering techniques available in literature K-Means clustering
approach is most widely being used. This paper introduces K-Means
based Clustering approach for software finding the fault proneness of
the Object-Oriented systems. The contribution of this paper is that it
has used Metric values of JEdit open source software for generation
of the rules for the categorization of software modules in the
categories of Faulty and non faulty modules and thereafter
empirically validation is performed. The results are measured in
terms of accuracy of prediction, probability of Detection and
Probability of False Alarms.
Abstract: Image retrieval is a topic where scientific interest is currently high. The important steps associated with image retrieval system are the extraction of discriminative features and a feasible similarity metric for retrieving the database images that are similar in content with the search image. Gabor filtering is a widely adopted technique for feature extraction from the texture images. The recently proposed sparsity promoting l1-norm minimization technique finds the sparsest solution of an under-determined system of linear equations. In the present paper, the l1-norm minimization technique as a similarity metric is used in image retrieval. It is demonstrated through simulation results that the l1-norm minimization technique provides a promising alternative to existing similarity metrics. In particular, the cases where the l1-norm minimization technique works better than the Euclidean distance metric are singled out.