Abstract: Ant colony optimization is an ant algorithm framework that took inspiration from foraging behavior of ant colonies. Indeed, ACO algorithms use a chemical communication, represented by pheromone trails, to build good solutions. However, ants involve different communication channels to interact. Thus, this paper introduces the acoustic communication between ants while they are foraging. This process allows fine and local exploration of search space and permits optimal solution to be improved.
Abstract: In this paper we propose a novel Run Time Interface
(RTI) technique to provide an efficient environment for MPI jobs on
the heterogeneous architecture of PARAM Padma. It suggests an
innovative, unified framework for the job management interface
system in parallel and distributed computing. This approach employs
proxy scheme. The implementation shows that the proposed RTI is
highly scalable and stable. Moreover RTI provides the storage access
for the MPI jobs in various operating system platforms and improve
the data access performance through high performance C-DAC
Parallel File System (C-PFS). The performance of the RTI is
evaluated by using the standard HPC benchmark suites and the
simulation results show that the proposed RTI gives good
performance on large scale supercomputing system.
Abstract: Purpose of this work is the development of an
automatic classification system which could be useful for radiologists
in the investigation of breast cancer. The software has been designed
in the framework of the MAGIC-5 collaboration.
In the automatic classification system the suspicious regions with
high probability to include a lesion are extracted from the image as
regions of interest (ROIs). Each ROI is characterized by some
features based on morphological lesion differences.
Some classifiers as a Feed Forward Neural Network, a K-Nearest
Neighbours and a Support Vector Machine are used to distinguish the
pathological records from the healthy ones.
The results obtained in terms of sensitivity (percentage of
pathological ROIs correctly classified) and specificity (percentage of
non-pathological ROIs correctly classified) will be presented through
the Receive Operating Characteristic curve (ROC). In particular the
best performances are 88% ± 1 of area under ROC curve obtained
with the Feed Forward Neural Network.
Abstract: Wireless Sensor Network is Multi hop Self-configuring
Wireless Network consisting of sensor nodes. The deployment of
wireless sensor networks in many application areas, e.g., aggregation
services, requires self-organization of the network nodes into clusters.
Efficient way to enhance the lifetime of the system is to partition the
network into distinct clusters with a high energy node as cluster head.
The different methods of node clustering techniques have appeared in
the literature, and roughly fall into two families; those based on the
construction of a dominating set and those which are based solely on
energy considerations. Energy optimized cluster formation for a set
of randomly scattered wireless sensors is presented. Sensors within a
cluster are expected to be communicating with cluster head only. The
energy constraint and limited computing resources of the sensor nodes
present the major challenges in gathering the data. In this paper we
propose a framework to study how partially correlated data affect the
performance of clustering algorithms. The total energy consumption
and network lifetime can be analyzed by combining random geometry
techniques and rate distortion theory. We also present the relation
between compression distortion and data correlation.
Abstract: In today-s economy plant engineering faces many
challenges. For instance the intensifying competition in this business
is leading to cost competition and needs for a shorter time-to-market.
To remain competitive companies need to make their businesses
more profitable by implementing improvement programs such as
standardization projects. But they have difficulties to tap their full
economic potential for various reasons. One of them is non-holistic
planning and implementation of standardization projects. This paper
describes a new conceptual framework - the layer-model. The model
combines and expands existing proven approaches in order to
improve design, implementation and management of standardization
projects. Based on a holistic approach it helps to systematically
analyze the effects of standardization projects on different business
layers and enables companies to better seize the opportunities offered
by standardization.
Abstract: In the framework of the image compression by
Wavelet Transforms, we propose a perceptual method by
incorporating Human Visual System (HVS) characteristics in the
quantization stage. Indeed, human eyes haven-t an equal sensitivity
across the frequency bandwidth. Therefore, the clarity of the
reconstructed images can be improved by weighting the quantization
according to the Contrast Sensitivity Function (CSF). The visual
artifact at low bit rate is minimized. To evaluate our method, we use
the Peak Signal to Noise Ratio (PSNR) and a new evaluating criteria
witch takes into account visual criteria. The experimental results
illustrate that our technique shows improvement on image quality at
the same compression ratio.
Abstract: Enterprise Architecture (EA) is a framework for description, coordination and alignment of all activities across the organization in order to achieve strategic goals using ICT enablers. A number of EA-compatible frameworks have been developed. We, in this paper, mainly focus on Federal Enterprise Architecture Framework (FEAF) since its reference models are plentiful. Among these models we are interested here in its business reference model (BRM). The test process is one important subject of an EA project which is to somewhat overlooked. This lack of attention may cause drawbacks or even failure of an enterprise architecture project. To address this issue we intend to use International Software Testing Qualification Board (ISTQB) framework and standard test suites to present a method to improve EA testing process. The main challenge is how to communicate between the concepts of EA and ISTQB. In this paper, we propose a method for integrating these concepts.
Abstract: Fingerprint based identification system; one of a well
known biometric system in the area of pattern recognition and has
always been under study through its important role in forensic
science that could help government criminal justice community. In
this paper, we proposed an identification framework of individuals by
means of fingerprint. Different from the most conventional
fingerprint identification frameworks the extracted Geometrical
element features (GEFs) will go through a Discretization process.
The intention of Discretization in this study is to attain individual
unique features that could reflect the individual varianceness in order
to discriminate one person from another. Previously, Discretization
has been shown a particularly efficient identification on English
handwriting with accuracy of 99.9% and on discrimination of twins-
handwriting with accuracy of 98%. Due to its high discriminative
power, this method is adopted into this framework as an independent
based method to seek for the accuracy of fingerprint identification.
Finally the experimental result shows that the accuracy rate of
identification of the proposed system using Discretization is 100%
for FVC2000, 93% for FVC2002 and 89.7% for FVC2004 which is
much better than the conventional or the existing fingerprint
identification system (72% for FVC2000, 26% for FVC2002 and
32.8% for FVC2004). The result indicates that Discretization
approach manages to boost up the classification effectively, and
therefore prove to be suitable for other biometric features besides
handwriting and fingerprint.
Abstract: Website plays a significant role in success of an e-business. It is the main start point of any organization and corporation for its customers, so it's important to customize and design it according to the visitors' preferences. Also, websites are a place to introduce services of an organization and highlight new service to the visitors and audiences. In this paper, we will use web usage mining techniques, as a new field of research in data mining and knowledge discovery, in an Iranian government website. Using the results, a framework for web content layour is proposed. An agent is designed to dynamically update and improve web links locations and layout. Then, we will explain how it is used to directly enable top managers of the organization to influence on the arrangement of web contents and also to enhance customization of web site navigation due to online users' behaviors.
Abstract: With the popularity of the multi-core and many-core architectures there is a great requirement for software frameworks which can support parallel programming methodologies. In this paper we introduce an Eclipse toolkit, JConqurr which is easy to use and provides robust support for flexible parallel progrmaming. JConqurr is a multi-core and many-core programming toolkit for Java which is capable of providing support for common parallel programming patterns which include task, data, divide and conquer and pipeline parallelism. The toolkit uses an annotation and a directive mechanism to convert the sequential code into parallel code. In addition to that we have proposed a novel mechanism to achieve the parallelism using graphical processing units (GPU). Experiments with common parallelizable algorithms have shown that our toolkit can be easily and efficiently used to convert sequential code to parallel code and significant performance gains can be achieved.
Abstract: Data warehouse is a dedicated database used for querying and reporting. Queries in this environment show special characteristics such as multidimensionality and aggregation. Exploiting the nature of queries, in this paper we propose a query driven design framework. The proposed framework is general and allows a designer to generate a schema based on a set of queries.
Abstract: A registration framework for image-guided robotic
surgery is proposed for three emergency neurosurgical procedures,
namely Intracranial Pressure (ICP) Monitoring, External Ventricular
Drainage (EVD) and evacuation of a Chronic Subdural Haematoma
(CSDH). The registration paradigm uses CT and white light as
modalities. This paper presents two simulation studies for a
preliminary evaluation of the registration protocol: (1) The loci of the
Target Registration Error (TRE) in the patient-s axial, coronal and
sagittal views were simulated based on a Fiducial Localisation Error
(FLE) of 5 mm and (2) Simulation of the actual framework using
projected views from a surface rendered CT model to represent white
light images of the patient. Craniofacial features were employed as
the registration basis to map the CT space onto the simulated
intraoperative space. Photogrammetry experiments on an artificial
skull were also performed to benchmark the results obtained from the
second simulation. The results of both simulations show that the
proposed protocol can provide a 5mm accuracy for these
neurosurgical procedures.
Abstract: An application framework provides a reusable design
and implementation for a family of software systems. Application
developers extend the framework to build their particular
applications using hooks. Hooks are the places identified to show
how to use and customize the framework. Hooks define Framework
Interface Classes (FICs) and their possible specifications, which
helps in building reusable test cases for the implementations of these
classes. In applications developed using gray-box frameworks, FICs
inherit framework classes or use them without inheritance. In this
paper, a test-case generation technique is extended to build test cases
for FICs built for gray-box frameworks. A tool is developed to
automate the introduced technique.
Abstract: The growing outsourcing of logistics services
resulting from the ongoing current in firms of costs
reduction/increased efficiency means that it is becoming more and
more important for the companies doing the outsourcing to carry out
a proper evaluation.
The multiple definitions and measures of logistics service
performance found in research on the topic create a certain degree of
confusion and do not clear the way towards the proper measurement
of their performance. Do a model and a specific set of indicators exist
that can be considered appropriate for measuring the performance of
logistics services outsourcing in industrial environments? Are said
indicators in keeping with the objectives pursued by outsourcing? We
aim to answer these and other research questions in the study we have
initiated in the field within the framework of the international High
Performance Manufacturing (HPM) project of which this paper
forms part.
As the first stage of this research, this paper reviews articles
dealing with the topic published in the last 15 years with the aim of
detecting the models most used to make this measurement and
determining which performance indicators are proposed as part of
said models and which are most used. The first steps are also taken in
determining whether these indicators, financial and operational, cover
the aims that are being pursued when outsourcing logistics services.
The findings show there is a wide variety of both models and
indicators used. This would seem to testify to the need to continue
with our research in order to try to propose a model and a set of
indicators for measuring the performance of logistics services
outsourcing in industrial environments.
Abstract: Encryption protects communication partners from
disclosure of their secret messages but cannot prevent traffic analysis
and the leakage of information about “who communicates with
whom". In the presence of collaborating adversaries, this linkability
of actions can danger anonymity. However, reliably providing
anonymity is crucial in many applications. Especially in contextaware
mobile business, where mobile users equipped with PDAs
request and receive services from service providers, providing
anonymous communication is mission-critical and challenging at the
same time. Firstly, the limited performance of mobile devices does
not allow for heavy use of expensive public-key operations which are
commonly used in anonymity protocols. Moreover, the demands for
security depend on the application (e.g., mobile dating vs. pizza
delivery service), but different users (e.g., a celebrity vs. a normal
person) may even require different security levels for the same
application. Considering both hardware limitations of mobile devices
and different sensitivity of users, we propose an anonymity
framework that is dynamically configurable according to user and
application preferences. Our framework is based on Chaum-s mixnet.
We explain the proposed framework, its configuration
parameters for the dynamic behavior and the algorithm to enforce
dynamic anonymity.
Abstract: This research aims to study value-creation process of
producing monk-s bowls, Thai traditional handicrafts, which is facing problems in adapting to the changing society. It also aims to identify
problems and obstacles to value creation. This research is based on a case study of monk-s bowl manufactures from Ban-Baat Village,
Bangkok. The conceptual framework is based on the model of value
chain to analyze the process.
The research methodology is qualitative. This research found that the value-creation process of monk-s bowls consists of eight
activities contributing to adding value to the products and increasing
profits to the producers in return. Five major problems and obstacles
are found.
The research suggests that these problems and obstacles limit the manufacturers- potential for creating more valued product and lead to business stagnation. These problems should be addressed and solved with collaboration among the government, the private sector and the
manufacturers.
Abstract: This paper describes a new method for affine parameter
estimation between image sequences. Usually, the parameter
estimation techniques can be done by least squares in a quadratic
way. However, this technique can be sensitive to the presence
of outliers. Therefore, parameter estimation techniques for various
image processing applications are robust enough to withstand the
influence of outliers. Progressively, some robust estimation functions
demanding non-quadratic and perhaps non-convex potentials adopted
from statistics literature have been used for solving these. Addressing
the optimization of the error function in a factual framework for
finding a global optimal solution, the minimization can begin with
the convex estimator at the coarser level and gradually introduce nonconvexity
i.e., from soft to hard redescending non-convex estimators
when the iteration reaches finer level of multiresolution pyramid.
Comparison has been made to find the performance of the results
of proposed method with the results found individually using two
different estimators.
Abstract: In this paper, a joint source-channel coding (JSCC) scheme for time-varying channels is presented. The proposed scheme uses hierarchical framework for both source encoder and transmission via QAM modulation. Hierarchical joint source channel codes with hierarchical QAM constellations are designed to track the channel variations which yields to a higher throughput by adapting certain parameters of the receiver to the channel variation. We consider the problem of still image transmission over time-varying channels with channel state information (CSI) available at 1) receiver only and 2) both transmitter and receiver being informed about the state of the channel. We describe an algorithm that optimizes hierarchical source codebooks by minimizing the distortion due to source quantizer and channel impairments. Simulation results, based on image representation, show that, the proposed hierarchical system outperforms the conventional schemes based on a single-modulator and channel optimized source coding.
Abstract: This article presents a computationally tractable probabilistic model for the relation between the complex wavelet coefficients of two images of the same scene. The two images are acquisitioned at distinct moments of times, or from distinct viewpoints, or by distinct sensors. By means of the introduced probabilistic model, we argue that the similarity between the two images is controlled not by the values of the wavelet coefficients, which can be altered by many factors, but by the nature of the wavelet coefficients, that we model with the help of hidden state variables. We integrate this probabilistic framework in the construction of a new image registration algorithm. This algorithm has sub-pixel accuracy and is robust to noise and to other variations like local illumination changes. We present the performance of our algorithm on various image types.
Abstract: An appropriate project delivery system (PDS) is crucial
to the success of a construction projects. Case-based Reasoning (CBR)
is a useful support for PDS selection. However, the traditional CBR
approach represents cases as attribute-value vectors without taking
relations among attributes into consideration, and could not calculate
the similarity when the structures of cases are not strictly same.
Therefore, this paper solves this problem by adopting the Relational
Case-based Reasoning (RCBR) approach for PDS selection,
considering both the structural similarity and feature similarity. To
develop the feature terms of the construction projects, the criteria and
factors governing PDS selection process are first identified. Then
feature terms for the construction projects are developed. Finally, the
mechanism of similarity calculation and a case study indicate how
RCBR works for PDS selection. The adoption of RCBR in PDS
selection expands the scope of application of traditional CBR method
and improves the accuracy of the PDS selection system.