Abstract: The need to merge software artifacts seems inherent
to modern software development. Distribution of development over
several teams and breaking tasks into smaller, more manageable
pieces are an effective means to deal with the kind of complexity. In
each case, the separately developed artifacts need to be assembled as
efficiently as possible into a consistent whole in which the parts still
function as described. In addition, earlier changes are introduced into
the life cycle and easier is their management by designers.
Interaction-based specifications such as UML sequence diagrams
have been found effective in this regard. As a result, sequence
diagrams can be used not only for capturing system behaviors but
also for merging changes in order to create a new version. The
objective of this paper is to suggest a new approach to deal with the
problem of software merging at the level of sequence diagrams by
using the concept of dependence analysis that captures, formally, all
mapping, and differences between elements of sequence diagrams
and serves as a key concept to create a new version of sequence
diagram.
Abstract: Clustering is a process of grouping objects and data
into groups of clusters to ensure that data objects from the same
cluster are identical to each other. Clustering algorithms in one of the
area in data mining and it can be classified into partition, hierarchical,
density based and grid based. Therefore, in this paper we do survey
and review four major hierarchical clustering algorithms called
CURE, ROCK, CHAMELEON and BIRCH. The obtained state of
the art of these algorithms will help in eliminating the current
problems as well as deriving more robust and scalable algorithms for
clustering.
Abstract: Recently, numerous documents including large
volumes of unstructured data and text have been created because of the
rapid increase in the use of social media and the Internet. Usually,
these documents are categorized for the convenience of users. Because
the accuracy of manual categorization is not guaranteed, and such
categorization requires a large amount of time and incurs huge costs.
Many studies on automatic categorization have been conducted to help
mitigate the limitations of manual categorization. Unfortunately, most
of these methods cannot be applied to categorize complex documents
with multiple topics because they work on the assumption that
individual documents can be categorized into single categories only.
Therefore, to overcome this limitation, some studies have attempted to
categorize each document into multiple categories. However, the
learning process employed in these studies involves training using a
multi-categorized document set. These methods therefore cannot be
applied to the multi-categorization of most documents unless
multi-categorized training sets using traditional multi-categorization
algorithms are provided. To overcome this limitation, in this study, we
review our novel methodology for extending the category of a
single-categorized document to multiple categorizes, and then
introduce a survey-based verification scenario for estimating the
accuracy of our automatic categorization methodology.
Abstract: Modelling of building processes of a multimodal
freight transportation support information system is discussed based
on modern CASE technologies. Functional efficiencies of ports in
the eastern part of the Black Sea are analyzed taking into account
their ecological, seasonal, resource usage parameters. By resources,
we mean capacities of berths, cranes, automotive transport, as well as
work crews and neighbouring airports. For the purpose of designing
database of computer support system for Managerial (Logistics)
function, using Object-Role Modeling (ORM) tool (NORMA–Natural ORM Architecture) is proposed, after which Entity
Relationship Model (ERM) is generated in automated process.
Software is developed based on Process-Oriented and Service-Oriented architecture, in Visual Studio.NET environment.
Abstract: In this paper, we present an optimization technique or
a learning algorithm using the hybrid architecture by combining the
most popular sequence recognition models such as Recurrent Neural
Networks (RNNs) and Hidden Markov models (HMMs). In order to
improve the sequence/pattern recognition/classification performance
by applying a hybrid/neural symbolic approach, a gradient descent
learning algorithm is developed using the Real Time Recurrent
Learning of Recurrent Neural Network for processing the knowledge
represented in trained Hidden Markov Models. The developed hybrid
algorithm is implemented on automata theory as a sample test beds
and the performance of the designed algorithm is demonstrated and
evaluated on learning the deterministic finite state automata.
Abstract: In the context of the handwriting recognition, we
propose an off line system for the recognition of the Arabic
handwritten words of the Algerian departments. The study is based
mainly on the evaluation of neural network performances, trained
with the gradient back propagation algorithm. The used parameters to
form the input vector of the neural network are extracted on the
binary images of the handwritten word by several methods. The
Distribution parameters, the centered moments of the different
projections of the different segments, the centered moments of the
word image coding according to the directions of Freeman, and the
Barr features applied binary image of the word and on its different
segments. The classification is achieved by a multi layers perceptron.
A detailed experiment is carried and satisfactory recognition results
are reported.
Abstract: This paper evaluates the accrual based scheduling for
cloud in single and multi-resource system. Numerous organizations
benefit from Cloud computing by hosting their applications. The
cloud model provides needed access to computing with potentially
unlimited resources. Scheduling is tasks and resources mapping to a
certain optimal goal principle. Scheduling, schedules tasks to virtual
machines in accordance with adaptable time, in sequence under
transaction logic constraints. A good scheduling algorithm improves
CPU use, turnaround time, and throughput. In this paper, three realtime
cloud services scheduling algorithm for single resources and
multiple resources are investigated. Experimental results show
Resource matching algorithm performance to be superior for both
single and multi-resource scheduling when compared to benefit first
scheduling, Migration, Checkpoint algorithms.
Abstract: People, throughout the history, have made estimates
and inferences about the future by using their past experiences.
Developing information technologies and the improvements in the
database management systems make it possible to extract useful
information from knowledge in hand for the strategic decisions.
Therefore, different methods have been developed. Data mining by
association rules learning is one of such methods. Apriori algorithm,
one of the well-known association rules learning algorithms, is not
commonly used in spatio-temporal data sets. However, it is possible
to embed time and space features into the data sets and make Apriori
algorithm a suitable data mining technique for learning spatiotemporal
association rules. Lake Van, the largest lake of Turkey, is a
closed basin. This feature causes the volume of the lake to increase or
decrease as a result of change in water amount it holds. In this study,
evaporation, humidity, lake altitude, amount of rainfall and
temperature parameters recorded in Lake Van region throughout the
years are used by the Apriori algorithm and a spatio-temporal data
mining application is developed to identify overflows and newlyformed
soil regions (underflows) occurring in the coastal parts of
Lake Van. Identifying possible reasons of overflows and underflows
may be used to alert the experts to take precautions and make the
necessary investments.
Abstract: Software testing has become a mandatory process in
assuring the software product quality. Hence, test management is
needed in order to manage the test activities conducted in the
software test life cycle. This paper discusses on the challenges faced
in the software test life cycle, and how the test processes and test
activities, mainly on test cases creation, test execution, and test
reporting is being managed and automated using several test
automation tools, i.e. Jira, Robot Framework, and Jenkins.
Abstract: Maintaining factory default battery endurance rate
over time in supporting huge amount of running applications on
energy-restricted mobile devices has created a new challenge for
mobile applications developer. While delivering customers’
unlimited expectations, developers are barely aware of efficient use
of energy from the application itself. Thus, developers need a set of
valid energy consumption indicators in assisting them to develop
energy saving applications. In this paper, we present a few software
product metrics that can be used as an indicator to measure energy
consumption of Android-based mobile applications in the early of
design stage. In particular, Trepn Profiler (Power profiling tool for
Qualcomm processor) has used to collect the data of mobile
application power consumption, and then analyzed for the 23
software metrics in this preliminary study. The results show that
McCabe cyclomatic complexity, number of parameters, nested block
depth, number of methods, weighted methods per class, number of
classes, total lines of code and method lines have direct relationship
with power consumption of mobile application.
Abstract: Digital cameras to reduce cost, use an image sensor to
capture color images. Color Filter Array (CFA) in digital cameras
permits only one of the three primary (red-green-blue) colors to be
sensed in a pixel and interpolates the two missing components
through a method named demosaicking. Captured data is interpolated
into a full color image and compressed in applications. Color
interpolation before compression leads to data redundancy. This
paper proposes a new Vector Quantization (VQ) technique to
construct a VQ codebook with Differential Evolution (DE)
Algorithm. The new technique is compared to conventional Linde-
Buzo-Gray (LBG) method.
Abstract: In this paper, we describe an application for face
recognition. Many studies have used local descriptors to characterize
a face, the performance of these local descriptors remain low by
global descriptors (working on the entire image). The application of
local descriptors (cutting image into blocks) must be able to store
both the advantages of global and local methods in the Discrete
Cosine Transform (DCT) domain. This system uses neural network
techniques. The letter method provides a good compromise between
the two approaches in terms of simplifying of calculation and
classifying performance. Finally, we compare our results with those
obtained from other local and global conventional approaches.
Abstract: Detecting changes in multiple images of the same
scene has recently seen increased interest due to the many
contemporary applications including smart security systems, smart
homes, remote sensing, surveillance, medical diagnosis, weather
forecasting, speed and distance measurement, post-disaster forensics
and much more. These applications differ in the scale, nature, and
speed of change. This paper presents an application of image
processing techniques to implement a real-time change detection
system. Change is identified by comparing the RGB representation of
two consecutive frames captured in real-time. The detection threshold
can be controlled to account for various luminance levels. The
comparison result is passed through a filter before decision making to
reduce false positives, especially at lower luminance conditions. The
system is implemented with a MATLAB Graphical User interface
with several controls to manage its operation and performance.
Abstract: Obturator Foramen is a specific structure in Pelvic
bone images and recognition of it is a new concept in medical image
processing. Moreover, segmentation of bone structures such as
Obturator Foramen plays an essential role for clinical research in
orthopedics. In this paper, we present a novel method to analyze the
similarity between the substructures of the imaged region and a hand
drawn template as a preprocessing step for computation of Pelvic
bone rotation on hip radiographs. This method consists of integrated
usage of Marker-controlled Watershed segmentation and Zernike
moment feature descriptor and it is used to detect Obturator Foramen
accurately. Marker-controlled Watershed segmentation is applied to
separate Obturator Foramen from the background effectively. Then,
Zernike moment feature descriptor is used to provide matching
between binary template image and the segmented binary image for
final extraction of Obturator Foramens. Finally, Pelvic bone rotation
rate calculation for each hip radiograph is performed automatically to
select and eliminate hip radiographs for further studies which depend
on Pelvic bone angle measurements. The proposed method is tested
on randomly selected 100 hip radiographs. The experimental results
demonstrated that the proposed method is able to segment Obturator
Foramen with 96% accuracy.
Abstract: Underwater acoustic networks have attracted great
attention in the last few years because of its numerous applications.
High data rate can be achieved by efficiently modeling the physical
layer in the network protocol stack. In Acoustic medium,
propagation speed of the acoustic waves is dependent on many
parameters such as temperature, salinity, density, and depth.
Acoustic propagation speed cannot be modeled using standard
empirical formulas such as Urick and Thorp descriptions. In this
paper, we have modeled the acoustic channel using real time data of
temperature, salinity, and speed of Bay of Bengal (Indian Coastal
Region). We have modeled the acoustic channel by using Mackenzie
speed equation and real time data obtained from National Institute of
Oceanography and Technology. It is found that acoustic propagation
speed varies between 1503 m/s to 1544 m/s as temperature and
depth differs. The simulation results show that temperature, salinity,
depth plays major role in acoustic propagation and data rate
increases with appropriate data sets substituted in the simulated
model.
Abstract: In this paper genetic based test data compression is
targeted for improving the compression ratio and for reducing the
computation time. The genetic algorithm is based on extended pattern
run-length coding. The test set contains a large number of X value
that can be effectively exploited to improve the test data
compression. In this coding method, a reference pattern is set and its
compatibility is checked. For this process, a genetic algorithm is
proposed to reduce the computation time of encoding algorithm. This
coding technique encodes the 2n compatible pattern or the inversely
compatible pattern into a single test data segment or multiple test data
segment. The experimental result shows that the compression ratio
and computation time is reduced.
Abstract: Brain-Computer Interfaces (BCIs) measure brain
signals activity, intentionally and unintentionally induced by users,
and provides a communication channel without depending on the
brain’s normal peripheral nerves and muscles output pathway.
Feature Selection (FS) is a global optimization machine learning
problem that reduces features, removes irrelevant and noisy data
resulting in acceptable recognition accuracy. It is a vital step
affecting pattern recognition system performance. This study presents
a new Binary Particle Swarm Optimization (BPSO) based feature
selection algorithm. Multi-layer Perceptron Neural Network
(MLPNN) classifier with backpropagation training algorithm and
Levenberg-Marquardt training algorithm classify selected features.
Abstract: Polymorphism is one of the main pillars of objectoriented
paradigm. It induces hidden forms of class dependencies
which may impact software quality, resulting in higher cost factor for
comprehending, debugging, testing, and maintaining the software. In
this paper, a new cognitive complexity metric called Cognitive
Weighted Polymorphism Factor (CWPF) is proposed. Apart from the
software structural complexity, it includes the cognitive complexity
on the basis of type. The cognitive weights are calibrated based on 27
empirical studies with 120 persons. A case study and experimentation
of the new software metric shows positive results. Further, a
comparative study is made and the correlation test has proved that
CWPF complexity metric is a better, more comprehensive, and more
realistic indicator of the software complexity than Abreu’s
Polymorphism Factor (PF) complexity metric.
Abstract: This paper is meant to analyze the ranking of
University of Malaysia Terengganu, UMT’s website in the World
Wide Web. There are only few researches have been done on
comparing the ranking of universities’ websites so this research will
be able to determine whether the existing UMT’s website is serving
its purpose which is to introduce UMT to the world. The ranking is
based on hub and authority values which are accordance to the
structure of the website. These values are computed using two websearching
algorithms, HITS and SALSA. Three other universities’
websites are used as the benchmarks which are UM, Harvard and
Stanford. The result is clearly showing that more work has to be done
on the existing UMT’s website where important pages according to
the benchmarks, do not exist in UMT’s pages. The ranking of UMT’s
website will act as a guideline for the web-developer to develop a
more efficient website.
Abstract: As computing technology advances, smartphone
applications can assist student learning in a pervasive way. For
example, the idea of using mobile apps for the PA Common Trees,
Pests, Pathogens, in the field as a reference tool allows middle school
students to learn about trees and associated pests/pathogens without
bringing a textbook. While working on the development of three heterogeneous mobile
apps, we ran into numerous challenges. Both the traditional waterfall
model and the more modern agile methodologies failed in practice.
The waterfall model emphasizes the planning of the duration for each
phase. When the duration of each phase is not consistent with the
availability of developers, the waterfall model cannot be employed.
When applying Agile Methodologies, we cannot maintain the high
frequency of the iterative development review process, known as
‘sprints’. In this paper, we discuss the challenges and solutions. We
propose a hybrid model known as the Relay Race Methodology to
reflect the concept of racing and relaying during the process of
software development in practice. Based on the development project,
we observe that the modeling of the relay race transition between any
two phases is manifested naturally. Thus, we claim that the RRM
model can provide a de fecto rather than a de jure basis for the core
concept in the software development model. In this paper, the background of the project is introduced first.
Then, the challenges are pointed out followed by our solutions.
Finally, the experiences learned and the future works are presented.