Abstract: This paper introduces new algorithms (Fuzzy relative
of the CLARANS algorithm FCLARANS and Fuzzy c Medoids
based on randomized search FCMRANS) for fuzzy clustering of
relational data. Unlike existing fuzzy c-medoids algorithm (FCMdd)
in which the within cluster dissimilarity of each cluster is minimized
in each iteration by recomputing new medoids given current
memberships, FCLARANS minimizes the same objective function
minimized by FCMdd by changing current medoids in such away
that that the sum of the within cluster dissimilarities is minimized.
Computing new medoids may be effected by noise because outliers
may join the computation of medoids while the choice of medoids in
FCLARANS is dictated by the location of a predominant fraction of
points inside a cluster and, therefore, it is less sensitive to the
presence of outliers. In FCMRANS the step of computing new
medoids in FCMdd is modified to be based on randomized search.
Furthermore, a new initialization procedure is developed that add
randomness to the initialization procedure used with FCMdd. Both
FCLARANS and FCMRANS are compared with the robust and
linearized version of fuzzy c-medoids (RFCMdd). Experimental
results with different samples of the Reuter-21578, Newsgroups
(20NG) and generated datasets with noise show that FCLARANS is
more robust than both RFCMdd and FCMRANS. Finally, both
FCMRANS and FCLARANS are more efficient and their outputs
are almost the same as that of RFCMdd in terms of classification
rate.
Abstract: This paper proposes a specialized Web robot to automatically collect objectionable Web contents for use in an objectionable Web content classification system, which creates the URL database of objectionable Web contents. It aims at shortening the update period of the DB, increasing the number of URLs in the DB, and enhancing the accuracy of the information in the DB.
Abstract: Centrally controlled authentication and authorization services can provide enterprise with an increase in security, more flexible access control solutions and an increased users' trust. By using redirections, users of all Web-based applications within an organization are authenticated at a single well known and secure Web site and using secure communication protocol. Users are first authenticated at the central server using their domain wide credentials before being redirected to a particular Web-based application. The central authentication server will then provide others with pertinence authorization related particulars and credentials of the authenticated user to the specific application. The trust between the clients and the server hosts is established by secure session keys exchange. Case- studies are provided to demonstrate the usefulness and flexibility of the proposed solution.
Abstract: Control chart pattern recognition is one of the most important tools to identify the process state in statistical process control. The abnormal process state could be classified by the recognition of unnatural patterns that arise from assignable causes. In this study, a wavelet based neural network approach is proposed for the recognition of control chart patterns that have various characteristics. The procedure of proposed control chart pattern recognizer comprises three stages. First, multi-resolution wavelet analysis is used to generate time-shape and time-frequency coefficients that have detail information about the patterns. Second, distance based features are extracted by a bi-directional Kohonen network to make reduced and robust information. Third, a back-propagation network classifier is trained by these features. The accuracy of the proposed method is shown by the performance evaluation with numerical results.
Abstract: Biometric techniques are gaining importance for
personal authentication and identification as compared to the
traditional authentication methods. Biometric templates are
vulnerable to variety of attacks due to their inherent nature. When a
person-s biometric is compromised his identity is lost. In contrast to
password, biometric is not revocable. Therefore, providing security
to the stored biometric template is very crucial. Crypto biometric
systems are authentication systems, which blends the idea of
cryptography and biometrics. Fuzzy vault is a proven crypto
biometric construct which is used to secure the biometric templates.
However fuzzy vault suffer from certain limitations like nonrevocability,
cross matching. Security of the fuzzy vault is affected
by the non-uniform nature of the biometric data. Fuzzy vault when
hardened with password overcomes these limitations. Password
provides an additional layer of security and enhances user privacy.
Retina has certain advantages over other biometric traits. Retinal
scans are used in high-end security applications like access control to
areas or rooms in military installations, power plants, and other high
risk security areas. This work applies the idea of fuzzy vault for
retinal biometric template. Multimodal biometric system
performance is well compared to single modal biometric systems.
The proposed multi modal biometric fuzzy vault includes combined
feature points from retina and fingerprint. The combined vault is
hardened with user password for achieving high level of security.
The security of the combined vault is measured using min-entropy.
The proposed password hardened multi biometric fuzzy vault is
robust towards stored biometric template attacks.
Abstract: This paper presents a novel two-phase hybrid optimization algorithm with hybrid genetic operators to solve the optimal control problem of a single stage hybrid manufacturing system. The proposed hybrid real coded genetic algorithm (HRCGA) is developed in such a way that a simple real coded GA acts as a base level search, which makes a quick decision to direct the search towards the optimal region, and a local search method is next employed to do fine tuning. The hybrid genetic operators involved in the proposed algorithm improve both the quality of the solution and convergence speed. The phase–1 uses conventional real coded genetic algorithm (RCGA), while optimisation by direct search and systematic reduction of the size of search region is employed in the phase – 2. A typical numerical example of an optimal control problem with the number of jobs varying from 10 to 50 is included to illustrate the efficacy of the proposed algorithm. Several statistical analyses are done to compare the validity of the proposed algorithm with the conventional RCGA and PSO techniques. Hypothesis t – test and analysis of variance (ANOVA) test are also carried out to validate the effectiveness of the proposed algorithm. The results clearly demonstrate that the proposed algorithm not only improves the quality but also is more efficient in converging to the optimal value faster. They can outperform the conventional real coded GA (RCGA) and the efficient particle swarm optimisation (PSO) algorithm in quality of the optimal solution and also in terms of convergence to the actual optimum value.
Abstract: All practical real-time scheduling algorithms in multiprocessor systems present a trade-off between their computational complexity and performance. In real-time systems, tasks have to be performed correctly and timely. Finding minimal schedule in multiprocessor systems with real-time constraints is shown to be NP-hard. Although some optimal algorithms have been employed in uni-processor systems, they fail when they are applied in multiprocessor systems. The practical scheduling algorithms in real-time systems have not deterministic response time. Deterministic timing behavior is an important parameter for system robustness analysis. The intrinsic uncertainty in dynamic real-time systems increases the difficulties of scheduling problem. To alleviate these difficulties, we have proposed a fuzzy scheduling approach to arrange real-time periodic and non-periodic tasks in multiprocessor systems. Static and dynamic optimal scheduling algorithms fail with non-critical overload. In contrast, our approach balances task loads of the processors successfully while consider starvation prevention and fairness which cause higher priority tasks have higher running probability. A simulation is conducted to evaluate the performance of the proposed approach. Experimental results have shown that the proposed fuzzy scheduler creates feasible schedules for homogeneous and heterogeneous tasks. It also and considers tasks priorities which cause higher system utilization and lowers deadline miss time. According to the results, it performs very close to optimal schedule of uni-processor systems.
Abstract: We present a new method to reconstruct a temporally
coherent 3D animation from single or multi-view RGB-D video data
using unbiased feature point sampling. Given RGB-D video data, in
form of a 3D point cloud sequence, our method first extracts feature
points using both color and depth information. In the subsequent
steps, these feature points are used to match two 3D point clouds in
consecutive frames independent of their resolution. Our new motion
vectors based dynamic alignement method then fully reconstruct
a spatio-temporally coherent 3D animation. We perform extensive
quantitative validation using novel error functions to analyze the
results. We show that despite the limiting factors of temporal and
spatial noise associated to RGB-D data, it is possible to extract
temporal coherence to faithfully reconstruct a temporally coherent
3D animation from RGB-D video data.
Abstract: Computations with higher than the IEEE 754 standard double-precision (about 16 significant digits) are required recently. Although there are available software routines in Fortran and C for high-precision computation, users are required to implement such routines in their own computers with detailed knowledges about them. We have constructed an user-friendly online system for octupleprecision computation. In our Web system users with no knowledges about high-precision computation can easily perform octupleprecision computations, by choosing mathematical functions with argument(s) inputted, by writing simple mathematical expression(s) or by uploading C program(s). In this paper we enhance the Web system above by adding the facility of uploading Fortran programs, which have been widely used in scientific computing. To this end we construct converter routines in two stages.
Abstract: The conjugate gradient optimization algorithm
usually used for nonlinear least squares is presented and is
combined with the modified back propagation algorithm yielding
a new fast training multilayer perceptron (MLP) algorithm
(CGFR/AG). The approaches presented in the paper consist of
three steps: (1) Modification on standard back propagation
algorithm by introducing gain variation term of the activation
function, (2) Calculating the gradient descent on error with
respect to the weights and gains values and (3) the determination
of the new search direction by exploiting the information
calculated by gradient descent in step (2) as well as the previous
search direction. The proposed method improved the training
efficiency of back propagation algorithm by adaptively modifying
the initial search direction. Performance of the proposed method
is demonstrated by comparing to the conjugate gradient algorithm
from neural network toolbox for the chosen benchmark. The
results show that the number of iterations required by the
proposed method to converge is less than 20% of what is required
by the standard conjugate gradient and neural network toolbox
algorithm.
Abstract: Evolutionary Algorithms are population-based,
stochastic search techniques, widely used as efficient global
optimizers. However, many real life optimization problems often
require finding optimal solution to complex high dimensional,
multimodal problems involving computationally very expensive
fitness function evaluations. Use of evolutionary algorithms in such
problem domains is thus practically prohibitive. An attractive
alternative is to build meta models or use an approximation of the
actual fitness functions to be evaluated. These meta models are order
of magnitude cheaper to evaluate compared to the actual function
evaluation. Many regression and interpolation tools are available to
build such meta models. This paper briefly discusses the
architectures and use of such meta-modeling tools in an evolutionary
optimization context. We further present two evolutionary algorithm
frameworks which involve use of meta models for fitness function
evaluation. The first framework, namely the Dynamic Approximate
Fitness based Hybrid EA (DAFHEA) model [14] reduces
computation time by controlled use of meta-models (in this case
approximate model generated by Support Vector Machine
regression) to partially replace the actual function evaluation by
approximate function evaluation. However, the underlying
assumption in DAFHEA is that the training samples for the metamodel
are generated from a single uniform model. This does not take
into account uncertain scenarios involving noisy fitness functions.
The second model, DAFHEA-II, an enhanced version of the original
DAFHEA framework, incorporates a multiple-model based learning
approach for the support vector machine approximator to handle
noisy functions [15]. Empirical results obtained by evaluating the
frameworks using several benchmark functions demonstrate their
efficiency
Abstract: Camera calibration is an indispensable step for augmented
reality or image guided applications where quantitative information
should be derived from the images. Usually, a camera
calibration is obtained by taking images of a special calibration object
and extracting the image coordinates of projected calibration marks
enabling the calculation of the projection from the 3d world coordinates
to the 2d image coordinates. Thus such a procedure exhibits
typical steps, including feature point localization in the acquired
images, camera model fitting, correction of distortion introduced by
the optics and finally an optimization of the model-s parameters. In
this paper we propose to extend this list by further step concerning
the identification of the optimal subset of images yielding the smallest
overall calibration error. For this, we present a Monte Carlo based
algorithm along with a deterministic extension that automatically
determines the images yielding an optimal calibration. Finally, we
present results proving that the calibration can be significantly
improved by automated image selection.
Abstract: This article outlines conceptualization and
implementation of an intelligent system capable of extracting
knowledge from databases. Use of hybridized features of both the
Rough and Fuzzy Set theory render the developed system flexibility
in dealing with discreet as well as continuous datasets. A raw data set
provided to the system, is initially transformed in a computer legible
format followed by pruning of the data set. The refined data set is
then processed through various Rough Set operators which enable
discovery of parameter relationships and interdependencies. The
discovered knowledge is automatically transformed into a rule base
expressed in Fuzzy terms. Two exemplary cancer repository datasets
(for Breast and Lung Cancer) have been used to test and implement
the proposed framework.
Abstract: More and more home videos are being generated with the ever growing popularity of digital cameras and camcorders. For many home videos, a photo rendering, whether capturing a moment or a scene within the video, provides a complementary representation to the video. In this paper, a video motion mining framework for creative rendering is presented. The user-s capture intent is derived by analyzing video motions, and respective metadata is generated for each capture type. The metadata can be used in a number of applications, such as creating video thumbnail, generating panorama posters, and producing slideshows of video.
Abstract: PPX(Pretty Printer for XML) is a query language that offers a concise description method of formatting the XML data into HTML. In this paper, we propose a simple specification of formatting method that is a combination description of automatic layout operators and variables in the layout expression of the GENERATE clause of PPX. This method can automatically format irregular XML data included in a part of XML with layout decision rule that is referred to DTD. In the experiment, a quick comparison shows that PPX requires far less description compared to XSLT or XQuery programs doing same tasks.
Abstract: This paper suggests an improved integer frequency
offset (IFO) estimation scheme using P1 symbol for orthogonal
frequency division multiplexing (OFDM) based the second generation
terrestrial digital video broadcasting (DVB-T2) system. Proposed
IFO estimator is designed by a low-complexity blind IFO estimation
scheme, which is implemented with complex additions. Also, we
propose active carriers (ACs) selection scheme in order to prevent
performance degradation in blind IFO estimation. The simulation
results show that under the AWGN and TU6 channels, the proposed
method has low complexity than conventional method and almost
similar performance in comparison with the conventional method.
Abstract: Text-based game is supposed to be a low resource
consumption application that delivers good performances when
compared to graphical-intensive type of games. But, nowadays, some
of the online text-based games are not offering performances that are
acceptable to the users. Therefore, an online text-based game called
Star_Quest has been developed in order to analyze its behavior under
different performance measurements. Performance metrics such as
throughput, scalability, response time and page loading time are
captured to yield the performance of the game. The techniques in
performing the load testing are also disclosed to exhibit the viability
of our work. The comparative assessment between the results
obtained and the accepted level of performances are conducted as to
determine the performance level of the game. The study reveals that
the developed game managed to meet all the performance objectives
set forth.
Abstract: It-s known that incorporating prior knowledge into support
vector regression (SVR) can help to improve the approximation
performance. Most of researches are concerned with the incorporation
of knowledge in the form of numerical relationships. Little work,
however, has been done to incorporate the prior knowledge on the
structural relationships among the variables (referred as to Structural
Prior Knowledge, SPK). This paper explores the incorporation of SPK
in SVR by constructing appropriate admissible support vector kernel
(SV kernel) based on the properties of reproducing kernel (R.K).
Three-levels specifications of SPK are studied with the corresponding
sub-levels of prior knowledge that can be considered for the method.
These include Hierarchical SPK (HSPK), Interactional SPK (ISPK)
consisting of independence, global and local interaction, Functional
SPK (FSPK) composed of exterior-FSPK and interior-FSPK. A
convenient tool for describing the SPK, namely Description Matrix
of SPK is introduced. Subsequently, a new SVR, namely Motivated
Support Vector Regression (MSVR) whose structure is motivated
in part by SPK, is proposed. Synthetic examples show that it is
possible to incorporate a wide variety of SPK and helpful to improve
the approximation performance in complex cases. The benefits of
MSVR are finally shown on a real-life military application, Air-toground
battle simulation, which shows great potential for MSVR to
the complex military applications.
Abstract: The number of framework conceived for e-learning
constantly increase, unfortunately the creators of learning materials
and educational institutions engaged in e-formation adopt a
“proprietor" approach, where the developed products (courses,
activities, exercises, etc.) can be exploited only in the framework
where they were conceived, their uses in the other learning
environments requires a greedy adaptation in terms of time and
effort. Each one proposes courses whose organization, contents,
modes of interaction and presentations are unique for all learners,
unfortunately the latter are heterogeneous and are not interested by
the same information, but only by services or documents adapted to
their needs. Currently the new tendency for the framework
conceived for e-learning, is the interoperability of learning materials,
several standards exist (DCMI (Dublin Core Metadata Initiative)[2],
LOM (Learning Objects Meta data)[1], SCORM (Shareable Content
Object Reference Model)[6][7][8], ARIADNE (Alliance of Remote
Instructional Authoring and Distribution Networks for Europe)[9],
CANCORE (Canadian Core Learning Resource Metadata
Application Profiles)[3]), they converge all to the idea of learning
objects. They are also interested in the adaptation of the learning
materials according to the learners- profile. This article proposes an
approach for the composition of courses adapted to the various
profiles (knowledge, preferences, objectives) of learners, based on
two ontologies (domain to teach and educational) and the learning
objects.
Abstract: Many difficulties are faced in the process of learning
computer programming. This paper will propose a system framework
intended to reduce cognitive load in learning programming. In first
section focus is given on the process of learning and the
shortcomings of the current approaches to learning programming.
Finally the proposed prototype is suggested along with the
justification of the prototype. In the proposed prototype the concept
map is used as visualization metaphor. Concept maps are similar to
the mental schema in long term memory and hence it can reduce
cognitive load well. In addition other method such as part code
method is also proposed in this framework to can reduce cognitive
load.