Abstract: In this paper, we propose a sensorless backstepping control of induction motor (IM) associated with three levels neutral clamped (NPC) inverter. First, the backstepping approach is designed to steer the flux and speed variables to theirs references and to compensate the uncertainties. A Lyapunov theory is used and it demonstrates that the dynamic trajectories tracking are asymptotically stable. Second, we estimate the rotor flux and speed by using the adaptive Luenberger observer (ALO). Simulation results are provided to illustrate the performance of the proposed approach in high and low speeds and load torque disturbance.
Abstract: Cloud computing technology is very useful in present day to day life, it uses the internet and the central remote servers to provide and maintain data as well as applications. Such applications in turn can be used by the end users via the cloud communications without any installation. Moreover, the end users’ data files can be accessed and manipulated from any other computer using the internet services. Despite the flexibility of data and application accessing and usage that cloud computing environments provide, there are many questions still coming up on how to gain a trusted environment that protect data and applications in clouds from hackers and intruders. This paper surveys the “keys generation and management” mechanism and encryption/decryption algorithms used in cloud computing environments, we proposed new security architecture for cloud computing environment that considers the various security gaps as much as possible. A new cryptographic environment that implements quantum mechanics in order to gain more trusted with less computation cloud communications is given.
Abstract: A mobile ad hoc network (MANET) is a self configuring network, without any centralized control. The topology of this network is not always defined. The main objective of this paper is to introduce the fundamental concepts of MANETs to the researchers and practitioners, who are involved in the work in the area of modeling and simulation of MANETs. This paper begins with an overview of mobile ad hoc networks. Then it proceeds with the overview of routing protocols used in the MANETS, their properties and simulation methods. A brief tabular comparison between the routing protocols is also given in this paper considering different routing protocol parameters. This paper introduces a new routing scheme developed by the use of evolutionary algorithms (EA) and analytical hierarchy process (AHP) which will be used for getting the optimized output of MANET. In this paper cryptographic technique, ceaser cipher is also employed for making the optimized route secure.
Abstract: Improvements in the data fusion and data analysis phase of research are imperative due to the exponential growth of sensed data. Currently, there are developments in the Semantic Sensor Web community to explore efficient methods for reuse, correlation and integration of web-based data sets and live data streams. This paper describes the integration of remotely sensed data with web-available static data for use in observational hypothesis testing and the analysis phase of research. The Semantic Reef system combines semantic technologies (e.g., well-defined ontologies and logic systems) with scientific workflows to enable hypothesis-based research. A framework is presented for how the data fusion concepts from the Semantic Reef architecture map to the Smart Environment Monitoring and Analysis Technologies (SEMAT) intelligent sensor network initiative. The data collected via SEMAT and the inferred knowledge from the Semantic Reef system are ingested to the Tropical Data Hub for data discovery, reuse, curation and publication.
Abstract: The problem of optimal planning of multiple sources
of distributed generation (DG) in distribution networks is treated in
this paper using an improved Ant Colony Optimization algorithm
(ACO). This objective of this problem is to determine the DG
optimal size and location that in order to minimize the network real
power losses. Considering the multiple sources of DG, both size and
location are simultaneously optimized in a single run of the proposed
ACO algorithm. The various practical constraints of the problem are
taken into consideration by the problem formulation and the
algorithm implementation. A radial power flow algorithm for
distribution networks is adopted and applied to satisfy these
constraints. To validate the proposed technique and demonstrate its
effectiveness, the well-know 69-bus feeder standard test system is
employed.cm.
Abstract: Hierarchical classification is a problem with applications in many areas as protein function prediction where the dates are hierarchically structured. Therefore, it is necessary the development of algorithms able to induce hierarchical classification models. This paper presents experimenters using the algorithm for hierarchical classification called Multi-label Hierarchical Classification using a Competitive Neural Network (MHC-CNN). It was tested in ten datasets the Gene Ontology (GO) Cellular Component Domain. The results are compared with the Clus-HMC and Clus-HSC using the hF-Measure.
Abstract: As the mobile Internet has become widespread in
recent years, communication based on mobile networks is increasing.
As a result, security threats have been posed with regard to the
abnormal traffic of mobile networks, but mobile security has been
handled with focus on threats posed by mobile malicious codes, and
researches on security threats to the mobile network itself have not
attracted much attention. In mobile networks, the IP address of the data
packet is a very important factor for billing purposes. If one mobile
terminal use an incorrect IP address that either does not exist or could
be assigned to another mobile terminal, billing policy will cause
problems. We monitor and analyze 3G mobile data networks traffics
for a period of time and finds some abnormal IP packets. In this paper,
we analyze the reason for abnormal IP packets on 3G Mobile Data
Networks. And we also propose an algorithm based on IP address table
that contains addresses currently in use within the mobile data network
to detect abnormal IP packets.
Abstract: Sudoku is a kind of logic puzzles. Each puzzle consists
of a board, which is a 9×9 cells, divided into nine 3×3 subblocks
and a set of numbers from 1 to 9. The aim of this puzzle is to
fill in every cell of the board with a number from 1 to 9 such
that in every row, every column, and every subblock contains each
number exactly one. Sudoku puzzles belong to combinatorial problem
(NP complete). Sudoku puzzles can be solved by using a variety of
techniques/algorithms such as genetic algorithms, heuristics, integer
programming, and so on. In this paper, we propose a new approach for
solving Sudoku which is by modelling them as block-world problems.
In block-world problems, there are a number of boxes on the table
with a particular order or arrangement. The objective of this problem
is to change this arrangement into the targeted arrangement with the
help of two types of robots. In this paper, we present three models
for Sudoku. We modellized Sudoku as parameterized multi-agent
systems. A parameterized multi-agent system is a multi-agent system
which consists of several uniform/similar agents and the number of
the agents in the system is stated as the parameter of this system. We
use Temporal Logic of Actions (TLA) for formalizing our models.
Abstract: Searching similar documents and document
management subjects have important place in text mining. One of the
most important parts of similar document research studies is the
process of classifying or clustering the documents. In this study, a
similar document search approach that includes discussion of out the
case of belonging to multiple categories (multiple categories
problem) has been carried. The proposed method that based on Fuzzy
Similarity Classification (FSC) has been compared with Rocchio
algorithm and naive Bayes method which are widely used in text
mining. Empirical results show that the proposed method is quite
successful and can be applied effectively. For the second stage,
multiple categories vector method based on information of categories
regarding to frequency of being seen together has been used.
Empirical results show that achievement is increased almost two
times, when proposed method is compared with classical approach.
Abstract: In face recognition, feature extraction techniques
attempts to search for appropriate representation of the data. However,
when the feature dimension is larger than the samples size, it brings
performance degradation. Hence, we propose a method called
Normalization Discriminant Independent Component Analysis
(NDICA). The input data will be regularized to obtain the most
reliable features from the data and processed using Independent
Component Analysis (ICA). The proposed method is evaluated on
three face databases, Olivetti Research Ltd (ORL), Face Recognition
Technology (FERET) and Face Recognition Grand Challenge
(FRGC). NDICA showed it effectiveness compared with other
unsupervised and supervised techniques.
Abstract: The occurrence of missing values in database is a serious problem for Data Mining tasks, responsible for degrading data quality and accuracy of analyses. In this context, the area has shown a lack of standardization for experiments to treat missing values, introducing difficulties to the evaluation process among different researches due to the absence in the use of common parameters. This paper proposes a testbed intended to facilitate the experiments implementation and provide unbiased parameters using available datasets and suited performance metrics in order to optimize the evaluation and comparison between the state of art missing values treatments.
Abstract: This paper introduces a framework based on the collaboration of multi agent and hyper-heuristics to find a solution of the real single machine production problem. There are many techniques used to solve this problem. Each of it has its own advantages and disadvantages. By the collaboration of multi agent system and hyper-heuristics, we can get more optimal solution. The hyper-heuristics approach operates on a search space of heuristics rather than directly on a search space of solutions. The proposed framework consists of some agents, i.e. problem agent, trainer agent, algorithm agent (GPHH, GAHH, and SAHH), optimizer agent, and solver agent. Some low level heuristics used in this paper are MRT, SPT, LPT, EDD, LDD, and MON
Abstract: Today, computer systems are more and more complex and support growing security risks. The security managers need to find effective security risk assessment methodologies that allow modeling well the increasing complexity of current computer systems but also maintaining low the complexity of the assessment procedure. This paper provides a brief analysis of common security risk assessment methodologies leading to the selection of a proper methodology to fulfill these requirements. Then, a detailed analysis of the most effective methodology is accomplished, presenting numerical examples to demonstrate how easy it is to use.
Abstract: Histogram equalization is often used in image enhancement, but it can be also used in auto exposure. However, conventional histogram equalization does not work well when many pixels are concentrated in a narrow luminance range.This paper proposes an auto exposure method based on 2-way histogram equalization. Two cumulative distribution functions are used, where one is from dark to bright and the other is from bright to dark. In this paper, the proposed auto exposure method is also designed and implemented for image signal processors with full-HD images.
Abstract: This research paper presents a framework on how to
build up malware dataset.Many researchers took longer time to
clean the dataset from any noise or to transform the dataset into a
format that can be used straight away for testing. Therefore, this
research is proposing a framework to help researchers to speed up
the malware dataset cleaningprocesses which later can be used for
testing. It is believed, an efficient malware dataset cleaning
processes, can improved the quality of the data, thus help to improve
the accuracy and the efficiency of the subsequent analysis. Apart
from that, an in-depth understanding of the malware taxonomy is
also important prior and during the dataset cleaning processes. A
new Trojan classification has been proposed to complement this
framework.This experiment has been conducted in a controlled lab
environment and using the dataset from VxHeavens dataset. This
framework is built based on the integration of static and dynamic
analyses, incident response method and knowledge database
discovery (KDD) processes.This framework can be used as the basis
guideline for malware researchers in building malware dataset.
Abstract: Support Vector Domain Description (SVDD) is one of the best-known one-class support vector learning methods, in which one tries the strategy of using balls defined on the feature space in order to distinguish a set of normal data from all other possible abnormal objects. As all kernel-based learning algorithms its performance depends heavily on the proper choice of the kernel parameter. This paper proposes a new approach to select kernel's parameter based on maximizing the distance between both gravity centers of normal and abnormal classes, and at the same time minimizing the variance within each class. The performance of the proposed algorithm is evaluated on several benchmarks. The experimental results demonstrate the feasibility and the effectiveness of the presented method.
Abstract: Interactive public displays give access as an
innovative media to promote enhanced communication between
people and information. However, digital public displays are subject
to a few constraints, such as content presentation. Content
presentation needs to be developed to be more interesting to attract
people’s attention and motivate people to interact with the display. In
this paper, we proposed idea to implement contents with interaction
elements for vision-based digital public display. Vision-based
techniques are applied as a sensor to detect passers-by and theme
contents are suggested to attract their attention for encouraging them
to interact with the announcement content. Virtual object, gesture
detection and projection installation are applied for attracting
attention from passers-by. Preliminary study showed positive
feedback of interactive content designing towards the public display.
This new trend would be a valuable innovation as delivery of
announcement content and information communication through this
media is proven to be more engaging.
Abstract: This paper discusses the designing of knowledge
integration of clinical information extracted from distributed medical
ontologies in order to ameliorate a machine learning-based multilabel
coding assignment system. The proposed approach is
implemented using a decision tree technique of the machine learning
on the university hospital data for patients with Coronary Heart
Disease (CHD). The preliminary results obtained show a satisfactory
finding that the use of medical ontologies improves the overall
system performance.