Towards Development of Solution for Business Process-Oriented Data Analysis

This paper proposes a modeling methodology for the development of data analysis solution. The Author introduce the approach to address data warehousing issues at the at enterprise level. The methodology covers the process of the requirements eliciting and analysis stage as well as initial design of data warehouse. The paper reviews extended business process model, which satisfy the needs of data warehouse development. The Author considers that the use of business process models is necessary, as it reflects both enterprise information systems and business functions, which are important for data analysis. The Described approach divides development into three steps with different detailed elaboration of models. The Described approach gives possibility to gather requirements and display them to business users in easy manner.

Using Automated Database Reverse Engineering for Database Integration

One important problem in today organizations is the existence of non-integrated information systems, inconsistency and lack of suitable correlations between legacy and modern systems. One main solution is to transfer the local databases into a global one. In this regards we need to extract the data structures from the legacy systems and integrate them with the new technology systems. In legacy systems, huge amounts of a data are stored in legacy databases. They require particular attention since they need more efforts to be normalized, reformatted and moved to the modern database environments. Designing the new integrated (global) database architecture and applying the reverse engineering requires data normalization. This paper proposes the use of database reverse engineering in order to integrate legacy and modern databases in organizations. The suggested approach consists of methods and techniques for generating data transformation rules needed for the data structure normalization.

Real-Time Hand Tracking and Gesture Recognition System Using Neural Networks

This paper introduces a hand gesture recognition system to recognize real time gesture in unstrained environments. Efforts should be made to adapt computers to our natural means of communication: Speech and body language. A simple and fast algorithm using orientation histograms will be developed. It will recognize a subset of MAL static hand gestures. A pattern recognition system will be using a transforrn that converts an image into a feature vector, which will be compared with the feature vectors of a training set of gestures. The final system will be Perceptron implementation in MATLAB. This paper includes experiments of 33 hand postures and discusses the results. Experiments shows that the system can achieve a 90% recognition average rate and is suitable for real time applications.

Deniable Authentication Protocol Resisting Man-in-the-Middle Attack

Deniable authentication is a new protocol which not only enables a receiver to identify the source of a received message but also prevents a third party from identifying the source of the message. The proposed protocol in this paper makes use of bilinear pairings over elliptic curves, as well as the Diffie-Hellman key exchange protocol. Besides the security properties shared with previous authentication protocols, the proposed protocol provides the same level of security with smaller public key sizes.

Impovement of a Label Extraction Method for a Risk Search System

This paper proposes an improvement method of classification efficiency in a classification model. The model is used in a risk search system and extracts specific labels from articles posted at bulletin board sites. The system can analyze the important discussions composed of the articles. The improvement method introduces ensemble learning methods that use multiple classification models. Also, it introduces expressions related to the specific labels into generation of word vectors. The paper applies the improvement method to articles collected from three bulletin board sites selected by users and verifies the effectiveness of the improvement method.

Collaboration of Multi-Agent and Hyper-Heuristics Systems for Production Scheduling Problem

This paper introduces a framework based on the collaboration of multi agent and hyper-heuristics to find a solution of the real single machine production problem. There are many techniques used to solve this problem. Each of it has its own advantages and disadvantages. By the collaboration of multi agent system and hyper-heuristics, we can get more optimal solution. The hyper-heuristics approach operates on a search space of heuristics rather than directly on a search space of solutions. The proposed framework consists of some agents, i.e. problem agent, trainer agent, algorithm agent (GPHH, GAHH, and SAHH), optimizer agent, and solver agent. Some low level heuristics used in this paper are MRT, SPT, LPT, EDD, LDD, and MON

Regular Data Broadcasting Plan with Grouping in Wireless Mobile Environment

The broadcast problem including the plan design is considered. The data are inserted and numbered at predefined order into customized size relations. The server ability to create a full, regular Broadcast Plan (RBP) with single and multiple channels after some data transformations is examined. The Regular Geometric Algorithm (RGA) prepares a RBP and enables the users to catch their items avoiding energy waste of their devices. Moreover, the Grouping Dimensioning Algorithm (GDA) based on integrated relations can guarantee the discrimination of services with a minimum number of channels. This last property among the selfmonitoring, self-organizing, can be offered by servers today providing also channel availability and less energy consumption by using smaller number of channels. Simulation results are provided.

A Visual Educational Modeling Language to Help Teachers in Learning Scenario Design

The success of an e-learning system is highly dependent on the quality of its educational content and how effective, complete, and simple the design tool can be for teachers. Educational modeling languages (EMLs) are proposed as design languages intended to teachers for modeling diverse teaching-learning experiences, independently of the pedagogical approach and in different contexts. However, most existing EMLs are criticized for being too abstract and too complex to be understood and manipulated by teachers. In this paper, we present a visual EML that simplifies the process of designing learning scenarios for teachers with no programming background. Based on the conceptual framework of the activity theory, our resulting visual EML focuses on using Domainspecific modeling techniques to provide a pedagogical level of abstraction in the design process.

An Advanced Stereo Vision Based Obstacle Detection with a Robust Shadow Removal Technique

This paper presents a robust method to detect obstacles in stereo images using shadow removal technique and color information. Stereo vision based obstacle detection is an algorithm that aims to detect and compute obstacle depth using stereo matching and disparity map. The proposed advanced method is divided into three phases, the first phase is detecting obstacles and removing shadows, the second one is matching and the last phase is depth computing. We propose a robust method for detecting obstacles in stereo images using a shadow removal technique based on color information in HIS space, at the first phase. In this paper we use Normalized Cross Correlation (NCC) function matching with a 5 × 5 window and prepare an empty matching table τ and start growing disparity components by drawing a seed s from S which is computed using canny edge detector, and adding it to τ. In this way we achieve higher performance than the previous works [2,17]. A fast stereo matching algorithm is proposed that visits only a small fraction of disparity space in order to find a semi-dense disparity map. It works by growing from a small set of correspondence seeds. The obstacle identified in phase one which appears in the disparity map of phase two enters to the third phase of depth computing. Finally, experimental results are presented to show the effectiveness of the proposed method.

C@sa: Intelligent Home Control and Simulation

In this paper, we present C@sa, a multiagent system aiming at modeling, controlling and simulating the behavior of an intelligent house. The developed system aims at providing to architects, designers and psychologists a simulation and control tool for understanding which is the impact of embedded and pervasive technology on people daily life. In this vision, the house is seen as an environment made up of independent and distributed devices, controlled by agents, interacting to support user's goals and tasks.

String Matching using Inverted Lists

This paper proposes a new solution to string matching problem. This solution constructs an inverted list representing a  string pattern to be searched for. It then uses a new algorithm to process an input string in a single pass. The preprocessing phase  takes 1) time complexity O(m) 2) space complexity O(1) where m is  the length of pattern. The searching phase time complexity takes 1)  O(m+α ) in average case 2) O(n/m) in the best case and 3) O(n) in  the worst case, where α is the number of comparing leading to  mismatch and n is the length of input text.

CScheme in Traditional Concurrency Problems

CScheme, a concurrent programming paradigm based on scheme concept enables concurrency schemes to be constructed from smaller synchronization units through a GUI based composer and latter be reused on other concurrency problems of a similar nature. This paradigm is particularly important in the multi-core environment prevalent nowadays. In this paper, we demonstrate techniques to separate concurrency from functional code using the CScheme paradigm. Then we illustrate how the CScheme methodology can be used to solve some of the traditional concurrency problems – critical section problem, and readers-writers problem - using synchronization schemes such as Single Threaded Execution Scheme, and Readers Writers Scheme.

Low Cost Chip Set Selection Algorithm for Multi-way Partitioning of Digital System

This paper considers the problem of finding low cost chip set for a minimum cost partitioning of a large logic circuits. Chip sets are selected from a given library. Each chip in the library has a different price, area, and I/O pin. We propose a low cost chip set selection algorithm. Inputs to the algorithm are a netlist and a chip information in the library. Output is a list of chip sets satisfied with area and maximum partitioning number and it is sorted by cost. The algorithm finds the sorted list of chip sets from minimum cost to maximum cost. We used MCNC benchmark circuits for experiments. The experimental results show that all of chip sets found satisfy the multiple partitioning constraints.

A Robust LS-SVM Regression

In comparison to the original SVM, which involves a quadratic programming task; LS–SVM simplifies the required computation, but unfortunately the sparseness of standard SVM is lost. Another problem is that LS-SVM is only optimal if the training samples are corrupted by Gaussian noise. In Least Squares SVM (LS–SVM), the nonlinear solution is obtained, by first mapping the input vector to a high dimensional kernel space in a nonlinear fashion, where the solution is calculated from a linear equation set. In this paper a geometric view of the kernel space is introduced, which enables us to develop a new formulation to achieve a sparse and robust estimate.

A Practical Approach for Testing the Process Quality

Process capability index Cpk is the most widely used index in making managerial decisions since it provides bounds on the process yield for normally distributed processes. However, existent methods for assessing process performance which constructed by statistical inference may unfortunately lead to fine results, because uncertainties exist in most real-world applications. Thus, this study adopts fuzzy inference to deal with testing of Cpk . A brief score is obtained for assessing a supplier’s process instead of a severe evaluation.

A Hybrid Method for Eyes Detection in Facial Images

This paper proposes a hybrid method for eyes localization in facial images. The novelty is in combining techniques that utilise colour, edge and illumination cues to improve accuracy. The method is based on the observation that eye regions have dark colour, high density of edges and low illumination as compared to other parts of face. The first step in the method is to extract connected regions from facial images using colour, edge density and illumination cues separately. Some of the regions are then removed by applying rules that are based on the general geometry and shape of eyes. The remaining connected regions obtained through these three cues are then combined in a systematic way to enhance the identification of the candidate regions for the eyes. The geometry and shape based rules are then applied again to further remove the false eye regions. The proposed method was tested using images from the PICS facial images database. The proposed method has 93.7% and 87% accuracies for initial blobs extraction and final eye detection respectively.

Design and Implementation of a Hybrid Fuzzy Controller for a High-Performance Induction

This paper proposes an effective algorithm approach to hybrid control systems combining fuzzy logic and conventional control techniques of controlling the speed of induction motor assumed to operate in high-performance drives environment. The introducing of fuzzy logic in the control systems helps to achieve good dynamical response, disturbance rejection and low sensibility to parameter variations and external influences. Some fundamentals of the fuzzy logic control are preliminary illustrated. The developed control algorithm is robust, efficient and simple. It also assures precise trajectory tracking with the prescribed dynamics. Experimental results have shown excellent tracking performance of the proposed control system, and have convincingly demonstrated the validity and the usefulness of the hybrid fuzzy controller in high-performance drives with parameter and load uncertainties. Satisfactory performance was observed for most reference tracks.

System Identification Based on Stepwise Regression for Dynamic Market Representation

A system for market identification (SMI) is presented. The resulting representations are multivariable dynamic demand models. The market specifics are analyzed. Appropriate models and identification techniques are chosen. Multivariate static and dynamic models are used to represent the market behavior. The steps of the first stage of SMI, named data preprocessing, are mentioned. Next, the second stage, which is the model estimation, is considered in more details. Stepwise linear regression (SWR) is used to determine the significant cross-effects and the orders of the model polynomials. The estimates of the model parameters are obtained by a numerically stable estimator. Real market data is used to analyze SMI performance. The main conclusion is related to the applicability of multivariate dynamic models for representation of market systems.

Performance Analysis of Chrominance Red and Chrominance Blue in JPEG

While compressing text files is useful, compressing still image files is almost a necessity. A typical image takes up much more storage than a typical text message and without compression images would be extremely clumsy to store and distribute. The amount of information required to store pictures on modern computers is quite large in relation to the amount of bandwidth commonly available to transmit them over the Internet and applications. Image compression addresses the problem of reducing the amount of data required to represent a digital image. Performance of any image compression method can be evaluated by measuring the root-mean-square-error & peak signal to noise ratio. The method of image compression that will be analyzed in this paper is based on the lossy JPEG image compression technique, the most popular compression technique for color images. JPEG compression is able to greatly reduce file size with minimal image degradation by throwing away the least “important" information. In JPEG, both color components are downsampled simultaneously, but in this paper we will compare the results when the compression is done by downsampling the single chroma part. In this paper we will demonstrate more compression ratio is achieved when the chrominance blue is downsampled as compared to downsampling the chrominance red in JPEG compression. But the peak signal to noise ratio is more when the chrominance red is downsampled as compared to downsampling the chrominance blue in JPEG compression. In particular we will use the hats.jpg as a demonstration of JPEG compression using low pass filter and demonstrate that the image is compressed with barely any visual differences with both methods.

Learning to Order Terms: Supervised Interestingness Measures in Terminology Extraction

Term Extraction, a key data preparation step in Text Mining, extracts the terms, i.e. relevant collocation of words, attached to specific concepts (e.g. genetic-algorithms and decisiontrees are terms associated to the concept “Machine Learning" ). In this paper, the task of extracting interesting collocations is achieved through a supervised learning algorithm, exploiting a few collocations manually labelled as interesting/not interesting. From these examples, the ROGER algorithm learns a numerical function, inducing some ranking on the collocations. This ranking is optimized using genetic algorithms, maximizing the trade-off between the false positive and true positive rates (Area Under the ROC curve). This approach uses a particular representation for the word collocations, namely the vector of values corresponding to the standard statistical interestingness measures attached to this collocation. As this representation is general (over corpora and natural languages), generality tests were performed by experimenting the ranking function learned from an English corpus in Biology, onto a French corpus of Curriculum Vitae, and vice versa, showing a good robustness of the approaches compared to the state-of-the-art Support Vector Machine (SVM).