A Performance Evaluation of Oscillation Based Test in Continuous Time Filters

This work evaluates the ability of OBT for detecting parametric faults in continuous-time filters. To this end, we adopt two filters with quite different topologies as cases of study and a previously reported statistical fault model. In addition, we explore the behavior of the test schemes when a particular test condition is changed. The new data reported here, obtained from a fault simulation process, reveal a lower performance of OBT not observed in previous work using single-deviation faults, even under the change in the test condition.

Fuzzy Sequential Algorithm for Discrimination and Decision Maker in Sporting Events

Events discrimination and decision maker in sport field are the subject of many interesting studies in computer vision and artificial intelligence. A large volume of research has been conducted for automatic semantic event detection and summarization of sports videos. Indeed the results of these researches have a very significant contribution, as well to television broadcasts as to the football teams, since the result of sporting event can be reflected on the economic field. In this paper, we propose a novel fuzzy sequential technique which lead to discriminate events and specify the technico-tactics on going the game, nor the fuzzy system or the sequential one, may be able to respond to the asked question, in fact fuzzy process is not sufficient, it does not respect the chronological order according the time of various events, similarly the sequential process needs flexibility about the parameters used in this study, it may affect a membership degree of each parameter on the one hand and respect the sequencing of events for each frame on the other hand. Indeed this technique describes special events such as dribbling, headings, short sprints, rapid acceleration or deceleration, turning, jumping, kicking, ball occupation, and tackling according velocity vectors of the two players and the ball direction.

Development and Performance Analysis of Multifunctional City Smart Card System

In recent years, several smart card solutions for transportation services of cities with different technical infrastructures and business models has emerged considerably, which triggers new business and technical opportunities. In order to create a unique system, we present a novel, promising system called Multifunctional City Smart Card System to be used in all cities that provides transportation and loyalty services based on the MasterCard M/Chip Advance standards. The proposed system provides a unique solution for transportation services of large cities over the world, aiming to answer all transportation needs of citizens. In this paper, development of the Multifunctional City Smart Card system and system requirements are briefly described. Moreover, performance analysis results of M/Chip Advance Compatible Validators which is the system's most important component are presented.

Alive Cemeteries with Augmented Reality and Semantic Web Technologies

Due the proliferation of smartphones in everyday use, several different outdoor navigation systems have become available. Since these smartphones are able to connect to the Internet, the users can obtain location-based information during the navigation as well. The users could interactively get to know the specifics of a particular area (for instance, ancient cultural area, Statue Park, cemetery) with the help of thus obtained information. In this paper, we present an Augmented Reality system which uses Semantic Web technologies and is based on the interaction between the user and the smartphone. The system allows navigating through a specific area and provides information and details about the sight an interactive manner.

An Expert System Designed to Be Used with MOEAs for Efficient Portfolio Selection

This study presents an Expert System specially designed to be used with Multiobjective Evolutionary Algorithms (MOEAs) for the solution of the portfolio selection problem. The validation of the proposed hybrid System is done by using data sets from Hang Seng 31 in Hong Kong, DAX 100 in Germany and FTSE 100 in UK. The performance of the proposed system is assessed in comparison with the Non-dominated Sorting Genetic Algorithm II (NSGAII). The evaluation of the performance is based on different performance metrics that evaluate both the proximity of the solutions to the Pareto front and their dispersion on it. The results show that the proposed hybrid system is efficient for the solution of this kind of problems.

Incorporating Multiple Supervised Learning Algorithms for Effective Intrusion Detection

As internet continues to expand its usage with an  enormous number of applications, cyber-threats have significantly  increased accordingly. Thus, accurate detection of malicious traffic in  a timely manner is a critical concern in today’s Internet for security.  One approach for intrusion detection is to use Machine Learning (ML)  techniques. Several methods based on ML algorithms have been  introduced over the past years, but they are largely limited in terms of  detection accuracy and/or time and space complexity to run. In this  work, we present a novel method for intrusion detection that  incorporates a set of supervised learning algorithms. The proposed  technique provides high accuracy and outperforms existing techniques  that simply utilizes a single learning method. In addition, our  technique relies on partial flow information (rather than full  information) for detection, and thus, it is light-weight and desirable for  online operations with the property of early identification. With the  mid-Atlantic CCDC intrusion dataset publicly available, we show that  our proposed technique yields a high degree of detection rate over 99%  with a very low false alarm rate (0.4%).   

Fault Tolerance in Distributed Database Systems

Pioneer networked systems assume that connections are reliable, and a faulty operation will be considered in case of losing a connection. Transient connections are typical of mobile devices. Areas of application of data sharing system such as these, lead to the conclusion that network connections may not always be reliable, and that the conventional approaches can be improved. Nigerian commercial banking industry is a critical system whose operation is increasingly becoming dependent on information technology (IT) driven information system. The proposed solution to this problem makes use of a hierarchically clustered network structure which we selected to reflect (as much as possible) the typical organizational structure of the Nigerian commercial banks. Representative transactions such as data updates and replication of the results of such updates were used to simulate the proposed model to show its applicability.

MATLAB-Based Graphical User Interface (GUI) for Data Mining as a Tool for Environment Management

The application of data mining to environmental monitoring has become crucial for a number of tasks related to emergency management. Over recent years, many tools have been developed for decision support system (DSS) for emergency management. In this article a graphical user interface (GUI) for environmental monitoring system is presented. This interface allows accomplishing (i) data collection and observation and (ii) extraction for data mining. This tool may be the basis for future development along the line of the open source software paradigm.

Multiple Targets Classification and Fuzzy Logic Decision Fusion in Wireless Sensor Networks

This paper proposes a hierarchical hidden Markov model (HHMM) to model the detection of M vehicles in a wireless sensor network (WSN). The HHMM model contains an extra level of hidden Markov model to model the temporal transitions of each state of the first HMM. By modeling the temporal transitions, only those hypothesis with nonzero transition probabilities needs to be tested. Thus, this method efficiently reduces the computation load, which is preferable in WSN applications.This paper integrates several techniques to optimize the detection performance. The output of the states of the first HMM is modeled as Gaussian Mixture Model (GMM), where the number of states and the number of Gaussians are experimentally determined, while the other parameters are estimated using Expectation Maximization (EM). HHMM is used to model the sequence of the local decisions which are based on multiple hypothesis testing with maximum likelihood approach. The states in the HHMM represent various combinations of vehicles of different types. Due to the statistical advantages of multisensor data fusion, we propose a heuristic based on fuzzy weighted majority voting to enhance cooperative classification of moving vehicles within a region that is monitored by a wireless sensor network. A fuzzy inference system weighs each local decision based on the signal to noise ratio of the acoustic signal for target detection and the signal to noise ratio of the radio signal for sensor communication. The spatial correlation among the observations of neighboring sensor nodes is efficiently utilized as well as the temporal correlation. Simulation results demonstrate the efficiency of this scheme.

Empirical Analysis of the Reusability of Object-Oriented Program Code in Open-Source Software

Measuring the reusability of Object-Oriented (OO) program code is important to ensure a successful and timely adaptation and integration of the reused code in new software projects. It has become even more relevant with the availability of huge amounts of open-source projects. Reuse saves cost, increases the speed of development and improves software reliability. Measuring this reusability is not s straight forward process due to the variety of metrics and qualities linked to software reuse and the lack of comprehensive empirical studies to support the proposed metrics or models. In this paper, a conceptual model is proposed to measure the reusability of OO program code. A comprehensive set of metrics is used to compute the most significant factors of reusability and an empirical investigation is conducted to measure the reusability of the classes of randomly selected open-source Java projects. Additionally, the impact of using inner and anonymous classes on the reusability of their enclosing classes is assessed. The results obtained are thoroughly analyzed to identify the factors behind lack of reusability in open-source OO program code and the impact of nesting on it.

TTCN-3 Based Conformance Testing of a Node Monitoring Protocol for MANETs

As a node monitoring protocol, which is a part of network management, operates in distributed manner, conformance testing of such protocols is more tedious than testing a peer-to-peer protocol. Various works carried out to give the methodology to do conformance testing of distributed protocol. In this paper, we have presented a formal approach for conformance testing of a Node Monitoring Protocol, which uses both static and mobile agents, for MANETs. First, we use SDL to obtain MSCs, which represent the scenario descriptions by sequence diagrams, which in turn generate test sequences and test cases. Later, Testing and Test Control Notation Version-3 (TTCN-3) is used to execute test cases with respect to generated test sequences to know the conformance of protocol against the given specification. This approach shows, the effective conformance testing of the distributed protocols for the network with varying node density and complex behavior. Experimental results for the protocol scenario represent the effectiveness of the method used.

Improvement of Data Transfer over Simple Object Access Protocol (SOAP)

This paper presents a designed algorithm involves improvement of transferring data over Simple Object Access Protocol (SOAP). The aim of this work is to establish whether using SOAP in exchanging XML messages has any added advantages or not. The results showed that XML messages without SOAP take longer time and consume more memory, especially with binary data.

Face Recognition Based On Vector Quantization Using Fuzzy Neuro Clustering

A face recognition system is a computer application for automatically identifying or verifying a person from a digital image or a video frame. A lot of algorithms have been proposed for face recognition. Vector Quantization (VQ) based face recognition is a novel approach for face recognition. Here a new codebook generation for VQ based face recognition using Integrated Adaptive Fuzzy Clustering (IAFC) is proposed. IAFC is a fuzzy neural network which incorporates a fuzzy learning rule into a competitive neural network. The performance of proposed algorithm is demonstrated by using publicly available AT&T database, Yale database, Indian Face database and a small face database, DCSKU database created in our lab. In all the databases the proposed approach got a higher recognition rate than most of the existing methods. In terms of Equal Error Rate (ERR) also the proposed codebook is better than the existing methods.

An Anonymity-Based Secure On-Demand Routing for Mobile Ad Hoc Networks

Privacy and Security have emerged as an important research issue in Mobile Ad Hoc Networks (MANET) due to its unique nature such as scarce of resources and absence of centralized authority. There are number of protocols have been proposed to provide privacy and security for data communication in an adverse environment, but those protocols are compromised in many ways by the attackers. The concept of anonymity (in terms of unlinkability and unobservability) and pseudonymity has been introduced in this paper to ensure privacy and security. In this paper, a Secure Onion Throat (SOT) protocol is proposed to provide complete anonymity in an adverse environment. The SOT protocol is designed based on the combination of group signature and onion routing with ID-based encryption for route discovery. The security analysis demonstrates the performance of SOT protocol against all categories of attacks. The simulation results ensure the necessity and importance of the proposed SOT protocol in achieving such anonymity.

A Model for Test Case Selection in the Software-Development Life Cycle

Software maintenance is one of the essential processes of Software-Development Life Cycle. The main philosophies of retaining software concern the improvement of errors, the revision of codes, the inhibition of future errors, and the development in piece and capacity. While the adjustment has been employing, the software structure has to be retested to an upsurge a level of assurance that it will be prepared due to the requirements. According to this state, the test cases must be considered for challenging the revised modules and the whole software. A concept of resolving this problem is ongoing by regression test selection such as the retest-all selections, random/ad-hoc selection and the safe regression test selection. Particularly, the traditional techniques concern a mapping between the test cases in a test suite and the lines of code it executes. However, there are not only the lines of code as one of the requirements that can affect the size of test suite but including the number of functions and faulty versions. Therefore, a model for test case selection is developed to cover those three requirements by the integral technique which can produce the smaller size of the test cases when compared with the traditional regression selection techniques.

Prediction of Research Topics Using Ensemble of Best Predictors from Similar Dataset

Prediction of future research topics by using time series analysis either statistical or machine learning has been conducted previously by several researchers. Several methods have been proposed to combine the forecasting results into single forecast. These methods use fixed combination of individual forecast to get the final forecast result. In this paper, quite different approach is employed to select the forecasting methods, in which every point to forecast is calculated by using the best methods used by similar validation dataset. The dataset used in the experiment is time series derived from research report in Garuda, which is an online sites belongs to the Ministry of Education in Indonesia, over the past 20 years. The experimental result demonstrates that the proposed method may perform better compared to the fix combination of predictors. In addition, based on the prediction result, we can forecast emerging research topics for the next few years.

Multiplayer RC-Car Driving System in a Collaborative Augmented Reality Environment

We developed a prototype system for multiplayer RC-car driving in a collaborative augmented reality (AR) environment. The tele-existence environment is constructed by superimposing digital data onto images captured by a camera on an RC-car, enabling players to experience an augmented coexistence of the digital content and the real world. Marker-based tracking was used for estimating position and orientation of the camera. The plural RC-cars can be operated in a field where square markers are arranged. The video images captured by the camera are transmitted to a PC for visual tracking. The RC-cars are also tracked by using an infrared camera attached to the ceiling, so that the instability is reduced in the visual tracking. Multimedia data such as texts and graphics are visualized to be overlaid onto the video images in the geometrically correct manner. The prototype system allows a tele-existence sensation to be augmented in a collaborative AR environment.

A New Reliability Based Channel Allocation Model in Mobile Networks

The data transmission between mobile hosts and base stations (BSs) in Mobile networks are often vulnerable to failure. So, efficient link connectivity, in terms of the services of both base stations and communication channels of the network, is required in wireless mobile networks to achieve highly reliable data transmission. In addition, it is observed that the number of blocked hosts is increased due to insufficient number of channels during heavy load in the network. Under such scenario, the channels are allocated accordingly to offer a reliable communication at any given time. Therefore, a reliability-based channel allocation model with acceptable system performance is proposed as a MOO problem in this paper. Two conflicting parameters known as Resource Reuse factor (RRF) and the number of blocked calls are optimized under reliability constraint in this problem. The solution to such MOO problem is obtained through NSGA-II (Non dominated Sorting Genetic Algorithm). The effectiveness of the proposed model in this work is shown with a set of experimental results.

A Trends Analysis of Image Processing in Unmanned Aerial Vehicle

This paper describes an analysis of domestic and international trends of image processing for data in UAV (unmanned aerial vehicle) and also explains about UAV and Quadcopter. Overseas examples of image processing using UAV include image processing for totaling the total numberof vehicles, edge/target detection, detection and evasion algorithm, image processing using SIFT(scale invariant features transform) matching, and application of median filter and thresholding. In Korea, many studies are underway including visualization of new urban buildings.

FPGA Implementation of RSA Encryption Algorithm for E-Passport Application

Securing the data stored on E-passport is a very important issue. RSA encryption algorithm is suitable for such application with low data size. In this paper the design and implementation of 1024 bit-key RSA encryption and decryption module on an FPGA is presented. The module is verified through comparing the result with that obtained from MATLAB tools. The design runs at a frequency of 36.3 MHz on Virtex-5 Xilinx FPGA. The key size is designed to be 1024-bit to achieve high security for the passport information. The whole design is achieved through VHDL design entry which makes it a portable design and can be directed to any hardware platform.