Adaptive Car Safety System

Car accident is one of the major causes of death in many countries. Many researchers have attempted to design and develop techniques to increase car safety in the past recent years. In spite of all the efforts, it is still challenging to design a system adaptive to the driver rather than the automotive characteristics. In this paper, the adaptive car safety system is explained which attempts to find a balance.

The Role of Contextual Ontologies in Enterprise Modeling

Information sharing and exchange, rather than information processing, is what characterizes information technology in the 21st century. Ontologies, as shared common understanding, gain increasing attention, as they appear as the most promising solution to enable information sharing both at a semantic level and in a machine-processable way. Domain Ontology-based modeling has been exploited to provide shareability and information exchange among diversified, heterogeneous applications of enterprises. Contextual ontologies are “an explicit specification of contextual conceptualization". That is: ontology is characterized by concepts that have multiple representations and they may exist in several contexts. Hence, contextual ontologies are a set of concepts and relationships, which are seen from different perspectives. Contextualization is to allow for ontologies to be partitioned according to their contexts. The need for contextual ontologies in enterprise modeling has become crucial due to the nature of today's competitive market. Information resources in enterprise is distributed and diversified and is in need to be shared and communicated locally through the intranet and globally though the internet. This paper discusses the roles that ontologies play in an enterprise modeling, and how ontologies assist in building a conceptual model in order to provide communicative and interoperable information systems. The issue of enterprise modeling based on contextual domain ontology is also investigated, and a framework is proposed for an enterprise model that consists of various applications.

Local Image Descriptor using VQ-SIFT for Image Retrieval

In this paper, we present local image descriptor using VQ-SIFT for more effective and efficient image retrieval. Instead of SIFT's weighted orientation histograms, we apply vector quantization (VQ) histogram as an alternate representation for SIFT features. Experimental results show that SIFT features using VQ-based local descriptors can achieve better image retrieval accuracy than the conventional algorithm while the computational cost is significantly reduced.

Wood Species Recognition System

The proposed system identifies the species of the wood using the textural features present in its barks. Each species of a wood has its own unique patterns in its bark, which enabled the proposed system to identify it accurately. Automatic wood recognition system has not yet been well established mainly due to lack of research in this area and the difficulty in obtaining the wood database. In our work, a wood recognition system has been designed based on pre-processing techniques, feature extraction and by correlating the features of those wood species for their classification. Texture classification is a problem that has been studied and tested using different methods due to its valuable usage in various pattern recognition problems, such as wood recognition, rock classification. The most popular technique used for the textural classification is Gray-level Co-occurrence Matrices (GLCM). The features from the enhanced images are thus extracted using the GLCM is correlated, which determines the classification between the various wood species. The result thus obtained shows a high rate of recognition accuracy proving that the techniques used in suitable to be implemented for commercial purposes.

Detecting Interactions between Behavioral Requirements with OWL and SWRL

High quality requirements analysis is one of the most crucial activities to ensure the success of a software project, so that requirements verification for software system becomes more and more important in Requirements Engineering (RE) and it is one of the most helpful strategies for improving the quality of software system. Related works show that requirement elicitation and analysis can be facilitated by ontological approaches and semantic web technologies. In this paper, we proposed a hybrid method which aims to verify requirements with structural and formal semantics to detect interactions. The proposed method is twofold: one is for modeling requirements with the semantic web language OWL, to construct a semantic context; the other is a set of interaction detection rules which are derived from scenario-based analysis and represented with semantic web rule language (SWRL). SWRL based rules are working with rule engines like Jess to reason in semantic context for requirements thus to detect interactions. The benefits of the proposed method lie in three aspects: the method (i) provides systematic steps for modeling requirements with an ontological approach, (ii) offers synergy of requirements elicitation and domain engineering for knowledge sharing, and (3)the proposed rules can systematically assist in requirements interaction detection.

Array Data Transformation for Source Code Obfuscation

Obfuscation is a low cost software protection methodology to avoid reverse engineering and re engineering of applications. Source code obfuscation aims in obscuring the source code to hide the functionality of the codes. This paper proposes an Array data transformation in order to obfuscate the source code which uses arrays. The applications using the proposed data structures force the programmer to obscure the logic manually. It makes the developed obscured codes hard to reverse engineer and also protects the functionality of the codes.

An Optimal Feature Subset Selection for Leaf Analysis

This paper describes an optimal approach for feature subset selection to classify the leaves based on Genetic Algorithm (GA) and Kernel Based Principle Component Analysis (KPCA). Due to high complexity in the selection of the optimal features, the classification has become a critical task to analyse the leaf image data. Initially the shape, texture and colour features are extracted from the leaf images. These extracted features are optimized through the separate functioning of GA and KPCA. This approach performs an intersection operation over the subsets obtained from the optimization process. Finally, the most common matching subset is forwarded to train the Support Vector Machine (SVM). Our experimental results successfully prove that the application of GA and KPCA for feature subset selection using SVM as a classifier is computationally effective and improves the accuracy of the classifier.

A Modified Cross Correlation in the Frequency Domain for Fast Pattern Detection Using Neural Networks

Recently, neural networks have shown good results for detection of a certain pattern in a given image. In our previous papers [1-5], a fast algorithm for pattern detection using neural networks was presented. Such algorithm was designed based on cross correlation in the frequency domain between the input image and the weights of neural networks. Image conversion into symmetric shape was established so that fast neural networks can give the same results as conventional neural networks. Another configuration of symmetry was suggested in [3,4] to improve the speed up ratio. In this paper, our previous algorithm for fast neural networks is developed. The frequency domain cross correlation is modified in order to compensate for the symmetric condition which is required by the input image. Two new ideas are introduced to modify the cross correlation algorithm. Both methods accelerate the speed of the fast neural networks as there is no need for converting the input image into symmetric one as previous. Theoretical and practical results show that both approaches provide faster speed up ratio than the previous algorithm.

A Fault Tolerant Token-based Algorithm for Group Mutual Exclusion in Distributed Systems

The group mutual exclusion (GME) problem is a variant of the mutual exclusion problem. In the present paper a token-based group mutual exclusion algorithm, capable of handling transient faults, is proposed. The algorithm uses the concept of dynamic request sets. A time out mechanism is used to detect the token loss; also, a distributed scheme is used to regenerate the token. The worst case message complexity of the algorithm is n+1. The maximum concurrency and forum switch complexity of the algorithm are n and min (n, m) respectively, where n is the number of processes and m is the number of groups. The algorithm also satisfies another desirable property called smooth admission. The scheme can also be adapted to handle the extended group mutual exclusion problem.

DODR : Delay On-Demand Routing

As originally designed for wired networks, TCP (transmission control protocol) congestion control mechanism is triggered into action when packet loss is detected. This implicit assumption for packet loss mostly due to network congestion does not work well in Mobile Ad Hoc Network, where there is a comparatively high likelihood of packet loss due to channel errors and node mobility etc. Such non-congestion packet loss, when dealt with by congestion control mechanism, causes poor TCP performance in MANET. In this study, we continue to investigate the impact of the interaction between transport protocols and on-demand routing protocols on the performance and stability of 802.11 multihop networks. We evaluate the important wireless networking events caused routing change, and propose a cross layer method to delay the unnecessary routing changes, only need to add a sensitivity parameter α , which represents the on-demand routing-s reaction to link failure of MAC layer. Our proposal is applicable to the plain 802.11 networking environment, the simulation results that this method can remarkably improve the stability and performance of TCP without any modification on TCP and MAC protocol.

Sounds Alike Name Matching for Myanmar Language

Personal name matching system is the core of essential task in national citizen database, text and web mining, information retrieval, online library system, e-commerce and record linkage system. It has necessitated to the all embracing research in the vicinity of name matching. Traditional name matching methods are suitable for English and other Latin based language. Asian languages which have no word boundary such as Myanmar language still requires sounds alike matching system in Unicode based application. Hence we proposed matching algorithm to get analogous sounds alike (phonetic) pattern that is convenient for Myanmar character spelling. According to the nature of Myanmar character, we consider for word boundary fragmentation, collation of character. Thus we use pattern conversion algorithm which fabricates words in pattern with fragmented and collated. We create the Myanmar sounds alike phonetic group to help in the phonetic matching. The experimental results show that fragmentation accuracy in 99.32% and processing time in 1.72 ms.

Fuzzy Controlled Hydraulic Excavator with Model Parameter Uncertainty

The hydraulic actuated excavator, being a non-linear mobile machine, encounters many uncertainties. There are uncertainties in the hydraulic system in addition to the uncertain nature of the load. The simulation results obtained in this study show that there is a need for intelligent control of such machines and in particular interval type-2 fuzzy controller is most suitable for minimizing the position error of a typical excavator-s bucket under load variations. We consider the model parameter uncertainties such as hydraulic fluid leakage and friction. These are uncertainties which also depend up on the temperature and alter bulk modulus and viscosity of the hydraulic fluid. Such uncertainties together with the load variations cause chattering of the bucket position. The interval type-2 fuzzy controller effectively eliminates the chattering and manages to control the end-effecter (bucket) position with positional error in the order of few millimeters.

Design and Implementation of Cyber Video Consultation System Using Hybrid P2P

This paper describes the design and implementation of cyber video consultation systems(CVCS) using hybrid P2P for video consultation between remote sites. The proposed system is based on client-server and P2P(Peer to Peer) architecture, where client-server is used for communication with the MCU(Multipoint Control Unit) and P2P is used for the cyber video consultation. The developed video consultation system decreases server traffic, and cuts down network expenses, as the multimedia data decentralizes to the client by hybrid P2P architecture. Also the developed system is tested by the group-type video consultation system using communication protocol and application software through Ethernet networks.

A Unified Framework for a Robust Conflict-Free Robot Navigation

Many environment specific methods and systems for Robot Navigation exist. However vast strides in the evolution of navigation technologies and system techniques create the need for a general unified framework that is scalable, modular and dynamic. In this paper a Unified Framework for a Robust Conflict-free Robot Navigation System that can be used for either a structured or unstructured and indoor or outdoor environments has been proposed. The fundamental design aspects and implementation issues encountered during the development of the module are discussed. The results of the deployment of three major peripheral modules of the framework namely the GSM based communication module, GIS Module and GPS module are reported in this paper.

Modelling and Analyzing a Hospital Procedureusing a Petri-Net Approach

Hierarchical high-level PNs (HHPNs) with time versions are a useful tool to model systems in a variety of application domains, ranging from logistics to complex workflows. This paper addresses an application domain which is receiving more and more attention: procedure that arranges the final inpatient charge in payment-s office and their management. We shall prove that Petri net based analysis is able to improve the delays during the procedure, in order that inpatient charges could be more reliable and on time.

Spectral Entropy Employment in Speech Enhancement based on Wavelet Packet

In this work, we are interested in developing a speech denoising tool by using a discrete wavelet packet transform (DWPT). This speech denoising tool will be employed for applications of recognition, coding and synthesis. For noise reduction, instead of applying the classical thresholding technique, some wavelet packet nodes are set to zero and the others are thresholded. To estimate the non stationary noise level, we employ the spectral entropy. A comparison of our proposed technique to classical denoising methods based on thresholding and spectral subtraction is made in order to evaluate our approach. The experimental implementation uses speech signals corrupted by two sorts of noise, white and Volvo noises. The obtained results from listening tests show that our proposed technique is better than spectral subtraction. The obtained results from SNR computation show the superiority of our technique when compared to the classical thresholding method using the modified hard thresholding function based on u-law algorithm.

Walsh-Hadamard Transform for Facial Feature Extraction in Face Recognition

This Paper proposes a new facial feature extraction approach, Wash-Hadamard Transform (WHT). This approach is based on correlation between local pixels of the face image. Its primary advantage is the simplicity of its computation. The paper compares the proposed approach, WHT, which was traditionally used in data compression with two other known approaches: the Principal Component Analysis (PCA) and the Discrete Cosine Transform (DCT) using the face database of Olivetti Research Laboratory (ORL). In spite of its simple computation, the proposed algorithm (WHT) gave very close results to those obtained by the PCA and DCT. This paper initiates the research into WHT and the family of frequency transforms and examines their suitability for feature extraction in face recognition applications.

Unsupervised Clustering Methods for Identifying Rare Events in Anomaly Detection

It is important problems to increase the detection rates and reduce false positive rates in Intrusion Detection System (IDS). Although preventative techniques such as access control and authentication attempt to prevent intruders, these can fail, and as a second line of defence, intrusion detection has been introduced. Rare events are events that occur very infrequently, detection of rare events is a common problem in many domains. In this paper we propose an intrusion detection method that combines Rough set and Fuzzy Clustering. Rough set has to decrease the amount of data and get rid of redundancy. Fuzzy c-means clustering allow objects to belong to several clusters simultaneously, with different degrees of membership. Our approach allows us to recognize not only known attacks but also to detect suspicious activity that may be the result of a new, unknown attack. The experimental results on Knowledge Discovery and Data Mining-(KDDCup 1999) Dataset show that the method is efficient and practical for intrusion detection systems.

Applying Fuzzy Analytic Hierarchy Process for Evaluating Service Quality of Online Auction

This paper applies fuzzy AHP to evaluate the service quality of online auction. Service quality is a composition of various criteria. Among them many intangible attributes are difficult to measure. This characteristic introduces the obstacles for respondents on reply in the survey. So as to overcome this problem, we invite fuzzy set theory into the measurement of performance and use AHP in obtaining criteria. We found the most concerned dimension of service quality is Transaction Safety Mechanism and the least is Charge Item. Other criteria such as information security, accuracy and information are too vital.

An Efficient Architecture for Interleaved Modular Multiplication

Modular multiplication is the basic operation in most public key cryptosystems, such as RSA, DSA, ECC, and DH key exchange. Unfortunately, very large operands (in order of 1024 or 2048 bits) must be used to provide sufficient security strength. The use of such big numbers dramatically slows down the whole cipher system, especially when running on embedded processors. So far, customized hardware accelerators - developed on FPGAs or ASICs - were the best choice for accelerating modular multiplication in embedded environments. On the other hand, many algorithms have been developed to speed up such operations. Examples are the Montgomery modular multiplication and the interleaved modular multiplication algorithms. Combining both customized hardware with an efficient algorithm is expected to provide a much faster cipher system. This paper introduces an enhanced architecture for computing the modular multiplication of two large numbers X and Y modulo a given modulus M. The proposed design is compared with three previous architectures depending on carry save adders and look up tables. Look up tables should be loaded with a set of pre-computed values. Our proposed architecture uses the same carry save addition, but replaces both look up tables and pre-computations with an enhanced version of sign detection techniques. The proposed architecture supports higher frequencies than other architectures. It also has a better overall absolute time for a single operation.