Use XML Format like a Model of Data Backup

Nowadays data backup format doesn-t cease to appear raising so the anxiety on their accessibility and their perpetuity. XML is one of the most promising formats to guarantee the integrity of data. This article suggests while showing one thing man can do with XML. Indeed XML will help to create a data backup model. The main task will consist in defining an application in JAVA able to convert information of a database in XML format and restore them later.

A Logic Approach to Database Dynamic Updating

We introduce a logic-based framework for database updating under constraints. In our framework, the constraints are represented as an instantiated extended logic program. When performing an update, database consistency may be violated. We provide an approach of maintaining database consistency, and study the conditions under which the maintenance process is deterministic. We show that the complexity of the computations and decision problems presented in our framework is in each case polynomial time.

Comparative Analysis of Mobility Support in Mobile IP and SIP

With the rapid usage of portable devices mobility in IP networks becomes more important issue in the recent years. IETF standardized Mobile IP that works in Network Layer, which involves tunneling of IP packets from HA to Foreign Agent. Mobile IP suffers many problems of Triangular Routing, conflict with private addressing scheme, increase in load in HA, need of permanent home IP address, tunneling itself, and so on. In this paper, we proposed mobility management in Application Layer protocol SIP and show some comparative analysis between Mobile IP and SIP in context of mobility.

A Modified Maximum Urgency First Scheduling Algorithm for Real-Time Tasks

This paper presents a modified version of the maximum urgency first scheduling algorithm. The maximum urgency algorithm combines the advantages of fixed and dynamic scheduling to provide the dynamically changing systems with flexible scheduling. This algorithm, however, has a major shortcoming due to its scheduling mechanism which may cause a critical task to fail. The modified maximum urgency first scheduling algorithm resolves the mentioned problem. In this paper, we propose two possible implementations for this algorithm by using either earliest deadline first or modified least laxity first algorithms for calculating the dynamic priorities. These two approaches are compared together by simulating the two algorithms. The earliest deadline first algorithm as the preferred implementation is then recommended. Afterwards, we make a comparison between our proposed algorithm and maximum urgency first algorithm using simulation and results are presented. It is shown that modified maximum urgency first is superior to maximum urgency first, since it usually has less task preemption and hence, less related overhead. It also leads to less failed non-critical tasks in overloaded situations.

Extended Deductive Databases with Uncertain Information

The paper presents an approach for handling uncertain information in deductive databases using multivalued logics. Uncertainty means that database facts may be assigned logical values other than the conventional ones - true and false. The logical values represent various degrees of truth, which may be combined and propagated by applying the database rules. A corresponding multivalued database semantics is defined. We show that it extends successful conventional semantics as the well-founded semantics, and has a polynomial time data complexity.

Formal Analysis of a Public-Key Algorithm

In this article, a formal specification and verification of the Rabin public-key scheme in a formal proof system is presented. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. A major objective of this article is the presentation of the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Moreover, we explicate a (computer-proven) formalization of correctness as well as a computer verification of security properties using a straight-forward computation model in Isabelle/HOL. The analysis uses a given database to prove formal properties of our implemented functions with computer support. The main task in designing a practical formalization of correctness as well as efficient computer proofs of security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as efficient formal proofs. Consequently, we get reliable proofs with a minimal error rate augmenting the used database, what provides a formal basis for more computer proof constructions in this area.

A Fuzzy Approach for Delay Proportion Differentiated Service

There are two paradigms proposed to provide QoS for Internet applications: Integrated service (IntServ) and Differentiated service (DiffServ).Intserv is not appropriate for large network like Internet. Because is very complex. Therefore, to reduce the complexity of QoS management, DiffServ was introduced to provide QoS within a domain using aggregation of flow and per- class service. In theses networks QoS between classes is constant and it allows low priority traffic to be effected from high priority traffic, which is not suitable. In this paper, we proposed a fuzzy controller, which reduced the effect of low priority class on higher priority ones. Our simulations shows that, our approach reduces the latency dependency of low priority class on higher priority ones, in an effective manner.

Power-Efficient AND-EXOR-INV Based Realization of Achilles' heel Logic Functions

This paper deals with a power-conscious ANDEXOR- Inverter type logic implementation for a complex class of Boolean functions, namely Achilles- heel functions. Different variants of the above function class have been considered viz. positive, negative and pure horn for analysis and simulation purposes. The proposed realization is compared with the decomposed implementation corresponding to an existing standard AND-EXOR logic minimizer; both result in Boolean networks with good testability attribute. It could be noted that an AND-OR-EXOR type logic network does not exist for the positive phase of this unique class of logic function. Experimental results report significant savings in all the power consumption components for designs based on standard cells pertaining to a 130nm UMC CMOS process The simulations have been extended to validate the savings across all three library corners (typical, best and worst case specifications).

2D Gabor Functions and FCMI Algorithm for Flaws Detection in Ultrasonic Images

In this paper we present a new approach to detecting a flaw in T.O.F.D (Time Of Flight Diffraction) type ultrasonic image based on texture features. Texture is one of the most important features used in recognizing patterns in an image. The paper describes texture features based on 2D Gabor functions, i.e., Gaussian shaped band-pass filters, with dyadic treatment of the radial spatial frequency range and multiple orientations, which represent an appropriate choice for tasks requiring simultaneous measurement in both space and frequency domains. The most relevant features are used as input data on a Fuzzy c-mean clustering classifier. The classes that exist are only two: 'defects' or 'no defects'. The proposed approach is tested on the T.O.F.D image achieved at the laboratory and on the industrial field.

Extended Well-Founded Semantics in Bilattices

One of the most used assumptions in logic programming and deductive databases is the so-called Closed World Assumption (CWA), according to which the atoms that cannot be inferred from the programs are considered to be false (i.e. a pessimistic assumption). One of the most successful semantics of conventional logic programs based on the CWA is the well-founded semantics. However, the CWA is not applicable in all circumstances when information is handled. That is, the well-founded semantics, if conventionally defined, would behave inadequately in different cases. The solution we adopt in this paper is to extend the well-founded semantics in order for it to be based also on other assumptions. The basis of (default) negative information in the well-founded semantics is given by the so-called unfounded sets. We extend this concept by considering optimistic, pessimistic, skeptical and paraconsistent assumptions, used to complete missing information from a program. Our semantics, called extended well-founded semantics, expresses also imperfect information considered to be missing/incomplete, uncertain and/or inconsistent, by using bilattices as multivalued logics. We provide a method of computing the extended well-founded semantics and show that Kripke-Kleene semantics is captured by considering a skeptical assumption. We show also that the complexity of the computation of our semantics is polynomial time.

A Supervised Text-Independent Speaker Recognition Approach

We provide a supervised speech-independent voice recognition technique in this paper. In the feature extraction stage we propose a mel-cepstral based approach. Our feature vector classification method uses a special nonlinear metric, derived from the Hausdorff distance for sets, and a minimum mean distance classifier.

Feedback-Controlled Server for Scheduling Aperiodic Tasks

This paper proposes a scheduling scheme using feedback control to reduce the response time of aperiodic tasks with soft real-time constraints. We design an algorithm based on the proposed scheduling scheme and Total Bandwidth Server (TBS) that is a conventional server technique for scheduling aperiodic tasks. We then describe the feedback controller of the algorithm and give the control parameter tuning methods. The simulation study demonstrates that the algorithm can reduce the mean response time up to 26% compared to TBS in exchange for slight deadline misses.

Harmonic Parameters with HHT and Wavelet Transform for Automatic Sleep Stages Scoring

Previously, harmonic parameters (HPs) have been selected as features extracted from EEG signals for automatic sleep scoring. However, in previous studies, only one HP parameter was used, which were directly extracted from the whole epoch of EEG signal. In this study, two different transformations were applied to extract HPs from EEG signals: Hilbert-Huang transform (HHT) and wavelet transform (WT). EEG signals are decomposed by the two transformations; and features were extracted from different components. Twelve parameters (four sets of HPs) were extracted. Some of the parameters are highly diverse among different stages. Afterward, HPs from two transformations were used to building a rough sleep stages scoring model using the classifier SVM. The performance of this model is about 78% using the features obtained by our proposed extractions. Our results suggest that these features may be useful for automatic sleep stages scoring.

Robust Face Recognition using AAM and Gabor Features

In this paper, we propose a face recognition algorithm using AAM and Gabor features. Gabor feature vectors which are well known to be robust with respect to small variations of shape, scaling, rotation, distortion, illumination and poses in images are popularly employed for feature vectors for many object detection and recognition algorithms. EBGM, which is prominent among face recognition algorithms employing Gabor feature vectors, requires localization of facial feature points where Gabor feature vectors are extracted. However, localization method employed in EBGM is based on Gabor jet similarity and is sensitive to initial values. Wrong localization of facial feature points affects face recognition rate. AAM is known to be successfully applied to localization of facial feature points. In this paper, we devise a facial feature point localization method which first roughly estimate facial feature points using AAM and refine facial feature points using Gabor jet similarity-based facial feature localization method with initial points set by the rough facial feature points obtained from AAM, and propose a face recognition algorithm using the devised localization method for facial feature localization and Gabor feature vectors. It is observed through experiments that such a cascaded localization method based on both AAM and Gabor jet similarity is more robust than the localization method based on only Gabor jet similarity. Also, it is shown that the proposed face recognition algorithm using this devised localization method and Gabor feature vectors performs better than the conventional face recognition algorithm using Gabor jet similarity-based localization method and Gabor feature vectors like EBGM.

Improving Image Quality in Remote Sensing Satellites using Channel Coding

Among other factors that characterize satellite communication channels is their high bit error rate. We present a system for still image transmission over noisy satellite channels. The system couples image compression together with error control codes to improve the received image quality while maintaining its bandwidth requirements. The proposed system is tested using a high resolution satellite imagery simulated over the Rician fading channel. Evaluation results show improvement in overall system including image quality and bandwidth requirements compared to similar systems with different coding schemes.

Multi-Scale Gabor Feature Based Eye Localization

Eye localization is necessary for face recognition and related application areas. Most of eye localization algorithms reported so far still need to be improved about precision and computational time for successful applications. In this paper, we propose an eye location method based on multi-scale Gabor feature vectors, which is more robust with respect to initial points. The eye localization based on Gabor feature vectors first needs to constructs an Eye Model Bunch for each eye (left or right eye) which consists of n Gabor jets and average eye coordinates of each eyes obtained from n model face images, and then tries to localize eyes in an incoming face image by utilizing the fact that the true eye coordinates is most likely to be very close to the position where the Gabor jet will have the best Gabor jet similarity matching with a Gabor jet in the Eye Model Bunch. Similar ideas have been already proposed in such as EBGM (Elastic Bunch Graph Matching). However, the method used in EBGM is known to be not robust with respect to initial values and may need extensive search range for achieving the required performance, but extensive search ranges will cause much more computational burden. In this paper, we propose a multi-scale approach with a little increased computational burden where one first tries to localize eyes based on Gabor feature vectors in a coarse face image obtained from down sampling of the original face image, and then localize eyes based on Gabor feature vectors in the original resolution face image by using the eye coordinates localized in the coarse scaled image as initial points. Several experiments and comparisons with other eye localization methods reported in the other papers show the efficiency of our proposed method.

Palmprint based Cancelable Biometric Authentication System

A cancelable palmprint authentication system proposed in this paper is specifically designed to overcome the limitations of the contemporary biometric authentication system. In this proposed system, Geometric and pseudo Zernike moments are employed as feature extractors to transform palmprint image into a lower dimensional compact feature representation. Before moment computation, wavelet transform is adopted to decompose palmprint image into lower resolution and dimensional frequency subbands. This reduces the computational load of moment calculation drastically. The generated wavelet-moment based feature representation is used to generate cancelable verification key with a set of random data. This private binary key can be canceled and replaced. Besides that, this key also possesses high data capture offset tolerance, with highly correlated bit strings for intra-class population. This property allows a clear separation of the genuine and imposter populations, as well as zero Equal Error Rate achievement, which is hardly gained in the conventional biometric based authentication system.

Spectral Entropy Employment in Speech Enhancement based on Wavelet Packet

In this work, we are interested in developing a speech denoising tool by using a discrete wavelet packet transform (DWPT). This speech denoising tool will be employed for applications of recognition, coding and synthesis. For noise reduction, instead of applying the classical thresholding technique, some wavelet packet nodes are set to zero and the others are thresholded. To estimate the non stationary noise level, we employ the spectral entropy. A comparison of our proposed technique to classical denoising methods based on thresholding and spectral subtraction is made in order to evaluate our approach. The experimental implementation uses speech signals corrupted by two sorts of noise, white and Volvo noises. The obtained results from listening tests show that our proposed technique is better than spectral subtraction. The obtained results from SNR computation show the superiority of our technique when compared to the classical thresholding method using the modified hard thresholding function based on u-law algorithm.

Application of Computational Intelligence for Sensor Fault Detection and Isolation

The new idea of this research is application of a new fault detection and isolation (FDI) technique for supervision of sensor networks in transportation system. In measurement systems, it is necessary to detect all types of faults and failures, based on predefined algorithm. Last improvements in artificial neural network studies (ANN) led to using them for some FDI purposes. In this paper, application of new probabilistic neural network features for data approximation and data classification are considered for plausibility check in temperature measurement. For this purpose, two-phase FDI mechanism was considered for residual generation and evaluation.

Simulation and 40 Years of Object-Oriented Programming

2007 is a jubilee year: in 1967, programming language SIMULA 67 was presented, which contained all aspects of what was later called object-oriented programming. The present paper contains a description of the development unto the objectoriented programming, the role of simulation in this development and other tools that appeared in SIMULA 67 and that are nowadays called super-object-oriented programming.