Optimized Data Fusion in an Intelligent Integrated GPS/INS System Using Genetic Algorithm

Most integrated inertial navigation systems (INS) and global positioning systems (GPS) have been implemented using the Kalman filtering technique with its drawbacks related to the need for predefined INS error model and observability of at least four satellites. Most recently, a method using a hybrid-adaptive network based fuzzy inference system (ANFIS) has been proposed which is trained during the availability of GPS signal to map the error between the GPS and the INS. Then it will be used to predict the error of the INS position components during GPS signal blockage. This paper introduces a genetic optimization algorithm that is used to update the ANFIS parameters with respect to the INS/GPS error function used as the objective function to be minimized. The results demonstrate the advantages of the genetically optimized ANFIS for INS/GPS integration in comparison with conventional ANFIS specially in the cases of satellites- outages. Coping with this problem plays an important role in assessment of the fusion approach in land navigation.

Discovery of Quantified Hierarchical Production Rules from Large Set of Discovered Rules

Automated discovery of Rule is, due to its applicability, one of the most fundamental and important method in KDD. It has been an active research area in the recent past. Hierarchical representation allows us to easily manage the complexity of knowledge, to view the knowledge at different levels of details, and to focus our attention on the interesting aspects only. One of such efficient and easy to understand systems is Hierarchical Production rule (HPRs) system. A HPR, a standard production rule augmented with generality and specificity information, is of the following form: Decision If < condition> Generality Specificity . HPRs systems are capable of handling taxonomical structures inherent in the knowledge about the real world. This paper focuses on the issue of mining Quantified rules with crisp hierarchical structure using Genetic Programming (GP) approach to knowledge discovery. The post-processing scheme presented in this work uses Quantified production rules as initial individuals of GP and discovers hierarchical structure. In proposed approach rules are quantified by using Dempster Shafer theory. Suitable genetic operators are proposed for the suggested encoding. Based on the Subsumption Matrix(SM), an appropriate fitness function is suggested. Finally, Quantified Hierarchical Production Rules (HPRs) are generated from the discovered hierarchy, using Dempster Shafer theory. Experimental results are presented to demonstrate the performance of the proposed algorithm.

Content Based Sampling over Transactional Data Streams

This paper investigates the problem of sampling from transactional data streams. We introduce CFISDS as a content based sampling algorithm that works on a landmark window model of data streams and preserve more informed sample in sample space. This algorithm that work based on closed frequent itemset mining tasks, first initiate a concept lattice using initial data, then update lattice structure using an incremental mechanism.Incremental mechanism insert, update and delete nodes in/from concept lattice in batch manner. Presented algorithm extracts the final samples on demand of user. Experimental results show the accuracy of CFISDS on synthetic and real datasets, despite on CFISDS algorithm is not faster than exist sampling algorithms such as Z and DSS.

Phenolic Content and Antioxidant Activity Determination in Broccoli and Lamb’s Lettuce

Broccoli has been widely recognized as a wealthy vegetable which contains multiple nutrients with potent anti-cancer properties. Lamb’s lettuce has been used as food for many centuries but only recently became commercially available and literature is therefore exiguous concerning these vegetables. The aim of this work was to evaluate the influence of the extraction conditions on the yield of phenolic compounds and the corresponding antioxidant capacity of broccoli and lamb’s lettuce. The results indicate that lamb’s lettuce, compared to broccoli, contains simultaneously a large amount of total polyphenols as well as high antioxidant activity. It is clearly demonstrated that extraction solvent significantly influences the antioxidant activity. Methanol is the solvent that can globally maximize the antioxidant extraction yield. The results presented herein prove lamb’s lettuce as a very interesting source of polyphenols, and thus a potential health-promoting food.

Cyber Warriors for Cyber Security and Information Assurance- An Academic Perspective

A virtualized and virtual approach is presented on academically preparing students to successfully engage at a strategic perspective to understand those concerns and measures that are both structured and not structured in the area of cyber security and information assurance. The Master of Science in Cyber Security and Information Assurance (MSCSIA) is a professional degree for those who endeavor through technical and managerial measures to ensure the security, confidentiality, integrity, authenticity, control, availability and utility of the world-s computing and information systems infrastructure. The National University Cyber Security and Information Assurance program is offered as a Master-s degree. The emphasis of the MSCSIA program uniquely includes hands-on academic instruction using virtual computers. This past year, 2011, the NU facility has become fully operational using system architecture to provide a Virtual Education Laboratory (VEL) accessible to both onsite and online students. The first student cohort completed their MSCSIA training this past March 2, 2012 after fulfilling 12 courses, for a total of 54 units of college credits. The rapid pace scheduling of one course per month is immensely challenging, perpetually changing, and virtually multifaceted. This paper analyses these descriptive terms in consideration of those globalization penetration breaches as present in today-s world of cyber security. In addition, we present current NU practices to mitigate risks.

Impact Assessment using Path Models of Microentrepreneurs developed by a Business Corporation in India

For scores of years now, several microfinance organizations, non governmental organizations and other welfare organizations have, with a view to aiding the progress of communities rooted in poverty have been focusing on creating microentrepreneurs, besides taking several other measures. In recent times, business corporations have joined forces to combat poverty by taking up microenterprise development. Hindustan Unilever Limited (HUL), the Indian subsidiary of Unilever Limited exemplifies this through its Project Shakti. The company through the Project creates rural women entrepreneurs by making them direct to home sales distributors of its products in villages that have thus far been ignored by multinational corporations. The members participating in Project Shakti are largely self help group members. The paper focuses on assessing the impact made by the company on the members engaged in Project Shakti. The analysis involves use of quantitative methods to study the effect of Project Shakti on those self help group members engaged in Project Shakti and those not engaged with Project Shakti. Path analysis has been used to study the impact made on those members engaged in Project Shakti. Significant differences were observed on fronts of entrepreneurial development, economic empowerment and social empowerment between members associated with Project Shakti and those not associated with Project Shakti. Path analysis demonstrated that involvement in Project Shakti led to entrepreneurial development resulting in economic empowerment that in turn led to social empowerment and that these three elements independently induced a feeling of privilege in the women for being associated with the Project.

Determining Cluster Boundaries Using Particle Swarm Optimization

Self-organizing map (SOM) is a well known data reduction technique used in data mining. Data visualization can reveal structure in data sets that is otherwise hard to detect from raw data alone. However, interpretation through visual inspection is prone to errors and can be very tedious. There are several techniques for the automatic detection of clusters of code vectors found by SOMs, but they generally do not take into account the distribution of code vectors; this may lead to unsatisfactory clustering and poor definition of cluster boundaries, particularly where the density of data points is low. In this paper, we propose the use of a generic particle swarm optimization (PSO) algorithm for finding cluster boundaries directly from the code vectors obtained from SOMs. The application of our method to unlabeled call data for a mobile phone operator demonstrates its feasibility. PSO algorithm utilizes U-matrix of SOMs to determine cluster boundaries; the results of this novel automatic method correspond well to boundary detection through visual inspection of code vectors and k-means algorithm.

Student Satisfaction Data for Work Based Learners

This paper aims to describe how student satisfaction is measured for work-based learners as these are non-traditional learners, conducting academic learning in the workplace, typically their curricula have a high degree of negotiation, and whose motivations are directly related to their employers- needs, as well as their own career ambitions. We argue that while increasing WBL participation, and use of SSD are both accepted as being of strategic importance to the HE agenda, the use of WBL SSD is rarely examined, and lessons can be learned from the comparison of SSD from a range of WBL programmes, and increased visibility of this type of data will provide insight into ways to improve and develop this type of delivery. The key themes that emerged from the analysis of the interview data were: learners profiles and needs, employers drivers, academic staff drivers, organizational approach, tools for collecting data and visibility of findings. The paper concludes with observations on best practice in the collection, analysis and use of WBL SSD, thus offering recommendations for both academic managers and practitioners.

Design of Nonlinear Robust Control in a Class of Structurally Stable Functions

An approach of design of stable of control systems with ultimately wide ranges of uncertainly disturbed parameters is offered. The method relies on using of nonlinear structurally stable functions from catastrophe theory as controllers. Theoretical part presents an analysis of designed nonlinear second-order control systems. As more important the integrators in series, canonical controllable form and Jordan forms are considered. The analysis resumes that due to added controllers systems become stable and insensitive to any disturbance of parameters. Experimental part presents MATLAB simulation of design of control systems of epidemic spread, aircrafts angular motion and submarine depth. The results of simulation confirm the efficiency of offered method of design. KeywordsCatastrophes, robust control, simulation, uncertain parameters.

Target and Kaizen Costing

increased competition and increased costs of designing made it important for the firms to identify the right products and the right methods for manufacturing the products. Firms should focus on customers and identify customer demands directly to design the right products. Several management methods and techniques that are currently available improve one or more functions or processes in an industry and do not take the complete product life cycle into consideration. On the other hand target costing is a method / philosophy that takes financial, manufacturing and customer aspects into consideration during designing phase and helps firms in making product design decisions to increase the profit / value of the company. It uses various techniques to identify customer demands, to decrease costs of manufacturing and finally to achieve strategic goals. Target Costing forms an integral part of total product design / redesign based on strategic plans.

Performance Evaluation of Qos Parameters in Cognitive Radio Using Genetic Algorithm

The efficient use of available licensed spectrum is becoming more and more critical with increasing demand and usage of the radio spectrum. This paper shows how the use of spectrum as well as dynamic spectrum management can be effectively managed and spectrum allocation schemes in the wireless communication systems be implemented and used, in future. This paper would be an attempt towards better utilization of the spectrum. This research will focus on the decision-making process mainly, with an assumption that the radio environment has already been sensed and the QoS requirements for the application have been specified either by the sensed radio environment or by the secondary user itself. We identify and study the characteristic parameters of Cognitive Radio and use Genetic Algorithm for spectrum allocation. Performance evaluation is done using MATLAB toolboxes.

Effect of Flaying Capacitors on Improving the 4 Level Three-Cell Inverter

With the rapid advanced of technology, the industrial processes become increasingly demanding, from the point of view, power quality and controllability. The advent of multi levels inverters responds partially to these requirements. But actually, the new generation of multi-cells inverters permits to reach more performances, since, it offers more voltage levels. The disadvantage in the increase of voltage levels by the number of cells in cascades is on account of series igbts synchronisation loss, from where, a limitation of cells in cascade to 4. Regarding to these constraints, a new topology is proposed in this paper, which increases the voltage levels of the three-cell inverter from 4 to 8; with the same number of igbts, and using less stored energy in the flaying capacitors. The details of operation and modelling of this new inverter structure are also presented, then tested thanks to a three phase induction motor. KeywordsFlaying capacitors, Multi-cells inverter, pwm, switchers, modelling.

Density Estimation using Generalized Linear Model and a Linear Combination of Gaussians

In this paper we present a novel approach for density estimation. The proposed approach is based on using the logistic regression model to get initial density estimation for the given empirical density. The empirical data does not exactly follow the logistic regression model, so, there will be a deviation between the empirical density and the density estimated using logistic regression model. This deviation may be positive and/or negative. In this paper we use a linear combination of Gaussian (LCG) with positive and negative components as a model for this deviation. Also, we will use the expectation maximization (EM) algorithm to estimate the parameters of LCG. Experiments on real images demonstrate the accuracy of our approach.

Anomaly Detection using Neuro Fuzzy system

As the network based technologies become omnipresent, demands to secure networks/systems against threat increase. One of the effective ways to achieve higher security is through the use of intrusion detection systems (IDS), which are a software tool to detect anomalous in the computer or network. In this paper, an IDS has been developed using an improved machine learning based algorithm, Locally Linear Neuro Fuzzy Model (LLNF) for classification whereas this model is originally used for system identification. A key technical challenge in IDS and LLNF learning is the curse of high dimensionality. Therefore a feature selection phase is proposed which is applicable to any IDS. While investigating the use of three feature selection algorithms, in this model, it is shown that adding feature selection phase reduces computational complexity of our model. Feature selection algorithms require the use of a feature goodness measure. The use of both a linear and a non-linear measure - linear correlation coefficient and mutual information- is investigated respectively

Development and Evaluation of a Dynamic Cardiac Phantom for use in Nuclear Medicine

The aim of this study was to develop a dynamic cardiac phantom for quality control in myocardial scintigraphy. The dynamic heart phantom constructed only contained the left ventricle, made of elastic material (latex), comprising two cavities: one internal and one external. The data showed a non-significant variation in the values of left ventricular ejection fraction (LVEF) obtained by varying the heart rate. It was also possible to evaluate the ejection fraction (LVEF) through different arrays of image acquisition and to perform an intercomparison of LVEF by two different scintillation cameras. The results of the quality control tests were satisfactory, showing that they can be used as parameters in future assessments. The new dynamic heart phantom was demonstrated to be effective for use in LVEF measurements. Therefore, the new heart simulator is useful for the quality control of scintigraphic cameras.

MMU Simulation in Hardware Simulator Based-on State Transition Models

Embedded hardware simulator is a valuable computeraided tool for embedded application development. This paper focuses on the ARM926EJ-S MMU, builds state transition models and formally verifies critical properties for the models. The state transition models include loading instruction model, reading data model, and writing data model. The properties of the models are described by CTL specification language, and they are verified in VIS. The results obtained in VIS demonstrate that the critical properties of MMU are satisfied in the state transition models. The correct models can be used to implement the MMU component in our simulator. In the end of this paper, the experimental results show that the MMU can successfully accomplish memory access requests from CPU.

Adsorption of Lead from Synthetic Solution using Luffa Charcoal

This work was to study batch biosorption of Pb(II) ions from aqueous solution by Luffa charcoal. The effect of operating parameters such as adsorption contact time, initial pH solution and different initial Pb(II) concentration on the sorption of Pb(II) were investigated. The results showed that the adsorption of Pb(II) ions was initially rapid and the equilibrium time was 10 h. Adsorption kinetics of Pb(II) ions onto Luffa charcoal could be best described by the pseudo-second order model. At pH 5.0 was favorable for the adsorption and removal of Pb(II) ions. Freundlich adsorption isotherm model was better fitted for the adsorption of Pb(II) ions than Langmuir and Timkin isotherms, respectively. The highest monolayer adsorption capacity obtained from Langmuir isotherm model was 51.02 mg/g. This study demonstrated that Luffa charcoal could be used for the removal of Pb(II) ions in water treatment.

Design and Control of PEM Fuel Cell Diffused Aeration System using Artificial Intelligence Techniques

Fuel cells have become one of the major areas of research in the academia and the industry. The goal of most fish farmers is to maximize production and profits while holding labor and management efforts to the minimum. Risk of fish kills, disease outbreaks, poor water quality in most pond culture operations, aeration offers the most immediate and practical solution to water quality problems encountered at higher stocking and feeding rates. Many units of aeration system are electrical units so using a continuous, high reliability, affordable, and environmentally friendly power sources is necessary. Aeration of water by using PEM fuel cell power is not only a new application of the renewable energy, but also, it provides an affordable method to promote biodiversity in stagnant ponds and lakes. This paper presents a new design and control of PEM fuel cell powered a diffused air aeration system for a shrimp farm in Mersa Matruh in Egypt. Also Artificial intelligence (AI) techniques control is used to control the fuel cell output power by control input gases flow rate. Moreover the mathematical modeling and simulation of PEM fuel cell is introduced. A comparison study is applied between the performance of fuzzy logic control (FLC) and neural network control (NNC). The results show the effectiveness of NNC over FLC.

Balancing Strategies for Parallel Content-based Data Retrieval Algorithms in a k-tree Structured Database

The paper proposes a unified model for multimedia data retrieval which includes data representatives, content representatives, index structure, and search algorithms. The multimedia data are defined as k-dimensional signals indexed in a multidimensional k-tree structure. The benefits of using the k-tree unified model were demonstrated by running the data retrieval application on a six networked nodes test bed cluster. The tests were performed with two retrieval algorithms, one that allows parallel searching using a single feature, the second that performs a weighted cascade search for multiple features querying. The experiments show a significant reduction of retrieval time while maintaining the quality of results.

Join and Meet Block Based Default Definite Decision Rule Mining from IDT and an Incremental Algorithm

Using maximal consistent blocks of tolerance relation on the universe in incomplete decision table, the concepts of join block and meet block are introduced and studied. Including tolerance class, other blocks such as tolerant kernel and compatible kernel of an object are also discussed at the same time. Upper and lower approximations based on those blocks are also defined. Default definite decision rules acquired from incomplete decision table are proposed in the paper. An incremental algorithm to update default definite decision rules is suggested for effective mining tasks from incomplete decision table into which data is appended. Through an example, we demonstrate how default definite decision rules based on maximal consistent blocks, join blocks and meet blocks are acquired and how optimization is done in support of discernibility matrix and discernibility function in the incomplete decision table.