Navigation and Self Alignment of Inertial Systems using Nonlinear H∞ Filters

Micro electromechanical sensors (MEMS) play a vital role along with global positioning devices in navigation of autonomous vehicles .These sensors are low cost ,easily available but depict colored noises and unpredictable discontinuities .Conventional filters like Kalman filters and Sigma point filters are not able to cope with nonwhite noises. This research has utilized H∞ filter in nonlinear frame work both with Kalman filter and Unscented filter for navigation and self alignment of an airborne vehicle. The system is simulated for colored noises and discontinuities and results are compared with not robust nonlinear filters. The results are found 40%-70% more robust against colored noises and discontinuities.

Alignment of Emission Gamma Ray Sources with Nai(Ti) Scintillation Detectors by Two Laser Beams to Pre-Operation using Alternating Minimization Technique

Accurate timing alignment and stability is important to maximize the true counts and minimize the random counts in positron emission tomography So signals output from detectors must be centering with the two isotopes to pre-operation and fed signals into four units of pulse-processing units, each unit can accept up to eight inputs. The dual source computed tomography consist two units on the left for 15 detector signals of Cs-137 isotope and two units on the right are for 15 detectors signals of Co-60 isotope. The gamma spectrum consisting of either single or multiple photo peaks. This allows for the use of energy discrimination electronic hardware associated with the data acquisition system to acquire photon counts data with a specific energy, even if poor energy resolution detectors are used. This also helps to avoid counting of the Compton scatter counts especially if a single discrete gamma photo peak is emitted by the source as in the case of Cs-137. In this study the polyenergetic version of the alternating minimization algorithm is applied to the dual energy gamma computed tomography problem.

Performance Analysis of Expert Systems Incorporating Neural Network for Fault Detection of an Electric Motor

In this paper, an artificial neural network simulator is employed to carry out diagnosis and prognosis on electric motor as rotating machinery based on predictive maintenance. Vibration data of the primary failed motor including unbalance, misalignment and bearing fault were collected for training the neural network. Neural network training was performed for a variety of inputs and the motor condition was used as the expert training information. The main purpose of applying the neural network as an expert system was to detect the type of failure and applying preventive maintenance. The advantage of this study is for machinery Industries by providing appropriate maintenance that has an essential activity to keep the production process going at all processes in the machinery industry. Proper maintenance is pivotal in order to prevent the possible failures in operating system and increase the availability and effectiveness of a system by analyzing vibration monitoring and developing expert system.

Maximum Common Substructure Extraction in RNA Secondary Structures Using Clique Detection Approach

The similarity comparison of RNA secondary structures is important in studying the functions of RNAs. In recent years, most existing tools represent the secondary structures by tree-based presentation and calculate the similarity by tree alignment distance. Different to previous approaches, we propose a new method based on maximum clique detection algorithm to extract the maximum common structural elements in compared RNA secondary structures. A new graph-based similarity measurement and maximum common subgraph detection procedures for comparing purely RNA secondary structures is introduced. Given two RNA secondary structures, the proposed algorithm consists of a process to determine the score of the structural similarity, followed by comparing vertices labelling, the labelled edges and the exact degree of each vertex. The proposed algorithm also consists of a process to extract the common structural elements between compared secondary structures based on a proposed maximum clique detection of the problem. This graph-based model also can work with NC-IUB code to perform the pattern-based searching. Therefore, it can be used to identify functional RNA motifs from database or to extract common substructures between complex RNA secondary structures. We have proved the performance of this proposed algorithm by experimental results. It provides a new idea of comparing RNA secondary structures. This tool is helpful to those who are interested in structural bioinformatics.

Exploring the Combinatorics of Motif Alignments Foraccurately Computing E-values from P-values

In biological and biomedical research motif finding tools are important in locating regulatory elements in DNA sequences. There are many such motif finding tools available, which often yield position weight matrices and significance indicators. These indicators, p-values and E-values, describe the likelihood that a motif alignment is generated by the background process, and the expected number of occurrences of the motif in the data set, respectively. The various tools often estimate these indicators differently, making them not directly comparable. One approach for comparing motifs from different tools, is computing the E-value as the product of the p-value and the number of possible alignments in the data set. In this paper we explore the combinatorics of the motif alignment models OOPS, ZOOPS, and ANR, and propose a generic algorithm for computing the number of possible combinations accurately. We also show that using the wrong alignment model can give E-values that significantly diverge from their true values.

Computational Method for Annotation of Protein Sequence According to Gene Ontology Terms

Annotation of a protein sequence is pivotal for the understanding of its function. Accuracy of manual annotation provided by curators is still questionable by having lesser evidence strength and yet a hard task and time consuming. A number of computational methods including tools have been developed to tackle this challenging task. However, they require high-cost hardware, are difficult to be setup by the bioscientists, or depend on time intensive and blind sequence similarity search like Basic Local Alignment Search Tool. This paper introduces a new method of assigning highly correlated Gene Ontology terms of annotated protein sequences to partially annotated or newly discovered protein sequences. This method is fully based on Gene Ontology data and annotations. Two problems had been identified to achieve this method. The first problem relates to splitting the single monolithic Gene Ontology RDF/XML file into a set of smaller files that can be easy to assess and process. Thus, these files can be enriched with protein sequences and Inferred from Electronic Annotation evidence associations. The second problem involves searching for a set of semantically similar Gene Ontology terms to a given query. The details of macro and micro problems involved and their solutions including objective of this study are described. This paper also describes the protein sequence annotation and the Gene Ontology. The methodology of this study and Gene Ontology based protein sequence annotation tool namely extended UTMGO is presented. Furthermore, its basic version which is a Gene Ontology browser that is based on semantic similarity search is also introduced.

Prediction of Coast Down Time for Mechanical Faults in Rotating Machinery Using Artificial Neural Networks

Misalignment and unbalance are the major concerns in rotating machinery. When the power supply to any rotating system is cutoff, the system begins to lose the momentum gained during sustained operation and finally comes to rest. The exact time period from when the power is cutoff until the rotor comes to rest is called Coast Down Time. The CDTs for different shaft cutoff speeds were recorded at various misalignment and unbalance conditions. The CDT reduction percentages were calculated for each fault and there is a specific correlation between the CDT reduction percentage and the severity of the fault. In this paper, radial basis network, a new generation of artificial neural networks, has been successfully incorporated for the prediction of CDT for misalignment and unbalance conditions. Radial basis network has been found to be successful in the prediction of CDT for mechanical faults in rotating machinery.

An Efficient Classification Method for Inverse Synthetic Aperture Radar Images

This paper proposes an efficient method to classify inverse synthetic aperture (ISAR) images. Because ISAR images can be translated and rotated in the 2-dimensional image place, invariance to the two factors is indispensable for successful classification. The proposed method achieves invariance to translation and rotation of ISAR images using a combination of two-dimensional Fourier transform, polar mapping and correlation-based alignment of the image. Classification is conducted using a simple matching score classifier. In simulations using the real ISAR images of five scaled models measured in a compact range, the proposed method yields classification ratios higher than 97 %.

Protein-Protein Interaction Detection Based on Substring Sensitivity Measure

Detecting protein-protein interactions is a central problem in computational biology and aberrant such interactions may have implicated in a number of neurological disorders. As a result, the prediction of protein-protein interactions has recently received considerable attention from biologist around the globe. Computational tools that are capable of effectively identifying protein-protein interactions are much needed. In this paper, we propose a method to detect protein-protein interaction based on substring similarity measure. Two protein sequences may interact by the mean of the similarities of the substrings they contain. When applied on the currently available protein-protein interaction data for the yeast Saccharomyces cerevisiae, the proposed method delivered reasonable improvement over the existing ones.

Measuring the CSR Company-Stakeholder Fit

As a company competitiveness depends more and more on the relationship with its stakeholders, the topic of companystakeholder fit is becoming increasingly important. This fit affects the extent to which a stakeholder perceives CSR company commitment, values and behaviors and, therefore, stakeholder identification in a company and his/her loyalty to it. Consequently, it is important to measure the alignment or the gap between stakeholder CSR demands, values, preferences and perceptions, and the company CSR disclosed commitment, values and policies. In this paper, in order to assess the company-stakeholder fit about corporate responsibility, an innovative CSR fit positioning matrix is proposed. This matrix is based on the measurement of a company CSR disclosed commitment and stakeholder perceived and required commitment. The matrix is part of a more complex methodology based on Global Reporting Initiative (GRI) indicators, content analysis and stakeholder questionnaires. This methodology provides appropriate indications for helping companies to achieve CSR company-stakeholder fit, by leveraging both CSR commitment and communication. Moreover, it could be used by top management for comparing different companies and stakeholders, and for planning specific CSR strategies, policies and activities.

Quantitative Analysis of PCA, ICA, LDA and SVM in Face Recognition

Face recognition is a technique to automatically identify or verify individuals. It receives great attention in identification, authentication, security and many more applications. Diverse methods had been proposed for this purpose and also a lot of comparative studies were performed. However, researchers could not reach unified conclusion. In this paper, we are reporting an extensive quantitative accuracy analysis of four most widely used face recognition algorithms: Principal Component Analysis (PCA), Independent Component Analysis (ICA), Linear Discriminant Analysis (LDA) and Support Vector Machine (SVM) using AT&T, Sheffield and Bangladeshi people face databases under diverse situations such as illumination, alignment and pose variations.

Reversible, Embedded and Highly Scalable Image Compression System

In this work a new method for low complexity image coding is presented, that permits different settings and great scalability in the generation of the final bit stream. This coding presents a continuous-tone still image compression system that groups loss and lossless compression making use of finite arithmetic reversible transforms. Both transformation in the space of color and wavelet transformation are reversible. The transformed coefficients are coded by means of a coding system in depending on a subdivision into smaller components (CFDS) similar to the bit importance codification. The subcomponents so obtained are reordered by means of a highly configure alignment system depending on the application that makes possible the re-configure of the elements of the image and obtaining different importance levels from which the bit stream will be generated. The subcomponents of each importance level are coded using a variable length entropy coding system (VBLm) that permits the generation of an embedded bit stream. This bit stream supposes itself a bit stream that codes a compressed still image. However, the use of a packing system on the bit stream after the VBLm allows the realization of a final highly scalable bit stream from a basic image level and one or several improvement levels.

Edit Distance Algorithm to Increase Storage Efficiency of Javanese Corpora

Since the one-to-one word translator does not have the facility to translate pragmatic aspects of Javanese, the parallel text alignment model described uses a phrase pair combination. The algorithm aligns the parallel text automatically from the beginning to the end of each sentence. Even though the results of the phrase pair combination outperform the previous algorithm, it is still inefficient. Recording all possible combinations consume more space in the database and time consuming. The original algorithm is modified by applying the edit distance coefficient to improve the data-storage efficiency. As a result, the data-storage consumption is 90% reduced as well as its learning period (42s).

Pilot Study on the Impact of VLE on Mathematical Concepts Acquisition within Secondary Education in England

The research investigates the “impact of VLE on mathematical concepts acquisition of the special education needs (SENs) students at KS4 secondary education sector" in England. The overall aim of the study is to establish possible areas of difficulties to approach for above or below knowledge standard requirements for KS4 students in the acquisition and validation of basic mathematical concepts. A teaching period, in which virtual learning environment (Fronter) was used to emphasise different mathematical perception and symbolic representation was carried out and task based survey conducted to 20 special education needs students [14 actually took part]. The result shows that students were able to process information and consider images, objects and numbers within the VLE at early stages of acquisition process. They were also able to carry out perceptual tasks but with limiting process of different quotient, thus they need teacher-s guidance to connect them to symbolic representations and sometimes coach them through. The pilot study further indicates that VLE curriculum approaches for students were minutely aligned with mathematics teaching which does not emphasise the integration of VLE into the existing curriculum and current teaching practice. There was also poor alignment of vision regarding the use of VLE in realisation of the objectives of teaching mathematics by the management. On the part of teacher training, not much was done to develop teacher-s skills in the technical and pedagogical aspects of VLE that is in-use at the school. The classroom observation confirmed teaching practice will find a reliance on VLE as an enhancer of mathematical skills, providing interaction and personalisation of learning to SEN students.

Achieving Business and IT Alignment from Organisational Learning Perspectives

Business and IT alignment has continued as a top concern for business and IT executives for almost three decades. Many researchers have conducted empirical studies on the relationship between business-IT alignment and performance. Yet, these approaches, lacking a social perspective, have had little impact on sustaining performance and competitive advantage. In addition to the limited alignment literature that explores organisational learning that is represented in shared understanding, communication, cognitive maps and experiences. Hence, this paper proposes an integrated process that enables social and intellectual dimensions through the concept of organisational learning. In particular, the feedback and feedforward process which provide a value creation across dynamic multilevel of learning. This mechanism enables on-going effectiveness through development of individuals, groups and organisations, which improves the quality of business and IT strategies and drives to performance.

A Simplified Model for Mechanical Loads under Angular Misalignment and Unbalance

This paper presents a dynamic model for mechanical loads of an electric drive, including angular misalignment and including load unbalance. The misalignment model represents the effects of the universal joint between the motor and the mechanical load. Simulation results are presented for an induction motor driving a mechanical load with angular misalignment for both flexible and rigid coupling. The models presented are very useful in the study of mechanical fault detection in induction motors, using mechanical and electrical signals already available in a drive system, such as speed, torque and stator currents.

Software Industrialization in Systems Integration

Today-s economy is in a permanent change, causing merger and acquisitions and co operations between enterprises. As a consequence, process adaptations and realignments result in systems integration and software development projects. Processes and procedures to execute such projects are still reliant on craftsman-ship of highly skilled workers. A generally accepted, industrialized production, characterized by high efficiency and quality, seems inevitable. In spite of this, current concepts of software industrialization are aimed at traditional software engineering and do not consider the characteristics of systems integration. The present work points out these particularities and discusses the applicability of existing industrial concepts in the systems integration domain. Consequently it defines further areas of research necessary to bring the field of systems integration closer to an industrialized production, allowing a higher efficiency, quality and return on investment.

Face Localization and Recognition in Varied Expressions and Illumination

In this paper, we propose a robust scheme to work face alignment and recognition under various influences. For face representation, illumination influence and variable expressions are the important factors, especially the accuracy of facial localization and face recognition. In order to solve those of factors, we propose a robust approach to overcome these problems. This approach consists of two phases. One phase is preprocessed for face images by means of the proposed illumination normalization method. The location of facial features can fit more efficient and fast based on the proposed image blending. On the other hand, based on template matching, we further improve the active shape models (called as IASM) to locate the face shape more precise which can gain the recognized rate in the next phase. The other phase is to process feature extraction by using principal component analysis and face recognition by using support vector machine classifiers. The results show that this proposed method can obtain good facial localization and face recognition with varied illumination and local distortion.

PRO-Teaching – Sharing Ideas to Develop Capabilities

In this paper, the action research driven design of a context relevant, developmental peer review of teaching model, its implementation strategy and its impact at an Australian university is presented. PRO-Teaching realizes an innovative process that triangulates contemporaneous teaching quality data from a range of stakeholders including students, discipline academics, learning and teaching expert academics, and teacher reflection to create reliable evidence of teaching quality. Data collected over multiple classroom observations allows objective reporting on development differentials in constructive alignment, peer, and student evaluations. Further innovation is realized in the application of this highly structured developmental process to provide summative evidence of sufficient validity to support claims for professional advancement and learning and teaching awards. Design decision points and contextual triggers are described within the operating domain. Academics and developers seeking to introduce structured peer review of teaching into their organization will find this paper a useful reference.

Digital Terrestrial Broadcasting Technologies and Implementation Status

Digital broadcasting has been an area of active research, development, innovation and business models development in recent years. This paper presents a survey on the characteristics of the digital terrestrial television broadcasting (DTTB) standards, and implementation status of DTTB worldwide showing the standards adopted. It is clear that only the developed countries and some in the developing ones shall be able to beat the ITU set analogue to digital broadcasting migration deadline because of the challenges that these countries faces in digitizing their terrestrial broadcasting. The challenges to keep on track the DTTB migration plan are also discussed in this paper. They include financial, technology gap, policies alignment with DTTB technology, etc. The reported performance comparisons for the different standards are also presented. The interesting part is that the results for many comparative studies depends to a large extent on the objective behind such studies, hence counter claims are common.