EZW Coding System with Artificial Neural Networks

Image compression plays a vital role in today-s communication. The limitation in allocated bandwidth leads to slower communication. To exchange the rate of transmission in the limited bandwidth the Image data must be compressed before transmission. Basically there are two types of compressions, 1) LOSSY compression and 2) LOSSLESS compression. Lossy compression though gives more compression compared to lossless compression; the accuracy in retrievation is less in case of lossy compression as compared to lossless compression. JPEG, JPEG2000 image compression system follows huffman coding for image compression. JPEG 2000 coding system use wavelet transform, which decompose the image into different levels, where the coefficient in each sub band are uncorrelated from coefficient of other sub bands. Embedded Zero tree wavelet (EZW) coding exploits the multi-resolution properties of the wavelet transform to give a computationally simple algorithm with better performance compared to existing wavelet transforms. For further improvement of compression applications other coding methods were recently been suggested. An ANN base approach is one such method. Artificial Neural Network has been applied to many problems in image processing and has demonstrated their superiority over classical methods when dealing with noisy or incomplete data for image compression applications. The performance analysis of different images is proposed with an analysis of EZW coding system with Error Backpropagation algorithm. The implementation and analysis shows approximately 30% more accuracy in retrieved image compare to the existing EZW coding system.

Adaptive Motion Estimator Based on Variable Block Size Scheme

This paper presents an adaptive motion estimator that can be dynamically reconfigured by the best algorithm depending on the variation of the video nature during the lifetime of an application under running. The 4 Step Search (4SS) and the Gradient Search (GS) algorithms are integrated in the estimator in order to be used in the case of rapid and slow video sequences respectively. The Full Search Block Matching (FSBM) algorithm has been also integrated in order to be used in the case of the video sequences which are not real time oriented. In order to efficiently reduce the computational cost while achieving better visual quality with low cost power, the proposed motion estimator is based on a Variable Block Size (VBS) scheme that uses only the 16x16, 16x8, 8x16 and 8x8 modes. Experimental results show that the adaptive motion estimator allows better results in term of Peak Signal to Noise Ratio (PSNR), computational cost, FPGA occupied area, and dissipated power relatively to the most popular variable block size schemes presented in the literature.

Absorption of Volatile Organic Compounds into Polydimethylsiloxane: Phase Equilibrium Computation at Infinite Dilution

Group contribution methods such as the UNIFAC are very useful to researchers and engineers involved in synthesis, feasibility studies, design and optimization of separation processes. They can be applied successfully to predict phase equilibrium and excess properties in the development of chemical and separation processes. The main focus of this work was to investigate the possibility of absorbing selected volatile organic compounds (VOCs) into polydimethylsiloxane (PDMS) using three selected UNIFAC group contribution methods. Absorption followed by subsequent stripping is the predominant available abatement technology of VOCs from flue gases prior to their release into the atmosphere. The original, modified and effective UNIFAC models were used in this work. The thirteen selected VOCs that have been considered in this research are: pentane, hexane, heptanes, trimethylamine, toluene, xylene, cyclohexane, butyl acetate, diethyl acetate, chloroform, acetone, ethyl methyl ketone and isobutyl methyl ketone. The computation was done for solute VOC concentration of 8.55x10-8 which is well in the infinite dilution region. The results obtained in this study compare very well with those published in literature obtained through both measurements and predictions. The phase equilibrium obtained in this study show that PDMS is a good absorbent for the removal of VOCs from contaminated air streams through physical absorption.

SUPAR: System for User-Centric Profiling of Association Rules in Streaming Data

With a surge of stream processing applications novel techniques are required for generation and analysis of association rules in streams. The traditional rule mining solutions cannot handle streams because they generally require multiple passes over the data and do not guarantee the results in a predictable, small time. Though researchers have been proposing algorithms for generation of rules from streams, there has not been much focus on their analysis. We propose Association rule profiling, a user centric process for analyzing association rules and attaching suitable profiles to them depending on their changing frequency behavior over a previous snapshot of time in a data stream. Association rule profiles provide insights into the changing nature of associations and can be used to characterize the associations. We discuss importance of characteristics such as predictability of linkages present in the data and propose metric to quantify it. We also show how association rule profiles can aid in generation of user specific, more understandable and actionable rules. The framework is implemented as SUPAR: System for Usercentric Profiling of Association Rules in streaming data. The proposed system offers following capabilities: i) Continuous monitoring of frequency of streaming item-sets and detection of significant changes therein for association rule profiling. ii) Computation of metrics for quantifying predictability of associations present in the data. iii) User-centric control of the characterization process: user can control the framework through a) constraint specification and b) non-interesting rule elimination.

Monotonicity of Dependence Concepts from Independent Random Vector into Dependent Random Vector

When the failure function is monotone, some monotonic reliability methods are used to gratefully simplify and facilitate the reliability computations. However, these methods often work in a transformed iso-probabilistic space. To this end, a monotonic simulator or transformation is needed in order that the transformed failure function is still monotone. This note proves at first that the output distribution of failure function is invariant under the transformation. And then it presents some conditions under which the transformed function is still monotone in the newly obtained space. These concern the copulas and the dependence concepts. In many engineering applications, the Gaussian copulas are often used to approximate the real word copulas while the available information on the random variables is limited to the set of marginal distributions and the covariances. So this note catches an importance on the conditional monotonicity of the often used transformation from an independent random vector into a dependent random vector with Gaussian copulas.

Application of Adaptive Neuro-Fuzzy Inference System in Smoothing Transition Autoregressive Models

In this paper we propose and examine an Adaptive Neuro-Fuzzy Inference System (ANFIS) in Smoothing Transition Autoregressive (STAR) modeling. Because STAR models follow fuzzy logic approach, in the non-linear part fuzzy rules can be incorporated or other training or computational methods can be applied as the error backpropagation algorithm instead to nonlinear squares. Furthermore, additional fuzzy membership functions can be examined, beside the logistic and exponential, like the triangle, Gaussian and Generalized Bell functions among others. We examine two macroeconomic variables of US economy, the inflation rate and the 6-monthly treasury bills interest rates.

A New Heuristic for Improving the Performance of Genetic Algorithm

The hybridisation of genetic algorithm with heuristics has been shown to be one of an effective way to improve its performance. In this work, genetic algorithm hybridised with four heuristics including a new heuristic called neighbourhood improvement were investigated through the classical travelling salesman problem. The experimental results showed that the proposed heuristic outperformed other heuristics both in terms of quality of the results obtained and the computational time.

Human Face Detection and Segmentation using Eigenvalues of Covariance Matrix, Hough Transform and Raster Scan Algorithms

In this paper we propose a novel method for human face segmentation using the elliptical structure of the human head. It makes use of the information present in the edge map of the image. In this approach we use the fact that the eigenvalues of covariance matrix represent the elliptical structure. The large and small eigenvalues of covariance matrix are associated with major and minor axial lengths of an ellipse. The other elliptical parameters are used to identify the centre and orientation of the face. Since an Elliptical Hough Transform requires 5D Hough Space, the Circular Hough Transform (CHT) is used to evaluate the elliptical parameters. Sparse matrix technique is used to perform CHT, as it squeeze zero elements, and have only a small number of non-zero elements, thereby having an advantage of less storage space and computational time. Neighborhood suppression scheme is used to identify the valid Hough peaks. The accurate position of the circumference pixels for occluded and distorted ellipses is identified using Bresenham-s Raster Scan Algorithm which uses the geometrical symmetry properties. This method does not require the evaluation of tangents for curvature contours, which are very sensitive to noise. The method has been evaluated on several images with different face orientations.

Efficient Realization of an ADFE with a New Adaptive Algorithm

Decision feedback equalizers are commonly employed to reduce the error caused by intersymbol interference. Here, an adaptive decision feedback equalizer is presented with a new adaptation algorithm. The algorithm follows a block-based approach of normalized least mean square (NLMS) algorithm with set-membership filtering and achieves a significantly less computational complexity over its conventional NLMS counterpart with set-membership filtering. It is shown in the results that the proposed algorithm yields similar type of bit error rate performance over a reasonable signal to noise ratio in comparison with the latter one.

Analog Circuit Design using Genetic Algorithm: Modified

Genetic Algorithm has been used to solve wide range of optimization problems. Some researches conduct on applying Genetic Algorithm to analog circuit design automation. These researches show a better performance due to the nature of Genetic Algorithm. In this paper a modified Genetic Algorithm is applied for analog circuit design automation. The modifications are made to the topology of the circuit. These modifications will lead to a more computationally efficient algorithm.

Aerodynamic Stall Control of a Generic Airfoil using Synthetic Jet Actuator

The aerodynamic stall control of a baseline 13-percent thick NASA GA(W)-2 airfoil using a synthetic jet actuator (SJA) is presented in this paper. Unsteady Reynolds-averaged Navier-Stokes equations are solved on a hybrid grid using a commercial software to simulate the effects of a synthetic jet actuator located at 13% of the chord from the leading edge at a Reynolds number Re = 2.1x106 and incidence angles from 16 to 22 degrees. The experimental data for the pressure distribution at Re = 3x106 and aerodynamic coefficients at Re = 2.1x106 (angle of attack varied from -16 to 22 degrees) without SJA is compared with the computational fluid dynamic (CFD) simulation as a baseline validation. A good agreement of the CFD simulations is obtained for aerodynamic coefficients and pressure distribution. A working SJA has been integrated with the baseline airfoil and initial focus is on the aerodynamic stall control at angles of attack from 16 to 22 degrees. The results show a noticeable improvement in the aerodynamic performance with increase in lift and decrease in drag at these post stall regimes.

Eigenvalues of Particle Bound in Single and Double Delta Function Potentials through Numerical Analysis

This study employs the use of the fourth order Numerov scheme to determine the eigenstates and eigenvalues of particles, electrons in particular, in single and double delta function potentials. For the single delta potential, it is found that the eigenstates could only be attained by using specific potential depths. The depth of the delta potential well has a value that varies depending on the delta strength. These depths are used for each well on the double delta function potential and the eigenvalues are determined. There are two bound states found in the computation, one with a symmetric eigenstate and another one which is antisymmetric.

Approximation Algorithm for the Shortest Approximate Common Superstring Problem

The Shortest Approximate Common Superstring (SACS) problem is : Given a set of strings f={w1, w2, ... , wn}, where no wi is an approximate substring of wj, i ≠ j, find a shortest string Sa, such that, every string of f is an approximate substring of Sa. When the number of the strings n>2, the SACS problem becomes NP-complete. In this paper, we present a greedy approximation SACS algorithm. Our algorithm is a 1/2-approximation for the SACS problem. It is of complexity O(n2*(l2+log(n))) in computing time, where n is the number of the strings and l is the length of a string. Our SACS algorithm is based on computation of the Length of the Approximate Longest Overlap (LALO).

An Approximate Engineering Method for Aerodynamic Heating Solution around Blunt Body Nose

This paper is devoted to predict laminar and turbulent heating rates around blunt re-entry spacecraft at hypersonic conditions. Heating calculation of a hypersonic body is normally performed during the critical part of its flight trajectory. The procedure is of an inverse method, where a shock wave is assumed, and the body shape that supports this shock, as well as the flowfield between the shock and body, are calculated. For simplicity the normal momentum equation is replaced with a second order pressure relation; this simplification significantly reduces computation time. The geometries specified in this research, are parabola and ellipsoids which may have conical after bodies. An excellent agreement is observed between the results obtained in this paper and those calculated by others- research. Since this method is much faster than Navier-Stokes solutions, it can be used in preliminary design, parametric study of hypersonic vehicles.

A Computer Model of Quantum Field Theory

This paper describes a computer model of Quantum Field Theory (QFT), referred to in this paper as QTModel. After specifying the initial configuration for a QFT process (e.g. scattering) the model generates the possible applicable processes in terms of Feynman diagrams, the equations for the scattering matrix, and evaluates probability amplitudes for the scattering matrix and cross sections. The computations of probability amplitudes are performed numerically. The equations generated by QTModel are provided for demonstration purposes only. They are not directly used as the base for the computations of probability amplitudes. The computer model supports two modes for the computation of the probability amplitudes: (1) computation according to standard QFT, and (2) computation according to a proposed functional interpretation of quantum theory.

Graph-Based Text Similarity Measurement by Exploiting Wikipedia as Background Knowledge

Text similarity measurement is a fundamental issue in many textual applications such as document clustering, classification, summarization and question answering. However, prevailing approaches based on Vector Space Model (VSM) more or less suffer from the limitation of Bag of Words (BOW), which ignores the semantic relationship among words. Enriching document representation with background knowledge from Wikipedia is proven to be an effective way to solve this problem, but most existing methods still cannot avoid similar flaws of BOW in a new vector space. In this paper, we propose a novel text similarity measurement which goes beyond VSM and can find semantic affinity between documents. Specifically, it is a unified graph model that exploits Wikipedia as background knowledge and synthesizes both document representation and similarity computation. The experimental results on two different datasets show that our approach significantly improves VSM-based methods in both text clustering and classification.

An Overview of the Application of Fuzzy Inference System for the Automation of Breast Cancer Grading with Spectral Data

Breast cancer is one of the most frequent occurring cancers in women throughout the world including U.K. The grading of this cancer plays a vital role in the prognosis of the disease. In this paper we present an overview of the use of advanced computational method of fuzzy inference system as a tool for the automation of breast cancer grading. A new spectral data set obtained from Fourier Transform Infrared Spectroscopy (FTIR) of cancer patients has been used for this study. The future work outlines the potential areas of fuzzy systems that can be used for the automation of breast cancer grading.

Using Rao-Blackwellised Particle Filter Track 3D Arm Motion based on Hierarchical Limb Model

For improving the efficiency of human 3D tracking, we present an algorithm to track 3D Arm Motion. First, the Hierarchy Limb Model (HLM) is proposed based on the human 3D skeleton model. Second, via graph decomposition, the arm motion state space, modeled by HLM, can be discomposed into two low dimension subspaces: root nodes and leaf nodes. Finally, Rao-Blackwellised Particle Filter is used to estimate the 3D arm motion. The result of experiment shows that our algorithm can advance the computation efficiency.

LINUX Cluster Possibilities in 3-D PHOTO Quality Imaging and Animation

In this paper we present the PC cluster built at R.V. College of Engineering (with great help from the Department of Computer Science and Electrical Engineering). The structure of the cluster is described and the performance is evaluated by rendering of complex 3D Persistence of Vision (POV) images by the Ray-Tracing algorithm. Here, we propose an unexampled method to render such images, distributedly on a low cost scalable.

Generating High-Accuracy Tool Path for 5-axis Flank Milling of Globoidal Spatial Cam

A new tool path planning method for 5-axis flank milling of a globoidal indexing cam is developed in this paper. The globoidal indexing cam is a practical transmission mechanism due to its high transmission speed, accuracy and dynamic performance. Machining the cam profile is a complex and precise task. The profile surface of the globoidal cam is generated by the conjugate contact motion of the roller. The generated complex profile surface is usually machined by 5-axis point-milling method. The point-milling method is time-consuming compared with flank milling. The tool path for 5-axis flank milling of globoidal cam is developed to improve the cutting efficiency. The flank milling tool path is globally optimized according to the minimum zone criterion, and high accuracy is guaranteed. The computational example and cutting simulation finally validate the developed method.