Tuning a Fractional Order PID Controller with Lead Compensator in Frequency Domain

To achieve the desired specifications of gain and phase margins for plants with time-delay that stabilized with FO-PID controller a lead compensator is designed. At first the range of controlled system stability based on stability boundary criteria is determined. Using stability boundary locus method in frequency domain the fractional order controller parameters are tuned and then with drawing bode diagram in frequency domain accessing to desired gain and phase margin are shown. Numerical examples are given to illustrate the shapes of the stabilizing region and to show the design procedure.

Experimental Investigation of Chatter Vibrations in Facing and Turning Processes

This paper investigates the occurrence of regenerative chatter vibrations in facing and turning processes. Orthogonal turning (facing) and normal turning experiments are carried out under stable as well as in the presence of controlled chatter vibrations. The effects of chatter vibrations on various sensor signals are captured and analyzed using frequency domain methods, which successfully detected the chatter vibrations close to the dominant mode of the machine tool system.

A New High Speed Neural Model for Fast Character Recognition Using Cross Correlation and Matrix Decomposition

Neural processors have shown good results for detecting a certain character in a given input matrix. In this paper, a new idead to speed up the operation of neural processors for character detection is presented. Such processors are designed based on cross correlation in the frequency domain between the input matrix and the weights of neural networks. This approach is developed to reduce the computation steps required by these faster neural networks for the searching process. The principle of divide and conquer strategy is applied through image decomposition. Each image is divided into small in size sub-images and then each one is tested separately by using a single faster neural processor. Furthermore, faster character detection is obtained by using parallel processing techniques to test the resulting sub-images at the same time using the same number of faster neural networks. In contrast to using only faster neural processors, the speed up ratio is increased with the size of the input image when using faster neural processors and image decomposition. Moreover, the problem of local subimage normalization in the frequency domain is solved. The effect of image normalization on the speed up ratio of character detection is discussed. Simulation results show that local subimage normalization through weight normalization is faster than subimage normalization in the spatial domain. The overall speed up ratio of the detection process is increased as the normalization of weights is done off line.

Hybrid Modulation Technique for Fingerprinting

This paper addresses an efficient technique to embed and detect digital fingerprint code. Orthogonal modulation method is a straightforward and widely used approach for digital fingerprinting but shows several limitations in computational cost and signal efficiency. Coded modulation method can solve these limitations in theory. However it is difficult to perform well in practice if host signals are not available during tracing colluders, other kinds of attacks are applied, and the size of fingerprint code becomes large. In this paper, we propose a hybrid modulation method, in which the merits of or-thogonal modulation and coded modulation method are combined so that we can achieve low computational cost and high signal efficiency. To analyze the performance, we design a new fingerprint code based on GD-PBIBD theory and modulate this code into images by our method using spread-spectrum watermarking on frequency domain. The results show that the proposed method can efficiently handle large fingerprint code and trace colluders against averaging attacks.

Relative Radiometric Correction of Cloudy Multitemporal Satellite Imagery

Repeated observation of a given area over time yields potential for many forms of change detection analysis. These repeated observations are confounded in terms of radiometric consistency due to changes in sensor calibration over time, differences in illumination, observation angles and variation in atmospheric effects. This paper demonstrates applicability of an empirical relative radiometric normalization method to a set of multitemporal cloudy images acquired by Resourcesat1 LISS III sensor. Objective of this study is to detect and remove cloud cover and normalize an image radiometrically. Cloud detection is achieved by using Average Brightness Threshold (ABT) algorithm. The detected cloud is removed and replaced with data from another images of the same area. After cloud removal, the proposed normalization method is applied to reduce the radiometric influence caused by non surface factors. This process identifies landscape elements whose reflectance values are nearly constant over time, i.e. the subset of non-changing pixels are identified using frequency based correlation technique. The quality of radiometric normalization is statistically assessed by R2 value and mean square error (MSE) between each pair of analogous band.

Effect of Carbon Amount of Dual-Phase Steels on Deformation Behavior Using Acoustic Emission

In this study acoustic emission (AE) signals obtained during deformation and fracture of two types of ferrite-martensite dual phase steels (DPS) specimens have been analyzed in frequency domain. For this reason two low carbon steels with various amounts of carbon were chosen, and intercritically heat treated. In the introduced method, identifying the mechanisms of failure in the various phases of DPS is done. For this aim, AE monitoring has been used during tensile test of several DPS with various volume fraction of the martensite (VM) and attempted to relate the AE signals and failure mechanisms in these steels. Different signals, which referred to 2-3 micro-mechanisms of failure due to amount of carbon and also VM have been seen. By Fast Fourier Transformation (FFT) of signals in distinct locations, an excellent relationship between peak frequencies in these areas and micro-mechanisms of failure were seen. The results were verified by microscopic observations (SEM).

Integrating Fast Karnough Map and Modular Neural Networks for Simplification and Realization of Complex Boolean Functions

In this paper a new fast simplification method is presented. Such method realizes Karnough map with large number of variables. In order to accelerate the operation of the proposed method, a new approach for fast detection of group of ones is presented. Such approach implemented in the frequency domain. The search operation relies on performing cross correlation in the frequency domain rather than time one. It is proved mathematically and practically that the number of computation steps required for the presented method is less than that needed by conventional cross correlation. Simulation results using MATLAB confirm the theoretical computations. Furthermore, a powerful solution for realization of complex functions is given. The simplified functions are implemented by using a new desigen for neural networks. Neural networks are used because they are fault tolerance and as a result they can recognize signals even with noise or distortion. This is very useful for logic functions used in data and computer communications. Moreover, the implemented functions are realized with minimum amount of components. This is done by using modular neural nets (MNNs) that divide the input space into several homogenous regions. Such approach is applied to implement XOR function, 16 logic functions on one bit level, and 2-bit digital multiplier. Compared to previous non- modular designs, a clear reduction in the order of computations and hardware requirements is achieved.

Comprehensive Study on the Linear Hydrodynamic Analysis of a Truss Spar in Random Waves

Truss spars are used for oil exploitation in deep and ultra-deep water if storage crude oil is not needed. The linear hydrodynamic analysis of truss spar in random sea wave load is necessary for determining the behaviour of truss spar. This understanding is not only important for design of the mooring lines, but also for optimising the truss spar design. In this paper linear hydrodynamic analysis of truss spar is carried out in frequency domain. The hydrodynamic forces are calculated using the modified Morison equation and diffraction theory. Added mass and drag coefficients of truss section computed by transmission matrix and normal acceleration and velocity component acting on each element and for hull section computed by strip theory. The stiffness properties of the truss spar can be separated into two components; hydrostatic stiffness and mooring line stiffness. Then, platform response amplitudes obtained by solved the equation of motion. This equation is non-linear due to viscous damping term therefore linearised by iteration method [1]. Finally computed RAOs and significant response amplitude and results are compared with experimental data.

Robustness of Hybrid Learning Acceleration Feedback Control Scheme in Flexible Manipulators

This paper describes a practical approach to design and develop a hybrid learning with acceleration feedback control (HLC) scheme for input tracking and end-point vibration suppression of flexible manipulator systems. Initially, a collocated proportionalderivative (PD) control scheme using hub-angle and hub-velocity feedback is developed for control of rigid-body motion of the system. This is then extended to incorporate a further hybrid control scheme of the collocated PD control and iterative learning control with acceleration feedback using genetic algorithms (GAs) to optimize the learning parameters. Experimental results of the response of the manipulator with the control schemes are presented in the time and frequency domains. The performance of the HLC is assessed in terms of input tracking, level of vibration reduction at resonance modes and robustness with various payloads.

Frequency-Domain Design of Fractional-Order FIR Differentiators

In this paper, a fractional-order FIR differentiator design method using the differential evolution (DE) algorithm is presented. In the proposed method, the FIR digital filter is designed to meet the frequency response of a desired fractal-order differentiator, which is evaluated in the frequency domain. To verify the design performance, another design method considered in the time-domain is also provided. Simulation results reveal the efficiency of the proposed method.

Fractal Analysis on Human Colonic Pressure Activities based on the Box-counting Method

The colonic tissue is a complicated dynamic system and the colonic activities it generates are composed of irregular segmental waves, which are referred to as erratic fluctuations or spikes. They are also highly irregular with subunit fractal structure. The traditional time-frequency domain statistics like the averaged amplitude, the motility index and the power spectrum, etc. are insufficient to describe such fluctuations. Thus the fractal box-counting dimension is proposed and the fractal scaling behaviors of the human colonic pressure activities under the physiological conditions are studied. It is shown that the dimension of the resting activity is smaller than that of the normal one, whereas the clipped version, which corresponds to the activity of the constipation patient, shows with higher fractal dimension. It may indicate a practical application to assess the colonic motility, which is often indicated by the colonic pressure activity.

Computer Software Applicable in Rehabilitation, Cardiology and Molecular Biology

We have developed a computer program consisting of 6 subtests assessing the children hand dexterity applicable in the rehabilitation medicine. We have carried out a normative study on a representative sample of 285 children aged from 7 to 15 (mean age 11.3) and we have proposed clinical standards for three age groups (7-9, 9-11, 12-15 years). We have shown statistical significance of differences among the corresponding mean values of the task time completion. We have also found a strong correlation between the task time completion and the age of the subjects, as well as we have performed the test-retest reliability checks in the sample of 84 children, giving the high values of the Pearson coefficients for the dominant and non-dominant hand in the range 0.740.97 and 0.620.93, respectively. A new MATLAB-based programming tool aiming at analysis of cardiologic RR intervals and blood pressure descriptors, is worked out, too. For each set of data, ten different parameters are extracted: 2 in time domain, 4 in frequency domain and 4 in Poincaré plot analysis. In addition twelve different parameters of baroreflex sensitivity are calculated. All these data sets can be visualized in time domain together with their power spectra and Poincaré plots. If available, the respiratory oscillation curves can be also plotted for comparison. Another application processes biological data obtained from BLAST analysis.

A DCT-Based Secure JPEG Image Authentication Scheme

The challenge in the case of image authentication is that in many cases images need to be subjected to non malicious operations like compression, so the authentication techniques need to be compression tolerant. In this paper we propose an image authentication system that is tolerant to JPEG lossy compression operations. A scheme for JPEG grey scale images is proposed based on a data embedding method that is based on a secret key and a secret mapping vector in the frequency domain. An encrypted feature vector extracted from the image DCT coefficients, is embedded redundantly, and invisibly in the marked image. On the receiver side, the feature vector from the received image is derived again and compared against the extracted watermark to verify the image authenticity. The proposed scheme is robust against JPEG compression up to a maximum compression of approximately 80%,, but sensitive to malicious attacks such as cutting and pasting.

Effect of Physical Contact (Hand-Holding) on Heart Rate Variability

Heart-s electric field can be measured anywhere on the surface of the body (ECG). When individuals touch, one person-s ECG signal can be registered in other person-s EEG and elsewhere on his body. Now, the aim of this study was to test the hypothesis that physical contact (hand-holding) of two persons changes their heart rate variability. Subjects were sixteen healthy female (age: 20- 26) which divided into eight sets. In each sets, we had two friends that they passed intimacy test of J.sternberg. ECG of two subjects (each set) acquired for 5 minutes before hand-holding (as control group) and 5 minutes during they held their hands (as experimental group). Then heart rate variability signals were extracted from subjects' ECG and analyzed in linear feature space (time and frequency domain) and nonlinear feature space. Considering the results, we conclude that physical contact (hand-holding of two friends) increases parasympathetic activity, as indicate by increase SD1, SD1/SD2, HF and MF power (p

FPGA Based Parallel Architecture for the Computation of Third-Order Cross Moments

Higher-order Statistics (HOS), also known as cumulants, cross moments and their frequency domain counterparts, known as poly spectra have emerged as a powerful signal processing tool for the synthesis and analysis of signals and systems. Algorithms used for the computation of cross moments are computationally intensive and require high computational speed for real-time applications. For efficiency and high speed, it is often advantageous to realize computation intensive algorithms in hardware. A promising solution that combines high flexibility together with the speed of a traditional hardware is Field Programmable Gate Array (FPGA). In this paper, we present FPGA-based parallel architecture for the computation of third-order cross moments. The proposed design is coded in Very High Speed Integrated Circuit (VHSIC) Hardware Description Language (VHDL) and functionally verified by implementing it on Xilinx Spartan-3 XC3S2000FG900-4 FPGA. Implementation results are presented and it shows that the proposed design can operate at a maximum frequency of 86.618 MHz.

A New Implementation of PCA for Fast Face Detection

Principal Component Analysis (PCA) has many different important applications especially in pattern detection such as face detection / recognition. Therefore, for real time applications, the response time is required to be as small as possible. In this paper, new implementation of PCA for fast face detection is presented. Such new implementation is designed based on cross correlation in the frequency domain between the input image and eigenvectors (weights). Simulation results show that the proposed implementation of PCA is faster than conventional one.

Surface and Guided Waves in Composites with Nematic Coatings

The theoretical prediction of the acoustical polarization effects in the heterogeneous composites, made of thick elastic solids with thin nematic films, is presented. The numericalanalytical solution to the problem of the different wave propagation exhibits some new physical effects in the low frequency domain: the appearance of the critical frequency and the existence of the narrow transition zone where the wave rapidly changes its speed. The associated wave attenuation is highly perturbed in this zone. We also show the possible appearance of the critical frequencies where the attenuation changes the sign. The numerical results of parametrical analysis are presented and discussed.

2D Gabor Functions and FCMI Algorithm for Flaws Detection in Ultrasonic Images

In this paper we present a new approach to detecting a flaw in T.O.F.D (Time Of Flight Diffraction) type ultrasonic image based on texture features. Texture is one of the most important features used in recognizing patterns in an image. The paper describes texture features based on 2D Gabor functions, i.e., Gaussian shaped band-pass filters, with dyadic treatment of the radial spatial frequency range and multiple orientations, which represent an appropriate choice for tasks requiring simultaneous measurement in both space and frequency domains. The most relevant features are used as input data on a Fuzzy c-mean clustering classifier. The classes that exist are only two: 'defects' or 'no defects'. The proposed approach is tested on the T.O.F.D image achieved at the laboratory and on the industrial field.

New Features for Specific JPEG Steganalysis

We present in this paper a new approach for specific JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. Traditionally, steganographic algorithms try to preserve statistics of the DCT and of the spatial domain, but they cannot preserve both and also control the alteration of the compressed data. We have noticed a deviation of the entropy of the compressed data after a first embedding. This deviation is greater when the image is a cover medium than when the image is a stego image. To observe this deviation, we pointed out new statistic features and combined them with the Multiple Embedding Method. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. Finally, we designed a Fisher discriminant based classifier for well known steganographic algorithms, Outguess, F5 and Hide and Seek. The experiemental results we obtained show the efficiency of our classifier for these algorithms. Moreover, it is also designed to work with low embedding rates (< 10-5) and according to the avalanche criterion of RLE and Huffman compression step, its efficiency is independent of the quantity of hidden information.

Research on the Predict Method of Random Vibration Cumulative Fatigue Damage Life Based on the Finite Element Analysis

Aiming at most of the aviation products are facing the problem of fatigue fracture in vibration environment, we makes use of the testing result of a bracket, analysis for the structure with ANSYS-Workbench, predict the life of the bracket by different ways, and compared with the testing result. With the research on analysis methods, make an organic combination of simulation analysis and testing, Not only ensure the accuracy of simulation analysis and life predict, but also make a dynamic supervision of product life process, promote the application of finite element simulation analysis in engineering practice.