High Accuracy ESPRIT-TLS Technique for Wind Turbine Fault Discrimination

ESPRIT-TLS method appears a good choice for high resolution fault detection in induction machines. It has a very high effectiveness in the frequency and amplitude identification. Contrariwise, it presents a high computation complexity which affects its implementation in real time fault diagnosis. To avoid this problem, a Fast-ESPRIT algorithm that combined the IIR band-pass filtering technique, the decimation technique and the original ESPRIT-TLS method was employed to enhance extracting accurately frequencies and their magnitudes from the wind stator current with less computation cost. The proposed algorithm has been applied to verify the wind turbine machine need in the implementation of an online, fast, and proactive condition monitoring. This type of remote and periodic maintenance provides an acceptable machine lifetime, minimize its downtimes and maximize its productivity. The developed technique has evaluated by computer simulations under many fault scenarios. Study results prove the performance of Fast- ESPRIT offering rapid and high resolution harmonics recognizing with minimum computation time and less memory cost.

Personal Authentication Using FDOST in Finger Knuckle-Print Biometrics

The inherent skin patterns created at the joints in the finger exterior are referred as finger knuckle-print. It is exploited to identify a person in a unique manner because the finger knuckle print is greatly affluent in textures. In biometric system, the region of interest is utilized for the feature extraction algorithm. In this paper, local and global features are extracted separately. Fast Discrete Orthonormal Stockwell Transform is exploited to extract the local features. Global feature is attained by escalating the size of Fast Discrete Orthonormal Stockwell Transform to infinity. Two features are fused to increase the recognition accuracy. A matching distance is calculated for both the features individually. Then two distances are merged mutually to acquire the final matching distance. The proposed scheme gives the better performance in terms of equal error rate and correct recognition rate.

GSM Based Smart Patient Monitoring System

In this paper, we propose an intelligent system that is used for monitoring the health conditions of patients. Monitoring the health condition of patients is a complex problem that involves different medical units and requires continuous monitoring especially in rural areas because of inadequate number of available specialized physicians. The proposed system will improve patient care and drive costs down comparing to the existing system in Jordan. The proposed system will be the start point to faster and improve the communication between different units in the health system in Jordan. Connecting patients and their physicians beyond hospital doors regarding their geographical area is an important issue in developing the health system in Jordan. The ability of making medical decisions, the quality of medical is expected to be improved.

Experimental Modal Analysis of Reinforced Concrete Square Slabs

The aim of this paper is to perform experimental modal analysis (EMA) of reinforced concrete (RC) square slabs. EMA is the process of determining the modal parameters (Natural Frequencies, damping factors, modal vectors) of a structure from a set of frequency response functions FRFs (curve fitting). Although, experimental modal analysis (or modal testing) has grown steadily in popularity since the advent of the digital FFT spectrum analyzer in the early 1970’s, studying all types of members and materials using such method have not yet been well documented. Therefore, in this work, experimental tests were conducted on RC square slab specimens of dimensions 600mm x 600mmx 40mm. Experimental analysis was based on freely supported boundary condition. Moreover, impact testing as a fast and economical means of finding the modes of vibration of a structure was used during the experiments. In addition, Pico Scope 6 device and MATLAB software were used to acquire data, analyze and plot Frequency Response Function (FRF). The experimental natural frequencies which were extracted from measurements exhibit good agreement with analytical predictions. It is showed that EMA method can be usefully employed to investigate the dynamic behavior of RC slabs.

A New Correlation between SPT and CPT for Various Soils

The Standard Penetration Test (SPT) is the most common in situ test for soil investigations. On the other hand, the Cone Penetration Test (CPT) is considered one of the best investigation tools. Due to the fast and accurate results that can be obtained it complaints the SPT in many applications like field explorations, design parameters, and quality control assessments. Many soil index and engineering properties have been correlated to both of SPT and CPT. Various foundation design methods were developed based on the outcome of these tests. Therefore it is vital to correlate these tests to each other so that either one of the tests can be used in the absence of the other, especially for preliminary evaluation and design purposes. The primary purpose of this study was to investigate the relationships between the SPT and CPT for different type of sandy soils in Florida. Data for this research were collected from number of projects sponsored by the Florida Department of Transportation (FDOT), six sites served as the subject of SPT-CPT correlations. The correlations were established between the cone resistance (qc), sleeve friction (fs) and the uncorrected SPT blow counts (N) for various soils. A positive linear relationship was found between qc, fs and N for various sandy soils. In general, qc versus N showed higher correlation coefficients than fs versus N. qc/N ratios were developed for different soil types and compared to literature values, the results of this research revealed higher ratios than literature values.

Analysis of Different Resins in Web-to-Flange Joints

The industrial process adds to engineering wood products features absent in solid wood, with homogeneous structure and reduced defects, improved physical and mechanical properties, bio-deterioration, resistance and better dimensional stability, improving quality and increasing the reliability of structures wood. These features combined with using fast-growing trees, make them environmentally ecological products, ensuring a strong consumer market. The wood I-joists are manufactured by the industrial profiles bonding flange and web, an important aspect of the production of wooden I-beams is the adhesive joint that bonds the web to the flange. Adhesives can effectively transfer and distribute stresses, thereby increasing the strength and stiffness of the composite. The objective of this study is to evaluate different resins in a shear strain specimens with the aim of analyzing the most efficient resin and possibility of using national products, reducing the manufacturing cost. First was conducted a literature review, where established the geometry and materials generally used, then established and analyzed 8 national resins and produced six specimens for each.

Construction and Validation of a Hybrid Lumbar Spine Model for the Fast Evaluation of Intradiscal Pressure and Mobility

A novel hybrid model of the lumbar spine, allowing fast static and dynamic simulations of the disc pressure and the spine mobility, is introduced in this work. Our contribution is to combine rigid bodies, deformable finite elements, articular constraints, and springs into a unique model of the spine. Each vertebra is represented by a rigid body controlling a surface mesh to model contacts on the facet joints and the spinous process. The discs are modeled using a heterogeneous tetrahedral finite element model. The facet joints are represented as elastic joints with six degrees of freedom, while the ligaments are modeled using non-linear one-dimensional elastic elements. The challenge we tackle is to make these different models efficiently interact while respecting the principles of Anatomy and Mechanics. The mobility, the intradiscal pressure, the facet joint force and the instantaneous center of rotation of the lumbar spine are validated against the experimental and theoretical results of the literature on flexion, extension, lateral bending as well as axial rotation. Our hybrid model greatly simplifies the modeling task and dramatically accelerates the simulation of pressure within the discs, as well as the evaluation of the range of motion and the instantaneous centers of rotation, without penalizing precision. These results suggest that for some types of biomechanical simulations, simplified models allow far easier modeling and faster simulations compared to usual full-FEM approaches without any loss of accuracy.

Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator

Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and Validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) where in the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation & Control design team. This paper discusses about the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.

Improving Automotive Efficiency through Lean Management Tools: A Case Study

Managing and improving efficiency in the current highly competitive global automotive industry demands that those companies adopt leaner and more flexible systems. During the past 20 years the domestic automotive industry in North America has been focusing on establishing new management strategies in order to meet market demands. The lean management process also known as Toyota Manufacturing Process (TPS) or lean manufacturing encompasses tools and techniques that were established in order to provide the best quality product with the fastest lead time at the lowest cost. The following paper presents a study that focused on improving labor efficiency at one of the Big Three (Ford, GM, Chrysler LLC) domestic automotive facility in North America. The objective of the study was to utilize several lean management tools in order to optimize the efficiency and utilization levels at the “Pre- Marriage” chassis area in a truck manufacturing and assembly facility. Utilizing three different lean tools (i.e. Standardization of work, 7 Wastes, and 5S) this research was able to improve efficiency by 51%, utilization by 246%, and reduce operations by 14%. The return on investment calculated based on the improvements made was 284%.

Automatic Enhanced Update Summary Generation System for News Documents

Fast changing knowledge systems on the Internet can be accessed more efficiently with the help of automatic document summarization and updating techniques. The aim of multi-document update summary generation is to construct a summary unfolding the mainstream of data from a collection of documents based on the hypothesis that the user has already read a set of previous documents. In order to provide a lot of semantic information from the documents, deeper linguistic or semantic analysis of the source documents were used instead of relying only on document word frequencies to select important concepts. In order to produce a responsive summary, meaning oriented structural analysis is needed. To address this issue, the proposed system presents a document summarization approach based on sentence annotation with aspects, prepositions and named entities. Semantic element extraction strategy is used to select important concepts from documents which are used to generate enhanced semantic summary.

Size-Reduction Strategies for Iris Codes

Iris codes contain bits with different entropy. This work investigates different strategies to reduce the size of iris code templates with the aim of reducing storage requirements and computational demand in the matching process. Besides simple subsampling schemes, also a binary multi-resolution representation as used in the JBIG hierarchical coding mode is assessed. We find that iris code template size can be reduced significantly while maintaining recognition accuracy. Besides, we propose a two-stage identification approach, using small-sized iris code templates in a pre-selection stage, and full resolution templates for final identification, which shows promising recognition behaviour.

The Influence of Clayey Pellet Size on Adsorption Efficiency of Metal Ions Removal from Waste Printing Developer

The adsorption efficiency of fired clayey pellets of 5 and 8 mm diameter size for Cu(II) and Zn(II) ion removal from a waste printing developer was studied. In order to investigate the influence of contact time, adsorbent mass and pellet size on the adsorption efficiency the batch mode was carried out. Faster uptake of copper ion was obtained with the fired clay pellets of 5 mm diameter size within 30 minutes. The pellets of 8 mm diameter size showed the higher equilibrium time (60 to 75 minutes) for copper and zinc ion. The results pointed out that adsorption efficiency increases with the increase of adsorbent mass. The maximal efficiency is different for Cu(II) and Zn(II) ion due to the pellet size. Therefore, the fired clay pellets of 5 mm diameter size present an effective adsorbent for Cu(II) ion removal (adsorption efficiency is 63.6%), whereas the fired clay pellets of 8 mm diameter size are the best alternative for Zn(II) ion removal (adsorption efficiency is 92.8%) from a waste printing developer.

Real Time Adaptive Obstacle Avoidance in Dynamic Environments with Different D-S

In this paper a real-time obstacle avoidance approach for both autonomous and non-autonomous dynamical systems (DS) is presented. In this approach the original dynamics of the controller which allow us to determine safety margin can be modulated. Different common types of DS increase the robot’s reactiveness in the face of uncertainty in the localization of the obstacle especially when robot moves very fast in changeable complex environments. The method is validated by simulation and influence of different autonomous and non-autonomous DS such as important characteristics of limit cycles and unstable DS. Furthermore, the position of different obstacles in complex environment is explained. Finally, the verification of avoidance trajectories is described through different parameters such as safety factor.

Unsupervised Segmentation Technique for Acute Leukemia Cells Using Clustering Algorithms

Leukaemia is a blood cancer disease that contributes to the increment of mortality rate in Malaysia each year. There are two main categories for leukaemia, which are acute and chronic leukaemia. The production and development of acute leukaemia cells occurs rapidly and uncontrollable. Therefore, if the identification of acute leukaemia cells could be done fast and effectively, proper treatment and medicine could be delivered. Due to the requirement of prompt and accurate diagnosis of leukaemia, the current study has proposed unsupervised pixel segmentation based on clustering algorithm in order to obtain a fully segmented abnormal white blood cell (blast) in acute leukaemia image. In order to obtain the segmented blast, the current study proposed three clustering algorithms which are k-means, fuzzy c-means and moving k-means algorithms have been applied on the saturation component image. Then, median filter and seeded region growing area extraction algorithms have been applied, to smooth the region of segmented blast and to remove the large unwanted regions from the image, respectively. Comparisons among the three clustering algorithms are made in order to measure the performance of each clustering algorithm on segmenting the blast area. Based on the good sensitivity value that has been obtained, the results indicate that moving kmeans clustering algorithm has successfully produced the fully segmented blast region in acute leukaemia image. Hence, indicating that the resultant images could be helpful to haematologists for further analysis of acute leukaemia.

Enhanced Disk-Based Databases Towards Improved Hybrid In-Memory Systems

In-memory database systems are becoming popular due to the availability and affordability of sufficiently large RAM and processors in modern high-end servers with the capacity to manage large in-memory database transactions. While fast and reliable inmemory systems are still being developed to overcome cache misses, CPU/IO bottlenecks and distributed transaction costs, disk-based data stores still serve as the primary persistence. In addition, with the recent growth in multi-tenancy cloud applications and associated security concerns, many organisations consider the trade-offs and continue to require fast and reliable transaction processing of diskbased database systems as an available choice. For these organizations, the only way of increasing throughput is by improving the performance of disk-based concurrency control. This warrants a hybrid database system with the ability to selectively apply an enhanced disk-based data management within the context of inmemory systems that would help improve overall throughput. The general view is that in-memory systems substantially outperform disk-based systems. We question this assumption and examine how a modified variation of access invariance that we call enhanced memory access, (EMA) can be used to allow very high levels of concurrency in the pre-fetching of data in disk-based systems. We demonstrate how this prefetching in disk-based systems can yield close to in-memory performance, which paves the way for improved hybrid database systems. This paper proposes a novel EMA technique and presents a comparative study between disk-based EMA systems and in-memory systems running on hardware configurations of equivalent power in terms of the number of processors and their speeds. The results of the experiments conducted clearly substantiate that when used in conjunction with all concurrency control mechanisms, EMA can increase the throughput of disk-based systems to levels quite close to those achieved by in-memory system. The promising results of this work show that enhanced disk-based systems facilitate in improving hybrid data management within the broader context of in-memory systems.

Simulation Based VLSI Implementation of Fast Efficient Lossless Image Compression System Using Adjusted Binary Code & Golumb Rice Code

The Simulation based VLSI Implementation of FELICS (Fast Efficient Lossless Image Compression System) Algorithm is proposed to provide the lossless image compression and is implemented in simulation oriented VLSI (Very Large Scale Integrated). To analysis the performance of Lossless image compression and to reduce the image without losing image quality and then implemented in VLSI based FELICS algorithm. In FELICS algorithm, which consists of simplified adjusted binary code for Image compression and these compression image is converted in pixel and then implemented in VLSI domain. This parameter is used to achieve high processing speed and minimize the area and power. The simplified adjusted binary code reduces the number of arithmetic operation and achieved high processing speed. The color difference preprocessing is also proposed to improve coding efficiency with simple arithmetic operation. Although VLSI based FELICS Algorithm provides effective solution for hardware architecture design for regular pipelining data flow parallelism with four stages. With two level parallelisms, consecutive pixels can be classified into even and odd samples and the individual hardware engine is dedicated for each one. This method can be further enhanced by multilevel parallelisms.

Investigation of Minor Actinide-Contained Thorium Fuel Impacts on CANDU-Type Reactor Neutronics Using Computational Method

Currently, thorium fuel has been especially noticed because of its proliferation resistance than long half-life alpha emitter minor actinides, breeding capability in fast and thermal neutron flux and mono-isotopic naturally abundant. In recent years, efficiency of minor actinide burning up in PWRs has been investigated. Hence, a minor actinide-contained thorium based fuel matrix can confront both proliferation resistance and nuclear waste depletion aims. In the present work, minor actinide depletion rate in a CANDU-type nuclear core modeled using MCNP code has been investigated. The obtained effects of minor actinide load as mixture of thorium fuel matrix on the core neutronics has been studied with comparing presence and non-presence of minor actinide component in the fuel matrix. Depletion rate of minor actinides in the MA-contained fuel has been calculated using different power loads. According to the obtained computational data, minor actinide loading in the modeled core results in more negative reactivity coefficients. The MA-contained fuel achieves less radial peaking factor in the modeled core. The obtained computational results showed 140 kg of 464 kg initial load of minor actinide has been depleted in during a 6-year burn up in 10 MW power.

Particle Concentration Distribution under Idling Conditions in a Residential Underground Garage

Particles exhausted from cars have adverse impacts on human health. The study developed a three-dimensional particle dispersion numerical model including particle coagulation to simulate the particle concentration distribution under idling conditions in a residential underground garage. The simulation results demonstrate that particle disperses much faster in the vertical direction than that in horizontal direction. The enhancement of particle dispersion in the vertical direction due to the increase of cars with engine running is much stronger than that in the car exhaust direction. Particle dispersion from each pair of adjacent cars has little influence on each other in the study. Average particle concentration after 120 seconds exhaust is 1.8-4.5 times higher than the initial total particles at ambient environment. Particle pollution in the residential underground garage is severe.

Improving the Performance of Back-Propagation Training Algorithm by Using ANN

Artificial Neural Network (ANN) can be trained using back propagation (BP). It is the most widely used algorithm for supervised learning with multi-layered feed-forward networks. Efficient learning by the BP algorithm is required for many practical applications. The BP algorithm calculates the weight changes of artificial neural networks, and a common approach is to use a twoterm algorithm consisting of a learning rate (LR) and a momentum factor (MF). The major drawbacks of the two-term BP learning algorithm are the problems of local minima and slow convergence speeds, which limit the scope for real-time applications. Recently the addition of an extra term, called a proportional factor (PF), to the two-term BP algorithm was proposed. The third increases the speed of the BP algorithm. However, the PF term also reduces the convergence of the BP algorithm, and criteria for evaluating convergence are required to facilitate the application of the three terms BP algorithm. Although these two seem to be closely related, as described later, we summarize various improvements to overcome the drawbacks. Here we compare the different methods of convergence of the new three-term BP algorithm.

Performance Comparisons between PID and Adaptive PID Controllers for Travel Angle Control of a Bench-Top Helicopter

This paper provides a comparative study on the performances of standard PID and adaptive PID controllers tested on travel angle of a 3-Degree-of-Freedom (3-DOF) Quanser bench-top helicopter. Quanser, a well-known manufacturer of educational bench-top helicopter has developed Proportional Integration Derivative (PID) controller with Linear Quadratic Regulator (LQR) for all travel, pitch and yaw angle of the bench-top helicopter. The performance of the PID controller is relatively good; however, its performance could also be improved if the controller is combined with adaptive element. The objective of this research is to design adaptive PID controller and then compare the performances of the adaptive PID with the standard PID. The controller design and test is focused on travel angle control only. Adaptive method used in this project is self-tuning controller, which controller’s parameters are updated online. Two adaptive algorithms those are pole-placement and deadbeat have been chosen as the method to achieve optimal controller’s parameters. Performance comparisons have shown that the adaptive (deadbeat) PID controller has produced more desirable performance compared to standard PID and adaptive (poleplacement). The adaptive (deadbeat) PID controller attained very fast settling time (5 seconds) and very small percentage of overshoot (5% to 7.5%) for 10° to 30° step change of travel angle.