Causes of Rotor Distortions and Applicable Common Straightening Methods for Turbine Rotors and Shafts

Different problems may causes distortion of the rotor, and hence vibration, which is the most severe damage of the turbine rotors. In many years different techniques have been developed for the straightening of bent rotors. The method for straightening can be selected according to initial information from preliminary inspections and tests such as nondestructive tests, chemical analysis, run out tests and also a knowledge of the shaft material. This article covers the various causes of excessive bends and then some applicable common straightening methods are reviewed. Finally, hot spotting is opted for a particular bent rotor. A 325 MW steam turbine rotor is modeled and finite element analyses are arranged to investigate this straightening process. Results of experimental data show that performing the exact hot spot straightening process reduced the bending of the rotor significantly.

Application of Artificial Neural Network for Predicting Maintainability Using Object-Oriented Metrics

Importance of software quality is increasing leading to development of new sophisticated techniques, which can be used in constructing models for predicting quality attributes. One such technique is Artificial Neural Network (ANN). This paper examined the application of ANN for software quality prediction using Object- Oriented (OO) metrics. Quality estimation includes estimating maintainability of software. The dependent variable in our study was maintenance effort. The independent variables were principal components of eight OO metrics. The results showed that the Mean Absolute Relative Error (MARE) was 0.265 of ANN model. Thus we found that ANN method was useful in constructing software quality model.

Automatic 3D Reconstruction of Coronary Artery Centerlines from Monoplane X-ray Angiogram Images

We present a new method for the fully automatic 3D reconstruction of the coronary artery centerlines, using two X-ray angiogram projection images from a single rotating monoplane acquisition system. During the first stage, the input images are smoothed using curve evolution techniques. Next, a simple yet efficient multiscale method, based on the information of the Hessian matrix, for the enhancement of the vascular structure is introduced. Hysteresis thresholding using different image quantiles, is used to threshold the arteries. This stage is followed by a thinning procedure to extract the centerlines. The resulting skeleton image is then pruned using morphological and pattern recognition techniques to remove non-vessel like structures. Finally, edge-based stereo correspondence is solved using a parallel evolutionary optimization method based on f symbiosis. The detected 2D centerlines combined with disparity map information allow the reconstruction of the 3D vessel centerlines. The proposed method has been evaluated on patient data sets for evaluation purposes.

Frequency and Amplitude Measurement of a Vibrating Object in Water Using Ultrasonic Speckle Technique

The principle of frequency and amplitude measurement of a vibrating object in water using ultrasonic speckle technique is presented in this paper. Compared with other traditional techniques, the ultrasonic speckle technique can be applied to vibration measurement of a nonmetal object with rough surface in water in a noncontact way. The relationship between speckle movement and object movement was analyzed. Based on this study, an ultrasonic speckle measurement system was set up. With this system the frequency and amplitude of an underwater vibrating cantilever beam was detected. The result shows that the experimental data is in good agreement with the calibrating data.

Optimization of the Characteristic Straight Line Method by a “Best Estimate“ of Observed, Normal Orthometric Elevation Differences

In this paper, to optimize the “Characteristic Straight Line Method" which is used in the soil displacement analysis, a “best estimate" of the geodetic leveling observations has been achieved by taking in account the concept of 'Height systems'. This concept has been discussed in detail and consequently the concept of “height". In landslides dynamic analysis, the soil is considered as a mosaic of rigid blocks. The soil displacement has been monitored and analyzed by using the “Characteristic Straight Line Method". Its characteristic components have been defined constructed from a “best estimate" of the topometric observations. In the measurement of elevation differences, we have used the most modern leveling equipment available. Observational procedures have also been designed to provide the most effective method to acquire data. In addition systematic errors which cannot be sufficiently controlled by instrumentation or observational techniques are minimized by applying appropriate corrections to the observed data: the level collimation correction minimizes the error caused by nonhorizontality of the leveling instrument's line of sight for unequal sight lengths, the refraction correction is modeled to minimize the refraction error caused by temperature (density) variation of air strata, the rod temperature correction accounts for variation in the length of the leveling rod' s Invar/LO-VAR® strip which results from temperature changes, the rod scale correction ensures a uniform scale which conforms to the international length standard and the introduction of the concept of the 'Height systems' where all types of height (orthometric, dynamic, normal, gravity correction, and equipotential surface) have been investigated. The “Characteristic Straight Line Method" is slightly more convenient than the “Characteristic Circle Method". It permits to evaluate a displacement of very small magnitude even when the displacement is of an infinitesimal quantity. The inclination of the landslide is given by the inverse of the distance reference point O to the “Characteristic Straight Line". Its direction is given by the bearing of the normal directed from point O to the Characteristic Straight Line (Fig..6). A “best estimate" of the topometric observations was used to measure the elevation of points carefully selected, before and after the deformation. Gross errors have been eliminated by statistical analyses and by comparing the heights within local neighborhoods. The results of a test using an area where very interesting land surface deformation occurs are reported. Monitoring with different options and qualitative comparison of results based on a sufficient number of check points are presented.

Identification of Wideband Sources Using Higher Order Statistics in Noisy Environment

This paper deals with the localization of the wideband sources. We develop a new approach for estimating the wide band sources parameters. This method is based on the high order statistics of the recorded data in order to eliminate the Gaussian components from the signals received on the various hydrophones.In fact the noise of sea bottom is regarded as being Gaussian. Thanks to the coherent signal subspace algorithm based on the cumulant matrix of the received data instead of the cross-spectral matrix the wideband correlated sources are perfectly located in the very noisy environment. We demonstrate the performance of the proposed algorithm on the real data recorded during an underwater acoustics experiments.

Fractal - Wavelet Based Techniques for Improving the Artificial Neural Network Models

Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for preprocessing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based preprocessing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.

Increasing The Speed of Convergence of an Artificial Neural Network based ARMA Coefficients Determination Technique

In this paper, novel techniques in increasing the accuracy and speed of convergence of a Feed forward Back propagation Artificial Neural Network (FFBPNN) with polynomial activation function reported in literature is presented. These technique was subsequently used to determine the coefficients of Autoregressive Moving Average (ARMA) and Autoregressive (AR) system. The results obtained by introducing sequential and batch method of weight initialization, batch method of weight and coefficient update, adaptive momentum and learning rate technique gives more accurate result and significant reduction in convergence time when compared t the traditional method of back propagation algorithm, thereby making FFBPNN an appropriate technique for online ARMA coefficient determination.

Electric Load Forecasting Using Genetic Based Algorithm, Optimal Filter Estimator and Least Error Squares Technique: Comparative Study

This paper presents performance comparison of three estimation techniques used for peak load forecasting in power systems. The three optimum estimation techniques are, genetic algorithms (GA), least error squares (LS) and, least absolute value filtering (LAVF). The problem is formulated as an estimation problem. Different forecasting models are considered. Actual recorded data is used to perform the study. The performance of the above three optimal estimation techniques is examined. Advantages of each algorithms are reported and discussed.

Fast Cosine Transform to Increase Speed-up and Efficiency of Karhunen-Loève Transform for Lossy Image Compression

In this work, we present a comparison between two techniques of image compression. In the first case, the image is divided in blocks which are collected according to zig-zag scan. In the second one, we apply the Fast Cosine Transform to the image, and then the transformed image is divided in blocks which are collected according to zig-zag scan too. Later, in both cases, the Karhunen-Loève transform is applied to mentioned blocks. On the other hand, we present three new metrics based on eigenvalues for a better comparative evaluation of the techniques. Simulations show that the combined version is the best, with minor Mean Absolute Error (MAE) and Mean Squared Error (MSE), higher Peak Signal to Noise Ratio (PSNR) and better image quality. Finally, new technique was far superior to JPEG and JPEG2000.

Fabrication of Carbon Doped TiO2 Nanotubes via In-situ Anodization of Ti-foil in Acidic Medium

Highly ordered TiO2 nanotube (TNT) arrays were fabricated onto a pre-treated titanium foil by anodic oxidation with a voltage of 20V in phosphoric acid/sodium fluoride electrolyte. A pretreatment of titanium foil involved washing with acetone, isopropanol, ethanol and deionized water. Carbon doped TiO2 nanotubes (C-TNT) was fabricated 'in-situ' with the same method in the presence of polyvinyl alcohol and urea as carbon sources. The affects of polyvinyl alcohol concentration and oxidation time on the composition, morphology and structure of the C-TN were studied by FE-SEM, EDX and XRD techniques. FESEM images of the nanotubes showed uniform arrays of C-TNTs. The density and microstructures of the nanotubes were greatly affected by the content of PVA. The introduction of the polyvinyl alcohol into the electrolyte increases the amount of C content inside TiO2 nanotube arrays uniformly. The influence of carbon content on the photo-current of C-TNT was investigated and the I-V profiles of the nanotubes were established. The preliminary results indicated that the 'in-situ' doping technique produced a superior quality nanotubes compared to post doping techniques.

Retina Based Mouse Control (RBMC)

The paper presents a novel idea to control computer mouse cursor movement with human eyes. In this paper, a working of the product has been described as to how it helps the special people share their knowledge with the world. Number of traditional techniques such as Head and Eye Movement Tracking Systems etc. exist for cursor control by making use of image processing in which light is the primary source. Electro-oculography (EOG) is a new technology to sense eye signals with which the mouse cursor can be controlled. The signals captured using sensors, are first amplified, then noise is removed and then digitized, before being transferred to PC for software interfacing.

In Search of Robustness and Efficiency via l1− and l2− Regularized Optimization for Physiological Motion Compensation

Compensating physiological motion in the context of minimally invasive cardiac surgery has become an attractive issue since it outperforms traditional cardiac procedures offering remarkable benefits. Owing to space restrictions, computer vision techniques have proven to be the most practical and suitable solution. However, the lack of robustness and efficiency of existing methods make physiological motion compensation an open and challenging problem. This work focusses on increasing robustness and efficiency via exploration of the classes of 1−and 2−regularized optimization, emphasizing the use of explicit regularization. Both approaches are based on natural features of the heart using intensity information. Results pointed out the 1−regularized optimization class as the best since it offered the shortest computational cost, the smallest average error and it proved to work even under complex deformations.

Photomechanical Analysis of Wooden Testing Bodies under Flexural Loadings

Application of wood in rural construction is diffused all around the world since remote times. However, its inclusion in structural design deserves strong support from broad knowledge of material properties. The pertinent literature reveals the application of optical methods in determining the complete field displacement on bodies exhibiting regular as well as irregular surfaces. The use of moiré techniques in experimental mechanics consists in analyzing the patterns generated on the body surface before and after deformation. The objective of this research work is to study the qualitative deformation behavior of wooden testing specimens under specific loading situations. The experiment setup follows the literature description of shadow moiré methods. Results indicate strong anisotropy influence of the generated displacement field. Important qualitative as well as quantitative stress and strain distribution were obtained wooden members which are applicable to rural constructions.

The Effects of Multipath on OFDM Systems for Broadband Power-Line Communications a Case of Medium Voltage Channel

Power-line networks are widely used today for broadband data transmission. However, due to multipaths within the broadband power line communication (BPLC) systems owing to stochastic changes in the network load impedances, branches, etc., network or channel capacity performances are affected. This paper attempts to investigate the performance of typical medium voltage channels that uses Orthogonal Frequency Division Multiplexing (OFDM) techniques with Quadrature Amplitude Modulation (QAM) sub carriers. It has been observed that when the load impedances are different from line characteristic impedance channel performance decreases. Also as the number of branches in the link between the transmitter and receiver increases a loss of 4dB/branch is found in the signal to noise ratio (SNR). The information presented in the paper could be useful for an appropriate design of the BPLC systems.

Implementation of a Motion Detection System

In today-s competitive environment, the security concerns have grown tremendously. In the modern world, possession is known to be 9/10-ths of the law. Hence, it is imperative for one to be able to safeguard one-s property from worldly harms such as thefts, destruction of property, people with malicious intent etc. Due to the advent of technology in the modern world, the methodologies used by thieves and robbers for stealing have been improving exponentially. Therefore, it is necessary for the surveillance techniques to also improve with the changing world. With the improvement in mass media and various forms of communication, it is now possible to monitor and control the environment to the advantage of the owners of the property. The latest technologies used in the fight against thefts and destruction are the video surveillance and monitoring. By using the technologies, it is possible to monitor and capture every inch and second of the area in interest. However, so far the technologies used are passive in nature, i.e., the monitoring systems only help in detecting the crime but do not actively participate in stopping or curbing the crime while it takes place. Therefore, we have developed a methodology to detect the motion in a video stream environment and this is an idea to ensure that the monitoring systems not only actively participate in stopping the crime, but do so while the crime is taking place. Hence, a system is used to detect any motion in a live streaming video and once motion has been detected in the live stream, the software will activate a warning system and capture the live streaming video.

Media and Information Literacy (MIL) for Thai Youths

The objectives of this study are to determine the role of media that influence the values, attitudes and behaviors of Thai youths. Analytical qualitative research techniques were used for this purpose. Data collection based techniques was used which were individual interviews and focus group discussions with journalists, sample of high school and university students, and parents. The results show that “Social Media" is still the most popular media for Thai youths. It is also still in the hands of the marketing business and it can motivate Thai youths to do so many things. The main reasons of media exposure are to find quality information that they want quickly, get satisfaction and can use social media to get more exciting and to build communities. They believe that the need for media and information literacy skills is defined as making judgments, personal integrity, training of family and the behavior of close friends.

Attribute Weighted Class Complexity: A New Metric for Measuring Cognitive Complexity of OO Systems

In general, class complexity is measured based on any one of these factors such as Line of Codes (LOC), Functional points (FP), Number of Methods (NOM), Number of Attributes (NOA) and so on. There are several new techniques, methods and metrics with the different factors that are to be developed by the researchers for calculating the complexity of the class in Object Oriented (OO) software. Earlier, Arockiam et.al has proposed a new complexity measure namely Extended Weighted Class Complexity (EWCC) which is an extension of Weighted Class Complexity which is proposed by Mishra et.al. EWCC is the sum of cognitive weights of attributes and methods of the class and that of the classes derived. In EWCC, a cognitive weight of each attribute is considered to be 1. The main problem in EWCC metric is that, every attribute holds the same value but in general, cognitive load in understanding the different types of attributes cannot be the same. So here, we are proposing a new metric namely Attribute Weighted Class Complexity (AWCC). In AWCC, the cognitive weights have to be assigned for the attributes which are derived from the effort needed to understand their data types. The proposed metric has been proved to be a better measure of complexity of class with attributes through the case studies and experiments

Developing Vision-Based Digital Public Display as an Interactive Media

Interactive public displays give access as an innovative media to promote enhanced communication between people and information. However, digital public displays are subject to a few constraints, such as content presentation. Content presentation needs to be developed to be more interesting to attract people’s attention and motivate people to interact with the display. In this paper, we proposed idea to implement contents with interaction elements for vision-based digital public display. Vision-based techniques are applied as a sensor to detect passers-by and theme contents are suggested to attract their attention for encouraging them to interact with the announcement content. Virtual object, gesture detection and projection installation are applied for attracting attention from passers-by. Preliminary study showed positive feedback of interactive content designing towards the public display. This new trend would be a valuable innovation as delivery of announcement content and information communication through this media is proven to be more engaging.

Assessment Methods for Surgical Skill

The increasingly sophisticated technologies have now been able to provide assistance for surgeons to improve surgical performance through various training programs. Equally important to learning skills is the assessment method as it determines the learning and technical proficiency of a trainee. A consistent and rigorous assessment system will ensure that trainees acquire the specific level of competency prior to certification. This paper reviews the methods currently in use for assessment of surgical skill and some modern techniques using computer-based measurements and virtual reality systems for more quantitative measurements