Transient Thermal Modeling of an Axial Flux Permanent Magnet (AFPM) Machine Using a Hybrid Thermal Model

This paper presents the development of a hybrid thermal model for the EVO Electric AFM 140 Axial Flux Permanent Magnet (AFPM) machine as used in hybrid and electric vehicles. The adopted approach is based on a hybrid lumped parameter and finite difference method. The proposed method divides each motor component into regular elements which are connected together in a thermal resistance network representing all the physical connections in all three dimensions. The element shape and size are chosen according to the component geometry to ensure consistency. The fluid domain is lumped into one region with averaged heat transfer parameters connecting it to the solid domain. Some model parameters are obtained from Computation Fluid Dynamic (CFD) simulation and empirical data. The hybrid thermal model is described by a set of coupled linear first order differential equations which is discretised and solved iteratively to obtain the temperature profile. The computation involved is low and thus the model is suitable for transient temperature predictions. The maximum error in temperature prediction is 3.4% and the mean error is consistently lower than the mean error due to uncertainty in measurements. The details of the model development, temperature predictions and suggestions for design improvements are presented in this paper.

Stego Machine – Video Steganography using Modified LSB Algorithm

Computer technology and the Internet have made a breakthrough in the existence of data communication. This has opened a whole new way of implementing steganography to ensure secure data transfer. Steganography is the fine art of hiding the information. Hiding the message in the carrier file enables the deniability of the existence of any message at all. This paper designs a stego machine to develop a steganographic application to hide data containing text in a computer video file and to retrieve the hidden information. This can be designed by embedding text file in a video file in such away that the video does not loose its functionality using Least Significant Bit (LSB) modification method. This method applies imperceptible modifications. This proposed method strives for high security to an eavesdropper-s inability to detect hidden information.

Application of RP Technology with Polycarbonate Material for Wind Tunnel Model Fabrication

Traditionally, wind tunnel models are made of metal and are very expensive. In these years, everyone is looking for ways to do more with less. Under the right test conditions, a rapid prototype part could be tested in a wind tunnel. Using rapid prototype manufacturing techniques and materials in this way significantly reduces time and cost of production of wind tunnel models. This study was done of fused deposition modeling (FDM) and their ability to make components for wind tunnel models in a timely and cost effective manner. This paper discusses the application of wind tunnel model configuration constructed using FDM for transonic wind tunnel testing. A study was undertaken comparing a rapid prototyping model constructed of FDM Technologies using polycarbonate to that of a standard machined steel model. Testing covered the Mach range of Mach 0.3 to Mach 0.75 at an angle-ofattack range of - 2° to +12°. Results from this study show relatively good agreement between the two models and rapid prototyping Method reduces time and cost of production of wind tunnel models. It can be concluded from this study that wind tunnel models constructed using rapid prototyping method and materials can be used in wind tunnel testing for initial baseline aerodynamic database development.

Improvement of Overall Equipment Effectiveness through Total Productive Maintenance

Frequent machine breakdowns, low plant availability and increased overtime are a great threat to a manufacturing plant as they increase operating costs of an industry. The main aim of this study was to improve Overall Equipment Effectiveness (OEE) at a manufacturing company through the implementation of innovative maintenance strategies. A case study approach was used. The paper focuses on improving the maintenance in a manufacturing set up using an innovative maintenance regime mix to improve overall equipment effectiveness. Interviews, reviewing documentation and historical records, direct and participatory observation were used as data collection methods during the research. Usually production is based on the total kilowatt of motors produced per day. The target kilowatt at 91% availability is 75 Kilowatts a day. Reduced demand and lack of raw materials particularly imported items are adversely affecting the manufacturing operations. The company had to reset its targets from the usual figure of 250 Kilowatt per day to mere 75 per day due to lower availability of machines as result of breakdowns as well as lack of raw materials. The price reductions and uncertainties as well as general machine breakdowns further lowered production. Some recommendations were given. For instance, employee empowerment in the company will enhance responsibility and authority to improve and totally eliminate the six big losses. If the maintenance department is to realise its proper function in a progressive, innovative industrial society, then its personnel must be continuously trained to meet current needs as well as future requirements. To make the maintenance planning system effective, it is essential to keep track of all the corrective maintenance jobs and preventive maintenance inspections. For large processing plants these cannot be handled manually. It was therefore recommended that the company implement (Computerised Maintenance Management System) CMMS.

Information Fusion as a Means of Forecasting Expenditures for Regenerating Complex Investment Goods

Planning capacities when regenerating complex investment goods involves particular challenges in that the planning is subject to a large degree of uncertainty regarding load information. Using information fusion – by applying Bayesian Networks – a method is being developed for forecasting the anticipated expenditures (human labor, tool and machinery utilization, time etc.) for regenerating a good. The generated forecasts then later serve as a tool for planning capacities and ensure a greater stability in the planning processes.

Analysis of Noise Level Effects on Signal-Averaged Electrocardiograms

Noise level has critical effects on the diagnostic performance of signal-averaged electrocardiogram (SAECG), because the true starting and end points of QRS complex would be masked by the residual noise and sensitive to the noise level. Several studies and commercial machines have used a fixed number of heart beats (typically between 200 to 600 beats) or set a predefined noise level (typically between 0.3 to 1.0 μV) in each X, Y and Z lead to perform SAECG analysis. However different criteria or methods used to perform SAECG would cause the discrepancies of the noise levels among study subjects. According to the recommendations of 1991 ESC, AHA and ACC Task Force Consensus Document for the use of SAECG, the determinations of onset and offset are related closely to the mean and standard deviation of noise sample. Hence this study would try to perform SAECG using consistent root-mean-square (RMS) noise levels among study subjects and analyze the noise level effects on SAECG. This study would also evaluate the differences between normal subjects and chronic renal failure (CRF) patients in the time-domain SAECG parameters. The study subjects were composed of 50 normal Taiwanese and 20 CRF patients. During the signal-averaged processing, different RMS noise levels were adjusted to evaluate their effects on three time domain parameters (1) filtered total QRS duration (fQRSD), (2) RMS voltage of the last QRS 40 ms (RMS40), and (3) duration of the low amplitude signals below 40 μV (LAS40). The study results demonstrated that the reduction of RMS noise level can increase fQRSD and LAS40 and decrease the RMS40, and can further increase the differences of fQRSD and RMS40 between normal subjects and CRF patients. The SAECG may also become abnormal due to the reduction of RMS noise level. In conclusion, it is essential to establish diagnostic criteria of SAECG using consistent RMS noise levels for the reduction of the noise level effects.

Improvement Approach on Rotor Time Constant Adaptation with Optimum Flux in IFOC for Induction Machines Drives

Induction machine models used for steady-state and transient analysis require machine parameters that are usually considered design parameters or data. The knowledge of induction machine parameters is very important for Indirect Field Oriented Control (IFOC). A mismatched set of parameters will degrade the response of speed and torque control. This paper presents an improvement approach on rotor time constant adaptation in IFOC for Induction Machines (IM). Our approach tends to improve the estimation accuracy of the fundamental model for flux estimation. Based on the reduced order of the IM model, the rotor fluxes and rotor time constant are estimated using only the stator currents and voltages. This reduced order model offers many advantages for real time identification parameters of the IM.

Quranic Braille System

This article concerned with the translation of Quranic verses to Braille symbols, by using Visual basic program. The system has the ability to translate the special vibration for the Quran. This study limited for the (Noun + Scoon) vibrations. It builds on an existing translation system that combines a finite state machine with left and right context matching and a set of translation rules. This allows to translate the Arabic language from text to Braille symbols after detect the vibration for the Quran verses.

Multiclass Support Vector Machines for Environmental Sounds Classification Using log-Gabor Filters

In this paper we propose a robust environmental sound classification approach, based on spectrograms features driven from log-Gabor filters. This approach includes two methods. In the first methods, the spectrograms are passed through an appropriate log-Gabor filter banks and the outputs are averaged and underwent an optimal feature selection procedure based on a mutual information criteria. The second method uses the same steps but applied only to three patches extracted from each spectrogram. To investigate the accuracy of the proposed methods, we conduct experiments using a large database containing 10 environmental sound classes. The classification results based on Multiclass Support Vector Machines show that the second method is the most efficient with an average classification accuracy of 89.62 %.

Consumer Product Demand Forecasting based on Artificial Neural Network and Support Vector Machine

The nature of consumer products causes the difficulty in forecasting the future demands and the accuracy of the forecasts significantly affects the overall performance of the supply chain system. In this study, two data mining methods, artificial neural network (ANN) and support vector machine (SVM), were utilized to predict the demand of consumer products. The training data used was the actual demand of six different products from a consumer product company in Thailand. The results indicated that SVM had a better forecast quality (in term of MAPE) than ANN in every category of products. Moreover, another important finding was the margin difference of MAPE from these two methods was significantly high when the data was highly correlated.

Cross Signal Identification for PSG Applications

The standard investigational method for obstructive sleep apnea syndrome (OSAS) diagnosis is polysomnography (PSG), which consists of a simultaneous, usually overnight recording of multiple electro-physiological signals related to sleep and wakefulness. This is an expensive, encumbering and not a readily repeated protocol, and therefore there is need for simpler and easily implemented screening and detection techniques. Identification of apnea/hypopnea events in the screening recordings is the key factor for the diagnosis of OSAS. The analysis of a solely single-lead electrocardiographic (ECG) signal for OSAS diagnosis, which may be done with portable devices, at patient-s home, is the challenge of the last years. A novel artificial neural network (ANN) based approach for feature extraction and automatic identification of respiratory events in ECG signals is presented in this paper. A nonlinear principal component analysis (NLPCA) method was considered for feature extraction and support vector machine for classification/recognition. An alternative representation of the respiratory events by means of Kohonen type neural network is discussed. Our prospective study was based on OSAS patients of the Clinical Hospital of Pneumology from Iaşi, Romania, males and females, as well as on non-OSAS investigated human subjects. Our computed analysis includes a learning phase based on cross signal PSG annotation.

An Ontology Abstract Machine

As more people from non-technical backgrounds are becoming directly involved with large-scale ontology development, the focal point of ontology research has shifted from the more theoretical ontology issues to problems associated with the actual use of ontologies in real-world, large-scale collaborative applications. Recently the National Science Foundation funded a large collaborative ontology development project for which a new formal ontology model, the Ontology Abstract Machine (OAM), was developed to satisfy some unique functional and data representation requirements. This paper introduces the OAM model and the related algorithms that enable maintenance of an ontology that supports node-based user access. The successful software implementation of the OAM model and its subsequent acceptance by a large research community proves its validity and its real-world application value.

Exploiting Machine Learning Techniques for the Enhancement of Acceptance Sampling

This paper proposes an innovative methodology for Acceptance Sampling by Variables, which is a particular category of Statistical Quality Control dealing with the assurance of products quality. Our contribution lies in the exploitation of machine learning techniques to address the complexity and remedy the drawbacks of existing approaches. More specifically, the proposed methodology exploits Artificial Neural Networks (ANNs) to aid decision making about the acceptance or rejection of an inspected sample. For any type of inspection, ANNs are trained by data from corresponding tables of a standard-s sampling plan schemes. Once trained, ANNs can give closed-form solutions for any acceptance quality level and sample size, thus leading to an automation of the reading of the sampling plan tables, without any need of compromise with the values of the specific standard chosen each time. The proposed methodology provides enough flexibility to quality control engineers during the inspection of their samples, allowing the consideration of specific needs, while it also reduces the time and the cost required for these inspections. Its applicability and advantages are demonstrated through two numerical examples.

Multi-Line Power Flow Control using Interline Power Flow Controller (IPFC) in Power Transmission Systems

The interline power flow controller (IPFC) is one of the latest generation flexible AC transmission systems (FACTS) controller used to control power flows of multiple transmission lines. This paper presents a mathematical model of IPFC, termed as power injection model (PIM). This model is incorporated in Newton- Raphson (NR) power flow algorithm to study the power flow control in transmission lines in which IPFC is placed. A program in MATLAB has been written in order to extend conventional NR algorithm based on this model. Numerical results are carried out on a standard 2 machine 5 bus system. The results without and with IPFC are compared in terms of voltages, active and reactive power flows to demonstrate the performance of the IPFC model.

A Study on the Quality of Hexapod Machine Tool's Workspace

One of the main concerns about parallel mechanisms is the presence of singular points within their workspaces. In singular positions the mechanism gains or loses one or several degrees of freedom. It is impossible to control the mechanism in singular positions. Therefore, these positions have to be avoided. This is a vital need especially in computer controlled machine tools designed and manufactured on the basis of parallel mechanisms. This need has to be taken into consideration when selecting design parameters. A prerequisite to this is a thorough knowledge about the effect of design parameters and constraints on singularity. In this paper, quality condition index was introduced as a criterion for evaluating singularities of different configurations of a hexapod mechanism obtainable by different design parameters. It was illustrated that this method can effectively be employed to obtain the optimum configuration of hexapod mechanism with the aim of avoiding singularity within the workspace. This method was then employed to design the hexapod table of a CNC milling machine.

Motor Imagery Signal Classification for a Four State Brain Machine Interface

Motor imagery classification provides an important basis for designing Brain Machine Interfaces [BMI]. A BMI captures and decodes brain EEG signals and transforms human thought into actions. The ability of an individual to control his EEG through imaginary mental tasks enables him to control devices through the BMI. This paper presents a method to design a four state BMI using EEG signals recorded from the C3 and C4 locations. Principle features extracted through principle component analysis of the segmented EEG are analyzed using two novel classification algorithms using Elman recurrent neural network and functional link neural network. Performance of both classifiers is evaluated using a particle swarm optimization training algorithm; results are also compared with the conventional back propagation training algorithm. EEG motor imagery recorded from two subjects is used in the offline analysis. From overall classification performance it is observed that the BP algorithm has higher average classification of 93.5%, while the PSO algorithm has better training time and maximum classification. The proposed methods promises to provide a useful alternative general procedure for motor imagery classification

Quantitative Analysis of PCA, ICA, LDA and SVM in Face Recognition

Face recognition is a technique to automatically identify or verify individuals. It receives great attention in identification, authentication, security and many more applications. Diverse methods had been proposed for this purpose and also a lot of comparative studies were performed. However, researchers could not reach unified conclusion. In this paper, we are reporting an extensive quantitative accuracy analysis of four most widely used face recognition algorithms: Principal Component Analysis (PCA), Independent Component Analysis (ICA), Linear Discriminant Analysis (LDA) and Support Vector Machine (SVM) using AT&T, Sheffield and Bangladeshi people face databases under diverse situations such as illumination, alignment and pose variations.

Experimental and Theoretical Investigation on Notched Specimens Life Under Bending Loading

In this work, bending fatigue life of notched specimens with various notch geometries and dimensions is investigated by experiment and Manson-Caffin theoretical method. In this theoretical method, fatigue life of notched specimens is calculated using the fatigue life obtained from the experiments for plain specimens (without notch). Three notch geometries including ∪-shape, ∨-shape and C -shape notches are considered in this investigation. The experiments are conducted on a rotary bending Moore machine. The specimens are made of a low carbon steel alloy, which has wide application in industry. The stress- life curves are captured for all notched specimen by experiment. The results indicate that Manson-Caffin analytical method cannot adequately predict the fatigue life of notched specimen. However, it seems that the difference between the experiments and Manson-Caffin predictions can be compensated by a proportional factor.

Analytical Model Based Evaluation of Human Machine Interfaces Using Cognitive Modeling

Cognitive models allow predicting some aspects of utility and usability of human machine interfaces (HMI), and simulating the interaction with these interfaces. The action of predicting is based on a task analysis, which investigates what a user is required to do in terms of actions and cognitive processes to achieve a task. Task analysis facilitates the understanding of the system-s functionalities. Cognitive models are part of the analytical approaches, that do not associate the users during the development process of the interface. This article presents a study about the evaluation of a human machine interaction with a contextual assistant-s interface using ACTR and GOMS cognitive models. The present work shows how these techniques may be applied in the evaluation of HMI, design and research by emphasizing firstly the task analysis and secondly the time execution of the task. In order to validate and support our results, an experimental study of user performance is conducted at the DOMUS laboratory, during the interaction with the contextual assistant-s interface. The results of our models show that the GOMS and ACT-R models give good and excellent predictions respectively of users performance at the task level, as well as the object level. Therefore, the simulated results are very close to the results obtained in the experimental study.

Development of NOx Emission Model for a Tangentially Fired Acid Incinerator

This paper aims to develop a NOx emission model of an acid gas incinerator using Nelder-Mead least squares support vector regression (LS-SVR). Malaysia DOE is actively imposing the Clean Air Regulation to mandate the installation of analytical instrumentation known as Continuous Emission Monitoring System (CEMS) to report emission level online to DOE . As a hardware based analyzer, CEMS is expensive, maintenance intensive and often unreliable. Therefore, software predictive technique is often preferred and considered as a feasible alternative to replace the CEMS for regulatory compliance. The LS-SVR model is built based on the emissions from an acid gas incinerator that operates in a LNG Complex. Simulated Annealing (SA) is first used to determine the initial hyperparameters which are then further optimized based on the performance of the model using Nelder-Mead simplex algorithm. The LS-SVR model is shown to outperform a benchmark model based on backpropagation neural networks (BPNN) in both training and testing data.