Optimal Current Control of Externally Excited Synchronous Machines in Automotive Traction Drive Applications

The excellent suitability of the externally excited synchronous machine (EESM) in automotive traction drive applications is justified by its high efficiency over the whole operation range and the high availability of materials. Usually, maximum efficiency is obtained by modelling each single loss and minimizing the sum of all losses. As a result, the quality of the optimization highly depends on the precision of the model. Moreover, it requires accurate knowledge of the saturation dependent machine inductances. Therefore, the present contribution proposes a method to minimize the overall losses of a salient pole EESM and its inverter in steady state operation based on measurement data only. Since this method does not require any manufacturer data, it is well suited for an automated measurement data evaluation and inverter parametrization. The field oriented control (FOC) of an EESM provides three current components resp. three degrees of freedom (DOF). An analytic minimization of the copper losses in the stator and the rotor (assuming constant inductances) is performed and serves as a first approximation of how to choose the optimal current reference values. After a numeric offline minimization of the overall losses based on measurement data the results are compared to a control strategy that satisfies cos (ϕ) = 1.

A New Group Key Management Protocol for Wireless Ad-Hoc Networks

Ad hoc networks are characterized by multi-hop wireless connectivity and frequently changing network topology. Forming security association among a group of nodes in ad-hoc networks is more challenging than in conventional networks due to the lack of central authority, i.e. fixed infrastructure. With that view in mind, group key management plays an important building block of any secure group communication. The main contribution of this paper is a low complexity key management scheme that is suitable for fully self-organized ad-hoc networks. The protocol is also password authenticated, making it resilient against active attacks. Unlike other existing key agreement protocols, ours make no assumption about the structure of the underlying wireless network, making it suitable for “truly ad-hoc" networks. Finally, we will analyze our protocol to show the computation and communication burden on individual nodes for key establishment.

Red Diode Laser in the Treatment of Epidermal Diseases in PDT

The process of laser absorption in the skin during laser irradiation was a critical point in medical application treatments. Delivery the correct amount of laser light is a critical element in photodynamic therapy (PDT). More amounts of laser light able to affect tissues in the skin and small amount not able to enhance PDT procedure in skin. The knowledge of the skin tone laser dependent distribution of 635 nm radiation and its penetration depth in skin is a very important precondition for the investigation of advantage laser induced effect in (PDT) in epidermis diseases (psoriasis). The aim of this work was to estimate an optimum effect of diode laser (635 nm) on the treatment of epidermis diseases in different color skin. Furthermore, it is to improve safety of laser in PDT in epidermis diseases treatment. Advanced system analytical program (ASAP) which is a new approach in investigating the PDT, dependent on optical properties of different skin color was used in present work. A two layered Realistic Skin Model (RSM); stratum corneum and epidermal with red laser (635 nm, 10 mW) were used for irradiative transfer to study fluence and absorbance in different penetration for various human skin colors. Several skin tones very fair, fair, light, medium and dark are used to irradiative transfer. This investigation involved the principles of laser tissue interaction when the skin optically injected by a red laser diode. The results demonstrated that the power characteristic of a laser diode (635 nm) can affect the treatment of epidermal disease in various color skins. Power absorption of the various human skins were recorded and analyzed in order to find the influence of the melanin in PDT treatment in epidermal disease. A two layered RSM show that the change in penetration depth in epidermal layer of the color skin has a larger effect on the distribution of absorbed laser in the skin; this is due to the variation of the melanin concentration for each color.

Applications of Genetic Programming in Data Mining

This paper details the application of a genetic programming framework for induction of useful classification rules from a database of income statements, balance sheets, and cash flow statements for North American public companies. Potentially interesting classification rules are discovered. Anomalies in the discovery process merit further investigation of the application of genetic programming to the dataset for the problem domain.

A Hybrid Particle Swarm Optimization Solution to Ramping Rate Constrained Dynamic Economic Dispatch

This paper presents the application of an enhanced Particle Swarm Optimization (EPSO) combined with Gaussian Mutation (GM) for solving the Dynamic Economic Dispatch (DED) problem considering the operating constraints of generators. The EPSO consists of the standard PSO and a modified heuristic search approaches. Namely, the ability of the traditional PSO is enhanced by applying the modified heuristic search approach to prevent the solutions from violating the constraints. In addition, Gaussian Mutation is aimed at increasing the diversity of global search, whilst it also prevents being trapped in suboptimal points during search. To illustrate its efficiency and effectiveness, the developed EPSO-GM approach is tested on the 3-unit and 10-unit 24-hour systems considering valve-point effect. From the experimental results, it can be concluded that the proposed EPSO-GM provides, the accurate solution, the efficiency, and the feature of robust computation compared with other algorithms under consideration.

Array Signal Processing: DOA Estimation for Missing Sensors

Array signal processing involves signal enumeration and source localization. Array signal processing is centered on the ability to fuse temporal and spatial information captured via sampling signals emitted from a number of sources at the sensors of an array in order to carry out a specific estimation task: source characteristics (mainly localization of the sources) and/or array characteristics (mainly array geometry) estimation. Array signal processing is a part of signal processing that uses sensors organized in patterns or arrays, to detect signals and to determine information about them. Beamforming is a general signal processing technique used to control the directionality of the reception or transmission of a signal. Using Beamforming we can direct the majority of signal energy we receive from a group of array. Multiple signal classification (MUSIC) is a highly popular eigenstructure-based estimation method of direction of arrival (DOA) with high resolution. This Paper enumerates the effect of missing sensors in DOA estimation. The accuracy of the MUSIC-based DOA estimation is degraded significantly both by the effects of the missing sensors among the receiving array elements and the unequal channel gain and phase errors of the receiver.

Automatic Visualization Pipeline Formation for Medical Datasets on Grid Computing Environment

Distance visualization of large datasets often takes the direction of remote viewing and zooming techniques of stored static images. However, the continuous increase in the size of datasets and visualization operation causes insufficient performance with traditional desktop computers. Additionally, the visualization techniques such as Isosurface depend on the available resources of the running machine and the size of datasets. Moreover, the continuous demand for powerful computing powers and continuous increase in the size of datasets results an urgent need for a grid computing infrastructure. However, some issues arise in current grid such as resources availability at the client machines which are not sufficient enough to process large datasets. On top of that, different output devices and different network bandwidth between the visualization pipeline components often result output suitable for one machine and not suitable for another. In this paper we investigate how the grid services could be used to support remote visualization of large datasets and to break the constraint of physical co-location of the resources by applying the grid computing technologies. We show our grid enabled architecture to visualize large medical datasets (circa 5 million polygons) for remote interactive visualization on modest resources clients.

BER Performance of UWB Modulations through S-V Channel Model

BER analysis of Impulse Radio Ultra Wideband (IRUWB) pulse modulations over S-V channel model is proposed in this paper. The UWB pulse is Gaussian monocycle pulse modulated using Pulse Amplitude Modulation (PAM) and Pulse Position Modulation (PPM). The channel model is generated from a modified S-V model. Bit-error rate (BER) is measured over several of bit rates. The result shows that all modulation are appropriate for both LOS and NLOS channel, but PAM gives better performance in bit rates and SNR. Moreover, as standard of speed has been given for UWB, the communication is appropriate with high bit rates in LOS channel.

Fatigue Failure of Structural Steel – Analysis Using Fracture Mechanics

Fatigue is the major threat in service of steel structure subjected to fluctuating loads. With the additional effect of corrosion and presence of weld joints the fatigue failure may become more critical in structural steel. One of the apt examples of such structural is the sailing ship. This is experiencing a constant stress due to floating and a pulsating bending load due to the waves. This paper describes an attempt to verify theory of fatigue in fracture mechanics approach with experimentation to determine the constants of crack growth curve. For this, specimen is prepared from the ship building steel and it is subjected to a pulsating bending load with a known defect. Fatigue crack and its nature is observed in this experiment. Application of fracture mechanics approach in fatigue with a simple practical experiment is conducted and constants of crack growth equation are investigated.

Accelerating Sparse Matrix Vector Multiplication on Many-Core GPUs

Many-core GPUs provide high computing ability and substantial bandwidth; however, optimizing irregular applications like SpMV on GPUs becomes a difficult but meaningful task. In this paper, we propose a novel method to improve the performance of SpMV on GPUs. A new storage format called HYB-R is proposed to exploit GPU architecture more efficiently. The COO portion of the matrix is partitioned recursively into a ELL portion and a COO portion in the process of creating HYB-R format to ensure that there are as many non-zeros as possible in ELL format. The method of partitioning the matrix is an important problem for HYB-R kernel, so we also try to tune the parameters to partition the matrix for higher performance. Experimental results show that our method can get better performance than the fastest kernel (HYB) in NVIDIA-s SpMV library with as high as 17% speedup.

Real-time Performance Study of EPA Periodic Data Transmission

EPA (Ethernet for Plant Automation) resolves the nondeterministic problem of standard Ethernet and accomplishes real-time communication by means of micro-segment topology and deterministic scheduling mechanism. This paper studies the real-time performance of EPA periodic data transmission from theoretical and experimental perspective. By analyzing information transmission characteristics and EPA deterministic scheduling mechanism, 5 indicators including delivery time, time synchronization accuracy, data-sending time offset accuracy, utilization percentage of configured timeslice and non-RTE bandwidth that can be used to specify the real-time performance of EPA periodic data transmission are presented and investigated. On this basis, the test principles and test methods of the indicators are respectively studied and some formulas for real-time performance of EPA system are derived. Furthermore, an experiment platform is developed to test the indicators of EPA periodic data transmission in a micro-segment. According to the analysis and the experiment, the methods to improve the real-time performance of EPA periodic data transmission including optimizing network structure, studying self-adaptive adjustment method of timeslice and providing data-sending time offset accuracy for configuration are proposed.

Autistic Children and Different Tense Forms

Autism spectrum disorder is characterized by abnormalities in social communication, language abilities and repetitive behaviors. The present study focused on some grammatical deficits in autistic children. We evaluated the impairment of correct use of different Persian verb tenses in autistic children-s speech. Two standardized Language Test were administered then gathered data were analyzed. The main result of this study was significant difference between the mean scores of correct responses to present tense in comparison with past tense in Persian language. This study demonstrated that tense is severely impaired in autistic children-s speech. Our findings indicated those autistic children-s production of simple present/ past tense opposition to be better than production of future and past periphrastic forms (past perfect, present perfect, past progressive).

EEG Spikes Detection, Sorting, and Localization

This study introduces a new method for detecting, sorting, and localizing spikes from multiunit EEG recordings. The method combines the wavelet transform, which localizes distinctive spike features, with Super-Paramagnetic Clustering (SPC) algorithm, which allows automatic classification of the data without assumptions such as low variance or Gaussian distributions. Moreover, the method is capable of setting amplitude thresholds for spike detection. The method makes use of several real EEG data sets, and accordingly the spikes are detected, clustered and their times were detected.

Performance Appraisal System using Multifactorial Evaluation Model

Performance appraisal of employee is important in managing the human resource of an organization. With the change towards knowledge-based capitalism, maintaining talented knowledge workers is critical. However, management classification of “outstanding", “poor" and “average" performance may not be an easy decision. Besides that, superior might also tend to judge the work performance of their subordinates informally and arbitrarily especially without the existence of a system of appraisal. In this paper, we propose a performance appraisal system using multifactorial evaluation model in dealing with appraisal grades which are often express vaguely in linguistic terms. The proposed model is for evaluating staff performance based on specific performance appraisal criteria. The project was collaboration with one of the Information and Communication Technology company in Malaysia with reference to its performance appraisal process.

Investigation of Interference Conditions in BFWA System Applying Adaptive TDD

In a BFWA (Broadband Fixed Wireless Access Network) the evolved SINR (Signal to Interference plus Noise Ratio) is relevant influenced by the applied duplex method. The TDD (Time Division Duplex), especially adaptive TDD method has some advantage contrary to FDD (Frequency Division Duplex), for example the spectrum efficiency and flexibility. However these methods are suffering several new interference situations that can-t occur in a FDD system. This leads to reduced SINR in the covered area what could cause some connection outages. Therefore, countermeasure techniques against interference are necessary to apply in TDD systems. Synchronization is one way to handling the interference. In this paper the TDD systems – applying different system synchronization degree - will be compared by the evolved SINR at different locations of the BFWA service area and the percentage of the covered area by the system.

Software Technology Behind Computer Accounting

The main problems of data centric and open source project are large number of developers and changes of core framework. Model-View-Control (MVC) design pattern significantly improved the development and adjustments of complex projects. Entity framework as a Model layer in MVC architecture has simplified communication with the database. How often are the new technologies used and whether they have potentials for designing more efficient Enterprise Resource Planning (ERP) system that will be more suited to accountants?

Application and Limitation of Parallel Modelingin Multidimensional Sequential Pattern

The goal of data mining algorithms is to discover useful information embedded in large databases. One of the most important data mining problems is discovery of frequently occurring patterns in sequential data. In a multidimensional sequence each event depends on more than one dimension. The search space is quite large and the serial algorithms are not scalable for very large datasets. To address this, it is necessary to study scalable parallel implementations of sequence mining algorithms. In this paper, we present a model for multidimensional sequence and describe a parallel algorithm based on data parallelism. Simulation experiments show good load balancing and scalable and acceptable speedup over different processors and problem sizes and demonstrate that our approach can works efficiently in a real parallel computing environment.

Biological Effects of a Carbohydrate-Binding Protein from an Annelid, Perinereis nuntia Against Human and Phytopathogenic Microorganisms

Lectins have a good scope in current clinical microbiology research. In the present study evaluated the antimicrobial activities of a D-galactose binding lectin (PnL) was purified from the annelid, Perinereis nuntia (polychaeta) by affinity chromatography. The molecular mass of the lectin was determined to be 32 kDa as a single polypeptide by SDS-PAGE under both reducing and non-reducing conditions. The hemagglutinating activity of the PnL showed against trypsinized and glutaraldehyde-fixed human erythrocytes was specifically inhibited by D-Gal, GalNAc, Galβ1-4Glc and Galα1-6Glc. PnL was evaluated for in vitro antibacterial screening studies against 11 gram-positive and gram-negative microorganisms. From the screening results, it was revealed that PnL exhibited significant antibacterial activity against gram-positive bacteria. Bacillus megaterium showed the highest growth inhibition by the lectin (250 μg/disc). However, PnL did not inhibit the growth of gram-negative bacteria such as Vibrio cholerae and Pseudomonas sp. PnL was also examined for in vitro antifungal activity against six fungal phytopathogens. PnL (100 μg/mL) inhibited the mycelial growth of Alternaria alternata (24.4%). These results indicate that future findings of lectin applications obtained from annelids may be of importance to life sciences.

Transmission Pricing based on Voltage Angle Decomposition

In this paper a new approach for transmission pricing is presented. The main idea is voltage angle allocation, i.e. determining the contribution of each contract on the voltage angle of each bus. DC power flow is used to compute a primary solution for angle decomposition. To consider the impacts of system non-linearity on angle decomposition, the primary solution is corrected in different iterations of decoupled Newton-Raphson power flow. Then, the contribution of each contract on power flow of each transmission line is computed based on angle decomposition. Contract-related flows are used as a measure for “extent of use" of transmission network capacity and consequently transmission pricing. The presented approach is applied to a 4-bus test system and IEEE 30-bus test system.

Cloud Computing Databases: Latest Trends and Architectural Concepts

The Economic factors are leading to the rise of infrastructures provides software and computing facilities as a service, known as cloud services or cloud computing. Cloud services can provide efficiencies for application providers, both by limiting up-front capital expenses, and by reducing the cost of ownership over time. Such services are made available in a data center, using shared commodity hardware for computation and storage. There is a varied set of cloud services available today, including application services (salesforce.com), storage services (Amazon S3), compute services (Google App Engine, Amazon EC2) and data services (Amazon SimpleDB, Microsoft SQL Server Data Services, Google-s Data store). These services represent a variety of reformations of data management architectures, and more are on the horizon.