Power Control in a Doubly Fed Induction Machine

This paper proposes a direct power control for doubly-fed induction machine for variable speed wind power generation. It provides decoupled regulation of the primary side active and reactive power and it is suitable for both electric energy generation and drive applications. In order to control the power flowing between the stator of the DFIG and the network, a decoupled control of active and reactive power is synthesized using PI controllers.The obtained simulation results show the feasibility and the effectiveness of the suggested method

Determination of Geometric Dimensions of a Double Sided Linear Switched Reluctance Motor

In this study, a double-sided linear switched reluctance motor (LSRM) drive was investigated as an alternative actuator for vertical linear transportation applications such as a linear elevator door, hospital and subway doors which move linearly and where accurate position control and rapid response is requested. A prototype sliding elevator door that is focused on a home elevator with LSRMs is designed. The motor has 6/4 poles, 3 phases, 8A, 24V, 250 W and 250 N pull forces. Air gap between rotor and translator poles of the designed motor and phase coil-s ideal inductance profile are obtained in compliance with the geometric dimensions. Operation and switching sections as motor and generator has been determined from the inductance profile.

Evaluation of Fuzzy ARTMAP with DBSCAN in VLSI Application

The various applications of VLSI circuits in highperformance computing, telecommunications, and consumer electronics has been expanding progressively, and at a very hasty pace. This paper describes a new model for partitioning a circuit using DBSCAN and fuzzy ARTMAP neural network. The first step is concerned with feature extraction, where we had make use DBSCAN algorithm. The second step is the classification and is composed of a fuzzy ARTMAP neural network. The performance of both approaches is compared using benchmark data provided by MCNC standard cell placement benchmark netlists. Analysis of the investigational results proved that the fuzzy ARTMAP with DBSCAN model achieves greater performance then only fuzzy ARTMAP in recognizing sub-circuits with lowest amount of interconnections between them The recognition rate using fuzzy ARTMAP with DBSCAN is 97.7% compared to only fuzzy ARTMAP.

GSM-Based Approach for Indoor Localization

Ability of accurate and reliable location estimation in indoor environment is the key issue in developing great number of context aware applications and Location Based Services (LBS). Today, the most viable solution for localization is the Received Signal Strength (RSS) fingerprinting based approach using wireless local area network (WLAN). This paper presents two RSS fingerprinting based approaches – first we employ widely used WLAN based positioning as a reference system and then investigate the possibility of using GSM signals for positioning. To compare them, we developed a positioning system in real world environment, where realistic RSS measurements were collected. Multi-Layer Perceptron (MLP) neural network was used as the approximation function that maps RSS fingerprints and locations. Experimental results indicate advantage of WLAN based approach in the sense of lower localization error compared to GSM based approach, but GSM signal coverage by far outreaches WLAN coverage and for some LBS services requiring less precise accuracy our results indicate that GSM positioning can also be a viable solution.

Modeling of Bio Scaffolds: Structural and Fluid Transport Characterization

Scaffolds play a key role in tissue engineering and can be produced in many different ways depending on the applications and the materials used. Most researchers used an experimental trialand- error approach into new biomaterials but computer simulation applied to tissue engineering can offer a more exhaustive approach to test and screen out biomaterials. This paper develops the model of scaffolds and Computational Fluid Dynamics that show the value of computer simulations in determining the influence of the geometrical scaffold parameter porosity, pore size and shape on the permeability of scaffolds, magnitude of velocity, drop pressure, shear stress distribution and level and the proper design of the geometry of the scaffold. This creates a need for more advanced studies that include aspects of dynamic conditions of a micro fluid passing through the scaffold were characterized for tissue engineering applications and differentiation of tissues within scaffolds.

Thermal Treatments and Characteristics Study On Unalloyed Structural (AISI 1140) Steel

The main emphasis of metallurgists has been to process the materials to obtain the balanced mechanical properties for the given application. One of the processing routes to alter the properties is heat treatment. Nearly 90% of the structural applications are related to the medium carbon an alloyed steels and hence are regarded as structural steels. The major requirement in the conventional steel is to improve workability, toughness, hardness and grain refinement. In this view, it is proposed to study the mechanical and tribological properties of unalloyed structural (AISI 1140) steel with different thermal (heat) treatments like annealing, normalizing, tempering and hardening and compared with as brought (cold worked) specimen. All heat treatments are carried out in atmospheric condition. Hardening treatment improves hardness of the material, a marginal decrease in hardness value with improved ductility is observed in tempering. Annealing and normalizing improve ductility of the specimen. Normalized specimen shows ultimate ductility. Hardened specimen shows highest wear resistance in the initial period of slide wear where as above 25KM of sliding distance, as brought steel dominates the hardened specimen. Both mild and severe wear regions are observed. Microstructural analysis shows the existence of pearlitic structure in normalized specimen, lath martensitic structure in hardened, pearlitic, ferritic structure in annealed specimen.

Assessment of the Accuracy of Spalart-Allmaras Turbulence Model for Application in Turbulent Wall Jets

The Spalart and Allmaras turbulence model has been implemented in a numerical code to study the compressible turbulent flows, which the system of governing equations is solved with a finite volume approach using a structured grid. The AUSM+ scheme is used to calculate the inviscid fluxes. Different benchmark problems have been computed to validate the implementation and numerical results are shown. A special Attention is paid to wall jet applications. In this study, the jet is submitted to various wall boundary conditions (adiabatic or uniform heat flux) in forced convection regime and both two-dimensional and axisymmetric wall jets are considered. The comparison between the numerical results and experimental data has given the validity of this turbulence model to study the turbulent wall jets especially in engineering applications.

Knowledge Discovery Techniques for Talent Forecasting in Human Resource Application

Human Resource (HR) applications can be used to provide fair and consistent decisions, and to improve the effectiveness of decision making processes. Besides that, among the challenge for HR professionals is to manage organization talents, especially to ensure the right person for the right job at the right time. For that reason, in this article, we attempt to describe the potential to implement one of the talent management tasks i.e. identifying existing talent by predicting their performance as one of HR application for talent management. This study suggests the potential HR system architecture for talent forecasting by using past experience knowledge known as Knowledge Discovery in Database (KDD) or Data Mining. This article consists of three main parts; the first part deals with the overview of HR applications, the prediction techniques and application, the general view of Data mining and the basic concept of talent management in HRM. The second part is to understand the use of Data Mining technique in order to solve one of the talent management tasks, and the third part is to propose the potential HR system architecture for talent forecasting.

A New Block-based NLMS Algorithm and Its Realization in Block Floating Point Format

we propose a new normalized LMS (NLMS) algorithm, which gives satisfactory performance in certain applications in comaprison with con-ventional NLMS recursion. This new algorithm can be treated as a block based simplification of NLMS algorithm with significantly reduced number of multi¬ply and accumulate as well as division operations. It is also shown that such a recursion can be easily implemented in block floating point (BFP) arithmetic, treating the implementational issues much efficiently. In particular, the core challenges of a BFP realization to such adaptive filters are mainly considered in this regard. A global upper bound on the step size control parameter of the new algorithm due to BFP implementation is also proposed to prevent overflow in filtering as well as weight updating operations jointly.

Energy Map Construction using Adaptive Alpha Grey Prediction Model in WSNs

Wireless Sensor Networks can be used to monitor the physical phenomenon in such areas where human approach is nearly impossible. Hence the limited power supply is the major constraint of the WSNs due to the use of non-rechargeable batteries in sensor nodes. A lot of researches are going on to reduce the energy consumption of sensor nodes. Energy map can be used with clustering, data dissemination and routing techniques to reduce the power consumption of WSNs. Energy map can also be used to know which part of the network is going to fail in near future. In this paper, Energy map is constructed using the prediction based approach. Adaptive alpha GM(1,1) model is used as the prediction model. GM(1,1) is being used worldwide in many applications for predicting future values of time series using some past values due to its high computational efficiency and accuracy.

Context for Simplicity: A Basis for Context-aware Systems Based on the 3GPP Generic User Profile

The paper focuses on the area of context modeling with respect to the specification of context-aware systems supporting ubiquitous applications. The proposed approach, followed within the SIMPLICITY IST project, uses a high-level system ontology to derive context models for system components which consequently are mapped to the system's physical entities. For the definition of user and device-related context models in particular, the paper suggests a standard-based process consisting of an analysis phase using the Common Information Model (CIM) methodology followed by an implementation phase that defines 3GPP based components. The benefits of this approach are further depicted by preliminary examples of XML grammars defining profiles and components, component instances, coupled with descriptions of respective ubiquitous applications.

Knowledge Sharing: A Survey, Assessment and Directions for Future Research: Individual Behavior Perspective

One of the most important areas of knowledge management studies is knowledge sharing. Measured in terms of number of scientific articles and organization-s applications, knowledge sharing stands as an example of success in the field. This paper reviews the related papers in the context of the underlying individual behavioral variables to providea direction framework for future research and writing.

Solving Partially Monotone Problems with Neural Networks

In many applications, it is a priori known that the target function should satisfy certain constraints imposed by, for example, economic theory or a human-decision maker. Here we consider partially monotone problems, where the target variable depends monotonically on some of the predictor variables but not all. We propose an approach to build partially monotone models based on the convolution of monotone neural networks and kernel functions. The results from simulations and a real case study on house pricing show that our approach has significantly better performance than partially monotone linear models. Furthermore, the incorporation of partial monotonicity constraints not only leads to models that are in accordance with the decision maker's expertise, but also reduces considerably the model variance in comparison to standard neural networks with weight decay.

Software Effort Estimation Using Soft Computing Techniques

Various models have been derived by studying large number of completed software projects from various organizations and applications to explore how project sizes mapped into project effort. But, still there is a need to prediction accuracy of the models. As Neuro-fuzzy based system is able to approximate the non-linear function with more precision. So, Neuro-Fuzzy system is used as a soft computing approach to generate model by formulating the relationship based on its training. In this paper, Neuro-Fuzzy technique is used for software estimation modeling of on NASA software project data and performance of the developed models are compared with the Halstead, Walston-Felix, Bailey-Basili and Doty Models mentioned in the literature.

Modelling Indoor Air Carbon Dioxide (CO2)Concentration using Neural Network

The use of neural networks is popular in various building applications such as prediction of heating load, ventilation rate and indoor temperature. Significant is, that only few papers deal with indoor carbon dioxide (CO2) prediction which is a very good indicator of indoor air quality (IAQ). In this study, a data-driven modelling method based on multilayer perceptron network for indoor air carbon dioxide in an apartment building is developed. Temperature and humidity measurements are used as input variables to the network. Motivation for this study derives from the following issues. First, measuring carbon dioxide is expensive and sensors power consumptions is high and secondly, this leads to short operating times of battery-powered sensors. The results show that predicting CO2 concentration based on relative humidity and temperature measurements, is difficult. Therefore, more additional information is needed.

Security Enhanced RFID Middleware System

Recently, the RFID (Radio Frequency Identification) technology attracts the world market attention as essential technology for ubiquitous environment. The RFID market has focused on transponders and reader development. But that concern has shifted to RFID software like as high-valued e-business applications, RFID middleware and related development tools. However, due to the high sensitivity of data and service transaction within the RFID network, security consideration must be addressed. In order to guarantee trusted e-business based on RFID technology, we propose a security enhanced RFID middleware system. Our proposal is compliant with EPCglobal ALE (Application Level Events), which is standard interface for middleware and its clients. We show how to provide strengthened security and trust by protecting transported data between middleware and its client, and stored data in middleware. Moreover, we achieve the identification and service access control against illegal service abuse. Our system enables secure RFID middleware service and trusted e-business service.

A Reliable FPGA-based Real-time Optical-flow Estimation

Optical flow is a research topic of interest for many years. It has, until recently, been largely inapplicable to real-time applications due to its computationally expensive nature. This paper presents a new reliable flow technique which is combined with a motion detection algorithm, from stationary camera image streams, to allow flow-based analyses of moving entities, such as rigidity, in real-time. The combination of the optical flow analysis with motion detection technique greatly reduces the expensive computation of flow vectors as compared with standard approaches, rendering the method to be applicable in real-time implementation. This paper describes also the hardware implementation of a proposed pipelined system to estimate the flow vectors from image sequences in real time. This design can process 768 x 576 images at a very high frame rate that reaches to 156 fps in a single low cost FPGA chip, which is adequate for most real-time vision applications.

Cooperative Data Caching in WSN

Wireless sensor networks (WSNs) have gained tremendous attention in recent years due to their numerous applications. Due to the limited energy resource, energy efficient operation of sensor nodes is a key issue in wireless sensor networks. Cooperative caching which ensures sharing of data among various nodes reduces the number of communications over the wireless channels and thus enhances the overall lifetime of a wireless sensor network. In this paper, we propose a cooperative caching scheme called ZCS (Zone Cooperation at Sensors) for wireless sensor networks. In ZCS scheme, one-hop neighbors of a sensor node form a cooperative cache zone and share the cached data with each other. Simulation experiments show that the ZCS caching scheme achieves significant improvements in byte hit ratio and average query latency in comparison with other caching strategies.

GeoSEMA: A Modelling Platform, Emerging “GeoSpatial-based Evolutionary and Mobile Agents“

Spatial and mobile computing evolves. This paper describes a smart modeling platform called “GeoSEMA". This approach tends to model multidimensional GeoSpatial Evolutionary and Mobile Agents. Instead of 3D and location-based issues, there are some other dimensions that may characterize spatial agents, e.g. discrete-continuous time, agent behaviors. GeoSEMA is seen as a devoted design pattern motivating temporal geographic-based applications; it is a firm foundation for multipurpose and multidimensional special-based applications. It deals with multipurpose smart objects (buildings, shapes, missiles, etc.) by stimulating geospatial agents. Formally, GeoSEMA refers to geospatial, spatio-evolutive and mobile space constituents where a conceptual geospatial space model is given in this paper. In addition to modeling and categorizing geospatial agents, the model incorporates the concept of inter-agents event-based protocols. Finally, a rapid software-architecture prototyping GeoSEMA platform is also given. It will be implemented/ validated in the next phase of our work.

Performance Analysis of List Scheduling in Heterogeneous Computing Systems

Given a parallel program to be executed on a heterogeneous computing system, the overall execution time of the program is determined by a schedule. In this paper, we analyze the worst-case performance of the list scheduling algorithm for scheduling tasks of a parallel program in a mixed-machine heterogeneous computing system such that the total execution time of the program is minimized. We prove tight lower and upper bounds for the worst-case performance ratio of the list scheduling algorithm. We also examine the average-case performance of the list scheduling algorithm. Our experimental data reveal that the average-case performance of the list scheduling algorithm is much better than the worst-case performance and is very close to optimal, except for large systems with large heterogeneity. Thus, the list scheduling algorithm is very useful in real applications.