Coordinated Design of TCSC Controller and PSS Employing Particle Swarm Optimization Technique

This paper investigates the application of Particle Swarm Optimization (PSO) technique for coordinated design of a Power System Stabilizer (PSS) and a Thyristor Controlled Series Compensator (TCSC)-based controller to enhance the power system stability. The design problem of PSS and TCSC-based controllers is formulated as a time domain based optimization problem. PSO algorithm is employed to search for optimal controller parameters. By minimizing the time-domain based objective function, in which the deviation in the oscillatory rotor speed of the generator is involved; stability performance of the system is improved. To compare the capability of PSS and TCSC-based controller, both are designed independently first and then in a coordinated manner for individual and coordinated application. The proposed controllers are tested on a weakly connected power system. The eigenvalue analysis and non-linear simulation results are presented to show the effectiveness of the coordinated design approach over individual design. The simulation results show that the proposed controllers are effective in damping low frequency oscillations resulting from various small disturbances like change in mechanical power input and reference voltage setting.

Adaptive Algorithm to Predict the QoS of Web Processes and Workflows

Workflow Management Systems (WfMS) alloworganizations to streamline and automate business processes and reengineer their structure. One important requirement for this type of system is the management and computation of the Quality of Service(QoS) of processes and workflows. Currently, a range of Web processes and workflow languages exist. Each language can be characterized by the set of patterns they support. Developing andimplementing a suitable and generic algorithm to compute the QoSof processes that have been designed using different languages is a difficult task. This is because some patterns are specific to particular process languages and new patterns may be introduced in future versions of a language. In this paper, we describe an adaptive algorithm implemented to cope with these two problems. The algorithm is called adaptive since it can be dynamically changed as the patterns of a process language also change.

Malaysia Folk Literature in Early Childhood Education

Malay Folk Literature in early childhood education served as an important agent in child development that involved emotional, thinking and language aspects. Up to this moment not much research has been carried out in Malaysia particularly in the teaching and learning aspects nor has there been an effort to publish “big books." Hence this article will discuss the stance taken by university undergraduate students, teachers and parents in evaluating Malay Folk Literature in early childhood education to be used as big books. The data collated and analyzed were taken from 646 respondents comprising 347 undergraduates and 299 teachers. Results of the study indicated that Malay Folk Literature can be absorbed into teaching and learning for early childhood with a mean of 4.25 while it can be in big books with a mean of 4.14. Meanwhile the highest mean value required for placing Malay Folk Literature genre as big books in early childhood education rests on exemplary stories for undergraduates with mean of 4.47; animal fables for teachers with a mean of 4.38. The lowest mean value of 3.57 is given to lipurlara stories. The most popular Malay Folk Literature found suitable for early children is Sang Kancil and the Crocodile, followed by Bawang Putih Bawang Merah. Pak Padir, Legends of Mahsuri, Origin of Malacca, and Origin of Rainbow are among the popular stories as well. Overall the undergraduates show a positive attitude toward all the items compared to teachers. The t-test analysis has revealed a non significant relationship between the undergraduate students and teachers with all the items for the teaching and learning of Malay Folk Literature.

Transmitter Macrodiversity in Multihopping- SFN Based Algorithm for Improved Node Reachability and Robust Routing

A novel idea presented in this paper is to combine multihop routing with single-frequency networks (SFNs) for a broadcasting scenario. An SFN is a set of multiple nodes that transmit the same data simultaneously, resulting in transmitter macrodiversity. Two of the most important performance factors of multihop networks, node reachability and routing robustness, are analyzed. Simulation results show that our proposed SFN-D routing algorithm improves the node reachability by 37 percentage points as compared to non-SFN multihop routing. It shows a diversity gain of 3.7 dB, meaning that 3.7 dB lower transmission powers are required for the same reachability. Even better results are possible for larger networks. If an important node becomes inactive, this algorithm can find new routes that a non-SFN scheme would not be able to find. Thus, two of the major problems in multihopping are addressed; achieving robust routing as well as improving node reachability or reducing transmission power.

Extended “2D-RIB“ for Impression-Based Satisfactory Retrieval and its Evaluation

Recently, lots of researchers are attracted to retrieving multimedia database by using some impression words and their values. Ikezoe-s research is one of the representatives and uses eight pairs of opposite impression words. We had modified its retrieval interface and proposed '2D-RIB' in the previous work. The aim of the present paper is to improve his/her satisfaction level to the retrieval result in the 2D-RIB. Our method is to extend the 2D-RIB. One of our extensions is to define and introduce the following two measures: 'melody goodness' and 'general acceptance'. Another extension is three types of customization menus. The result of evaluation using a pilot system is as follows. Both of these two measures 'melody goodness' and -general acceptance- can contribute to the improvement. Moreover, it is effective if we introduce the customization menu which enables a retrieval person to reduce the strictness level of retrieval condition in an impression pair based on his/her need.

Design, Implementation and Analysis of Composite Material Dampers for Turning Operations

This paper introduces a novel design for boring bar with enhanced damping capability. The principle followed in the design phase was to enhance the damping capability minimizing the loss in static stiffness through implementation of composite material interfaces. The newly designed tool has been compared to a conventional tool. The evaluation criteria were the dynamic characteristics, frequency and damping ratio, of the machining system, as well as the surface roughness of the machined workpieces. The use of composite material in the design of damped tool has been demonstrated effective. Furthermore, the autoregressive moving average (ARMA) models presented in this paper take into consideration the interaction between the elastic structure of the machine tool and the cutting process and can therefore be used to characterize the machining system in operational conditions.

Alertness States Classification By SOM and LVQ Neural Networks

Several studies have been carried out, using various techniques, including neural networks, to discriminate vigilance states in humans from electroencephalographic (EEG) signals, but we are still far from results satisfactorily useable results. The work presented in this paper aims at improving this status with regards to 2 aspects. Firstly, we introduce an original procedure made of the association of two neural networks, a self organizing map (SOM) and a learning vector quantization (LVQ), that allows to automatically detect artefacted states and to separate the different levels of vigilance which is a major breakthrough in the field of vigilance. Lastly and more importantly, our study has been oriented toward real-worked situation and the resulting model can be easily implemented as a wearable device. It benefits from restricted computational and memory requirements and data access is very limited in time. Furthermore, some ongoing works demonstrate that this work should shortly results in the design and conception of a non invasive electronic wearable device.

Analytical and Experimental Methods of Design for Supersonic Two-Stage Ejectors

In this paper the supersonic ejectors are experimentally and analytically studied. Ejector is a device that uses the energy of a fluid to move another fluid. This device works like a vacuum pump without usage of piston, rotor or any other moving component. An ejector contains an active nozzle, a passive nozzle, a mixing chamber and a diffuser. Since the fluid viscosity is large, and the flow is turbulent and three dimensional in the mixing chamber, the numerical methods consume long time and high cost to analyze the flow in ejectors. Therefore this paper presents a simple analytical method that is based on the precise governing equations in fluid mechanics. According to achieved analytical relations, a computer code has been prepared to analyze the flow in different components of the ejector. An experiment has been performed in supersonic regime 1.5

Forecasting Malaria Cases in Bujumbura

The focus in this work is to assess which method allows a better forecasting of malaria cases in Bujumbura ( Burundi) when taking into account association between climatic factors and the disease. For the period 1996-2007, real monthly data on both malaria epidemiology and climate in Bujumbura are described and analyzed. We propose a hierarchical approach to achieve our objective. We first fit a Generalized Additive Model to malaria cases to obtain an accurate predictor, which is then used to predict future observations. Various well-known forecasting methods are compared leading to different results. Based on in-sample mean average percentage error (MAPE), the multiplicative exponential smoothing state space model with multiplicative error and seasonality performed better.

Choice of Efficient Information System with Service-Oriented Architecture using Multiple Criteria Threshold Algorithms (With Practical Example)

Author presents the results of a study conducted to identify criteria of efficient information system (IS) with serviceoriented architecture (SOA) realization and proposes a ranking method to evaluate SOA information systems using a set of architecture quality criteria before the systems are implemented. The method is used to compare 7 SOA projects and ranking result for SOA efficiency of the projects is provided. The choice of SOA realization project depends on following criteria categories: IS internal work and organization, SOA policies, guidelines and change management, processes and business services readiness, risk management and mitigation. The last criteria category was analyzed on the basis of projects statistics.

The Modified Eigenface Method using Two Thresholds

A new approach is adopted in this paper based on Turk and Pentland-s eigenface method. It was found that the probability density function of the distance between the projection vector of the input face image and the average projection vector of the subject in the face database, follows Rayleigh distribution. In order to decrease the false acceptance rate and increase the recognition rate, the input face image has been recognized using two thresholds including the acceptance threshold and the rejection threshold. We also find out that the value of two thresholds will be close to each other as number of trials increases. During the training, in order to reduce the number of trials, the projection vectors for each subject has been averaged. The recognition experiments using the proposed algorithm show that the recognition rate achieves to 92.875% whilst the average number of judgment is only 2.56 times.

Virtual Reality for Mutual Understanding in Landscape Planning

This paper argues that fostering mutual understanding in landscape planning is as much about the planners educating stakeholder groups as the stakeholders educating the planners. In other words it is an epistemological agreement as to the meaning and nature of place, especially where an effort is made to go beyond the quantitative aspects, which can be achieved by the phenomenological experience of the Virtual Reality (VR) environment. This education needs to be a bi-directional process in which distance can be both temporal as well as spatial separation of participants, that there needs to be a common framework of understanding in which neither 'side' is disadvantaged during the process of information exchange and it follows that a medium such as VR offers an effective way of overcoming some of the shortcomings of traditional media by taking advantage of continuing technological advances in Information, Technology and Communications (ITC). In this paper we make particular reference to this as an extension to Geographical Information Systems (GIS). VR as a two-way communication tool offers considerable potential particularly in the area of Public Participation GIS (PPGIS). Information rich virtual environments that can operate over broadband networks are now possible and thus allow for the representation of large amounts of qualitative and quantitative information 'side-by-side'. Therefore, with broadband access becoming standard for households and enterprises alike, distributed virtual reality environments have great potential to contribute to enabling stakeholder participation and mutual learning within the planning context.

Design of an SNMP Agent for OSGi Service Platforms

On one hand, SNMP (Simple Network Management Protocol) allows integrating different enterprise elements connected through Internet into a standardized remote management. On the other hand, as a consequence of the success of Intelligent Houses they can be connected through Internet now by means of a residential gateway according to a common standard called OSGi (Open Services Gateway initiative). Due to the specifics of OSGi Service Platforms and their dynamic nature, specific design criterions should be defined to implement SNMP Agents for OSGi in order to integrate them into the SNMP remote management. Based on the analysis of the relation between both standards (SNMP and OSGi), this paper shows how OSGi Service Platforms can be included into the SNMP management of a global enterprise, giving implementation details about an SNMP Agent solution and the definition of a new MIB (Management Information Base) for managing OSGi platforms that takes into account the specifics and dynamic nature of OSGi.

Optimal All-to-All Personalized Communication in All-Port Tori

All-to-all personalized communication, also known as complete exchange, is one of the most dense communication patterns in parallel computing. In this paper, we propose new indirect algorithms for complete exchange on all-port ring and torus. The new algorithms fully utilize all communication links and transmit messages along shortest paths to completely achieve the theoretical lower bounds on message transmission, which have not be achieved among other existing indirect algorithms. For 2D r × c ( r % c ) all-port torus, the algorithm has time complexities of optimal transmission cost and O(c) message startup cost. In addition, the proposed algorithms accommodate non-power-of-two tori where the number of nodes in each dimension needs not be power-of-two or square. Finally, the algorithms are conceptually simple and symmetrical for every message and every node so that they can be easily implemented and achieve the optimum in practice.

A New Approach of Wireless Network Traffic on VPN

This work presents a new approach of securing a wireless network. The configuration is focused on securing & Protecting wireless network traffic for a small network such as a home or dorm room. The security Mechanism provided both authentication, allowing only known authorized users access to the wireless network, and encryption, preventing anyone from reading the wireless traffic. The mentioned solution utilizes the open source free S/WAN software which implements the Internet Protocol Security –IPSEC. In addition to wireless components, wireless NIC in PC and wireless access point needs a machine running Linux to act as security gateway. While the current configuration assumes that the wireless PC clients are running Linux, Windows XP/VISTA/7 based machines equipped with VPN software which will allow to interface with this configuration.

Medical Image Segmentation Based On Vigorous Smoothing and Edge Detection Ideology

Medical image segmentation based on image smoothing followed by edge detection assumes a great degree of importance in the field of Image Processing. In this regard, this paper proposes a novel algorithm for medical image segmentation based on vigorous smoothening by identifying the type of noise and edge diction ideology which seems to be a boom in medical image diagnosis. The main objective of this algorithm is to consider a particular medical image as input and make the preprocessing to remove the noise content by employing suitable filter after identifying the type of noise and finally carrying out edge detection for image segmentation. The algorithm consists of three parts. First, identifying the type of noise present in the medical image as additive, multiplicative or impulsive by analysis of local histograms and denoising it by employing Median, Gaussian or Frost filter. Second, edge detection of the filtered medical image is carried out using Canny edge detection technique. And third part is about the segmentation of edge detected medical image by the method of Normalized Cut Eigen Vectors. The method is validated through experiments on real images. The proposed algorithm has been simulated on MATLAB platform. The results obtained by the simulation shows that the proposed algorithm is very effective which can deal with low quality or marginal vague images which has high spatial redundancy, low contrast and biggish noise, and has a potential of certain practical use of medical image diagnosis.

Development of a Catchment Water Quality Model for Continuous Simulations of Pollutants Build-up and Wash-off

Estimation of runoff water quality parameters is required to determine appropriate water quality management options. Various models are used to estimate runoff water quality parameters. However, most models provide event-based estimates of water quality parameters for specific sites. The work presented in this paper describes the development of a model that continuously simulates the accumulation and wash-off of water quality pollutants in a catchment. The model allows estimation of pollutants build-up during dry periods and pollutants wash-off during storm events. The model was developed by integrating two individual models; rainfall-runoff model, and catchment water quality model. The rainfall-runoff model is based on the time-area runoff estimation method. The model allows users to estimate the time of concentration using a range of established methods. The model also allows estimation of the continuing runoff losses using any of the available estimation methods (i.e., constant, linearly varying or exponentially varying). Pollutants build-up in a catchment was represented by one of three pre-defined functions; power, exponential, or saturation. Similarly, pollutants wash-off was represented by one of three different functions; power, rating-curve, or exponential. The developed runoff water quality model was set-up to simulate the build-up and wash-off of total suspended solids (TSS), total phosphorus (TP) and total nitrogen (TN). The application of the model was demonstrated using available runoff and TSS field data from road and roof surfaces in the Gold Coast, Australia. The model provided excellent representation of the field data demonstrating the simplicity yet effectiveness of the proposed model.

Low Pressure Binder-Less Densification of Fibrous Biomass Material using a Screw Press

In this study, the theoretical relationship between pressure and density was investigated on cylindrical hollow fuel briquettes produced of a mixture of fibrous biomass material using a screw press without any chemical binder. The fuel briquettes were made of biomass and other waste material such as spent coffee beans, mielie husks, saw dust and coal fines under pressures of 0.878-2.2 Mega Pascals (MPa). The material was densified into briquettes of outer diameter of 100mm, inner diameter of 35mm and 50mm long. It was observed that manual screw compression action produces briquettes of relatively low density as compared to the ones made using hydraulic compression action. The pressure and density relationship was obtained in the form of power law and compare well with other cylindrical solid briquettes made using hydraulic compression action. The produced briquettes have a dry density of 989 kg/m3 and contain 26.30% fixed carbon, 39.34% volatile matter, 10.9% moisture and 10.46% ash as per dry proximate analysis. The bomb calorimeter tests have shown the briquettes yielding a gross calorific value of 18.9MJ/kg.

Space-Vector PWM Inverter Feeding a Permanent-Magnet Synchronous Motor

The paper presents a space-vector pulse width modulation (SVPWM) inverter feeding a permanent-magnet synchronous motor (PMSM). The SVPWM inverter enables to feed the motor with a higher voltage with low harmonic distortions than the conventional sinusoidal PWM inverter. The control strategy of the inverter is the voltage / frequency control method, which is based on the space-vector modulation technique. The proposed PMSM drive system involving the field-oriented control scheme not only decouples the torque and flux which provides faster response but also makes the control task easy. The performance of the proposed drive is simulated. The advantages of the proposed drive are confirmed by the simulation results.

Moving towards Positive Security Model for Web Application Firewall

The proliferation of web application and the pervasiveness of mobile technology make web-based attacks even more attractive and even easier to launch. Web Application Firewall (WAF) is an intermediate tool between web server and users that provides comprehensive protection for web application. WAF is a negative security model where the detection and prevention mechanisms are based on predefined or user-defined attack signatures and patterns. However, WAF alone is not adequate to offer best defensive system against web vulnerabilities that are increasing in number and complexity daily. This paper presents a methodology to automatically design a positive security based model which identifies and allows only legitimate web queries. The paper shows a true positive rate of more than 90% can be achieved.