Multi-Line Power Flow Control using Interline Power Flow Controller (IPFC) in Power Transmission Systems

The interline power flow controller (IPFC) is one of the latest generation flexible AC transmission systems (FACTS) controller used to control power flows of multiple transmission lines. This paper presents a mathematical model of IPFC, termed as power injection model (PIM). This model is incorporated in Newton- Raphson (NR) power flow algorithm to study the power flow control in transmission lines in which IPFC is placed. A program in MATLAB has been written in order to extend conventional NR algorithm based on this model. Numerical results are carried out on a standard 2 machine 5 bus system. The results without and with IPFC are compared in terms of voltages, active and reactive power flows to demonstrate the performance of the IPFC model.

Adaptive Bidirectional Flow for Image Interpolation and Enhancement

Image interpolation is a common problem in imaging applications. However, most interpolation algorithms in existence suffer visually the effects of blurred edges and jagged artifacts in the image to some extent. This paper presents an adaptive feature preserving bidirectional flow process, where an inverse diffusion is performed to sharpen edges along the normal directions to the isophote lines (edges), while a normal diffusion is done to remove artifacts (“jaggies") along the tangent directions. In order to preserve image features such as edges, corners and textures, the nonlinear diffusion coefficients are locally adjusted according to the directional derivatives of the image. Experimental results on synthetic images and nature images demonstrate that our interpolation algorithm substantially improves the subjective quality of the interpolated images over conventional interpolations.

Influence of Radio Frequency Identification Technology in Logistic, Inventory Control and Supply Chain Optimization

The main aim of Supply Chain Management (SCM) is to produce, distribute, logistics and deliver goods and equipment in right location, right time, right amount to satisfy costumers, with minimum time and cost waste. So implementing techniques that reduce project time and cost, and improve productivity and performance is very important. Emerging technologies such as the Radio Frequency Identification (RFID) are now making it possible to automate supply chains in a real time manner and making them more efficient than the simple supply chain of the past for tracing and monitoring goods and products and capturing data on movements of goods and other events. This paper considers concepts, components and RFID technology characteristics by concentration of warehouse and inventories management. Additionally, utilization of RFID in the role of improving information management in supply chain is discussed. Finally, the facts of installation and this technology-s results in direction with warehouse and inventory management and business development will be presented.

Shape Restoration of the Left Ventricle

This paper describes an automatic algorithm to restore the shape of three-dimensional (3D) left ventricle (LV) models created from magnetic resonance imaging (MRI) data using a geometry-driven optimization approach. Our basic premise is to restore the LV shape such that the LV epicardial surface is smooth after the restoration. A geometrical measure known as the Minimum Principle Curvature (κ2) is used to assess the smoothness of the LV. This measure is used to construct the objective function of a two-step optimization process. The objective of the optimization is to achieve a smooth epicardial shape by iterative in-plane translation of the MRI slices. Quantitatively, this yields a minimum sum in terms of the magnitude of κ 2, when κ2 is negative. A limited memory quasi-Newton algorithm, L-BFGS-B, is used to solve the optimization problem. We tested our algorithm on an in vitro theoretical LV model and 10 in vivo patient-specific models which contain significant motion artifacts. The results show that our method is able to automatically restore the shape of LV models back to smoothness without altering the general shape of the model. The magnitudes of in-plane translations are also consistent with existing registration techniques and experimental findings.

Individual Learning and Collaborative Knowledge Building with Shared Digital Artifacts

The development of Internet technology in recent years has led to a more active role of users in creating Web content. This has significant effects both on individual learning and collaborative knowledge building. This paper will present an integrative framework model to describe and explain learning and knowledge building with shared digital artifacts on the basis of Luhmann-s systems theory and Piaget-s model of equilibration. In this model, knowledge progress is based on cognitive conflicts resulting from incongruities between an individual-s prior knowledge and the information which is contained in a digital artifact. Empirical support for the model will be provided by 1) applying it descriptively to texts from Wikipedia, 2) examining knowledge-building processes using a social network analysis, and 3) presenting a survey of a series of experimental laboratory studies.

OPTIMAL Placement of FACTS Devices by Genetic Algorithm for the Increased Load Ability of a Power System

This paper presents Genetic Algorithm (GA) based approach for the allocation of FACTS (Flexible AC Transmission System) devices for the improvement of Power transfer capacity in an interconnected Power System. The GA based approach is applied on IEEE 30 BUS System. The system is reactively loaded starting from base to 200% of base load. FACTS devices are installed in the different locations of the power system and system performance is noticed with and without FACTS devices. First, the locations, where the FACTS devices to be placed is determined by calculating active and reactive power flows in the lines. Genetic Algorithm is then applied to find the amount of magnitudes of the FACTS devices. This approach of GA based placement of FACTS devices is tremendous beneficial both in terms of performance and economy is clearly observed from the result obtained.

Kalman Filter Based Adaptive Reduction of Motion Artifact from Photoplethysmographic Signal

Artifact free photoplethysmographic (PPG) signals are necessary for non-invasive estimation of oxygen saturation (SpO2) in arterial blood. Movement of a patient corrupts the PPGs with motion artifacts, resulting in large errors in the computation of Sp02. This paper presents a study on using Kalman Filter in an innovative way by modeling both the Artillery Blood Pressure (ABP) and the unwanted signal, additive motion artifact, to reduce motion artifacts from corrupted PPG signals. Simulation results show acceptable performance regarding LMS and variable step LMS, thus establishing the efficacy of the proposed method.

Extended Deductive Databases with Uncertain Information

The paper presents an approach for handling uncertain information in deductive databases using multivalued logics. Uncertainty means that database facts may be assigned logical values other than the conventional ones - true and false. The logical values represent various degrees of truth, which may be combined and propagated by applying the database rules. A corresponding multivalued database semantics is defined. We show that it extends successful conventional semantics as the well-founded semantics, and has a polynomial time data complexity.

Automatic Removal of Ocular Artifacts using JADE Algorithm and Neural Network

The ElectroEncephaloGram (EEG) is useful for clinical diagnosis and biomedical research. EEG signals often contain strong ElectroOculoGram (EOG) artifacts produced by eye movements and eye blinks especially in EEG recorded from frontal channels. These artifacts obscure the underlying brain activity, making its visual or automated inspection difficult. The goal of ocular artifact removal is to remove ocular artifacts from the recorded EEG, leaving the underlying background signals due to brain activity. In recent times, Independent Component Analysis (ICA) algorithms have demonstrated superior potential in obtaining the least dependent source components. In this paper, the independent components are obtained by using the JADE algorithm (best separating algorithm) and are classified into either artifact component or neural component. Neural Network is used for the classification of the obtained independent components. Neural Network requires input features that exactly represent the true character of the input signals so that the neural network could classify the signals based on those key characters that differentiate between various signals. In this work, Auto Regressive (AR) coefficients are used as the input features for classification. Two neural network approaches are used to learn classification rules from EEG data. First, a Polynomial Neural Network (PNN) trained by GMDH (Group Method of Data Handling) algorithm is used and secondly, feed-forward neural network classifier trained by a standard back-propagation algorithm is used for classification and the results show that JADE-FNN performs better than JADEPNN.

Data-organization Before Learning Multi-Entity Bayesian Networks Structure

The objective of our work is to develop a new approach for discovering knowledge from a large mass of data, the result of applying this approach will be an expert system that will serve as diagnostic tools of a phenomenon related to a huge information system. We first recall the general problem of learning Bayesian network structure from data and suggest a solution for optimizing the complexity by using organizational and optimization methods of data. Afterward we proposed a new heuristic of learning a Multi-Entities Bayesian Networks structures. We have applied our approach to biological facts concerning hereditary complex illnesses where the literatures in biology identify the responsible variables for those diseases. Finally we conclude on the limits arched by this work.

Super Resolution Blind Reconstruction of Low Resolution Images using Wavelets based Fusion

Crucial information barely visible to the human eye is often embedded in a series of low resolution images taken of the same scene. Super resolution reconstruction is the process of combining several low resolution images into a single higher resolution image. The ideal algorithm should be fast, and should add sharpness and details, both at edges and in regions without adding artifacts. In this paper we propose a super resolution blind reconstruction technique for linearly degraded images. In our proposed technique the algorithm is divided into three parts an image registration, wavelets based fusion and an image restoration. In this paper three low resolution images are considered which may sub pixels shifted, rotated, blurred or noisy, the sub pixel shifted images are registered using affine transformation model; A wavelet based fusion is performed and the noise is removed using soft thresolding. Our proposed technique reduces blocking artifacts and also smoothens the edges and it is also able to restore high frequency details in an image. Our technique is efficient and computationally fast having clear perspective of real time implementation.

A Local Statistics Based Region Growing Segmentation Method for Ultrasound Medical Images

This paper presents the region based segmentation method for ultrasound images using local statistics. In this segmentation approach the homogeneous regions depends on the image granularity features, where the interested structures with dimensions comparable to the speckle size are to be extracted. This method uses a look up table comprising of the local statistics of every pixel, which are consisting of the homogeneity and similarity bounds according to the kernel size. The shape and size of the growing regions depend on this look up table entries. The algorithms are implemented by using connected seeded region growing procedure where each pixel is taken as seed point. The region merging after the region growing also suppresses the high frequency artifacts. The updated merged regions produce the output in formed of segmented image. This algorithm produces the results that are less sensitive to the pixel location and it also allows a segmentation of the accurate homogeneous regions.

Optimal SSSC Placement to ATC Enhancing in Power Systems

This paper reviews the optimization available transmission capability (ATC) of power systems using a device of FACTS named SSSC equipped with energy storage devices. So that, emplacement and improvement of parameters of SSSC will be illustrated. Thus, voltage magnitude constraints of network buses, line transient stability constraints and voltage breakdown constraints are considered. To help the calculations, a comprehensive program in DELPHI is provided, which is able to simulate and trace the parameters of SSSC has been installed on a specific line. Furthermore, the provided program is able to compute ATC, TTC and maximum value of their enhancement after using SSSC.

Damping Power System Oscillations Improvement by FACTS Devices: A Comparison between SSSC and STATCOM

The main objective of this paper is a comparative investigate in enhancement of damping power system oscillation via coordinated design of the power system stabilizer (PSS) and static synchronous series compensator (SSSC) and static synchronous compensator (STATCOM). The design problem of FACTS-based stabilizers is formulated as a GA based optimization problem. In this paper eigenvalue analysis method is used on small signal stability of single machine infinite bus (SMIB) system installed with SSSC and STATCOM. The generator is equipped with a PSS. The proposed stabilizers are tested on a weakly connected power system with different disturbances and loading conditions. This aim is to enhance both rotor angle and power system stability. The eigenvalue analysis and non-linear simulation results are presented to show the effects of these FACTS-based stabilizers and reveal that SSSC exhibits the best effectiveness on damping power system oscillation.

Fuzzy Logic Control of Static Var Compensator for Power System Damping

Static Var Compensator (SVC) is a shunt type FACTS device which is used in power system primarily for the purpose of voltage and reactive power control. In this paper, a fuzzy logic based supplementary controller for Static Var Compensator (SVC) is developed which is used for damping the rotor angle oscillations and to improve the transient stability of the power system. Generator speed and the electrical power are chosen as input signals for the Fuzzy Logic Controller (FLC). The effectiveness and feasibility of the proposed control is demonstrated with Single Machine Infinite Bus (SMIB) system and multimachine system (WSCC System) which show improvement over the use of a fixed parameter controller.

2D-Modeling with Lego Mindstorms

The whole work is based on possibility to use Lego Mindstorms robotics systems to reduce costs. Lego Mindstorms consists of a wide variety of hardware components necessary to simulate, programme and test of robotics systems in practice. To programme algorithm, which simulates space using the ultrasonic sensor, was used development environment supplied with kit. Software Matlab was used to render values afterwards they were measured by ultrasonic sensor. The algorithm created for this paper uses theoretical knowledge from area of signal processing. Data being processed by algorithm are collected by ultrasonic sensor that scans 2D space in front of it. Ultrasonic sensor is placed on moving arm of robot which provides horizontal moving of sensor. Vertical movement of sensor is provided by wheel drive. The robot follows map in order to get correct positioning of measured data. Based on discovered facts it is possible to consider Lego Mindstorm for low-cost and capable kit for real-time modelling.

Application of HSA and GA in Optimal Placement of FACTS Devices Considering Voltage Stability and Losses

Voltage collapse is instability of heavily loaded electric power systems that cause to declining voltages and blackout. Power systems are predicated to become more heavily loaded in the future decade as the demand for electric power rises while economic and environmental concerns limit the construction of new transmission and generation capacity. Heavily loaded power systems are closer to their stability limits and voltage collapse blackouts will occur if suitable monitoring and control measures are not taken. To control transmission lines, it can be used from FACTS devices. In this paper Harmony search algorithm (HSA) and Genetic Algorithm (GA) have applied to determine optimal location of FACTS devices in a power system to improve power system stability. Three types of FACTS devices (TCPAT, UPFS, and SVC) have been introduced. Bus under voltage has been solved by controlling reactive power of shunt compensator. Also a combined series-shunt compensators has been also used to control transmission power flow and bus voltage simultaneously. Different scenarios have been considered. First TCPAT, UPFS, and SVC are placed solely in transmission lines and indices have been calculated. Then two types of above controller try to improve parameters randomly. The last scenario tries to make better voltage stability index and losses by implementation of three types controller simultaneously. These scenarios are executed on typical 34-bus test system and yields efficiency in improvement of voltage profile and reduction of power losses; it also may permit an increase in power transfer capacity, maximum loading, and voltage stability margin.

An Efficient Watermarking Method for MP3 Audio Files

In this work, we present for the first time in our perception an efficient digital watermarking scheme for mpeg audio layer 3 files that operates directly in the compressed data domain, while manipulating the time and subband/channel domain. In addition, it does not need the original signal to detect the watermark. Our scheme was implemented taking special care for the efficient usage of the two limited resources of computer systems: time and space. It offers to the industrial user the capability of watermark embedding and detection in time immediately comparable to the real music time of the original audio file that depends on the mpeg compression, while the end user/audience does not face any artifacts or delays hearing the watermarked audio file. Furthermore, it overcomes the disadvantage of algorithms operating in the PCMData domain to be vulnerable to compression/recompression attacks, as it places the watermark in the scale factors domain and not in the digitized sound audio data. The strength of our scheme, that allows it to be used with success in both authentication and copyright protection, relies on the fact that it gives to the users the enhanced capability their ownership of the audio file not to be accomplished simply by detecting the bit pattern that comprises the watermark itself, but by showing that the legal owner knows a hard to compute property of the watermark.

Combining Fuzzy Logic and Data Miningto Predict the Result of an EIA Review

The purpose of determining impact significance is to place value on impacts. Environmental impact assessment review is a process that judges whether impact significance is acceptable or not in accordance with the scientific facts regarding environmental, ecological and socio-economical impacts described in environmental impact statements (EIS) or environmental impact assessment reports (EIAR). The first aim of this paper is to summarize the criteria of significance evaluation from the past review results and accordingly utilize fuzzy logic to incorporate these criteria into scientific facts. The second aim is to employ data mining technique to construct an EIS or EIAR prediction model for reviewing results which can assist developers to prepare and revise better environmental management plans in advance. The validity of the previous prediction model proposed by authors in 2009 is 92.7%. The enhanced validity in this study can attain 100.0%.

Image Magnification Using Adaptive Interpolationby Pixel Level Data-Dependent Geometrical Shapes

World has entered in 21st century. The technology of computer graphics and digital cameras is prevalent. High resolution display and printer are available. Therefore high resolution images are needed in order to produce high quality display images and high quality prints. However, since high resolution images are not usually provided, there is a need to magnify the original images. One common difficulty in the previous magnification techniques is that of preserving details, i.e. edges and at the same time smoothing the data for not introducing the spurious artefacts. A definitive solution to this is still an open issue. In this paper an image magnification using adaptive interpolation by pixel level data-dependent geometrical shapes is proposed that tries to take into account information about the edges (sharp luminance variations) and smoothness of the image. It calculate threshold, classify interpolation region in the form of geometrical shapes and then assign suitable values inside interpolation region to the undefined pixels while preserving the sharp luminance variations and smoothness at the same time. The results of proposed technique has been compared qualitatively and quantitatively with five other techniques. In which the qualitative results show that the proposed method beats completely the Nearest Neighbouring (NN), bilinear(BL) and bicubic(BC) interpolation. The quantitative results are competitive and consistent with NN, BL, BC and others.