Differential Evolution Based Optimal Choice and Location of Facts Devices in Restructured Power System

This paper deals with the optimal choice and location of FACTS devices in deregulated power systems using Differential Evolution algorithm. The main objective of this paper is to achieve the power system economic generation allocation and dispatch in deregulated electricity market. Using the proposed method, the locations of the FACTS devices, their types and ratings are optimized simultaneously. Different kinds of FACTS devices such as TCSC and SVC are simulated in this study. Furthermore, their investment costs are also considered. Simulation results validate the capability of this new approach in minimizing the overall system cost function, which includes the investment costs of the FACTS devices and the bid offers of the market participants. The proposed algorithm is an effective and practical method for the choice and location of suitable FACTS devices in deregulated electricity market.

Optimal Choice and Location of Multi Type Facts Devices in Deregulated Electricity Market Using Evolutionary Programming Method

This paper deals with the optimal choice and allocation of multi FACTS devices in Deregulated power system using Evolutionary Programming method. The objective is to achieve the power system economic generation allocation and dispatch in deregulated electricity market. Using the proposed method, the locations of the FACTS devices, their types and ratings are optimized simultaneously. Different kinds of FACTS devices are simulated in this study such as UPFC, TCSC, TCPST, and SVC. Simulation results validate the capability of this new approach in minimizing the overall system cost function, which includes the investment costs of the FACTS devices and the bid offers of the market participants. The proposed algorithm is an effective and practical method for the choice and allocation of FACTS devices in deregulated electricity market environment. The standard data of IEEE 14 Bus systems has been taken into account and simulated with aid of MAT-lab software and results were obtained.

Extension of the Client-Centric Approach under Small Buffer Space

Periodic broadcast is a cost-effective solution for large-scale distribution of popular videos because this approach guarantees constant worst service latency, regardless of the number of video requests. An essential periodic broadcast method is the client-centric approach (CCA), which allows clients to use smaller receiving bandwidth to download broadcast data. An enhanced version, namely CCA++, was proposed to yield a shorter waiting time. This work further improves CCA++ in reducing client buffer requirements. The new scheme decreases the buffer requirements by as much as 52% when compared to CCA++. This study also provides an analytical evaluation to demonstrate the performance advantage, as compared with particular schemes.

Comparison of Material Constitutive Models Used in FEA of Low Volume Roads

Appropriate and progressive tool for analyzing behavior of low volume roads are probabilistic models used in reliability analyses. The necessary part of the probabilistic model is the deterministic model of structural behavior. The FE model of low volume roads is created in the ANSYS software. It is able to determine the state of stress and deformation in any point of the structure and thus generate data required for the reliability analysis. The paper compares two material constitutive models used for modeling of unbound non-homogenous materials used in low volume roads. The first model is linear elastic model according to Hook theory (H model), the second one is nonlinear elastic-plastic Drucker-Prager model (D-P model).

A General Mandatory Access Control Framework in Distributed Environments

In this paper, we propose a general mandatory access framework for distributed systems. The framework can be applied into multiple operating systems and can handle multiple stakeholders. Despite considerable advancements in the area of mandatory access control, a certain approach to enforcing mandatory access control can only be applied in a specific operating system. Other than PC market in which windows captures the overwhelming shares, there are a number of popular operating systems in the emerging smart phone environment, i.e. Android, Windows mobile, Symbian, RIM. It should be noted that more and more stakeholders are involved in smartphone software, such as devices owners, service providers and application providers. Our framework includes three parts—local decision layer, the middle layer and the remote decision layer. The middle layer takes charge of managing security contexts, OS API, operations and policy combination. The design of the remote decision layer doesn’t depend on certain operating systems because of the middle layer’s existence. We implement the framework in windows, linux and other popular embedded systems.

Computer Simulation of Low Volume Roads Made from Recycled Materials

Low volume roads are widely used all over the world. To improve their quality the computer simulation of their behavior is proposed. The FEM model enables to determine stress and displacement conditions in the pavement and/or also in the particular material layers. Different variants of pavement layers, material used, humidity as well as loading conditions can be studied. Among others, the input information about material properties of individual layers made from recycled materials is crucial for obtaining results as exact as possible. For this purpose the cyclic-load triaxial test machine testing of cyclic-load performance of materials is a promising test method. The test is able to simulate the real traffic loading on particular materials taking into account the changes in the horizontal stress conditions produced in particular layers by crossings of vehicles. Also the test specimen can be prepared with different amount of water. Thus modulus of elasticity (Young modulus) of different materials including recycled ones can be measured under the different conditions of horizontal and vertical stresses as well as under the different humidity conditions. Using the proposed testing procedure the modulus of elasticity of recycled materials used in the newly built low volume road is obtained under different stress and humidity conditions set to standard, dry and fully saturated level. Obtained values of modulus of elasticity are used in FEA.

Dynamic Safety-Stock Calculation

In order to ensure a high service level industrial enterprises have to maintain safety-stock that directly influences the economic efficiency at the same time. This paper analyses established mathematical methods to calculate safety-stock. Therefore, the performance measured in stock and service level is appraised and the limits of several methods are depicted. Afterwards, a new dynamic approach is presented to gain an extensive method to calculate safety-stock that also takes the knowledge of future volatility into account.

Investigation on Nanoparticle Velocity in Two Phase Approach

Numerical investigation on the generality of nanoparticle velocity equation had been done on the previous published work. The three dimensional governing equations (continuity, momentum and energy) were solved using finite volume method (FVM). Parametric study of thermal performance between pure water-cooled and nanofluid-cooled are evaluated for volume fraction in the range of 1% to 4%, and nanofluid type of gamma-Al2O3 at Reynolds number range of 67.41 to 286.77. The nanofluid is modeled using single and two phase approach. Three different existing Brownian motion velocities are applied in comparing the generality of the equation for a wide parametric condition. Deviation in between the Brownian motion velocity is identified to be due to the different means of mean free path and constant value used in diffusion equation.

Economic Factorial Analysis of CO2 Emissions: The Divisia Index with Interconnected Factors Approach

This paper presents a method of economic factorial analysis of the CO2 emissions based on the extension of the Divisia index to interconnected factors. This approach, contrary to the Kaya identity, considers three main factors of the CO2 emissions: gross domestic product, energy consumption, and population - as equally important, and allows for accounting of all of them simultaneously. The three factors are included into analysis together with their carbon intensities that allows for obtaining a comprehensive picture of the change in the CO2 emissions. A computer program in R-language that is available for free download serves automation of the calculations. A case study of the U.S. carbon dioxide emissions is used as an example. 

Derivation of Monotone Likelihood Ratio Using Two Sided Uniformly Normal Distribution Techniques

In this paper, two-sided uniformly normal distribution techniques were used in the derivation of monotone likelihood ratio. The approach mainly employed the parameters of the distribution for a class of all size a. The derivation technique is fast, direct and less burdensome when compared to some existing methods.

Digitization of Television Broadcasting in Nigeria Review

Information and Communication Technology (ICT) has opened up new and robust ways of sending and receiving information at global level. Any type of information including voice and video is sent to the diverse publics, who equally have variety of choices. Thus, the development of any nation is tied to efficient information dissemination. In Nigeria, television broadcasting started in 1959 with the establishment of the Western Nigeria Television (WNTV) by the opposition leader, Chief Obafemi Awolowo. Later on, the government took over the station and fully controlled it. Subsequently, regional stations were opened to propagate government policies and programs. The television industry in Nigeria continued to grow in terms of viewership and number with over fifty national television stations and twenty five private ones. Thus, existing documents on digitization of television broadcasting industry and related literature were used as the main source of information. Therefore, this paper analyses the efforts being made by the Nigerian government through its ICT policy towards digitization of its television broadcasting in order to cope with the global trend. Recommendations are proffered with a view to achieving the target goal.

Backcalculation of HMA Stiffness Based On Finite Element Model

Stiffness of Hot Mix Asphalt (HMA) in flexible pavement is largely dependent of temperature, mode of testing and age of pavement. Accurate measurement of HMA stiffness is thus quite challenging. This study determines HMA stiffness based on Finite Element Model (FEM) and validates the results using field data. As a first step, stiffnesses of different layers of a pavement section on Interstate 40 (I-40) in New Mexico were determined by Falling Weight Deflectometer (FWD) test. Pavement temperature was not measured at that time due to lack of temperature probe. Secondly, a FE model is developed in ABAQUS. Stiffness of the base, subbase and subgrade were taken from the FWD test output obtained from the first step. As HMA stiffness largely varies with temperature it was assigned trial and error approach. Thirdly, horizontal strain and vertical stress at the bottom of the HMA and temperature at different depths of the pavement were measured with installed sensors on the whole day on December 25th, 2012. Fourthly, outputs of FEM were correlated with measured stress-strain responses. After a number of trials a relationship was developed between the trial stiffness of HMA and measured mid-depth HMA temperature. At last, the obtained relationship between stiffness and temperature is verified by further FWD test when pavement temperature was recorded. A promising agreement between them is observed. Therefore, conclusion can be drawn that linear elastic FEM can accurately predict the stiffness and the structural response of flexible pavement.

Parallel Priority Region Approach to Detect Background

Background detection is essential in video analyses; optimization is often needed in order to achieve real time calculation. Information gathered by dual cameras placed in the front and rear part of an Autonomous Vehicle (AV) is integrated for background detection. In this paper, real time calculation is achieved on the proposed technique by using Priority Regions (PR) and Parallel Processing together where each frame is divided into regions then and each region process is processed in parallel. PR division depends upon driver view limitations. A background detection system is built on the Temporal Difference (TD) and Gaussian Filtering (GF). Temporal Difference and Gaussian Filtering with multi threshold and sigma (weight) value are be based on PR characteristics. The experiment result is prepared on real scene. Comparison of the speed and accuracy with traditional background detection techniques, the effectiveness of PR and parallel processing are also discussed in this paper.

The Effect of Motor Learning Based Computer-Assisted Practice for Children with Handwriting Deficit – Comparing with the Effect of Traditional Sensorimotor Approach

The objective of this study was to test how advanced digital technology enables a more effective training on the handwriting of children with handwriting deficit. This study implemented the graphomotor apparatuses to a computer-assisted instruction system. In a randomized controlled trial, the experiments for verifying the intervention effect were conducted. Forty two children with handwriting deficit were assigned to computer-assisted instruction, sensorimotor training or control (no intervention) group. Handwriting performance was measured using the Elementary reading/writing test and computerized handwriting evaluation before and after 6 weeks of intervention. Analysis of variance of change scores were conducted to show whether statistically significant difference across the three groups. Significant difference was found among three groups. Computer group shows significant difference from the other two groups. Significance was denoted in near-point, far-point copy, dictation test, and writing from phonetic symbols. Writing speed and mean stroke velocity in near-, far-point and short paragraph copy were found significantly difference among three groups. Computer group shows significant improvement from the other groups. For clinicians and school teachers, the results of this study provide a motor control based insight for the improvement of handwriting difficulties.

Multimodal Biometric Authentication Using Choquet Integral and Genetic Algorithm

The Choquet integral is a tool for the information fusion that is very effective in the case where fuzzy measures associated with it are well chosen. In this paper, we propose a new approach for calculating fuzzy measures associated with the Choquet integral in a context of data fusion in multimodal biometrics. The proposed approach is based on genetic algorithms. It has been validated in two databases: the first base is relative to synthetic scores and the second one is biometrically relating to the face, fingerprint and palmprint. The results achieved attest the robustness of the proposed approach.

Adaptive Score Normalization: A Novel Approach for Multimodal Biometric Systems

Multimodal biometric systems integrate the data presented by multiple biometric sources, hence offering a better performance than the systems based on a single biometric modality. Although the coupling of biometric systems can be done at different levels, the fusion at the scores level is the most common since it has been proven effective than the rest of the fusion levels. However, the scores from different modalities are generally heterogeneous. A step of normalizing the scores is needed to transform these scores into a common domain before combining them. In this paper, we study the performance of several normalization techniques with various fusion methods in a context relating to the merger of three unimodal systems based on the face, the palmprint and the fingerprint. We also propose a new adaptive normalization method that takes into account the distribution of client scores and impostor scores. Experiments conducted on a database of 100 people show that the performances of a multimodal system depend on the choice of the normalization method and the fusion technique. The proposed normalization method has given the best results.

Identification of Single Nucleotide Polymorphism in 5'-UTR of CYP11B1 Gene in Pakistani Sahiwal Cattle

A major goal in animal genetics is to understand the role of common genetic variants in diseases susceptibility and production traits. Sahiwal cattle can be considered as a global animal genetic resource due to its relatively high milk producing ability, resistance against tropical diseases and heat tolerant. CYP11B1 gene provides instructions for making a mitochondrial enzyme called steroid 11-beta-hydroxylase. It catalyzes the 11deoxy-cortisol to cortisol and 11deoxycorticosterone to corticosterone in cattle. The bovine CYP11B1 gene is positioned on BTA14q12 comprises of eight introns and nine exons and protein is associated with mitochondrial epithelium. The present study was aimed to identify the single-nucleotide polymorphisms in CYP11B1 gene in Sahiwal cattle breed of Pakistan. Four polymorphic sites were identified in exon one of CYP11B1 gene through sequencing approach. Significant finding was the incidence of the C→T polymorphism in 5'-UTR, causing amino acid substitution from alanine to valine (A30V) in Sahiwal cattle breed. That Ala/Val polymorphism may serve as a powerful genetic tool for the development of DNA markers that can be used for the particular traits for different local cattle breeds.

A Deterministic Dynamic Programming Approach for Optimization Problem with Quadratic Objective Function and Linear Constraints

This paper presents the novel deterministic dynamic programming approach for solving optimization problem with quadratic objective function with linear equality and inequality constraints. The proposed method employs backward recursion in which computations proceeds from last stage to first stage in a multi-stage decision problem. A generalized recursive equation which gives the exact solution of an optimization problem is derived in this paper. The method is purely analytical and avoids the usage of initial solution. The feasibility of the proposed method is demonstrated with a practical example. The numerical results show that the proposed method provides global optimum solution with negligible computation time.

Investigation of Hydraulic and Thermal Performances of Fin Array at Different Shield Positions without By-Pass

In heat sinks, the flow within the core exhibits separation and hence does not lend itself to simple analytical boundary layer or duct flow analysis of the wall friction. In this paper, we present some findings from an experimental and numerical study aimed to obtain physical insight into the influence of the presence of the shield and its position on the hydraulic and thermal performance of square pin fin heat sink without top by-pass. The variations of the Nusselt number and friction factor are obtained under varied parameters, such as the Reynolds number and the shield position. The numerical code is validated by comparing the numerical results with the available experimental data. It is shown that, there is a good agreement between the temperature predictions based on the model and the experimental data. Results show that, as the presence of the shield, the heat transfer of fin array is enhanced and the flow resistance increased. The surface temperature distribution of the heat sink base is more uniform when the dimensionless shield position equals to 1/3 or 2/3. The comprehensive performance evaluation approach based on identical pumping power criteria is adopted and shows that the optimum shield position is at x/l=0.43.

Clustering of Variables Based On a Probabilistic Approach Defined on the Hypersphere

We consider n individuals described by p standardized variables, represented by points of the surface of the unit hypersphere Sn-1. For a previous choice of n individuals we suppose that the set of observables variables comes from a mixture of bipolar Watson distribution defined on the hypersphere. EM and Dynamic Clusters algorithms are used for identification of such mixture. We obtain estimates of parameters for each Watson component and then a partition of the set of variables into homogeneous groups of variables. Additionally we will present a factor analysis model where unobservable factors are just the maximum likelihood estimators of Watson directional parameters, exactly the first principal component of data matrix associated to each group previously identified. Such alternative model it will yield us to directly interpretable solutions (simple structure), avoiding factors rotations.