A New Method for Contour Approximation Using Basic Ramer Idea

This paper presented two new efficient algorithms for contour approximation. The proposed algorithm is compared with Ramer (good quality), Triangle (faster) and Trapezoid (fastest) in this work; which are briefly described. Cartesian co-ordinates of an input contour are processed in such a manner that finally contours is presented by a set of selected vertices of the edge of the contour. In the paper the main idea of the analyzed procedures for contour compression is performed. For comparison, the mean square error and signal-to-noise ratio criterions are used. Computational time of analyzed methods is estimated depending on a number of numerical operations. Experimental results are obtained both in terms of image quality, compression ratios, and speed. The main advantages of the analyzed algorithm is small numbers of the arithmetic operations compared to the existing algorithms.

Noise Removal from Surface Respiratory EMG Signal

The aim of this study was to remove the two principal noises which disturb the surface electromyography signal (Diaphragm). These signals are the electrocardiogram ECG artefact and the power line interference artefact. The algorithm proposed focuses on a new Lean Mean Square (LMS) Widrow adaptive structure. These structures require a reference signal that is correlated with the noise contaminating the signal. The noise references are then extracted : first with a noise reference mathematically constructed using two different cosine functions; 50Hz (the fundamental) function and 150Hz (the first harmonic) function for the power line interference and second with a matching pursuit technique combined to an LMS structure for the ECG artefact estimation. The two removal procedures are attained without the use of supplementary electrodes. These techniques of filtering are validated on real records of surface diaphragm electromyography signal. The performance of the proposed methods was compared with already conducted research results.

Studies on Automatic Measurement Technology for Surface Braided Angle of Three-Dimensional Braided Composite Material Performs

This paper describes a new measuring algorithm for three-dimensional (3-D) braided composite material .Braided angle is an important parameter of braided composites. The objective of this paper is to present an automatic measuring system. In the paper, the algorithm is performed by using vcµ6.0 language on PC. An advanced filtered algorithm for image of 3-D braided composites material performs has been developed. The procedure is completely automatic and relies on the gray scale information content of the images and their local wavelet transform modulus maxims. Experimental results show that the proposed method is feasible. The algorithm was tested on both carbon-fiber and glass-fiber performs.

Multi-Line Power Flow Control using Interline Power Flow Controller (IPFC) in Power Transmission Systems

The interline power flow controller (IPFC) is one of the latest generation flexible AC transmission systems (FACTS) controller used to control power flows of multiple transmission lines. This paper presents a mathematical model of IPFC, termed as power injection model (PIM). This model is incorporated in Newton- Raphson (NR) power flow algorithm to study the power flow control in transmission lines in which IPFC is placed. A program in MATLAB has been written in order to extend conventional NR algorithm based on this model. Numerical results are carried out on a standard 2 machine 5 bus system. The results without and with IPFC are compared in terms of voltages, active and reactive power flows to demonstrate the performance of the IPFC model.

Flexible Heuristics for Project Scheduling with Limited Resources

Resource-constrained project scheduling is an NPhard optimisation problem. There are many different heuristic strategies how to shift activities in time when resource requirements exceed their available amounts. These strategies are frequently based on priorities of activities. In this paper, we assume that a suitable heuristic has been chosen to decide which activities should be performed immediately and which should be postponed and investigate the resource-constrained project scheduling problem (RCPSP) from the implementation point of view. We propose an efficient routine that, instead of shifting the activities, extends their duration. It makes it possible to break down their duration into active and sleeping subintervals. Then we can apply the classical Critical Path Method that needs only polynomial running time. This algorithm can simply be adapted for multiproject scheduling with limited resources.

3D Locomotion and Fractal Analysis of Goldfish for Acute Toxicity Bioassay

Biological reactions of individuals of a testing animal to toxic substance are unique and can be used as an indication of the existing of toxic substance. However, to distinguish such phenomenon need a very complicate system and even more complicate to analyze data in 3 dimensional. In this paper, a system to evaluate in vitro biological activities to acute toxicity of stochastic self-affine non-stationary signal of 3D goldfish swimming by using fractal analysis is introduced. Regular digital camcorders are utilized by proposed algorithm 3DCCPC to effectively capture and construct 3D movements of the fish. A Critical Exponent Method (CEM) has been adopted as a fractal estimator. The hypothesis was that the swimming of goldfish to acute toxic would show the fractal property which related to the toxic concentration. The experimental results supported the hypothesis by showing that the swimming of goldfish under the different toxic concentration has fractal properties. It also shows that the fractal dimension of the swimming related to the pH value of FD Ôëê 0.26pH + 0.05. With the proposed system, the fish is allowed to swim freely in all direction to react to the toxic. In addition, the trajectories are precisely evaluated by fractal analysis with critical exponent method and hence the results exhibit with much higher degree of confidence.

Issues in Travel Demand Forecasting

Travel demand forecasting including four travel choices, i.e., trip generation, trip distribution, modal split and traffic assignment constructs the core of transportation planning. In its current application, travel demand forecasting has associated with three important issues, i.e., interface inconsistencies among four travel choices, inefficiency of commonly used solution algorithms, and undesirable multiple path solutions. In this paper, each of the three issues is extensively elaborated. An ideal unified framework for the combined model consisting of the four travel choices and variable demand functions is also suggested. Then, a few remarks are provided in the end of the paper

Alternative to M-Estimates in Multisensor Data Fusion

To solve the problem of multisensor data fusion under non-Gaussian channel noise. The advanced M-estimates are known to be robust solution while trading off some accuracy. In order to improve the estimation accuracy while still maintaining the equivalent robustness, a two-stage robust fusion algorithm is proposed using preliminary rejection of outliers then an optimal linear fusion. The numerical experiments show that the proposed algorithm is equivalent to the M-estimates in the case of uncorrelated local estimates and significantly outperforms the M-estimates when local estimates are correlated.

Design of DC Voltage Control for D-STATCOM

This paper presents the DC voltage control design of D-STATCOM when the D-STATCOM is used for load voltage regulation. Although, the DC voltage can be controlled by active current of the D-STATCOM, reactive current still affects the DC voltage. To eliminate this effect, the control strategy with elimination effect of the reactive current is proposed and the results of the control with and without the elimination the effect of the reactive current are compared. For obtaining the proportional and integral gains of the PI controllers, the symmetrical optimum and genetic algorithms methods are applied. The stability margin of these methods are obtained and discussed in detail. In addition, the performance of the DC voltage control based on symmetrical optimum and genetic algorithms methods are compared. Effectiveness of the controllers designed was verified through computer simulation performed by using Power System Tool Block (PSB) in SIMULINK/MATLAB. The simulation results demonstrated that the DC voltage control proposed is effective in regulating DC voltage when the DSTATCOM is used for load voltage regulation.

Genetic Content-Based MP3 Audio Watermarking in MDCT Domain

In this paper a novel scheme for watermarking digital audio during its compression to MPEG-1 Layer III format is proposed. For this purpose we slightly modify some of the selected MDCT coefficients, which are used during MPEG audio compression procedure. Due to the possibility of modifying different MDCT coefficients, there will be different choices for embedding the watermark into audio data, considering robustness and transparency factors. Our proposed method uses a genetic algorithm to select the best coefficients to embed the watermark. This genetic selection is done according to the parameters that are extracted from the perceptual content of the audio to optimize the robustness and transparency of the watermark. On the other hand the watermark security is increased due to the random nature of the genetic selection. The information of the selected MDCT coefficients that carry the watermark bits, are saves in a database for future extraction of the watermark. The proposed method is suitable for online MP3 stores to pursue illegal copies of musical artworks. Experimental results show that the detection ratio of the watermarks at the bitrate of 128kbps remains above 90% while the inaudibility of the watermark is preserved.

Replicating Data Objects in Large-scale Distributed Computing Systems using Extended Vickrey Auction

This paper proposes a novel game theoretical technique to address the problem of data object replication in largescale distributed computing systems. The proposed technique draws inspiration from computational economic theory and employs the extended Vickrey auction. Specifically, players in a non-cooperative environment compete for server-side scarce memory space to replicate data objects so as to minimize the total network object transfer cost, while maintaining object concurrency. Optimization of such a cost in turn leads to load balancing, fault-tolerance and reduced user access time. The method is experimentally evaluated against four well-known techniques from the literature: branch and bound, greedy, bin-packing and genetic algorithms. The experimental results reveal that the proposed approach outperforms the four techniques in both the execution time and solution quality.

Enhanced Spectral Envelope Coding Based On NLMS for G.729.1

In this paper, a new encoding algorithm of spectral envelope based on NLMS in G.729.1 for VoIP is proposed. In the TDAC part of G.729.1, the spectral envelope and MDCT coefficients extracted in the weighted CELP coding error (lower-band) and the higher-band input signal are encoded. In order to reduce allocation bits for spectral envelope coding, a new quantization algorithm based on NLMS is proposed. Also, reduced bits are used to enhance sound quality. The performance of the proposed algorithm is evaluated by sound quality and bit reduction rates in clean and frame loss conditions.

TFRank: An Evaluation of Users Importance with Fractal Views in Social Networks

One of research issues in social network analysis is to evaluate the position/importance of users in social networks. As the information diffusion in social network is evolving, it seems difficult to evaluate the importance of users using traditional approaches. In this paper, we propose an evaluation approach for user importance with fractal view in social networks. In this approach, the global importance (Fractal Importance) and the local importance (Topological Importance) of nodes are considered. The basic idea is that the bigger the product of fractal importance and topological importance of a node is, the more important of the node is. We devise the algorithm called TFRank corresponding to the proposed approach. Finally, we evaluate TFRank by experiments. Experimental results demonstrate our TFRank has the high correlations with PageRank algorithm and potential ranking algorithm, and it shows the effectiveness and advantages of our approach.

Fuzzy Clustering Analysis in Real Estate Companies in China

This paper applies fuzzy clustering algorithm in classifying real estate companies in China according to some general financial indexes, such as income per share, share accumulation fund, net profit margins, weighted net assets yield and shareholders' equity. By constructing and normalizing initial partition matrix, getting fuzzy similar matrix with Minkowski metric and gaining the transitive closure, the dynamic fuzzy clustering analysis for real estate companies is shown clearly that different clustered result change gradually with the threshold reducing, and then, it-s shown there is the similar relationship with the prices of those companies in stock market. In this way, it-s great valuable in contrasting the real estate companies- financial condition in order to grasp some good chances of investment, and so on.

A Similarity Function for Global Quality Assessment of Retinal Vessel Segmentations

Retinal vascularity assessment plays an important role in diagnosis of ophthalmic pathologies. The employment of digital images for this purpose makes possible a computerized approach and has motivated development of many methods for automated vascular tree segmentation. Metrics based on contingency tables for binary classification have been widely used for evaluating performance of these algorithms and, concretely, the accuracy has been mostly used as measure of global performance in this topic. However, this metric shows very poor matching with human perception as well as other notable deficiencies. Here, a new similarity function for measuring quality of retinal vessel segmentations is proposed. This similarity function is based on characterizing the vascular tree as a connected structure with a measurable area and length. Tests made indicate that this new approach shows better behaviour than the current one does. Generalizing, this concept of measuring descriptive properties may be used for designing functions for measuring more successfully segmentation quality of other complex structures.

Adaptive Bidirectional Flow for Image Interpolation and Enhancement

Image interpolation is a common problem in imaging applications. However, most interpolation algorithms in existence suffer visually the effects of blurred edges and jagged artifacts in the image to some extent. This paper presents an adaptive feature preserving bidirectional flow process, where an inverse diffusion is performed to sharpen edges along the normal directions to the isophote lines (edges), while a normal diffusion is done to remove artifacts (“jaggies") along the tangent directions. In order to preserve image features such as edges, corners and textures, the nonlinear diffusion coefficients are locally adjusted according to the directional derivatives of the image. Experimental results on synthetic images and nature images demonstrate that our interpolation algorithm substantially improves the subjective quality of the interpolated images over conventional interpolations.

A Simulation Software for DNA Computing Algorithms Implementation

The capturing of gel electrophoresis image represents the output of a DNA computing algorithm. Before this image is being captured, DNA computing involves parallel overlap assembly (POA) and polymerase chain reaction (PCR) that is the main of this computing algorithm. However, the design of the DNA oligonucleotides to represent a problem is quite complicated and is prone to errors. In order to reduce these errors during the design stage before the actual in-vitro experiment is carried out; a simulation software capable of simulating the POA and PCR processes is developed. This simulation software capability is unlimited where problem of any size and complexity can be simulated, thus saving cost due to possible errors during the design process. Information regarding the DNA sequence during the computing process as well as the computing output can be extracted at the same time using the simulation software.

Application of Feed-Forward Neural Networks Autoregressive Models in Gross Domestic Product Prediction

In this paper we present an autoregressive model with neural networks modeling and standard error backpropagation algorithm training optimization in order to predict the gross domestic product (GDP) growth rate of four countries. Specifically we propose a kind of weighted regression, which can be used for econometric purposes, where the initial inputs are multiplied by the neural networks final optimum weights from input-hidden layer after the training process. The forecasts are compared with those of the ordinary autoregressive model and we conclude that the proposed regression-s forecasting results outperform significant those of autoregressive model in the out-of-sample period. The idea behind this approach is to propose a parametric regression with weighted variables in order to test for the statistical significance and the magnitude of the estimated autoregressive coefficients and simultaneously to estimate the forecasts.

Distribution Feeder Reconfiguration Considering Distributed Generators

Recently, distributed generation technologies have received much attention for the potential energy savings and reliability assurances that might be achieved as a result of their widespread adoption. Fueling the attention have been the possibilities of international agreements to reduce greenhouse gas emissions, electricity sector restructuring, high power reliability requirements for certain activities, and concern about easing transmission and distribution capacity bottlenecks and congestion. So it is necessary that impact of these kinds of generators on distribution feeder reconfiguration would be investigated. This paper presents an approach for distribution reconfiguration considering Distributed Generators (DGs). The objective function is summation of electrical power losses A Tabu search optimization is used to solve the optimal operation problem. The approach is tested on a real distribution feeder.

Design of an Augmented Automatic Choosing Control by Lyapunov Functions Using Gradient Optimization Automatic Choosing Functions

In this paper we consider a nonlinear feedback control called augmented automatic choosing control (AACC) using the gradient optimization automatic choosing functions for nonlinear systems. Constant terms which arise from sectionwise linearization of a given nonlinear system are treated as coefficients of a stable zero dynamics. Parameters included in the control are suboptimally selected by expanding a stable region in the sense of Lyapunov with the aid of the genetic algorithm. This approach is applied to a field excitation control problem of power system to demonstrate the splendidness of the AACC. Simulation results show that the new controller can improve performance remarkably well.