Choosing Search Algorithms in Bayesian Optimization Algorithm

The Bayesian Optimization Algorithm (BOA) is an algorithm based on the estimation of distributions. It uses techniques from modeling data by Bayesian networks to estimating the joint distribution of promising solutions. To obtain the structure of Bayesian network, different search algorithms can be used. The key point that BOA addresses is whether the constructed Bayesian network could generate new and useful solutions (strings), which could lead the algorithm in the right direction to solve the problem. Undoubtedly, this ability is a crucial factor of the efficiency of BOA. Varied search algorithms can be used in BOA, but their performances are different. For choosing better ones, certain suitable method to present their ability difference is needed. In this paper, a greedy search algorithm and a stochastic search algorithm are used in BOA to solve certain optimization problem. A method using Kullback-Leibler (KL) Divergence to reflect their difference is described.

An Optimized Design of Non-uniform Filterbank

The tree structured approach of non-uniform filterbank (NUFB) is normally used in perfect reconstruction (PR). The PR is not always feasible due to certain limitations, i.e, constraints in selecting design parameters, design complexity and some times output is severely affected by aliasing error if necessary and sufficient conditions of PR is not satisfied perfectly. Therefore, there has been generalized interest of researchers to go for near perfect reconstruction (NPR). In this proposed work, an optimized tree structure technique is used for the design of NPR non-uniform filterbank. Window functions of Blackman family are used to design the prototype FIR filter. A single variable linear optimization is used to minimize the amplitude distortion. The main feature of the proposed design is its simplicity with linear phase property.

Reentry Trajectory Optimization Based on Differential Evolution

Reentry trajectory optimization is a multi-constraints optimal control problem which is hard to solve. To tackle it, we proposed a new algorithm named CDEN(Constrained Differential Evolution Newton-Raphson Algorithm) based on Differential Evolution( DE) and Newton-Raphson.We transform the infinite dimensional optimal control problem to parameter optimization which is finite dimensional by discretize control parameter. In order to simplify the problem, we figure out the control parameter-s scope by process constraints. To handle constraints, we proposed a parameterless constraints handle process. Through comprehensive analyze the problem, we use a new algorithm integrated by DE and Newton-Raphson to solve it. It is validated by a reentry vehicle X-33, simulation results indicated that the algorithm is effective and robust.

Performance Analysis of Digital Signal Processors Using SMV Benchmark

Unlike general-purpose processors, digital signal processors (DSP processors) are strongly application-dependent. To meet the needs for diverse applications, a wide variety of DSP processors based on different architectures ranging from the traditional to VLIW have been introduced to the market over the years. The functionality, performance, and cost of these processors vary over a wide range. In order to select a processor that meets the design criteria for an application, processor performance is usually the major concern for digital signal processing (DSP) application developers. Performance data are also essential for the designers of DSP processors to improve their design. Consequently, several DSP performance benchmarks have been proposed over the past decade or so. However, none of these benchmarks seem to have included recent new DSP applications. In this paper, we use a new benchmark that we recently developed to compare the performance of popular DSP processors from Texas Instruments and StarCore. The new benchmark is based on the Selectable Mode Vocoder (SMV), a speech-coding program from the recent third generation (3G) wireless voice applications. All benchmark kernels are compiled by the compilers of the respective DSP processors and run on their simulators. Weighted arithmetic mean of clock cycles and arithmetic mean of code size are used to compare the performance of five DSP processors. In addition, we studied how the performance of a processor is affected by code structure, features of processor architecture and optimization of compiler. The extensive experimental data gathered, analyzed, and presented in this paper should be helpful for DSP processor and compiler designers to meet their specific design goals.

Implementation of On-Line Cutting Stock Problem on NC Machines

Introduction applicability of high-speed cutting stock problem (CSP) is presented in this paper. Due to the orders continued coming in from various on-line ways for a professional cutting company, to stay competitive, such a business has to focus on sustained production at high levels. In others words, operators have to keep the machine running to stay ahead of the pack. Therefore, the continuous stock cutting problem with setup is proposed to minimize the cutting time and pattern changing time to meet the on-line given demand. In this paper, a novel method is proposed to solve the problem directly by using cutting patterns directly. A major advantage of the proposed method in series on-line production is that the system can adjust the cutting plan according to the floating orders. Examples with multiple items are demonstrated. The results show considerable efficiency and reliability in high-speed cutting of CSP.

Parallel Distributed Computational Microcontroller System for Adaptive Antenna Downlink Transmitter Power Optimization

This paper presents a tested research concept that implements a complex evolutionary algorithm, genetic algorithm (GA), in a multi-microcontroller environment. Parallel Distributed Genetic Algorithm (PDGA) is employed in adaptive beam forming technique to reduce power usage of adaptive antenna at WCDMA base station. Adaptive antenna has dynamic beam that requires more advanced beam forming algorithm such as genetic algorithm which requires heavy computation and memory space. Microcontrollers are low resource platforms that are normally not associated with GAs, which are typically resource intensive. The aim of this project was to design a cooperative multiprocessor system by expanding the role of small scale PIC microcontrollers to optimize WCDMA base station transmitter power. Implementation results have shown that PDGA multi-microcontroller system returned optimal transmitted power compared to conventional GA.

Query Optimization Techniques for XML Databases

Over the past few years, XML (eXtensible Mark-up Language) has emerged as the standard for information representation and data exchange over the Internet. This paper provides a kick-start for new researches venturing in XML databases field. We survey the storage representation for XML document, review the XML query processing and optimization techniques with respect to the particular storage instance. Various optimization technologies have been developed to solve the query retrieval and updating problems. Towards the later year, most researchers proposed hybrid optimization techniques. Hybrid system opens the possibility of covering each technology-s weakness by its strengths. This paper reviews the advantages and limitations of optimization techniques.

NSGA Based Optimal Volt / Var Control in Distribution System with Dispersed Generation

In this paper, a method based on Non-Dominated Sorting Genetic Algorithm (NSGA) has been presented for the Volt / Var control in power distribution systems with dispersed generation (DG). Genetic algorithm approach is used due to its broad applicability, ease of use and high accuracy. The proposed method is better suited for volt/var control problems. A multi-objective optimization problem has been formulated for the volt/var control of the distribution system. The non-dominated sorting genetic algorithm based method proposed in this paper, alleviates the problem of tuning the weighting factors required in solving the multi-objective volt/var control optimization problems. Based on the simulation studies carried out on the distribution system, the proposed scheme has been found to be simple, accurate and easy to apply to solve the multiobjective volt/var control optimization problem of the distribution system with dispersed generation.

EML-Estimation of Multivariate t Copulas with Heuristic Optimization

In recent years, copulas have become very popular in financial research and actuarial science as they are more flexible in modelling the co-movements and relationships of risk factors as compared to the conventional linear correlation coefficient by Pearson. However, a precise estimation of the copula parameters is vital in order to correctly capture the (possibly nonlinear) dependence structure and joint tail events. In this study, we employ two optimization heuristics, namely Differential Evolution and Threshold Accepting to tackle the parameter estimation of multivariate t distribution models in the EML approach. Since the evolutionary optimizer does not rely on gradient search, the EML approach can be applied to estimation of more complicated copula models such as high-dimensional copulas. Our experimental study shows that the proposed method provides more robust and more accurate estimates as compared to the IFM approach.

A Parameter-Tuning Framework for Metaheuristics Based on Design of Experiments and Artificial Neural Networks

In this paper, a framework for the simplification and standardization of metaheuristic related parameter-tuning by applying a four phase methodology, utilizing Design of Experiments and Artificial Neural Networks, is presented. Metaheuristics are multipurpose problem solvers that are utilized on computational optimization problems for which no efficient problem specific algorithm exist. Their successful application to concrete problems requires the finding of a good initial parameter setting, which is a tedious and time consuming task. Recent research reveals the lack of approach when it comes to this so called parameter-tuning process. In the majority of publications, researchers do have a weak motivation for their respective choices, if any. Because initial parameter settings have a significant impact on the solutions quality, this course of action could lead to suboptimal experimental results, and thereby a fraudulent basis for the drawing of conclusions.

The Optimization of an Intelligent Traffic Congestion Level Classification from Motorists- Judgments on Vehicle's Moving Patterns

We proposed a technique to identify road traffic congestion levels from velocity of mobile sensors with high accuracy and consistent with motorists- judgments. The data collection utilized a GPS device, a webcam, and an opinion survey. Human perceptions were used to rate the traffic congestion levels into three levels: light, heavy, and jam. Then the ratings and velocity were fed into a decision tree learning model (J48). We successfully extracted vehicle movement patterns to feed into the learning model using a sliding windows technique. The parameters capturing the vehicle moving patterns and the windows size were heuristically optimized. The model achieved accuracy as high as 99.68%. By implementing the model on the existing traffic report systems, the reports will cover comprehensive areas. The proposed method can be applied to any parts of the world.

Evaluation of a PSO Approach for Optimum Design of a First-Order Controllers for TCP/AQM Systems

This paper presents a Particle Swarm Optimization (PSO) method for determining the optimal parameters of a first-order controller for TCP/AQM system. The model TCP/AQM is described by a second-order system with time delay. First, the analytical approach, based on the D-decomposition method and Lemma of Kharitonov, is used to determine the stabilizing regions of a firstorder controller. Second, the optimal parameters of the controller are obtained by the PSO algorithm. Finally, the proposed method is implemented in the Network Simulator NS-2 and compared with the PI controller.

Optimum Time Coordination of Overcurrent Relays using Two Phase Simplex Method

Overcurrent (OC) relays are the major protection devices in a distribution system. The operating time of the OC relays are to be coordinated properly to avoid the mal-operation of the backup relays. The OC relay time coordination in ring fed distribution networks is a highly constrained optimization problem which can be stated as a linear programming problem (LPP). The purpose is to find an optimum relay setting to minimize the time of operation of relays and at the same time, to keep the relays properly coordinated to avoid the mal-operation of relays. This paper presents two phase simplex method for optimum time coordination of OC relays. The method is based on the simplex algorithm which is used to find optimum solution of LPP. The method introduces artificial variables to get an initial basic feasible solution (IBFS). Artificial variables are removed using iterative process of first phase which minimizes the auxiliary objective function. The second phase minimizes the original objective function and gives the optimum time coordination of OC relays.

Automatic 3D Reconstruction of Coronary Artery Centerlines from Monoplane X-ray Angiogram Images

We present a new method for the fully automatic 3D reconstruction of the coronary artery centerlines, using two X-ray angiogram projection images from a single rotating monoplane acquisition system. During the first stage, the input images are smoothed using curve evolution techniques. Next, a simple yet efficient multiscale method, based on the information of the Hessian matrix, for the enhancement of the vascular structure is introduced. Hysteresis thresholding using different image quantiles, is used to threshold the arteries. This stage is followed by a thinning procedure to extract the centerlines. The resulting skeleton image is then pruned using morphological and pattern recognition techniques to remove non-vessel like structures. Finally, edge-based stereo correspondence is solved using a parallel evolutionary optimization method based on f symbiosis. The detected 2D centerlines combined with disparity map information allow the reconstruction of the 3D vessel centerlines. The proposed method has been evaluated on patient data sets for evaluation purposes.

Approximate Solution of Nonlinear Fredholm Integral Equations of the First Kind via Converting to Optimization Problems

In this paper we introduce an approach via optimization methods to find approximate solutions for nonlinear Fredholm integral equations of the first kind. To this purpose, we consider two stages of approximation. First we convert the integral equation to a moment problem and then we modify the new problem to two classes of optimization problems, non-constraint optimization problems and optimal control problems. Finally numerical examples is proposed.

Ablation, Mechanical and Thermal Properties of Fiber/Phenolic Matrix Composites

In this study, an ablation, mechanical and thermal properties of a rocket motor insulation from phenolic/ fiber matrix composites forming a laminate with different fiber between fiberglass and locally available synthetic fibers. The phenolic/ fiber matrix composites was mechanics and thermal properties by means of tensile strength, ablation, TGA and DSC. The design of thermal insulation involves several factors.Determined the mechanical properties according to MIL-I-24768: Density >1.3 g/cm3, Tensile strength >103 MPa and Ablation

Optimization of the Characteristic Straight Line Method by a “Best Estimate“ of Observed, Normal Orthometric Elevation Differences

In this paper, to optimize the “Characteristic Straight Line Method" which is used in the soil displacement analysis, a “best estimate" of the geodetic leveling observations has been achieved by taking in account the concept of 'Height systems'. This concept has been discussed in detail and consequently the concept of “height". In landslides dynamic analysis, the soil is considered as a mosaic of rigid blocks. The soil displacement has been monitored and analyzed by using the “Characteristic Straight Line Method". Its characteristic components have been defined constructed from a “best estimate" of the topometric observations. In the measurement of elevation differences, we have used the most modern leveling equipment available. Observational procedures have also been designed to provide the most effective method to acquire data. In addition systematic errors which cannot be sufficiently controlled by instrumentation or observational techniques are minimized by applying appropriate corrections to the observed data: the level collimation correction minimizes the error caused by nonhorizontality of the leveling instrument's line of sight for unequal sight lengths, the refraction correction is modeled to minimize the refraction error caused by temperature (density) variation of air strata, the rod temperature correction accounts for variation in the length of the leveling rod' s Invar/LO-VAR® strip which results from temperature changes, the rod scale correction ensures a uniform scale which conforms to the international length standard and the introduction of the concept of the 'Height systems' where all types of height (orthometric, dynamic, normal, gravity correction, and equipotential surface) have been investigated. The “Characteristic Straight Line Method" is slightly more convenient than the “Characteristic Circle Method". It permits to evaluate a displacement of very small magnitude even when the displacement is of an infinitesimal quantity. The inclination of the landslide is given by the inverse of the distance reference point O to the “Characteristic Straight Line". Its direction is given by the bearing of the normal directed from point O to the Characteristic Straight Line (Fig..6). A “best estimate" of the topometric observations was used to measure the elevation of points carefully selected, before and after the deformation. Gross errors have been eliminated by statistical analyses and by comparing the heights within local neighborhoods. The results of a test using an area where very interesting land surface deformation occurs are reported. Monitoring with different options and qualitative comparison of results based on a sufficient number of check points are presented.

Genetic Algorithm Parameters Optimization for Bi-Criteria Multiprocessor Task Scheduling Using Design of Experiments

Multiprocessor task scheduling is a NP-hard problem and Genetic Algorithm (GA) has been revealed as an excellent technique for finding an optimal solution. In the past, several methods have been considered for the solution of this problem based on GAs. But, all these methods consider single criteria and in the present work, minimization of the bi-criteria multiprocessor task scheduling problem has been considered which includes weighted sum of makespan & total completion time. Efficiency and effectiveness of genetic algorithm can be achieved by optimization of its different parameters such as crossover, mutation, crossover probability, selection function etc. The effects of GA parameters on minimization of bi-criteria fitness function and subsequent setting of parameters have been accomplished by central composite design (CCD) approach of response surface methodology (RSM) of Design of Experiments. The experiments have been performed with different levels of GA parameters and analysis of variance has been performed for significant parameters for minimisation of makespan and total completion time simultaneously.

Jobs Scheduling and Worker Assignment Problem to Minimize Makespan using Ant Colony Optimization Metaheuristic

This article proposes an Ant Colony Optimization (ACO) metaheuristic to minimize total makespan for scheduling a set of jobs and assign workers for uniformly related parallel machines. An algorithm based on ACO has been developed and coded on a computer program Matlab®, to solve this problem. The paper explains various steps to apply Ant Colony approach to the problem of minimizing makespan for the worker assignment & jobs scheduling problem in a parallel machine model and is aimed at evaluating the strength of ACO as compared to other conventional approaches. One data set containing 100 problems (12 Jobs, 03 machines and 10 workers) which is available on internet, has been taken and solved through this ACO algorithm. The results of our ACO based algorithm has shown drastically improved results, especially, in terms of negligible computational effort of CPU, to reach the optimal solution. In our case, the time taken to solve all 100 problems is even lesser than the average time taken to solve one problem in the data set by other conventional approaches like GA algorithm and SPT-A/LMC heuristics.

Medical Knowledge Management in Healthcare Industry

The Siemens Healthcare Sector is one of the world's largest suppliers to the healthcare industry and a trendsetter in medical imaging and therapy, laboratory diagnostics, medical information technology, and hearing aids. Siemens offers its customers products and solutions for the entire range of patient care from a single source – from prevention and early detection to diagnosis, and on to treatment and aftercare. By optimizing clinical workflows for the most common diseases, Siemens also makes healthcare faster, better, and more cost effective. The optimization of clinical workflows requires a multidisciplinary focus and a collaborative approach of e.g. medical advisors, researchers and scientists as well as healthcare economists. This new form of collaboration brings together experts with deep technical experience, physicians with specialized medical knowledge as well as people with comprehensive knowledge about health economics. As Charles Darwin is often quoted as saying, “It is neither the strongest of the species that survive, nor the most intelligent, but the one most responsive to change," We believe that those who can successfully manage this change will emerge as winners, with valuable competitive advantage. Current medical information and knowledge are some of the core assets in the healthcare industry. The main issue is to connect knowledge holders and knowledge recipients from various disciplines efficiently in order to spread and distribute knowledge.