A Novel Multiresolution based Optimization Scheme for Robust Affine Parameter Estimation

This paper describes a new method for affine parameter estimation between image sequences. Usually, the parameter estimation techniques can be done by least squares in a quadratic way. However, this technique can be sensitive to the presence of outliers. Therefore, parameter estimation techniques for various image processing applications are robust enough to withstand the influence of outliers. Progressively, some robust estimation functions demanding non-quadratic and perhaps non-convex potentials adopted from statistics literature have been used for solving these. Addressing the optimization of the error function in a factual framework for finding a global optimal solution, the minimization can begin with the convex estimator at the coarser level and gradually introduce nonconvexity i.e., from soft to hard redescending non-convex estimators when the iteration reaches finer level of multiresolution pyramid. Comparison has been made to find the performance of the results of proposed method with the results found individually using two different estimators.

Force on a High Voltage Capacitor with Asymmetrical Electrodes

When a high DC voltage is applied to a capacitor with strongly asymmetrical electrodes, it generates a mechanical force that affects the whole capacitor. This phenomenon is most likely to be caused by the motion of ions generated around the smaller of the two electrodes and their subsequent interaction with the surrounding medium. A method to measure this force has been devised and used. A formula describing the force has also been derived. After comparing the data gained through experiments with those acquired using the theoretical formula, a difference was found above a certain value of current. This paper also gives reasons for this difference.

Hidden State Probabilistic Modeling for Complex Wavelet Based Image Registration

This article presents a computationally tractable probabilistic model for the relation between the complex wavelet coefficients of two images of the same scene. The two images are acquisitioned at distinct moments of times, or from distinct viewpoints, or by distinct sensors. By means of the introduced probabilistic model, we argue that the similarity between the two images is controlled not by the values of the wavelet coefficients, which can be altered by many factors, but by the nature of the wavelet coefficients, that we model with the help of hidden state variables. We integrate this probabilistic framework in the construction of a new image registration algorithm. This algorithm has sub-pixel accuracy and is robust to noise and to other variations like local illumination changes. We present the performance of our algorithm on various image types.

Evaluation of Structural Behavior of Wide Sleepers on Asphalt Trackbed Due to Embedded Shear Keys

Korea Train eXpress (KTX) is now being operated, which allows Korea being one of the countries that operates the high-speed rail system. The high-speed rail has its advantage of short time transportation of population and materials, which lead to many researches performed in this matter. In the case of high speed classical trackbed system, the maintenance and usability of gravel ballast system is costly. Recently, the concrete trackbed structure has been introduced as a replacement of classical trackbed system. In this case, the sleeper plays a critical role. Current study investigated to develop the track sleepers readily applicable to the top of the asphalt trackbed, as part of the trcakbed study utilizing the asphalt material. Among many possible shapes and design of sleepers, current study proposed two types of wide-sleepers according to the shear-key installation method. The structural behavior analysis and safety evaluation on each case was conducted using Korean design standard.

Two Area Power Systems Economic Dispatch Problem Solving Considering Transmission Capacity Constraints

This paper describes an efficient and practical method for economic dispatch problem in one and two area electrical power systems with considering the constraint of the tie transmission line capacity constraint. Direct search method (DSM) is used with some equality and inequality constraints of the production units with any kind of fuel cost function. By this method, it is possible to use several inequality constraints without having difficulty for complex cost functions or in the case of unavailability of the cost function derivative. To minimize the number of total iterations in searching, process multi-level convergence is incorporated in the DSM. Enhanced direct search method (EDSM) for two area power system will be investigated. The initial calculation step size that causes less iterations and then less calculation time is presented. Effect of the transmission tie line capacity, between areas, on economic dispatch problem and on total generation cost will be studied; line compensation and active power with reactive power dispatch are proposed to overcome the high generation costs for this multi-area system.

How Celebrities can be used in Advertising to the Best Advantage?

The ever increasing product diversity and competition on the market of goods and services has dictated the pace of growth in the number of advertisements. Despite their admittedly diminished effectiveness over the recent years, advertisements remain the favored method of sales promotion. Consequently, the challenge for an advertiser is to explore every possible avenue of making an advertisement more noticeable, attractive and impellent for consumers. One way to achieve this is through invoking celebrity endorsements. On the one hand, the use of a celebrity to endorse a product involves substantial costs, however, on the other hand, it does not immediately guarantee the success of an advertisement. The question of how celebrities can be used in advertising to the best advantage is therefore of utmost importance. Celebrity endorsements have become commonplace: empirical evidence indicates that approximately 20 to 25 per cent of advertisements feature some famous person as a product endorser. The popularity of celebrity endorsements demonstrates the relevance of the topic, especially in the context of the current global economic downturn, when companies are forced to save in order to survive, yet simultaneously to heavily invest in advertising and sales promotion. The issue of the effective use of celebrity endorsements also figures prominently in the academic discourse. The study presented below is thus aimed at exploring what qualities (characteristics) of a celebrity endorser have an impact on the ffectiveness of the advertisement in which he/she appears and how.

Ethanol Production from Sugarcane Bagasse by Means of Enzymes Produced by Solid State Fermentation Method

Nowadays there is a growing interest in biofuel production in most countries because of the increasing concerns about hydrocarbon fuel shortage and global climate changes, also for enhancing agricultural economy and producing local needs for transportation fuel. Ethanol can be produced from biomass by the hydrolysis and sugar fermentation processes. In this study ethanol was produced without using expensive commercial enzymes from sugarcane bagasse. Alkali pretreatment was used to prepare biomass before enzymatic hydrolysis. The comparison between NaOH, KOH and Ca(OH)2 shows NaOH is more effective on bagasse. The required enzymes for biomass hydrolysis were produced from sugarcane solid state fermentation via two fungi: Trichoderma longibrachiatum and Aspergillus niger. The results show that the produced enzyme solution via A. niger has functioned better than T. longibrachiatum. Ethanol was produced by simultaneous saccharification and fermentation (SSF) with crude enzyme solution from T. longibrachiatum and Saccharomyces cerevisiae yeast. To evaluate this procedure, SSF of pretreated bagasse was also done using Celluclast 1.5L by Novozymes. The yield of ethanol production by commercial enzyme and produced enzyme solution via T. longibrachiatum was 81% and 50% respectively.

A Pairwise-Gaussian-Merging Approach: Towards Genome Segmentation for Copy Number Analysis

Segmentation, filtering out of measurement errors and identification of breakpoints are integral parts of any analysis of microarray data for the detection of copy number variation (CNV). Existing algorithms designed for these tasks have had some successes in the past, but they tend to be O(N2) in either computation time or memory requirement, or both, and the rapid advance of microarray resolution has practically rendered such algorithms useless. Here we propose an algorithm, SAD, that is much faster and much less thirsty for memory – O(N) in both computation time and memory requirement -- and offers higher accuracy. The two key ingredients of SAD are the fundamental assumption in statistics that measurement errors are normally distributed and the mathematical relation that the product of two Gaussians is another Gaussian (function). We have produced a computer program for analyzing CNV based on SAD. In addition to being fast and small it offers two important features: quantitative statistics for predictions and, with only two user-decided parameters, ease of use. Its speed shows little dependence on genomic profile. Running on an average modern computer, it completes CNV analyses for a 262 thousand-probe array in ~1 second and a 1.8 million-probe array in 9 seconds

Cascade Kalman Filter Configuration for Low Cost IMU/GPS Integration in Car Navigation Like Robot

This paper introduces a low cost INS/GPS algorithm for land vehicle navigation application. The data fusion process is done with an extended Kalman filter in cascade configuration mode. In order to perform numerical simulations, MATLAB software has been developed. Loosely coupled configuration is considered. The results obtained in this work demonstrate that a low-cost INS/GPS navigation system is partially capable of meeting the performance requirements for land vehicle navigation. The relative effectiveness of the kalman filter implementation in integrated GPS/INS navigation algorithm is highlighted. The paper also provides experimental results; field test using a car is carried out.

An Assessment of Food Control System and Development Perspective: The Case of Myanmar

Food control measures are critical in fostering food safety management of a nation. However, no academic study has been undertaken to assess the food control system of Myanmar up to now. The objective of this research paper was to assess the food control system with in depth examination of five key components using desktop analysis and short survey from related food safety program organizations including regulators and inspectors. Study showed that the existing food control system is conventional, mainly focusing on primary health care approach while relying on reactive measures. The achievements of food control work have been limited to a certain extent due to insufficienttechnical capacity that is needed to upgrade staffs, laboratory equipment and technical assistance etc. associated with various sectors. Assessing food control measures is the first step in the integration of food safety management, this paper could assist policy makers in providing information for enhancing the safety and quality of food produced and consumed in Myanmar.

Numerical Study of Iterative Methods for the Solution of the Dirichlet-Neumann Map for Linear Elliptic PDEs on Regular Polygon Domains

A generalized Dirichlet to Neumann map is one of the main aspects characterizing a recently introduced method for analyzing linear elliptic PDEs, through which it became possible to couple known and unknown components of the solution on the boundary of the domain without solving on its interior. For its numerical solution, a well conditioned quadratically convergent sine-Collocation method was developed, which yielded a linear system of equations with the diagonal blocks of its associated coefficient matrix being point diagonal. This structural property, among others, initiated interest for the employment of iterative methods for its solution. In this work we present a conclusive numerical study for the behavior of classical (Jacobi and Gauss-Seidel) and Krylov subspace (GMRES and Bi-CGSTAB) iterative methods when they are applied for the solution of the Dirichlet to Neumann map associated with the Laplace-s equation on regular polygons with the same boundary conditions on all edges.

Fast Calculation for Particle Interactions in SPH Simulations: Outlined Sub-domain Technique

A simple and easy algorithm is presented for a fast calculation of kernel functions which required in fluid simulations using the Smoothed Particle Hydrodynamic (SPH) method. Present proposed algorithm improves the Linked-list algorithm and adopts the Pair-Wise Interaction technique, which are widely used for evaluating kernel functions in fluid simulations using the SPH method. The algorithm is easy to be implemented without any complexities in programming. Some benchmark examples are used to show the simulation time saved by using the proposed algorithm. Parametric studies on the number of divisions for sub-domains, smoothing length and total amount of particles are conducted to show the effectiveness of the present technique. A compact formulation is proposed for practical usage.

Motor Skill Adaptation Depends On the Level of Learning

An experiment was conducted to examine the effect of the level of performance stabilization on the human adaptability to perceptual-motor perturbation in a complex coincident timing task. Three levels of performance stabilization were established operationally: pre-stabilization, stabilization, and super-stabilization groups. Each group practiced the task until reached its level of stabilization in a constant sequence of movements and under a constant time constraint before exposure to perturbation. The results clearly showed that performance stabilization is a pre-condition for adaptation. Moreover, variability before reaching stabilization is harmful to adaptation and persistent variability after stabilization is beneficial. Moreover, the behavior of variability is specific to each measure.

Flour and Bread Quality of Spring Spelt

The article contains results of the flour and bread quality assessment from the grains of spring spelt, also called as an ancient wheat. Spelt was cultivated on heavy and medium soils observing principles of organic farming. Based on flour and bread laboratory studies, as well as laboratory baking, the technological usefulness of studied flour has been determined. These results were referred to the standard derived from common wheat cultivated in the same conditions. Grain of spring spelt is a good raw material for manufacturing bread flour, from which to get high-quality bakery products, but this is strictly dependent on the variety of ancient wheat.

Synthesis and Characterization of Plasma Polymerized Thin Films Deposited from Benzene and Hexamethyldisiloxane using (PECVD) Method

Polymer-like organic thin films were deposited on both aluminum alloy type 6061 and glass substrates at room temperature by Plasma Enhanced Chemical Vapor Deposition (PECVD) methodusing benzene and hexamethyldisiloxane (HMDSO) as precursor materials. The surface and physical properties of plasma-polymerized organic thin films were investigated at different r.f. powers. The effects of benzene/argon ratio on the properties of plasma polymerized benzene films were also investigated. It is found that using benzene alone results in a non-coherent and non-adherent powdery deposited material. The chemical structure and surface properties of the asgrown plasma polymerized thin films were analyzed on glass substrates with FTIR and contact angle measurements. FTIR spectra of benzene deposited film indicated that the benzene rings are preserved when increasing benzene ratio and/or decreasing r.f. powers. FTIR spectra of HMDSO deposited films indicated an increase of the hydrogen concentration and a decrease of the oxygen concentration with the increase of r.f. power. The contact angle (θ) of the films prepared from benzene was found to increase by about 43% as benzene ratio increases from 10% to 20%. θ was then found to decrease to the original value (51°) when the benzene ratio increases to 100%. The contact angle, θ, for both benzene and HMDSO deposited films were found to increase with r.f. power. This signifies that the plasma polymerized organic films have substantially low surface energy as the r.f power increases. The corrosion resistance of aluminum alloy substrate both bare and covered with plasma polymerized thin films was carried out by potentiodynamic polarization measurements in standard 3.5 wt. % NaCl solution at room temperature. The results indicate that the benzene and HMDSO deposited films are suitable for protection of the aluminum substrate against corrosion. The changes in the processing parameters seem to have a strong influence on the film protective ability. Surface roughness of films deposited on aluminum alloy substrate was investigated using scanning electron microscopy (SEM). The SEM images indicate that the surface roughness of benzene deposited films increase with decreasing the benzene ratio. SEM images of benzene and HMDSO deposited films indicate that the surface roughness decreases with increasing r.f. power. Studying the above parameters indicate that the films produced are suitable for specific practical applications.

Damage of Tubular Equipment in Process Industry

Tubular process equipment is often damaged in industrial processes. The damage occurs both on devices working at high temperatures and also on less exposed devices. In case of sudden damage of key equipment a shutdown of the whole production unit and resulting significant economic losses are imminent. This paper presents a solution of several types of tubular process equipment. The causes of damage and suggestions of correction actions are discussed in all cases. Very important part is the analysis of operational conditions, determination of unfavourable working states decreasing lifetime of devices and suggestions of correction actions. Lately very popular numerical methods are used for analysis of the equipment.

Deixis and Personalization in Ad Slogans

This study examines the use of the persuasive strategy of deixis and personalization in advertising slogans. This rhetorical/ stylistic and linguistic strategy has been found to be widely used in advertising slogans for over a century. A total of five hundred advertising slogans of multinational companies in both product and service sectors were obtained. The analysis reveals the 3 main components of this strategy as being deictic words, absolute uniqueness and personal pronouns. The percentage and mean of the use of the 3 components are tabulated. The findings show that advertisers have used this persuasive strategy in creative ways to persuade consumers to buy their products and services.

Dissipation of Higher Mode using Numerical Integration Algorithm in Dynamic Analysis

In general dynamic analyses, lower mode response is of interest, however the higher modes of spatially discretized equations generally do not represent the real behavior and not affects to global response much. Some implicit algorithms, therefore, are introduced to filter out the high-frequency modes using intended numerical error. The objective of this study is to introduce the P-method and PC α-method to compare that with dissipation method and Newmark method through the stability analysis and numerical example. PC α-method gives more accuracy than other methods because it based on the α-method inherits the superior properties of the implicit α-method. In finite element analysis, the PC α-method is more useful than other methods because it is the explicit scheme and it achieves the second order accuracy and numerical damping simultaneously.

A Bi-Objective Model for Location-Allocation Problem within Queuing Framework

This paper proposes a bi-objective model for the facility location problem under a congestion system. The idea of the model is motivated by applications of locating servers in bank automated teller machines (ATMS), communication networks, and so on. This model can be specifically considered for situations in which fixed service facilities are congested by stochastic demand within queueing framework. We formulate this model with two perspectives simultaneously: (i) customers and (ii) service provider. The objectives of the model are to minimize (i) the total expected travelling and waiting time and (ii) the average facility idle-time. This model represents a mixed-integer nonlinear programming problem which belongs to the class of NP-hard problems. In addition, to solve the model, two metaheuristic algorithms including nondominated sorting genetic algorithms (NSGA-II) and non-dominated ranking genetic algorithms (NRGA) are proposed. Besides, to evaluate the performance of the two algorithms some numerical examples are produced and analyzed with some metrics to determine which algorithm works better.

Application of Pearson Parametric Distribution Model in Fatigue Life Reliability Evaluation

The aim of this paper is to introduce a parametric distribution model in fatigue life reliability analysis dealing with variation in material properties. Service loads in terms of responsetime history signal of Belgian pave were replicated on a multi-axial spindle coupled road simulator and stress-life method was used to estimate the fatigue life of automotive stub axle. A PSN curve was obtained by monotonic tension test and two-parameter Weibull distribution function was used to acquire the mean life of the component. A Pearson system was developed to evaluate the fatigue life reliability by considering stress range intercept and slope of the PSN curve as random variables. Considering normal distribution of fatigue strength, it is found that the fatigue life of the stub axle to have the highest reliability between 10000 – 15000 cycles. Taking into account the variation of material properties associated with the size effect, machining and manufacturing conditions, the method described in this study can be effectively applied in determination of probability of failure of mass-produced parts.