Robustness of Hybrid Learning Acceleration Feedback Control Scheme in Flexible Manipulators

This paper describes a practical approach to design and develop a hybrid learning with acceleration feedback control (HLC) scheme for input tracking and end-point vibration suppression of flexible manipulator systems. Initially, a collocated proportionalderivative (PD) control scheme using hub-angle and hub-velocity feedback is developed for control of rigid-body motion of the system. This is then extended to incorporate a further hybrid control scheme of the collocated PD control and iterative learning control with acceleration feedback using genetic algorithms (GAs) to optimize the learning parameters. Experimental results of the response of the manipulator with the control schemes are presented in the time and frequency domains. The performance of the HLC is assessed in terms of input tracking, level of vibration reduction at resonance modes and robustness with various payloads.

Cross-Search Technique and its Visualization of Peer-to-Peer Distributed Clinical Documents

One of the ubiquitous routines in medical practice is searching through voluminous piles of clinical documents. In this paper we introduce a distributed system to search and exchange clinical documents. Clinical documents are distributed peer-to-peer. Relevant information is found in multiple iterations of cross-searches between the clinical text and its domain encyclopedia.

A Comparative Study of SVM Classifiers and Artificial Neural Networks Application for Rolling Element Bearing Fault Diagnosis using Wavelet Transform Preprocessing

Effectiveness of Artificial Neural Networks (ANN) and Support Vector Machines (SVM) classifiers for fault diagnosis of rolling element bearings are presented in this paper. The characteristic features of vibration signals of rotating driveline that was run in its normal condition and with faults introduced were used as input to ANN and SVM classifiers. Simple statistical features such as standard deviation, skewness, kurtosis etc. of the time-domain vibration signal segments along with peaks of the signal and peak of power spectral density (PSD) are used as features to input the ANN and SVM classifier. The effect of preprocessing of the vibration signal by Discreet Wavelet Transform (DWT) prior to feature extraction is also studied. It is shown from the experimental results that the performance of SVM classifier in identification of bearing condition is better then ANN and pre-processing of vibration signal by DWT enhances the effectiveness of both ANN and SVM classifier

Extraction of Significant Phrases from Text

Prospective readers can quickly determine whether a document is relevant to their information need if the significant phrases (or keyphrases) in this document are provided. Although keyphrases are useful, not many documents have keyphrases assigned to them, and manually assigning keyphrases to existing documents is costly. Therefore, there is a need for automatic keyphrase extraction. This paper introduces a new domain independent keyphrase extraction algorithm. The algorithm approaches the problem of keyphrase extraction as a classification task, and uses a combination of statistical and computational linguistics techniques, a new set of attributes, and a new machine learning method to distinguish keyphrases from non-keyphrases. The experiments indicate that this algorithm performs better than other keyphrase extraction tools and that it significantly outperforms Microsoft Word 2000-s AutoSummarize feature. The domain independence of this algorithm has also been confirmed in our experiments.

An Interactive 3D Experience for the Creation of Personalized Styling

This research proposes an Interactive 3D Experience to enhance customer value in the fantasy era. As products reach maturity, they become more similar in the range of functions that they provide. This leads to competition via reduced retail price and ultimately reduced profitability. A competitive design method is therefore needed that can produce higher value products. An Enhanced Value Experience has been identified that can assist designers to provide quality products and to give them a unique positioning. On the basis of this value opportunity, the method of Interactive 3D Experience has been formulated and applied to the domain of retail furniture. Through this, customers can create their own personalized styling via the interactive 3D platform.

Reservoir Operating by Ant Colony Optimization for Continuous Domains (ACOR) Case Study: Dez Reservoir

A direct search approach to determine optimal reservoir operating is proposed with ant colony optimization for continuous domains (ACOR). The model is applied to a system of single reservoir to determine the optimum releases during 42 years of monthly steps. A disadvantage of ant colony based methods and the ACOR in particular, refers to great amount of computer run time consumption. In this study a highly effective procedure for decreasing run time has been developed. The results are compared to those of a GA based model.

Influence of Turbulence Model, Grid Resolution and Free-Stream Turbulence Intensity on the Numerical Simulation of the Flow Field around an Inclined Flat Plate

The flow field around a flat plate of infinite span has been investigated for several values of the angle of attack. Numerical predictions have been compared to experimental measurements, in order to examine the effect of turbulence model and grid resolution on the resultant aerodynamic forces acting on the plate. Also the influence of the free-stream turbulence intensity, at the entrance of the computational domain, has been investigated. A full campaign of simulations has been conducted for three inclination angles (9°, 15° and 30°), in order to obtain some practical guidelines to be used for the simulation of the flow field around inclined plates and discs.

Aspects to Motivate users of a Design Engineering Wiki to Share their Knowledge

Industrial design engineering is an information and knowledge intensive job. Although Wikipedia offers a lot of this information, design engineers are better served with a wiki tailored to their job, offering information in a compact manner and functioning as a design tool. For that reason WikID has been developed. However for the viability of a wiki, an active user community is essential. The main subject of this paper is a study to the influence of the communication and the contents of WikID on the user-s willingness to contribute. At first the theory about a website-s first impression, general usability guidelines and user motivation in an online community is studied. Using this theory, the aspects of the current site are analyzed on their suitability. These results have been verified with a questionnaire amongst 66 industrial design engineers (or students industrial design engineering). The main conclusion is that design engineers are enchanted with the existence of WikID and its knowledge structure (taxonomy) but this structure has not become clear without any guidance. In other words, the knowledge structure is very helpful for inspiring and guiding design engineers through their tailored knowledge domain in WikID but this taxonomy has to be better communicated on the main page. Thereby the main page needs to be fitted more to the target group preferences.

Bifurcation Method for Solving Positive Solutions to a Class of Semilinear Elliptic Equations and Stability Analysis of Solutions

Semilinear elliptic equations are ubiquitous in natural sciences. They give rise to a variety of important phenomena in quantum mechanics, nonlinear optics, astrophysics, etc because they have rich multiple solutions. But the nontrivial solutions of semilinear equations are hard to be solved for the lack of stabilities, such as Lane-Emden equation, Henon equation and Chandrasekhar equation. In this paper, bifurcation method is applied to solving semilinear elliptic equations which are with homogeneous Dirichlet boundary conditions in 2D. Using this method, nontrivial numerical solutions will be computed and visualized in many different domains (such as square, disk, annulus, dumbbell, etc).

Specialization-based parallel Processing without Memo-trees

The purpose of this paper is to propose a framework for constructing correct parallel processing programs based on Equivalent Transformation Framework (ETF). ETF regards computation as In the framework, a problem-s domain knowledge and a query are described in definite clauses, and computation is regarded as transformation of the definite clauses. Its meaning is defined by a model of the set of definite clauses, and the transformation rules generated must preserve meaning. We have proposed a parallel processing method based on “specialization", a part of operation in the transformations, which resembles substitution in logic programming. The method requires “Memo-tree", a history of specialization to maintain correctness. In this paper we proposes the new method for the specialization-base parallel processing without Memo-tree.

Concept Abduction in Description Logics with Cardinality Restrictions

Recently the usefulness of Concept Abduction, a novel non-monotonic inference service for Description Logics (DLs), has been argued in the context of ontology-based applications such as semantic matchmaking and resource retrieval. Based on tableau calculus, a method has been proposed to realize this reasoning task in ALN, a description logic that supports simple cardinality restrictions as well as other basic constructors. However, in many ontology-based systems, the representation of ontology would require expressive formalisms for capturing domain-specific constraints, this language is not sufficient. In order to increase the applicability of the abductive reasoning method in such contexts, we would like to present in the scope of this paper an extension of the tableaux-based algorithm for dealing with concepts represented inALCQ, the description logic that extends ALN with full concept negation and quantified number restrictions.

Microarrays Denoising via Smoothing of Coefficients in Wavelet Domain

We describe a novel method for removing noise (in wavelet domain) of unknown variance from microarrays. The method is based on a smoothing of the coefficients of the highest subbands. Specifically, we decompose the noisy microarray into wavelet subbands, apply smoothing within each highest subband, and reconstruct a microarray from the modified wavelet coefficients. This process is applied a single time, and exclusively to the first level of decomposition, i.e., in most of the cases, it is not necessary a multirresoltuion analysis. Denoising results compare favorably to the most of methods in use at the moment.

Small Signal Stability Assessment Employing PSO Based TCSC Controller with Comparison to GA Based Design

This paper aims to select the optimal location and setting parameters of TCSC (Thyristor Controlled Series Compensator) controller using Particle Swarm Optimization (PSO) and Genetic Algorithm (GA) to mitigate small signal oscillations in a multimachine power system. Though Power System Stabilizers (PSSs) are prime choice in this issue, installation of FACTS device has been suggested here in order to achieve appreciable damping of system oscillations. However, performance of any FACTS devices highly depends upon its parameters and suitable location in the power network. In this paper PSO as well as GA based techniques are used separately and compared their performances to investigate this problem. The results of small signal stability analysis have been represented employing eigenvalue as well as time domain response in face of two common power system disturbances e.g., varying load and transmission line outage. It has been revealed that the PSO based TCSC controller is more effective than GA based controller even during critical loading condition.

Mean-Square Performance of Adaptive Filter Algorithms in Nonstationary Environments

Employing a recently introduced unified adaptive filter theory, we show how the performance of a large number of important adaptive filter algorithms can be predicted within a general framework in nonstationary environment. This approach is based on energy conservation arguments and does not need to assume a Gaussian or white distribution for the regressors. This general performance analysis can be used to evaluate the mean square performance of the Least Mean Square (LMS) algorithm, its normalized version (NLMS), the family of Affine Projection Algorithms (APA), the Recursive Least Squares (RLS), the Data-Reusing LMS (DR-LMS), its normalized version (NDR-LMS), the Block Least Mean Squares (BLMS), the Block Normalized LMS (BNLMS), the Transform Domain Adaptive Filters (TDAF) and the Subband Adaptive Filters (SAF) in nonstationary environment. Also, we establish the general expressions for the steady-state excess mean square in this environment for all these adaptive algorithms. Finally, we demonstrate through simulations that these results are useful in predicting the adaptive filter performance.

DHT-LMS Algorithm for Sensorineural Loss Patients

Hearing impairment is the number one chronic disability affecting many people in the world. Background noise is particularly damaging to speech intelligibility for people with hearing loss especially for sensorineural loss patients. Several investigations on speech intelligibility have demonstrated sensorineural loss patients need 5-15 dB higher SNR than the normal hearing subjects. This paper describes Discrete Hartley Transform Power Normalized Least Mean Square algorithm (DHT-LMS) to improve the SNR and to reduce the convergence rate of the Least Means Square (LMS) for sensorineural loss patients. The DHT transforms n real numbers to n real numbers, and has the convenient property of being its own inverse. It can be effectively used for noise cancellation with less convergence time. The simulated result shows the superior characteristics by improving the SNR at least 9 dB for input SNR with zero dB and faster convergence rate (eigenvalue ratio 12) compare to time domain method and DFT-LMS.

A Method for 3D Mesh Adaptation in FEA

The use of the mechanical simulation (in particular the finite element analysis) requires the management of assumptions in order to analyse a real complex system. In finite element analysis (FEA), two modeling steps require assumptions to be able to carry out the computations and to obtain some results: the building of the physical model and the building of the simulation model. The simplification assumptions made on the analysed system in these two steps can generate two kinds of errors: the physical modeling errors (mathematical model, domain simplifications, materials properties, boundary conditions and loads) and the mesh discretization errors. This paper proposes a mesh adaptive method based on the use of an h-adaptive scheme in combination with an error estimator in order to choose the mesh of the simulation model. This method allows us to choose the mesh of the simulation model in order to control the cost and the quality of the finite element analysis.

Convective Heat Transfer Enhancement in an Enclosure with Fin Utilizing Nano Fluids

The objective of the present work is to conduct investigations leading to a more complete explanation of single phase natural convective heat transfer in an enclosure with fin utilizing nano fluids. The nano fluid used, which is composed of Aluminum oxide nano particles in suspension of Ethylene glycol, is provided at various volume fractions. The study is carried out numerically for a range of Rayleigh numbers, fin heights and aspect ratio. The flow and temperature distributions are taken to be two-dimensional. Regions with the same velocity and temperature distributions are identified as symmetry of sections. One half of such a rectangular region is chosen as the computational domain taking into account the symmetry about the fin. Transport equations are modeled by a stream functionvorticity formulation and are solved numerically by finite-difference schemes. Comparisons with previously published works on the basis of special cases are done. Results are presented in the form of streamline, vector and isotherm plots as well as the variation of local Nusselt number along the fin under different conditions.

Further the Effectiveness of Software Testability Measure

Software testability is proposed to address the problem of increasing cost of test and the quality of software. Testability measure provides a quantified way to denote the testability of software. Since 1990s, many testability measure models are proposed to address the problem. By discussing the contradiction between domain testability and domain range ratio (DRR), a new testability measure, semantic fault distance, is proposed. Its validity is discussed.

Frequency-Domain Design of Fractional-Order FIR Differentiators

In this paper, a fractional-order FIR differentiator design method using the differential evolution (DE) algorithm is presented. In the proposed method, the FIR digital filter is designed to meet the frequency response of a desired fractal-order differentiator, which is evaluated in the frequency domain. To verify the design performance, another design method considered in the time-domain is also provided. Simulation results reveal the efficiency of the proposed method.

A Blind Digital Watermark in Hadamard Domain

A new blind gray-level watermarking scheme is described. In the proposed method, the host image is first divided into 4*4 non-overlapping blocks. For each block, two first AC coefficients of its Hadamard transform are then estimated using DC coefficients of its neighbor blocks. A gray-level watermark is then added into estimated values. Since embedding watermark does not change the DC coefficients, watermark extracting could be done by estimating AC coefficients and comparing them with their actual values. Several experiments are made and results suggest the robustness of the proposed algorithm.