Robust Adaptive ELS-QR Algorithm for Linear Discrete Time Stochastic Systems Identification

This work proposes a recursive weighted ELS algorithm for system identification by applying numerically robust orthogonal Householder transformations. The properties of the proposed algorithm show it obtains acceptable results in a noisy environment: fast convergence and asymptotically unbiased estimates. Comparative analysis with others robust methods well known from literature are also presented.

Face Recognition: A Literature Review

The task of face recognition has been actively researched in recent years. This paper provides an up-to-date review of major human face recognition research. We first present an overview of face recognition and its applications. Then, a literature review of the most recent face recognition techniques is presented. Description and limitations of face databases which are used to test the performance of these face recognition algorithms are given. A brief summary of the face recognition vendor test (FRVT) 2002, a large scale evaluation of automatic face recognition technology, and its conclusions are also given. Finally, we give a summary of the research results.

A New Method for Contour Approximation Using Basic Ramer Idea

This paper presented two new efficient algorithms for contour approximation. The proposed algorithm is compared with Ramer (good quality), Triangle (faster) and Trapezoid (fastest) in this work; which are briefly described. Cartesian co-ordinates of an input contour are processed in such a manner that finally contours is presented by a set of selected vertices of the edge of the contour. In the paper the main idea of the analyzed procedures for contour compression is performed. For comparison, the mean square error and signal-to-noise ratio criterions are used. Computational time of analyzed methods is estimated depending on a number of numerical operations. Experimental results are obtained both in terms of image quality, compression ratios, and speed. The main advantages of the analyzed algorithm is small numbers of the arithmetic operations compared to the existing algorithms.

Robust Design and Optimization of Production Wastes: An Application for Industries

This paper focuses on robust design and optimization of industrial production wastes. Past literatures were reviewed to case study Clamason Industries Limited (CIL) - a leading ladder-tops manufacturer. A painstaking study of the firm-s practices at the shop floor revealed that Over-production, Waiting time, Excess inventory, and Defects are the major wastes that are impeding their progress and profitability. Design expert8 software was used to apply Taguchi robust design and response surface methodology in order to model, analyse and optimise the wastes cost in CIL. Waiting time and overproduction rank first and second in contributing to the costs of wastes in CIL. For minimal wastes cost the control factors of overproduction, waiting-time, defects and excess-inventory must be set at 0.30, 390.70, 4 and 55.70 respectively for CIL. The optimal value of cost of wastes for the months studied was 22.3679. Finally, a recommendation was made that for the company to enhance their profitability and customer satisfaction, they must adopt the Shingeo Shingo-s Single Minute Exchange of Dies (SMED), which will immediately tackle the waste of waiting by drastically reducing their setup time.

Artificial Intelligence Support for Interferon Treatment Decision in Chronic Hepatitis B

Chronic hepatitis B can evolve to cirrhosis and liver cancer. Interferon is the only effective treatment, for carefully selected patients, but it is very expensive. Some of the selection criteria are based on liver biopsy, an invasive, costly and painful medical procedure. Therefore, developing efficient non-invasive selection systems, could be in the patients benefit and also save money. We investigated the possibility to create intelligent systems to assist the Interferon therapeutical decision, mainly by predicting with acceptable accuracy the results of the biopsy. We used a knowledge discovery in integrated medical data - imaging, clinical, and laboratory data. The resulted intelligent systems, tested on 500 patients with chronic hepatitis B, based on C5.0 decision trees and boosting, predict with 100% accuracy the results of the liver biopsy. Also, by integrating the other patients selection criteria, they offer a non-invasive support for the correct Interferon therapeutic decision. To our best knowledge, these decision systems outperformed all similar systems published in the literature, and offer a realistic opportunity to replace liver biopsy in this medical context.

Experimental Determination of the Critical Locus of the Acetone + Chloroform Binary System

In this paper, vapour-liquid critical locus for the binary system acetone + chloroform was determined experimentally over the whole range of composition. The critical property measurements were carried out using a dynamic-synthetic apparatus, employed in the dynamic mode. The critical points are visually determined by observing the critical opalescence and the simultaneous disappearance and reappearance of the meniscus in the middle of a high-pressure view cell which withstands operations up to 673K and 20MPa. The experimental critical points measured in this work were compared to those available in literature.

MEGSOR Iterative Scheme for the Solution of 2D Elliptic PDE's

Recently, the findings on the MEG iterative scheme has demonstrated to accelerate the convergence rate in solving any system of linear equations generated by using approximation equations of boundary value problems. Based on the same scheme, the aim of this paper is to investigate the capability of a family of four-point block iterative methods with a weighted parameter, ω such as the 4 Point-EGSOR, 4 Point-EDGSOR, and 4 Point-MEGSOR in solving two-dimensional elliptic partial differential equations by using the second-order finite difference approximation. In fact, the formulation and implementation of three four-point block iterative methods are also presented. Finally, the experimental results show that the Four Point MEGSOR iterative scheme is superior as compared with the existing four point block schemes.

A Conceptual Framework for Supply Chain Competitiveness

The purpose of this paper is to highlight the importance of the concept of competitiveness in the supply chain and to present a conceptual framework for Supply Chain Competitiveness (SCC). The framework is based on supply chain activities, which are inputs, necessary for SCC and the benefits which are the outputs of SCC. A literature review is conducted on key supply chain competitiveness issues, its determinants, its various dimensions followed by exploration for SCC. Based on the insights gained, a conceptual framework for SCC is presented based on activities for SCC, SCC environment and outcomes of SCC. The information flow in the conceptual framework is bi-directional at all levels and the activities are interrelated in a global competitive environment. The activities include the activities of suppliers, manufacturers and distributors, giving more emphasis on manufacturers- activities. Further, implications of various factors such as economic, politicolegal, technical, socio-cultural, competition, demographic etc. are also highlighted. The SCC framework is an attempt to cover the relatively less explored area of supply chain competitiveness. It is expected that this work will further motivate researchers, academicians and practitioners to work in this area and offers conceptual help in providing a directions for supply chain competitiveness which leads to improvement in the supply chain and supply chain performance.

A Literature Review of Servant Leadership and Criticism of Advanced Research

Although there are many theories and discussion of leadership, the necessity of having a new leadership paradigm was emphasized. The existing leadership characteristic of instruction and control revealed its limitations. Market competition becomes fierce and economic recession never ends worldwide. Of the leadership theories, servant leadership was introduced recently and is in line with the environmental changes of the organization. Servant leadership is a combination of two words, 'servant' and 'leader' and can be defined as the role of the leader who focuses on doing voluntary work for others with altruistic ethics, makes members, customers, and local communities a priority, and makes a commitment to satisfying their needs. This leadership received attention as one field of leadership in the late 1990s and secured its legitimacy. This study discusses the existing research trends of leadership, the concept, behavior characteristics, and lower dimensions of servant leadership, compares servant leadership with the existing leadership researches and diagnoses if servant leadership is a useful concept for further leadership researches. Finally, this study criticizes the limitations in the existing researches on servant leadership.

Replicating Data Objects in Large-scale Distributed Computing Systems using Extended Vickrey Auction

This paper proposes a novel game theoretical technique to address the problem of data object replication in largescale distributed computing systems. The proposed technique draws inspiration from computational economic theory and employs the extended Vickrey auction. Specifically, players in a non-cooperative environment compete for server-side scarce memory space to replicate data objects so as to minimize the total network object transfer cost, while maintaining object concurrency. Optimization of such a cost in turn leads to load balancing, fault-tolerance and reduced user access time. The method is experimentally evaluated against four well-known techniques from the literature: branch and bound, greedy, bin-packing and genetic algorithms. The experimental results reveal that the proposed approach outperforms the four techniques in both the execution time and solution quality.

An Efficient Method for Solving Multipoint Equation Boundary Value Problems

In this work, we solve multipoint boundary value problems where the boundary value conditions are equations using the Newton-Broyden Shooting method (NBSM).The proposed method is tested upon several problems from the literature and the results are compared with the available exact solution. The experiments are given to illustrate the efficiency and implementation of the method.

A Multiple-Objective Environmental Rationalization and Optimization for Material Substitution in the Production of Stone-Washed Jeans- Garments

As the Textile Industry is the second largest industry in Egypt and as small and medium-sized enterprises (SMEs) make up a great portion of this industry therein it is essential to apply the concept of Cleaner Production for the purpose of reducing pollution. In order to achieve this goal, a case study concerned with ecofriendly stone-washing of jeans-garments was investigated. A raw material-substitution option was adopted whereby the toxic potassium permanganate and sodium sulfide were replaced by the environmentally compatible hydrogen peroxide and glucose respectively where the concentrations of both replaced chemicals together with the operating time were optimized. In addition, a process-rationalization option involving four additional processes was investigated. By means of criteria such as product quality, effluent analysis, mass and heat balance; and cost analysis with the aid of a statistical model, a process optimization treatment revealed that the superior process optima were 50%, 0.15% and 50min for H2O2 concentration, glucose concentration and time, respectively. With these values the superior process ought to reduce the annual cost by about EGP 105 relative to the currently used conventional method.

A Study on the Quality of Hexapod Machine Tool's Workspace

One of the main concerns about parallel mechanisms is the presence of singular points within their workspaces. In singular positions the mechanism gains or loses one or several degrees of freedom. It is impossible to control the mechanism in singular positions. Therefore, these positions have to be avoided. This is a vital need especially in computer controlled machine tools designed and manufactured on the basis of parallel mechanisms. This need has to be taken into consideration when selecting design parameters. A prerequisite to this is a thorough knowledge about the effect of design parameters and constraints on singularity. In this paper, quality condition index was introduced as a criterion for evaluating singularities of different configurations of a hexapod mechanism obtainable by different design parameters. It was illustrated that this method can effectively be employed to obtain the optimum configuration of hexapod mechanism with the aim of avoiding singularity within the workspace. This method was then employed to design the hexapod table of a CNC milling machine.

Integrating Security Indifference Curve to Formal Decision Evaluation

Decisions are regularly made during a project or daily life. Some decisions are critical and have a direct impact on project or human success. Formal evaluation is thus required, especially for crucial decisions, to arrive at the optimal solution among alternatives to address issues. According to microeconomic theory, all people-s decisions can be modeled as indifference curves. The proposed approach supports formal analysis and decision by constructing indifference curve model from the previous experts- decision criteria. These knowledge embedded in the system can be reused or help naïve users select alternative solution of the similar problem. Moreover, the method is flexible to cope with unlimited number of factors influencing the decision-making. The preliminary experimental results of the alternative selection are accurately matched with the expert-s decisions.

2D Rigid Registration of MR Scans using the 1d Binary Projections

This paper presents the application of a signal intensity independent registration criterion for 2D rigid body registration of medical images using 1D binary projections. The criterion is defined as the weighted ratio of two projections. The ratio is computed on a pixel per pixel basis and weighting is performed by setting the ratios between one and zero pixels to a standard high value. The mean squared value of the weighted ratio is computed over the union of the one areas of the two projections and it is minimized using the Chebyshev polynomial approximation using n=5 points. The sum of x and y projections is used for translational adjustment and a 45deg projection for rotational adjustment. 20 T1- T2 registration experiments were performed and gave mean errors 1.19deg and 1.78 pixels. The method is suitable for contour/surface matching. Further research is necessary to determine the robustness of the method with regards to threshold, shape and missing data.

Hybrid of Hunting Search and Modified Simplex Methods for Grease Position Parameter Design Optimisation

This study proposes a multi-response surface optimization problem (MRSOP) for determining the proper choices of a process parameter design (PPD) decision problem in a noisy environment of a grease position process in an electronic industry. The proposed models attempts to maximize dual process responses on the mean of parts between failure on left and right processes. The conventional modified simplex method and its hybridization of the stochastic operator from the hunting search algorithm are applied to determine the proper levels of controllable design parameters affecting the quality performances. A numerical example demonstrates the feasibility of applying the proposed model to the PPD problem via two iterative methods. Its advantages are also discussed. Numerical results demonstrate that the hybridization is superior to the use of the conventional method. In this study, the mean of parts between failure on left and right lines improve by 39.51%, approximately. All experimental data presented in this research have been normalized to disguise actual performance measures as raw data are considered to be confidential.

A New Effective Local Search Heuristic for the Maximum Clique Problem

An edge based local search algorithm, called ELS, is proposed for the maximum clique problem (MCP), a well-known combinatorial optimization problem. ELS is a two phased local search method effectively £nds the near optimal solutions for the MCP. A parameter ’support’ of vertices de£ned in the ELS greatly reduces the more number of random selections among vertices and also the number of iterations and running times. Computational results on BHOSLIB and DIMACS benchmark graphs indicate that ELS is capable of achieving state-of-the-art-performance for the maximum clique with reasonable average running times.

Two Iterative Algorithms to Compute the Bisymmetric Solution of the Matrix Equation A1X1B1 + A2X2B2 + ... + AlXlBl = C

In this paper, two matrix iterative methods are presented to solve the matrix equation A1X1B1 + A2X2B2 + ... + AlXlBl = C the minimum residual problem l i=1 AiXiBi−CF = minXi∈BRni×ni l i=1 AiXiBi−CF and the matrix nearness problem [X1, X2, ..., Xl] = min[X1,X2,...,Xl]∈SE [X1,X2, ...,Xl] − [X1, X2, ..., Xl]F , where BRni×ni is the set of bisymmetric matrices, and SE is the solution set of above matrix equation or minimum residual problem. These matrix iterative methods have faster convergence rate and higher accuracy than former methods. Paige’s algorithms are used as the frame method for deriving these matrix iterative methods. The numerical example is used to illustrate the efficiency of these new methods.

On Constructing Approximate Convex Hull

The algorithms of convex hull have been extensively studied in literature, principally because of their wide range of applications in different areas. This article presents an efficient algorithm to construct approximate convex hull from a set of n points in the plane in O(n + k) time, where k is the approximation error control parameter. The proposed algorithm is suitable for applications preferred to reduce the computation time in exchange of accuracy level such as animation and interaction in computer graphics where rapid and real-time graphics rendering is indispensable.

Exploring the Ambiguity Resolution in Spacecraft Attitude Determination Using GNSS Phase Measurement

Attitude Determination (AD) of a spacecraft using the phase measurements of the Global Navigation Satellite System (GNSS) is an active area of research. Various attitude determination algorithms have been developed in yester years for spacecrafts using different sensors but the last two decades have witnessed a phenomenal increase in research related with GPS receivers as a stand-alone sensor for determining the attitude of satellite using the phase measurements of the signals from GNSS. The GNSS-based Attitude determination algorithms have been experimented in many real missions. The problem of AD algorithms using GNSS phase measurements has two important parts; the ambiguity resolution and the determining of attitude. Ambiguity resolution is the widely addressed topic in literature for implementing the AD algorithm using GNSS phase measurements for achieving the accuracy of millimeter level. This paper broadly overviews the different techniques for resolving the integer ambiguities encountered in AD using GNSS phase measurements.