Intuitive Robot Control Using Surface EMG and Accelerometer Signals

This paper proposes a method of remotely controlling robots with arm gestures using surface electromyography (EMG) and accelerometer sensors attached to the operator’s wrists. The EMG and accelerometer sensors receive signals from the arm gestures of the operator and infer the corresponding movements to execute the command to control the robot. The movements of the robot include moving forward and backward and turning left and right. The accuracy is over 99% and movements can be controlled in real time.

Extension of the Client-Centric Approach under Small Buffer Space

Periodic broadcast is a cost-effective solution for large-scale distribution of popular videos because this approach guarantees constant worst service latency, regardless of the number of video requests. An essential periodic broadcast method is the client-centric approach (CCA), which allows clients to use smaller receiving bandwidth to download broadcast data. An enhanced version, namely CCA++, was proposed to yield a shorter waiting time. This work further improves CCA++ in reducing client buffer requirements. The new scheme decreases the buffer requirements by as much as 52% when compared to CCA++. This study also provides an analytical evaluation to demonstrate the performance advantage, as compared with particular schemes.

Six Sigma in Mexican Manufacturing Companies

This work is about Six Sigma (SS) implementation in Mexico by using an empirical study. Main goals are to analyze the degree of importance of the Critical Success Factors (CSFs) of SS and to examine if these factors are grouped in some way. A literature research and a survey were conducted to capture SS practitioner’s viewpoint about CSFs in SS implementation and their impact on the performance within manufacturing companies located in Baja California, Mexico. Finally, a Principal Component Analysis showed that nine critical success factors could be grouped in three components, which are: management vision, implementation strategy, and collaborative team. In the other hand, SS’s success is represented by cost reduction, variation reduction, experience and self-esteem of the workers, and quality improvement. Concluding remarks arising from the study are that CSFs are changing through time and paying attention to these nine factors can increase SS’s success likelihood.

Assessing and Improving Ramp-Up Capability

In times when product life cycles are decreasing, while market demands are increasing, manufacturing enterprises are confronted with the challenge of more frequent and more complex ramp-ups. Thus it becomes obvious that ramp-up management is going to be a topic enterprises have to focus on in the future. Since each ramp-up is unique concerning the product, the process, the technology, the circumstances and the coaction of these four factors, the knowledge of the ramp-up situation and the current ramp-up capability of the enterprise are fundamental requirements for the subsequent improvement of the ramp-up capability of the production system. In this article a methodology is going to be presented which can be used to define typical production ramp-up situations, to identify the current ramp-up capability of a production system and to improve it with respect to a specific situation. Additionally there will be a description of the functionality of a software-tool developed based on this methodology.

Logistic Changeability - Application of a Methodological Framework for Designing Logistic Changeability

In the past decades, the environment of production companies showed a permanent increase in dynamic and volatility in the form of demand fluctuations, new technologies or global crises. As a reaction to these new requirements, changeability of production systems came into attention. A changeable production system can adapt to these changes quickly and with little effort. Even though demand for changeable production exists for some time, the practical application is still insufficient. To overcome this deficit, a three year research project at the Department of Production Systems and Logistics at the Leibniz University of Hanover/ Germany was initiated. As a result of this project, different concepts have been developed to design production changeable. An excerpt of the results will be presented in this paper. An eight step procedure will be presented to design the changeability of production logistics. This procedure has been applied at a German manufacturer of high demanding weighing machines. The developed procedure, their application in industry, as well as the major results of the application will be presented.

Dynamic Safety-Stock Calculation

In order to ensure a high service level industrial enterprises have to maintain safety-stock that directly influences the economic efficiency at the same time. This paper analyses established mathematical methods to calculate safety-stock. Therefore, the performance measured in stock and service level is appraised and the limits of several methods are depicted. Afterwards, a new dynamic approach is presented to gain an extensive method to calculate safety-stock that also takes the knowledge of future volatility into account.

Benefits from a SMED Application in a Punching Machine

This paper presents an application of the Single-Minute Exchange of Die (SMED) methodology to a turret punching machine in an elevators company, in Portugal. The work was developed during five months, in the ambit of a master thesis in Industrial Engineering and Management. The Lean Production tool SMED was applied to reduce setup times in order to improve the production flexibility of the machine. The main results obtained were a reduction of 64% in setup time (from 15.1 to 5.4min), 50% in work-in-process amount (from 12.8 to 6.4 days) and 99% in the distance traveled by the operator during the internal period (from 136.7 to 1.7m). These improvements correspond to gains of about €7,315.38 per year.

Backcalculation of HMA Stiffness Based On Finite Element Model

Stiffness of Hot Mix Asphalt (HMA) in flexible pavement is largely dependent of temperature, mode of testing and age of pavement. Accurate measurement of HMA stiffness is thus quite challenging. This study determines HMA stiffness based on Finite Element Model (FEM) and validates the results using field data. As a first step, stiffnesses of different layers of a pavement section on Interstate 40 (I-40) in New Mexico were determined by Falling Weight Deflectometer (FWD) test. Pavement temperature was not measured at that time due to lack of temperature probe. Secondly, a FE model is developed in ABAQUS. Stiffness of the base, subbase and subgrade were taken from the FWD test output obtained from the first step. As HMA stiffness largely varies with temperature it was assigned trial and error approach. Thirdly, horizontal strain and vertical stress at the bottom of the HMA and temperature at different depths of the pavement were measured with installed sensors on the whole day on December 25th, 2012. Fourthly, outputs of FEM were correlated with measured stress-strain responses. After a number of trials a relationship was developed between the trial stiffness of HMA and measured mid-depth HMA temperature. At last, the obtained relationship between stiffness and temperature is verified by further FWD test when pavement temperature was recorded. A promising agreement between them is observed. Therefore, conclusion can be drawn that linear elastic FEM can accurately predict the stiffness and the structural response of flexible pavement.

Disturbances of the Normal Operation of Kosovo Power System Regarding Atmospheric Discharges

This paper discusses aspects of outages in the electric transmission network in the Kosovo Power System caused by the atmospheric discharges. Frequency and location of the atmospheric discharges in Kosovo territory will be provided by a lightning location system ALARM (Automated Lightning Alert and Risk Management) and from the data from the Meteorological Department in Prishtina International Airport. These data will be used to make comparisons with the actual outages registered in the Kosovo Power System from the Kosovo Transmission, systems and market operator (KOSTT) during a specific time period. The lines with the worst performance determined, regarding the atmospheric discharges, will be choose for further discussions in terms of over voltages caused by the direct or indirect lightning strokes. Recommendations for protection in terms of insulator coordination and surge arresters will be given at the end and in this stage dynamic simulation will take part.

Parallel Priority Region Approach to Detect Background

Background detection is essential in video analyses; optimization is often needed in order to achieve real time calculation. Information gathered by dual cameras placed in the front and rear part of an Autonomous Vehicle (AV) is integrated for background detection. In this paper, real time calculation is achieved on the proposed technique by using Priority Regions (PR) and Parallel Processing together where each frame is divided into regions then and each region process is processed in parallel. PR division depends upon driver view limitations. A background detection system is built on the Temporal Difference (TD) and Gaussian Filtering (GF). Temporal Difference and Gaussian Filtering with multi threshold and sigma (weight) value are be based on PR characteristics. The experiment result is prepared on real scene. Comparison of the speed and accuracy with traditional background detection techniques, the effectiveness of PR and parallel processing are also discussed in this paper.

Amplitude and Phase Analysis of EEG Signal by Complex Demodulation

Analysis of amplitude and phase characteristics for delta, theta, and alpha bands at localized time instant from EEG signals is important for the characterizing information processing in the brain. In this paper, complex demodulation method was used to analyze EEG (Electroencephalographic) signal, particularly for auditory evoked potential response signal, with sufficient time resolution and designated frequency bandwidth resolution required. The complex demodulation decomposes raw EEG signal into 3 designated delta, theta, and alpha bands with complex EEG signal representation at sampled time instant, which can enable the extraction of amplitude envelope and phase information. Throughout simulated test data, and real EEG signal acquired during auditory attention task, it can extract the phase offset, phase and frequency changing instant and decomposed amplitude envelope for delta, theta, and alpha bands. The complex demodulation technique can be efficiently used in brain signal analysis in case of phase, and amplitude information required.

A Deterministic Dynamic Programming Approach for Optimization Problem with Quadratic Objective Function and Linear Constraints

This paper presents the novel deterministic dynamic programming approach for solving optimization problem with quadratic objective function with linear equality and inequality constraints. The proposed method employs backward recursion in which computations proceeds from last stage to first stage in a multi-stage decision problem. A generalized recursive equation which gives the exact solution of an optimization problem is derived in this paper. The method is purely analytical and avoids the usage of initial solution. The feasibility of the proposed method is demonstrated with a practical example. The numerical results show that the proposed method provides global optimum solution with negligible computation time.

Identifying Interactions in a Feeding System

In production processes, assembly conceals a considerable potential for increased efficiency in terms of lowering production costs. Due to the individualisation of customer requirements, product variants have increased in recent years. Simultaneously, the portion of automated production systems has increased. A challenge is to adapt the flexibility and adaptability of automated systems to these changes. The Institute for Production Systems and Logistics developed an aerodynamic orientation system for feeding technology. When changing to other components, only four parameters must be adjusted. The expenditure of time for setting parameters is high. An objective therefore is developing an optimisation algorithm for automatic parameter configuration. Know how regarding the interaction of the four parameters and their effect on the sizes to be optimised is required in order to be able to develop a more efficient algorithm. This article introduces an analysis of the interactions between parameters and their influence on the quality of feeding.

Impact of Liquidity Crunch on Interbank Network

Most empirical studies have analyzed how liquidity risks faced by individual institutions turn into systemic risk. Recent banking crisis has highlighted the importance of grasping and controlling the systemic risk, and the acceptance by Central Banks to ease their monetary policies for saving default or illiquid banks. This last point shows that banks would pay less attention to liquidity risk which, in turn, can become a new important channel of loss. The financial regulation focuses on the most important and “systemic” banks in the global network. However, to quantify the expected loss associated with liquidity risk, it is worth to analyze sensitivity to this channel for the various elements of the global bank network. A small bank is not considered as potentially systemic; however the interaction of small banks all together can become a systemic element. This paper analyzes the impact of medium and small banks interaction on a set of banks which is considered as the core of the network. The proposed method uses the structure of agent-based model in a two-class environment. In first class, the data from actual balance sheets of 22 large and systemic banks (such as BNP Paribas or Barclays) are collected. In second one, to model a network as closely as possible to actual interbank market, 578 fictitious banks smaller than the ones belonging to first class have been split into two groups of small and medium ones. All banks are active on the European interbank network and have deposit and market activity. A simulation of 12 three month periods representing a midterm time interval three years is projected. In each period, there is a set of behavioral descriptions: repayment of matured loans, liquidation of deposits, income from securities, collection of new deposits, new demands of credit, and securities sale. The last two actions are part of refunding process developed in this paper. To strengthen reliability of proposed model, random parameters dynamics are managed with stochastic equations as rates the variations of which are generated by Vasicek model. The Central Bank is considered as the lender of last resort which allows banks to borrow at REPO rate and some ejection conditions of banks from the system are introduced. Liquidity crunch due to exogenous crisis is simulated in the first class and the loss impact on other bank classes is analyzed though aggregate values representing the aggregate of loans and/or the aggregate of borrowing between classes. It is mainly shown that the three groups of European interbank network do not have the same response, and that intermediate banks are the most sensitive to liquidity risk.

A Distance Function for Data with Missing Values and Its Application

Missing values in data are common in real world applications. Since the performance of many data mining algorithms depend critically on it being given a good metric over the input space, we decided in this paper to define a distance function for unlabeled datasets with missing values. We use the Bhattacharyya distance, which measures the similarity of two probability distributions, to define our new distance function. According to this distance, the distance between two points without missing attributes values is simply the Mahalanobis distance. When on the other hand there is a missing value of one of the coordinates, the distance is computed according to the distribution of the missing coordinate. Our distance is general and can be used as part of any algorithm that computes the distance between data points. Because its performance depends strongly on the chosen distance measure, we opted for the k nearest neighbor classifier to evaluate its ability to accurately reflect object similarity. We experimented on standard numerical datasets from the UCI repository from different fields. On these datasets we simulated missing values and compared the performance of the kNN classifier using our distance to other three basic methods. Our  experiments show that kNN using our distance function outperforms the kNN using other methods. Moreover, the runtime performance of our method is only slightly higher than the other methods.

Isolation and Classification of Red Blood Cells in Anemic Microscopic Images

Red blood cells (RBCs) are among the most commonly and intensively studied type of blood cells in cell biology. Anemia is a lack of RBCs is characterized by its level compared to the normal hemoglobin level. In this study, a system based image processing methodology was developed to localize and extract RBCs from microscopic images. Also, the machine learning approach is adopted to classify the localized anemic RBCs images. Several textural and geometrical features are calculated for each extracted RBCs. The training set of features was analyzed using principal component analysis (PCA). With the proposed method, RBCs were isolated in 4.3secondsfrom an image containing 18 to 27 cells. The reasons behind using PCA are its low computation complexity and suitability to find the most discriminating features which can lead to accurate classification decisions. Our classifier algorithm yielded accuracy rates of 100%, 99.99%, and 96.50% for K-nearest neighbor (K-NN) algorithm, support vector machine (SVM), and neural network RBFNN, respectively. Classification was evaluated in highly sensitivity, specificity, and kappa statistical parameters. In conclusion, the classification results were obtained within short time period, and the results became better when PCA was used.

Critical Analysis of the Hong Kong International Convention on Ship Recycling

In May 2009, the International Maritime Organization (IMO) adopted the Hong Kong International Convention for the Safe and Environmentally Sound Recycling of Ships to address the growing concerns about the environmental, occupational health and safety risks related to ship recycling. The aim of the Hong Kong Convention is to provide a legally binding instrument which ensures that the process of ship recycling does not pose risks to human health, safety and to the environment. In this paper, critical analysis of the Hong Kong Convention has been carried out in order to study the effectiveness of the Convention to meet its objectives. The Convention has been studied in detail including its background, main features, major stakeholders, strengths and weaknesses. The Convention, though having several deficiencies, is a major breakthrough in not only recognizing but also dealing with the ill-practices associated with ship recycling.

Legal Knowledge of Legislated Employment Rights: An Empirical Study

This article aims to assess the level of basic knowledge of statutory employment rights at the workplace as prescribed by the Malaysian Employment Act 1955. The statutory employment rights comprises of a variety of individual employment rights such as protections of wages, statutory right to the general standard of working time, statutory right to rest day, public holidays, annual leave and sick leave as well as female employee’s statutory right to paid maternity leave. A field survey was carried out to collect data by using self-administered questionnaires from Human Resource (HR) practitioners in the small and medium-sized enterprises (SMEs). The results reveal that the level of basic knowledge of legislated employment rights varies between different types of statutory rights from high level to low level.

Implementation of Heuristics for Solving Travelling Salesman Problem Using Nearest Neighbour and Minimum Spanning Tree Algorithms

The travelling salesman problem (TSP) is a combinatorial optimization problem in which the goal is to find the shortest path between different cities that the salesman takes. In other words, the problem deals with finding a route covering all cities so that total distance and execution time is minimized. This paper adopts the nearest neighbor and minimum spanning tree algorithm to solve the well-known travelling salesman problem. The algorithms were implemented using java programming language. The approach is tested on three graphs that making a TSP tour instance of 5-city, 10 –city, and 229–city. The computation results validate the performance of the proposed algorithm.

The Use of TV and the Internet in the Social Context

This study examines the media habits of young people in Saudi Arabia, in particular their use of the Internet and television in the domestic sphere, and how use of the Internet impacts upon other activities. In order to address the research questions, focus group interviews were conducted with Saudi university students. The study found that television has become a central part of social life within the household where television represents a main source for family time, particularly in Ramadan while the Internet is a solitary activity where it is used in more private spaces. Furthermore, Saudi females were also more likely to have their Internet access monitored and circumscribed by family members, with parents controlling the location and the amount of time spent using the Internet.