A Development of Home Service Robot using Omni-Wheeled Mobility and Task-Based Manipulation

In this paper, a Smart Home Service Robot, McBot II, which performs mess-cleanup function etc. in house, is designed much more optimally than other service robots. It is newly developed in much more practical system than McBot I which we had developed two years ago. One characteristic attribute of mobile platforms equipped with a set of dependent wheels is their omni- directionality and the ability to realize complex translational and rotational trajectories for agile navigation in door. An accurate coordination of steering angle and spinning rate of each wheel is necessary for a consistent motion. This paper develops trajectory controller of 3-wheels omni-directional mobile robot using fuzzy azimuth estimator. A specialized anthropomorphic robot manipulator which can be attached to the housemaid robot McBot II, is developed in this paper. This built-in type manipulator consists of both arms with 3 DOF (Degree of Freedom) each and both hands with 3 DOF each. The robotic arm is optimally designed to satisfy both the minimum mechanical size and the maximum workspace. Minimum mass and length are required for the built-in cooperated-arms system. But that makes the workspace so small. This paper proposes optimal design method to overcome the problem by using neck joint to move the arms horizontally forward/backward and waist joint to move them vertically up/down. The robotic hand, which has two fingers and a thumb, is also optimally designed in task-based concept. Finally, the good performance of the developed McBot II is confirmed through live tests of the mess-cleanup task.

Assessing the Global Water Productivity of Some Irrigation Command Areas in Iran

The great challenge of the agricultural sector is to produce more crop from less water, which can be achieved by increasing crop water productivity. The modernization of the irrigation systems offers a number of possibilities to expand the economic productivity of water and improve the virtual water status. The objective of the present study is to assess the global water productivity (GWP) within the major irrigation command areas of I.R. Iran. For this purpose, fourteen irrigation command areas where located in different areas of Iran were selected. In order to calculate the global water productivity of irrigation command areas, all data on the delivered water to cropping pattern, cultivated area, crops water requirement, and yield production rate during 2002-2006 were gathered. In each of the command areas it seems that the cultivated crops have a higher amount of virtual water and thus can be replaced by crops with less virtual water. This is merely suggested due to crop water consumption and at the time of replacing crops, economic value as well as cultural and political factors must be considered. The results indicated that the lowest GWP belongs to Mahyar and Borkhar irrigation areas, 0.24 kg m-3, and the highest is that of the Dez irrigation area, 0.81 kg m-3. The findings demonstrated that water management in the two irrigation areas is just efficient. The difference in the GWP of irrigation areas is due to variations in the cropping pattern, amount of crop productions, in addition to the effective factors in the water use efficiency in the irrigation areas.

Using Data Mining Techniques for Finding Cardiac Outlier Patients

In this paper we used data mining techniques to identify outlier patients who are using large amount of drugs over a long period of time. Any healthcare or health insurance system should deal with the quantities of drugs utilized by chronic diseases patients. In Kingdom of Bahrain, about 20% of health budget is spent on medications. For the managers of healthcare systems, there is no enough information about the ways of drug utilization by chronic diseases patients, is there any misuse or is there outliers patients. In this work, which has been done in cooperation with information department in the Bahrain Defence Force hospital; we select the data for Cardiac patients in the period starting from 1/1/2008 to December 31/12/2008 to be the data for the model in this paper. We used three techniques for finding the drug utilization for cardiac patients. First we applied a clustering technique, followed by measuring of clustering validity, and finally we applied a decision tree as classification algorithm. The clustering results is divided into three clusters according to the drug utilization, for 1603 patients, who received 15,806 prescriptions during this period can be partitioned into three groups, where 23 patients (2.59%) who received 1316 prescriptions (8.32%) are classified to be outliers. The classification algorithm shows that the use of average drug utilization and the age, and the gender of the patient can be considered to be the main predictive factors in the induced model.

Improving Quality of Business Networks for Information Systems

Computer networks are essential part in computerbased information systems. The performance of these networks has a great influence on the whole information system. Measuring the usability criteria and customers satisfaction on small computer network is very important. In this article, an effective approach for measuring the usability of business network in an information system is introduced. The usability process for networking provides us with a flexible and a cost-effective way to assess the usability of a network and its products. In addition, the proposed approach can be used to certify network product usability late in the development cycle. Furthermore, it can be used to help in developing usable interfaces very early in the cycle and to give a way to measure, track, and improve usability. Moreover, a new approach for fast information processing over computer networks is presented. The entire data are collected together in a long vector and then tested as a one input pattern. Proposed fast time delay neural networks (FTDNNs) use cross correlation in the frequency domain between the tested data and the input weights of neural networks. It is proved mathematically and practically that the number of computation steps required for the presented time delay neural networks is less than that needed by conventional time delay neural networks (CTDNNs). Simulation results using MATLAB confirm the theoretical computations.

Adoption of Appropriate and Cost Effective Technologies in Housing: Indian Experience

Construction cost in India is increasing at around 50 per cent over the average inflation levels. It have registered increase of up to 15 per cent every year, primarily due to cost of basic building materials such as steel, cement, bricks, timber and other inputs as well as cost of labour. As a result, the cost of construction using conventional building materials and construction is becoming beyond the affordable limits particularly for low-income groups of population as well as a large cross section of the middle - income groups. Therefore, there is a need to adopt cost-effective construction methods either by up-gradation of traditional technologies using local resources or applying modern construction materials and techniques with efficient inputs leading to economic solutions. This has become the most relevant aspect in the context of the large volume of housing to be constructed in both rural and urban areas and the consideration of limitations in the availability of resources such as building materials and finance. This paper makes an overview of the housing status in India and adoption of appropriate and cost effective technologies in the country.

Aqueous Extract of Flacourtia indica Prevents Carbon Tetrachloride Induced Hepatotoxicity in Rat

Carbon tetrachloride (CCl4) is a well-known hepatotoxin and exposure to this chemical is known to induce oxidative stress and causes liver injury by the formation of free radicals. Flacourtia indica commonly known as 'Baichi' has been reported as an effective remedy for the treatment of a variety of diseases. The objective of this study was to investigate the hepatoprotective activity of aqueous extract of leaves of Flacourtia indica against CCl4 induced hepatotoxicity. Animals were pretreated with the aqueous extract of Flacourtia indica (250 & 500 mg/kg body weight) for one week and then challenged with CCl4 (1.5 ml/kg bw) in olive oil (1:1, v/v) on 7th day. Serum marker enzymes (ALP, AST, ALT, Total Protein & Total Bilirubin) and TBARS level (Marker for oxidative stress) were estimated in all the study groups. Alteration in the levels of biochemical markers of hepatic damage like AST, ALT, ALP, Total Protein, Total Bilirubin and lipid peroxides (TBARS) were tested in both CCl4 treated and extract treated groups. CCl4 has enhanced the AST, ALT, ALP and the Lipid peroxides (TBARS) in liver. Treatment of aqueous extract of Flacourtia indica leaves (250 & 500 mg/kg) exhibited a significant protective effect by altering the serum levels of AST, ALT, ALP, Total Protein, Total Bilirubin and liver TBARS. These biochemical observations were supported by histopathological study of liver sections. From this preliminary study it has been concluded that the aqueous extract of the leaves of Flacourtia indica protects liver against oxidative damages and could be used as an effective protector against CCl4 induced hepatic damage. Our findings suggested that Flacourtia indica possessed good hepatoprotective activity

Using the Polynomial Approximation Algorithm in the Algorithm 2 for Manipulator's Control in an Unknown Environment

The Algorithm 2 for a n-link manipulator movement amidst arbitrary unknown static obstacles for a case when a sensor system supplies information about local neighborhoods of different points in the configuration space is presented. The Algorithm 2 guarantees the reaching of a target position in a finite number of steps. The Algorithm 2 is reduced to a finite number of calls of a subroutine for planning a trajectory in the presence of known forbidden states. The polynomial approximation algorithm which is used as the subroutine is presented. The results of the Algorithm2 implementation are given.

Improvement of Durability of Wood by Maleic Anhydride

Wood as a natural renewable material is vulnerable to degradation by microorganisms and susceptible to change in dimension by water. In order to effectively improve the durability of wood, an active reagent, maleic anhydride (Man) was selected for wood modification. Man was first dissolved into a solvent, and then penetrated into wood porous structure under a vacuum/pressure condition. After a final catalyst-thermal treatment, wood modification was finished. The test results indicate that acetone is a good solvent for transporting Man into wood matrix. SEM observation proved that wood samples treated by Man kept a good cellular structure, indicating a well penetration of Man into wood cell walls. FTIR analysis suggested that Man reacted with hydroxyl groups on wood cell walls by its ring-ether group, resulting in reduction of amount of hydroxyl groups and resultant good dimensional stability as well as fine decay resistance. Consequently, Man modifying wood to improve its durability is an effective method.

Solving Facility Location Problem on Cluster Computing

Computation of facility location problem for every location in the country is not easy simultaneously. Solving the problem is described by using cluster computing. A technique is to design parallel algorithm by using local search with single swap method in order to solve that problem on clusters. Parallel implementation is done by the use of portable parallel programming, Message Passing Interface (MPI), on Microsoft Windows Compute Cluster. In this paper, it presents the algorithm that used local search with single swap method and implementation of the system of a facility to be opened by using MPI on cluster. If large datasets are considered, the process of calculating a reasonable cost for a facility becomes time consuming. The result shows parallel computation of facility location problem on cluster speedups and scales well as problem size increases.

Over-Height Vehicle Detection in Low Headroom Roads Using Digital Video Processing

In this paper we present a new method for over-height vehicle detection in low headroom streets and highways using digital video possessing. The accuracy and the lower price comparing to present detectors like laser radars and the capability of providing extra information like speed and height measurement make this method more reliable and efficient. In this algorithm the features are selected and tracked using KLT algorithm. A blob extraction algorithm is also applied using background estimation and subtraction. Then the world coordinates of features that are inside the blobs are estimated using a noble calibration method. As, the heights of the features are calculated, we apply a threshold to select overheight features and eliminate others. The over-height features are segmented using some association criteria and grouped using an undirected graph. Then they are tracked through sequential frames. The obtained groups refer to over-height vehicles in a scene.

Rarefactive and Compressive Solitons in Warm Dusty Plasma with Electrons and Nonthermal Ions

Dust acoustic solitary waves are studied in warm dusty plasma containing negatively charged dusts, nonthermal ions and Boltzmann distributed electrons. Sagdeev pseudopotential method is used in order to investigate solitary wave solutions in the plasmas. The existence of compressive and rarefractive solitons is studied.

On the Need to have an Additional Methodology for the Psychological Product Measurement and Evaluation

Cognitive Science appeared about 40 years ago, subsequent to the challenge of the Artificial Intelligence, as common territory for several scientific disciplines such as: IT, mathematics, psychology, neurology, philosophy, sociology, and linguistics. The new born science was justified by the complexity of the problems related to the human knowledge on one hand, and on the other by the fact that none of the above mentioned sciences could explain alone the mental phenomena. Based on the data supplied by the experimental sciences such as psychology or neurology, models of the human mind operation are built in the cognition science. These models are implemented in computer programs and/or electronic circuits (specific to the artificial intelligence) – cognitive systems – whose competences and performances are compared to the human ones, leading to the psychology and neurology data reinterpretation, respectively to the construction of new models. During these processes if psychology provides the experimental basis, philosophy and mathematics provides the abstraction level utterly necessary for the intermission of the mentioned sciences. The ongoing general problematic of the cognitive approach provides two important types of approach: the computational one, starting from the idea that the mental phenomenon can be reduced to 1 and 0 type calculus operations, and the connection one that considers the thinking products as being a result of the interaction between all the composing (included) systems. In the field of psychology measurements in the computational register use classical inquiries and psychometrical tests, generally based on calculus methods. Deeming things from both sides that are representing the cognitive science, we can notice a gap in psychological product measurement possibilities, regarded from the connectionist perspective, that requires the unitary understanding of the quality – quantity whole. In such approach measurement by calculus proves to be inefficient. Our researches, deployed for longer than 20 years, lead to the conclusion that measuring by forms properly fits to the connectionism laws and principles.

Grid Based and Random Based Ant Colony Algorithms for Automatic Hose Routing in 3D Space

Ant Colony Algorithms have been applied to difficult combinatorial optimization problems such as the travelling salesman problem and the quadratic assignment problem. In this paper gridbased and random-based ant colony algorithms are proposed for automatic 3D hose routing and their pros and cons are discussed. The algorithm uses the tessellated format for the obstacles and the generated hoses in order to detect collisions. The representation of obstacles and hoses in the tessellated format greatly helps the algorithm towards handling free-form objects and speeds up computation. The performance of algorithm has been tested on a number of 3D models.

Multiscale Analysis and Change Detection Based on a Contrario Approach

Automatic methods of detecting changes through satellite imaging are the object of growing interest, especially beca²use of numerous applications linked to analysis of the Earth’s surface or the environment (monitoring vegetation, updating maps, risk management, etc...). This work implemented spatial analysis techniques by using images with different spatial and spectral resolutions on different dates. The work was based on the principle of control charts in order to set the upper and lower limits beyond which a change would be noted. Later, the a contrario approach was used. This was done by testing different thresholds for which the difference calculated between two pixels was significant. Finally, labeled images were considered, giving a particularly low difference which meant that the number of “false changes” could be estimated according to a given limit.

Improvement of Synchronous Machine Dynamic Characteristics via Neural Network Based Controllers

This paper presents Simulation and experimental study aimed at investigating the effectiveness of an adaptive artificial neural network stabilizer on enhancing the damping torque of a synchronous generator. For this purpose, a power system comprising a synchronous generator feeding a large power system through a short tie line is considered. The proposed adaptive neuro-control system consists of two multi-layered feed forward neural networks, which work as a plant model identifier and a controller. It generates supplementary control signals to be utilized by conventional controllers. The details of the interfacing circuits, sensors and transducers, which have been designed and built for use in tests, are presented. The synchronous generator is tested to investigate the effect of tuning a Power System Stabilizer (PSS) on its dynamic stability. The obtained simulation and experimental results verify the basic theoretical concepts.

PCR based Detection of Food Borne Pathogens

Many high-risk pathogens that cause disease in humans are transmitted through various food items. Food-borne disease constitutes a major public health problem. Assessment of the quality and safety of foods is important in human health. Rapid and easy detection of pathogenic organisms will facilitate precautionary measures to maintain healthy food. The Polymerase Chain Reaction (PCR) is a handy tool for rapid detection of low numbers of bacteria. We have designed gene specific primers for most common food borne pathogens such as Staphylococci, Salmonella and E.coli. Bacteria were isolated from food samples of various food outlets and identified using gene specific PCRs. We identified Staphylococci, Salmonella and E.coli O157 using gene specific primers by rapid and direct PCR technique in various food samples. This study helps us in getting a complete picture of the various pathogens that threaten to cause and spread food borne diseases and it would also enable establishment of a routine procedure and methodology for rapid identification of food borne bacteria using the rapid technique of direct PCR. This study will also enable us to judge the efficiency of present food safety steps taken by food manufacturers and exporters.

Clustering Mixed Data Using Non-normal Regression Tree for Process Monitoring

In the semiconductor manufacturing process, large amounts of data are collected from various sensors of multiple facilities. The collected data from sensors have several different characteristics due to variables such as types of products, former processes and recipes. In general, Statistical Quality Control (SQC) methods assume the normality of the data to detect out-of-control states of processes. Although the collected data have different characteristics, using the data as inputs of SQC will increase variations of data, require wide control limits, and decrease performance to detect outof- control. Therefore, it is necessary to separate similar data groups from mixed data for more accurate process control. In the paper, we propose a regression tree using split algorithm based on Pearson distribution to handle non-normal distribution in parametric method. The regression tree finds similar properties of data from different variables. The experiments using real semiconductor manufacturing process data show improved performance in fault detecting ability.

Bank Business Models and The Changes in CEE Countries

The aim of this article is to assess the existing business models used by the banks operating in the CEE countries in the time period from 2006 till 2011. In order to obtain research results, the authors performed qualitative analysis of the scientific literature on bank business models, which have been grouped into clusters that consist of such components as: 1) capital and reserves; 2) assets; 3) deposits, and 4) loans. In their turn, bank business models have been developed based on the types of core activities of the banks, and have been divided into four groups: Wholesale, Investment, Retail and Universal Banks. Descriptive statistics have been used to analyse the models, determining mean, minimal and maximal values of constituent cluster components, as well as standard deviation. The analysis of the data is based on such bank variable indices as Return on Assets (ROA) and Return on Equity (ROE).

A Comparison Study of Inspector's Performance between Regular and Complex Tasks

This research was to study a comparison of inspector-s performance between regular and complex visual inspection task. Visual task was simulated on DVD read control circuit. Inspection task was performed by using computer. Subjects were 10 undergraduate randomly selected and test for 20/20. Then, subjects were divided into two groups, five for regular inspection (control group) and five for complex inspection (treatment group) tasks. Result was showed that performance on regular and complex inspectors was significantly difference at the level of 0.05. Inspector performance on regular inspection was showed high percentage on defects detected by using equal time to complex inspection. This would be indicated that inspector performance was affected by visual inspection task.

A New Self-Adaptive EP Approach for ANN Weights Training

Evolutionary Programming (EP) represents a methodology of Evolutionary Algorithms (EA) in which mutation is considered as a main reproduction operator. This paper presents a novel EP approach for Artificial Neural Networks (ANN) learning. The proposed strategy consists of two components: the self-adaptive, which contains phenotype information and the dynamic, which is described by genotype. Self-adaptation is achieved by the addition of a value, called the network weight, which depends on a total number of hidden layers and an average number of neurons in hidden layers. The dynamic component changes its value depending on the fitness of a chromosome, exposed to mutation. Thus, the mutation step size is controlled by two components, encapsulated in the algorithm, which adjust it according to the characteristics of a predefined ANN architecture and the fitness of a particular chromosome. The comparative analysis of the proposed approach and the classical EP (Gaussian mutation) showed, that that the significant acceleration of the evolution process is achieved by using both phenotype and genotype information in the mutation strategy.