Kinematics and Control System Design of Manipulators for a Humanoid Robot

In this work, a new approach is proposed to control the manipulators for Humanoid robot. The kinematics of the manipulators in terms of joint positions, velocity, acceleration and torque of each joint is computed using the Denavit Hardenberg (D-H) notations. These variables are used to design the manipulator control system, which has been proposed in this work. In view of supporting the development of a controller, a simulation of the manipulator is designed for Humanoid robot. This simulation is developed through the use of the Virtual Reality Toolbox and Simulink in Matlab. The Virtual Reality Toolbox in Matlab provides the interfacing and controls to an environment which is developed based on the Virtual Reality Modeling Language (VRML). Chains of bones were used to represent the robot.

The Cognitive Neuroscience of Vigilance – A Test of Temporal Decrement in the Attention Networks Test (ANT)

The aim of this study was to test whether the Attention Networks Test (ANT) showed temporal decrements in performance. Vigilance tasks typically show such decrements, which may reflect impairments in executive control resulting from cognitive fatigue. The ANT assesses executive control, as well as alerting and orienting. Thus, it was hypothesized that ANT executive control would deteriorate over time. Manipulations including task condition (trial composition) and masking were included in the experimental design in an attempt to increase performance decrements. However, results showed that there is no temporal decrement on the ANT. The roles of task demands, cognitive fatigue and participant motivation in producing this result are discussed. The ANT may not be an effective tool for investigating temporal decrement in attention.

Direction to Manage OTOP Entrepreneurship Based on Local Wisdom

The OTOP Entrepreneurship that used to create substantial source of income for local Thai communities are now in a stage of exigent matters that required assistances from public sectors due to over Entrepreneurship of duplicative ideas, unable to adjust costs and prices, lack of innovation, and inadequate of quality control. Moreover, there is a repetitive problem of middlemen who constantly corner the OTOP market. Local OTOP producers become easy preys since they do not know how to add more values, how to create and maintain their own brand name, and how to create proper packaging and labeling. The suggested solutions to local OTOP producers are to adopt modern management techniques, to find knowhow to add more values to products and to unravel other marketing problems. The objectives of this research are to study the prevalent OTOP products management and to discover direction to manage OTOP products to enhance the effectiveness of OTOP Entrepreneurship in Nonthaburi Province, Thailand. There were 113 participants in this study. The research tools can be divided into two parts: First part is done by questionnaire to find responses of the prevalent OTOP Entrepreneurship management. Second part is the use of focus group which is conducted to encapsulate ideas and local wisdom. Data analysis is performed by using frequency, percentage, mean, and standard deviation as well as the synthesis of several small group discussions. The findings reveal that 1) Business Resources: the quality of product is most important and the marketing of product is least important. 2) Business Management: Leadership is most important and raw material planning is least important. 3) Business Readiness: Communication is most important and packaging is least important. 4) Support from public sector: Certified from the government is most important and source of raw material is the least important.

Development of a Robust Supply Chain for Dynamic Operating Environment

Development of a Robust Supply Chain for Dynamic Operating Environment as we move further into the twenty first century, organisations are under increasing pressure to deliver a high product variation at a reasonable cost without compromise in quality. In a number of cases this will take the form of a customised or high variety low volume manufacturing system that requires prudent management of resources, among a number of functions, to achieve competitive advantage. Purchasing and Supply Chain management is one of such function and due to the substantial interaction with external elements needs to be strategically managed. This requires a number of primary and supporting tools that will enable the appropriate decisions to be made rapidly. This capability is especially vital in a dynamic environment as it provides a pivotal role in increasing the profit margin of the product. The management of this function can be challenging by itself and even more for Small and Medium Enterprises (SMEs) due to the limited resources and expertise available at their disposal. This paper discusses the development of tools and concepts towards effectively managing the purchasing and supply chain function. The developed tools and concepts will provide a cost effective way of managing this function within SMEs. The paper further shows the use of these tools within Contechs, a manufacturer of luxury boat interiors, and the associated benefits achieved as a result of this implementation. Finally a generic framework towards use in such environments is presented.

Big Bang – Big Crunch Learning Method for Fuzzy Cognitive Maps

Modeling of complex dynamic systems, which are very complicated to establish mathematical models, requires new and modern methodologies that will exploit the existing expert knowledge, human experience and historical data. Fuzzy cognitive maps are very suitable, simple, and powerful tools for simulation and analysis of these kinds of dynamic systems. However, human experts are subjective and can handle only relatively simple fuzzy cognitive maps; therefore, there is a need of developing new approaches for an automated generation of fuzzy cognitive maps using historical data. In this study, a new learning algorithm, which is called Big Bang-Big Crunch, is proposed for the first time in literature for an automated generation of fuzzy cognitive maps from data. Two real-world examples; namely a process control system and radiation therapy process, and one synthetic model are used to emphasize the effectiveness and usefulness of the proposed methodology.

An Overview of the Application of Fuzzy Inference System for the Automation of Breast Cancer Grading with Spectral Data

Breast cancer is one of the most frequent occurring cancers in women throughout the world including U.K. The grading of this cancer plays a vital role in the prognosis of the disease. In this paper we present an overview of the use of advanced computational method of fuzzy inference system as a tool for the automation of breast cancer grading. A new spectral data set obtained from Fourier Transform Infrared Spectroscopy (FTIR) of cancer patients has been used for this study. The future work outlines the potential areas of fuzzy systems that can be used for the automation of breast cancer grading.

Automated Particle Picking based on Correlation Peak Shape Analysis and Iterative Classification

Cryo-electron microscopy (CEM) in combination with single particle analysis (SPA) is a widely used technique for elucidating structural details of macromolecular assemblies at closeto- atomic resolutions. However, development of automated software for SPA processing is still vital since thousands to millions of individual particle images need to be processed. Here, we present our workflow for automated particle picking. Our approach integrates peak shape analysis to the classical correlation and an iterative approach to separate macromolecules and background by classification. This particle selection workflow furthermore provides a robust means for SPA with little user interaction. Processing simulated and experimental data assesses performance of the presented tools.

Finite Element Prediction on the Machining Stability of Milling Machine with Experimental Verification

Chatter vibration has been a troublesome problem for a machine tool toward the high precision and high speed machining. Essentially, the machining performance is determined by the dynamic characteristics of the machine tool structure and dynamics of cutting process, which can further be identified in terms of the stability lobe diagram. Therefore, realization on the machine tool dynamic behavior can help to enhance the cutting stability. To assess the dynamic characteristics and machining stability of a vertical milling system under the influence of a linear guide, this study developed a finite element model integrated the modeling of linear components with the implementation of contact stiffness at the rolling interface. Both the finite element simulations and experimental measurements reveal that the linear guide with different preload greatly affects the vibration behavior and milling stability of the vertical column spindle head system, which also clearly indicate that the predictions of the machining stability agree well with the cutting tests. It is believed that the proposed model can be successfully applied to evaluate the dynamics performance of machine tool systems of various configurations.

Generating High-Accuracy Tool Path for 5-axis Flank Milling of Globoidal Spatial Cam

A new tool path planning method for 5-axis flank milling of a globoidal indexing cam is developed in this paper. The globoidal indexing cam is a practical transmission mechanism due to its high transmission speed, accuracy and dynamic performance. Machining the cam profile is a complex and precise task. The profile surface of the globoidal cam is generated by the conjugate contact motion of the roller. The generated complex profile surface is usually machined by 5-axis point-milling method. The point-milling method is time-consuming compared with flank milling. The tool path for 5-axis flank milling of globoidal cam is developed to improve the cutting efficiency. The flank milling tool path is globally optimized according to the minimum zone criterion, and high accuracy is guaranteed. The computational example and cutting simulation finally validate the developed method.

Decision Rule Induction in a Learning Content Management System

A learning content management system (LCMS) is an environment to support web-based learning content development. Primary function of the system is to manage the learning process as well as to generate content customized to meet a unique requirement of each learner. Among the available supporting tools offered by several vendors, we propose to enhance the LCMS functionality to individualize the presented content with the induction ability. Our induction technique is based on rough set theory. The induced rules are intended to be the supportive knowledge for guiding the content flow planning. They can also be used as decision rules to help content developers on managing content delivered to individual learner.

Designing a Multilingual Auction Website for Selling Agricultural Products

The study aimed to identify the logical structure of data and particularities of developing and testing a website designed for selling farm products through online auctions. The research is based on a short literature review in the field and exploratory trials of some successful models from other industries, in order to identify the advantages of using such tool, as well as the optimal structure and functionality of an auction portal. In the last part, the study focuses on the results of testing the website by the potential beneficiaries. Conclusions of the study underlines that the particularities of some agricultural products could raise difficulties in the process of selling them through online auctions, but the use of such system it is perceived to bring significant improvements in the supply chain. The results of scientific investigations require a more detailed study regarding the importance of using quality standards for agricultural products sold via online auction, the impact that implementation of an online payment system could have on trade with agricultural products and problems which could arise in using the website in different countries.

Online Control of Knitted Fabric Quality: Loop Length Control

Circular knitting machine makes the fabric with more than two knitting tools. Variation of yarn tension between different knitting tools causes different loop length of stitches duration knitting process. In this research, a new intelligent method is applied to control loop length of stitches in various tools based on ideal shape of stitches and real angle of stitches direction while different loop length of stitches causes stitches deformation and deviation those of angle. To measure deviation of stitch direction against variation of tensions, image processing technique was applied to pictures of different fabrics with constant front light. After that, the rate of deformation is translated to needed compensation of loop length cam degree to cure stitches deformation. A fuzzy control algorithm was applied to loop length modification in knitting tools. The presented method was experienced for different knitted fabrics of various structures and yarns. The results show that presented method is useable for control of loop length variation between different knitting tools based on stitch deformation for various knitted fabrics with different fabric structures, densities and yarn types.

A New Approach for Classifying Large Number of Mixed Variables

The issue of classifying objects into one of predefined groups when the measured variables are mixed with different types of variables has been part of interest among statisticians in many years. Some methods for dealing with such situation have been introduced that include parametric, semi-parametric and nonparametric approaches. This paper attempts to discuss on a problem in classifying a data when the number of measured mixed variables is larger than the size of the sample. A propose idea that integrates a dimensionality reduction technique via principal component analysis and a discriminant function based on the location model is discussed. The study aims in offering practitioners another potential tool in a classification problem that is possible to be considered when the observed variables are mixed and too large.

Investigation of Tool Temperature and Surface Quality in Hot Machining of Hard-to-Cut Materials

Production of hard-to-cut materials with uncoated carbide cutting tools in turning, not only cause tool life reduction but also, impairs the product surface roughness. In this paper, influence of hot machining method were studied and presented in two cases. Case1-Workpiece surface roughness quality with constant cutting parameter and 300 ºC initial workpiece surface temperature. Case 2- Tool temperature variation when cutting with two speeds 78.5 (m/min) and 51 (m/min). The workpiece material and tool used in this study were AISI 1060 steel (45HRC) and uncoated carbide TNNM 120408-SP10(SANDVIK Coromant) respectively. A gas flam heating source was used to preheating of the workpiece surface up to 300 ºC, causing reduction of yield stress about 15%. Results obtained experimentally, show that the method used can considerably improved surface quality of the workpiece.

High Speed Bitwise Search for Digital Forensic System

The most common forensic activity is searching a hard disk for string of data. Nowadays, investigators and analysts are increasingly experiencing large, even terabyte sized data sets when conducting digital investigations. Therefore consecutive searching can take weeks to complete successfully. There are two primary search methods: index-based search and bitwise search. Index-based searching is very fast after the initial indexing but initial indexing takes a long time. In this paper, we discuss a high speed bitwise search model for large-scale digital forensic investigations. We used pattern matching board, which is generally used for network security, to search for string and complex regular expressions. Our results indicate that in many cases, the use of pattern matching board can substantially increase the performance of digital forensic search tools.

Plant Varieties Selection System

In the end of the day, meteorological data and environmental data becomes widely used such as plant varieties selection system. Variety plant selection for planted area is of almost importance for all crops, including varieties of sugarcane. Since sugarcane have many varieties. Variety plant non selection for planting may not be adapted to the climate or soil conditions for planted area. Poor growth, bloom drop, poor fruit, and low price are to be from varieties which were not recommended for those planted area. This paper presents plant varieties selection system for planted areas in Thailand from meteorological data and environmental data by the use of decision tree techniques. With this software developed as an environmental data analysis tool, it can analyze resulting easier and faster. Our software is a front end of WEKA that provides fundamental data mining functions such as classify, clustering, and analysis functions. It also supports pre-processing, analysis, and decision tree output with exporting result. After that, our software can export and display data result to Google maps API in order to display result and plot plant icons effectively.

Efficient Pipelined Hardware Implementation of RIPEMD-160 Hash Function

In this paper an efficient implementation of Ripemd- 160 hash function is presented. Hash functions are a special family of cryptographic algorithms, which is used in technological applications with requirements for security, confidentiality and validity. Applications like PKI, IPSec, DSA, MAC-s incorporate hash functions and are used widely today. The Ripemd-160 is emanated from the necessity for existence of very strong algorithms in cryptanalysis. The proposed hardware implementation can be synthesized easily for a variety of FPGA and ASIC technologies. Simulation results, using commercial tools, verified the efficiency of the implementation in terms of performance and throughput. Special care has been taken so that the proposed implementation doesn-t introduce extra design complexity; while in parallel functionality was kept to the required levels.

High Level Synthesis of Digital Filters Based On Sub-Token Forwarding

High level synthesis (HLS) is a process which generates register-transfer level design for digital systems from behavioral description. There are many HLS algorithms and commercial tools. However, most of these algorithms consider a behavioral description for the system when a single token is presented to the system. This approach does not exploit extra hardware efficiently, especially in the design of digital filters where common operations may exist between successive tokens. In this paper, we modify the behavioral description to process multiple tokens in parallel. However, this approach is unlike the full processing that requires full hardware replication. It exploits the presence of common operations between successive tokens. The performance of the proposed approach is better than sequential processing and approaches that of full parallel processing as the hardware resources are increased.

Bioprocessing of Proximally Analyzed Wheat Straw for Enhanced Cellulase Production through Process Optimization with Trichodermaviride under SSF

The purpose of the present work was to study the production and process parameters optimization for the synthesis of cellulase from Trichoderma viride in solid state fermentation (SSF) using an agricultural wheat straw as substrates; as fungal conversion of lignocellulosic biomass for cellulase production is one among the major increasing demand for various biotechnological applications. An optimization of process parameters is a necessary step to get higher yield of product. Several kinetic parameters like pretreatment, extraction solvent, substrate concentration, initial moisture content, pH, incubation temperature and inoculum size were optimized for enhanced production of third most demanded industrially important cellulase. The maximum cellulase enzyme activity 398.10±2.43 μM/mL/min was achieved when proximally analyzed lignocellulosic substrate wheat straw inocubated at 2% HCl as pretreatment tool along with distilled water as extraction solvent, 3% substrate concentration 40% moisture content with optimum pH 5.5 at 45°C incubation temperature and 10% inoculum size.

JConqurr - A Multi-Core Programming Toolkit for Java

With the popularity of the multi-core and many-core architectures there is a great requirement for software frameworks which can support parallel programming methodologies. In this paper we introduce an Eclipse toolkit, JConqurr which is easy to use and provides robust support for flexible parallel progrmaming. JConqurr is a multi-core and many-core programming toolkit for Java which is capable of providing support for common parallel programming patterns which include task, data, divide and conquer and pipeline parallelism. The toolkit uses an annotation and a directive mechanism to convert the sequential code into parallel code. In addition to that we have proposed a novel mechanism to achieve the parallelism using graphical processing units (GPU). Experiments with common parallelizable algorithms have shown that our toolkit can be easily and efficiently used to convert sequential code to parallel code and significant performance gains can be achieved.