Identification of Individual Objects at the Intelligent Assembly Cell

In this contribution is presented a complex design of individual objects identification in the workplace of intelligent assembly cell. Intelligent assembly cell is situated at Institute of Manufacturing Systems and Applied Mechanics and is used for pneumatic actuator assembly. Pneumatic actuator components are pneumatic roller, cover, piston and spring. Two identification objects alternatives for assembly are designed in the workplace of industrial robot. In the contribution is evaluated and selected suitable alternative for identification – 2D codes reader. The complex design of individual object identification is going out of intelligent manufacturing systems knowledge. Intelligent assembly and manufacturing systems as systems of new generation are gradually loaded in to the mechanical production, when they are removeing human operation out of production process and they also short production times.

Modern Kazakhstan in Global World After Independence

The article deals with the problems of political and economic processes in Kazakhstan since independence in the context of globalization. It analyzes the geopolitical situation and selfpositioning processes in the world after the end of the "cold war". It examines the problems of internal economization of the Republic for 20 years of independence. The authors argue that the reforms proceeded in the economic sphere have brought ambiguous and tangible results. Despite the difficult economic and political conditions facing a world economical crisis the country has undergone fundamental and radical transformations in the whole socio-economic system

Use of Item Response Theory in Medical Surgical Nursing Achievement Examination

Medical Surgical Nursing is one of the major subjects in nursing. This study examined the validity and reliability of the achievement examination utilizing the Classical Test Theory and Item Response Theory. The study answered the following objectives specifically : ( a) To establish the validity and reliability of the achievement examination utilizing Classical Test Theory and Item Response Theory ; ( b ) To determine the dimensionality measure of items and ( c ) to compare the item difficulty and item discrimination of the Medical Surgical Nursing Achievement examination using Classical Test Theory ( CTT ) and Item Response Theory ( IRT ). The developed instrument was administered to fourth year nursing students (N= 136) of a private university in Manila. The findings yielded the following results: The achievement examination is reliable both using CTT and IRT. The findings indicate person and item statistics from two frameworks are quite alike. The achievement examination formed a unidimensional construct.

Stochastic Scheduling to Minimize Expected Lateness in Multiple Identical Machines

There are many real world problems in which parameters like the arrival time of new jobs, failure of resources, and completion time of jobs change continuously. This paper tackles the problem of scheduling jobs with random due dates on multiple identical machines in a stochastic environment. First to assign jobs to different machine centers LPT scheduling methods have been used, after that the particular sequence of jobs to be processed on the machine have been found using simple stochastic techniques. The performance parameter under consideration has been the maximum lateness concerning the stochastic due dates which are independent and exponentially distributed. At the end a relevant problem has been solved using the techniques in the paper..

Robust Digital Cinema Watermarking

With the advent of digital cinema and digital broadcasting, copyright protection of video data has been one of the most important issues. We present a novel method of watermarking for video image data based on the hardware and digital wavelet transform techniques and name it as “traceable watermarking" because the watermarked data is constructed before the transmission process and traced after it has been received by an authorized user. In our method, we embed the watermark to the lowest part of each image frame in decoded video by using a hardware LSI. Digital Cinema is an important application for traceable watermarking since digital cinema system makes use of watermarking technology during content encoding, encryption, transmission, decoding and all the intermediate process to be done in digital cinema systems. The watermark is embedded into the randomly selected movie frames using hash functions. Embedded watermark information can be extracted from the decoded video data. For that, there is no need to access original movie data. Our experimental results show that proposed traceable watermarking method for digital cinema system is much better than the convenient watermarking techniques in terms of robustness, image quality, speed, simplicity and robust structure.

Small Wind Turbine Hybrid System for Remote Application: Egyptian Case Study

The objective of this research is to study the technical and economic performance of wind/diesel/battery (W/D/B) system supplying a remote small gathering of six families using HOMER software package. The electrical energy is to cater for the basic needs for which the daily load pattern is estimated. Net Present Cost (NPC) and Cost of Energy (COE) are used as economic criteria, while the  measure of performance is % of power shortage. Technical and economic parameters are defined to estimate the feasibility of the system under study. Optimum system configurations are estimated for two sites. Using HOMER software, the simulation results showed that W/D/B systems are economical for the assumed community sites as the price of generated electricity is about 0.308 $/kWh, without taking external benefits into considerations. W/D/B systems are more economical than W/B or diesel alone systems, as the COE is 0.86 $/kWh for W/B and 0.357 $/kWh for diesel alone.

Partial Derivatives and Optimization Problem on Time Scales

The optimization problem using time scales is studied. Time scale is a model of time. The language of time scales seems to be an ideal tool to unify the continuous-time and the discrete-time theories. In this work we present necessary conditions for a solution of an optimization problem on time scales. To obtain that result we use properties and results of the partial diamond-alpha derivatives for continuous-multivariable functions. These results are also presented here.

Basic Tendency Model in Complete Factor Synergetics of Complex Systems

The deviation between the target state variable and the practical state variable should be used to form the state tending factor of complex systems, which can reflect the process for the complex system to tend rationalization. Relating to the system of basic equations of complete factor synergetics consisting of twenty nonlinear stochastic differential equations, the two new models are considered to set, which should be called respectively the rationalizing tendency model and the non- rationalizing tendency model. Therefore we can extend the theory of programming with the objective function & constraint condition suitable only for the realm of man-s activities into the new analysis with the tendency function & constraint condition suitable for all the field of complex system.

An Application of the Sinc-Collocation Method to a Three-Dimensional Oceanography Model

In this paper, we explore the applicability of the Sinc- Collocation method to a three-dimensional (3D) oceanography model. The model describes a wind-driven current with depth-dependent eddy viscosity in the complex-velocity system. In general, the Sinc-based methods excel over other traditional numerical methods due to their exponentially decaying errors, rapid convergence and handling problems in the presence of singularities in end-points. Together with these advantages, the Sinc-Collocation approach that we utilize exploits first derivative interpolation, whose integration is much less sensitive to numerical errors. We bring up several model problems to prove the accuracy, stability, and computational efficiency of the method. The approximate solutions determined by the Sinc-Collocation technique are compared to exact solutions and those obtained by the Sinc-Galerkin approach in earlier studies. Our findings indicate that the Sinc-Collocation method outperforms other Sinc-based methods in past studies.

The Impact of High Performance Work Systems- on Firm Performance in MNCs and Local Manufacturing Firms in Malaysia

The empirical studies on High Performance Work Systems (HPWSs) and their impacts on firm performance have remarkably little in the developing countries. This paper reviews literatures on the HPWSs practices in different work settings, Western and Asian countries. A review on the empirical research leads to a conclusion that, country differences influence the Human Resource Management (HRM) practices. It is anticipated that there are similarities and differences in the extent of implementation of HPWSs practices by the Malaysian manufacturing firms due to the organizational contextual factors and, the HPWSs have a significant impact on firms- better performance amongst MNCs and local firms.

A Subtractive Clustering Based Approach for Early Prediction of Fault Proneness in Software Modules

In this paper, subtractive clustering based fuzzy inference system approach is used for early detection of faults in the function oriented software systems. This approach has been tested with real time defect datasets of NASA software projects named as PC1 and CM1. Both the code based model and joined model (combination of the requirement and code based metrics) of the datasets are used for training and testing of the proposed approach. The performance of the models is recorded in terms of Accuracy, MAE and RMSE values. The performance of the proposed approach is better in case of Joined Model. As evidenced from the results obtained it can be concluded that Clustering and fuzzy logic together provide a simple yet powerful means to model the earlier detection of faults in the function oriented software systems.

A Framework for Personalized Multi-Device Information Communicating System

Due to the mobility of users, many information systems are now developed with the capability of supporting retrieval of information from both static and mobile users. Hence, the amount, content and format of the information retrieved will need to be tailored according to the device and the user who requested for it. Thus, this paper presents a framework for the design and implementation of such a system, which is to be developed for communicating final examination related information to the academic community at one university in Malaysia. The concept of personalization will be implemented in the system so that only highly relevant information will be delivered to the users. The personalization concept used will be based on user profiling as well as context. The system in its final state will be accessible through cell phones as well as intranet connected personal computers.

Aliveness Detection of Fingerprints using Multiple Static Features

Fake finger submission attack is a major problem in fingerprint recognition systems. In this paper, we introduce an aliveness detection method based on multiple static features, which derived from a single fingerprint image. The static features are comprised of individual pore spacing, residual noise and several first order statistics. Specifically, correlation filter is adopted to address individual pore spacing. The multiple static features are useful to reflect the physiological and statistical characteristics of live and fake fingerprint. The classification can be made by calculating the liveness scores from each feature and fusing the scores through a classifier. In our dataset, we compare nine classifiers and the best classification rate at 85% is attained by using a Reduced Multivariate Polynomial classifier. Our approach is faster and more convenient for aliveness check for field applications.

Real Time Monitoring of Long Slender Shaft by Distributed-Lumped Modeling Techniques

The aim of this paper is to determine the stress levels at the end of a long slender shaft such as a drilling assembly used in the oil or gas industry using a mathematical model in real-time. The torsional deflection experienced by this type of drilling shaft (about 4 KM length and 20 cm diameter hollow shaft with a thickness of 1 cm) can only be determined using a distributed modeling technique. The main objective of this project is to calculate angular velocity and torque at the end of the shaft by TLM method and also analyzing of the behavior of the system by transient response. The obtained result is compared with lumped modeling technique the importance of these results will be evident only after the mentioned comparison. Two systems have different transient responses and in this project because of the length of the shaft transient response is very important.

Network Intrusion Detection Design Using Feature Selection of Soft Computing Paradigms

The network traffic data provided for the design of intrusion detection always are large with ineffective information and enclose limited and ambiguous information about users- activities. We study the problems and propose a two phases approach in our intrusion detection design. In the first phase, we develop a correlation-based feature selection algorithm to remove the worthless information from the original high dimensional database. Next, we design an intrusion detection method to solve the problems of uncertainty caused by limited and ambiguous information. In the experiments, we choose six UCI databases and DARPA KDD99 intrusion detection data set as our evaluation tools. Empirical studies indicate that our feature selection algorithm is capable of reducing the size of data set. Our intrusion detection method achieves a better performance than those of participating intrusion detectors.

Numerical Analysis of Thermal Conductivity of Non-Charring Material Ablation Carbon-Carbon and Graphite with Considering Chemical Reaction Effects, Mass Transfer and Surface Heat Transfer

Nowadays, there is little information, concerning the heat shield systems, and this information is not completely reliable to use in so many cases. for example, the precise calculation cannot be done for various materials. In addition, the real scale test has two disadvantages: high cost and low flexibility, and for each case we must perform a new test. Hence, using numerical modeling program that calculates the surface recession rate and interior temperature distribution is necessary. Also, numerical solution of governing equation for non-charring material ablation is presented in order to anticipate the recession rate and the heat response of non-charring heat shields. the governing equation is nonlinear and the Newton- Rafson method along with TDMA algorithm is used to solve this nonlinear equation system. Using Newton- Rafson method for solving the governing equation is one of the advantages of the solving method because this method is simple and it can be easily generalized to more difficult problems. The obtained results compared with reliable sources in order to examine the accuracy of compiling code.

A New Approach to Polynomial Neural Networks based on Genetic Algorithm

Recently, a lot of attention has been devoted to advanced techniques of system modeling. PNN(polynomial neural network) is a GMDH-type algorithm (Group Method of Data Handling) which is one of the useful method for modeling nonlinear systems but PNN performance depends strongly on the number of input variables and the order of polynomial which are determined by trial and error. In this paper, we introduce GPNN (genetic polynomial neural network) to improve the performance of PNN. GPNN determines the number of input variables and the order of all neurons with GA (genetic algorithm). We use GA to search between all possible values for the number of input variables and the order of polynomial. GPNN performance is obtained by two nonlinear systems. the quadratic equation and the time series Dow Jones stock index are two case studies for obtaining the GPNN performance.

Analysis of DNA-Recognizing Enzyme Interaction using Deaminated Lesions

Deaminated lesions were produced via nitrosative oxidation of natural nucleobases; uracul (Ura, U) from cytosine (Cyt, C), hypoxanthine (Hyp, H) from adenine (Ade, A), and xanthine (Xan, X) and oxanine (Oxa, O) from guanine (Gua, G). Such damaged nucleobases may induce mutagenic problems, so that much attentions and efforts have been poured on the revealing of their mechanisms in vivo or in vitro. In this study, we employed these deaminated lesions as useful probes for analysis of DNA-binding/recognizing proteins or enzymes. Since the pyrimidine lesions such as Hyp, Oxa and Xan are employed as analogues of guanine, their comparative uses are informative for analyzing the role of Gua in DNA sequence in DNA-protein interaction. Several DNA oligomers containing such Hyp, Oxa or Xan substituted for Gua were designed to reveal the molecular interaction between DNA and protein. From this approach, we have got useful information to understand the molecular mechanisms of the DNA-recognizing enzymes, which have not ever been observed using conventional DNA oligomer composed of just natural nucleobases.

Application of the Data Distribution Service for Flexible Manufacturing Automation

This paper discusses the applicability of the Data Distribution Service (DDS) for the development of automated and modular manufacturing systems which require a flexible and robust communication infrastructure. DDS is an emergent standard for datacentric publish/subscribe middleware systems that provides an infrastructure for platform-independent many-to-many communication. It particularly addresses the needs of real-time systems that require deterministic data transfer, have low memory footprints and high robustness requirements. After an overview of the standard, several aspects of DDS are related to current challenges for the development of modern manufacturing systems with distributed architectures. Finally, an example application is presented based on a modular active fixturing system to illustrate the described aspects.

Comanche – A Compiler-Driven I/O Management System

Most scientific programs have large input and output data sets that require out-of-core programming or use virtual memory management (VMM). Out-of-core programming is very error-prone and tedious; as a result, it is generally avoided. However, in many instance, VMM is not an effective approach because it often results in substantial performance reduction. In contrast, compiler driven I/O management will allow a program-s data sets to be retrieved in parts, called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a compiler combined with a user level runtime system that can be used to replace standard VMM for out-of-core programs. We describe Comanche and demonstrate on a number of representative problems that it substantially out-performs VMM. Significantly our system does not require any special services from the operating system and does not require modification of the operating system kernel.