A Web-Based System for Mapping Features into ISO 14649-Compliant Machining Workingsteps

The rapid development of manufacturing and information systems has caused significant changes in manufacturing environments in recent decades. Mass production has given way to flexible manufacturing systems, in which an important characteristic is customized or "on demand" production. In this scenario, the seamless and without gaps information flow becomes a key factor for success of enterprises. In this paper we present a framework to support the mapping of features into machining workingsteps compliant with the ISO 14649 standard (known as STEP-NC). The system determines how the features can be made with the available manufacturing resources. Examples of the mapping method are presented for features such as a pocket with a general surface.

The Influence of the Commons Structure Modification on the Allocation

The tracing methods determine the contribution the power system sources have in their supplying. The methods can be used to assess the transmission prices, but also to recover the transmission fixed cost. In this paper is presented the influence of the modification of commons structure has on the specific price of transfer. The operator must make use of a few basic principles about allocation. Most tracing methods are based on the proportional sharing principle. In this paper Kirschen method is used. In order to illustrate this method, the 25- bus test system is used, elaborated within the Electrical Power Engineering Department, from Timisoara, Romania.

Theoretical Analysis of a Crossed-Electrode 2D Array for 3D Imaging

Planar systems of electrodes arranged on both sides of dielectric piezoelectric layer are applied in numerous transducers. They are capable of electronic beam-steering of generated wave both in azimuth and elevation. The wave-beam control is achieved by addressable driving of two-dimensional transducer through proper voltage supply of electrodes on opposite surfaces of the layer. In this paper a semi-analytical method of analysis of the considered transducer is proposed, which is a generalization of the well-known BIS-expansion method. It was earlier exploited with great success in the theory of interdigital transducers of surface acoustic waves, theory of elastic wave scattering by cracks and certain advanced electrostatic problems. The corresponding nontrivial electrostatic problem is formulated and solved numerically.

Determination of the Characteristics for Ferroresonance Phenomenon in Electric Power Systems

Ferroresonance is an electrical phenomenon in nonlinear character, which frequently occurs in power system due to transmission line faults and single or more-phase switching on the lines as well as usage of the saturable transformers. In this study, the ferroresonance phenomena are investigated under the modeling of the West Anatolian Electric Power Network of 380 kV in Turkey. The ferroresonance event is observed as a result of removing the loads at the end of the lines. In this sense, two different cases are considered. At first, the switching is applied at 2nd second and the ferroresonance affects are observed between 2nd and 4th seconds in the voltage variations of the phase-R. Hence the ferroresonance and nonferroresonance parts of the overall data are compared with each others using the Fourier transform techniques to show the ferroresonance affects.

3G WCDMA Mobile Network DoS Attack and Detection Technology

Currently, there has been a 3G mobile networks data traffic explosion due to the large increase in the number of smartphone users. Unlike a traditional wired infrastructure, 3G mobile networks have limited wireless resources and signaling procedures for complex wireless resource management. And mobile network security for various abnormal and malicious traffic technologies was not ready. So Malicious or potentially malicious traffic originating from mobile malware infected smart devices can cause serious problems to the 3G mobile networks, such as DoS and scanning attack in wired networks. This paper describes the DoS security threat in the 3G mobile network and proposes a detection technology.

A Traffic Simulation Package Based on Travel Demand

In this paper we propose a new traffic simulation package, TDMSim, which supports both macroscopic and microscopic simulation on free-flowing and regulated traffic systems. Both simulators are based on travel demands, which specify the numbers of vehicles departing from origins to arrive at different destinations. The microscopic simulator implements the carfollowing model given the pre-defined routes of the vehicles but also supports the rerouting of vehicles. We also propose a macroscopic simulator which is built in integration with the microscopic simulator to allow the simulation to be scaled for larger networks without sacrificing the precision achievable through the microscopic simulator. The macroscopic simulator also enables the reuse of previous simulation results when simulating traffic on the same networks at later time. Validations have been conducted to show the correctness of both simulators.

Analysis of Long-Term File System Activities on Cluster Systems

I/O workload is a critical and important factor to analyze I/O pattern and to maximize file system performance. However to measure I/O workload on running distributed parallel file system is non-trivial due to collection overhead and large volume of data. In this paper, we measured and analyzed file system activities on two large-scale cluster systems which had TFlops level high performance computation resources. By comparing file system activities of 2009 with those of 2006, we analyzed the change of I/O workloads by the development of system performance and high-speed network technology.

Performance Evaluation of Complex Valued Neural Networks Using Various Error Functions

The backpropagation algorithm in general employs quadratic error function. In fact, most of the problems that involve minimization employ the Quadratic error function. With alternative error functions the performance of the optimization scheme can be improved. The new error functions help in suppressing the ill-effects of the outliers and have shown good performance to noise. In this paper we have tried to evaluate and compare the relative performance of complex valued neural network using different error functions. During first simulation for complex XOR gate it is observed that some error functions like Absolute error, Cauchy error function can replace Quadratic error function. In the second simulation it is observed that for some error functions the performance of the complex valued neural network depends on the architecture of the network whereas with few other error functions convergence speed of the network is independent of architecture of the neural network.

GeoSEMA: A Modelling Platform, Emerging “GeoSpatial-based Evolutionary and Mobile Agents“

Spatial and mobile computing evolves. This paper describes a smart modeling platform called “GeoSEMA". This approach tends to model multidimensional GeoSpatial Evolutionary and Mobile Agents. Instead of 3D and location-based issues, there are some other dimensions that may characterize spatial agents, e.g. discrete-continuous time, agent behaviors. GeoSEMA is seen as a devoted design pattern motivating temporal geographic-based applications; it is a firm foundation for multipurpose and multidimensional special-based applications. It deals with multipurpose smart objects (buildings, shapes, missiles, etc.) by stimulating geospatial agents. Formally, GeoSEMA refers to geospatial, spatio-evolutive and mobile space constituents where a conceptual geospatial space model is given in this paper. In addition to modeling and categorizing geospatial agents, the model incorporates the concept of inter-agents event-based protocols. Finally, a rapid software-architecture prototyping GeoSEMA platform is also given. It will be implemented/ validated in the next phase of our work.

Design and Analysis of Gauge R&R Studies: Making Decisions Based on ANOVA Method

In a competitive production environment, critical decision making are based on data resulted by random sampling of product units. Efficiency of these decisions depends on data quality and also their reliability scale. This point leads to the necessity of a reliable measurement system. Therefore, the conjecture process and analysing the errors contributes to a measurement system known as Measurement System Analysis (MSA). The aim of this research is on determining the necessity and assurance of extensive development in analysing measurement systems, particularly with the use of Repeatability and Reproducibility Gages (GR&R) to improve physical measurements. Nowadays in productive industries, repeatability and reproducibility gages released so well but they are not applicable as well as other measurement system analysis methods. To get familiar with this method and gain a feedback in improving measurement systems, this survey would be on “ANOVA" method as the most widespread way of calculating Repeatability and Reproducibility (R&R).

On Pattern-Based Programming towards the Discovery of Frequent Patterns

The problem of frequent pattern discovery is defined as the process of searching for patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a database. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages. Such paradigm is inefficient when set of patterns is large and the frequent pattern is long. We suggest a high-level declarative style of programming apply to the problem of frequent pattern discovery. We consider two languages: Haskell and Prolog. Our intuitive idea is that the problem of finding frequent patterns should be efficiently and concisely implemented via a declarative paradigm since pattern matching is a fundamental feature supported by most functional languages and Prolog. Our frequent pattern mining implementation using the Haskell and Prolog languages confirms our hypothesis about conciseness of the program. The comparative performance studies on line-of-code, speed and memory usage of declarative versus imperative programming have been reported in the paper.

Protocol and Method for Preventing Attacks from the Web

Nowadays, computer worms, viruses and Trojan horse become popular, and they are collectively called malware. Those malware just spoiled computers by deleting or rewriting important files a decade ago. However, recent malware seems to be born to earn money. Some of malware work for collecting personal information so that malicious people can find secret information such as password for online banking, evidence for a scandal or contact address which relates with the target. Moreover, relation between money and malware becomes more complex. Many kinds of malware bear bots to get springboards. Meanwhile, for ordinary internet users, countermeasures against malware come up against a blank wall. Pattern matching becomes too much waste of computer resources, since matching tools have to deal with a lot of patterns derived from subspecies. Virus making tools can automatically bear subspecies of malware. Moreover, metamorphic and polymorphic malware are no longer special. Recently there appears malware checking sites that check contents in place of users' PC. However, there appears a new type of malicious sites that avoids check by malware checking sites. In this paper, existing protocols and methods related with the web are reconsidered in terms of protection from current attacks, and new protocol and method are indicated for the purpose of security of the web.

Independent Spanning Trees on Systems-on-chip Hypercubes Routing

Independent spanning trees (ISTs) provide a number of advantages in data broadcasting. One can cite the use in fault tolerance network protocols for distributed computing and bandwidth. However, the problem of constructing multiple ISTs is considered hard for arbitrary graphs. In this paper we present an efficient algorithm to construct ISTs on hypercubes that requires minimum resources to be performed.

Programmable Logic Controller for Cassava Centrifugal Machine

Chaiyaphum Starch Co. Ltd. is one of many starch manufacturers that has introduced machinery to aid in manufacturing. Even though machinery has replaced many elements and is now a significant part in manufacturing processes, problems that must be solved with respect to current process flow to increase efficiency still exist. The paper-s aim is to increase productivity while maintaining desired quality of starch, by redesigning the flipping machine-s mechanical control system which has grossly low functional lifetime. Such problems stem from the mechanical control system-s bearings, as fluids and humidity can access into said bearing directly, in tandem with vibrations from the machine-s function itself. The wheel which is used to sense starch thickness occasionally falls from its shaft, due to high speed rotation during operation, while the shaft may bend from impact when processing dried bread. Redesigning its mechanical control system has increased its efficiency, allowing quality thickness measurement while increasing functional lifetime an additional 62 days.

Generator of Hypotheses an Approach of Data Mining Based on Monotone Systems Theory

Generator of hypotheses is a new method for data mining. It makes possible to classify the source data automatically and produces a particular enumeration of patterns. Pattern is an expression (in a certain language) describing facts in a subset of facts. The goal is to describe the source data via patterns and/or IF...THEN rules. Used evaluation criteria are deterministic (not probabilistic). The search results are trees - form that is easy to comprehend and interpret. Generator of hypotheses uses very effective algorithm based on the theory of monotone systems (MS) named MONSA (MONotone System Algorithm).

A Taguchi Approach to Investigate Impact of Factors for Reusability of Software Components

Quantitative Investigation of impact of the factors' contribution towards measuring the reusability of software components could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable component from existing legacy systems; that can save cost of developing the software from scratch. But the issue of the relative significance of contributing factors has remained relatively unexplored. In this paper, we have use the Taguchi's approach in analyzing the significance of different structural attributes or factors in deciding the reusability level of a particular component. The results obtained shows that the complexity is the most important factor in deciding the better Reusability of a function oriented Software. In case of Object Oriented Software, Coupling and Complexity collectively play significant role in high reusability.

Performance Analysis of List Scheduling in Heterogeneous Computing Systems

Given a parallel program to be executed on a heterogeneous computing system, the overall execution time of the program is determined by a schedule. In this paper, we analyze the worst-case performance of the list scheduling algorithm for scheduling tasks of a parallel program in a mixed-machine heterogeneous computing system such that the total execution time of the program is minimized. We prove tight lower and upper bounds for the worst-case performance ratio of the list scheduling algorithm. We also examine the average-case performance of the list scheduling algorithm. Our experimental data reveal that the average-case performance of the list scheduling algorithm is much better than the worst-case performance and is very close to optimal, except for large systems with large heterogeneity. Thus, the list scheduling algorithm is very useful in real applications.

Developing Student Teachers to Be Professional Teachers

Practicum placements are an critical factor for student teachers on Education Programs. How can student teachers become professionals? This study was to investigate problems, weakness and obstacles of practicum placements and develop guidelines for partnership in the practicum placements. In response to this issue, a partnership concept was implemented for developing student teachers into professionals. Data were collected through questionnaires on attitude toward problems, weaknesses, and obstacles of practicum placements of student teachers in Rajabhat universities and included focus group interviews. The research revealed that learning management, classroom management, curriculum, assessment and evaluation, classroom action research, and teacher demeanor are the important factors affecting the professional development of Education Program student teachers. Learning management plan and classroom management concerning instructional design, teaching technique, instructional media, and student behavior management are another important aspects influencing the professional development for student teachers.

An Efficient Biometric Cryptosystem using Autocorrelators

Cryptography provides the secure manner of information transmission over the insecure channel. It authenticates messages based on the key but not on the user. It requires a lengthy key to encrypt and decrypt the sending and receiving the messages, respectively. But these keys can be guessed or cracked. Moreover, Maintaining and sharing lengthy, random keys in enciphering and deciphering process is the critical problem in the cryptography system. A new approach is described for generating a crypto key, which is acquired from a person-s iris pattern. In the biometric field, template created by the biometric algorithm can only be authenticated with the same person. Among the biometric templates, iris features can efficiently be distinguished with individuals and produces less false positives in the larger population. This type of iris code distribution provides merely less intra-class variability that aids the cryptosystem to confidently decrypt messages with an exact matching of iris pattern. In this proposed approach, the iris features are extracted using multi resolution wavelets. It produces 135-bit iris codes from each subject and is used for encrypting/decrypting the messages. The autocorrelators are used to recall original messages from the partially corrupted data produced by the decryption process. It intends to resolve the repudiation and key management problems. Results were analyzed in both conventional iris cryptography system (CIC) and non-repudiation iris cryptography system (NRIC). It shows that this new approach provides considerably high authentication in enciphering and deciphering processes.

Probability Distribution of Rainfall Depth at Hourly Time-Scale

Rainfall data at fine resolution and knowledge of its characteristics plays a major role in the efficient design and operation of agricultural, telecommunication, runoff and erosion control as well as water quality control systems. The paper is aimed to study the statistical distribution of hourly rainfall depth for 12 representative stations spread across Peninsular Malaysia. Hourly rainfall data of 10 to 22 years period were collected and its statistical characteristics were estimated. Three probability distributions namely, Generalized Pareto, Exponential and Gamma distributions were proposed to model the hourly rainfall depth, and three goodness-of-fit tests, namely, Kolmogorov-Sminov, Anderson-Darling and Chi-Squared tests were used to evaluate their fitness. Result indicates that the east cost of the Peninsular receives higher depth of rainfall as compared to west coast. However, the rainfall frequency is found to be irregular. Also result from the goodness-of-fit tests show that all the three models fit the rainfall data at 1% level of significance. However, Generalized Pareto fits better than Exponential and Gamma distributions and is therefore recommended as the best fit.