EGCL: An Extended G-Code Language with Flow Control, Functions and Mnemonic Variables

In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.

Parametric Analysis of Effective Factors on the Seismic Rehabilitation of the Foundations by Network Micropile

The main objective of seismic rehabilitation in the foundations is decreasing the range of horizontal and vertical vibrations and omitting high frequencies contents under the seismic loading. In this regard, the advantages of micropiles network is utilized. Reduction in vibration range of foundation can be achieved by using high dynamic rigidness module such as deep foundations. In addition, natural frequency of pile and soil system increases in regard to rising of system rigidness. Accordingly, the main strategy is decreasing of horizontal and vertical seismic vibrations of the structure. In this case, considering the impact of foundation, pile and improved soil foundation is a primary concern. Therefore, in this paper, effective factors are studied on the seismic rehabilitation of foundations applying network micropiles in sandy soils with nonlinear reaction.

Comparison of Current Chinese and Japanese Design Specification for Bridge Pile in Liquefied Ground

Firstly, this study briefly presents the current situation that there exists a vast gap between current Chinese and Japanese seismic design specification for bridge pile foundation in liquefiable and liquefaction-induced lateral spreading ground; The Chinese and Japanese seismic design method and technical detail for bridge pile foundation in liquefying and lateral spreading ground are described and compared systematically and comprehensively, the methods of determining coefficient of subgrade reaction and its reduction factor as well as the computing mode of the applied force on pile foundation due to liquefaction-induced lateral spreading soil in Japanese design specification are especially introduced. Subsequently, the comparison indicates that the content of Chinese seismic design specification for bridge pile foundation in liquefiable and liquefaction-induced lateral spreading ground, just presenting some qualitative items, is too general and lacks systematicness and maneuverability. Finally, some defects of seismic design specification in China are summarized, so the improvement and revision of specification in the field turns out to be imperative for China, some key problems of current Chinese specifications are generalized and the corresponding improvement suggestions are proposed.

Prediction of a Human Facial Image by ANN using Image Data and its Content on Web Pages

Choosing the right metadata is a critical, as good information (metadata) attached to an image will facilitate its visibility from a pile of other images. The image-s value is enhanced not only by the quality of attached metadata but also by the technique of the search. This study proposes a technique that is simple but efficient to predict a single human image from a website using the basic image data and the embedded metadata of the image-s content appearing on web pages. The result is very encouraging with the prediction accuracy of 95%. This technique may become a great assist to librarians, researchers and many others for automatically and efficiently identifying a set of human images out of a greater set of images.

Qualitative Parametric Comparison of Load Balancing Algorithms in Parallel and Distributed Computing Environment

Decrease in hardware costs and advances in computer networking technologies have led to increased interest in the use of large-scale parallel and distributed computing systems. One of the biggest issues in such systems is the development of effective techniques/algorithms for the distribution of the processes/load of a parallel program on multiple hosts to achieve goal(s) such as minimizing execution time, minimizing communication delays, maximizing resource utilization and maximizing throughput. Substantive research using queuing analysis and assuming job arrivals following a Poisson pattern, have shown that in a multi-host system the probability of one of the hosts being idle while other host has multiple jobs queued up can be very high. Such imbalances in system load suggest that performance can be improved by either transferring jobs from the currently heavily loaded hosts to the lightly loaded ones or distributing load evenly/fairly among the hosts .The algorithms known as load balancing algorithms, helps to achieve the above said goal(s). These algorithms come into two basic categories - static and dynamic. Whereas static load balancing algorithms (SLB) take decisions regarding assignment of tasks to processors based on the average estimated values of process execution times and communication delays at compile time, Dynamic load balancing algorithms (DLB) are adaptive to changing situations and take decisions at run time. The objective of this paper work is to identify qualitative parameters for the comparison of above said algorithms. In future this work can be extended to develop an experimental environment to study these Load balancing algorithms based on comparative parameters quantitatively.

Exploration of the Communication Area of Infrared Short-Range Communication Systems for Intervehicle Communication

Infrared communication in the wavelength band 780- 950 nm is very suitable for short-range point-to-point communications. It is a good choice for vehicle-to-vehicle communication in several intelligent-transportation-system (ITS) applications such as cooperative driving, collision warning, and pileup-crash prevention. In this paper, with the aid of a physical model established in our previous works, we explore the communication area of an infrared intervehicle communication system utilizing a typical low-cost cormmercial lightemitting diodes (LEDs) as the emitter and planar p-i-n photodiodes as the receiver. The radiation pattern of the emitter fabricated by aforementioned LEDs and the receiving pattern of the receiver are approximated by a linear combination of cosinen functions. This approximation helps us analyze the system performance easily. Both multilane straight-road conditions and curved-road conditions with various radius of curvature are taken into account. The condition of a small car communicating with a big truck, i.e., there is a vertical mounting height difference between the emitter and the receiver, is also considered. Our results show that the performance of the system meets the requirement of aforementioned ITS applications in terms of the communication area.

Development of A Meta Description Language for Software/Hardware Cooperative Design and Verification for Model-Checking Systems

Model-checking tools such as Symbolic Model Verifier (SMV) and NuSMV are available for checking hardware designs. These tools can automatically check the formal legitimacy of a design. However, NuSMV is too low level for describing a complete hardware design. It is therefore necessary to translate the system definition, as designed in a language such as Verilog or VHDL, into a language such as NuSMV for validation. In this paper, we present a meta hardware description language, Melasy, that contains a code generator for existing hardware description languages (HDLs) and languages for model checking that solve this problem.

Comanche – A Compiler-Driven I/O Management System

Most scientific programs have large input and output data sets that require out-of-core programming or use virtual memory management (VMM). Out-of-core programming is very error-prone and tedious; as a result, it is generally avoided. However, in many instance, VMM is not an effective approach because it often results in substantial performance reduction. In contrast, compiler driven I/O management will allow a program-s data sets to be retrieved in parts, called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a compiler combined with a user level runtime system that can be used to replace standard VMM for out-of-core programs. We describe Comanche and demonstrate on a number of representative problems that it substantially out-performs VMM. Significantly our system does not require any special services from the operating system and does not require modification of the operating system kernel.

A Bayesian Kernel for the Prediction of Protein- Protein Interactions

Understanding proteins functions is a major goal in the post-genomic era. Proteins usually work in context of other proteins and rarely function alone. Therefore, it is highly relevant to study the interaction partners of a protein in order to understand its function. Machine learning techniques have been widely applied to predict protein-protein interactions. Kernel functions play an important role for a successful machine learning technique. Choosing the appropriate kernel function can lead to a better accuracy in a binary classifier such as the support vector machines. In this paper, we describe a Bayesian kernel for the support vector machine to predict protein-protein interactions. The use of Bayesian kernel can improve the classifier performance by incorporating the probability characteristic of the available experimental protein-protein interactions data that were compiled from different sources. In addition, the probabilistic output from the Bayesian kernel can assist biologists to conduct more research on the highly predicted interactions. The results show that the accuracy of the classifier has been improved using the Bayesian kernel compared to the standard SVM kernels. These results imply that protein-protein interaction can be predicted using Bayesian kernel with better accuracy compared to the standard SVM kernels.

Effects of a Methanol Fraction of the Leaves of Leonotis leonurus on the Blood Pressure and Heart Rate of Normotensive Male Wistar Rats

Leonotisleonurus a shrub indigenous to Southern Africa is widely used in traditional medicine to treat a variety of conditions ranging from skin diseases and cough to epileptic fits and ‘heart problems’. Studies on the aqueous extract of the leaves have indicated cycloxegenase enzyme inhibitory activity and an antihypertensive effect. Five methanol leaf extract fractions (MLEa - MLEe) of L. leonurus were tested on anaesthetized normotensive male Wistar rats (AWR) and isolated perfused working rat hearts (IWH). Fraction MLEc (0.01mg/kg – 0.05mg/kg) induced significant increases in BP and HR in AWR and positive chronotropic and inotropic effects in IWH (1.0mg/ml – 5.0mg/ml). Pre-administration of atenolol (2.0mg/kg) and prazosin (60μg/kg) significantly inhibited MLEc effect on HR and MAP respectively in vivo, while atenolol (7.0mg/ml) pre-perfusion significantly inhibited MLEc effect in vitro. The hypertensive effect of MLEc is probably via β1agonism. Results also indicate the presence of multiple cardioactive compounds in L. leonurus.

System of Programs for Rapid Development and Execution of Palm OS Applications

We present the development of a system of programs designed for the compilation and execution of applications for handheld computers. In introduction we describe the purpose of the project and its components. The next two paragraphs present the first two components of the project (the scanner and parser generators). Then we describe the Object Pascal compiler and the virtual machines for Windows and Palm OS. In conclusion we emphasize the ways in which the project can be extended.

Performance Analysis of Digital Signal Processors Using SMV Benchmark

Unlike general-purpose processors, digital signal processors (DSP processors) are strongly application-dependent. To meet the needs for diverse applications, a wide variety of DSP processors based on different architectures ranging from the traditional to VLIW have been introduced to the market over the years. The functionality, performance, and cost of these processors vary over a wide range. In order to select a processor that meets the design criteria for an application, processor performance is usually the major concern for digital signal processing (DSP) application developers. Performance data are also essential for the designers of DSP processors to improve their design. Consequently, several DSP performance benchmarks have been proposed over the past decade or so. However, none of these benchmarks seem to have included recent new DSP applications. In this paper, we use a new benchmark that we recently developed to compare the performance of popular DSP processors from Texas Instruments and StarCore. The new benchmark is based on the Selectable Mode Vocoder (SMV), a speech-coding program from the recent third generation (3G) wireless voice applications. All benchmark kernels are compiled by the compilers of the respective DSP processors and run on their simulators. Weighted arithmetic mean of clock cycles and arithmetic mean of code size are used to compare the performance of five DSP processors. In addition, we studied how the performance of a processor is affected by code structure, features of processor architecture and optimization of compiler. The extensive experimental data gathered, analyzed, and presented in this paper should be helpful for DSP processor and compiler designers to meet their specific design goals.

Energy Distribution of EEG Signals: EEG Signal Wavelet-Neural Network Classifier

In this paper, a wavelet-based neural network (WNN) classifier for recognizing EEG signals is implemented and tested under three sets EEG signals (healthy subjects, patients with epilepsy and patients with epileptic syndrome during the seizure). First, the Discrete Wavelet Transform (DWT) with the Multi-Resolution Analysis (MRA) is applied to decompose EEG signal at resolution levels of the components of the EEG signal (δ, θ, α, β and γ) and the Parseval-s theorem are employed to extract the percentage distribution of energy features of the EEG signal at different resolution levels. Second, the neural network (NN) classifies these extracted features to identify the EEGs type according to the percentage distribution of energy features. The performance of the proposed algorithm has been evaluated using in total 300 EEG signals. The results showed that the proposed classifier has the ability of recognizing and classifying EEG signals efficiently.

Inefficiency of Data Storing in Physical Memory

Memory forensic is important in digital investigation. The forensic is based on the data stored in physical memory that involve memory management and processing time. However, the current forensic tools do not consider the efficiency in terms of storage management and the processing time. This paper shows the high redundancy of data found in the physical memory that cause inefficiency in processing time and memory management. The experiment is done using Borland C compiler on Windows XP with 512 MB of physical memory.

Evaluation of Internet Anxiety in SRBIAU Higher Education Students in Research Process

Increase in using internet makes some problems that one of them is "internet anxiety". Internet anxiety is a type of anxious that people may feel during surfing internet or using internet for their educational purpose, blogging or streaming to digital libraries. The goal of this study is evaluating of internet anxiety among the management students. In this research Ealy's internet anxiety questionnaire, consists of positive and negative items, is completed by 310 participants. According to the findings, about 64.7% of them were equal or below to mean anxiety score (50). The distribution of internet anxiety scores was normal and there was no meaningful difference between men-s and women's anxiety level in this sample. Results also showed that there is no meaningful difference of internet anxiety level between different fields of study in Management. This evaluation will help managers to perform gap analysis between the existent level and the desired one. Future work would be providing techniques for abating human anxiety while using internet via human computer interaction techniques.

High Level Synthesis of Kahn Process Networks(KPN) for Streaming Applications

Streaming Applications usually run in parallel or in series that incrementally transform a stream of input data. It poses a design challenge to break such an application into distinguishable blocks and then to map them into independent hardware processing elements. For this, there is required a generic controller that automatically maps such a stream of data into independent processing elements without any dependencies and manual considerations. In this paper, Kahn Process Networks (KPN) for such streaming applications is designed and developed that will be mapped on MPSoC. This is designed in such a way that there is a generic Cbased compiler that will take the mapping specifications as an input from the user and then it will automate these design constraints and automatically generate the synthesized RTL optimized code for specified application.

A Comparison of Experimental Data with Monte Carlo Calculations for Optimisation of the Sourceto- Detector Distance in Determining the Efficiency of a LaBr3:Ce (5%) Detector

Cerium-doped lanthanum bromide LaBr3:Ce(5%) crystals are considered to be one of the most advanced scintillator materials used in PET scanning, combining a high light yield, fast decay time and excellent energy resolution. Apart from the correct choice of scintillator, it is also important to optimise the detector geometry, not least in terms of source-to-detector distance in order to obtain reliable measurements and efficiency. In this study a commercially available 25 mm x 25 mm BrilLanCeTM 380 LaBr3: Ce (5%) detector was characterised in terms of its efficiency at varying source-to-detector distances. Gamma-ray spectra of 22Na, 60Co, and 137Cs were separately acquired at distances of 5, 10, 15, and 20cm. As a result of the change in solid angle subtended by the detector, the geometric efficiency reduced in efficiency with increasing distance. High efficiencies at low distances can cause pulse pile-up when subsequent photons are detected before previously detected events have decayed. To reduce this systematic error the source-to-detector distance should be balanced between efficiency and pulse pile-up suppression as otherwise pile-up corrections would need to be necessary at short distances. In addition to the experimental measurements Monte Carlo simulations have been carried out for the same setup, allowing a comparison of results. The advantages and disadvantages of each approach have been highlighted.

Research on Pressed Pile Test and Finite Element Analysis of Large-diameter Steel Pipe Pile of Zhanjiang Port

In order to study pressed pile test and ultimate bearing capacity character of large-diameter steel pipe pile, based on two high-piled wharfs of Zhanjiang Port, pressed pile test and numerical simulation of three large-diameter steel pipe piles are analyzed in this paper. Anchored pile method is used to pressed pile test, and the curves of Q-s and ultimate bearing capacity are attained. Then the three piles are numerically simulated by ABAQUS, and results of numerical simulation and those of field test are comparatively analyzed. The results show that settlement value of numerical simulation is larger than that of field test in the process of loading, the difference value is widening with the increasing of load, and the ultimate difference value of settlement is 20% to 30%.

Compiler-Based Architecture for Context Aware Frameworks

Computers are being integrated in the various aspects of human every day life in different shapes and abilities. This fact has intensified a requirement for the software development technologies which is ability to be: 1) portable, 2) adaptable, and 3) simple to develop. This problem is also known as the Pervasive Computing Problem (PCP) which can be implemented in different ways, each has its own pros and cons and Context Oriented Programming (COP) is one of the methods to address the PCP. In this paper a design for a COP framework, a context aware framework, is presented which has eliminated weak points of a previous design based on interpreter languages, while introducing the compiler languages power in implementing these frameworks. The key point of this improvement is combining COP and Dependency Injection (DI) techniques. Both old and new frameworks are analyzed to show advantages and disadvantages. Finally a simulation of both designs is proposed to indicating that the practical results agree with the theoretical analysis while the new design runs almost 8 times faster.

Fundamental Concepts of Theory of Constraints: An Emerging Philosophy

Dr Eliyahu Goldratt has done the pioneering work in the development of Theory of Constraints. Since then, many more researchers around the globe are working to enhance this body of knowledge. In this paper, an attempt has been made to compile the salient features of this theory from the work done by Goldratt and other researchers. This paper will provide a good starting point to the potential researchers interested to work in Theory of Constraints. The paper will also help the practicing managers by clarifying their concepts on the theory and will facilitate its successful implementation in their working areas.