Dimensioning of Subsynchronous Cascade for Speed Regulation of Two-Motors 6kv Conveyer Drives

One way for optimum loading of overdimensioning conveyers is speed (capacity) decrement, with attention for production capabilities and demands. At conveyers which drives with three phase slip-ring induction motor, technically reasonable solution for conveyer (driving motors) speed regulation is using constant torque subsynchronous cascade with static semiconductor converter and transformer for energy reversion to the power network. In the paper is described mathematical model for parameter calculation of two-motors 6 kV subsynchronous cascade. It is also demonstrated that applying of this cascade gave several good properties, foremost in electrical energy saving, also in improving of other energy indexes, and finally that results in cost reduction of complete electrical motor drive.

Density, Strength, Thermal Conductivity and Leachate Characteristics of Light-Weight Fired Clay Bricks Incorporating Cigarette Butts

Several trillion cigarettes produced worldwide annually lead to many thousands of kilograms of toxic waste. Cigarette butts (CBs) accumulate in the environment due to the poor biodegradability of the cellulose acetate filters. This paper presents some of the results from a continuing study on recycling CBs into fired clay bricks. Physico-mechanical properties of fired clay bricks manufactured with different percentages of CBs are reported and discussed. The results show that the density of fired bricks was reduced by up to 30 %, depending on the percentage of CBs incorporated into the raw materials. Similarly, the compressive strength of bricks tested decreased according to the percentage of CBs included in the mix. The thermal conductivity performance of bricks was improved by 51 and 58 % for 5 and 10 % CBs content respectively. Leaching tests were carried out to investigate the levels of possible leachates of heavy metals from the manufactured clay-CB bricks. The results revealed trace amounts of heavy metals.

The Evaluation and Application of FMEA in Sepahan Oil Co

Failure modes and effects analysis (FMEA) is an effective technique for preventing potential problems and actions needed to error cause removal. On the other hand, the oil producing companies paly a critical role in the oil industry of Iran as a developing country out of which, Sepahan Oil Co. has a considerable contribution. The aim of this research is to show how FMEA could be applied and improve the quality of products at Sepahan Oil Co. For this purpose, the four liter production line of the company has been selected for investigation. The findings imply that the application of FMEA has reduced the scraps from 50000 ppm to 5000 ppm and has resulted in a 0.92 percent decrease of the oil waste.

Efficient and Extensible Data Processing Framework in Ubiquitious Sensor Networks

This paper presents the design and implements the prototype of an intelligent data processing framework in ubiquitous sensor networks. Much focus is put on how to handle the sensor data stream as well as the interoperability between the low-level sensor data and application clients. Our framework first addresses systematic middleware which mitigates the interaction between the application layer and low-level sensors, for the sake of analyzing a great volume of sensor data by filtering and integrating to create value-added context information. Then, an agent-based architecture is proposed for real-time data distribution to efficiently forward a specific event to the appropriate application registered in the directory service via the open interface. The prototype implementation demonstrates that our framework can host a sophisticated application on the ubiquitous sensor network and it can autonomously evolve to new middleware, taking advantages of promising technologies such as software agents, XML, cloud computing, and the like.

Object Detection based Weighted-Center Surround Difference

Intelligent traffic surveillance technology is an issue in the field of traffic data analysis. Therefore, we need the technology to detect moving objects in real-time while there are variations in background and natural light. In this paper, we proposed a Weighted-Center Surround Difference method for object detection in outdoor environments. The proposed system detects objects using the saliency map that is obtained by analyzing the weight of each layers of Gaussian pyramid. In order to validate the effectiveness of our system, we implemented the proposed method using a digital signal processor, TMS320DM6437. Experimental results show that blurred noisy around objects was effectively eliminated and the object detection accuracy is improved.

Multi-Functional Insect Cuticles: Informative Designs for Man-Made Surfaces

Biomimicry has many potential benefits as many technologies found in nature are superior to their man-made counterparts. As technological device components approach the micro and nanoscale, surface properties such as surface adhesion and friction may need to be taken into account. Lowering surface adhesion by manipulating chemistry alone might no longer be sufficient for such components and thus physical manipulation may be required. Adhesion reduction is only one of the many surface functions displayed by micro/nano-structured cuticles of insects. Here, we present a mini review of our understanding of insect cuticle structures and the relationship between the structure dimensions and the corresponding functional mechanisms. It may be possible to introduce additional properties to material surfaces (indeed multi-functional properties) based on the design of natural surfaces.

Analytical Model Based Evaluation of Human Machine Interfaces Using Cognitive Modeling

Cognitive models allow predicting some aspects of utility and usability of human machine interfaces (HMI), and simulating the interaction with these interfaces. The action of predicting is based on a task analysis, which investigates what a user is required to do in terms of actions and cognitive processes to achieve a task. Task analysis facilitates the understanding of the system-s functionalities. Cognitive models are part of the analytical approaches, that do not associate the users during the development process of the interface. This article presents a study about the evaluation of a human machine interaction with a contextual assistant-s interface using ACTR and GOMS cognitive models. The present work shows how these techniques may be applied in the evaluation of HMI, design and research by emphasizing firstly the task analysis and secondly the time execution of the task. In order to validate and support our results, an experimental study of user performance is conducted at the DOMUS laboratory, during the interaction with the contextual assistant-s interface. The results of our models show that the GOMS and ACT-R models give good and excellent predictions respectively of users performance at the task level, as well as the object level. Therefore, the simulated results are very close to the results obtained in the experimental study.

Multi Switched Split Vector Quantization of Narrowband Speech Signals

Vector quantization is a powerful tool for speech coding applications. This paper deals with LPC Coding of speech signals which uses a new technique called Multi Switched Split Vector Quantization (MSSVQ), which is a hybrid of Multi, switched, split vector quantization techniques. The spectral distortion performance, computational complexity, and memory requirements of MSSVQ are compared to split vector quantization (SVQ), multi stage vector quantization(MSVQ) and switched split vector quantization (SSVQ) techniques. It has been proved from results that MSSVQ has better spectral distortion performance, lower computational complexity and lower memory requirements when compared to all the above mentioned product code vector quantization techniques. Computational complexity is measured in floating point operations (flops), and memory requirements is measured in (floats).

Coping with the Rapidity of Information Technology Changes – A Comparison Reviewon Current Practices

Information technology managers nowadays are facing with tremendous pressure to plan, implement, and adopt new technology solution due to the rapidity of technology changes. Resulted from a lack of study that have been done in this topic, the aim of this paper is to provide a comparison review on current tools that are currently being used in order to respond to technological changes. The study is based on extensive literature review of published works with majority of them are ranging from 2000 to the first part of 2011. The works were gathered from journals, books, and other information sources available on the Web. Findings show that, each tools has different focus and none of the tools are providing a framework in holistic view, which should include technical, people, process, and business environment aspect. Hence, this result provides potential information about current available tools that IT managers could use to manage changes in technology. Further, the result reveals a research gap in the area where the industries a short of such framework.

Development of NOx Emission Model for a Tangentially Fired Acid Incinerator

This paper aims to develop a NOx emission model of an acid gas incinerator using Nelder-Mead least squares support vector regression (LS-SVR). Malaysia DOE is actively imposing the Clean Air Regulation to mandate the installation of analytical instrumentation known as Continuous Emission Monitoring System (CEMS) to report emission level online to DOE . As a hardware based analyzer, CEMS is expensive, maintenance intensive and often unreliable. Therefore, software predictive technique is often preferred and considered as a feasible alternative to replace the CEMS for regulatory compliance. The LS-SVR model is built based on the emissions from an acid gas incinerator that operates in a LNG Complex. Simulated Annealing (SA) is first used to determine the initial hyperparameters which are then further optimized based on the performance of the model using Nelder-Mead simplex algorithm. The LS-SVR model is shown to outperform a benchmark model based on backpropagation neural networks (BPNN) in both training and testing data.

A New Algorithm for Cluster Initialization

Clustering is a very well known technique in data mining. One of the most widely used clustering techniques is the k-means algorithm. Solutions obtained from this technique are dependent on the initialization of cluster centers. In this article we propose a new algorithm to initialize the clusters. The proposed algorithm is based on finding a set of medians extracted from a dimension with maximum variance. The algorithm has been applied to different data sets and good results are obtained.

Application of Quality Index Method, Texture Measurements and Electronic Nose to Assess the Freshness of Atlantic Herring (Clupea harengus) Stored in Ice

Atlantic herring (Clupea harengus) is an important commercial fish and shows to be more and more demanded for human consumption. Therefore, it is very important to find good methods for monitoring the freshness of the fish in order to keep it in the best quality for human consumption. In this study, the fish was stored in ice up to 2 weeks. Quality changes during storage were assessed by the Quality Index Method (QIM), quantitative descriptive analysis (QDA) and Torry scheme, by texture measurements: puncture tests and Texture Profile Analysis (TPA) tests on texture analyzer TA.XT2i, and by electronic nose (e-nose) measurements using FreshSense instrument. Storage time of herring in ice could be estimated by QIM with ± 2 days using 5 herring per lot. No correlation between instrumental texture parameters and storage time or between sensory and instrumental texture variables was found. E-nose measurements could be use to detect the onset of spoilage.

Implementation of Watch Dog Timer for Fault Tolerant Computing on Cluster Server

In today-s new technology era, cluster has become a necessity for the modern computing and data applications since many applications take more time (even days or months) for computation. Although after parallelization, computation speeds up, still time required for much application can be more. Thus, reliability of the cluster becomes very important issue and implementation of fault tolerant mechanism becomes essential. The difficulty in designing a fault tolerant cluster system increases with the difficulties of various failures. The most imperative obsession is that the algorithm, which avoids a simple failure in a system, must tolerate the more severe failures. In this paper, we implemented the theory of watchdog timer in a parallel environment, to take care of failures. Implementation of simple algorithm in our project helps us to take care of different types of failures; consequently, we found that the reliability of this cluster improves.

A Modified AES Based Algorithm for Image Encryption

With the fast evolution of digital data exchange, security information becomes much important in data storage and transmission. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. In this paper, we analyze the Advanced Encryption Standard (AES), and we add a key stream generator (A5/1, W7) to AES to ensure improving the encryption performance; mainly for images characterised by reduced entropy. The implementation of both techniques has been realized for experimental purposes. Detailed results in terms of security analysis and implementation are given. Comparative study with traditional encryption algorithms is shown the superiority of the modified algorithm.

Fast Depth Estimation with Filters

Fast depth estimation from binocular vision is often desired for autonomous vehicles, but, most algorithms could not easily be put into practice because of the much time cost. We present an image-processing technique that can fast estimate depth image from binocular vision images. By finding out the lines which present the best matched area in the disparity space image, the depth can be estimated. When detecting these lines, an edge-emphasizing filter is used. The final depth estimation will be presented after the smooth filter. Our method is a compromise between local methods and global optimization.

Power System Contingency Analysis Using Multiagent Systems

The demand of the energy management systems (EMS) set forth by modern power systems requires fast energy management systems. Contingency analysis is among the functions in EMS which is time consuming. In order to handle this limitation, this paper introduces agent based technology in the contingency analysis. The main function of agents is to speed up the performance. Negotiations process in decision making is explained and the issue set forth is the minimization of the operating costs. The IEEE 14 bus system and its line outage have been used in the research and simulation results are presented.

Application of Neural Network for Contingency Ranking Based on Combination of Severity Indices

In this paper, an improved technique for contingency ranking using artificial neural network (ANN) is presented. The proposed approach is based on multi-layer perceptrons trained by backpropagation to contingency analysis. Severity indices in dynamic stability assessment are presented. These indices are based on the concept of coherency and three dot products of the system variables. It is well known that some indices work better than others for a particular power system. This paper along with test results using several different systems, demonstrates that combination of indices with ANN provides better ranking than a single index. The presented results are obtained through the use of power system simulation (PSS/E) and MATLAB 6.5 software.

Complementary Energy Path Adiabatic Logic based Full Adder Circuit

In this paper, we present the design and experimental evaluation of complementary energy path adiabatic logic (CEPAL) based 1 bit full adder circuit. A simulative investigation on the proposed full adder has been done using VIRTUOSO SPECTRE simulator of cadence in 0.18μm UMC technology and its performance has been compared with the conventional CMOS full adder circuit. The CEPAL based full adder circuit exhibits the energy saving of 70% to the conventional CMOS full adder circuit, at 100 MHz frequency and 1.8V operating voltage.

Increasing Convergence Rate of a Fractionally-Spaced Channel Equalizer

In this paper a technique for increasing the convergence rate of fractionally spaced channel equalizer is proposed. Instead of symbol-spaced updating of the equalizer filter, a mechanism has been devised to update the filter at a higher rate. This ensures convergence of the equalizer filter at a higher rate and therefore less time-consuming. The proposed technique has been simulated and tested for two-ray modeled channels with various delay spreads. These channels include minimum-phase and nonminimum- phase channels. Simulation results suggest that that proposed technique outperforms the conventional technique of symbol-spaced updating of equalizer filter.

Experimental and Numerical Study of The Shock-Accelerated Elliptic Heavy Gas Cylinders

We studied the evolution of elliptic heavy SF6 gas cylinder surrounded by air when accelerated by a planar Mach 1.25 shock. A multiple dynamics imaging technology has been used to obtain one image of the experimental initial conditions and five images of the time evolution of elliptic cylinder. We compared the width and height of the circular and two kinds of elliptic gas cylinders, and analyzed the vortex strength of the elliptic ones. Simulations are in very good agreement with the experiments, but due to the different initial gas cylinder shapes, a certain difference of the initial density peak and distribution exists between the circular and elliptic gas cylinders, and the latter initial state is more sensitive and more inenarrable.