Effect of a Gravel Bed Flocculator on the Efficiency of a Low Cost Water Treatment Plants

The principal objective of a water treatment plant is to produce water that satisfies a set of drinking water quality standards at a reasonable price to the consumers. The gravel-bed flocculator provide a simple and inexpensive design for flocculation in small water treatment plants (less than 5000 m3/day capacity). The packed bed of gravel provides ideal conditions for the formation of compact settleable flocs because of continuous recontact provided by the sinuous flow of water through the interstices formed by the gravel. The field data which were obtained from the operation of the water supply treatment unit cover the physical, chemical and biological water qualities of the raw and settled water as obtained by the operation of the treatment unit. The experiments were carried out with the aim of assessing the efficiency of the gravel filter in removing the turbidity, pathogenic bacteria, from the raw water. The water treatment plant, which was constructed for the treatment of river water, was in principle a rapid sand filter. The results show that the average value of the turbidity level of the settled water was 4.83 NTU with a standard deviation of turbidity 2.893 NTU. This indicated that the removal efficiency of the sedimentation tank (gravel filter) was about 67.8 %. for pH values fluctuated between 7.75 and 8.15, indicating the alkaline nature of the raw water of the river Shatt Al-Hilla, as expected. Raw water pH is depressed slightly following alum coagulation. The pH of the settled water ranged from 7.75 to a maximum of 8.05. The bacteriological tests which were carried out on the water samples were: total coliform test, E-coli test, and the plate count test. In each test the procedure used was as outlined in the Standard Methods for the Examination of Water and Wastewater (APHA, AWWA, and WPCF, 1985). The gravel filter exhibit a low performance in removing bacterial load. The percentage bacterial removal, which is maximum for total plate count (19%) and minimum for total coliform (16.82%).

First Order Filter Based Current-Mode Sinusoidal Oscillators Using Current Differencing Transconductance Amplifiers (CDTAs)

This article presents new current-mode oscillator circuits using CDTAs which is designed from block diagram. The proposed circuits consist of two CDTAs and two grounded capacitors. The condition of oscillation and the frequency of oscillation can be adjusted by electronic method. The circuits have high output impedance and use only grounded capacitors without any external resistor which is very appropriate to future development into an integrated circuit. The results of PSPICE simulation program are corresponding to the theoretical analysis.

Home-Network Security Model in Ubiquitous Environment

Social interest and demand on Home-Network has been increasing greatly. Although various services are being introduced to respond to such demands, they can cause serious security problems when linked to the open network such as Internet. This paper reviews the security requirements to protect the service users with assumption that the Home-Network environment is connected to Internet and then proposes the security model based on the requirement. The proposed security model can satisfy most of the requirements and further can be dynamically applied to the future ubiquitous Home-Networks.

A Quantitative Approach to Strategic Design of Component-Based Business Process Models

A new paradigm for software design and development models software by its business process, translates the model into a process execution language, and has it run by a supporting execution engine. This process-oriented paradigm promotes modeling of software by less technical users or business analysts as well as rapid development. Since business process models may be shared by different organizations and sometimes even by different business domains, it is interesting to apply a technique used in traditional software component technology to design reusable business processes. This paper discusses an approach to apply a technique for software component fabrication to the design of process-oriented software units, called process components. These process components result from decomposing a business process of a particular application domain into subprocesses with an aim that the process components can be reusable in different process-based software models. The approach is quantitative because the quality of process component design is measured from technical features of the process components. The approach is also strategic because the measured quality is determined against business-oriented component management goals. A software tool has been developed to measure how good a process component design is, according to the required managerial goals and comparing to other designs. We also discuss how we benefit from reusable process components.

A Software Framework for Predicting Oil-Palm Yield from Climate Data

Intelligent systems based on machine learning techniques, such as classification, clustering, are gaining wide spread popularity in real world applications. This paper presents work on developing a software system for predicting crop yield, for example oil-palm yield, from climate and plantation data. At the core of our system is a method for unsupervised partitioning of data for finding spatio-temporal patterns in climate data using kernel methods which offer strength to deal with complex data. This work gets inspiration from the notion that a non-linear data transformation into some high dimensional feature space increases the possibility of linear separability of the patterns in the transformed space. Therefore, it simplifies exploration of the associated structure in the data. Kernel methods implicitly perform a non-linear mapping of the input data into a high dimensional feature space by replacing the inner products with an appropriate positive definite function. In this paper we present a robust weighted kernel k-means algorithm incorporating spatial constraints for clustering the data. The proposed algorithm can effectively handle noise, outliers and auto-correlation in the spatial data, for effective and efficient data analysis by exploring patterns and structures in the data, and thus can be used for predicting oil-palm yield by analyzing various factors affecting the yield.

Study on Discharge Current Phenomena of Epoxy Resin Insulator Specimen

This paper presents the experimental results of discharge current phenomena on various humidity, temperature, pressure and pollutant conditions of epoxy resin specimen. The leakage distance of specimen was 3 cm, that it was supplied by high voltage. The polluted condition was given with NaCl artificial pollutant. The conducted measurements were discharge current and applied voltage. The specimen was put in a hermetically sealed chamber, and the current waveforms were analyzed with FFT. The result indicated that on discharge condition, the fifth harmonics still had dominant, rather than third one. The third harmonics tent to be appeared on low pressure heavily polluted condition, and followed by high humidity heavily polluted condition. On the heavily polluted specimen, the peaks discharge current points would be high and more frequent. Nevertheless, the specimen still had capacitive property. Besides that, usually discharge current points were more frequent. The influence of low pressure was still dominant to be easier to discharge. The non-linear property would be appear explicitly on low pressure and heavily polluted condition.

Re-Optimization MVPP Using Common Subexpression for Materialized View Selection

A Data Warehouses is a repository of information integrated from source data. Information stored in data warehouse is the form of materialized in order to provide the better performance for answering the queries. Deciding which appropriated views to be materialized is one of important problem. In order to achieve this requirement, the constructing search space close to optimal is a necessary task. It will provide effective result for selecting view to be materialized. In this paper we have proposed an approach to reoptimize Multiple View Processing Plan (MVPP) by using global common subexpressions. The merged queries which have query processing cost not close to optimal would be rewritten. The experiment shows that our approach can help to improve the total query processing cost of MVPP and sum of query processing cost and materialized view maintenance cost is reduced as well after views are selected to be materialized.

Interactive Compromise Approach with Particle Swarm Optimization for Environmental/Economic Power Dispatch

In this paper, an Interactive Compromise Approach with Particle Swarm Optimization(ICA-PSO) is presented to solve the Economic Emission Dispatch(EED) problem. The cost function and emission function are modeled as the nonsmooth functions, respectively. The bi-objective including both the minimization of cost and emission is formulated in this paper. ICA-PSO is proposed to solve EED problem for finding a better compromise solution. The solution methodology can offer a global or near-global solution for decision-making requirements. The effectiveness and efficiency of ICA-PSO are demonstrated by a sample test system. Test results can be shown that the proposed method provide a practical and flexible framework for power dispatch.

Real Time Multi-Sensory Force Sensing Mat for Sports Biomechanics and Human Gait Analysis

This paper presents a real time force sensing instrument that is designed for human gait analysis purposes. It is capable of recording and monitoring ground reaction forces exerted by human foot during various activities such as walking, running and jumping in real time. In overall, force sensing mat mainly consists of three elements: the force sensing mat, signal conditioning circuit and data acquisition device. Force sensing mat is the mat that contains an array of force sensing elements. To control and process the incoming signal from the force sensing mat, Force-Logger and Force-Reloader are developed using National Instrument Labview. This paper describes the architecture of the force sensing mat, signal conditioning circuit and the real time streaming of the incoming data from the force sensing mat. Additionally, a preliminary experiment dataset is presented in this paper.

Seismic Analysis of a S-Curved Viaduct using Stick and Finite Element Models

Stick models are widely used in studying the behaviour of straight as well as skew bridges and viaducts subjected to earthquakes while carrying out preliminary studies. The application of such models to highly curved bridges continues to pose challenging problems. A viaduct proposed in the foothills of the Himalayas in Northern India is chosen for the study. It is having 8 simply supported spans @ 30 m c/c. It is doubly curved in horizontal plane with 20 m radius. It is inclined in vertical plane as well. The superstructure consists of a box section. Three models have been used: a conventional stick model, an improved stick model and a 3D finite element model. The improved stick model is employed by making use of body constraints in order to study its capabilities. The first 8 frequencies are about 9.71% away in the latter two models. Later the difference increases to 80% in 50th mode. The viaduct was subjected to all three components of the El Centro earthquake of May 1940. The numerical integration was carried out using the Hilber- Hughes-Taylor method as implemented in SAP2000. Axial forces and moments in the bridge piers as well as lateral displacements at the bearing levels are compared for the three models. The maximum difference in the axial forces and bending moments and displacements vary by 25% between the improved and finite element model. Whereas, the maximum difference in the axial forces, moments, and displacements in various sections vary by 35% between the improved stick model and equivalent straight stick model. The difference for torsional moment was as high as 75%. It is concluded that the stick model with body constraints to model the bearings and expansion joints is not desirable in very sharp S curved viaducts even for preliminary analysis. This model can be used only to determine first 10 frequency and mode shapes but not for member forces. A 3D finite element analysis must be carried out for meaningful results.

CFD of Oscillating Airfoil Pitch Cycle by using PISO Algorithm

This research paper presents the CFD analysis of oscillating airfoil during pitch cycle. Unsteady subsonic flow is simulated for pitching airfoil at Mach number 0.283 and Reynolds number 3.45 millions. Turbulent effects are also considered for this study by using K-ω SST turbulent model. Two-dimensional unsteady compressible Navier-Stokes code including two-equation turbulence model and PISO pressure velocity coupling is used. Pressure based implicit solver with first order implicit unsteady formulation is used. The simulated pitch cycle results are compared with the available experimental data. The results have a good agreement with the experimental data. Aerodynamic characteristics during pitch cycles have been studied and validated.

Downlink Scheduling and Radio Resource Allocation in Adaptive OFDMA Wireless Communication Systems for User-Individual QoS

In this paper, we address the problem of adaptive radio resource allocation (RRA) and packet scheduling in the downlink of a cellular OFDMA system, and propose a downlink multi-carrier proportional fair (MPF) scheduler and its joint with adaptive RRA algorithm to distribute radio resources among multiple users according to their individual QoS requirements. The allocation and scheduling objective is to maximize the total throughput, while at the same time maintaining the fairness among users. The simulation results demonstrate that the methods presented provide for user more explicit fairness relative to RRA algorithm, but the joint scheme achieves the higher sum-rate capacity with flexible parameters setting compared with MPF scheduler.

Design and Research of a New Kind Balance Adjusting System of Centrifuge

In order to make environmental test centrifuge balance automatically and accurately, reduce unbalance centrifugal force, balance adjusting system of centrifuge is designed. The new balance adjusting system comprises motor-reducer, timing belt, screw pair, slider-guideway and four rocker force sensors. According to information obtained by the four rocker force sensors, unbalanced value at both ends of the big arm is computed and heavy block is moved to achieve balance adjusting. In this paper, motor power and torque to move the heavy block is calculated. In full load running progress of centrifuge, the stress-strain of screw pair composed by adjusting nut and big arm are analyzed. A successful application of the balance adjusting system is also put forwarded. The results show that the balance adjusting system can satisfy balance require of environmental test centrifuge.

Analysis of Testing and Operational Software Reliability in SRGM based on NHPP

Software Reliability is one of the key factors in the software development process. Software Reliability is estimated using reliability models based on Non Homogenous Poisson Process. In most of the literature the Software Reliability is predicted only in testing phase. So it leads to wrong decision-making concept. In this paper, two Software Reliability concepts, testing and operational phase are studied in detail. Using S-Shaped Software Reliability Growth Model (SRGM) and Exponential SRGM, the testing and operational reliability values are obtained. Finally two reliability values are compared and optimal release time is investigated.

The Evolution of Quality Improvement Methodology in Malaysia-s IT Industry: The Past, Current and Future

There are various approaches to implement quality improvements. Organizations aim for a management standard which is capable of providing customers with quality assurance on their product/service via continuous process improvement. Carefully planned steps are necessary to ensure the right quality improvement methodology (QIM) and business operations are consistent, reliable and truly meet the customers' needs. This paper traces the evolution of QIM in Malaysia-s Information Technology (IT) industry in the past, current and future; and highlights some of the thought of researchers who contributed to the science and practice of quality, and identifies leading methodologies in use today. Some of the misconceptions and mistakes leading to quality system failures will also be examined and discussed. This paper aims to provide a general overview of different types of QIMs available for IT businesses in maximizing business advantages, enhancing product quality, improving process routines and increasing performance earnings.

Electromyographic Activity of the Medial Gastrocnemius and Lateral Gastrocnemius Muscle during Salat-s and Specific Exercise

This paper investigates the activity of the gastrocnemius (Gas) muscle in healthy subjects during salat (ruku- position) and specific exercise [Unilateral Plantar Flexion Exercise (UPFE)] using electromyography (EMG). Both lateral and medial Gas muscles were assessed. A group of undergraduates aged between 19 to 25 years voluntarily participated in this study. The myoelectric activity of the muscles were recorded and analyzed. The finding indicated that there were contractions of the muscles during the salat and exercise with almost same EMG-s level. From the result, Wilcoxon-s Rank Sum test showed no significant difference between ruku- and UPFE for both medial (p=0.082) and lateral (p=0.226) of GAS muscles. Therefore, salat may be useful in strengthening exercise and also in rehabilitation programs for lower limb activities.

The Regional Concept, Public Policy and Policy Spaces: The ARC and TVA

This paper examines two policy spaces–the ARC and TVA–and their spatialized politics. The research observes that the regional concept informs public policy and can contribute to the formation of stable policy initiatives. Using the subsystem framework to understand the political viability of policy regimes, the authors conclude policy geographies that appeal to traditional definitions of regions are more stable over time. In contrast, geographies that fail to reflect pre-existing representations of space are engaged in more competitive subsystem politics. The paper demonstrates that the spatial practices of policy regions and their directional politics influence the political viability of programs. The paper concludes that policy spaces should institutionalize pre-existing geographies–not manufacture new ones.

Supervisory Fuzzy Learning Control for Underwater Target Tracking

This paper presents recent work on the improvement of the robotics vision based control strategy for underwater pipeline tracking system. The study focuses on developing image processing algorithms and a fuzzy inference system for the analysis of the terrain. The main goal is to implement the supervisory fuzzy learning control technique to reduce the errors on navigation decision due to the pipeline occlusion problem. The system developed is capable of interpreting underwater images containing occluded pipeline, seabed and other unwanted noise. The algorithm proposed in previous work does not explore the cooperation between fuzzy controllers, knowledge and learnt data to improve the outputs for underwater pipeline tracking. Computer simulations and prototype simulations demonstrate the effectiveness of this approach. The system accuracy level has also been discussed.

Chemical Analysis of PM2.5 during Dry Deforestation Season in Southeast Asia

In Southeast Asia, during the dry season (August to October) forest fires in Indonesia emit pollutants into the atmosphere. For two years during this period, a total of 67 samples of 2.5 μm particulate matters were collected and analyzed for total mass and elemental composition with ICP - MS after microwave digestion. A study of 60 elements measured during these periods suggest that the concentration of most of elements, even those usually related to crustal source, are extremely high and unpredictable during the haze period. In By contrast, trace element concentration in non - haze months is more stable and covers a lower range. Other unexpected events and their effects on the findings are discussed.

A Cooperative Multi-Robot Control Using Ad Hoc Wireless Network

In this paper, a Cooperative Multi-robot for Carrying Targets (CMCT) algorithm is proposed. The multi-robot team consists of three robots, one is a supervisor and the others are workers for carrying boxes in a store of 100×100 m2. Each robot has a self recharging mechanism. The CMCT minimizes robot-s worked time for carrying many boxes during day by working in parallel. That is, the supervisor detects the required variables in the same time another robots work with previous variables. It works with straightforward mechanical models by using simple cosine laws. It detects the robot-s shortest path for reaching the target position avoiding obstacles by using a proposed CMCT path planning (CMCT-PP) algorithm. It prevents the collision between robots during moving. The robots interact in an ad hoc wireless network. Simulation results show that the proposed system that consists of CMCT algorithm and its accomplished CMCT-PP algorithm achieves a high improvement in time and distance while performing the required tasks over the already existed algorithms.