Biologically Inspired Controller for the Autonomous Navigation of a Mobile Robot in an Evasion Task

A novel biologically inspired controller for the autonomous navigation of a mobile robot in an evasion task is proposed. The controller takes advantage of the environment by calculating a measure of danger and subsequently choosing the parameters of a reinforcement learning based decision process. Two different reinforcement learning algorithms were used: Qlearning and Sarsa (λ). Simulations show that selecting dynamic parameters reduce the time while executing the decision making process, so the robot can obtain a policy to succeed in an escaping task in a realistic time.

Sentence Modality Recognition in French based on Prosody

This paper deals with automatic sentence modality recognition in French. In this work, only prosodic features are considered. The sentences are recognized according to the three following modalities: declarative, interrogative and exclamatory sentences. This information will be used to animate a talking head for deaf and hearing-impaired children. We first statistically study a real radio corpus in order to assess the feasibility of the automatic modeling of sentence types. Then, we test two sets of prosodic features as well as two different classifiers and their combination. We further focus our attention on questions recognition, as this modality is certainly the most important one for the target application.

Predicting DHF Incidence in Northern Thailand using Time Series Analysis Technique

This study aimed at developing a forecasting model on the number of Dengue Haemorrhagic Fever (DHF) incidence in Northern Thailand using time series analysis. We developed Seasonal Autoregressive Integrated Moving Average (SARIMA) models on the data collected between 2003-2006 and then validated the models using the data collected between January-September 2007. The results showed that the regressive forecast curves were consistent with the pattern of actual values. The most suitable model was the SARIMA(2,0,1)(0,2,0)12 model with a Akaike Information Criterion (AIC) of 12.2931 and a Mean Absolute Percent Error (MAPE) of 8.91713. The SARIMA(2,0,1)(0,2,0)12 model fitting was adequate for the data with the Portmanteau statistic Q20 = 8.98644 ( x20,95= 27.5871, P>0.05). This indicated that there was no significant autocorrelation between residuals at different lag times in the SARIMA(2,0,1)(0,2,0)12 model.

A Model of a Heat Radiation on a Mould Surface in the Car Industry

This article is focused on the calculation of heat radiation intensity and its optimization on an aluminum mould surface. The inside of the mould is sprinkled with a special powder and its outside is heated by infra heaters located above the mould surface, up to a temperature of 250°C. By this way artificial leathers in the car industry are produced (e. g. the artificial leather on a car dashboard). A mathematical model of heat radiation of infra heaters on a mould surface is described in this paper. This model allows us to calculate a heat-intensity radiation on the mould surface for the concrete location of infra heaters above the mould surface. It is necessary to ensure approximately the same heat intensity radiation on the mould surface by finding a suitable location for the infra heaters, and in this way the same material structure and color of artificial leather. In the model we have used a genetic algorithm to optimize the radiation intensity on the mould surface. Experimental measured values for the heat radiation intensity by a sensor in the surroundings of an infra heater are used for the calculation procedures. A computational procedure was programmed in language Matlab.

Prediction of Location of High Energy Shower Cores using Artificial Neural Networks

Artificial Neural Network (ANN)s can be modeled for High Energy Particle analysis with special emphasis on shower core location. The work describes the use of an ANN based system which has been configured to predict locations of cores of showers in the range 1010.5 to 1020.5 eV. The system receives density values as inputs and generates coordinates of shower events recorded for values captured by 20 core positions and 80 detectors in an area of 100 meters. Twenty ANNs are trained for the purpose and the positions of shower events optimized by using cooperative ANN learning. The results derived with variations of input upto 50% show success rates in the range of 90s.

Closely Parametrical Model for an Electrical Arc Furnace

To maximise furnace production it-s necessary to optimise furnace control, with the objectives of achieving maximum power input into the melting process, minimum network distortion and power-off time, without compromise on quality and safety. This can be achieved with on the one hand by an appropriate electrode control and on the other hand by a minimum of AC transformer switching. Electrical arc is a stochastic process; witch is the principal cause of power quality problems, including voltages dips, harmonic distortion, unbalance loads and flicker. So it is difficult to make an appropriate model for an Electrical Arc Furnace (EAF). The factors that effect EAF operation are the melting or refining materials, melting stage, electrode position (arc length), electrode arm control and short circuit power of the feeder. So arc voltages, current and power are defined as a nonlinear function of the arc length. In this article we propose our own empirical function of the EAF and model, for the mean stages of the melting process, thanks to the measurements in the steel factory.

Combing LCIA and Fuzzy Risk Assessment for Environmental Impact Assessment

Environmental impact assessment (EIA) is a procedure tool of environmental management for identifying, predicting, evaluating and mitigating the adverse effects of development proposals. EIA reports usually analyze how the amounts or concentrations of pollutants obey the relevant standards. Actually, many analytical tools can deepen the analysis of environmental impacts in EIA reports, such as life cycle assessment (LCA) and environmental risk assessment (ERA). Life cycle impact assessment (LCIA) is one of steps in LCA to introduce the causal relationships among environmental hazards and damage. Incorporating the LCIA concept into ERA as an integrated tool for EIA can extend the focus of the regulatory compliance of environmental impacts to determine of the significance of environmental impacts. Sometimes, when using integrated tools, it is necessary to consider fuzzy situations due to insufficient information; therefore, ERA should be generalized to fuzzy risk assessment (FRA). Finally, the use of the proposed methodology is demonstrated through the study case of the expansion plan of the world-s largest plastics processing factory.

Artificial Neural Network Model for a Low Cost Failure Sensor: Performance Assessment in Pipeline Distribution

This paper describes an automated event detection and location system for water distribution pipelines which is based upon low-cost sensor technology and signature analysis by an Artificial Neural Network (ANN). The development of a low cost failure sensor which measures the opacity or cloudiness of the local water flow has been designed, developed and validated, and an ANN based system is then described which uses time series data produced by sensors to construct an empirical model for time series prediction and classification of events. These two components have been installed, tested and verified in an experimental site in a UK water distribution system. Verification of the system has been achieved from a series of simulated burst trials which have provided real data sets. It is concluded that the system has potential in water distribution network management.

Migration from Commercial to in-House Developed Learning Management Systems

The Learning Management Systems present learning environment which offers a collection of e-learning tools in a package that allows a common interface and information sharing among the tools. South East European University initial experience in LMS was with the usage of the commercial LMS-ANGEL. After a three year experience on ANGEL usage because of expenses that were very high it was decided to develop our own software. As part of the research project team for the in-house design and development of the new LMS, we primarily had to select the features that would cover our needs and also comply with the actual trends in the area of software development, and then design and develop the system. In this paper we present the process of LMS in-house development for South East European University, its architecture, conception and strengths with a special accent on the process of migration and integration with other enterprise applications.

In Vitro Antibacterial and Antifungal Effects of a 30 kDa D-Galactoside-Specific Lectin from the Demosponge, Halichondria okadai

The present study has been taken to explore the screening of in vitro antimicrobial activities of D-galactose-binding sponge lectin (HOL-30). HOL-30 was purified from the marine demosponge Halichondria okadai by affinity chromatography. The molecular mass of the lectin was determined to be 30 kDa with a single polypeptide by SDS-PAGE under non-reducing and reducing conditions. HOL-30 agglutinated trypsinized and glutaraldehydefixed rabbit and human erythrocytes with preference for type O erythrocytes. The lectin was subjected to evaluation for inhibition of microbial growth by the disc diffusion method against eleven human pathogenic gram-positive and gram-negative bacteria. The lectin exhibited strong antibacterial activity against gram-positive bacteria, such as Bacillus megaterium and Bacillus subtilis. However, it did not affect against gram-negative bacteria such as Salmonella typhi and Escherichia coli. The largest zone of inhibition was recorded of Bacillus megaterium (12 in diameter) and Bacillus subtilis (10 mm in diameter) at a concentration of the lectin (250 μg/disc). On the other hand, the antifungal activity of the lectin was investigated against six phytopathogenic fungi based on food poisoning technique. The lectin has shown maximum inhibition (22.83%) of mycelial growth of Botrydiplodia theobromae at a concentration of 100 μg/mL media. These findings indicate that the lectin may be of importance to clinical microbiology and have therapeutic applications.

Global Electricity Consumption Estimation Using Particle Swarm Optimization (PSO)

An integrated Artificial Neural Network- Particle Swarm Optimization (PSO) is presented for analyzing global electricity consumption. To aim this purpose, following steps are done: STEP 1: in the first step, PSO is applied in order to determine world-s oil, natural gas, coal and primary energy demand equations based on socio-economic indicators. World-s population, Gross domestic product (GDP), oil trade movement and natural gas trade movement are used as socio-economic indicators in this study. For each socio-economic indicator, a feed-forward back propagation artificial neural network is trained and projected for future time domain. STEP 2: in the second step, global electricity consumption is projected based on the oil, natural gas, coal and primary energy consumption using PSO. global electricity consumption is forecasted up to year 2040.

Techniques for Video Mosaicing

Video Mosaicing is the stitching of selected frames of a video by estimating the camera motion between the frames and thereby registering successive frames of the video to arrive at the mosaic. Different techniques have been proposed in the literature for video mosaicing. Despite of the large number of papers dealing with techniques to generate mosaic, only a few authors have investigated conditions under which these techniques generate good estimate of motion parameters. In this paper, these techniques are studied under different videos, and the reasons for failures are found. We propose algorithms with incorporation of outlier removal algorithms for better estimation of motion parameters.

Adjustment of a PET Scanner for PEPT

Positron emission particle tracking (PEPT) is a technique in which a single radioactive tracer particle can be accurately tracked as it moves. A limitation of PET is that in order to reconstruct a tomographic image it is necessary to acquire a large volume of data (millions of events), so it is difficult to study rapidly changing systems. By considering this fact, PEPT is a very fast process compared with PET. In PEPT detecting both photons defines a line and the annihilation is assumed to have occurred somewhere along this line. The location of the tracer can be determined to within a few mm from coincident detection of a small number of pairs of back-to-back gamma rays and using triangulation. This can be achieved many times per second and the track of a moving particle can be reliably followed. This technique was invented at the University of Birmingham [1]. The attempt in PEPT is not to form an image of the tracer particle but simply to determine its location with time. If this tracer is followed for a long enough period within a closed, circulating system it explores all possible types of motion. The application of PEPT to industrial process systems carried out at the University of Birmingham is categorized in two subjects: the behaviour of granular materials and viscous fluids. Granular materials are processed in industry for example in the manufacture of pharmaceuticals, ceramics, food, polymers and PEPT has been used in a number of ways to study the behaviour of these systems [2]. PEPT allows the possibility of tracking a single particle within the bed [3]. Also PEPT has been used for studying systems such as: fluid flow, viscous fluids in mixers [4], using a neutrally-buoyant tracer particle [5].

Object Tracking using MACH filter and Optical Flow in Cluttered Scenes and Variable Lighting Conditions

Vision based tracking problem is solved through a combination of optical flow, MACH filter and log r-θ mapping. Optical flow is used for detecting regions of movement in video frames acquired under variable lighting conditions. The region of movement is segmented and then searched for the target. A template is used for target recognition on the segmented regions for detecting the region of interest. The template is trained offline on a sequence of target images that are created using the MACH filter and log r-θ mapping. The template is applied on areas of movement in successive frames and strong correlation is seen for in-class targets. Correlation peaks above a certain threshold indicate the presence of target and the target is tracked over successive frames.

The Effection of Different Culturing Proportion of Deep Sea Water(DSW) to Surface Sea Water(SSW) in Reductive Ability and Phenolic Compositions of Sargassum Cristaefolium

Characterized as rich mineral substances, low temperature, few bacteria, and stability with numerous implementation aspects on aquaculture, food, drinking, and leisure, the deep sea water (DSW) development has become a new industry in the world. It has been report that marine algae contain various biologically active compounds. This research focued on the affections in cultivating Sagrassum cristaefolium with different concentration of deep sea water(DSW) and surface sea water(SSW). After two and four weeks, the total phenolic contents were compared in Sagrassum cristaefolium culturing with different ways, and the reductive activity of them was also be tried with potassium ferricyanide. Those fresh seaweeds were dried with oven and were ground to powder. Progressively, the marine algae we cultured was extracted by water under the condition with heating them at 90Ôäâ for 1hr.The total phenolic contents were be executed using Folin–Ciocalteu method. The results were explaining as follows: the highest total phenolic contents and the best reductive ability of all could be observed on the 1/4 proportion of DSW to SSW culturing in two weeks. Furthermore, the 1/2 proportion of DSW to SSW also showed good reductive ability and plentiful phenolic compositions. Finally, we confirmed that difference proportion of DSW and SSW is the major point relating to ether the total phenolic components or the reductive ability in the Sagrassum cristaefolium. In the future, we will use this way to mass production the marine algae or other micro algae on industry applications.

Some Characterizations of Isotropic Curves In the Euclidean Space

The curves, of which the square of the distance between the two points equal to zero, are called minimal or isotropic curves [4]. In this work, first, necessary and sufficient conditions to be a Pseudo Helix, which is a special case of such curves, are presented. Thereafter, it is proven that an isotropic curve-s position vector and pseudo curvature satisfy a vector differential equation of fourth order. Additionally, In view of solution of mentioned equation, position vector of pseudo helices is obtained.

Impact of MAC Layer on the Performance of Routing Protocols in Mobile Ad hoc Networks

Mobile Ad hoc Networks is an autonomous system of mobile nodes connected by multi-hop wireless links without centralized infrastructure support. As mobile communication gains popularity, the need for suitable ad hoc routing protocols will continue to grow. Efficient dynamic routing is an important research challenge in such a network. Bandwidth constrained mobile devices use on-demand approach in their routing protocols because of its effectiveness and efficiency. Many researchers have conducted numerous simulations for comparing the performance of these protocols under varying conditions and constraints. Most of them are not aware of MAC Protocols, which will impact the relative performance of routing protocols considered in different network scenarios. In this paper we investigate the choice of MAC protocols affects the relative performance of ad hoc routing protocols under different scenarios. We have evaluated the performance of these protocols using NS2 simulations. Our results show that the performance of routing protocols of ad hoc networks will suffer when run over different MAC Layer protocols.

Sperm Whale Signal Analysis: Comparison using the Auto Regressive model and the Daubechies 15 Wavelets Transform

This article presents the results using a parametric approach and a Wavelet Transform in analysing signals emitting from the sperm whale. The extraction of intrinsic characteristics of these unique signals emitted by marine mammals is still at present a difficult exercise for various reasons: firstly, it concerns non-stationary signals, and secondly, these signals are obstructed by interfering background noise. In this article, we compare the advantages and disadvantages of both methods: Auto Regressive models and Wavelet Transform. These approaches serve as an alternative to the commonly used estimators which are based on the Fourier Transform for which the hypotheses necessary for its application are in certain cases, not sufficiently proven. These modern approaches provide effective results particularly for the periodic tracking of the signal's characteristics and notably when the signal-to-noise ratio negatively effects signal tracking. Our objectives are twofold. Our first goal is to identify the animal through its acoustic signature. This includes recognition of the marine mammal species and ultimately of the individual animal (within the species). The second is much more ambitious and directly involves the intervention of cetologists to study the sounds emitted by marine mammals in an effort to characterize their behaviour. We are working on an approach based on the recordings of marine mammal signals and the findings from this data result from the Wavelet Transform. This article will explore the reasons for using this approach. In addition, thanks to the use of new processors, these algorithms once heavy in calculation time can be integrated in a real-time system.

Selective Minterms Based Tabular Method for BDD Manipulations

The goal of this work is to describe a new algorithm for finding the optimal variable order, number of nodes for any order and other ROBDD parameters, based on a tabular method. The tabular method makes use of a pre-built backend database table that stores the ROBDD size for selected combinations of min-terms. The user uses the backend table and the proposed algorithm to find the necessary ROBDD parameters, such as best variable order, number of nodes etc. Experimental results on benchmarks are given for this technique.

Development of an Autonomous Greenhouse Gas Monitoring System

This paper describes the designs of a first and second generation autonomous gas monitoring system and the successful field trial of the final system (2nd generation). Infrared sensing technology is used to detect and measure the greenhouse gases methane (CH4) and carbon dioxide (CO2) at point sources. The ability to monitor real-time events is further enhanced through the implementation of both GSM and Bluetooth technologies to communicate these data in real-time. These systems are robust, reliable and a necessary tool where the monitoring of gas events in real-time are needed.