Optimization of Distribution Network Configuration for Loss Reduction Using Artificial Bee Colony Algorithm

Network reconfiguration in distribution system is realized by changing the status of sectionalizing switches to reduce the power loss in the system. This paper presents a new method which applies an artificial bee colony algorithm (ABC) for determining the sectionalizing switch to be operated in order to solve the distribution system loss minimization problem. The ABC algorithm is a new population based metaheuristic approach inspired by intelligent foraging behavior of honeybee swarm. The advantage of ABC algorithm is that it does not require external parameters such as cross over rate and mutation rate as in case of genetic algorithm and differential evolution and it is hard to determine these parameters in prior. The other advantage is that the global search ability in the algorithm is implemented by introducing neighborhood source production mechanism which is a similar to mutation process. To demonstrate the validity of the proposed algorithm, computer simulations are carried out on 14, 33, and 119-bus systems and compared with different approaches available in the literature. The proposed method has outperformed the other methods in terms of the quality of solution and computational efficiency.

Application of Biogas Technology in Turkey

The potential, opportunities and drawbacks of biogas technology use in Turkey are evaluated in this paper. Turkey is dependent on foreign sources of energy. Therefore, use of biogas technology would provide a safe way of waste disposal and recovery of renewable energy, particularly from a sustainable domestic source, which is less unlikely to be influenced by international price or political fluctuations. Use of biogas technology would especially meet the cooking, heating and electricity demand in rural areas and protect the environment, additionally creating new job opportunities and improving social-economical conditions.

Association of the p53 Codon 72 Polymorphism with Colorectal Cancer in South West of Iran

The p53 tumor suppressor gene plays two important roles in genomic stability: blocking cell proliferation after DNA damage until it has been repaired, and starting apoptosis if the damage is too critical. Codon 72 exon4 polymorphism (Arg72Pro) of the P53 gene has been implicated in cancer risk. Various studies have been done to investigate the status of p53 at codon 72 for arginine (Arg) and proline (Pro) alleles in different populations and also the association of this codon 72 polymorphism with various tumors. Our objective was to investigate the possible association between P53 Arg72Pro polymorphism and susceptibility to colorectal cancer among Isfahan and Chaharmahal Va Bakhtiari (a part of south west of Iran) population. We investigated the status of p53 at codon 72 for Arg/Arg, Arg/Pro and Pro/Pro allele polymorphisms in blood samples from 145 colorectal cancer patients and 140 controls by Nested-PCR of p53 exon 4 and digestion with BstUI restriction enzyme and the DNA fragments were then resolved by electrophoresis in 2% agarose gel. The Pro allele was 279 bp, while the Arg allele was restricted into two fragments of 160 and 119 bp. Among the 145 colorectal cancer cases 49 cases (33.79%) were homozygous for the Arg72 allele (Arg/Arg), 18 cases (12.41%) were homozygous for the Pro72 allele (Pro/Pro) and 78 cases (53.8%) found in heterozygous (Arg/Pro). In conclusion, it can be said that p53Arg/Arg genotype may be correlated with possible increased risk of this kind of cancers in south west of Iran.

Neural Network Learning Based on Chaos

Chaos and fractals are novel fields of physics and mathematics showing up a new way of universe viewpoint and creating many ideas to solve several present problems. In this paper, a novel algorithm based on the chaotic sequence generator with the highest ability to adapt and reach the global optima is proposed. The adaptive ability of proposal algorithm is flexible in 2 steps. The first one is a breadth-first search and the second one is a depth-first search. The proposal algorithm is examined by 2 functions, the Camel function and the Schaffer function. Furthermore, the proposal algorithm is applied to optimize training Multilayer Neural Networks.

Multi-View Neural Network Based Gait Recognition

Human identification at a distance has recently gained growing interest from computer vision researchers. Gait recognition aims essentially to address this problem by identifying people based on the way they walk [1]. Gait recognition has 3 steps. The first step is preprocessing, the second step is feature extraction and the third one is classification. This paper focuses on the classification step that is essential to increase the CCR (Correct Classification Rate). Multilayer Perceptron (MLP) is used in this work. Neural Networks imitate the human brain to perform intelligent tasks [3].They can represent complicated relationships between input and output and acquire knowledge about these relationships directly from the data [2]. In this paper we apply MLP NN for 11 views in our database and compare the CCR values for these views. Experiments are performed with the NLPR databases, and the effectiveness of the proposed method for gait recognition is demonstrated.

Negative Selection as a Means of Discovering Unknown Temporal Patterns

The temporal nature of negative selection is an under exploited area. In a negative selection system, newly generated antibodies go through a maturing phase, and the survivors of the phase then wait to be activated by the incoming antigens after certain number of matches. These without having enough matches will age and die, while these with enough matches (i.e., being activated) will become active detectors. A currently active detector may also age and die if it cannot find any match in a pre-defined (lengthy) period of time. Therefore, what matters in a negative selection system is the dynamics of the involved parties in the current time window, not the whole time duration, which may be up to eternity. This property has the potential to define the uniqueness of negative selection in comparison with the other approaches. On the other hand, a negative selection system is only trained with “normal" data samples. It has to learn and discover unknown “abnormal" data patterns on the fly by itself. Consequently, it is more appreciate to utilize negation selection as a system for pattern discovery and recognition rather than just pattern recognition. In this paper, we study the potential of using negative selection in discovering unknown temporal patterns.

Possibilistic Clustering Technique-Based Traffic Light Control for Handling Emergency Vehicle

A traffic light gives security from traffic congestion,reducing the traffic jam, and organizing the traffic flow. Furthermore,increasing congestion level in public road networks is a growingproblem in many countries. Using Intelligent Transportation Systemsto provide emergency vehicles a green light at intersections canreduce driver confusion, reduce conflicts, and improve emergencyresponse times. Nowadays, the technology of wireless sensornetworks can solve many problems and can offer a good managementof the crossroad. In this paper, we develop a new approach based onthe technique of clustering and the graphical possibilistic fusionmodeling. So, the proposed model is elaborated in three phases. Thefirst one consists to decompose the environment into clusters,following by the fusion intra and inter clusters processes. Finally, wewill show some experimental results by simulation that proves theefficiency of our proposed approach.KeywordsTraffic light, Wireless sensor network, Controller,Possibilistic network/Bayesain network.

Lateral and Longitudinal Vibration of a Rotating Flexible Beam Coupled with Torsional Vibration of a Flexible Shaft

In this study, rotating flexible shaft-disk system having flexible beams is considered as a dynamic system. After neglecting nonlinear terms, torsional vibration of the shaft-disk system and lateral and longitudinal vibration of the flexible beam are still coupled through the motor speed. The system has three natural frequencies; the flexible shaft-disk system torsional natural frequency, the flexible beam lateral and longitudinal natural frequencies. Eigenvalue calculations show that while the shaft speed changes, torsional natural frequency of the shaft-disk system and the beam longitudinal natural frequency are not changing but the beam lateral natural frequency changes. Beam lateral natural frequency stays the same as the nonrotating beam lateral natural frequency ωb until the motor speed ωm is equal to ωb. After then ωb increases and remains equal to the motor speed ωm until the motor speed is equal to the shaft-disk system natural frequency ωT. Then the beam lateral natural frequency ωb becomes equal to the natural frequency ωT and stays same while the motor speed ωm is increased. Modal amplitudes and phase angles of the vibrations are also plotted against the motor speed ωm.

RF Power Consumption Emulation Optimized with Interval Valued Homotopies

This paper presents a methodology towards the emulation of the electrical power consumption of the RF device during the cellular phone/handset transmission mode using the LTE technology. The emulation methodology takes the physical environmental variables and the logical interface between the baseband and the RF system as inputs to compute the emulated power dissipation of the RF device. The emulated power, in between the measured points corresponding to the discrete values of the logical interface parameters is computed as a polynomial interpolation using polynomial basis functions. The evaluation of polynomial and spline curve fitting models showed a respective divergence (test error) of 8% and 0.02% from the physically measured power consumption. The precisions of the instruments used for the physical measurements have been modeled as intervals. We have been able to model the power consumption of the RF device operating at 5MHz using homotopy between 2 continuous power consumptions of the RF device operating at the bandwidths 3MHz and 10MHz.

Fuzzy Controlled Hydraulic Excavator with Model Parameter Uncertainty

The hydraulic actuated excavator, being a non-linear mobile machine, encounters many uncertainties. There are uncertainties in the hydraulic system in addition to the uncertain nature of the load. The simulation results obtained in this study show that there is a need for intelligent control of such machines and in particular interval type-2 fuzzy controller is most suitable for minimizing the position error of a typical excavator-s bucket under load variations. We consider the model parameter uncertainties such as hydraulic fluid leakage and friction. These are uncertainties which also depend up on the temperature and alter bulk modulus and viscosity of the hydraulic fluid. Such uncertainties together with the load variations cause chattering of the bucket position. The interval type-2 fuzzy controller effectively eliminates the chattering and manages to control the end-effecter (bucket) position with positional error in the order of few millimeters.

Integrating Agents and Computational Intelligence Techniques in E-learning Environments

In this contribution a newly developed elearning environment is presented, which incorporates Intelligent Agents and Computational Intelligence Techniques. The new e-learning environment is constituted by three parts, the E-learning platform Front-End, the Student Questioner Reasoning and the Student Model Agent. These parts are distributed geographically in dispersed computer servers, with main focus on the design and development of these subsystems through the use of new and emerging technologies. These parts are interconnected in an interoperable way, using web services for the integration of the subsystems, in order to enhance the user modelling procedure and achieve the goals of the learning process.

Using Genetic Programming to Evolve a Team of Data Classifiers

The purpose of this paper is to demonstrate the ability of a genetic programming (GP) algorithm to evolve a team of data classification models. The GP algorithm used in this work is “multigene" in nature, i.e. there are multiple tree structures (genes) that are used to represent team members. Each team member assigns a data sample to one of a fixed set of output classes. A majority vote, determined using the mode (highest occurrence) of classes predicted by the individual genes, is used to determine the final class prediction. The algorithm is tested on a binary classification problem. For the case study investigated, compact classification models are obtained with comparable accuracy to alternative approaches.

ADABeV: Automatic Detection of Abnormal Behavior in Video-surveillance

Intelligent Video-Surveillance (IVS) systems are being more and more popular in security applications. The analysis and recognition of abnormal behaviours in a video sequence has gradually drawn the attention in the field of IVS, since it allows filtering out a large number of useless information, which guarantees the high efficiency in the security protection, and save a lot of human and material resources. We present in this paper ADABeV, an intelligent video-surveillance framework for event recognition in crowded scene to detect the abnormal human behaviour. This framework is attended to be able to achieve real-time alarming, reducing the lags in traditional monitoring systems. This architecture proposal addresses four main challenges: behaviour understanding in crowded scenes, hard lighting conditions, multiple input kinds of sensors and contextual-based adaptability to recognize the active context of the scene.

Performance Monitoring of the Refrigeration System with Minimum Set of Sensors

This paper describes a methodology for remote performance monitoring of retail refrigeration systems. The proposed framework starts with monitoring of the whole refrigeration circuit which allows detecting deviations from expected behavior caused by various faults and degradations. The subsequent diagnostics methods drill down deeper in the equipment hierarchy to more specifically determine root causes. An important feature of the proposed concept is that it does not require any additional sensors, and thus, the performance monitoring solution can be deployed at a low installation cost. Moreover only a minimum of contextual information is required, which also substantially reduces time and cost of the deployment process.

Semi-Blind Two-Dimensional Code Acquisition in CDMA Communications

In this paper, we propose a new algorithm for joint time-delay and direction-of-arrival (DOA) estimation, here called two-dimensional code acquisition, in an asynchronous directsequence code-division multiple-access (DS-CDMA) array system. This algorithm depends on eigenvector-eigenvalue decomposition of sample correlation matrix, and requires to know desired user-s training sequence. The performance of the algorithm is analyzed both analytically and numerically in uncorrelated and coherent multipath environment. Numerical examples show that the algorithm is robust with unknown number of coherent signals.

Knowledge Management Applied to Forensic Sciences

This paper presents initiatives of Knowledge Management (KM) applied to Forensic Sciences field, especially developed at the Forensic Science Institute of the Brazilian Federal Police. Successful projects, related to knowledge sharing, drugs analysis and environmental crimes, are reported in the KM perspective. The described results are related to: a) the importance of having an information repository, like a digital library, in such a multidisciplinary organization; b) the fight against drug dealing and environmental crimes, enabling the possibility to map the evolution of crimes, drug trafficking flows, and the advance of deforestation in Amazon rain forest. Perspectives of new KM projects under development and studies are also presented, tracing an evolution line of the KM view at the Forensic Science Institute.

Array Signal Processing: DOA Estimation for Missing Sensors

Array signal processing involves signal enumeration and source localization. Array signal processing is centered on the ability to fuse temporal and spatial information captured via sampling signals emitted from a number of sources at the sensors of an array in order to carry out a specific estimation task: source characteristics (mainly localization of the sources) and/or array characteristics (mainly array geometry) estimation. Array signal processing is a part of signal processing that uses sensors organized in patterns or arrays, to detect signals and to determine information about them. Beamforming is a general signal processing technique used to control the directionality of the reception or transmission of a signal. Using Beamforming we can direct the majority of signal energy we receive from a group of array. Multiple signal classification (MUSIC) is a highly popular eigenstructure-based estimation method of direction of arrival (DOA) with high resolution. This Paper enumerates the effect of missing sensors in DOA estimation. The accuracy of the MUSIC-based DOA estimation is degraded significantly both by the effects of the missing sensors among the receiving array elements and the unequal channel gain and phase errors of the receiver.

Deep iCrawl: An Intelligent Vision-Based Deep Web Crawler

The explosive growth of World Wide Web has posed a challenging problem in extracting relevant data. Traditional web crawlers focus only on the surface web while the deep web keeps expanding behind the scene. Deep web pages are created dynamically as a result of queries posed to specific web databases. The structure of the deep web pages makes it impossible for traditional web crawlers to access deep web contents. This paper, Deep iCrawl, gives a novel and vision-based approach for extracting data from the deep web. Deep iCrawl splits the process into two phases. The first phase includes Query analysis and Query translation and the second covers vision-based extraction of data from the dynamically created deep web pages. There are several established approaches for the extraction of deep web pages but the proposed method aims at overcoming the inherent limitations of the former. This paper also aims at comparing the data items and presenting them in the required order.

Fungal Leaching of Hazardous Heavy Metals from a Spent Hydrotreating Catalyst

In this study, the ability of Aspergillus niger and Penicillium simplicissimum to extract heavy metals from a spent refinery catalyst was investigated. For the first step, a spent processing catalyst from one of the oil refineries in Iran was physically and chemically characterized. Aspergillus niger and Penicillium simplicissimum were used to mobilize Al/Co/Mo/Ni from hazardous spent catalysts. The fungi were adapted to the mixture of metals at 100-800 mg L-1 with increments in concentration of 100 mg L-1. Bioleaching experiments were carried out in batch cultures. To investigate the production of organic acids in sucrose medium, analyses of the culture medium by HPLC were performed at specific time intervals after inoculation. The results obtained from Inductive coupled plasma-optical emission spectrometry (ICP-OES) showed that after the one-step bioleaching process using Aspergillus niger, maximum removal efficiencies of 27%, 66%, 62% and 38% were achieved for Al, Co, Mo and Ni, respectively. However, the highest removal efficiencies using Penicillium simplicissimum were of 32%, 67%, 65% and 38% for Al, Co, Mo and Ni, respectively

Intelligent Agent Communication by Using DAML to Build Agent Community Ontology

This paper presents a new approach for intelligent agent communication based on ontology for agent community. DARPA agent markup language (DAML) is used to build the community ontology. This paper extends the agent management specification by the foundation for intelligent physical agents (FIPA) to develop an agent role called community facilitator (CF) that manages community directory and community ontology. CF helps build agent community. Precise description of agent service in this community can thus be achieved. This facilitates agent communication. Furthermore, through ontology update, agents with different ontology are capable of communicating with each other. An example of advanced traveler information system is included to illustrate practicality of this approach.