Evaluation of the Energy Consumption per Bit inBENES Optical Packet Switch

We evaluate the average energy consumption per bit in Optical Packet Switches equipped with BENES switching fabric realized in Semiconductor Optical Amplifier (SOA) technology. We also study the impact that the Amplifier Spontaneous Emission (ASE) noise generated by a transmission system has on the power consumption of the BENES switches due to the gain saturation of the SOAs used to realize the switching fabric. As a matter of example for 32×32 switches supporting 64 wavelengths and offered traffic equal to 0,8, the average energy consumption per bit is 2, 34 · 10-1 nJ/bit and increases if ASE noise introduced by the transmission systems is increased.

Development of Knowledge Portal using Open Source Tools: A Case Study of FIIT, UNISEL

Knowledge sharing culture contributes to a positive working environment. Currently, there is no platform for the Faculty of Industrial Information Technology (FIIT), Unisel academic staff to share knowledge among them. As it is done manually, the sharing process is through common meeting or by any offline discussions. There is no repository for future retrieval. However, with open source solution the development of knowledge based application may reduce the cost tremendously. In this paper we discuss about the domain on which this knowledge portal is being developed and also the deployment of open source tools such as JOOMLA, PHP programming language and MySQL. This knowledge portal is evidence that open source tools also reliable in developing knowledge based portal. These recommendations will be useful to the open source community to produce more open source products in future.

Effective Defect Prevention Approach in Software Process for Achieving Better Quality Levels

Defect prevention is the most vital but habitually neglected facet of software quality assurance in any project. If functional at all stages of software development, it can condense the time, overheads and wherewithal entailed to engineer a high quality product. The key challenge of an IT industry is to engineer a software product with minimum post deployment defects. This effort is an analysis based on data obtained for five selected projects from leading software companies of varying software production competence. The main aim of this paper is to provide information on various methods and practices supporting defect detection and prevention leading to thriving software generation. The defect prevention technique unearths 99% of defects. Inspection is found to be an essential technique in generating ideal software generation in factories through enhanced methodologies of abetted and unaided inspection schedules. On an average 13 % to 15% of inspection and 25% - 30% of testing out of whole project effort time is required for 99% - 99.75% of defect elimination. A comparison of the end results for the five selected projects between the companies is also brought about throwing light on the possibility of a particular company to position itself with an appropriate complementary ratio of inspection testing.

Simulation of Particle Damping under Centrifugal Loads

Particle damping is a technique to reduce the structural vibrations by means of placing small metallic particles inside a cavity that is attached to the structure at location of high vibration amplitudes. In this paper, we have presented an analytical model to simulate the particle damping of two dimensional transient vibrations in structure operating under high centrifugal loads. The simulation results show that this technique remains effective as long as the ratio of the dynamic acceleration of the structure to the applied centrifugal load is more than 0.1. Particle damping increases with the increase of particle to structure mass ratio. However, unlike to the case of particle damping in the absence of centrifugal loads where the damping efficiency strongly depends upon the size of the cavity, here this dependence becomes very weak. Despite the simplicity of the model, the simulation results are considerably in good agreement with the very scarce experimental data available in the literature for particle damping under centrifugal loads.

A Frugal Bidding Procedure for Replicating WWW Content

Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.

Coordination for Synchronous Cooperative Systems Based on Fuzzy Causal Relations

Synchronous cooperative systems (SCS) bring together users that are geographically distributed and connected through a network to carry out a task. Examples of SCS include Tele- Immersion and Tele-Conferences. In SCS, the coordination is the core of the system, and it has been defined as the act of managing interdependencies between activities performed to achieve a goal. Some of the main problems that SCS present deal with the management of constraints between simultaneous activities and the execution ordering of these activities. In order to resolve these problems, orderings based on Lamport-s happened-before relation have been used, namely, causal, Δ-causal, and causal-total orderings. They mainly differ in the degree of asynchronous execution allowed. One of the most important orderings is the causal order, which establishes that the events must be seen in the cause-effect order as they occur in the system. In this paper we show that for certain SCS (e.g. videoconferences, tele-immersion) where some degradation of the system is allowed, ensuring the causal order is still rigid, which can render negative affects to the system. In this paper, we illustrate how a more relaxed ordering, which we call Fuzzy Causal Order (FCO), is useful for such kind of systems by allowing a more asynchronous execution than the causal order. The benefit of the FCO is illustrated by applying it to a particular scenario of intermedia synchronization of an audio-conference system.

Effect of Incorporating Silica Fume in Fly Ash Geopolymers

This paper presents results of an experimental study performed to investigate effect of incorporating silica fume on physico-mechanical properties and durability of resulting fly ash geopolymers. Geopolymer specimens were prepared by activating fly ash incorporated with additional silica fume in the range of 2.5% to 5%, with a mixture of sodium hydroxide and sodium silicate solution having Na2O content of 8%. For studying durability, 10% magnesium sulphate solution was used to immerse the specimens up to a period of 15 weeks during which visual observation, weight changes and strength changes were monitored regularly. Addition of silica fume lowers performance of geopolymer pastes. However, in mortars, addition of silica fume significantly enhanced physico-mechanical properties and durability.

Adaptive Notch Filter for Harmonic Current Mitigation

This paper presents an effective technique for harmonic current mitigation using an adaptive notch filter (ANF) to estimate current harmonics. The proposed filter consists of multiple units of ANF connected in parallel structure; each unit is governed by two ordinary differential equations. The frequency estimation is carried out based on the output of these units. The simulation and experimental results show the ability of the proposed tracking scheme to accurately estimate harmonics. The proposed filter was implemented digitally in TMS320F2808 and used in the control of hybrid active power filter (HAPF). The theoretical expectations are verified and demonstrated experimentally.

Evaluation of Graph-based Analysis for Forest Fire Detections

Spatial outliers in remotely sensed imageries represent observed quantities showing unusual values compared to their neighbor pixel values. There have been various methods to detect the spatial outliers based on spatial autocorrelations in statistics and data mining. These methods may be applied in detecting forest fire pixels in the MODIS imageries from NASA-s AQUA satellite. This is because the forest fire detection can be referred to as finding spatial outliers using spatial variation of brightness temperature. This point is what distinguishes our approach from the traditional fire detection methods. In this paper, we propose a graph-based forest fire detection algorithm which is based on spatial outlier detection methods, and test the proposed algorithm to evaluate its applicability. For this the ordinary scatter plot and Moran-s scatter plot were used. In order to evaluate the proposed algorithm, the results were compared with the MODIS fire product provided by the NASA MODIS Science Team, which showed the possibility of the proposed algorithm in detecting the fire pixels.

A Preliminary Study on the Suitability of Data Driven Approach for Continuous Water Level Modeling

Reliable water level forecasts are particularly important for warning against dangerous flood and inundation. The current study aims at investigating the suitability of the adaptive network based fuzzy inference system for continuous water level modeling. A hybrid learning algorithm, which combines the least square method and the back propagation algorithm, is used to identify the parameters of the network. For this study, water levels data are available for a hydrological year of 2002 with a sampling interval of 1-hour. The number of antecedent water level that should be included in the input variables is determined by two statistical methods, i.e. autocorrelation function and partial autocorrelation function between the variables. Forecasting was done for 1-hour until 12-hour ahead in order to compare the models generalization at higher horizons. The results demonstrate that the adaptive networkbased fuzzy inference system model can be applied successfully and provide high accuracy and reliability for river water level estimation. In general, the adaptive network-based fuzzy inference system provides accurate and reliable water level prediction for 1-hour ahead where the MAPE=1.15% and correlation=0.98 was achieved. Up to 12-hour ahead prediction, the model still shows relatively good performance where the error of prediction resulted was less than 9.65%. The information gathered from the preliminary results provide a useful guidance or reference for flood early warning system design in which the magnitude and the timing of a potential extreme flood are indicated.

Analysis Fraction Flow of Water versus Cumulative Oil Recoveries Using Buckley Leverett Method

To derive the fractional flow equation oil displacement will be assumed to take place under the so-called diffusive flow condition. The constraints are that fluid saturations at any point in the linear displacement path are uniformly distributed with respect to thickness; this allows the displacement to be described mathematically in one dimension. The simultaneous flow of oil and water can be modeled using thickness averaged relative permeability, along the centerline of the reservoir. The condition for fluid potential equilibrium is simply that of hydrostatic equilibrium for which the saturation distribution can be determined as a function of capillary pressure and therefore, height. That is the fluids are distributed in accordance with capillary-gravity equilibrium. This paper focused on the fraction flow of water versus cumulative oil recoveries using Buckley Leverett method. Several field cases have been developed to aid in analysis. Producing watercut (at surface conditions) will be compared with the cumulative oil recovery at breakthrough for the flowing fluid.

IVE: Virtual Humans AI Prototyping Toolkit

IVE toolkit has been created for facilitating research,education and development in the ?eld of virtual storytelling andcomputer games. Primarily, the toolkit is intended for modellingaction selection mechanisms of virtual humans, investigating level-of-detail AI techniques for large virtual environments, and for exploringjoint behaviour and role-passing technique (Sec. V). Additionally, thetoolkit can be used as an AI middleware without any changes. Themain facility of IVE is that it serves for prototyping both the AI andvirtual worlds themselves. The purpose of this paper is to describeIVE?s features in general and to present our current work - includingan educational game - on this platform.Keywords? AI middleware, simulation, virtual world.

Cold-pressed Kenaf and Fibreglass Hybrid Composites Laminates: Effect of Fibre Types

Natural fibres have emerged as the potential reinforcement material for composites and thus gain attraction by many researchers. This is mainly due to their applicable benefits as they offer low density, low cost, renewable, biodegradability and environmentally harmless and also comparable mechanical properties with synthetic fibre composites. The properties of hybrid composites highly depends on several factors, including the interaction of fillers with the polymeric matrix, shape and size (aspect ratio), and orientation of fillers [1]. In this study, natural fibre kenaf composites and kenaf/fibreglass hybrid composites were fabricated by a combination of hand lay-up method and cold-press method. The effect of different fibre types (powder, short and long) on the tensile properties of composites is investigated. The kenaf composites with and without the addition of fibreglass were then characterized by tensile testing and scanning electron microscopy. A significant improvement in tensile strength and modulus were indicated by the introduction of long kenaf/woven fibreglass hybrid composite. However, the opposite trends are observed in kenaf powder composite. Fractographic observation shows that fibre/matrix debonding causes the fibres pull out. This phenomenon results in the fibre and matrix fracture.

Thermal and Morphological Evaluation of Chemically Pretreated Sugarcane Bagasse

Enzymatic hydrolysis is one of the major steps involved in the conversion from sugarcane bagasse to yield ethanol. This process offers potential for yields and selectivity higher, lower energy costs and milder operating conditions than chemical processes. However, the presence of some factors such as lignin content, crystallinity degree of the cellulose, and particle sizes, limits the digestibility of the cellulose present in the lignocellulosic biomasses. Pretreatment aims to improve the access of the enzyme to the substrate. In this study sugarcane bagasse was submitted chemical pretreatment that consisted of two consecutive steps, the first with dilute sulfuric acid (1 % (v/v) H2SO4), and the second with alkaline solutions with different concentrations of NaOH (1, 2, 3 and 4 % (w/v)). Thermal Analysis (TG/ DTG and DTA) was used to evaluate hemicellulose, cellulose and lignin contents in the samples. Scanning Electron Microscopy (SEM) was used to evaluate the morphological structures of the in natura and chemically treated samples. Results showed that pretreatments were effective in chemical degradation of lignocellulosic materials of the samples, and also was possible to observe the morphological changes occurring in the biomasses after pretreatments.

On the Properties of Pseudo Noise Sequences with a Simple Proposal of Randomness Test

Maximal length sequences (m-sequences) are also known as pseudo random sequences or pseudo noise sequences for closely following Golomb-s popular randomness properties: (P1) balance, (P2) run, and (P3) ideal autocorrelation. Apart from these, there also exist certain other less known properties of such sequences all of which are discussed in this tutorial paper. Comprehensive proofs to each of these properties are provided towards better understanding of such sequences. A simple test is also proposed at the end of the paper in order to distinguish pseudo noise sequences from truly random sequences such as Bernoulli sequences.

Space Vector Pulse Width Modulation Technique Based Design and Simulation of a Three-Phase Voltage Source Converter Systems

A Space Vector based Pulse Width Modulation control technique for the three-phase PWM converter is proposed in this paper. The proposed control scheme is based on a synchronous reference frame model. High performance and efficiency is obtained with regards to the DC bus voltage and the power factor considerations of the PWM rectifier thus leading to low losses. MATLAB/SIMULINK are used as a platform for the simulations and a SIMULINK model is presented in the paper. The results show that the proposed model demonstrates better performance and properties compared to the traditional SPWM method and the method improves the dynamic performance of the closed loop drastically. For the Space Vector based Pulse Width Modulation, Sine signal is the reference waveform and triangle waveform is the carrier waveform. When the value sine signal is large than triangle signal, the pulse will start produce to high. And then when the triangular signals higher than sine signal, the pulse will come to low. SPWM output will changed by changing the value of the modulation index and frequency used in this system to produce more pulse width. The more pulse width produced, the output voltage will have lower harmonics contents and the resolution increase.

Approach to Implementation of Power Management with Load Prioritizations in Modern Civil Aircraft

Any use of energy in industrial productive activities is combined with various environment impacts. Withintransportation, this fact was not only found among land transport, railways and maritime transport, but also in the air transport industry. An effective climate protection requires strategies and measures for reducing all greenhouses gas emissions, in particular carbon dioxide, and must take into account the economic, ecologic and social aspects. It seem simperative now to develop and manufacture environmentally friendly products and systems, to reduce consumption and use less resource, and to save energy and power. Today-sproducts could better serve these requirements taking into account the integration of a power management system into the electrical power system.This paper gives an overview of an approach ofpower management with load prioritization in modernaircraft. Load dimensioning and load management strategies on current civil aircraft will be presented and used as a basis for the proposed approach.

Comparative Evaluation of Color-Based Video Signatures in the Presence of Various Distortion Types

The robustness of color-based signatures in the presence of a selection of representative distortions is investigated. Considered are five signatures that have been developed and evaluated within a new modular framework. Two signatures presented in this work are directly derived from histograms gathered from video frames. The other three signatures are based on temporal information by computing difference histograms between adjacent frames. In order to obtain objective and reproducible results, the evaluations are conducted based on several randomly assembled test sets. These test sets are extracted from a video repository that contains a wide range of broadcast content including documentaries, sports, news, movies, etc. Overall, the experimental results show the adequacy of color-histogram-based signatures for video fingerprinting applications and indicate which type of signature should be preferred in the presence of certain distortions.

Efficient Time Synchronization in Wireless Sensor Networks

Energy efficiency is the key requirement in wireless sensor network as sensors are small, cheap and are deployed in very large number in a large geographical area, so there is no question of replacing the batteries of the sensors once deployed. Different ways can be used for efficient energy transmission including Multi-Hop algorithms, collaborative communication, cooperativecommunication, Beam- forming, routing algorithm, phase, frequency and time synchronization. The paper reviews the need for time synchronization and proposed a BFS based synchronization algorithm to achieve energy efficiency. The efficiency of our protocol has been tested and verified by simulation

Exploiting Self-Adaptive Replication Management on Decentralized Tuple Space

Decentralized Tuple Space (DTS) implements tuple space model among a series of decentralized hosts and provides the logical global shared tuple repository. Replication has been introduced to promote performance problem incurred by remote tuple access. In this paper, we propose a replication approach of DTS allowing replication policies self-adapting. The accesses from users or other nodes are monitored and collected to contribute the decision making. The replication policy may be changed if the better performance is expected. The experiments show that this approach suitably adjusts the replication policies, which brings negligible overhead.