Evaluation of Wavelet Filters for Image Compression

The aim of this paper to characterize a larger set of wavelet functions for implementation in a still image compression system using SPIHT algorithm. This paper discusses important features of wavelet functions and filters used in sub band coding to convert image into wavelet coefficients in MATLAB. Image quality is measured objectively using peak signal to noise ratio (PSNR) and its variation with bit rate (bpp). The effect of different parameters is studied on different wavelet functions. Our results provide a good reference for application designers of wavelet based coder.

Periodic Storage Control Problem

Considering a reservoir with periodic states and different cost functions with penalty, its release rules can be modeled as a periodic Markov decision process (PMDP). First, we prove that policy- iteration algorithm also works for the PMDP. Then, with policy- iteration algorithm, we obtain the optimal policies for a special aperiodic reservoir model with two cost functions under large penalty and give a discussion when the penalty is small.

Comparison of Three Turbulence Models in Wear Prediction of Multi-Size Particulate Flow through Rotating Channel

The present work compares the performance of three turbulence modeling approach (based on the two-equation k -ε model) in predicting erosive wear in multi-size dense slurry flow through rotating channel. All three turbulence models include rotation modification to the production term in the turbulent kineticenergy equation. The two-phase flow field obtained numerically using Galerkin finite element methodology relates the local flow velocity and concentration to the wear rate via a suitable wear model. The wear models for both sliding wear and impact wear mechanisms account for the particle size dependence. Results of predicted wear rates using the three turbulence models are compared for a large number of cases spanning such operating parameters as rotation rate, solids concentration, flow rate, particle size distribution and so forth. The root-mean-square error between FE-generated data and the correlation between maximum wear rate and the operating parameters is found less than 2.5% for all the three models.

Histopathological and Morphological Defects in the Mice Prenatally Exposed to Low EMF

This research was carried out to determine the possible effects of low electromagnetic field (EMF) exposure to the developing mice fetuses. Pregnant mice were exposed to EMF exposure at 0mT (sham) and 1.2 mT for six hours per session, carried out on gestation day 3, 6, 9, 12 and 15. Samples from the stillborn offspring were observed for morphological defects. The heart didn-t show progressive cellular damage, the lungs were congested and emphysemics. The bones were in advance stage of hypertrophy. Spectrums of morphological defects were observed over 70% of the surviving offspring. These results indicate that even at lower exposure to low EMF, is enough to induce morphological defects in prenatal mice.

An Approach for Data Analysis, Evaluation and Correction: A Case Study from Man-Made River Project in Libya

The world-s largest Pre-stressed Concrete Cylinder Pipe (PCCP) water supply project had a series of pipe failures which occurred between 1999 and 2001. This has led the Man-Made River Authority (MMRA), the authority in charge of the implementation and operation of the project, to setup a rehabilitation plan for the conveyance system while maintaining the uninterrupted flow of water to consumers. At the same time, MMRA recognized the need for a long term management tool that would facilitate repair and maintenance decisions and enable taking the appropriate preventive measures through continuous monitoring and estimation of the remaining life of each pipe. This management tool is known as the Pipe Risk Management System (PRMS) and now in operation at MMRA. Both the rehabilitation plan and the PRMS require the availability of complete and accurate pipe construction and manufacturing data This paper describes a systematic approach of data collection, analysis, evaluation and correction for the construction and manufacturing data files of phase I pipes which are the platform for the PRMS database and any other related decision support system.

Employee Loyalty and Telecommuting

Telecommuting has become an increasingly popular work arrangement. However, little research has examined the impact of telecommuting on the relationship between employees and the organization. This study aims to shed light on this aspect by comparing the loyalty of telecommuters and non telecommuters as it can be viewed from three angles: organizational loyalty, peer loyalty, and professional loyalty. Furthermore, this paper will explore the dynamics among employee loyalty, productivity, and job satisfaction. Whereas previous studies had looked on employees that are not fully telecommuting, the current study concentrates on employees that are exclusively working from home.

3D Oil Reservoir Visualisation Using Octree Compression Techniques Utilising Logical Grid Co-Ordinates

Octree compression techniques have been used for several years for compressing large three dimensional data sets into homogeneous regions. This compression technique is ideally suited to datasets which have similar values in clusters. Oil engineers represent reservoirs as a three dimensional grid where hydrocarbons occur naturally in clusters. This research looks at the efficiency of storing these grids using octree compression techniques where grid cells are broken into active and inactive regions. Initial experiments yielded high compression ratios as only active leaf nodes and their ancestor, header nodes are stored as a bitstream to file on disk. Savings in computational time and memory were possible at decompression, as only active leaf nodes are sent to the graphics card eliminating the need of reconstructing the original matrix. This results in a more compact vertex table, which can be loaded into the graphics card quicker and generating shorter refresh delay times.

A Systematic Construction of Instability Bounds in LIS Networks

In this work, we study the impact of dynamically changing link slowdowns on the stability properties of packetswitched networks under the Adversarial Queueing Theory framework. Especially, we consider the Adversarial, Quasi-Static Slowdown Queueing Theory model, where each link slowdown may take on values in the two-valued set of integers {1, D} with D > 1 which remain fixed for a long time, under a (w, p)-adversary. In this framework, we present an innovative systematic construction for the estimation of adversarial injection rate lower bounds, which, if exceeded, cause instability in networks that use the LIS (Longest-in- System) protocol for contention-resolution. In addition, we show that a network that uses the LIS protocol for contention-resolution may result in dropping its instability bound at injection rates p > 0 when the network size and the high slowdown D take large values. This is the best ever known instability lower bound for LIS networks.

Evolutionary Distance in the Yeast Genome

Whole genome duplication (WGD) increased the number of yeast Saccharomyces cerevisiae chromosomes from 8 to 16. In spite of retention the number of chromosomes in the genome of this organism after WGD to date, chromosomal rearrangement events have caused an evolutionary distance between current genome and its ancestor. Studies under evolutionary-based approaches on eukaryotic genomes have shown that the rearrangement distance is an approximable problem. In the case of S. cerevisiae, we describe that rearrangement distance is accessible by using dedoubled adjacency graph drawn for 55 large paired chromosomal regions originated from WGD. Then, we provide a program extracted from a C program database to draw a dedoubled genome adjacency graph for S. cerevisiae. From a bioinformatical perspective, using the duplicated blocks of current genome in S. cerevisiae, we infer that genomic organization of eukaryotes has the potential to provide valuable detailed information about their ancestrygenome.

Magnetohydrodynamics Boundary Layer Flows over a Stretching Surface with Radiation Effect and Embedded in Porous Medium

A steady two-dimensional magnetohydrodynamics flow and heat transfer over a stretching vertical sheet influenced by radiation and porosity is studied. The governing boundary layer equations of partial differential equations are reduced to a system of ordinary differential equations using similarity transformation. The system is solved numerically by using a finite difference scheme known as the Keller-box method for some values of parameters, namely the radiation parameter N, magnetic parameter M, buoyancy parameter l , Prandtl number Pr and permeability parameter K. The effects of the parameters on the heat transfer characteristics are analyzed and discussed. It is found that both the skin friction coefficient and the local Nusselt number decrease as the magnetic parameter M and permeability parameter K increase. Heat transfer rate at the surface decreases as the radiation parameter increases.

Bandwidth Estimation Algorithms for the Dynamic Adaptation of Voice Codec

In the recent years multimedia traffic and in particular VoIP services are growing dramatically. We present a new algorithm to control the resource utilization and to optimize the voice codec selection during SIP call setup on behalf of the traffic condition estimated on the network path. The most suitable methodologies and the tools that perform realtime evaluation of the available bandwidth on a network path have been integrated with our proposed algorithm: this selects the best codec for a VoIP call in function of the instantaneous available bandwidth on the path. The algorithm does not require any explicit feedback from the network, and this makes it easily deployable over the Internet. We have also performed intensive tests on real network scenarios with a software prototype, verifying the algorithm efficiency with different network topologies and traffic patterns between two SIP PBXs. The promising results obtained during the experimental validation of the algorithm are now the basis for the extension towards a larger set of multimedia services and the integration of our methodology with existing PBX appliances.

Multi-Context Recurrent Neural Network for Time Series Applications

this paper presents a multi-context recurrent network for time series analysis. While simple recurrent network (SRN) are very popular among recurrent neural networks, they still have some shortcomings in terms of learning speed and accuracy that need to be addressed. To solve these problems, we proposed a multi-context recurrent network (MCRN) with three different learning algorithms. The performance of this network is evaluated on some real-world application such as handwriting recognition and energy load forecasting. We study the performance of this network and we compared it to a very well established SRN. The experimental results showed that MCRN is very efficient and very well suited to time series analysis and its applications.

Measuring of Urban Sustainability in Town Planners Practice

Physical urban form is recognized to be the media for human transactions. It directly influences the travel demand of people in a specific urban area and the amount of energy used for transportation. Distorted, sprawling form often creates sustainability problems in urban areas. It is declared in EU strategic planning documents that compact urban form and mixed land use pattern must be given the main focus to achieve better sustainability in urban areas, but the methods to measure and compare these characteristics are still not clear. This paper presents the simple methods to measure the spatial characteristics of urban form by analyzing the location and distribution of objects in an urban environment. The extended CA (cellular automata) model is used to simulate urban development scenarios.

Detecting and Locating Wormhole Attacks in Wireless Sensor Networks Using Beacon Nodes

This paper focuses on wormhole attacks detection in wireless sensor networks. The wormhole attack is particularly challenging to deal with since the adversary does not need to compromise any nodes and can use laptops or other wireless devices to send the packets on a low latency channel. This paper introduces an easy and effective method to detect and locate the wormholes: Since beacon nodes are assumed to know their coordinates, the straight line distance between each pair of them can be calculated and then compared with the corresponding hop distance, which in this paper equals hop counts × node-s transmission range R. Dramatic difference may emerge because of an existing wormhole. Our detection mechanism is based on this. The approximate location of the wormhole can also be derived in further steps based on this information. To the best of our knowledge, our method is much easier than other wormhole detecting schemes which also use beacon nodes, and to those have special requirements on each nodes (e.g., GPS receivers or tightly synchronized clocks or directional antennas), ours is more economical. Simulation results show that the algorithm is successful in detecting and locating wormholes when the density of beacon nodes reaches 0.008 per m2.

Characterization of Lactose Consumption during the Biogas Production from Acid Whey by FT-IR Spectroscopy

The consumption of lactose in acid cheese whey anaerobic fermentation process under fed-batch conditions was studied. During fermentation for 100 hours the biogas production (CO2 and CH4) was analyzed online. Among the standard analyses FT-IR spectroscopy was used to follow the consumption of lactose by bacteria. The absorption bands at 990, 894 and 787 cm-1 in the 2nd derivative spectra were shown to be characteristic for lactose and were used to follow the lactose conversion. It was shown that acid cheese whey lactose was converted by bacteria in first 7 hours. In the spectra of 17, 18 and 95 hour fermentation samples lactose was not identified and these results correlated with the HPLC data.

A Model for Business Network Governance: Case Study in the Pharmaceutical Industry

This paper discusses the theory behind the existence of an idealistic model for business network governance and uses a clarifying case-study, containing governance structures and processes within a business network framework. The case study from a German pharmaceutical industry company complements existing literature by providing a comprehensive explanation of the relations between supply chains and business networks, and also between supply chain management and business network governance. Supply chains and supply chain management are only one side of the interorganizational relationships and ensure short-term performance, while real-world governance structures are needed for ensuring the long-term existence of a supply chain. Within this context, a comprehensive model for business governance is presented. An interesting finding from the case study is that multiple business network governance systems co-exist within the evaluated supply chain.

Discrete Polyphase Matched Filtering-based Soft Timing Estimation for Mobile Wireless Systems

In this paper we present a soft timing phase estimation (STPE) method for wireless mobile receivers operating in low signal to noise ratios (SNRs). Discrete Polyphase Matched (DPM) filters, a Log-maximum a posterior probability (MAP) and/or a Soft-output Viterbi algorithm (SOVA) are combined to derive a new timing recovery (TR) scheme. We apply this scheme to wireless cellular communication system model that comprises of a raised cosine filter (RCF), a bit-interleaved turbo-coded multi-level modulation (BITMM) scheme and the channel is assumed to be memory-less. Furthermore, no clock signals are transmitted to the receiver contrary to the classical data aided (DA) models. This new model ensures that both the bandwidth and power of the communication system is conserved. However, the computational complexity of ideal turbo synchronization is increased by 50%. Several simulation tests on bit error rate (BER) and block error rate (BLER) versus low SNR reveal that the proposed iterative soft timing recovery (ISTR) scheme outperforms the conventional schemes.

Classifying Bio-Chip Data using an Ant Colony System Algorithm

Bio-chips are used for experiments on genes and contain various information such as genes, samples and so on. The two-dimensional bio-chips, in which one axis represent genes and the other represent samples, are widely being used these days. Instead of experimenting with real genes which cost lots of money and much time to get the results, bio-chips are being used for biological experiments. And extracting data from the bio-chips with high accuracy and finding out the patterns or useful information from such data is very important. Bio-chip analysis systems extract data from various kinds of bio-chips and mine the data in order to get useful information. One of the commonly used methods to mine the data is classification. The algorithm that is used to classify the data can be various depending on the data types or number characteristics and so on. Considering that bio-chip data is extremely large, an algorithm that imitates the ecosystem such as the ant algorithm is suitable to use as an algorithm for classification. This paper focuses on finding the classification rules from the bio-chip data using the Ant Colony algorithm which imitates the ecosystem. The developed system takes in consideration the accuracy of the discovered rules when it applies it to the bio-chip data in order to predict the classes.

Development of Subjective Measures of Interestingness: From Unexpectedness to Shocking

Knowledge Discovery of Databases (KDD) is the process of extracting previously unknown but useful and significant information from large massive volume of databases. Data Mining is a stage in the entire process of KDD which applies an algorithm to extract interesting patterns. Usually, such algorithms generate huge volume of patterns. These patterns have to be evaluated by using interestingness measures to reflect the user requirements. Interestingness is defined in different ways, (i) Objective measures (ii) Subjective measures. Objective measures such as support and confidence extract meaningful patterns based on the structure of the patterns, while subjective measures such as unexpectedness and novelty reflect the user perspective. In this report, we try to brief the more widely spread and successful subjective measures and propose a new subjective measure of interestingness, i.e. shocking.