A Characterized and Optimized Approach for End-to-End Delay Constrained QoS Routing

QoS Routing aims to find paths between senders and receivers satisfying the QoS requirements of the application which efficiently using the network resources and underlying routing algorithm to be able to find low-cost paths that satisfy given QoS constraints. The problem of finding least-cost routing is known to be NP hard or complete and some algorithms have been proposed to find a near optimal solution. But these heuristics or algorithms either impose relationships among the link metrics to reduce the complexity of the problem which may limit the general applicability of the heuristic, or are too costly in terms of execution time to be applicable to large networks. In this paper, we analyzed two algorithms namely Characterized Delay Constrained Routing (CDCR) and Optimized Delay Constrained Routing (ODCR). The CDCR algorithm dealt an approach for delay constrained routing that captures the trade-off between cost minimization and risk level regarding the delay constraint. The ODCR which uses an adaptive path weight function together with an additional constraint imposed on the path cost, to restrict search space and hence ODCR finds near optimal solution in much quicker time.

Geostatistical Analysis and Mapping of Groundlevel Ozone in a Medium Sized Urban Area

Ground-level tropospheric ozone is one of the air pollutants of most concern. It is mainly produced by photochemical processes involving nitrogen oxides and volatile organic compounds in the lower parts of the atmosphere. Ozone levels become particularly high in regions close to high ozone precursor emissions and during summer, when stagnant meteorological conditions with high insolation and high temperatures are common. In this work, some results of a study about urban ozone distribution patterns in the city of Badajoz, which is the largest and most industrialized city in Extremadura region (southwest Spain) are shown. Fourteen sampling campaigns, at least one per month, were carried out to measure ambient air ozone concentrations, during periods that were selected according to favourable conditions to ozone production, using an automatic portable analyzer. Later, to evaluate the ozone distribution at the city, the measured ozone data were analyzed using geostatistical techniques. Thus, first, during the exploratory analysis of data, it was revealed that they were distributed normally, which is a desirable property for the subsequent stages of the geostatistical study. Secondly, during the structural analysis of data, theoretical spherical models provided the best fit for all monthly experimental variograms. The parameters of these variograms (sill, range and nugget) revealed that the maximum distance of spatial dependence is between 302-790 m and the variable, air ozone concentration, is not evenly distributed in reduced distances. Finally, predictive ozone maps were derived for all points of the experimental study area, by use of geostatistical algorithms (kriging). High prediction accuracy was obtained in all cases as cross-validation showed. Useful information for hazard assessment was also provided when probability maps, based on kriging interpolation and kriging standard deviation, were produced.

A Theory in Optimization of Ad-hoc Routing Algorithms

In this paper optimization of routing in ad-hoc networks is surveyed and a new method for reducing the complexity of routing algorithms is suggested. Using binary matrices for each node in the network and updating it once the routing is done, helps nodes to stop repeating the routing protocols in each data transfer. The algorithm suggested can reduce the complexity of routing to the least amount possible.

Frequency-Domain Design of Fractional-Order FIR Differentiators

In this paper, a fractional-order FIR differentiator design method using the differential evolution (DE) algorithm is presented. In the proposed method, the FIR digital filter is designed to meet the frequency response of a desired fractal-order differentiator, which is evaluated in the frequency domain. To verify the design performance, another design method considered in the time-domain is also provided. Simulation results reveal the efficiency of the proposed method.

An Augmented Beam-search Based Algorithm for the Strip Packing Problem

In this paper, the use of beam search and look-ahead strategies for solving the strip packing problem (SPP) is investigated. Given a strip of fixed width W, unlimited length L, and a set of n circular pieces of known radii, the objective is to determine the minimum length of the initial strip that packs all the pieces. An augmented algorithm which combines beam search and a look-ahead strategies is proposed. The look-ahead is used in order to evaluate the nodes at each level of the tree search. The best nodes are then retained for branching. The computational investigation showed that the proposed augmented algorithm is able to improve the best known solutions of the literature on most instances used.

Wavelet Compression of ECG Signals Using SPIHT Algorithm

In this paper we present a novel approach for wavelet compression of electrocardiogram (ECG) signals based on the set partitioning in hierarchical trees (SPIHT) coding algorithm. SPIHT algorithm has achieved prominent success in image compression. Here we use a modified version of SPIHT for one dimensional signals. We applied wavelet transform with SPIHT coding algorithm on different records of MIT-BIH database. The results show the high efficiency of this method in ECG compression.

Improved Zero Text Watermarking Algorithm against Meaning Preserving Attacks

Internet is largely composed of textual contents and a huge volume of digital contents gets floated over the Internet daily. The ease of information sharing and re-production has made it difficult to preserve author-s copyright. Digital watermarking came up as a solution for copyright protection of plain text problem after 1993. In this paper, we propose a zero text watermarking algorithm based on occurrence frequency of non-vowel ASCII characters and words for copyright protection of plain text. The embedding algorithm makes use of frequency non-vowel ASCII characters and words to generate a specialized author key. The extraction algorithm uses this key to extract watermark, hence identify the original copyright owner. Experimental results illustrate the effectiveness of the proposed algorithm on text encountering meaning preserving attacks performed by five independent attackers.

Dynamic Admission Control for Quality of Service in IP Networks

The goal of admission control is to support the Quality of Service demands of real-time applications via resource reservation in IP networks. In this paper we introduce a novel Dynamic Admission Control (DAC) mechanism for IP networks. The DAC dynamically allocates network resources using the previous network pattern for each path and uses the dynamic admission algorithm to improve bandwidth utilization using bandwidth brokers. We evaluate the performance of the proposed mechanism through trace-driven simulation experiments in view point of blocking probability, throughput and normalized utilization.

Finger Vein Recognition using PCA-based Methods

In this paper a novel algorithm is proposed to merit the accuracy of finger vein recognition. The performances of Principal Component Analysis (PCA), Kernel Principal Component Analysis (KPCA), and Kernel Entropy Component Analysis (KECA) in this algorithm are validated and compared with each other in order to determine which one is the most appropriate one in terms of finger vein recognition.

Distributed Splay Suffix Arrays: A New Structure for Distributed String Search

As a structure for processing string problem, suffix array is certainly widely-known and extensively-studied. But if the string access pattern follows the “90/10" rule, suffix array can not take advantage of the fact that we often find something that we have just found. Although the splay tree is an efficient data structure for small documents when the access pattern follows the “90/10" rule, it requires many structures and an excessive amount of pointer manipulations for efficiently processing and searching large documents. In this paper, we propose a new and conceptually powerful data structure, called splay suffix arrays (SSA), for string search. This data structure combines the features of splay tree and suffix arrays into a new approach which is suitable to implementation on both conventional and clustered computers.

A Blind Digital Watermark in Hadamard Domain

A new blind gray-level watermarking scheme is described. In the proposed method, the host image is first divided into 4*4 non-overlapping blocks. For each block, two first AC coefficients of its Hadamard transform are then estimated using DC coefficients of its neighbor blocks. A gray-level watermark is then added into estimated values. Since embedding watermark does not change the DC coefficients, watermark extracting could be done by estimating AC coefficients and comparing them with their actual values. Several experiments are made and results suggest the robustness of the proposed algorithm.

Pattern Classification of Back-Propagation Algorithm Using Exclusive Connecting Network

The objective of this paper is to a design of pattern classification model based on the back-propagation (BP) algorithm for decision support system. Standard BP model has done full connection of each node in the layers from input to output layers. Therefore, it takes a lot of computing time and iteration computing for good performance and less accepted error rate when we are doing some pattern generation or training the network. However, this model is using exclusive connection in between hidden layer nodes and output nodes. The advantage of this model is less number of iteration and better performance compare with standard back-propagation model. We simulated some cases of classification data and different setting of network factors (e.g. hidden layer number and nodes, number of classification and iteration). During our simulation, we found that most of simulations cases were satisfied by BP based using exclusive connection network model compared to standard BP. We expect that this algorithm can be available to identification of user face, analysis of data, mapping data in between environment data and information.

Helicopter Adaptive Control with Parameter Estimation Based on Feedback Linearization

This paper presents an adaptive feedback linearization approach to derive helicopter. Ideal feedback linearization is defined for the cases when the system model is known. Adaptive feedback linearization is employed to get asymptotically exact cancellation for the inherent uncertainty in the knowledge of the given parameters of system. The control algorithm is implemented using the feedback linearization technique and adaptive method. The controller parameters are unknown where an adaptive control law aims to drive them towards their ideal values for providing perfect model matching between the reference model and the closed-loop plant model. The converged parameters of controller would then provide good estimates for the unknown plant parameters.

Photo Mosaic Smartphone Application in Client-Server Based Large-Scale Image Databases

In this paper we present a photo mosaic smartphone application in client-server based large-scale image databases. Photo mosaic is not a new concept, but there are very few smartphone applications especially for a huge number of images in the client-server environment. To support large-scale image databases, we first propose an overall framework working as a client-server model. We then present a concept of image-PAA features to efficiently handle a huge number of images and discuss its lower bounding property. We also present a best-match algorithm that exploits the lower bounding property of image-PAA. We finally implement an efficient Android-based application and demonstrate its feasibility.

A Study on Creation of Human-Based Co-Design Service Platform

With the approaching of digital era, various interactive service platforms and systems support human beings- needs in lives by different contents and measures. Design strategies have gradually turned from function-based to user-oriented, and are often customized. In other words, how designers include users- value reaction in creation becomes the goal. Creative design service of interior design requires positive interaction and communication to allow users to obtain full design information, recognize the style and process of personal needs, develop creative service design, lower communication time and cost and satisfy users- sense of achievement. Thus, by constructing a co-design method, based on the communication between interior designers and users, this study recognizes users- real needs and provides the measure of co-design for designers and users.

BIBD-s for (13, 5, 5), (16, 6, 5) and (21, 6, 4) Possessing Possibly an Automorphism of Order 3

When trying to enumerate all BIBD-s for given parameters, their natural solution space appears to be huge and grows extremely with the number of points of the design. Therefore, constructive enumerations are often carried out by assuming additional constraints on design-s structure, automorphisms being mostly used ones. It remains a hard task to construct designs with trivial automorphism group – those with no additional symmetry – although it is believed that most of the BIBD-s belong to that case. In this paper, very many new designs with parameters 2-(13, 5, 5), 2-(16, 6, 5) and 2-(21, 6, 4) are constructed, assuming an action of an automorphism of order 3. Even more, it was possible to construct millions of such designs with no non-trivial automorphisms.

Estimation of Skew Angle in Binary Document Images Using Hough Transform

This paper includes two novel techniques for skew estimation of binary document images. These algorithms are based on connected component analysis and Hough transform. Both these methods focus on reducing the amount of input data provided to Hough transform. In the first method, referred as word centroid approach, the centroids of selected words are used for skew detection. In the second method, referred as dilate & thin approach, the selected characters are blocked and dilated to get word blocks and later thinning is applied. The final image fed to Hough transform has the thinned coordinates of word blocks in the image. The methods have been successful in reducing the computational complexity of Hough transform based skew estimation algorithms. Promising experimental results are also provided to prove the effectiveness of the proposed methods.

A Dual Method for Solving General Convex Quadratic Programs

In this paper, we present a new method for solving quadratic programming problems, not strictly convex. Constraints of the problem are linear equalities and inequalities, with bounded variables. The suggested method combines the active-set strategies and support methods. The algorithm of the method and numerical experiments are presented, while comparing our approach with the active set method on randomly generated problems.

Fuzzy Ideology based Long Term Load Forecasting

Fuzzy Load forecasting plays a paramount role in the operation and management of power systems. Accurate estimation of future power demands for various lead times facilitates the task of generating power reliably and economically. The forecasting of future loads for a relatively large lead time (months to few years) is studied here (long term load forecasting). Among the various techniques used in forecasting load, artificial intelligence techniques provide greater accuracy to the forecasts as compared to conventional techniques. Fuzzy Logic, a very robust artificial intelligent technique, is described in this paper to forecast load on long term basis. The paper gives a general algorithm to forecast long term load. The algorithm is an Extension of Short term load forecasting method to Long term load forecasting and concentrates not only on the forecast values of load but also on the errors incorporated into the forecast. Hence, by correcting the errors in the forecast, forecasts with very high accuracy have been achieved. The algorithm, in the paper, is demonstrated with the help of data collected for residential sector (LT2 (a) type load: Domestic consumers). Load, is determined for three consecutive years (from April-06 to March-09) in order to demonstrate the efficiency of the algorithm and to forecast for the next two years (from April-09 to March-11).

A Hybrid Machine Learning System for Stock Market Forecasting

In this paper, we propose a hybrid machine learning system based on Genetic Algorithm (GA) and Support Vector Machines (SVM) for stock market prediction. A variety of indicators from the technical analysis field of study are used as input features. We also make use of the correlation between stock prices of different companies to forecast the price of a stock, making use of technical indicators of highly correlated stocks, not only the stock to be predicted. The genetic algorithm is used to select the set of most informative input features from among all the technical indicators. The results show that the hybrid GA-SVM system outperforms the stand alone SVM system.