Abstract: Border Gateway Protocol (BGP) is the standard routing protocol between various autonomous systems (AS) in the internet. In the event of failure, a considerable delay in the BGP convergence has been shown by empirical measurements. During the convergence time the BGP will repeatedly advertise new routes to some destination and withdraw old ones until it reach a stable state. It has been found that the KEEPALIVE message timer and the HOLD time are tow parameters affecting the convergence speed. This paper aims to find the optimum value for the KEEPALIVE timer and the HOLD time that maximally reduces the convergence time without increasing the traffic. The KEEPALIVE message timer optimal value founded by this paper is 30 second instead of 60 seconds, and the optimal value for the HOLD time is 90 seconds instead of 180 seconds.
Abstract: Retrieval image by shape similarity, given a template
shape is particularly challenging, owning to the difficulty to derive a
similarity measurement that closely conforms to the common
perception of similarity by humans. In this paper, a new method for the
representation and comparison of shapes is present which is based on
the shape matrix and snake model. It is scaling, rotation, translation
invariant. And it can retrieve the shape images with some missing or
occluded parts. In the method, the deformation spent by the template
to match the shape images and the matching degree is used to evaluate
the similarity between them.
Abstract: Performance of a cobalt doped sol-gel derived silica (Co/SiO2) catalyst for Fischer–Tropsch synthesis (FTS) in slurryphase reactor was studied using paraffin wax as initial liquid media. The reactive mixed gas, hydrogen (H2) and carbon monoxide (CO) in a molar ratio of 2:1, was flowed at 50 ml/min. Braunauer-Emmett- Teller (BET) surface area and X-ray diffraction (XRD) techniques were employed to characterize both the specific surface area and crystallinity of the catalyst, respectively. The reduction behavior of Co/SiO2 catalyst was investigated using the Temperature Programmmed Reduction (TPR) method. Operating temperatures were varied from 493 to 533K to find the optimum conditions to maximize liquid fuels production, gasoline and diesel.
Abstract: This paper describes an algorithm to estimate realtime vehicle velocity using image processing technique from the known camera calibration parameters. The presented algorithm involves several main steps. First, the moving object is extracted by utilizing frame differencing technique. Second, the object tracking method is applied and the speed is estimated based on the displacement of the object-s centroid. Several assumptions are listed to simplify the transformation of 2D images from 3D real-world images. The results obtained from the experiment have been compared to the estimated ground truth. From this experiment, it exhibits that the proposed algorithm has achieved the velocity accuracy estimation of about ± 1.7 km/h.
Abstract: Hexavalent chromium is highly toxic to most living organisms and a known human carcinogen by the inhalation route of exposure. Therefore, treatment of Cr(VI) contaminated wastewater is essential before their discharge to the natural water bodies. Cr(VI) reduction to Cr(III) can be beneficial because a more mobile and more toxic chromium species is converted to a less mobile and less toxic form. Zero-valence-state metals, such as scrap iron, can serve as electron donors for reducing Cr(VI) to Cr(III). The influence of pH on scrap iron capacity to reduce Cr(VI) was investigated in this study. Maximum reduction capacity of scrap iron was observed at the beginning of the column experiments; the lower the pH, the greater the experiment duration with maximum scrap iron reduction capacity. The experimental results showed that highest maximum reduction capacity of scrap iron was 12.5 mg Cr(VI)/g scrap iron, at pH 2.0, and decreased with increasing pH up to 1.9 mg Cr(VI)/g scrap iron at pH = 7.3.
Abstract: In this paper, first, a characterization of spherical
Pseudo null curves in Semi-Euclidean space is given. Then, to
investigate position vector of a pseudo null curve, a system of
differential equation whose solution gives the components of the
position vector of a pseudo null curve on the Frenet axis is
established by means of Frenet equations. Additionally, in view of
some special solutions of mentioned system, characterizations of
some special pseudo null curves are presented.
Abstract: This work consists of three parts. First, the alias-free
condition for the conventional two-channel quadrature mirror filter
bank is analyzed using complex arithmetic. Second, the approach
developed in the first part is applied to the complex quadrature mirror
filter bank. Accordingly, the structure is simplified and the theory is
easier to follow. Finally, a new class of complex quadrature mirror
filter banks is proposed. Interesting properties of this new structure
are also discussed.
Abstract: The decoding of Low-Density Parity-Check (LDPC) codes is operated over a redundant structure known as the bipartite graph, meaning that the full set of bit nodes is not absolutely necessary for decoder convergence. In 2008, Soyjaudah and Catherine designed a recovery algorithm for LDPC codes based on this assumption and showed that the error-correcting performance of their codes outperformed conventional LDPC Codes. In this work, the use of the recovery algorithm is further explored to test the performance of LDPC codes while the number of iterations is progressively increased. For experiments conducted with small blocklengths of up to 800 bits and number of iterations of up to 2000, the results interestingly demonstrate that contrary to conventional wisdom, the error-correcting performance keeps increasing with increasing number of iterations.
Abstract: Interpretation of aerial images is an important task in
various applications. Image segmentation can be viewed as the essential
step for extracting information from aerial images. Among many
developed segmentation methods, the technique of clustering has been
extensively investigated and used. However, determining the number
of clusters in an image is inherently a difficult problem, especially
when a priori information on the aerial image is unavailable. This
study proposes a support vector machine approach for clustering
aerial images. Three cluster validity indices, distance-based index,
Davies-Bouldin index, and Xie-Beni index, are utilized as quantitative
measures of the quality of clustering results. Comparisons on the
effectiveness of these indices and various parameters settings on the
proposed methods are conducted. Experimental results are provided
to illustrate the feasibility of the proposed approach.
Abstract: The number of features required to represent an image
can be very huge. Using all available features to recognize objects
can suffer from curse dimensionality. Feature selection and
extraction is the pre-processing step of image mining. Main issues in
analyzing images is the effective identification of features and
another one is extracting them. The mining problem that has been
focused is the grouping of features for different shapes. Experiments
have been conducted by using shape outline as the features. Shape
outline readings are put through normalization and dimensionality
reduction process using an eigenvector based method to produce a
new set of readings. After this pre-processing step data will be
grouped through their shapes. Through statistical analysis, these
readings together with peak measures a robust classification and
recognition process is achieved. Tests showed that the suggested
methods are able to automatically recognize objects through their
shapes. Finally, experiments also demonstrate the system invariance
to rotation, translation, scale, reflection and to a small degree of
distortion.
Abstract: There are three approaches to complete Bayesian
Network (BN) model construction: total expert-centred, total datacentred,
and semi data-centred. These three approaches constitute the
basis of the empirical investigation undertaken and reported in this
paper. The objective is to determine, amongst these three
approaches, which is the optimal approach for the construction of a
BN-based model for the performance assessment of students-
laboratory work in a virtual electronic laboratory environment. BN
models were constructed using all three approaches, with respect to
the focus domain, and compared using a set of optimality criteria. In
addition, the impact of the size and source of the training, on the
performance of total data-centred and semi data-centred models was
investigated. The results of the investigation provide additional
insight for BN model constructors and contribute to literature
providing supportive evidence for the conceptual feasibility and
efficiency of structure and parameter learning from data. In addition,
the results highlight other interesting themes.
Abstract: This paper demonstrates the bus location system for
the route bus through the experiment in the real environment. A
bus location system is a system that provides information such as
the bus delay and positions. This system uses actual services and
positions data of buses, and those information should match data
on the database. The system has two possible problems. One, the
system could cost high in preparing devices to get bus positions.
Two, it could be difficult to match services data of buses. To avoid
these problems, we have developed this system at low cost and short
time by using the smart phone with GPS and the bus route system.
This system realizes the path planning considering bus delay and
displaying position of buses on the map. The bus location system
was demonstrated on route buses with smart phones for two months.
Abstract: Models are placed by modeling paradigm at the center of development process. These models are represented by languages, like UML the language standardized by the OMG which became necessary for development. Moreover the ontology engineering paradigm places ontologies at the center of development process; in this paradigm we find OWL the principal language for knowledge representation. Building ontologies from scratch is generally a difficult task. The bridging between UML and OWL appeared on several regards such as the classes and associations. In this paper, we have to profit from convergence between UML and OWL to propose an approach based on Meta-Modelling and Graph Grammars and registered in the MDA architecture for the automatic generation of OWL ontologies from UML class diagrams. The transformation is based on transformation rules; the level of abstraction in these rules is close to the application in order to have usable ontologies. We illustrate this approach by an example.
Abstract: Smart Dust particles, are small smart materials used for generating weather maps. We investigate question of the optimal number of Smart Dust particles necessary for generating precise, computationally feasible and cost effective 3–D weather maps. We also give an optimal matching algorithm for the generalized scenario, when there are N Smart Dust particles and M ground receivers.
Abstract: The Comparison analysis of the Wald-s and Bayestype sequential methods for testing hypotheses is offered. The merits of the new sequential test are: universality which consists in optimality (with given criteria) and uniformity of decision-making regions for any number of hypotheses; simplicity, convenience and uniformity of the algorithms of their realization; reliability of the obtained results and an opportunity of providing the errors probabilities of desirable values. There are given the Computation results of concrete examples which confirm the above-stated characteristics of the new method and characterize the considered methods in regard to each other.
Abstract: This paper presents the convergence analysis
of a prediction based blind equalizer for IIR channels.
Predictor parameters are estimated by using the recursive
least squares algorithm. It is shown that the prediction
error converges almost surely (a.s.) toward a scalar
multiple of the unknown input symbol sequence. It is
also proved that the convergence rate of the parameter
estimation error is of the same order as that in the iterated
logarithm law.
Abstract: In view of their importance and usefulness in reliability theory and probability distributions, several generalizations of the inverse Gaussian distribution and the Krtzel function are investigated in recent years. This has motivated the authors to introduce and study a new generalization of the inverse Gaussian distribution and the Krtzel function associated with a product of a Bessel function of the third kind )(zKQ and a Z - Fox-Wright generalized hyper geometric function introduced in this paper. The introduced function turns out to be a unified gamma-type function. Its incomplete forms are also discussed. Several properties of this gamma-type function are obtained. By means of this generalized function, we introduce a generalization of inverse Gaussian distribution, which is useful in reliability analysis, diffusion processes, and radio techniques etc. The inverse Gaussian distribution thus introduced also provides a generalization of the Krtzel function. Some basic statistical functions associated with this probability density function, such as moments, the Mellin transform, the moment generating function, the hazard rate function, and the mean residue life function are also obtained.KeywordsFox-Wright function, Inverse Gaussian distribution, Krtzel function & Bessel function of the third kind.
Abstract: Novel acrylated epoxidized hemp oil (AEHO) based
bioresins were successfully synthesised, characterized and applied to
biocomposites reinforced with woven jute fibre. Characterisation of
the synthesised AEHO consisted of acid number titrations and FTIR
spectroscopy to assess the success of the acrylation reaction. Three
different matrices were produced (vinylester (VE), 50/50 blend of
AEHO/VE and 100% AEHO) and reinforced with jute fibre to form
three different types of biocomposite samples. Mechanical properties
in the form of flexural and interlaminar shear strength (ILSS) were
investigated and compared for the different samples. Results from the
mechanical tests showed that AEHO and 50/50 based neat bioresins
displayed lower flexural properties compared with the VE samples.
However when applied to biocomposites and compared with VE
based samples, AEHO biocomposites demonstrated comparable
flexural performance and improved ILSS. These results are attributed
to improved fibre-matrix interfacial adhesion due to surface-chemical
compatibility between the natural fibres and bioresin.
Abstract: In many buildings we rely on large footings to offer
structural stability. Designers often compensate for the lack of
knowledge available with regard to foundation-soil interaction by
furnishing structures with overly large footings. This may lead to a
significant increase in building expenditures if many large
foundations are present. This paper describes the interface material
law that governs the behavior along the contact surface of adjacent
materials, and the behavior of a large foundation under ultimate limit
loading. A case study is chosen that represents a common
foundation-soil system frequently used in general practice and
therefore relevant to other structures. Investigations include
compressing versus uplifting wind forces, alterations to the
foundation size and subgrade compositions, the role of the slab
stiffness and presence and the effect of commonly used structural
joints and connections. These investigations aim to provide the
reader with an objective design approach, efficiently preventing
structural instability.
Abstract: Biological Ammonia removal (nitrification), the
oxidation of ammonia to nitrate catalyzed by bacteria, is a key part of
global nitrogen cycling. In the first step of nitrification,
chemolithoautotrophic ammonia oxidizer transform ammonia to
nitrite, this subsequently oxidized to nitrate by nitrite oxidizing
bacteria. This process can be affected by several factors. In this study
the effect of influent COD on biological ammonia removal in a
bench-scale biological reactor was investigated. Experiments were
carried out using synthetic wastewater. The initial ammonium
concentration was 25mgNH4
+-N L-1. The effect of COD between
247.55±1.8 and 601.08±3.24mgL-1 on biological ammonia removal
was investigated by varying the COD loading supplied to reactor.
From the results obtained in this study it could be concluded in the
range of 247.55±1.8 to 351.35±2.05mgL-1, there is a direct
relationship between amount of COD and ammonia removal.
However more than 351.35±2.05 up to 601.08±3.24mgL-1 were
found an indirect relationship between them.