Abstract: This paper addresses the problem of how one can
improve the performance of a non-optimal filter. First the theoretical question on dynamical representation for a given time correlated
random process is studied. It will be demonstrated that for a wide class of random processes, having a canonical form, there exists
a dynamical system equivalent in the sense that its output has the
same covariance function. It is shown that the dynamical approach is more effective for simulating and estimating a Markov and non-
Markovian random processes, computationally is less demanding,
especially with increasing of the dimension of simulated processes.
Numerical examples and estimation problems in low dimensional
systems are given to illustrate the advantages of the approach. A very useful application of the proposed approach is shown for the
problem of state estimation in very high dimensional systems. Here a modified filter for data assimilation in an oceanic numerical model
is presented which is proved to be very efficient due to introducing
a simple Markovian structure for the output prediction error process
and adaptive tuning some parameters of the Markov equation.
Abstract: This study presents an investigation of
electrochemical variables and an application of the optimal
parameters in operating a continuous upflow electrocoagulation
reactor in removing dye. Direct red 23, which is azo-based, was used
as a representative of direct dyes. First, a batch mode was employed
to optimize the design parameters: electrode type, electrode distance,
current density and electrocoagulation time. The optimal parameters
were found to be iron anode, distance between electrodes of 8 mm
and current density of 30 A·m-2 with contact time of 5 min. The
performance of the continuous upflow reactor with these parameters
was satisfactory, with >95% color removal and energy consumption
in the order of 0.6-0.7 kWh·m-3.
Abstract: In this work, propagation of uncertainty during calibration
process of TRANUS, an integrated land use and transport model
(ILUTM), has been investigated. It has also been examined, through a
sensitivity analysis, which input parameters affect the variation of the
outputs the most. Moreover, a probabilistic verification methodology
of calibration process, which equates the observed and calculated
production, has been proposed. The model chosen as an application is
the model of the city of Grenoble, France. For sensitivity analysis and
uncertainty propagation, Monte Carlo method was employed, and a
statistical hypothesis test was used for verification. The parameters of
the induced demand function in TRANUS, were assumed as uncertain
in the present case. It was found that, if during calibration, TRANUS
converges, then with a high probability the calibration process is
verified. Moreover, a weak correlation was found between the inputs
and the outputs of the calibration process. The total effect of the
inputs on outputs was investigated, and the output variation was found
to be dictated by only a few input parameters.
Abstract: Embedding and extraction of a secret information as
well as the restoration of the original un-watermarked image is
highly desirable in sensitive applications like military, medical, and
law enforcement imaging. This paper presents a novel reversible
data-hiding method for digital images using integer to integer
wavelet transform and companding technique which can embed and
recover the secret information as well as can restore the image to its
pristine state. The novel method takes advantage of block based
watermarking and iterative optimization of threshold for companding
which avoids histogram pre and post-processing. Consequently, it
reduces the associated overhead usually required in most of the
reversible watermarking techniques. As a result, it keeps the
distortion small between the marked and the original images.
Experimental results show that the proposed method outperforms the
existing reversible data hiding schemes reported in the literature.
Abstract: In this article we propose to model Net-banking
system by game theory. We adopt extensive game to model our web
application. We present the model in term of players and strategy.
We present UML diagram related the protocol game.
Abstract: A numerical method for solving nonlinear Fredholm integral equations of second kind is proposed. The Fredholm type equations which have many applications in mathematical physics are then considered. The method is based on hybrid function approximations. The properties of hybrid of block-pulse functions and Chebyshev polynomials are presented and are utilized to reduce the computation of nonlinear Fredholm integral equations to a system of nonlinear. Some numerical examples are selected to illustrate the effectiveness and simplicity of the method.
Abstract: Knowledge is a key asset for any organisation to
sustain competitive advantages, but it is difficult to identify and
represent knowledge which is needed to perform activities in
business processes. The effective knowledge management and
support for relevant business activities definitely gives a huge impact
to the performance of the organisation as a whole. This is because
that knowledge have the functions of directing, coordinating and
controlling actions within business processes. The study has
introduced organisational morphology, a norm-based approach by
applying semiotic theories which emphasise on the representation of
knowledge in norms. This approach is concerned with the
identification of activities into three categories: substantive,
communication and control activities. All activities are directed by
norms; hence three types of norms exist; each is associated to a
category of activities. The paper describes the approach briefly and
illustrates the application of this approach through a case study of
academic activities in higher education institutions. The result of the
study shows that the approach provides an effective way to profile
business knowledge and the profile enables the understanding and
specification of business requirements of an organisation.
Abstract: The daily increase of organic waste materials resulting
from different activities in the country is one of the main factors for
the pollution of environment. Today, with regard to the low level of
the output of using traditional methods, the high cost of disposal
waste materials and environmental pollutions, the use of modern
methods such as anaerobic digestion for the production of biogas has
been prevailing. The collected biogas from the process of anaerobic
digestion, as a renewable energy source similar to natural gas but
with a less methane and heating value is usable. Today, with the help
of technologies of filtration and proper preparation, access to biogas
with features fully similar to natural gas has become possible. At
present biogas is one of the main sources of supplying electrical and
thermal energy and also an appropriate option to be used in four
stroke engine, diesel engine, sterling engine, gas turbine, gas micro
turbine and fuel cell to produce electricity. The use of biogas for
different reasons which returns to socio-economic and environmental
advantages has been noticed in CHP for the production of energy in
the world. The production of biogas from the technology of anaerobic
digestion and its application in CHP power plants in Iran can not only
supply part of the energy demands in the country, but it can
materialize moving in line with the sustainable development. In this
article, the necessity of the development of CHP plants with biogas
fuels in the country will be dealt based on studies performed from the
economic, environmental and social aspects. Also to prove the
importance of the establishment of these kinds of power plants from
the economic point of view, necessary calculations has been done as
a case study for a CHP power plant with a biogas fuel.
Abstract: Internet Protocol version 4 (IPv4) address is decreasing and a rapid transition method to the next generation IP address (IPv6) should be established. This study aims to evaluate and select the best performance of the IPv6 address network transitionmechanisms, such as IPv4/IPv6 dual stack, transport Relay Translation (TRT) and Reverse Proxy with additional features. It is also aim to prove that faster access can be done while ensuring optimal usage of available resources used during the test and actual implementation. This study used two test methods such asInternet Control Message Protocol (ICMP)ping and ApacheBenchmark (AB) methodsto evaluate the performance.Performance metrics for this study include aspects ofaverageaccessin one second,time takenfor singleaccess,thedata transfer speed and the costof additional requirements.Reverse Proxy with Caching featureis the most efficientmechanism because of it simpler configurationandthe best performerfrom the test conducted.
Abstract: We report in this paper the procedure of a system of
automatic speech recognition based on techniques of the dynamic
programming. The technique of temporal retiming is a technique
used to synchronize between two forms to compare. We will see how
this technique is adapted to the field of the automatic speech
recognition. We will expose, in a first place, the theory of the
function of retiming which is used to compare and to adjust an
unknown form with a whole of forms of reference constituting the
vocabulary of the application. Then we will give, in the second place,
the various algorithms necessary to their implementation on machine.
The algorithms which we will present were tested on part of the
corpus of words in Arab language Arabdic-10 [4] and gave whole
satisfaction. These algorithms are effective insofar as we apply them
to the small ones or average vocabularies.
Abstract: Due to new distributed database applications such as
huge deductive database systems, the search complexity is constantly
increasing and we need better algorithms to speedup traditional
relational database queries. An optimal dynamic programming
method for such high dimensional queries has the big disadvantage of
its exponential order and thus we are interested in semi-optimal but
faster approaches. In this work we present a multi-agent based
mechanism to meet this demand and also compare the result with
some commonly used query optimization algorithms.
Abstract: This paper is concerned with the production of an Arabic word semantic similarity benchmark dataset. It is the first of its kind for Arabic which was particularly developed to assess the accuracy of word semantic similarity measurements. Semantic similarity is an essential component to numerous applications in fields such as natural language processing, artificial intelligence, linguistics, and psychology. Most of the reported work has been done for English. To the best of our knowledge, there is no word similarity measure developed specifically for Arabic. In this paper, an Arabic benchmark dataset of 70 word pairs is presented. New methods and best possible available techniques have been used in this study to produce the Arabic dataset. This includes selecting and creating materials, collecting human ratings from a representative sample of participants, and calculating the overall ratings. This dataset will make a substantial contribution to future work in the field of Arabic WSS and hopefully it will be considered as a reference basis from which to evaluate and compare different methodologies in the field.
Abstract: Application of flexible structures has been
significantly, increased in industry and aerospace missions due to
their contributions and unique advantages over the rigid counterparts.
In this paper, vibration analysis of a flexible structure i.e., automobile
wiper blade is investigated and controlled. The wiper generates
unwanted noise and vibration during the wiping the rain and other
particles on windshield which may cause annoying noise in different
ranges of frequency. A two dimensional analytical modeled wiper
blade whose model accuracy is verified by numerical studies in
literature is considered in this study. Particle swarm optimization
(PSO) is employed in alliance with input shaping (IS) technique in
order to control or to attenuate the amplitude level of unwanted
noise/vibration of the wiper blade.
Abstract: Image interpolation is a common problem in imaging applications. However, most interpolation algorithms in existence suffer visually to some extent the effects of blurred edges and jagged artifacts in the image. This paper presents an adaptive feature preserving bidirectional flow process, where an inverse diffusion is performed to enhance edges along the normal directions to the isophote lines (edges), while a normal diffusion is done to remove artifacts (''jaggies'') along the tangent directions. In order to preserve image features such as edges, angles and textures, the nonlinear diffusion coefficients are locally adjusted according to the first and second order directional derivatives of the image. Experimental results on synthetic images and nature images demonstrate that our interpolation algorithm substantially improves the subjective quality of the interpolated images over conventional interpolations.
Abstract: The next generation wireless systems, especially the
cognitive radio networks aim at utilizing network resources more
efficiently. They share a wide range of available spectrum in an
opportunistic manner. In this paper, we propose a quality
management model for short-term sub-lease of unutilized spectrum
bands to different service providers. We built our model on
competitive secondary market architecture. To establish the
necessary conditions for convergent behavior, we utilize techniques
from game theory. Our proposed model is based on potential game
approach that is suitable for systems with dynamic decision making.
The Nash equilibrium point tells the spectrum holders the ideal price
values where profit is maximized at the highest level of customer
satisfaction. Our numerical results show that the price decisions of
the network providers depend on the price and QoS of their own
bands as well as the prices and QoS levels of their opponents- bands.
Abstract: Owning to the high-speed feed rate and ultra spindle
speed have been used in modern machine tools, the tool-path
generation plays a key role in the successful application of a
High-Speed Machining (HSM) system. Because of its importance in
both high-speed machining and tool-path generation, approximating a
contour by NURBS format is a potential function in CAD/CAM/CNC
systems. It is much more convenient to represent an ellipse by
parametric form than to connect points laboriously determined in a
CNC system. A new approximating method based on optimum
processes and NURBS curves of any degree to the ellipses is presented
in this study. Such operations can be the foundation of tool-radius
compensation interpolator of NURBS curves in CNC system. All
operating processes for a CAD tool is presented and demonstrated by
practical models.
Abstract: In order to study the effect of phosphate solubilization
microorganisms (PSM) and plant growth promoting rhizobacteria
(PGPR) on yield and yield components of corn Zea mays (L. cv.
SC604) an experiment was conducted at research farm of Sari
Agricultural Sciences and Natural Resources University, Iran during
2007. Experiment laid out as split plot based on randomized
complete block design with three replications. Three levels of
manures (consisted of 20 Mg.ha-1 farmyard manure, 15 Mg.ha-1 green
manure and check or without any manures) as main plots and eight
levels of biofertilizers (consisted of 1-NPK or conventional fertilizer
application; 2-NPK+PSM+PGPR; 3 NP50%K+PSM+PGPR; 4-
N50%PK+PSM +PGPR; 5-N50%P50%K+PSM+ PGPR; 6-PK+PGPR; 7-
NK+PSM and 8-PSM+PGPR) as sub plots were treatments. Results
showed that farmyard manure application increased row number, ear
weight, grain number per ear, grain yield, biological yield and
harvest index compared to check. Furthermore, using of PSM and
PGPR in addition to conventional fertilizer applications (NPK) could
improve ear weight, row number and grain number per row and
ultimately increased grain yield in green manure and check plots.
According to results in all fertilizer treatments application of PSM
and PGPR together could reduce P application by 50% without any
significant reduction of grain yield. However, this treatment could
not compensate 50% reduction of N application.
Abstract: The abnormal increase in the number of applications available for download in Android markets is a good indication that they are being reused. However, little is known about their real reusability potential. A considerable amount of these applications is reported as having a poor quality or being malicious. Hence, in this paper, an approach to measure the reusability potential of classes in Android applications is proposed. The approach is not meant specifically for this particular type of applications. Rather, it is intended for Object-Oriented (OO) software systems in general and aims also to provide means to discard the classes of low quality and defect prone applications from being reused directly through inheritance and instantiation. An empirical investigation is conducted to measure and rank the reusability potential of the classes of randomly selected Android applications. The results obtained are thoroughly analyzed in order to understand the extent of this potential and the factors influencing it.
Abstract: In many applications there is a broad variety of
information relevant to a focal “object" of interest, and the fusion of such heterogeneous data types is desirable for classification and
categorization. While these various data types can sometimes be treated as orthogonal (such as the hull number, superstructure color,
and speed of an oil tanker), there are instances where the inference and the correlation between quantities can provide improved fusion
capabilities (such as the height, weight, and gender of a person). A
service-oriented architecture has been designed and prototyped to
support the fusion of information for such “object-centric" situations.
It is modular, scalable, and flexible, and designed to support new data sources, fusion algorithms, and computational resources without affecting existing services. The architecture is designed to simplify
the incorporation of legacy systems, support exact and probabilistic entity disambiguation, recognize and utilize multiple types of
uncertainties, and minimize network bandwidth requirements.
Abstract: In order to achieve competitive advantage and better
performance of firm, supply chain management (SCM) strategy
should support and drive forward business strategy. It means that
supply chain should be aligned with business strategy, at the same
time supply chain (SC) managers need to use appropriate information
system (IS) solution to support their strategy, which would lead to
stay competitive. There are different kinds of IS strategies which
enable managers to meet the SC requirement by selecting the best IS
strategy. Therefore, it is important to align IS strategies and practices
with SC strategies and practices, which could help us to plan for an
IS application that supports and enhances a SCMS. In this study,
aligning IS with SC in strategy level is considered. The main aim of
this paper is to align the various IS strategies with SCM strategies
and demonstrate their impact on SC and firm performance.