Abstract: Simultaneous Saccharification and Fermentation (SSF) of sugarcane bagasse by cellulase and Pachysolen tannophilus MTCC *1077 were investigated in the present study. Important process variables for ethanol production form pretreated bagasse were optimized using Response Surface Methodology (RSM) based on central composite design (CCD) experiments. A 23 five level CCD experiments with central and axial points was used to develop a statistical model for the optimization of process variables such as incubation temperature (25–45°) X1, pH (5.0–7.0) X2 and fermentation time (24–120 h) X3. Data obtained from RSM on ethanol production were subjected to the analysis of variance (ANOVA) and analyzed using a second order polynomial equation and contour plots were used to study the interactions among three relevant variables of the fermentation process. The fermentation experiments were carried out using an online monitored modular fermenter 2L capacity. The processing parameters setup for reaching a maximum response for ethanol production was obtained when applying the optimum values for temperature (32°C), pH (5.6) and fermentation time (110 h). Maximum ethanol concentration (3.36 g/l) was obtained from 50 g/l pretreated sugarcane bagasse at the optimized process conditions in aerobic batch fermentation. Kinetic models such as Monod, Modified Logistic model, Modified Logistic incorporated Leudeking – Piret model and Modified Logistic incorporated Modified Leudeking – Piret model have been evaluated and the constants were predicted.
Abstract: Let the vertices of a graph such that every two
adjacent vertices have different color is a very common problem in
the graph theory. This is known as proper coloring of graphs. The
possible number of different proper colorings on a graph with a given
number of colors can be represented by a function called the
chromatic polynomial. Two graphs G and H are said to be
chromatically equivalent, if they share the same chromatic
polynomial. A Graph G is chromatically unique, if G is isomorphic to
H for any graph H such that G is chromatically equivalent to H. The
study of chromatically equivalent and chromatically unique problems
is called chromaticity. This paper shows that a wheel W12 is
chromatically unique.
Abstract: We introduce a logic-based framework for database
updating under constraints. In our framework, the constraints are
represented as an instantiated extended logic program. When performing
an update, database consistency may be violated. We provide
an approach of maintaining database consistency, and study the
conditions under which the maintenance process is deterministic. We
show that the complexity of the computations and decision problems
presented in our framework is in each case polynomial time.
Abstract: This paper focuses on the development of bond graph
dynamic model of the mechanical dynamics of an excavating mechanism
previously designed to be used with small tractors, which are
fabricated in the Engineering Workshops of Jomo Kenyatta University
of Agriculture and Technology. To develop a mechanical dynamics
model of the manipulator, forward recursive equations similar to
those applied in iterative Newton-Euler method were used to obtain
kinematic relationships between the time rates of joint variables
and the generalized cartesian velocities for the centroids of the
links. Representing the obtained kinematic relationships in bondgraphic
form, while considering the link weights and momenta as
the elements led to a detailed bond graph model of the manipulator.
The bond graph method was found to reduce significantly the number
of recursive computations performed on a 3 DOF manipulator for a
mechanical dynamic model to result, hence indicating that bond graph
method is more computationally efficient than the Newton-Euler
method in developing dynamic models of 3 DOF planar manipulators.
The model was verified by comparing the joint torque expressions
of a two link planar manipulator to those obtained using Newton-
Euler and Lagrangian methods as analyzed in robotic textbooks. The
expressions were found to agree indicating that the model captures
the aspects of rigid body dynamics of the manipulator. Based on
the model developed, actuator sizing and valve sizing methodologies
were developed and used to obtain the optimal sizes of the pistons
and spool valve ports respectively. It was found that using the pump
with the sized flow rate capacity, the engine of the tractor is able to
power the excavating mechanism in digging a sandy-loom soil.
Abstract: In this paper, center conditions and bifurcation of limit cycles at the nilpotent critical point in a class of quintic polynomial differential system are investigated.With the help of computer algebra system MATHEMATICA, the first 10 quasi Lyapunov constants are deduced. As a result, sufficient and necessary conditions in order to have a center are obtained. The fact that there exist 10 small amplitude limit cycles created from the three order nilpotent critical point is also proved. Henceforth we give a lower bound of cyclicity of three-order nilpotent critical point for quintic Lyapunov systems. At last, we give an system which could bifurcate 10 limit circles.
Abstract: The control design for unmanned underwater vehicles (UUVs) is challenging due to the uncertainties in the complex dynamic modeling of the vehicle as well as its unstructured operational environment. To cope with these difficulties, a practical robust control is therefore desirable. The paper deals with the application of coefficient diagram method (CDM) for a robust control design of an autonomous underwater vehicle. The CDM is an algebraic approach in which the characteristic polynomial and the controller are synthesized simultaneously. Particularly, a coefficient diagram (comparable to Bode diagram) is used effectively to convey pertinent design information and as a measure of trade-off between stability, response speed and robustness. In the polynomial ring, Kharitonov polynomials are employed to analyze the robustness of the controller due to parametric uncertainties.
Abstract: The RK5GL3 method is a numerical method for solving
initial value problems in ordinary differential equations, and is
based on a combination of a fifth-order Runge-Kutta method and
3-point Gauss-Legendre quadrature. In this paper we describe an
effective local error control algorithm for RK5GL3, which uses local
extrapolation with an eighth-order Runge-Kutta method in tandem
with RK5GL3, and a Hermite interpolating polynomial for solution
estimation at the Gauss-Legendre quadrature nodes.
Abstract: Uncertainties of a serial production line affect on the
production throughput. The uncertainties cannot be prevented in a
real production line. However the uncertain conditions can be
controlled by a robust prediction model. Thus, a hybrid model
including autoregressive integrated moving average (ARIMA) and
multiple polynomial regression, is proposed to model the nonlinear
relationship of production uncertainties with throughput. The
uncertainties under consideration of this study are demand, breaktime,
scrap, and lead-time. The nonlinear relationship of production
uncertainties with throughput are examined in the form of quadratic
and cubic regression models, where the adjusted R-squared for
quadratic and cubic regressions was 98.3% and 98.2%. We optimized
the multiple quadratic regression (MQR) by considering the time
series trend of the uncertainties using ARIMA model. Finally the
hybrid model of ARIMA and MQR is formulated by better adjusted
R-squared, which is 98.9%.
Abstract: Resource-constrained project scheduling is an NPhard
optimisation problem. There are many different heuristic
strategies how to shift activities in time when resource requirements
exceed their available amounts. These strategies are frequently based
on priorities of activities. In this paper, we assume that a suitable
heuristic has been chosen to decide which activities should be
performed immediately and which should be postponed and
investigate the resource-constrained project scheduling problem
(RCPSP) from the implementation point of view. We propose an
efficient routine that, instead of shifting the activities, extends their
duration. It makes it possible to break down their duration into active
and sleeping subintervals. Then we can apply the classical Critical
Path Method that needs only polynomial running time. This
algorithm can simply be adapted for multiproject scheduling with
limited resources.
Abstract: This paper presents the application of a signal
intensity independent registration criterion for 2D rigid body
registration of medical images using 1D binary projections. The
criterion is defined as the weighted ratio of two projections. The ratio
is computed on a pixel per pixel basis and weighting is performed by
setting the ratios between one and zero pixels to a standard high
value. The mean squared value of the weighted ratio is computed
over the union of the one areas of the two projections and it is
minimized using the Chebyshev polynomial approximation using
n=5 points. The sum of x and y projections is used for translational
adjustment and a 45deg projection for rotational adjustment. 20 T1-
T2 registration experiments were performed and gave mean errors
1.19deg and 1.78 pixels. The method is suitable for contour/surface
matching. Further research is necessary to determine the robustness
of the method with regards to threshold, shape and missing data.
Abstract: This paper presents a protocol aiming at proving that an encryption system contains structural weaknesses without disclosing any information on those weaknesses. A verifier can check in a polynomial time that a given property of the cipher system output has been effectively realized. This property has been chosen by the prover in such a way that it cannot been achieved by known attacks or exhaustive search but only if the prover indeed knows some undisclosed weaknesses that may effectively endanger the cryptosystem security. This protocol has been denoted zero-knowledge-like proof of cryptanalysis. In this paper, we apply this protocol to the Bluetooth core encryption algorithm E0, used in many mobile environments and thus we suggest that its security can seriously be put into question.
Abstract: Smoothing or filtering of data is first preprocessing step
for noise suppression in many applications involving data analysis.
Moving average is the most popular method of smoothing the data,
generalization of this led to the development of Savitzky-Golay filter.
Many window smoothing methods were developed by convolving
the data with different window functions for different applications;
most widely used window functions are Gaussian or Kaiser. Function
approximation of the data by polynomial regression or Fourier
expansion or wavelet expansion also gives a smoothed data. Wavelets
also smooth the data to great extent by thresholding the wavelet
coefficients. Almost all smoothing methods destroys the peaks and
flatten them when the support of the window is increased. In certain
applications it is desirable to retain peaks while smoothing the data
as much as possible. In this paper we present a methodology called
as peak-wise smoothing that will smooth the data to any desired level
without losing the major peak features.
Abstract: Methods to detect and localize time singularities of polynomial and quasi-polynomial ordinary differential equations are systematically presented and developed. They are applied to examples taken form different fields of applications and they are also compared to better known methods such as those based on the existence of linear first integrals or Lyapunov functions.
Abstract: A numerical method for Riccati equation is presented in this work. The method is based on the replacement of unknown functions through a truncated series of hybrid of block-pulse functions and Chebyshev polynomials. The operational matrices of derivative and product of hybrid functions are presented. These matrices together with the tau method are then utilized to transform the differential equation into a system of algebraic equations. Corresponding numerical examples are presented to demonstrate the accuracy of the proposed method.
Abstract: A system for market identification (SMI) is presented.
The resulting representations are multivariable dynamic demand
models. The market specifics are analyzed. Appropriate models and
identification techniques are chosen. Multivariate static and dynamic
models are used to represent the market behavior. The steps of the
first stage of SMI, named data preprocessing, are mentioned. Next,
the second stage, which is the model estimation, is considered in more
details. Stepwise linear regression (SWR) is used to determine the
significant cross-effects and the orders of the model polynomials. The
estimates of the model parameters are obtained by a numerically stable
estimator. Real market data is used to analyze SMI performance.
The main conclusion is related to the applicability of multivariate
dynamic models for representation of market systems.
Abstract: The Wavelet-Galerkin finite element method for
solving the one-dimensional heat equation is presented in this work.
Two types of basis functions which are the Lagrange and multi-level
wavelet bases are employed to derive the full form of matrix system.
We consider both linear and quadratic bases in the Galerkin method.
Time derivative is approximated by polynomial time basis that
provides easily extend the order of approximation in time space. Our
numerical results show that the rate of convergences for the linear
Lagrange and the linear wavelet bases are the same and in order 2
while the rate of convergences for the quadratic Lagrange and the
quadratic wavelet bases are approximately in order 4. It also reveals
that the wavelet basis provides an easy treatment to improve
numerical resolutions that can be done by increasing just its desired
levels in the multilevel construction process.
Abstract: The paper presents an approach for handling uncertain
information in deductive databases using multivalued logics. Uncertainty
means that database facts may be assigned logical values other
than the conventional ones - true and false. The logical values represent
various degrees of truth, which may be combined and propagated
by applying the database rules. A corresponding multivalued database
semantics is defined. We show that it extends successful conventional
semantics as the well-founded semantics, and has a polynomial time
data complexity.
Abstract: In this paper, the implementation of a rule-based
intuitive reasoner is presented. The implementation included two
parts: the rule induction module and the intuitive reasoner. A large
weather database was acquired as the data source. Twelve weather
variables from those data were chosen as the “target variables"
whose values were predicted by the intuitive reasoner. A “complex"
situation was simulated by making only subsets of the data available
to the rule induction module. As a result, the rules induced were
based on incomplete information with variable levels of certainty.
The certainty level was modeled by a metric called "Strength of
Belief", which was assigned to each rule or datum as ancillary
information about the confidence in its accuracy. Two techniques
were employed to induce rules from the data subsets: decision tree
and multi-polynomial regression, respectively for the discrete and the
continuous type of target variables. The intuitive reasoner was tested
for its ability to use the induced rules to predict the classes of the
discrete target variables and the values of the continuous target
variables. The intuitive reasoner implemented two types of
reasoning: fast and broad where, by analogy to human thought, the
former corresponds to fast decision making and the latter to deeper
contemplation. . For reference, a weather data analysis approach
which had been applied on similar tasks was adopted to analyze the
complete database and create predictive models for the same 12
target variables. The values predicted by the intuitive reasoner and
the reference approach were compared with actual data. The intuitive
reasoner reached near-100% accuracy for two continuous target
variables. For the discrete target variables, the intuitive reasoner
predicted at least 70% as accurately as the reference reasoner. Since
the intuitive reasoner operated on rules derived from only about 10%
of the total data, it demonstrated the potential advantages in dealing
with sparse data sets as compared with conventional methods.
Abstract: In this work, we apply the Modified Laplace
decomposition algorithm in finding a numerical solution of Blasius’
boundary layer equation for the flat plate in a uniform stream. The
series solution is found by first applying the Laplace transform to the
differential equation and then decomposing the nonlinear term by the
use of Adomian polynomials. The resulting series, which is exactly the
same as that obtained by Weyl 1942a, was expressed as a rational
function by the use of diagonal padé approximant.
Abstract: During the last years, the genomes of more and more
species have been sequenced, providing data for phylogenetic recon-
struction based on genome rearrangement measures. A main task in
all phylogenetic reconstruction algorithms is to solve the median of
three problem. Although this problem is NP-hard even for the sim-
plest distance measures, there are exact algorithms for the breakpoint
median and the reversal median that are fast enough for practical use.
In this paper, this approach is extended to the transposition median as
well as to the weighted reversal and transposition median. Although
there is no exact polynomial algorithm known even for the pairwise
distances, we will show that it is in most cases possible to solve
these problems exactly within reasonable time by using a branch and
bound algorithm.