Abstract: Simultaneous Saccharification and Fermentation (SSF) of sugarcane bagasse by cellulase and Pachysolen tannophilus MTCC *1077 were investigated in the present study. Important process variables for ethanol production form pretreated bagasse were optimized using Response Surface Methodology (RSM) based on central composite design (CCD) experiments. A 23 five level CCD experiments with central and axial points was used to develop a statistical model for the optimization of process variables such as incubation temperature (25–45°) X1, pH (5.0–7.0) X2 and fermentation time (24–120 h) X3. Data obtained from RSM on ethanol production were subjected to the analysis of variance (ANOVA) and analyzed using a second order polynomial equation and contour plots were used to study the interactions among three relevant variables of the fermentation process. The fermentation experiments were carried out using an online monitored modular fermenter 2L capacity. The processing parameters setup for reaching a maximum response for ethanol production was obtained when applying the optimum values for temperature (32°C), pH (5.6) and fermentation time (110 h). Maximum ethanol concentration (3.36 g/l) was obtained from 50 g/l pretreated sugarcane bagasse at the optimized process conditions in aerobic batch fermentation. Kinetic models such as Monod, Modified Logistic model, Modified Logistic incorporated Leudeking – Piret model and Modified Logistic incorporated Modified Leudeking – Piret model have been evaluated and the constants were predicted.
Abstract: Bacterial cellulose, a biopolysaccharide, is produced by the bacterium, Gluconacetobacter xylinus. Static batch fermentation for bacterial cellulose production was studied in sucrose and date syrup solutions (Bx. 10%) at 28 °C using G. xylinus (PTCC, 1734). Results showed that the maximum yields of bacterial cellulose (BC) were 4.35 and 1.69 g/l00 ml for date syrup and sucrose medium after 336 hours fermentation period, respectively. Comparison of FTIR spectrum of cellulose with BC indicated appropriate coincidence which proved that the component produced by G. xylinus was cellulose. Determination of the area under X-ray diffractometry patterns demonstrated that the crystallinity amount of cellulose (83.61%) was more than that for the BC (60.73%). The scanning electron microscopy imaging of BC and cellulose were carried out in two magnifications of 1 and 6K. Results showed that the diameter ratio of BC to cellulose was approximately 1/30 which indicated more delicacy of BC fibers relative to cellulose.
Abstract: In this paper three different approaches for person
verification and identification, i.e. by means of fingerprints, face and
voice recognition, are studied. Face recognition uses parts-based
representation methods and a manifold learning approach. The
assessment criterion is recognition accuracy. The techniques under
investigation are: a) Local Non-negative Matrix Factorization
(LNMF); b) Independent Components Analysis (ICA); c) NMF with
sparse constraints (NMFsc); d) Locality Preserving Projections
(Laplacianfaces). Fingerprint detection was approached by classical
minutiae (small graphical patterns) matching through image
segmentation by using a structural approach and a neural network as
decision block. As to voice / speaker recognition, melodic cepstral
and delta delta mel cepstral analysis were used as main methods, in
order to construct a supervised speaker-dependent voice recognition
system. The final decision (e.g. “accept-reject" for a verification
task) is taken by using a majority voting technique applied to the
three biometrics. The preliminary results, obtained for medium
databases of fingerprints, faces and voice recordings, indicate the
feasibility of our study and an overall recognition precision (about
92%) permitting the utilization of our system for a future complex
biometric card.
Abstract: The purpose of this study is to investigate the chemical
degradation of the organophosphorus pesticide of parathion and
carbamate insecticide of methomyl in the aqueous phase through
Fenton process. With the employment of batch Fenton process, the
degradation of the two selected pesticides at different pH, initial
concentration, humic acid concentration, and Fenton reagent dosages
was explored. The Fenton process was found effective to degrade
parathion and methomyl. The optimal dosage of Fenton reagents (i.e.,
molar concentration ratio of H2O2 to Fe2+) at pH 7 for parathion
degradation was equal to 3, which resulted in 50% removal of
parathion. Similarly, the optimal dosage for methomyl degradation
was 1, resulting in 80% removal of methomyl. This study also found
that the presence of humic substances has enhanced pesticide
degradation by Fenton process significantly. The mass spectroscopy
results showed that the hydroxyl free radical may attack the single
bonds with least energy of investigated pesticides to form smaller
molecules which is more easily to degrade either through
physio-chemical or bilolgical processes.
Abstract: Over the past decades, automatic face recognition has become a highly active research area, mainly due to the countless application possibilities in both the private as well as the public sector. Numerous algorithms have been proposed in the literature to cope with the problem of face recognition, nevertheless, a group of methods commonly referred to as appearance based have emerged as the dominant solution to the face recognition problem. Many comparative studies concerned with the performance of appearance based methods have already been presented in the literature, not rarely with inconclusive and often with contradictory results. No consent has been reached within the scientific community regarding the relative ranking of the efficiency of appearance based methods for the face recognition task, let alone regarding their susceptibility to appearance changes induced by various environmental factors. To tackle these open issues, this paper assess the performance of the three dominant appearance based methods: principal component analysis, linear discriminant analysis and independent component analysis, and compares them on equal footing (i.e., with the same preprocessing procedure, with optimized parameters for the best possible performance, etc.) in face verification experiments on the publicly available XM2VTS database. In addition to the comparative analysis on the XM2VTS database, ten degraded versions of the database are also employed in the experiments to evaluate the susceptibility of the appearance based methods on various image degradations which can occur in "real-life" operating conditions. Our experimental results suggest that linear discriminant analysis ensures the most consistent verification rates across the tested databases.
Abstract: The overlay approach has been widely used by many service providers for Traffic Engineering (TE) in large Internet backbones. In the overlay approach, logical connections are set up between edge nodes to form a full mesh virtual network on top of the physical topology. IP routing is then run over the virtual network. Traffic engineering objectives are achieved through carefully routing logical connections over the physical links. Although the overlay approach has been implemented in many operational networks, it has a number of well-known scaling issues. This paper proposes a new approach to achieve traffic engineering without full-mesh overlaying with the help of integrated approach and equal subset split method. Traffic engineering needs to determine the optimal routing of traffic over the existing network infrastructure by efficiently allocating resource in order to optimize traffic performance on an IP network. Even though constraint-based routing [1] of Multi-Protocol Label Switching (MPLS) is developed to address this need, since it is not widely tested or debugged, Internet Service Providers (ISPs) resort to TE methods under Open Shortest Path First (OSPF), which is the most commonly used intra-domain routing protocol. Determining OSPF link weights for optimal network performance is an NP-hard problem. As it is not possible to solve this problem, we present a subset split method to improve the efficiency and performance by minimizing the maximum link utilization in the network via a small number of link weight modifications. The results of this method are compared against results of MPLS architecture [9] and other heuristic methods.
Abstract: In this study, the transesterification of palm oil with methanol for biodiesel production was studied by using CaO–ZnO as a heterogeneous base catalyst prepared by incipient-wetness impregnation (IWI) and co-precipitation (CP) methods. The reaction parameters considered were molar ratio of methanol to oil, amount of catalyst, reaction temperature, and reaction time. The optimum conditions–15:1 molar ratio of methanol to oil, a catalyst amount of 6 wt%, reaction temperature of 60 °C, and reaction time of 8 h–were observed. The effects of Ca loading, calcination temperature, and catalyst preparation on the catalytic performance were studied. The fresh and spent catalysts were characterized by several techniques, including XRD, TPR, and XRF.
Abstract: The application of agro-industrial waste in Aluminum
Metal Matrix Composites has been getting more attention as they
can reinforce particles in metal matrix which enhance the strength
properties of the composites. In addition, by applying these agroindustrial
wastes in useful way not only save the manufacturing cost
of products but also reduce the pollutions on environment. This
paper represents a literature review on a range of industrial wastes
and their utilization in metal matrix composites. The paper describes
the synthesis methods of agro-industrial waste filled metal matrix
composite materials and their mechanical, wear, corrosion, and
physical properties. It also highlights the current application and
future potential of agro-industrial waste reinforced composites in
aerospace, automotive and other construction industries.
Abstract: This paper proposes two types of non-isolated
direct AC-DC converters. First, it shows a buck-boost
converter with an H-bridge, which requires few components
(three switches, two diodes, one inductor and one capacitor) to
convert AC input to DC output directly. This circuit can handle
a wide range of output voltage. Second, a direct AC-DC buck
converter is proposed for lower output voltage applications.
This circuit is analyzed with output voltage of 12V. We
describe circuit topologies, operation principles and simulation
results for both circuits.
Abstract: High Strength Concrete (HSC) is defined as concrete
that meets special combination of performance and uniformity
requirements that cannot be achieved routinely using conventional
constituents and normal mixing, placing, and curing procedures. It is
a highly complex material, which makes modeling its behavior a very
difficult task. This paper aimed to show possible applicability of
Neural Networks (NN) to predict the slump in High Strength
Concrete (HSC). Neural Network models is constructed, trained and
tested using the available test data of 349 different concrete mix
designs of High Strength Concrete (HSC) gathered from a particular
Ready Mix Concrete (RMC) batching plant. The most versatile
Neural Network model is selected to predict the slump in concrete.
The data used in the Neural Network models are arranged in a format
of eight input parameters that cover the Cement, Fly Ash, Sand,
Coarse Aggregate (10 mm), Coarse Aggregate (20 mm), Water,
Super-Plasticizer and Water/Binder ratio. Furthermore, to test the
accuracy for predicting slump in concrete, the final selected model is
further used to test the data of 40 different concrete mix designs of
High Strength Concrete (HSC) taken from the other batching plant.
The results are compared on the basis of error function (or
performance function).
Abstract: This paper highlights the importance of the selection
of the building-s wall material,and the shortcomings of the most
commonly used framed structures with masonry infills .The
objective of this study is investigating the behavior of infill walls as
structural components in existing structures.Structural infill walls are
very important in structural behavior under earthquake effects.
Structural capacity under the effect of earthquake,displacement and
relative story displacement are affected by the structural irregularities
.The presence of nonstructural masonry infill walls can modify
extensively the global seismic behavior of framed buildings .The
stability and integrity of reinforced concrete frames are enhanced by
masonry infill walls. Masonry infill walls alter displacement and
base shear of the frame as well. Short columns have great
importance during earthquakes,because their failure may lead to
additional structural failures and result in total building collapse.
Consequently the effects of short columns are considered in this
study.
Abstract: In this paper, we propose a robust disease detection
method, called adaptive orientation code matching (Adaptive OCM),
which is developed from a robust image registration algorithm:
orientation code matching (OCM), to achieve continuous and
site-specific detection of changes in plant disease. We use two-stage
framework for realizing our research purpose; in the first stage,
adaptive OCM was employed which could not only realize the
continuous and site-specific observation of disease development, but
also shows its excellent robustness for non-rigid plant object searching
in scene illumination, translation, small rotation and occlusion changes
and then in the second stage, a machine learning method of support
vector machine (SVM) based on a feature of two dimensional (2D)
xy-color histogram is further utilized for pixel-wise disease
classification and quantification. The indoor experiment results
demonstrate the feasibility and potential of our proposed algorithm,
which could be implemented in real field situation for better
observation of plant disease development.
Abstract: The two-dimensional gel electrophoresis method
(2-DE) is widely used in Proteomics to separate thousands of proteins
in a sample. By comparing the protein expression levels of proteins in
a normal sample with those in a diseased one, it is possible to identify
a meaningful set of marker proteins for the targeted disease. The major
shortcomings of this approach involve inherent noises and irregular
geometric distortions of spots observed in 2-DE images. Various
experimental conditions can be the major causes of these problems. In
the protein analysis of samples, these problems eventually lead to
incorrect conclusions. In order to minimize the influence of these
problems, this paper proposes a partition based pair extension method
that performs spot-matching on a set of gel images multiple times and
segregates more reliable mapping results which can improve the
accuracy of gel image analysis. The improved accuracy of the
proposed method is analyzed through various experiments on real
2-DE images of human liver tissues.
Abstract: Vector quantization is a powerful tool for speech
coding applications. This paper deals with LPC Coding of speech
signals which uses a new technique called Multi Switched Split
Vector Quantization, This is a hybrid of two product code vector
quantization techniques namely the Multi stage vector quantization
technique, and Switched split vector quantization technique,. Multi
Switched Split Vector Quantization technique quantizes the linear
predictive coefficients in terms of line spectral frequencies. From
results it is proved that Multi Switched Split Vector Quantization
provides better trade off between bitrate and spectral distortion
performance, computational complexity and memory requirements
when compared to Switched Split Vector Quantization, Multi stage
vector quantization, and Split Vector Quantization techniques. By
employing the switching technique at each stage of the vector
quantizer the spectral distortion, computational complexity and
memory requirements were greatly reduced. Spectral distortion was
measured in dB, Computational complexity was measured in
floating point operations (flops), and memory requirements was
measured in (floats).
Abstract: A passive system "Qanat" is collection of some
underground wells. A mother-well was dug in a place far from the
city where they could reach to the water table maybe 100 meters
underground, they dug other wells to direct water toward the city,
with minimum possible gradient. Using the slope of the earth they
could bring water close to the surface in the city. The source of water
or the appearance of Qanat, land slope and the ownership lines are
the important and effective factors in the formation of routes and the
segment division of lands to the extent that making use of Qanat as
the techniques of extracting underground waters creates a channel of
routes with an organic order and hierarchy coinciding the slope of
land and it also guides the Qanat waters in the tradition texture of salt
desert and border provinces of it. Qanats are excavated in a specified
distinction from each other. The quantity of water provided by
Qanats depends on the kind of land, distance from mountain,
geographical situation of them and the rate of water supply from the
underground land. The rate of underground waters, possibility of
Qanat excavation, number of Qanats and rate of their water supply
from one hand and the quantity of cultivable fertile lands from the
other hand are the important natural factors making the size of cities.
In the same manner the cities with several Qanats have multi central
textures. The location of cities is in direct relation with land quality,
soil fertility and possibility of using underground water by excavating
Qanats. Observing the allowable distance for Qanat watering is a
determining factor for distance between villages and cities.
Topography, land slope, soil quality, watering system, ownership,
kind of cultivation, etc. are the effective factors in directing Qanats
for excavation and guiding water toward the cultivable lands and it
also causes the formation of different textures in land division of
farming provinces. Several divisions such as orderly and wide, inorderly,
thin and long, comb like, etc. are the introduction to organic
order. And at the same time they are complete coincidence with
environmental conditions in the typical development of ecological
architecture and planning in the traditional cities and settlements
order.
Abstract: Recent developments in Soft computing techniques,
power electronic switches and low-cost computational hardware have
made it possible to design and implement sophisticated control
strategies for sensorless speed control of AC motor drives. Such an
attempt has been made in this work, for Sensorless Speed Control of
Induction Motor (IM) by means of Direct Torque Fuzzy Control
(DTFC), PI-type fuzzy speed regulator and MRAS speed estimator
strategy, which is absolutely nonlinear in its nature. Direct torque
control is known to produce quick and robust response in AC drive
system. However, during steady state, torque, flux and current ripple
occurs. So, the performance of conventional DTC with PI speed
regulator can be improved by implementing fuzzy logic techniques.
Certain important issues in design including the space vector
modulated (SVM) 3-Ф voltage source inverter, DTFC design,
generation of reference torque using PI-type fuzzy speed regulator
and sensor less speed estimator have been resolved. The proposed
scheme is validated through extensive numerical simulations on
MATLAB. The simulated results indicate the sensor less speed
control of IM with DTFC and PI-type fuzzy speed regulator provides
satisfactory high dynamic and static performance compare to
conventional DTC with PI speed regulator.
Abstract: All over the world, including the Middle and East
European countries, sustainable tillage and sowing technologies are
applied increasingly broadly with a view to optimising soil resources,
mitigating soil degradation processes, saving energy resources,
preserving biological diversity, etc. As a result, altered conditions of
tillage and sowing technological processes are faced inevitably. The
purpose of this study is to determine the seedbed topsoil hardness
when using a combined sowing coulter in different sustainable tillage
technologies. The research involved a combined coulter consisting
of two dissected blade discs and a shoe coulter. In order to determine
soil hardness at the seedbed area, a multipenetrometer was used. It
was found by experimental studies that in loosened soil, a combined
sowing coulter equally suppresses the furrow bottom, walls and soil
near the furrow; therefore, here, soil hardness was similar at all
researched depths and no significant differences were established. In
loosened and compacted (double-rolled) soil, the impact of a
combined coulter on the hardness of seedbed soil surface was more
considerable at a depth of 2 mm. Soil hardness at the furrow bottom
and walls to a distance of up to 26 mm was 1.1 MPa. At a depth of 10
mm, the greatest hardness was established at the furrow bottom. In
loosened and heavily compacted (rolled for 6 times) soil, at a depth
of 2 and 10 mm a combined coulter most of all compacted the furrow
bottom, which has a hardness of 1.8 MPa. At a depth of 20 mm, soil
hardness within the whole investigated area varied insignificantly and
fluctuated by around 2.0 MPa. The hardness of furrow walls and soil
near the furrow was by approximately 1.0 MPa lower than that at the
furrow bottom
Abstract: This article presents the developments of efficient
algorithms for tablet copies comparison. Image recognition has
specialized use in digital systems such as medical imaging,
computer vision, defense, communication etc. Comparison between
two images that look indistinguishable is a formidable task. Two
images taken from different sources might look identical but due to
different digitizing properties they are not. Whereas small variation
in image information such as cropping, rotation, and slight
photometric alteration are unsuitable for based matching
techniques. In this paper we introduce different matching
algorithms designed to facilitate, for art centers, identifying real
painting images from fake ones. Different vision algorithms for
local image features are implemented using MATLAB. In this
framework a Table Comparison Computer Tool “TCCT" is
designed to facilitate our research. The TCCT is a Graphical Unit
Interface (GUI) tool used to identify images by its shapes and
objects. Parameter of vision system is fully accessible to user
through this graphical unit interface. And then for matching, it
applies different description technique that can identify exact
figures of objects.
Abstract: This paper presents modeling and analysis of 12-phase distribution static compensator (DSTATCOM), which is capable of balancing the source currents in spite of unbalanced loading and phase outages. In addition to balance the supply current, the power factor can be set to a desired value. The theory of instantaneous symmetrical components is used to generate the twelve-phase reference currents. These reference currents are then tracked using current controlled voltage source inverter, operated in a hysteresis band control scheme. An ideal compensator in place of physical realization of the compensator is used. The performance of the proposed DTATCOM is validated through MATLAB simulation and detailed simulation results are given.
Abstract: With the advance of information technology in the
new era the applications of Internet to access data resources has
steadily increased and huge amount of data have become accessible
in various forms. Obviously, the network providers and agencies,
look after to prevent electronic attacks that may be harmful or may
be related to terrorist applications. Thus, these have facilitated the
authorities to under take a variety of methods to protect the special
regions from harmful data. One of the most important approaches is
to use firewall in the network facilities. The main objectives of
firewalls are to stop the transfer of suspicious packets in several
ways. However because of its blind packet stopping, high process
power requirements and expensive prices some of the providers are
reluctant to use the firewall. In this paper we proposed a method to
find a discriminate function to distinguish between usual packets and
harmful ones by the statistical processing on the network router logs.
By discriminating these data, an administrator may take an approach
action against the user. This method is very fast and can be used
simply in adjacent with the Internet routers.