Abstract: This paper presents a several diagnostic methods designed to electrical machinesespecially for permanent magnets (PM) machines. Those machines are commonly used in small wind and water systems and vehicles drives.Thosemethodsare preferred by the author in periodic diagnostic of electrical machines. The special attentionshould be paid to diagnostic method of turn-to-turn insulation and vibrations. Both of those methodswere createdinInstitute of Electrical Drives and MachinesKomel. The vibration diagnostic method is the main thesis of author’s doctoral dissertation. This is method of determination the technical condition of PM electrical machine basing on its own signals is the subject of patent application No P.405669. Specific structural properties of machines excited by permanent magnets are used in this method - electromotive force (EMF) generated due to vibrations. There was analysed number of publications which describe vibration diagnostic methods and tests of electrical machines with permanent magnets and there was no method found to determine the technical condition of such machine basing on their own signals.
Abstract: This paper presents an algorithm which
combining ant colony optimization in the dynamic
programming for solving a dynamic facility layout problem.
The problem is separated into 2 phases, static and dynamic
phase. In static phase, ant colony optimization is used to find
the best ranked of layouts for each period. Then the dynamic
programming (DP) procedure is performed in the dynamic
phase to evaluate the layout set during multi-period planning
horizon. The proposed algorithm is tested over many
problems with size ranging from 9 to 49 departments, 2 and 4
periods. The experimental results show that the proposed
method is an alternative way for the plant layout designer to
determine the layouts during multi-period planning horizon.
Abstract: There are three distinct stages in the evolution of
economic thought, namely:
1. in the first stage, the major concern was to accelerate
economic growth with increased availability of material
goods, especially in developing economies with very low
living standards, because poverty eradication meant faster
economic growth.
2. in the second stage, economists made distinction between
growth and development. Development was seen as going
beyond economic growth, and bringing certain changes in
the structure of the economy with more equitable
distribution of the benefits of growth, with the growth
coming automatic and sustained.
3. the third stage is now reached. Our concern is now with
“sustainable development", that is, development not only
for the present but also of the future.
Thus the focus changed from “sustained growth" to “sustained
development". Sustained development brings to the fore the long
term relationship between the ecology and economic development.
Since the creation of UNEP in 1972 it has worked for
development without destruction for environmentally sound and
sustained development. It was realised that the environment cannot
be viewed in a vaccum, it is not separate from development, nor is it
competing. It suggested for the integration of the environment with
development whereby ecological factors enter development planning,
socio-economic policies, cost-benefit analysis, trade, technology
transfer, waste management, educational and other specific areas.
Industrialisation has contributed to the growth of economy of
several countries. It has improved the standards of living of its people
and provided benefits to the society. It has also created in the process
great environmental problems like climate change, forest destruction
and denudation, soil erosion and desertification etc.
On the other hand, industry has provided jobs and improved the
prospects of wealth for the industrialists. The working class
communities had to simply put up with the high levels of pollution in
order to keep up their jobs and also to save their income.
There are many roots of the environmental problem. They may be
political, economic, cultural and technological conditions of the
modern society. The experts concede that industrial growth lies
somewhere close to the heart of the matter. Therefore, the objective
of this paper is not to document all roots of an environmental crisis
but rather to discuss the effects of industrial growth and
development.
We have come to the conclusion that although public intervention
is often unnecessary to ensure that perfectly competitive markets will
function in society-s best interests, such intervention is necessary
when firms or consumers pollute.
Abstract: The present study concentrates on solving the along wind oscillation problem of a tall square building from first principles and across wind oscillation problem of the same from empirical relations obtained by experiments. The criterion for human comfort at the worst condition at the top floor of the building is being considered and a limiting value of height of a building for a given cross section is predicted. Numerical integrations are carried out as and when required. The results show severeness of across wind oscillations in comparison to along wind oscillation. The comfort criterion is combined with across wind oscillation results to determine the maximum allowable height of a building for a given square cross-section.
Abstract: Mathematical programming has been applied to various
problems. For many actual problems, the assumption that the parameters
involved are deterministic known data is often unjustified. In
such cases, these data contain uncertainty and are thus represented
as random variables, since they represent information about the
future. Decision-making under uncertainty involves potential risk.
Stochastic programming is a commonly used method for optimization
under uncertainty. A stochastic programming problem with recourse
is referred to as a two-stage stochastic problem. In this study, we
consider a stochastic programming problem with simple integer
recourse in which the value of the recourse variable is restricted to a
multiple of a nonnegative integer. The algorithm of a dynamic slope
scaling procedure for solving this problem is developed by using a
property of the expected recourse function. Numerical experiments
demonstrate that the proposed algorithm is quite efficient. The
stochastic programming model defined in this paper is quite useful
for a variety of design and operational problems.
Abstract: In this paper, we discuss the paradigm shift in bank
capital from the “gone concern" to the “going concern" mindset. We
then propose a methodology for pricing a product of this shift called
Contingent Capital Notes (“CoCos"). The Merton Model can
determine a price for credit risk by using the firm-s equity value as a
call option on those assets. Our pricing methodology for CoCos also
uses the credit spread implied by the Merton Model in a subsequent
derivative form created by John Hull et al . Here, a market implied
asset volatility is calculated by using observed market CDS spreads.
This implied asset volatility is then used to estimate the probability of
triggering a predetermined “contingency event" given the distanceto-
trigger (DTT). The paper then investigates the effect of varying
DTTs and recovery assumptions on the CoCo yield. We conclude
with an investment rationale.
Abstract: The application of a high frequency signal injection method as speed and position observer in PMSM drives has been a research focus. At present, the precision of this method is nearly good as that of ten-bit encoder. But there are some questions for estimating position polarity. Based on high frequency signal injection, this paper presents a method to compensate position polarity for permanent magnet synchronous motor (PMSM). Experiments were performed to test the effectiveness of the proposed algorithm and results present the good performance.
Abstract: In a nuclear reactor Loss of Coolant accident (LOCA)
considers wide range of postulated damage or rupture of pipe in the
heat transport piping system. In the case of LOCA with/without
failure of emergency core cooling system in a Pressurised Heavy
water Reactor, the Pressure Tube (PT) temperature could rise
significantly due to fuel heat up and gross mismatch of the heat
generation and heat removal in the affected channel. The extent and
nature of deformation is important from reactor safety point of view.
Experimental set-ups have been designed and fabricated to simulate
ballooning (radial deformation) of PT for 220 MWe IPHWRs.
Experiments have been conducted by covering the CT by ceramic
fibers and then by submerging CT in water of voided PTs. In both
the experiments, it is observed that ballooning initiates at a
temperature around 665´┐¢C and complete contact between PT and
Caldaria Tube (CT) occurs at around 700´┐¢C approximately. The
strain rate is found to be 0.116% per second. The structural integrity
of PT is retained (no breach) for all the experiments. The PT heatup
is found to be arrested after the contact between PT and CT, thus
establishing moderator acting as an efficient heat sink for IPHWRs.
Abstract: This paper presents a very simple and efficient
algorithm for codebook search, which reduces a great deal of
computation as compared to the full codebook search. The algorithm
is based on sorting and centroid technique for search. The results
table shows the effectiveness of the proposed algorithm in terms of
computational complexity. In this paper we also introduce a new
performance parameter named as Average fractional change in pixel
value as we feel that it gives better understanding of the closeness of
the image since it is related to the perception. This new performance
parameter takes into consideration the average fractional change in
each pixel value.
Abstract: The one-class support vector machine “support vector
data description” (SVDD) is an ideal approach for anomaly or outlier
detection. However, for the applicability of SVDD in real-world
applications, the ease of use is crucial. The results of SVDD are
massively determined by the choice of the regularisation parameter C
and the kernel parameter of the widely used RBF kernel. While for
two-class SVMs the parameters can be tuned using cross-validation
based on the confusion matrix, for a one-class SVM this is not
possible, because only true positives and false negatives can occur
during training. This paper proposes an approach to find the optimal
set of parameters for SVDD solely based on a training set from
one class and without any user parameterisation. Results on artificial
and real data sets are presented, underpinning the usefulness of the
approach.
Abstract: Much research into handwritten Thai character
recognition have been proposed, such as comparing heads of
characters, Fuzzy logic and structure trees, etc. This paper presents a
system of handwritten Thai character recognition, which is based on
the Ant-minor algorithm (data mining based on Ant colony
optimization). Zoning is initially used to determine each character.
Then three distinct features (also called attributes) of each character
in each zone are extracted. The attributes are Head zone, End point,
and Feature code. All attributes are used for construct the
classification rules by an Ant-miner algorithm in order to classify
112 Thai characters. For this experiment, the Ant-miner algorithm is
adapted, with a small change to increase the recognition rate. The
result of this experiment is a 97% recognition rate of the training set
(11200 characters) and 82.7% recognition rate of unseen data test
(22400 characters).
Abstract: Champs Bourcin black grape originated from
Aquitaine, France and planted in Sapa, Lao cai provice, exhibited
high total acidity (11.72 g/L). After 9 days of alcoholic fermentation
at 25oC using Saccharomyces cerevisiae UP3OY5 strain, the ethanol
concentration of wine was 11.5% v/v, however the sharp sour taste of
wine has been found. The malolactic fermentation (MLF) was carried
out by Oenococcus oeni ATCCBAA-1163 strain which had been preadapted
to acid (pH 3-4) and ethanol (8-12%v/v) conditions. We
obtained the highest vivability (83.2%) upon malolactic fermentation
after 5 days at 22oC with early stationary phase O. oeni cells preadapted
to pH 3.5 and 8% v/v ethanol in MRS medium. The malic
acid content in wine was decreased from 5.82 g/L to 0.02 g/L after
MLF (21 days at 22oC). The sensory quality of wine was
significantly improved.
Abstract: The authors report a case of swine urolithiasis caused
by improper administration of sulfamonomethoxine and which was
diagnosed by examination of urinary sediments and analyzing the
composition of the uroliths. The chemical composition of urinary
calculi obtained from affected pigs with urolithiasis was further
confimed as sulfamonomethoxine by fourier transform infrared
(FTIR). It is suggested that appearance of typical fanlike or wheat
bunchy crystals in urinary sediments under observation of lightmicroscope
and determination by FTIR for the crystals are helpful in
diagnosing sulfa calculi causced swine urolithiasis.
Abstract: A novel PDE solver using the multidimensional wave
digital filtering (MDWDF) technique to achieve the solution of a 2D
seismic wave system is presented. In essence, the continuous physical
system served by a linear Kirchhoff circuit is transformed to an
equivalent discrete dynamic system implemented by a MD wave
digital filtering (MDWDF) circuit. This amounts to numerically
approximating the differential equations used to describe elements of a
MD passive electronic circuit by a grid-based difference equations
implemented by the so-called state quantities within the passive
MDWDF circuit. So the digital model can track the wave field on a
dense 3D grid of points. Details about how to transform the continuous
system into a desired discrete passive system are addressed. In
addition, initial and boundary conditions are properly embedded into
the MDWDF circuit in terms of state quantities. Graphic results have
clearly demonstrated some physical effects of seismic wave (P-wave
and S–wave) propagation including radiation, reflection, and
refraction from and across the hard boundaries. Comparison between
the MDWDF technique and the finite difference time domain (FDTD)
approach is also made in terms of the computational efficiency.
Abstract: The performance of adaptive beamforming degrades
substantially in the presence of steering vector mismatches. This
degradation is especially severe in the near-field, for the
3-dimensional source location is more difficult to estimate than the
2-dimensional direction of arrival in far-field cases. As a solution, a
novel approach of near-field robust adaptive beamforming (RABF) is
proposed in this paper. It is a natural extension of the traditional
far-field RABF and belongs to the class of diagonal loading
approaches, with the loading level determined based on worst-case
performance optimization. However, different from the methods
solving the optimal loading by iteration, it suggests here a simple
closed-form solution after some approximations, and consequently,
the optimal weight vector can be expressed in a closed form. Besides
simplicity and low computational cost, the proposed approach reveals
how different factors affect the optimal loading as well as the weight
vector. Its excellent performance in the near-field is confirmed via a
number of numerical examples.
Abstract: Every day human life experiences new equipments
more automatic and with more abilities. So the need for faster
processors doesn-t seem to finish. Despite new architectures and
higher frequencies, a single processor is not adequate for many
applications. Parallel processing and networks are previous solutions
for this problem. The new solution to put a network of resources on a
chip is called NOC (network on a chip). The more usual topology for
NOC is mesh topology. There are several routing algorithms suitable
for this topology such as XY, fully adaptive, etc. In this paper we
have suggested a new algorithm named Intermittent X, Y (IX/Y). We
have developed the new algorithm in simulation environment to
compare delay and power consumption with elders' algorithms.
Abstract: Concept maps can be generated manually or
automatically. It is important to recognize differences of the two
types of concept maps. The automatically generated concept maps
are dynamic, interactive, and full of associations between the terms
on the maps and the underlying documents. Through a specific
concept mapping system, Visual Concept Explorer (VCE), this paper
discusses how automatically generated concept maps are different
from manually generated concept maps and how different
applications and learning opportunities might be created with the
automatically generated concept maps. The paper presents several
examples of learning strategies that take advantages of the
automatically generated concept maps for concept learning and
exploration.
Abstract: Long term rainfall analysis and prediction is a
challenging task especially in the modern world where the impact of
global warming is creating complications in environmental issues.
These factors which are data intensive require high performance
computational modeling for accurate prediction. This research paper
describes a prototype which is designed and developed on grid
environment using a number of coupled software infrastructural
building blocks. This grid enabled system provides the demanding
computational power, efficiency, resources, user-friendly interface,
secured job submission and high throughput. The results obtained
using sequential execution and grid enabled execution shows that
computational performance has enhanced among 36% to 75%, for
decade of climate parameters. Large variation in performance can be
attributed to varying degree of computational resources available for
job execution.
Grid Computing enables the dynamic runtime selection, sharing
and aggregation of distributed and autonomous resources which plays
an important role not only in business, but also in scientific
implications and social surroundings. This research paper attempts to
explore the grid enabled computing capabilities on weather indices
from HOAPS data for climate impact modeling and change
detection.
Abstract: An end-member selection method for spectral unmixing that is based on Particle Swarm Optimization (PSO) is developed in this paper. The algorithm uses the K-means clustering algorithm and a method of dynamic selection of end-members subsets to find the appropriate set of end-members for a given set of multispectral images. The proposed algorithm has been successfully applied to test image sets from various platforms such as LANDSAT 5 MSS and NOAA's AVHRR. The experimental results of the proposed algorithm are encouraging. The influence of different values of the algorithm control parameters on performance is studied. Furthermore, the performance of different versions of PSO is also investigated.
Abstract: Enzymatic saccharification of biomass for reducing
sugar production is one of the crucial processes in biofuel production
through biochemical conversion. In this study, enzymatic
saccharification of dilute potassium hydroxide (KOH) pre-treated
Tetraselmis suecica biomass was carried out by using cellulase
enzyme obtained from Trichoderma longibrachiatum. Initially, the
pre-treatment conditions were optimised by changing alkali reagent
concentration, retention time for reaction, and temperature. The T.
suecica biomass after pre-treatment was also characterized using
Fourier Transform Infrared Spectra and Scanning Electron
Microscope. These analyses revealed that the functional group such
as acetyl and hydroxyl groups, structure and surface of T. suecica
biomass were changed through pre-treatment, which is favourable for
enzymatic saccharification process. Comparison of enzymatic
saccharification of untreated and pre-treated microalgal biomass
indicated that higher level of reducing sugar can be obtained from
pre-treated T. suecica. Enzymatic saccharification of pre-treated T.
suecica biomass was optimised by changing temperature, pH, and
enzyme concentration to solid ratio ([E]/[S]). Highest conversion of
carbohydrate into reducing sugar of 95% amounted to reducing sugar
yield of 20 (wt%) from pre-treated T. suecica was obtained from
saccharification, at temperature: 40°C, pH: 4.5 and [E]/[S] of 0.1
after 72 h of incubation. Hydrolysate obtained from enzymatic
saccharification of pretreated T. suecica biomass was further
fermented into biobutanol using Clostridium saccharoperbutyliticum
as biocatalyst. The results from this study demonstrate a positive
prospect of application of dilute alkaline pre-treatment to enhance
enzymatic saccharification and biobutanol production from
microalgal biomass.