Abstract: The PH curve can be constructed by given parameters, but the shape of the curve is not so easy to image from the value of the parameters. On the contract, Bézier curve can be constructed by the control polygon, and from the control polygon, we can image the figure of the curve. In this paper, we want to use the hodograph of Bézier curve to construct PH curve by selecting part of the control vectors, and produce other control vectors, so the property of PH curve exists.
Abstract: In recent years various types of electric vehicles
has gained again increasing attention as an environmentally
benign technology in transport. Especially for urban areas with
high local pollution this Zero-emission technology (at the point
of use) is considered to provide proper solutions. Yet, the bad
economics and the limited driving ranges are still major barriers
for a broader market penetration of battery electric vehicles
(BEV) and of fuel cell vehicles (FCV). The major result of our
analyses is that the most important precondition for a further
dissemination of BEV in urban areas are emission-free zones.
This is an instrument which allows the promotion of BEV
without providing excessive subsidies. In addition, it is
important to note that the full benefits of EV can only be
harvested if the electricity used is produced from renewable
energy sources. That is to say, it has to be ensured that the use of
BEV in urban areas is clearly linked to a green electricity
purchase model. And moreover, the introduction of a CO2-
emission-based tax system would support this requirement.
Abstract: The present study is aim to prepare and evaluate the selfnanoemulsifying drug delivery (SNEDDS) system of a poorly water soluble drug valsartan in order to achieve a better dissolution rate which would further help in enhancing oral bioavailability. The present research work describes a SNEDDS of valsartan using labrafil M 1944 CS, Tween 80 and Transcutol HP. The pseudoternary phase diagrams with presence and absence of drug were plotted to check for the emulsification range and also to evaluate the effect of valsartan on the emulsification behavior of the phases. The mixtures consisting of oil (labrafil M 1944 CS) with surfactant (tween 80), co-surfactant (Transcutol HP) were found to be optimum formulations. Prepared formulations were evaluated for its particle size distribution, nanoemulsifying properties, robustness to dilution, self emulsication time, turbidity measurement, drug content and invitro dissolution. The optimized formulations are further evaluated for heating cooling cycle, centrifugation studies, freeze thaw cycling, particle size distribution and zeta potential were carried out to confirm the stability of the formed SNEDDS formulations. The prepared formulation revealed t a significant improvement in terms of the drug solubility as compared with marketed tablet and pure drug.
Abstract: Thermo-chemical treatment (TCT) such as pyrolysis
is getting recognized as a valid route for (i) materials and valuable
products and petrochemicals recovery; (ii) waste recycling; and (iii)
elemental characterization. Pyrolysis is also receiving renewed
attention for its operational, economical and environmental
advantages. In this study, samples of polyethylene terephthalate
(PET) and polystyrene (PS) were pyrolysed in a microthermobalance
reactor (using a thermogravimetric-TGA setup). Both
polymers were prepared and conditioned prior to experimentation.
The main objective was to determine the kinetic parameters of the
depolymerization reactions that occur within the thermal degradation
process. Overall kinetic rate constants (ko) and activation energies
(Eo) were determined using the general kinetics theory (GKT)
method previously used by a number of authors. Fitted correlations
were found and validated using the GKT, errors were within ± 5%.
This study represents a fundamental step to pave the way towards the
development of scaling relationship for the investigation of larger
scale reactors relevant to industry.
Abstract: The project describes the modeling of various
architectures mechatronics specifically morphologies of robots in an educational environment. Each structure developed by students of
pre-school, primary and secondary was created using the concept of
reverse engineering in a constructivist environment, to later be integrated in educational software that promotes the teaching of
educational Robotics in a virtual and economic environment.
Abstract: An unsupervised classification algorithm is derived
by modeling observed data as a mixture of several mutually
exclusive classes that are each described by linear combinations of
independent non-Gaussian densities. The algorithm estimates the
data density in each class by using parametric nonlinear functions
that fit to the non-Gaussian structure of the data. This improves
classification accuracy compared with standard Gaussian mixture
models. When applied to textures, the algorithm can learn basis
functions for images that capture the statistically significant structure
intrinsic in the images. We apply this technique to the problem of
unsupervised texture classification and segmentation.
Abstract: This paper presents an algorithm to estimate the parameters of two closely spaced sinusoids, providing a frequency resolution that is more than 800 times greater than that obtained by using the Discrete Fourier Transform (DFT). The strategy uses a highly optimized grid search approach to accurately estimate frequency, amplitude and phase of both sinusoids, keeping at the same time the computational effort at reasonable levels. The proposed method has three main characteristics: 1) a high frequency resolution; 2) frequency, amplitude and phase are all estimated at once using one single package; 3) it does not rely on any statistical assumption or constraint. Potential applications to this strategy include the difficult task of resolving coincident partials of instruments in musical signals.
Abstract: The purpose of this research was to study the inspector performance by using computer based training (CBT). Visual inspection task was printed circuit board (PCB) simulated on several types of defects. Subjects were 16 undergraduate randomly selected from King Mongkut-s University of Technology Thonburi and test for 20/20. Then, they were equally divided on performance into two groups (control and treatment groups) and were provided information before running the experiment. Only treatment group was provided feedback information after first experiment. Results revealed that treatment group was showed significantly difference at the level of 0.01. The treatment group showed high percentage on defects detected. Moreover, the attitude of inspectors on using the CBT to inspection was showed on good. These results have been showed that CBT could be used for training to improve inspector performance.
Abstract: In this paper, we present a new method for
incorporating global shift invariance in support vector machines.
Unlike other approaches which incorporate a feature extraction stage,
we first scale the image and then classify it by using the modified
support vector machines classifier. Shift invariance is achieved by
replacing dot products between patterns used by the SVM classifier
with the maximum cross-correlation value between them. Unlike the
normal approach, in which the patterns are treated as vectors, in our
approach the patterns are treated as matrices (or images). Crosscorrelation
is computed by using computationally efficient
techniques such as the fast Fourier transform. The method has been
tested on the ORL face database. The tests indicate that this method
can improve the recognition rate of an SVM classifier.
Abstract: The steady incompressible flow has been solved in cylindrical coordinates in both vapour region and wick structure. The governing equations in vapour region are continuity, Navier-Stokes and energy equations. These equations have been solved using SIMPLE algorithm. For study of parameters variation on heat pipe operation, a benchmark has been chosen and the effect of changing one parameter has been analyzed when the others have been fixed.
Abstract: Network-Centric Air Defense Missile Systems
(NCADMS) represents the superior development of the air defense
missile systems and has been regarded as one of the major research
issues in military domain at present. Due to lack of knowledge and
experience on NCADMS, modeling and simulation becomes an effective
approach to perform operational analysis, compared with
those equation based ones. However, the complex dynamic interactions
among entities and flexible architectures of NCADMS put forward
new requirements and challenges to the simulation framework
and models. ABS (Agent-Based Simulations) explicitly addresses
modeling behaviors of heterogeneous individuals. Agents have capability
to sense and understand things, make decisions, and act on the
environment. They can also cooperate with others dynamically to
perform the tasks assigned to them. ABS proves an effective approach
to explore the new operational characteristics emerging in
NCADMS. In this paper, based on the analysis of network-centric
architecture and new cooperative engagement strategies for
NCADMS, an agent-based simulation framework by expanding the
simulation framework in the so-called System Effectiveness Analysis
Simulation (SEAS) was designed. The simulation framework specifies
components, relationships and interactions between them, the
structure and behavior rules of an agent in NCADMS. Based on scenario
simulations, information and decision superiority and operational
advantages in NCADMS were analyzed; meanwhile some
suggestions were provided for its future development.
Abstract: To realize the vision of ubiquitous computing, it is
important to develop a context-aware infrastructure which can help
ubiquitous agents, services, and devices become aware of their
contexts because such computational entities need to adapt themselves
to changing situations. A context-aware infrastructure manages the
context model representing contextual information and provides
appropriate information. In this paper, we introduce Context-Aware
Middleware for URC System (hereafter CAMUS) as a context-aware
infrastructure for a network-based intelligent robot system and discuss
the ontology-based context modeling and reasoning approach which is
used in that infrastructure.
Abstract: Newcastle Disease Virus (NDV), an avian
paramyxovirus, is a highly contagious, generalised virus disease of
domestic poultry and wild birds characterized by gastro-intestinal,
respiratory and nervous signs. In this study, it was shown that NDV
strain AF2240 and V4-UPM are cytolytic to Human Promyelocytic
Leukemia, HL60 and Human T-lymphoblastic Leukemia, CEM-SS
cells. Results from MTT cytolytic assay showed that CD50 for NDV
AF2240 against HL60 was 130 HAU and NDV V4-UPM against
HL60 and CEM-SS were 110.6 and 150.9 HAU respectively.
Besides, both strains were found to inhibit the proliferation of cells in
a dose dependent manner. The mode of cell death either by apoptosis
or necrosis was further analyzed using acridine orange and propidium
iodide (AO/PI) staining. Our results showed that both NDV strains
induced primarily apoptosis in treated cells at CD50 concentration. In
conclusion, both NDV strains caused cytolytic effects primarily via
apoptosis in leukemia cells.
Abstract: The purpose of Grid computing is to utilize
computational power of idle resources which are distributed in
different areas. Given the grid dynamism and its decentralize
resources, there is a need for an efficient scheduler for scheduling
applications. Since task scheduling includes in the NP-hard problems
various researches have focused on invented algorithms especially
the genetic ones. But since genetic is an inherent algorithm which
searches the problem space globally and does not have the efficiency
required for local searching, therefore, its combination with local
searching algorithms can compensate for this shortcomings. The aim
of this paper is to combine the genetic algorithm and GELS (GAGELS)
as a method to solve scheduling problem by which
simultaneously pay attention to two factors of time and number of
missed tasks. Results show that the proposed algorithm can decrease
makespan while minimizing the number of missed tasks compared
with the traditional methods.
Abstract: A fuzzy classifier using multiple ellipsoids approximating decision regions for classification is to be designed in this paper. An algorithm called Gustafson-Kessel algorithm (GKA) with an adaptive distance norm based on covariance matrices of prototype data points is adopted to learn the ellipsoids. GKA is able toadapt the distance norm to the underlying distribution of the prototypedata points except that the sizes of ellipsoids need to be determined a priori. To overcome GKA's inability to determine appropriate size ofellipsoid, the genetic algorithm (GA) is applied to learn the size ofellipsoid. With GA combined with GKA, it will be shown in this paper that the proposed method outperforms the benchmark algorithms as well as algorithms in the field.
Abstract: In this work the opportunity of construction of the
qualifiers for face-recognition systems based on conjugation criteria
is investigated. The linkage between the bipartite conjugation, the
conjugation with a subspace and the conjugation with the null-space
is shown. The unified solving rule is investigated. It makes the
decision on the rating of face to a class considering the linkage
between conjugation values. The described recognition method can
be successfully applied to the distributed systems of video control
and video observation.
Abstract: By using Mawhin-s continuation theorem of coincidence degree theory, we establish the existence of 2n positive periodic solutions for n species non-autonomous Lotka-Volterra competition systems with harvesting terms. An example is given to illustrate the effectiveness of our results.
Abstract: A frequency grouping approach for multi-channel
instantaneous blind source separation (I-BSS) of convolutive
mixtures is proposed for a lower net residual inter-symbol
interference (ISI) and inter-channel interference (ICI) than the
conventional short-time Fourier transform (STFT) approach. Starting
in the time domain, STFTs are taken with overlapping windows to
convert the convolutive mixing problem into frequency domain
instantaneous mixing. Mixture samples at the same frequency but
from different STFT windows are grouped together forming unique
frequency groups.
The individual frequency group vectors are input to the I-BSS
algorithm of choice, from which the output samples are dispersed
back to their respective STFT windows. After applying the inverse
STFT, the resulting time domain signals are used to construct the
complete source estimates via the weighted overlap-add method
(WOLA). The proposed algorithm is tested for source deconvolution
given two mixtures, and simulated along with the STFT approach to
illustrate its superiority for fairly motionless sources.
Abstract: Since large power transformers are the most
expensive and strategically important components of any power
generator and transmission system, their reliability is crucially
important for the energy system operation. Also, Circuit breakers are
very important elements in the power transmission line so monitoring
the events gives a knowledgebase to determine time to the next
maintenance. This paper deals with the introduction of the
comparative method of the state estimation of transformers and
Circuit breakers using continuous monitoring of voltage, current.
This paper gives details a new method based on wavelet to apparatus
insulation monitoring. In this paper to insulation monitoring of
transformer, a new method based on wavelet transformation and
neutral point analysis is proposed. Using the EMTP tools, fault in
transformer winding and the detailed transformer winding model
were simulated. The current of neutral point of winding was analyzed
by wavelet transformation. It is shown that the neutral current of the
transformer winding has useful information about fault in insulation
of the transformer.
Abstract: Accounts of language acquisition differ significantly in their treatment of the role of prediction in language learning. In particular, nativist accounts posit that probabilistic learning about words and word sequences has little to do with how children come to use language. The accuracy of this claim was examined by testing whether distributional probabilities and frequency contributed to how well 3-4 year olds repeat simple word chunks. Corresponding chunks were the same length, expressed similar content, and were all grammatically acceptable, yet the results of the study showed marked differences in performance when overall distributional frequency varied. It was found that a distributional model of language predicted the empirical findings better than a number of other models, replicating earlier findings and showing that children attend to distributional probabilities in an adult corpus. This suggested that language is more prediction-and-error based, rather than on abstract rules which nativist camps suggest.