Abstract: The paper discusses a 3D numerical solution of the inverse boundary problem for a continuous casting process of alloy. The main goal of the analysis presented within the paper was to estimate heat fluxes along the external surface of the ingot. The verified information on these fluxes was crucial for a good design of a mould, effective cooling system and generally the whole caster. In the study an enthalpy-porosity technique implemented in Fluent package was used for modeling the solidification process. In this method, the phase change interface was determined on the basis of the liquid fraction approach. In inverse procedure the sensitivity analysis was applied for retrieving boundary conditions. A comparison of the measured and retrieved values showed a high accuracy of the computations. Additionally, the influence of the accuracy of measurements on the estimated heat fluxes was also investigated.
Abstract: The kinetics of palm oil catalytic cracking over
aluminum containing mesoporous silica Al-MCM-41 (5% Al) was
investigated in a batch autoclave reactor at the temperatures range of
573 – 673 K. The catalyst was prepared by using sol-gel technique
and has been characterized by nitrogen adsorption and x-ray
diffraction methods. Surface area of 1276 m2/g with average pore
diameter of 2.54 nm and pore volume of 0.811 cm3/g was obtained.
The experimental catalytic cracking runs were conducted using 50 g
of oil and 1 g of catalyst. The reaction pressure was recorded at
different time intervals and the data were analyzed using Levenberg-
Marquardt (LM) algorithm using polymath software. The results
show that the reaction order was found to be -1.5 and activation
energy of 3200 J/gmol.
Abstract: The forest fires in Thailand are annual occurrence which is the cause of air pollutions. This study intended to estimate the emission from forest fire during 2005-2009 using MODerateresolution Imaging Spectro-radiometer (MODIS) sensor aboard the Terra and Aqua satellites, experimental data, and statistical data. The forest fire emission is estimated using equation established by Seiler and Crutzen in 1982. The spatial and temporal variation of forest fire emission is analyzed and displayed in the form of grid density map. From the satellite data analysis suggested between 2005 and 2009, the number of fire hotspots occurred 86,877 fire hotspots with a significant highest (more than 80% of fire hotspots) in the deciduous forest. The peak period of the forest fire is in January to May. The estimation on the emissions from forest fires during 2005 to 2009 indicated that the amount of CO, CO2, CH4, and N2O was about 3,133,845 tons, 47,610.337 tons, 204,905 tons, and 6,027 tons, respectively, or about 6,171,264 tons of CO2eq. They also emitted 256,132 tons of PM10. The year 2007 was found to be the year when the emissions were the largest. Annually, March is the period that has the maximum amount of forest fire emissions. The areas with high density of forest fire emission were the forests situated in the northern, the western, and the upper northeastern parts of the country.
Abstract: In recent years various types of electric vehicles
has gained again increasing attention as an environmentally
benign technology in transport. Especially for urban areas with
high local pollution this Zero-emission technology (at the point
of use) is considered to provide proper solutions. Yet, the bad
economics and the limited driving ranges are still major barriers
for a broader market penetration of battery electric vehicles
(BEV) and of fuel cell vehicles (FCV). The major result of our
analyses is that the most important precondition for a further
dissemination of BEV in urban areas are emission-free zones.
This is an instrument which allows the promotion of BEV
without providing excessive subsidies. In addition, it is
important to note that the full benefits of EV can only be
harvested if the electricity used is produced from renewable
energy sources. That is to say, it has to be ensured that the use of
BEV in urban areas is clearly linked to a green electricity
purchase model. And moreover, the introduction of a CO2-
emission-based tax system would support this requirement.
Abstract: The present study is aim to prepare and evaluate the selfnanoemulsifying drug delivery (SNEDDS) system of a poorly water soluble drug valsartan in order to achieve a better dissolution rate which would further help in enhancing oral bioavailability. The present research work describes a SNEDDS of valsartan using labrafil M 1944 CS, Tween 80 and Transcutol HP. The pseudoternary phase diagrams with presence and absence of drug were plotted to check for the emulsification range and also to evaluate the effect of valsartan on the emulsification behavior of the phases. The mixtures consisting of oil (labrafil M 1944 CS) with surfactant (tween 80), co-surfactant (Transcutol HP) were found to be optimum formulations. Prepared formulations were evaluated for its particle size distribution, nanoemulsifying properties, robustness to dilution, self emulsication time, turbidity measurement, drug content and invitro dissolution. The optimized formulations are further evaluated for heating cooling cycle, centrifugation studies, freeze thaw cycling, particle size distribution and zeta potential were carried out to confirm the stability of the formed SNEDDS formulations. The prepared formulation revealed t a significant improvement in terms of the drug solubility as compared with marketed tablet and pure drug.
Abstract: In this paper, the data correction algorithm is suggested
when the environmental air temperature varies. To correct the infrared
data in this paper, the initial temperature or the initial infrared image
data is used so that a target source system may not be necessary. The
temperature data obtained from infrared detector show nonlinear
property depending on the surface temperature. In order to handle this
nonlinear property, Taylor series approach is adopted. It is shown that
the proposed algorithm can reduce the influence of environmental
temperature on the components in the board. The main advantage of
this algorithm is to use only the initial temperature of the components
on the board rather than using other reference device such as black
body sources in order to get reference temperatures.
Abstract: This paper shows the results of empirical research. It
presents experiences of Polish companies from the Podkarpackie
voivodeship connected with implementing EMS according to the
requirements of the ISO 14001 international standard. The incentives
to introduce and certify organizational eco-innovation, which formal
EMSs are treated as, are presented in this paper.
Abstract: Thermo-chemical treatment (TCT) such as pyrolysis
is getting recognized as a valid route for (i) materials and valuable
products and petrochemicals recovery; (ii) waste recycling; and (iii)
elemental characterization. Pyrolysis is also receiving renewed
attention for its operational, economical and environmental
advantages. In this study, samples of polyethylene terephthalate
(PET) and polystyrene (PS) were pyrolysed in a microthermobalance
reactor (using a thermogravimetric-TGA setup). Both
polymers were prepared and conditioned prior to experimentation.
The main objective was to determine the kinetic parameters of the
depolymerization reactions that occur within the thermal degradation
process. Overall kinetic rate constants (ko) and activation energies
(Eo) were determined using the general kinetics theory (GKT)
method previously used by a number of authors. Fitted correlations
were found and validated using the GKT, errors were within ± 5%.
This study represents a fundamental step to pave the way towards the
development of scaling relationship for the investigation of larger
scale reactors relevant to industry.
Abstract: This paper uses quasi-steady molecular statics model
and diamond tool to carry out simulation temperature rise of nanoscale
orthogonal cutting single-crystal silicon. It further qualitatively
analyzes temperature field of silicon workpiece without considering
heat transfer and considering heat transfer. This paper supposes that
the temperature rise of workpiece is mainly caused by two heat sources:
plastic deformation heat and friction heat. Then, this paper develops a
theoretical model about production of the plastic deformation heat and
friction heat during nanoscale orthogonal cutting. After the increased
temperature produced by these two heat sources are added up, the
acquired total temperature rise at each atom of the workpiece is
substituted in heat transfer finite difference equation to carry out heat
transfer and calculates the temperature field in each step and makes
related analysis.
Abstract: Spectrum is a scarce commodity, and considering the spectrum scarcity faced by the wireless-based service providers led to high congestion levels. Technical inefficiencies from pooled, since all networks share a common pool of channels, exhausting the available channels will force networks to block the services. Researchers found that cognitive radio (CR) technology may resolve the spectrum scarcity. A CR is a self-configuring entity in a wireless networking that senses its environment, tracks changes, and frequently exchanges information with their networks. However, CRN facing challenges and condition become worst while tracks changes i.e. reallocation of another under-utilized channels while primary network user arrives. In this paper, channels or resource reallocation technique based on DNA-inspired computing algorithm for CRN has been proposed.
Abstract: The project describes the modeling of various
architectures mechatronics specifically morphologies of robots in an educational environment. Each structure developed by students of
pre-school, primary and secondary was created using the concept of
reverse engineering in a constructivist environment, to later be integrated in educational software that promotes the teaching of
educational Robotics in a virtual and economic environment.
Abstract: An unsupervised classification algorithm is derived
by modeling observed data as a mixture of several mutually
exclusive classes that are each described by linear combinations of
independent non-Gaussian densities. The algorithm estimates the
data density in each class by using parametric nonlinear functions
that fit to the non-Gaussian structure of the data. This improves
classification accuracy compared with standard Gaussian mixture
models. When applied to textures, the algorithm can learn basis
functions for images that capture the statistically significant structure
intrinsic in the images. We apply this technique to the problem of
unsupervised texture classification and segmentation.
Abstract: This paper presents an algorithm to estimate the parameters of two closely spaced sinusoids, providing a frequency resolution that is more than 800 times greater than that obtained by using the Discrete Fourier Transform (DFT). The strategy uses a highly optimized grid search approach to accurately estimate frequency, amplitude and phase of both sinusoids, keeping at the same time the computational effort at reasonable levels. The proposed method has three main characteristics: 1) a high frequency resolution; 2) frequency, amplitude and phase are all estimated at once using one single package; 3) it does not rely on any statistical assumption or constraint. Potential applications to this strategy include the difficult task of resolving coincident partials of instruments in musical signals.
Abstract: The purpose of this research was to study the inspector performance by using computer based training (CBT). Visual inspection task was printed circuit board (PCB) simulated on several types of defects. Subjects were 16 undergraduate randomly selected from King Mongkut-s University of Technology Thonburi and test for 20/20. Then, they were equally divided on performance into two groups (control and treatment groups) and were provided information before running the experiment. Only treatment group was provided feedback information after first experiment. Results revealed that treatment group was showed significantly difference at the level of 0.01. The treatment group showed high percentage on defects detected. Moreover, the attitude of inspectors on using the CBT to inspection was showed on good. These results have been showed that CBT could be used for training to improve inspector performance.
Abstract: Synthetic Aperture Radar (SAR) is an imaging radar form by taking full advantage of the relative movement of the antenna with respect to the target. Through the simultaneous processing of the radar reflections over the movement of the antenna via the Range Doppler Algorithm (RDA), the superior resolution of a theoretical wider antenna, termed synthetic aperture, is obtained. Therefore, SAR can achieve high resolution two dimensional imagery of the ground surface. In addition, two filtering steps in range and azimuth direction provide accurate enough result. This paper develops a simulation in which realistic SAR images can be generated. Also, the effect of velocity errors in the resulting image has also been investigated. Taking some velocity errors into account, the simulation results on the image resolution would be presented. Most of the times, algorithms need to be adjusted for particular datasets, or particular applications.
Abstract: This paper aims to give a full study of the dynamic
behavior of a mono-phase active power filter. First, the principle of
the parallel active power filter will be introduced. Then, a
dimensioning procedure for all its components will be explained in
detail, such as the input filter, the current and voltage controllers.
This active power filter is simulated using OrCAD program showing
the validity of the theoretical study.
Abstract: Model-checking tools such as Symbolic Model Verifier
(SMV) and NuSMV are available for checking hardware designs.
These tools can automatically check the formal legitimacy of a
design. However, NuSMV is too low level for describing a complete
hardware design. It is therefore necessary to translate the system
definition, as designed in a language such as Verilog or VHDL, into
a language such as NuSMV for validation. In this paper, we present
a meta hardware description language, Melasy, that contains a code
generator for existing hardware description languages (HDLs) and
languages for model checking that solve this problem.
Abstract: Network-Centric Air Defense Missile Systems
(NCADMS) represents the superior development of the air defense
missile systems and has been regarded as one of the major research
issues in military domain at present. Due to lack of knowledge and
experience on NCADMS, modeling and simulation becomes an effective
approach to perform operational analysis, compared with
those equation based ones. However, the complex dynamic interactions
among entities and flexible architectures of NCADMS put forward
new requirements and challenges to the simulation framework
and models. ABS (Agent-Based Simulations) explicitly addresses
modeling behaviors of heterogeneous individuals. Agents have capability
to sense and understand things, make decisions, and act on the
environment. They can also cooperate with others dynamically to
perform the tasks assigned to them. ABS proves an effective approach
to explore the new operational characteristics emerging in
NCADMS. In this paper, based on the analysis of network-centric
architecture and new cooperative engagement strategies for
NCADMS, an agent-based simulation framework by expanding the
simulation framework in the so-called System Effectiveness Analysis
Simulation (SEAS) was designed. The simulation framework specifies
components, relationships and interactions between them, the
structure and behavior rules of an agent in NCADMS. Based on scenario
simulations, information and decision superiority and operational
advantages in NCADMS were analyzed; meanwhile some
suggestions were provided for its future development.
Abstract: This paper is concerned with the numerical minimization
of energy functionals in BV (
) (the space of bounded variation
functions) involving total variation for gray-scale 1-dimensional inpainting
problem. Applications are shown by finite element method
and discontinuous Galerkin method for total variation minimization.
We include the numerical examples which show the different recovery
image by these two methods.
Abstract: The automatic discrimination of seismic signals is an important practical goal for the earth-science observatories due to the large amount of information that they receive continuously. An essential discrimination task is to allocate the incoming signal to a group associated with the kind of physical phenomena producing it. In this paper, we present new techniques for seismic signals classification: local, regional and global discrimination. These techniques were tested on seismic signals from the data base of the National Geophysical Institute of the Centre National pour la Recherche Scientifique et Technique (Morocco) by using the Moroccan software for seismic signals analysis.