Abstract: Based on a non-linear single track model which
describes the dynamics of vehicle, an optimal path planning strategy
is developed. Real time optimization is used to generate reference
control values to allow leading the vehicle alongside a calculated lane
which is optimal for different objectives such as energy consumption,
run time, safety or comfort characteristics. Strict mathematic
formulation of the autonomous driving allows taking decision on
undefined situation such as lane change or obstacle avoidance. Based
on position of the vehicle, lane situation and obstacle position, the
optimization problem is reformulated in real-time to avoid the
obstacle and any car crash.
Abstract: Photovoltaic power generation forecasting is an
important task in renewable energy power system planning and
operating. This paper explores the application of neural networks
(NN) to study the design of photovoltaic power generation
forecasting systems for one week ahead using weather databases
include the global irradiance, and temperature of Ghardaia city
(south of Algeria) using a data acquisition system. Simulations were
run and the results are discussed showing that neural networks
Technique is capable to decrease the photovoltaic power generation
forecasting error.
Abstract: There are three distinct stages in the evolution of
economic thought, namely:
1. in the first stage, the major concern was to accelerate
economic growth with increased availability of material
goods, especially in developing economies with very low
living standards, because poverty eradication meant faster
economic growth.
2. in the second stage, economists made distinction between
growth and development. Development was seen as going
beyond economic growth, and bringing certain changes in
the structure of the economy with more equitable
distribution of the benefits of growth, with the growth
coming automatic and sustained.
3. the third stage is now reached. Our concern is now with
“sustainable development", that is, development not only
for the present but also of the future.
Thus the focus changed from “sustained growth" to “sustained
development". Sustained development brings to the fore the long
term relationship between the ecology and economic development.
Since the creation of UNEP in 1972 it has worked for
development without destruction for environmentally sound and
sustained development. It was realised that the environment cannot
be viewed in a vaccum, it is not separate from development, nor is it
competing. It suggested for the integration of the environment with
development whereby ecological factors enter development planning,
socio-economic policies, cost-benefit analysis, trade, technology
transfer, waste management, educational and other specific areas.
Industrialisation has contributed to the growth of economy of
several countries. It has improved the standards of living of its people
and provided benefits to the society. It has also created in the process
great environmental problems like climate change, forest destruction
and denudation, soil erosion and desertification etc.
On the other hand, industry has provided jobs and improved the
prospects of wealth for the industrialists. The working class
communities had to simply put up with the high levels of pollution in
order to keep up their jobs and also to save their income.
There are many roots of the environmental problem. They may be
political, economic, cultural and technological conditions of the
modern society. The experts concede that industrial growth lies
somewhere close to the heart of the matter. Therefore, the objective
of this paper is not to document all roots of an environmental crisis
but rather to discuss the effects of industrial growth and
development.
We have come to the conclusion that although public intervention
is often unnecessary to ensure that perfectly competitive markets will
function in society-s best interests, such intervention is necessary
when firms or consumers pollute.
Abstract: The multidelays linear control systems described by
difference differential equations are often studied in modern control
theory. In this paper, the delay-independent stabilization algebraic
criteria and the theorem of delay-independent stabilization for linear
systems with multiple time-delays are established by using the
Lyapunov functional and the Riccati algebra matrix equation in the
matrix theory. An illustrative example and the simulation result, show
that the approach to linear systems with multiple time-delays is
effective.
Abstract: The spatial variation in plant species associated with intercropping is intended to reduce resource competition between species and increase yield potential. A field experiment was carried out on corn (Zea mays L.) and soybean (Glycine max L.) intercropping in a replacement series experiment with weed contamination consist of: weed free, infestation of redroot pigweed, infestation of jimsonweed and simultaneous infestation of redroot pigweed and jimsonweed in Karaj, Iran during 2007 growing season. The experimental design was a randomized complete block in factorial experiment with replicated thrice. Significant (P≤0.05) differences were observed in yield in intercropping. Corn yield was higher in intercropping, but soybean yield was significantly reduced by corn when intercropped. However, total productivity and land use efficiency were high under the intercropping system even in contamination of either species of weeds. Aggressivity of corn relative to soybean revealed the greater competitive ability of corn than soybean. Land equivalent ratio (LER) more than 1 in all treatments attributed to intercropping advantages and was highest in 50: 50 (corn/soybean) in weed free. These findings suggest that intercropping corn and soybean increase total productivity per unit area and improve land use efficiency. Considering the experimental findings, corn-soybean intercropping (50:50) may be recommended for yield advantage, more efficient utilization of resources, and weed suppression as a biological control.
Abstract: Most Decision Support Systems (DSS) for waste
management (WM) constructed are not widely marketed and lack
practical applications. This is due to the number of variables and
complexity of the mathematical models which include the
assumptions and constraints required in decision making. The
approach made by many researchers in DSS modelling is to isolate a
few key factors that have a significant influence to the DSS. This
segmented approach does not provide a thorough understanding of
the complex relationships of the many elements involved. The
various elements in constructing the DSS must be integrated and
optimized in order to produce a viable model that is marketable and
has practical application. The DSS model used in assisting decision
makers should be integrated with GIS, able to give robust prediction
despite the inherent uncertainties of waste generation and the plethora
of waste characteristics, and gives optimal allocation of waste stream
for recycling, incineration, landfill and composting.
Abstract: Mathematical programming has been applied to various
problems. For many actual problems, the assumption that the parameters
involved are deterministic known data is often unjustified. In
such cases, these data contain uncertainty and are thus represented
as random variables, since they represent information about the
future. Decision-making under uncertainty involves potential risk.
Stochastic programming is a commonly used method for optimization
under uncertainty. A stochastic programming problem with recourse
is referred to as a two-stage stochastic problem. In this study, we
consider a stochastic programming problem with simple integer
recourse in which the value of the recourse variable is restricted to a
multiple of a nonnegative integer. The algorithm of a dynamic slope
scaling procedure for solving this problem is developed by using a
property of the expected recourse function. Numerical experiments
demonstrate that the proposed algorithm is quite efficient. The
stochastic programming model defined in this paper is quite useful
for a variety of design and operational problems.
Abstract: Identifying the nature of protein-nanoparticle
interactions and favored binding sites is an important issue in
functional characterization of biomolecules and their physiological
responses. Herein, interaction of silver nanoparticles with lysozyme
as a model protein has been monitored via fluorescence spectroscopy.
Formation of complex between the biomolecule and silver
nanoparticles (AgNPs) induced a steady state reduction in the
fluorescence intensity of protein at different concentrations of
nanoparticles. Tryptophan fluorescence quenching spectra suggested
that silver nanoparticles act as a foreign quencher, approaching the
protein via this residue. Analysis of the Stern-Volmer plot showed
quenching constant of 3.73 μM−1. Moreover, a single binding site in
lysozyme is suggested to play role during interaction with AgNPs,
having low affinity of binding compared to gold nanoparticles.
Unfolding studies of lysozyme showed that complex of lysozyme-
AgNPs has not undergone structural perturbations compared to the
bare protein. Results of this effort will pave the way for utilization of
sensitive spectroscopic techniques for rational design of
nanobiomaterials in biomedical applications.
Abstract: In a nuclear reactor Loss of Coolant accident (LOCA)
considers wide range of postulated damage or rupture of pipe in the
heat transport piping system. In the case of LOCA with/without
failure of emergency core cooling system in a Pressurised Heavy
water Reactor, the Pressure Tube (PT) temperature could rise
significantly due to fuel heat up and gross mismatch of the heat
generation and heat removal in the affected channel. The extent and
nature of deformation is important from reactor safety point of view.
Experimental set-ups have been designed and fabricated to simulate
ballooning (radial deformation) of PT for 220 MWe IPHWRs.
Experiments have been conducted by covering the CT by ceramic
fibers and then by submerging CT in water of voided PTs. In both
the experiments, it is observed that ballooning initiates at a
temperature around 665´┐¢C and complete contact between PT and
Caldaria Tube (CT) occurs at around 700´┐¢C approximately. The
strain rate is found to be 0.116% per second. The structural integrity
of PT is retained (no breach) for all the experiments. The PT heatup
is found to be arrested after the contact between PT and CT, thus
establishing moderator acting as an efficient heat sink for IPHWRs.
Abstract: While many studies have conducted the achievement
gap between groups of students in school districts, few studies have
utilized resilience research to investigate achievement gaps within
classrooms. This paper aims to summarize and discuss some recent
studies Waxman, Padr├│n, and their colleagues conducted, in which
they examined learning environment differences between resilient
and nonresilient students in reading and mathematics classrooms.
The classes consist of predominantly Hispanic elementary school
students from low-income families. These studies all incorporated
learning environment questionnaires and systematic observation
methods. Significant differences were found between resilient and
nonresilient students on their classroom learning environments and
classroom behaviors. The observation results indicate that the amount
and quality of teacher and student academic interaction are two of the
most influential variables that promote student outcomes. This paper
concludes by suggesting the following teacher practices to promote
resiliency in schools: (a) using feedback from classroom observation
and learning environment measures, (b) employing explicit teaching
practices; and (c) understanding students on a social and personal
level.
Abstract: In this cyber age, the job market has been rapidly transforming and being digitalized. Submitting a paper-based curriculum vitae (CV) nowadays does not grant a job seeker a high employability rate. This paper calls for attention on the creation of mobile Curriculum Vitae or m-CV (http://mcurriculumvitae. blogspot.com), a sample of an individual CV developed using weblog, which can enhance the job hunter especially fresh graduate-s higher marketability rate. This study is designed to identify the perceptions held by Malaysian university students regarding m-CV grounded on a modified Technology Acceptance Model (TAM). It measures the strength and the direction of relationships among three major variables – Perceived Ease of Use (PEOU), Perceived Usefulness (PU) and Behavioral Intention (BI) to use. The finding shows that university students generally accepted adopting m-CV since they perceived m-CV to be more useful rather than easy to use. Additionally, this study has confirmed TAM to be a useful theoretical model in helping to understand and explain the behavioral intention to use Web 2.0 application-weblog publishing their CV. The result of the study has underlined another significant positive value of using weblog to create personal CV. Further research of m-CV has been highlighted in this paper.
Abstract: Many exist studies always use Markov decision
processes (MDPs) in modeling optimal route choice in
stochastic, time-varying networks. However, taking many
variable traffic data and transforming them into optimal route
decision is a computational challenge by employing MDPs in
real transportation networks. In this paper we model finite
horizon MDPs using directed hypergraphs. It is shown that the
problem of route choice in stochastic, time-varying networks
can be formulated as a minimum cost hyperpath problem, and
it also can be solved in linear time. We finally demonstrate the
significant computational advantages of the introduced
methods.
Abstract: Nowadays, Gene Ontology has been used widely by many researchers for biological data mining and information retrieval, integration of biological databases, finding genes, and incorporating knowledge in the Gene Ontology for gene clustering. However, the increase in size of the Gene Ontology has caused problems in maintaining and processing them. One way to obtain their accessibility is by clustering them into fragmented groups. Clustering the Gene Ontology is a difficult combinatorial problem and can be modeled as a graph partitioning problem. Additionally, deciding the number k of clusters to use is not easily perceived and is a hard algorithmic problem. Therefore, an approach for solving the automatic clustering of the Gene Ontology is proposed by incorporating cohesion-and-coupling metric into a hybrid algorithm consisting of a genetic algorithm and a split-and-merge algorithm. Experimental results and an example of modularized Gene Ontology in RDF/XML format are given to illustrate the effectiveness of the algorithm.
Abstract: General requirements for knowledge representation in
the form of logic rules, applicable to design and control of industrial
processes, are formulated. Characteristic behavior of decision trees
(DTs) and rough sets theory (RST) in rules extraction from recorded
data is discussed and illustrated with simple examples. The
significance of the models- drawbacks was evaluated, using
simulated and industrial data sets. It is concluded that performance of
DTs may be considerably poorer in several important aspects,
compared to RST, particularly when not only a characterization of a
problem is required, but also detailed and precise rules are needed,
according to actual, specific problems to be solved.
Abstract: In networks, mainly small and medium-sized businesses benefit from the knowledge, experiences and solutions offered by experts from industry and science or from the exchange with practitioners. Associations which focus, among other things, on networking, information and knowledge transfer and which are interested in supporting such cooperations are especially well suited to provide such networks and the appropriate web platforms. Using METORA as an example – a project developed and run by the Federal Association for Information Economy, Telecommunications and New Media e.V. (BITKOM) for the Federal Ministry of Economics and Technology (BMWi) – This paper will discuss how associations and other network organizations can achieve this task and what conditions they have to consider.
Abstract: Long term rainfall analysis and prediction is a
challenging task especially in the modern world where the impact of
global warming is creating complications in environmental issues.
These factors which are data intensive require high performance
computational modeling for accurate prediction. This research paper
describes a prototype which is designed and developed on grid
environment using a number of coupled software infrastructural
building blocks. This grid enabled system provides the demanding
computational power, efficiency, resources, user-friendly interface,
secured job submission and high throughput. The results obtained
using sequential execution and grid enabled execution shows that
computational performance has enhanced among 36% to 75%, for
decade of climate parameters. Large variation in performance can be
attributed to varying degree of computational resources available for
job execution.
Grid Computing enables the dynamic runtime selection, sharing
and aggregation of distributed and autonomous resources which plays
an important role not only in business, but also in scientific
implications and social surroundings. This research paper attempts to
explore the grid enabled computing capabilities on weather indices
from HOAPS data for climate impact modeling and change
detection.
Abstract: An end-member selection method for spectral unmixing that is based on Particle Swarm Optimization (PSO) is developed in this paper. The algorithm uses the K-means clustering algorithm and a method of dynamic selection of end-members subsets to find the appropriate set of end-members for a given set of multispectral images. The proposed algorithm has been successfully applied to test image sets from various platforms such as LANDSAT 5 MSS and NOAA's AVHRR. The experimental results of the proposed algorithm are encouraging. The influence of different values of the algorithm control parameters on performance is studied. Furthermore, the performance of different versions of PSO is also investigated.
Abstract: The objective of this research was to study factors,
which were affected on surface roughness in high speed milling of
hardened tool steel. Material used in the experiment was tool steel JIS
SKD 61 that hardened on 60 ±2 HRC. Full factorial experimental
design was conducted on 3 factors and 3 levels (3
3
designs) with 2
replications. Factors were consisted of cutting speed, feed rate, and
depth of cut. The results showed that influenced factor affected to
surface roughness was cutting speed, feed rate and depth of cut which
showed statistical significant. Higher cutting speed would cause on
better surface quality. On the other hand, higher feed rate would cause
on poorer surface quality. Interaction of factor was found that cutting
speed and depth of cut were significantly to surface quality. The
interaction of high cutting speed associated with low depth of cut
affected to better surface quality than low cutting speed and high depth
of cut.
Abstract: Data envelopment analysis (DEA) has gained great popularity in environmental performance measurement because it can provide a synthetic standardized environmental performance index when pollutants are suitably incorporated into the traditional DEA framework. Since some of the environmental performance indicators cannot be controlled by companies managers, it is necessary to develop the model in a way that it could be applied when discretionary and/or non-discretionary factors were involved. In this paper, we present a semi-radial DEA approach to measuring environmental performance, which consists of non-discretionary factors. The model, then, has been applied on a real case.
Abstract: In the automotive industry test drives are being conducted
during the development of new vehicle models or as a part of
quality assurance of series-production vehicles. The communication
on the in-vehicle network, data from external sensors, or internal
data from the electronic control units is recorded by automotive
data loggers during the test drives. The recordings are used for fault
analysis. Since the resulting data volume is tremendous, manually
analysing each recording in great detail is not feasible.
This paper proposes to use machine learning to support domainexperts
by preventing them from contemplating irrelevant data and
rather pointing them to the relevant parts in the recordings. The
underlying idea is to learn the normal behaviour from available
recordings, i.e. a training set, and then to autonomously detect
unexpected deviations and report them as anomalies.
The one-class support vector machine “support vector data description”
is utilised to calculate distances of feature vectors. SVDDSUBSEQ
is proposed as a novel approach, allowing to classify subsequences
in multivariate time series data. The approach allows to
detect unexpected faults without modelling effort as is shown with
experimental results on recordings from test drives.