Abstract: Environmental contamination is a common problem in ex-industrial and industrial sites. This article gives a brief description of general applied environmental investigation methodologies and possible remediation applications in Latvia. Most of contaminated areas are situated in former and active industrial, military areas and ports. Industrial and logistic activities very often have been with great impact for more than hundred years thus the contamination level with heavy metals, hydrocarbons, pesticides, persistent organic pollutants is high and is threatening health and environment in general. 242 territories now are numbered as contaminated and fixed in the National Register of contaminated territories in Latvia. Research and remediation of contamination in densely populated areas are of important environmental policy domain. Four different investigation case studies of contaminated areas are given describing the history of use, environmental quality assessment as well as planned environmental management actions. All four case study locations are situated in Riga - the capital of the Republic of Latvia. The aim of this paper is to analyze the situation and problems with management of contaminated areas in Latvia, give description of field research methods and recommendations for remediation industry based on scientific data and innovations.
Abstract: Robust nonlinear integrated navigation of GPS and
low cost MEMS is a hot topic of research these days. A robust filter
is required to cope up with the problem of unpredictable
discontinuities and colored noises associated with low cost sensors.
H∞ filter is previously used in Extended Kalman filter and Unscented
Kalman filter frame. Unscented Kalman filter has a problem of
Cholesky matrix factorization at each step which is a very unstable
operation. To avoid this problem in this research H∞ filter is
designed in Square root Unscented filter framework and found 50%
more robust towards increased level of colored noises.
Abstract: Evolvable hardware (EHW) refers to a selfreconfiguration
hardware design, where the configuration is under
the control of an evolutionary algorithm (EA). A lot of research has
been done in this area several different EA have been introduced.
Every time a specific EA is chosen for solving a particular problem,
all its components, such as population size, initialization, selection
mechanism, mutation rate, and genetic operators, should be selected
in order to achieve the best results. In the last three decade a lot of
research has been carried out in order to identify the best parameters
for the EA-s components for different “test-problems". However
different researchers propose different solutions. In this paper the
behaviour of mutation rate on (1+λ) evolution strategy (ES) for
designing logic circuits, which has not been done before, has been
deeply analyzed. The mutation rate for an EHW system modifies
values of the logic cell inputs, the cell type (for example from AND
to NOR) and the circuit output. The behaviour of the mutation has
been analyzed based on the number of generations, genotype
redundancy and number of logic gates used for the evolved circuits.
The experimental results found provide the behaviour of the mutation
rate to be used during evolution for the design and optimization of
logic circuits. The researches on the best mutation rate during the last
40 years are also summarized.
Abstract: The objective of the paper is twofold. First, to develop a
formal framework for planning for mobile agents. A logical language
based on a temporal logic is proposed that can express a type of
tasks which often arise in network management. Second, to design a
planning algorithm for such tasks. The aim of this paper is to study
the importance of finding plans for mobile agents. Although there
has been a lot of research in mobile agents, not much work has been
done to incorporate planning ideas for such agents. This paper makes
an attempt in this direction. A theoretical study of finding plans for
mobile agents is undertaken. A planning algorithm (based on the
paradigm of mobile computing) is proposed and its space, time, and
communication complexity is analyzed. The algorithm is illustrated
by working out an example in detail.
Abstract: In this paper, we have applied the homotopy perturbation
method (HPM) for obtaining the analytical solution of unsteady
flow of gas through a porous medium and we have also compared the
findings of this research with some other analytical results. Results
showed a very good agreement between results of HPM and the
numerical solutions of the problem rather than other analytical solutions
which have previously been applied. The results of homotopy
perturbation method are of high accuracy and the method is very
effective and succinct.
Abstract: This paper describes a platform that faces the main
research areas for e-learning educational contents. Reusability tackles
the possibility to use contents in different courses reducing costs and
exploiting available data from repositories. In our approach the
production of educational material is based on templates to reuse
learning objects. In terms of interoperability the main challenge lays
on reaching the audience through different platforms. E-learning
solution must track social consumption evolution where nowadays
lots of multimedia contents are accessed through the social networks.
Our work faces it by implementing a platform for generation of
multimedia presentations focused on the new paradigm related to
social media. The system produces videos-courses on top of web
standard SMIL (Synchronized Multimedia Integration Language)
ready to be published and shared. Regarding interfaces it is
mandatory to satisfy user needs and ease communication. To
overcome it the platform deploys virtual teachers that provide natural
interfaces while multimodal features remove barriers to pupils with
disabilities.
Abstract: Subdivision surfaces were applied to the entire
meshes in order to produce smooth surfaces refinement from coarse
mesh. Several schemes had been introduced in this area to provide a
set of rules to converge smooth surfaces. However, to compute and
render all the vertices are really inconvenient in terms of memory
consumption and runtime during the subdivision process. It will lead
to a heavy computational load especially at a higher level of
subdivision. Adaptive subdivision is a method that subdivides only at
certain areas of the meshes while the rest were maintained less
polygons. Although adaptive subdivision occurs at the selected areas,
the quality of produced surfaces which is their smoothness can be
preserved similar as well as regular subdivision. Nevertheless,
adaptive subdivision process burdened from two causes; calculations
need to be done to define areas that are required to be subdivided and
to remove cracks created from the subdivision depth difference
between the selected and unselected areas. Unfortunately, the result
of adaptive subdivision when it reaches to the higher level of
subdivision, it still brings the problem with memory consumption.
This research brings to iterative process of adaptive subdivision to
improve the previous adaptive method that will reduce memory
consumption applied on triangular mesh. The result of this iterative
process was acceptable better in memory and appearance in order to
produce fewer polygons while it preserves smooth surfaces.
Abstract: Cavitation, usually known as a destructive
phenomenon, involves turbulent unsteady two-phase flow. Having
such features, cavitating flows have been turned to a challenging
topic in numerical studies and many researches are being done for
better understanding of bubbly flows and proposing solutions to
reduce its consequent destructive effects. Aeration may be regarded
as an effective protection against cavitation erosion in many
hydraulic structures, like gated tunnels. The paper concerns
numerical simulation of flow in discharge gated tunnel of a dam
using ing RNG k -ε model coupled with the volume of fluid (VOF)
method and the zone which is susceptible of cavitation inception in
the tunnel is predicted. In the second step, a vent is considered in the
mentioned zone for aeration and the numerical simulation is done
again to study the effects of aeration. The results show that aeration
is an impressively useful method to exclude cavitation in mentioned
tunnels.
Abstract: Theobjective of this study was to evaluate the optimal
treatment condition of Fenton oxidation process to removal
contaminant in soil slurry contaminated by petroleum hydrocarbons.
This research studied somefactors that affect the removal efficiency
of petroleum hydrocarbons in soil slurry including molar ratio of
hydrogen peroxide (H2O2) to ferrous ion(Fe2+), pH condition and
reaction time.The resultsdemonstrated that the optimum condition
was that the molar ratio of H2O2:Fe3+ was 200:1,the pHwas 4.0and
the rate of reaction was increasing rapidly from starting point to 7th
hour and destruction kinetic rate (k) was 0.24 h-1. Approximately
96% of petroleum hydrocarbon was observed(initialtotal petroleum
hydrocarbon (TPH) concentration = 70±7gkg-1)
Abstract: Narratives are invaluable assets of human lives. Due to
the distinct features of narratives, they are useful for supporting human
reasoning processes. However, many useful narratives become
residuals in organizations or human minds nowadays. Researchers
have contributed effort to investigate and improve narrative generation
processes. This paper attempts to contemplate essential components in
narratives and explore a computational approach to acquire and extract
knowledge to generate narratives. The methodology and significant
benefit for decision support are presented.
Abstract: Ovshinsky initiated scientific research in the field of
amorphous and disordered materials that continues to this day. The
Ovshinsky Effect where the resistance of thin GST films is
significantly reduced upon the application of low voltage is of
fundamental importance in phase-change - random access memory
(PC-RAM) devices.GST stands for GdSbTe chalcogenide type
glasses.However, the Ovshinsky Effect is not without controversy.
Ovshinsky thought the resistance of GST films is reduced by the
redistribution of charge carriers; whereas, others at that time including
many PC-RAM researchers today argue that the GST resistance
changes because the GST amorphous state is transformed to the
crystalline state by melting, the heat supplied by external heaters. In
this controversy, quantum mechanics (QM) asserts the heat capacity of
GST films vanishes, and therefore melting cannot occur as the heat
supplied cannot be conserved by an increase in GST film
temperature.By precluding melting, QM re-opens the controversy
between the melting and charge carrier mechanisms. Supporting
analysis is presented to show that instead of increasing GST film
temperature, conservation proceeds by the QED induced creation of
photons within the GST film, the QED photons confined by TIR. QED
stands for quantum electrodynamics and TIR for total internal
reflection. The TIR confinement of QED photons is enhanced by the
fact the absorbedheat energy absorbed in the GST film is concentrated
in the TIR mode because of their high surface to volume ratio. The
QED photons having Planck energy beyond the ultraviolet produce
excitons by the photoelectric effect, the electrons and holes of which
reduce the GST film resistance.
Abstract: Nurses in an Armed Force Hospital (AFH) expose to stronger stress than those in a civil hospital, especially in an emergency department (ED). Ironically, stresses of these nurses received few if any attention in academic research in the past. This study collects 227 samples from the emergency departments of four armed force hospitals in central and southern Taiwan. The research indicates that the top five stressors are a massive casualty event, delayed physician support, overloads of routine work, overloads of assignments, and annoying paper work. Excessive work loading was found to be the primary source of stress. Nurses who were perceived to have greater stress levels were more inclined to deploy emotion-oriented approaches and more likely to seek job rotations. Professional stressors and problem-oriented approaches were positively correlated. Unlike other local studies, this study concludes that the excessive work-loading is more stressful in an AFH.
Abstract: In an open real-time system environment, the coexistence of different kinds of real-time and non real-time applications makes the system scheduling mechanism face new requirements and challenges. One two-level scheduling scheme of the open real-time systems is introduced, and points out that hard and soft real-time applications are scheduled non-distinctively as the same type real-time applications, the Quality of Service (QoS) cannot be guaranteed. It has two flaws: The first, it can not differentiate scheduling priorities of hard and soft real-time applications, that is to say, it neglects characteristic differences between hard real-time applications and soft ones, so it does not suit a more complex real-time environment. The second, the worst case execution time of soft real-time applications cannot be predicted exactly, so it is not worth while to cost much spending in order to assure all soft real-time applications not to miss their deadlines, and doing that may cause resource wasting. In order to solve this problem, a novel two-level real-time scheduling mechanism (including scheduling profile and scheduling algorithm) which adds the process of dealing with soft real-time applications is proposed. Finally, we verify real-time scheduling mechanism from two aspects of theory and experiment. The results indicate that our scheduling mechanism can achieve the following objectives. (1) It can reflect the difference of priority when scheduling hard and soft real-time applications. (2) It can ensure schedulability of hard real-time applications, that is, their rate of missing deadline is 0. (3) The overall rate of missing deadline of soft real-time applications can be less than 1. (4) The deadline of a non-real-time application is not set, whereas the scheduling algorithm that server 0 S uses can avoid the “starvation" of jobs and increase QOS. By doing that, our scheduling mechanism is more compatible with different types of applications and it will be applied more widely.
Abstract: In designing river intakes and diversion structures, it is paramount that the sediments entering the intake are minimized or, if possible, completely separated. Due to high water velocity, sediments can significantly damage hydraulic structures especially when mechanical equipment like pumps and turbines are used. This subsequently results in wasting water, electricity and further costs. Therefore, it is prudent to investigate and analyze the performance of lateral intakes affected by sediment control structures. Laboratory experiments, despite their vast potential and benefits, can face certain limitations and challenges. Some of these include: limitations in equipment and facilities, space constraints, equipment errors including lack of adequate precision or mal-operation, and finally, human error. Research has shown that in order to achieve the ultimate goal of intake structure design – which is to design longlasting and proficient structures – the best combination of sediment control structures (such as sill and submerged vanes) along with parameters that increase their performance (such as diversion angle and location) should be determined. Cost, difficulty of execution and environmental impacts should also be included in evaluating the optimal design. This solution can then be applied to similar problems in the future. Subsequently, the model used to arrive at the optimal design requires high level of accuracy and precision in order to avoid improper design and execution of projects. Process of creating and executing the design should be as comprehensive and applicable as possible. Therefore, it is important that influential parameters and vital criteria is fully understood and applied at all stages of choosing the optimal design. In this article, influential parameters on optimal performance of the intake, advantages and disadvantages, and efficiency of a given design are studied. Then, a multi-criterion decision matrix is utilized to choose the optimal model that can be used to determine the proper parameters in constructing the intake.
Abstract: This paper proposes a method to vibration analysis in
order to on-line monitoring and predictive maintenance during the
milling process. Adapting envelope method to diagnostics and the
analysis for milling tool materials is an important contribution to the
qualitative and quantitative characterization of milling capacity and a
step by modeling the three-dimensional cutting process. An
experimental protocol was designed and developed for the
acquisition, processing and analyzing three-dimensional signal. The
vibration envelope analysis is proposed to detect the cutting capacity
of the tool with the optimization application of cutting parameters.
The research is focused on Hilbert transform optimization to evaluate
the dynamic behavior of the machine/ tool/workpiece.
Abstract: Today the social marketing was constituted as a tool
of significant value in what he refers to the promotion of changes of
behaviors, attitudes end practices. With the objective of analyzing the
benefits that the social marketing can bring for the organizations that
use it the research was of the exploratory and descriptive. In the
present study the comparative method was used, through a qualitative
approach, to analyze the activities developed by three institutions:
the Recovery Center Rosa de Saron, the House of Recovery for
addicts and Teen Challenge Institute Children's Cancer of the
Wasteland (ICIA), kindred of pointing out the benefits of the social
marketing in organizations that don-t seek the profit.
Abstract: This research investigates risk factors for defective products in autoparts factories. Under a Bayesian framework, a generalized linear mixed model (GLMM) in which the dependent variable, the number of defective products, has a Poisson distribution is adopted. Its performance is compared with the Poisson GLM under a Bayesian framework. The factors considered are production process, machines, and workers. The products coded RT50 are observed. The study found that the Poisson GLMM is more appropriate than the Poisson GLM. For the production Process factor, the highest risk of producing defective products is Process 1, for the Machine factor, the highest risk is Machine 5, and for the Worker factor, the highest risk is Worker 6.
Abstract: Water pollution assessment problems arise frequently
in environmental science. In this research, a finite difference method
for solving the one-dimensional steady convection-diffusion equation
with variable coefficients is proposed; it is then used to optimize
water treatment costs.
Abstract: Today-s economy is in a permanent change, causing
merger and acquisitions and co operations between enterprises. As a
consequence, process adaptations and realignments result in systems
integration and software development projects. Processes and
procedures to execute such projects are still reliant on craftsman-ship
of highly skilled workers. A generally accepted, industrialized
production, characterized by high efficiency and quality, seems
inevitable.
In spite of this, current concepts of software industrialization are
aimed at traditional software engineering and do not consider the
characteristics of systems integration. The present work points out
these particularities and discusses the applicability of existing
industrial concepts in the systems integration domain. Consequently
it defines further areas of research necessary to bring the field of
systems integration closer to an industrialized production, allowing a
higher efficiency, quality and return on investment.
Abstract: Global competitiveness has recently become the
biggest concern of both manufacturing and service companies.
Electronic commerce, as a key technology enables the firms to reach
all the potential consumers from all over the world. In this study, we
have presented commonly used electronic payment systems, and then
we have shown the evaluation of these systems in respect to different
criteria. The payment systems which are included in this research are
the credit card, the virtual credit card, the electronic money, the
mobile payment, the credit transfer and the debit instruments. We
have realized a systematic comparison of these systems in respect to
three main criteria: Technical, economical and social. We have
conducted a fuzzy multi-criteria decision making procedure to deal
with the multi-attribute nature of the problem. The subjectiveness
and imprecision of the evaluation process are modeled using
triangular fuzzy numbers.