Abstract: Rapid urbanization, industrialization and population
growth have led to an increase in number of automobiles that cause
air pollution. It is estimated that road traffic contributes 60% of air
pollution in urban areas. A case by case assessment is required to
predict the air quality in urban situations, so as to evolve certain
traffic management measures to maintain the air quality levels with
in the tolerable limits. Calicut city in the state of Kerala, India has
been chosen as the study area. Carbon Monoxide (CO) concentration
was monitored at 15 links in Calicut city and air quality performance
was evaluated over each link. The CO pollutant concentration values
were compared with the National Ambient Air Quality Standards
(NAAQS), and the CO values were predicted by using CALINE4 and
IITLS and Linear regression models. The study has revealed that
linear regression model performs better than the CALINE4 and
IITLS models. The possible association between CO pollutant
concentration and traffic parameters like traffic flow, type of vehicle,
and traffic stream speed was also evaluated.
Abstract: Documents clustering become an essential technology
with the popularity of the Internet. That also means that fast and
high-quality document clustering technique play core topics. Text
clustering or shortly clustering is about discovering semantically
related groups in an unstructured collection of documents. Clustering
has been very popular for a long time because it provides unique
ways of digesting and generalizing large amounts of information.
One of the issues of clustering is to extract proper feature (concept)
of a problem domain. The existing clustering technology mainly
focuses on term weight calculation. To achieve more accurate
document clustering, more informative features including concept
weight are important. Feature Selection is important for clustering
process because some of the irrelevant or redundant feature may
misguide the clustering results. To counteract this issue, the proposed
system presents the concept weight for text clustering system
developed based on a k-means algorithm in accordance with the
principles of ontology so that the important of words of a cluster can
be identified by the weight values. To a certain extent, it has resolved
the semantic problem in specific areas.
Abstract: A mathematical model for the Dynamics of Economic
Profit is constructed by proposing a characteristic differential oneform
for this dynamics (analogous to the action in Hamiltonian
dynamics). After processing this form with exterior calculus, a pair of
characteristic differential equations is generated and solved for the
rate of change of profit P as a function of revenue R (t) and cost C (t).
By contracting the characteristic differential one-form with a vortex
vector, the Lagrangian is obtained for the Dynamics of Economic
Profit.
Abstract: This paper reviews the objectives, methods and results of previous studies on biodrying of solid waste in several countries. Biodrying of solid waste is a novel technology in developing countries such as in Malaysia where high moisture content in organic waste makes the segregation process for recycling purposes complicated and diminishes the calorific value for the use of fuel source. In addition, the high moisture content also encourages the breeding of vectors and disease-bearing animals. From the laboratory results, the average moisture content of organic waste, paper, plastics and metals are 58.17%, 37.93%, 29.79% and 1.03% respectively for UKM campus. Biodrying of solid waste is a simple method of waste treatment as well as a cost-efficient technology to dry the solid waste. The process depends on temperature monitoring and air flow control along with the natural biodegradable process of organic waste. This review shows that the biodrying of solid waste method has high potential in treatment and recycling of solid waste, be useful for biodrying study and implementation in Malaysia.
Abstract: We present the development of a new underwater laser
cutting process in which a water-jet has been used along with the
laser beam to remove the molten material through kerf. The
conventional underwater laser cutting usually utilizes a high pressure
gas jet along with laser beam to create a dry condition in the cutting
zone and also to eject out the molten material. This causes a lot of gas
bubbles and turbulence in water, and produces aerosols and waste
gas. This may cause contamination in the surrounding atmosphere
while cutting radioactive components like burnt nuclear fuel. The
water-jet assisted underwater laser cutting process produces much
less turbulence and aerosols in the atmosphere. Some amount of
water vapor bubbles is formed at the laser-metal-water interface;
however, they tend to condense as they rise up through the
surrounding water. We present the design and development of a
water-jet assisted underwater laser cutting head and the parametric
study of the cutting of AISI 304 stainless steel sheets with a 2 kW
CW fiber laser. The cutting performance is similar to that of the gas
assist laser cutting; however, the process efficiency is reduced due to
heat convection by water-jet and laser beam scattering by vapor. This
process may be attractive for underwater cutting of nuclear reactor
components.
Abstract: The performance and the plasma created by a pulsed
magnetoplasmadynamic thruster for small satellite application is
studied to understand better the ablation and plasma propagation
processes occurring during the short-time discharge. The results can
be applied to improve the quality of the thruster in terms of efficiency,
and to tune the propulsion system to the needs required by the satellite
mission. Therefore, plasma measurements with a high-speed camera
and induction probes, and performance measurements of mass bit
and impulse bit were conducted. Values for current sheet propagation
speed, mean exhaust velocity and thrust efficiency were derived from
these experimental data. A maximum in current sheet propagation
was found by the high-speed camera measurements for a medium
energy input and confirmed by the induction probes. A quasilinear
tendency between the mass bit and the energy input, the current
action integral respectively, was found, as well as a linear tendency
between the created impulse and the discharge energy. The highest
mean exhaust velocity and thrust efficiency was found for the highest
energy input.
Abstract: This paper proposes a method, combining color and layout features, for identifying documents captured from low-resolution handheld devices. On one hand, the document image color density surface is estimated and represented with an equivalent ellipse and on the other hand, the document shallow layout structure is computed and hierarchically represented. Our identification method first uses the color information in the documents in order to focus the search space on documents having a similar color distribution, and finally selects the document having the most similar layout structure in the remaining of the search space.
Abstract: A cross sectional survey design was used to collect
data from 370 diabetic patients. Two instruments were used in
obtaining data; in-depth interview guide and researchers- developed
questionnaire. Fisher's exact test was used to investigate association
between the identified factors and nonadherence. Factors identified
were: socio-demographic factors such as: gender, age, marital status,
educational level and occupation; psychosocial obstacles such as:
non-affordability of prescribed diet, frustration due to the restriction,
limited spousal support, feelings of deprivation, feeling that
temptation is inevitable, difficulty in adhering in social gatherings
and difficulty in revealing to host that one is diabetic; health care
providers obstacles were: poor attitude of health workers, irregular
diabetes education in clinics , limited number of nutrition education
sessions/ inability of the patients to estimate the desired quantity of
food, no reminder post cards or phone calls about upcoming patient
appointments and delayed start of appointment / time wasting in
clinics.
Abstract: Economically transformers constitute one of the largest investments in a Power system. For this reason, transformer condition assessment and management is a high priority task. If a transformer fails, it would have a significant negative impact on revenue and service reliability. Monitoring the state of health of power transformers has traditionally been carried out using laboratory Dissolved Gas Analysis (DGA) tests performed at periodic intervals on the oil sample, collected from the transformers. DGA of transformer oil is the single best indicator of a transformer-s overall condition and is a universal practice today, which started somewhere in the 1960s. Failure can occur in a transformer due to different reasons. Some failures can be limited or prevented by maintenance. Oil filtration is one of the methods to remove the dissolve gases and prevent the deterioration of the oil. In this paper we analysis the DGA data by regression method and predict the gas concentration in the oil in the future. We bring about a comparative study of different traditional methods of regression and the errors generated out of their predictions. With the help of these data we can deduce the health of the transformer by finding the type of fault if it has occurred or will occur in future. Additional in this paper effect of filtration on the transformer health is highlight by calculating the probability of failure of a transformer with and without oil filtrating.
Abstract: In the past years, the world has witnessed significant work in the field of Manufacturing. Special efforts have been made in the implementation of new technologies, management and control systems, among many others which have all evolved the field. Closely following all this, due to the scope of new projects and the need of turning the existing flexible ideas into more autonomous and intelligent ones, i.e.: moving toward a more intelligent manufacturing, the present paper emerges with the main aim of contributing to the analysis and a few customization issues of a new iCIM 3000 system at the IPSAM. In this process, special emphasis in made on the material flow problem. For this, besides offering a description and analysis of the system and its main parts, also some tips on how to define other possible alternative material flow scenarios and a partial analysis of the combinatorial nature of the problem are offered as well. All this is done with the intentions of relating it with the use of simulation tools, for which these have been briefly addressed with a special focus on the Witness simulation package. For a better comprehension, the previous elements are supported by a few figures and expressions which would help obtaining necessary data. Such data and others will be used in the future, when simulating the scenarios in the search of the best material flow configurations.
Abstract: Design and modeling of nonlinear systems require the
knowledge of all inside acting parameters and effects. An empirical
alternative is to identify the system-s transfer function from input and
output data as a black box model. This paper presents a procedure
using least squares algorithm for the identification of a feed drive
system coefficients in time domain using a reduced model based on
windowed input and output data. The command and response of the
axis are first measured in the first 4 ms, and then least squares are
applied to predict the transfer function coefficients for this
displacement segment. From the identified coefficients, the next
command response segments are estimated. The obtained results
reveal a considerable potential of least squares method to identify the
system-s time-based coefficients and predict accurately the command
response as compared to measurements.
Abstract: Lately, significant work in the area of Intelligent
Manufacturing has become public and mainly applied within the
frame of industrial purposes. Special efforts have been made in the
implementation of new technologies, management and control
systems, among many others which have all evolved the field. Aware
of all this and due to the scope of new projects and the need of
turning the existing flexible ideas into more autonomous and
intelligent ones, i.e.: Intelligent Manufacturing, the present paper
emerges with the main aim of contributing to the design and analysis
of the material flow in either systems, cells or work stations under
this new “intelligent" denomination. For this, besides offering a
conceptual basis in some of the key points to be taken into account
and some general principles to consider in the design and analysis of
the material flow, also some tips on how to define other possible
alternative material flow scenarios and a classification of the states a
system, cell or workstation are offered as well. All this is done with
the intentions of relating it with the use of simulation tools, for which
these have been briefly addressed with a special focus on the Witness
simulation package. For a better comprehension, the previous
elements are supported by a detailed layout, other figures and a few
expressions which could help obtaining necessary data. Such data and
others will be used in the future, when simulating the scenarios in the
search of the best material flow configurations.
Abstract: The objective of this paper is to construct a creativity
composite index designed to capture the growing role of creativity in
driving economic and social development for the 27 European Union
countries.
The paper proposes a new approach for the measurement of EU-27
creative potential and for determining its capacity to attract and
develop creative human capital. We apply a modified version of the
3T model developed by Richard Florida and Irene Tinagli for
constructing a Euro-Creativity Index. The resulting indexes establish
a quantitative base for policy makers, supporting their efforts to
determine the contribution of creativity to economic development.
Abstract: In this paper we present semantic assistant agent
(SAA), an open source digital library agent which takes user query
for finding information in the digital library and takes resources-
metadata and stores it semantically. SAA uses Semantic Web to
improve browsing and searching for resources in digital library. All
metadata stored in the library are available in RDF format for
querying and processing by SemanSreach which is a part of SAA
architecture. The architecture includes a generic RDF-based model
that represents relationships among objects and their components.
Queries against these relationships are supported by an RDF triple
store.
Abstract: Appropriate description of business processes through
standard notations has become one of the most important assets for
organizations. Organizations must therefore deal with quality faults
in business process models such as the lack of understandability and
modifiability. These quality faults may be exacerbated if business
process models are mined by reverse engineering, e.g., from existing
information systems that support those business processes. Hence,
business process refactoring is often used, which change the internal
structure of business processes whilst its external behavior is
preserved. This paper aims to choose the most appropriate set of
refactoring operators through the quality assessment concerning
understandability and modifiability. These quality features are
assessed through well-proven measures proposed in the literature.
Additionally, a set of measure thresholds are heuristically established
for applying the most promising refactoring operators, i.e., those that
achieve the highest quality improvement according to the selected
measures in each case.
Abstract: The back propagation algorithm calculates the weight
changes of artificial neural networks, and a common approach is to
use a training algorithm consisting of a learning rate and a
momentum factor. The major drawbacks of above learning algorithm
are the problems of local minima and slow convergence speeds. The
addition of an extra term, called a proportional factor reduces the
convergence of the back propagation algorithm. We have applied the
three term back propagation to multiplicative neural network
learning. The algorithm is tested on XOR and parity problem and
compared with the standard back propagation training algorithm.
Abstract: Pharmaceutical industries and effluents of sewage treatment plants are the main sources of residual pharmaceuticals in water resources. These emergent pollutants may adversely impact the biophysical environment. Pharmaceutical industries often generate wastewater that changes in characteristics and quantity depending on the used manufacturing processes. Carbamazepine (CBZ), {5Hdibenzo [b,f]azepine-5-carboxamide, (C15H12N2O)}, is a significant non-biodegradable pharmaceutical contaminant in the Jordanian pharmaceutical wastewater, which is not removed by the activated sludge processes in treatment plants. Activated carbon may potentially remove that pollutant from effluents, but the high cost involved suggests that more attention should be given to the potential use of low-cost materials in order to reduce cost and environmental contamination. Powders of Jordanian non-metallic raw materials namely, Azraq Bentonite (AB), Kaolinite (K), and Zeolite (Zeo) were activated (acid and thermal treatment) and evaluated by removing CBZ. The results of batch and column techniques experiments showed around 46% and 67% removal of CBZ respectively.
Abstract: The performance of schedules released to a shop floor may greatly be affected by unexpected disruptions. Thus, this paper considers the flexible job shop scheduling problem when processing times of some operations are represented by a uniform distribution with given lower and upper bounds. The objective is to find a predictive schedule that can deal with this uncertainty. The paper compares two genetic approaches to obtain predictive schedule. To determine the performance of the predictive schedules obtained by both approaches, an experimental study is conducted on a number of benchmark problems.
Abstract: The main aim of this research is to develop a methodology to encourage people's awareness, knowledge and understanding on the participation of flood management for cultural heritage, as the cooperation and interaction among government section, private section, and public section through role-play gaming simulation theory. The format of this research is to develop Role-play gaming simulation from existing documents, game or role-playing from several sources and existing data of the research site. We found that role-play gaming simulation can be implemented to help improving the understanding of the existing problem and the impact of the flood on cultural heritage, and the role-play game can be developed into the tool to improve people's knowledge, understanding and awareness about people's participation for flood management on cultural heritage, moreover the cooperation among the government, private section and public section will be improved through the theory of role-play gaming simulation.