Abstract: Rapid urbanization, industrialization and population
growth have led to an increase in number of automobiles that cause
air pollution. It is estimated that road traffic contributes 60% of air
pollution in urban areas. A case by case assessment is required to
predict the air quality in urban situations, so as to evolve certain
traffic management measures to maintain the air quality levels with
in the tolerable limits. Calicut city in the state of Kerala, India has
been chosen as the study area. Carbon Monoxide (CO) concentration
was monitored at 15 links in Calicut city and air quality performance
was evaluated over each link. The CO pollutant concentration values
were compared with the National Ambient Air Quality Standards
(NAAQS), and the CO values were predicted by using CALINE4 and
IITLS and Linear regression models. The study has revealed that
linear regression model performs better than the CALINE4 and
IITLS models. The possible association between CO pollutant
concentration and traffic parameters like traffic flow, type of vehicle,
and traffic stream speed was also evaluated.
Abstract: In this paper, we present a comparative study between two computer vision systems for objects recognition and tracking, these algorithms describe two different approach based on regions constituted by a set of pixels which parameterized objects in shot sequences. For the image segmentation and objects detection, the FCM technique is used, the overlapping between cluster's distribution is minimized by the use of suitable color space (other that the RGB one). The first technique takes into account a priori probabilities governing the computation of various clusters to track objects. A Parzen kernel method is described and allows identifying the players in each frame, we also show the importance of standard deviation value research of the Gaussian probability density function. Region matching is carried out by an algorithm that operates on the Mahalanobis distance between region descriptors in two subsequent frames and uses singular value decomposition to compute a set of correspondences satisfying both the principle of proximity and the principle of exclusion.
Abstract: Documents clustering become an essential technology
with the popularity of the Internet. That also means that fast and
high-quality document clustering technique play core topics. Text
clustering or shortly clustering is about discovering semantically
related groups in an unstructured collection of documents. Clustering
has been very popular for a long time because it provides unique
ways of digesting and generalizing large amounts of information.
One of the issues of clustering is to extract proper feature (concept)
of a problem domain. The existing clustering technology mainly
focuses on term weight calculation. To achieve more accurate
document clustering, more informative features including concept
weight are important. Feature Selection is important for clustering
process because some of the irrelevant or redundant feature may
misguide the clustering results. To counteract this issue, the proposed
system presents the concept weight for text clustering system
developed based on a k-means algorithm in accordance with the
principles of ontology so that the important of words of a cluster can
be identified by the weight values. To a certain extent, it has resolved
the semantic problem in specific areas.
Abstract: A mathematical model for the Dynamics of Economic
Profit is constructed by proposing a characteristic differential oneform
for this dynamics (analogous to the action in Hamiltonian
dynamics). After processing this form with exterior calculus, a pair of
characteristic differential equations is generated and solved for the
rate of change of profit P as a function of revenue R (t) and cost C (t).
By contracting the characteristic differential one-form with a vortex
vector, the Lagrangian is obtained for the Dynamics of Economic
Profit.
Abstract: When reconstructing a scenario, it is necessary to
know the structure of the elements present on the scene to have an
interpretation. In this work we link 3D scenes reconstruction to
evolutionary algorithms through the vision stereo theory. We
consider vision stereo as a method that provides the reconstruction of
a scene using only a couple of images of the scene and performing
some computation. Through several images of a scene, captured from
different positions, vision stereo can give us an idea about the threedimensional
characteristics of the world. Vision stereo usually
requires of two cameras, making an analogy to the mammalian vision
system. In this work we employ only a camera, which is translated
along a path, capturing images every certain distance. As we can not
perform all computations required for an exhaustive reconstruction,
we employ an evolutionary algorithm to partially reconstruct the
scene in real time. The algorithm employed is the fly algorithm,
which employ “flies" to reconstruct the principal characteristics of
the world following certain evolutionary rules.
Abstract: This paper reviews the objectives, methods and results of previous studies on biodrying of solid waste in several countries. Biodrying of solid waste is a novel technology in developing countries such as in Malaysia where high moisture content in organic waste makes the segregation process for recycling purposes complicated and diminishes the calorific value for the use of fuel source. In addition, the high moisture content also encourages the breeding of vectors and disease-bearing animals. From the laboratory results, the average moisture content of organic waste, paper, plastics and metals are 58.17%, 37.93%, 29.79% and 1.03% respectively for UKM campus. Biodrying of solid waste is a simple method of waste treatment as well as a cost-efficient technology to dry the solid waste. The process depends on temperature monitoring and air flow control along with the natural biodegradable process of organic waste. This review shows that the biodrying of solid waste method has high potential in treatment and recycling of solid waste, be useful for biodrying study and implementation in Malaysia.
Abstract: We present the development of a new underwater laser
cutting process in which a water-jet has been used along with the
laser beam to remove the molten material through kerf. The
conventional underwater laser cutting usually utilizes a high pressure
gas jet along with laser beam to create a dry condition in the cutting
zone and also to eject out the molten material. This causes a lot of gas
bubbles and turbulence in water, and produces aerosols and waste
gas. This may cause contamination in the surrounding atmosphere
while cutting radioactive components like burnt nuclear fuel. The
water-jet assisted underwater laser cutting process produces much
less turbulence and aerosols in the atmosphere. Some amount of
water vapor bubbles is formed at the laser-metal-water interface;
however, they tend to condense as they rise up through the
surrounding water. We present the design and development of a
water-jet assisted underwater laser cutting head and the parametric
study of the cutting of AISI 304 stainless steel sheets with a 2 kW
CW fiber laser. The cutting performance is similar to that of the gas
assist laser cutting; however, the process efficiency is reduced due to
heat convection by water-jet and laser beam scattering by vapor. This
process may be attractive for underwater cutting of nuclear reactor
components.
Abstract: The performance and the plasma created by a pulsed
magnetoplasmadynamic thruster for small satellite application is
studied to understand better the ablation and plasma propagation
processes occurring during the short-time discharge. The results can
be applied to improve the quality of the thruster in terms of efficiency,
and to tune the propulsion system to the needs required by the satellite
mission. Therefore, plasma measurements with a high-speed camera
and induction probes, and performance measurements of mass bit
and impulse bit were conducted. Values for current sheet propagation
speed, mean exhaust velocity and thrust efficiency were derived from
these experimental data. A maximum in current sheet propagation
was found by the high-speed camera measurements for a medium
energy input and confirmed by the induction probes. A quasilinear
tendency between the mass bit and the energy input, the current
action integral respectively, was found, as well as a linear tendency
between the created impulse and the discharge energy. The highest
mean exhaust velocity and thrust efficiency was found for the highest
energy input.
Abstract: A cross sectional survey design was used to collect
data from 370 diabetic patients. Two instruments were used in
obtaining data; in-depth interview guide and researchers- developed
questionnaire. Fisher's exact test was used to investigate association
between the identified factors and nonadherence. Factors identified
were: socio-demographic factors such as: gender, age, marital status,
educational level and occupation; psychosocial obstacles such as:
non-affordability of prescribed diet, frustration due to the restriction,
limited spousal support, feelings of deprivation, feeling that
temptation is inevitable, difficulty in adhering in social gatherings
and difficulty in revealing to host that one is diabetic; health care
providers obstacles were: poor attitude of health workers, irregular
diabetes education in clinics , limited number of nutrition education
sessions/ inability of the patients to estimate the desired quantity of
food, no reminder post cards or phone calls about upcoming patient
appointments and delayed start of appointment / time wasting in
clinics.
Abstract: Economically transformers constitute one of the largest investments in a Power system. For this reason, transformer condition assessment and management is a high priority task. If a transformer fails, it would have a significant negative impact on revenue and service reliability. Monitoring the state of health of power transformers has traditionally been carried out using laboratory Dissolved Gas Analysis (DGA) tests performed at periodic intervals on the oil sample, collected from the transformers. DGA of transformer oil is the single best indicator of a transformer-s overall condition and is a universal practice today, which started somewhere in the 1960s. Failure can occur in a transformer due to different reasons. Some failures can be limited or prevented by maintenance. Oil filtration is one of the methods to remove the dissolve gases and prevent the deterioration of the oil. In this paper we analysis the DGA data by regression method and predict the gas concentration in the oil in the future. We bring about a comparative study of different traditional methods of regression and the errors generated out of their predictions. With the help of these data we can deduce the health of the transformer by finding the type of fault if it has occurred or will occur in future. Additional in this paper effect of filtration on the transformer health is highlight by calculating the probability of failure of a transformer with and without oil filtrating.
Abstract: In the past years, the world has witnessed significant work in the field of Manufacturing. Special efforts have been made in the implementation of new technologies, management and control systems, among many others which have all evolved the field. Closely following all this, due to the scope of new projects and the need of turning the existing flexible ideas into more autonomous and intelligent ones, i.e.: moving toward a more intelligent manufacturing, the present paper emerges with the main aim of contributing to the analysis and a few customization issues of a new iCIM 3000 system at the IPSAM. In this process, special emphasis in made on the material flow problem. For this, besides offering a description and analysis of the system and its main parts, also some tips on how to define other possible alternative material flow scenarios and a partial analysis of the combinatorial nature of the problem are offered as well. All this is done with the intentions of relating it with the use of simulation tools, for which these have been briefly addressed with a special focus on the Witness simulation package. For a better comprehension, the previous elements are supported by a few figures and expressions which would help obtaining necessary data. Such data and others will be used in the future, when simulating the scenarios in the search of the best material flow configurations.
Abstract: A catastrophic earthquake measuring 6.3 on the
Richter scale struck the Christchurch, New Zealand Central Business
District on February 22, 2012, abruptly disrupting the business of
teaching and learning at Christchurch Polytechnic Institute of
Technology. This paper presents the findings from a study
undertaken about the complexity of delivering an educational
programme in the face of this traumatic natural event. Nine
interconnected themes emerged from this multiple method study:
communication, decision making, leader- and follower-ship,
balancing personal and professional responsibilities, taking action,
preparedness and thinking ahead, all within a disruptive and uncertain
context. Sustainable responses that maximise business continuity, and
provide solutions to practical challenges, are among the study-s
recommendations.
Abstract: The purpose of the study was to determine if, among
32 brain injured adults in community rehabilitation programs, there is
a statistically significant relationship between the degree of severity
of brain injury and these adults- level of self-esteem and stress. The
researcher hypothesized there would be a statistically significant
difference and a statistically significant relationship in self-esteem
and stress levels among and TBI adults. A Pearson product moment
correlational analysis was implemented and results found a
statistically significant relationship between self-esteem and stress
levels. Future recommendations were suggested upon completion of
research.
Abstract: Lately, significant work in the area of Intelligent
Manufacturing has become public and mainly applied within the
frame of industrial purposes. Special efforts have been made in the
implementation of new technologies, management and control
systems, among many others which have all evolved the field. Aware
of all this and due to the scope of new projects and the need of
turning the existing flexible ideas into more autonomous and
intelligent ones, i.e.: Intelligent Manufacturing, the present paper
emerges with the main aim of contributing to the design and analysis
of the material flow in either systems, cells or work stations under
this new “intelligent" denomination. For this, besides offering a
conceptual basis in some of the key points to be taken into account
and some general principles to consider in the design and analysis of
the material flow, also some tips on how to define other possible
alternative material flow scenarios and a classification of the states a
system, cell or workstation are offered as well. All this is done with
the intentions of relating it with the use of simulation tools, for which
these have been briefly addressed with a special focus on the Witness
simulation package. For a better comprehension, the previous
elements are supported by a detailed layout, other figures and a few
expressions which could help obtaining necessary data. Such data and
others will be used in the future, when simulating the scenarios in the
search of the best material flow configurations.
Abstract: Candida albicans ATCC 10231 had low endogenous activity of the alternative oxidase compared with that of C. albicans ATCC 10261. In C. albicans ATCC 10231 the endogenous activity declined as the cultures aged. Alternative oxidase activity could be induced in C. albicans ATCC 10231 by treatment with cyanide, but the induction of this activity required the presence of oxygen which could be replaced, at least in part, with high concentrations of potassium ferricyanide. We infer from this that the expression of the gene encoding the alternative oxidase is under the control of a redoxsensitive transcription factor.
Abstract: The effect of beak trimming on behavior of two strains
of Thai native pullets kept in floor pens was studied. Six general
activities (standing, crouching, moving, comforting, roosting, and
nesting), 6 beak related activities (preening, feeding, drinking,
pecking at inedible object, feather pecking, and litter pecking), and 4
agonistic activities (head pecking, threatening, avoiding, and fighting)
were measured twice a for 15 consecutive days, started when the
pullets were 19 wk old. It was found that beak trimmed pullets drank
more frequent (P
Abstract: In this paper we present semantic assistant agent
(SAA), an open source digital library agent which takes user query
for finding information in the digital library and takes resources-
metadata and stores it semantically. SAA uses Semantic Web to
improve browsing and searching for resources in digital library. All
metadata stored in the library are available in RDF format for
querying and processing by SemanSreach which is a part of SAA
architecture. The architecture includes a generic RDF-based model
that represents relationships among objects and their components.
Queries against these relationships are supported by an RDF triple
store.
Abstract: Appropriate description of business processes through
standard notations has become one of the most important assets for
organizations. Organizations must therefore deal with quality faults
in business process models such as the lack of understandability and
modifiability. These quality faults may be exacerbated if business
process models are mined by reverse engineering, e.g., from existing
information systems that support those business processes. Hence,
business process refactoring is often used, which change the internal
structure of business processes whilst its external behavior is
preserved. This paper aims to choose the most appropriate set of
refactoring operators through the quality assessment concerning
understandability and modifiability. These quality features are
assessed through well-proven measures proposed in the literature.
Additionally, a set of measure thresholds are heuristically established
for applying the most promising refactoring operators, i.e., those that
achieve the highest quality improvement according to the selected
measures in each case.
Abstract: Pharmaceutical industries and effluents of sewage treatment plants are the main sources of residual pharmaceuticals in water resources. These emergent pollutants may adversely impact the biophysical environment. Pharmaceutical industries often generate wastewater that changes in characteristics and quantity depending on the used manufacturing processes. Carbamazepine (CBZ), {5Hdibenzo [b,f]azepine-5-carboxamide, (C15H12N2O)}, is a significant non-biodegradable pharmaceutical contaminant in the Jordanian pharmaceutical wastewater, which is not removed by the activated sludge processes in treatment plants. Activated carbon may potentially remove that pollutant from effluents, but the high cost involved suggests that more attention should be given to the potential use of low-cost materials in order to reduce cost and environmental contamination. Powders of Jordanian non-metallic raw materials namely, Azraq Bentonite (AB), Kaolinite (K), and Zeolite (Zeo) were activated (acid and thermal treatment) and evaluated by removing CBZ. The results of batch and column techniques experiments showed around 46% and 67% removal of CBZ respectively.
Abstract: The performance of schedules released to a shop floor may greatly be affected by unexpected disruptions. Thus, this paper considers the flexible job shop scheduling problem when processing times of some operations are represented by a uniform distribution with given lower and upper bounds. The objective is to find a predictive schedule that can deal with this uncertainty. The paper compares two genetic approaches to obtain predictive schedule. To determine the performance of the predictive schedules obtained by both approaches, an experimental study is conducted on a number of benchmark problems.