Abstract: Leprosy is an infectious disease caused by
Mycobacterium Leprae, this disease, generally, compromises
the neural fibers, leading to the development of disability.
Disabilities are changes that limit daily activities or social life
of a normal individual. When comes to leprosy, the study of
disability considered the functional limitation (physical
disabilities), the limitation of activity and social participation,
which are measured respectively by the scales: EHF, SALSA
and PARTICIPATION SCALE. The objective of this work is
to propose an on-line monitoring of leprosy patients, which is
based on information scales EHF, SALSA and
PARTICIPATION SCALE. It is expected that the proposed
system is applied in monitoring the patient during treatment
and after healing therapy of the disease. The correlations that
the system is between the scales create a variety of
information, presented the state of the patient and full of
changes or reductions in disability. The system provides
reports with information from each of the scales and the
relationships that exist between them. This way, health
professionals, with access to patient information, can
intervene with techniques for the Prevention of Disability.
Through the automated scale, the system shows the level of
the patient and allows the patient, or the responsible, to take a
preventive measure. With an online system, it is possible take
the assessments and monitor patients from anywhere.
Abstract: WikID is a wiki for industrial design engineers. An
important aspect for the viability of a wiki is the loyalty of the user
community to share their information and knowledge by adding this
knowledge to the wiki. For the initiators of a wiki it is therefore
important to use every aspect to stimulate the user community to
actively participate. In this study the focus is on the styling of the
website. The central question is: How could the WikID website be
visually designed to achieve a user experience which will incite the
user to actively participate in the WikID community? After a
literature study on the influencing factors of a website, a new
interface has been designed by applying the rules found, in order to
expand this website-s active user community. An online
questionnaire regarding the old or the new website gave insights in
the opinions of users. As expected, the new website was rated more
positively than the old website. However, the differences are limited.
Abstract: The production of biodiesel from crude palm oil with
a homogeneous base catalyst is unlikely owing to considerable
formation of soap. Free fatty acids (FFA) in crude palm oil need to
be reduced, e.g. by esterification. This study investigated the activity
of sulfated zirconia calcined at various temperatures for esterification
of FFA in crude palm oil to biodiesel. It was found that under a
proper reaction condition, sulfated zirconia well catalyzes
esterification. FFA content can be reduced to an acceptable value for
typical biodiesel production with a homogeneous base catalyst.
Crystallinity and sulfate attachment of sulfated zirconia depend on
calcination temperature during the catalyst preparation. Too low
temperature of calcination gives amorphous sulfated zirconia which
has low activity for esterification of FFA. In contrast, very high
temperature of calcination removes sulfate group, consequently,
conversion of FFA is reduced. The appropriate temperature range of
calcination is 550-650 oC.
Abstract: With the advent of new technologies, factors related to
mental health in e-workspaces are taken into consideration more than
ever. Studies have revealed that one of the factors affecting the
productivity of employees in an organization is occupational stress.
Another influential factor is quality of work life which is important in
the improvement of work environment conditions and organizational
efficiency. In order to uncover the quality of work life level and to
investigate the impact of occupational stress on quality of work life
among information technology employees in Iran, a cross-sectional
study design was applied and data were gathered using a
questionnaire validated by a group of experts. The results of the study
showed that information technology staffs have average level of both
occupational stress and quality of work life. Furthermore, it was
found that occupational stress has a negative impact on quality of
work life. In addition, the same results were observed for role
ambiguity, role conflict, role under-load, work-pace, work
repetitiveness and tension toward quality of work life. No significant
relation was found between role overload and quality of work life.
Finally, directions for future research are proposed and discussed.
Abstract: The objective of this paper is to investigate a new
approach based on the idea of pictograms for food portion size. This
approach adopts the model of the United States Pharmacopeia- Drug
Information (USP-DI). The representation of each food portion size
composed of three parts: frame, the connotation of dietary portion
sizes and layout. To investigate users- comprehension based on this
approach, two experiments were conducted, included 122 Taiwanese
people, 60 male and 62 female with ages between 16 and 64 (divided
into age groups of 16-30, 31-45 and 46-64). In Experiment 1, the mean
correcting rate of the understanding level of food items is 48.54%
(S.D.= 95.08) and the mean response time 2.89sec (S.D.=2.14). The
difference on the correct rates for different age groups is significant
(P*=0.00
Abstract: In Geographic Information System, one of the sources
of obtaining needed geographic data is digitizing analog maps and
evaluation of aerial and satellite photos. In this study, a method will
be discussed which can be used to extract vectorial features and
creating vectorized drawing files for aerial photos. At the same time
a software developed for these purpose. Converting from raster to
vector is also known as vectorization and it is the most important step
when creating vectorized drawing files. In the developed algorithm,
first of all preprocessing on the aerial photo is done. These are;
converting to grayscale if necessary, reducing noise, applying some
filters and determining the edge of the objects etc. After these steps,
every pixel which constitutes the photo are followed from upper left
to right bottom by examining its neighborhood relationship and one
pixel wide lines or polylines obtained. The obtained lines have to be
erased for preventing confusion while continuing vectorization
because if not erased they can be perceived as new line, but if erased
it can cause discontinuity in vector drawing so the image converted
from 2 bit to 8 bit and the detected pixels are expressed as a different
bit. In conclusion, the aerial photo can be converted to vector form
which includes lines and polylines and can be opened in any CAD
application.
Abstract: With the advance of multimedia and diagnostic
images technologies, the number of radiographic images is increasing
constantly. The medical field demands sophisticated systems for
search and retrieval of the produced multimedia document. This
paper presents an ongoing research that focuses on the semantic
content of radiographic image documents to facilitate semantic-based
radiographic image indexing and a retrieval system. The proposed
model would divide a radiographic image document, based on its
semantic content, and would be converted into a logical structure or
a semantic structure. The logical structure represents the overall
organization of information. The semantic structure, which is bound
to logical structure, is composed of semantic objects with
interrelationships in the various spaces in the radiographic image.
Abstract: Today-s healthcare industries had become more
patient-centric than profession-centric, from which the issues of quality of healthcare and the patient safety are the major concerns in the modern healthcare facilities. An unplanned extubation (UE) may
be detrimental to the patient-s life, and thus is one of the major indexes
of patient safety and healthcare quality. A high UE rate not only
defeated the healthcare quality as well as the patient safety policy but
also the nurses- morality, and job satisfaction. The UE problem in a psychiatric hospital is unique and may be a tough challenge for the
healthcare professionals for the patients were mostly lacking communication capabilities. We reported with this essay a particular
project that was organized to reduce the UE rate from the current 2.3%
to a lower and satisfactory level in the long-term care units of a psychiatric hospital. The project was conducted between March 1st,
2011 and August 31st, 2011. Based on the error information gathered
from varied units of the hospital, the team analyzed the root causes
with possible solutions proposed to the meetings. Four solutions were
then concluded with consensus and launched to the units in question.
The UE rate was now reduced to a level of 0.17%. Experience from
this project, the procedure and the tools adopted would be good reference to other hospitals.
Abstract: Parametric models have been quite popular for
studying human growth, particularly in relation to biological
parameters such as peak size velocity and age at peak size velocity.
Longitudinal data are generally considered to be vital for fittinga
parametric model to individual-specific data, and for studying the
distribution of these biological parameters in a human population.
However, cross-sectional data are easier to obtain than longitudinal
data. In this paper, we present a method of combining longitudinal
and cross-sectional data for the purpose of estimating the distribution
of the biological parameters. We demonstrate, through simulations in
the special case ofthePreece Baines model, how estimates based on
longitudinal data can be improved upon by harnessing the
information contained in cross-sectional data.We study the extent of
improvement for different mixes of the two types of data, and finally
illustrate the use of the method through data collected by the Indian
Statistical Institute.
Abstract: Industrial robots play a vital role in automation
however only little effort are taken for the application of robots in
machining work such as Grinding, Cutting, Milling, Drilling,
Polishing etc. Robot parallel manipulators have high stiffness,
rigidity and accuracy, which cannot be provided by conventional
serial robot manipulators. The aim of this paper is to perform the
modeling and the workspace analysis of a 3 DOF Parallel
Manipulator (3 DOF PM). The 3 DOF PM was modeled and
simulated using 'ADAMS'. The concept involved is based on the
transformation of motion from a screw joint to a spherical joint
through a connecting link. This paper work has been planned to
model the Parallel Manipulator (PM) using screw joints for very
accurate positioning. A workspace analysis has been done for the
determination of work volume of the 3 DOF PM. The position of the
spherical joints connected to the moving platform and the
circumferential points of the moving platform were considered for
finding the workspace. After the simulation, the position of the joints
of the moving platform was noted with respect to simulation time and
these points were given as input to the 'MATLAB' for getting the
work envelope. Then 'AUTOCAD' is used for determining the work
volume. The obtained values were compared with analytical
approach by using Pappus-Guldinus Theorem. The analysis had been
dealt by considering the parameters, link length and radius of the
moving platform. From the results it is found that the radius of
moving platform is directly proportional to the work volume for a
constant link length and the link length is also directly proportional
to the work volume, at a constant radius of the moving platform.
Abstract: XML is a markup language which is becoming the
standard format for information representation and data exchange. A
major purpose of XML is the explicit representation of the logical
structure of a document. Much research has been performed to
exploit logical structure of documents in information retrieval in
order to precisely extract user information need from large
collections of XML documents. In this paper, we describe an XML
information retrieval weighting scheme that tries to find the most
relevant elements in XML documents in response to a user query.
We present this weighting model for information retrieval systems
that utilize plausible inferences to infer the relevance of elements in
XML documents. We also add to this model the Dempster-Shafer
theory of evidence to express the uncertainty in plausible inferences
and Dempster-Shafer rule of combination to combine evidences
derived from different inferences.
Abstract: In rotating machinery one of the critical components
that is prone to premature failure is the rolling bearing.
Consequently, early warning of an imminent bearing failure is much
critical to the safety and reliability of any high speed rotating
machines. This study is concerned with the application of Recurrence
Quantification Analysis (RQA) in fault detection of rolling element
bearings in rotating machinery. Based on the results from this study it
is reported that the RQA variable, percent determinism, is sensitive
to the type of fault investigated and therefore can provide useful
information on bearing damage in rolling element bearings.
Abstract: This paper introduces the effective speckle reduction of
synthetic aperture radar (SAR) images using inner product spaces in
undecimated wavelet domain. There are two major areas in projection
onto span algorithm where improvement can be made. First is the use
of undecimated wavelet transformation instead of discrete wavelet
transformation. And second area is the use of smoothing filter namely
directional smoothing filter which is an additional step. Proposed
method does not need any noise estimation and thresholding
technique. More over proposed method gives good results on both
single polarimetric and fully polarimetric SAR images.
Abstract: This work aims to explore the factors that have an incidence in reading comprehension process, with different type of texts. In a recent study with 2nd, 3rd and 4th grade children, it was observed that reading comprehension of narrative texts was better than comprehension of expository texts. Nevertheless it seems that not only the type of text but also other textual factors would account for comprehension depending on the cognitive processing demands posed by the text. In order to explore this assumption, three narrative and three expository texts were elaborated with different degree of complexity. A group of 40 fourth grade Spanish-speaking children took part in the study. Children were asked to read the texts and answer orally three literal and three inferential questions for each text. The quantitative and qualitative analysis of children responses showed that children had difficulties in both, narrative and expository texts. The problem was to answer those questions that involved establishing complex relationships among information units that were present in the text or that should be activated from children’s previous knowledge to make an inference. Considering the data analysis, it could be concluded that there is some interaction between the type of text and the cognitive processing load of a specific text.
Abstract: Scale defects are common surface defects in hot steel rolling. The modelling of such defects is problematic and their causes are not straightforward. In this study, we investigated genetic algorithms in search for a mathematical solution to scale formation. For this research, a high-dimensional data set from hot steel rolling process was gathered. The synchronisation of the variables as well as the allocation of the measurements made on the steel strip were solved before the modelling phase.
Abstract: Encryption protects communication partners from
disclosure of their secret messages but cannot prevent traffic analysis
and the leakage of information about “who communicates with
whom". In the presence of collaborating adversaries, this linkability
of actions can danger anonymity. However, reliably providing
anonymity is crucial in many applications. Especially in contextaware
mobile business, where mobile users equipped with PDAs
request and receive services from service providers, providing
anonymous communication is mission-critical and challenging at the
same time. Firstly, the limited performance of mobile devices does
not allow for heavy use of expensive public-key operations which are
commonly used in anonymity protocols. Moreover, the demands for
security depend on the application (e.g., mobile dating vs. pizza
delivery service), but different users (e.g., a celebrity vs. a normal
person) may even require different security levels for the same
application. Considering both hardware limitations of mobile devices
and different sensitivity of users, we propose an anonymity
framework that is dynamically configurable according to user and
application preferences. Our framework is based on Chaum-s mixnet.
We explain the proposed framework, its configuration
parameters for the dynamic behavior and the algorithm to enforce
dynamic anonymity.
Abstract: Company mergers and acquisitions reached their peak
in the twenty-first century. Mergers and acquisitions have become one
of the competitive strategies for external growth. In general, it is
believed that mergers and acquisitions can create synergies. However,
they require complete information technology system and service
integration, especially in the banking industry. Much of the research
has focused on performance evaluation, shareholder equity allocation,
or even the increase of company market value after the merger and
acquisition, whereas few scholars have focused on information system
integration post merger and acquisition. This study indicates the role
of information systems after a merger and acquisition, explaining the
benefits of information system integration using a merger and
acquisition case in the banking industry as an example. In addition, we
discuss factors that affect the performance of information system
integration, and utilize system dynamics to interpret the relationship
among factors that affect information system integration performance
in the banking industry after a merger and acquisition.
Abstract: Most fingerprint recognition techniques are based on minutiae matching and have been well studied. However, this technology still suffers from problems associated with the handling of poor quality impressions. One problem besetting fingerprint matching is distortion. Distortion changes both geometric position and orientation, and leads to difficulties in establishing a match among multiple impressions acquired from the same finger tip. Marking all the minutiae accurately as well as rejecting false minutiae is another issue still under research. Our work has combined many methods to build a minutia extractor and a minutia matcher. The combination of multiple methods comes from a wide investigation into research papers. Also some novel changes like segmentation using Morphological operations, improved thinning, false minutiae removal methods, minutia marking with special considering the triple branch counting, minutia unification by decomposing a branch into three terminations, and matching in the unified x-y coordinate system after a two-step transformation are used in the work.
Abstract: A feature weighting and selection method is proposed
which uses the structure of a weightless neuron and exploits the
principles that govern the operation of Genetic Algorithms and
Evolution. Features are coded onto chromosomes in a novel way
which allows weighting information regarding the features to be
directly inferred from the gene values. The proposed method is
significant in that it addresses several problems concerned with
algorithms for feature selection and weighting as well as providing
significant advantages such as speed, simplicity and suitability for
real-time systems.
Abstract: This paper describes the application of a model
predictive controller to the problem of batch reactor temperature
control. Although a great deal of work has been done to improve
reactor throughput using batch sequence control, the control of the
actual reactor temperature remains a difficult problem for many
operators of these processes. Temperature control is important as
many chemical reactions are sensitive to temperature for formation of
desired products. This controller consist of two part (1) a nonlinear
control method GLC (Global Linearizing Control) to create a linear
model of system and (2) a Model predictive controller used to obtain
optimal input control sequence. The temperature of reactor is tuned
to track a predetermined temperature trajectory that applied to the
batch reactor. To do so two input signals, electrical powers and the
flow of coolant in the coil are used. Simulation results show that the
proposed controller has a remarkable performance for tracking
reference trajectory while at the same time it is robust against noise
imposed to system output.