Abstract: The sanitary sewerage connection rate becomes an
important indicator of advanced cities. Following the construction of
sanitary sewerages, the maintenance and management systems are
required for keeping pipelines and facilities functioning well. These
maintenance tasks often require sewer workers to enter the manholes
and the pipelines, which are confined spaces short of natural
ventilation and full of hazardous substances. Working in sewers could
be easily exposed to a risk of adverse health effects. This paper
proposes the use of Bayesian belief networks (BBN) as a higher level
of noncarcinogenic health risk assessment of sewer workers. On the
basis of the epidemiological studies, the actual hospital attendance
records and expert experiences, the BBN is capable of capturing the
probabilistic relationships between the hazardous substances in sewers
and their adverse health effects, and accordingly inferring the
morbidity and mortality of the adverse health effects. The provision of
the morbidity and mortality rates of the related diseases is more
informative and can alleviate the drawbacks of conventional methods.
Abstract: In a complex project environment, project teams face
multi-dimensional communication problems that can ultimately lead
to project breakdown. Team Performance varies in Face-to-Face
(FTF) environment versus groups working remotely in a computermediated
communication (CMC) environment. A brief review of the
Input_Process_Output model suggested by James E. Driskell, Paul H.
Radtke and Eduardo Salas in “Virtual Teams: Effects of
Technological Mediation on Team Performance (2003)", has been
done to develop the basis of this research. This model theoretically
analyzes the effects of technological mediation on team processes,
such as, cohesiveness, status and authority relations, counternormative
behavior and communication. An empirical study
described in this paper has been undertaken to test the
“cohesiveness" of diverse project teams in a multi-national
organization. This study uses both quantitative and qualitative
techniques for data gathering and analysis. These techniques include
interviews, questionnaires for data collection and graphical data
representation for analyzing the collected data. Computer-mediated
technology may impact team performance because of difference in
cohesiveness among teams and this difference may be moderated by
factors, such as, the type of communication environment, the type of
task and the temporal context of the team. Based on the reviewed
model, sets of hypotheses are devised and tested. This research,
reports on a study that compared team cohesiveness among virtual
teams using CMC and non-CMC communication mediums. The
findings suggest that CMC can help virtual teams increase team
cohesiveness among their members, making CMC an effective
medium for increasing productivity and team performance.
Abstract: This paper features the kinematic modelling of a 5-axis stationary articulated robot arm which is used for doing successful robotic manipulation task in its workspace. To start with, a 5-axes articulated robot was designed entirely from scratch and from indigenous components and a brief kinematic modelling was performed and using this kinematic model, the pick and place task was performed successfully in the work space of the robot. A user friendly GUI was developed in C++ language which was used to perform the successful robotic manipulation task using the developed mathematical kinematic model. This developed kinematic model also incorporates the obstacle avoiding algorithms also during the pick and place operation.
Abstract: Till date, English as a Second Language (ESL) educators involved in teaching language and communication to engineering students face an uphill task in developing graduate communicative competency. This challenge is accentuated by the apparent lack of English for Specific Purposes (ESP) materials for engineering students in the engineering curriculum. As such, most ESL educators are forced to play multiple roles. They don tasks such as curriculum designers, material writers and teachers with limited knowledge of the disciplinary content. Previous research indicates that prospective professional engineers should possess some sub-sets of competency: technical, linguistic oral immediacy, meta-cognitive and rhetorical explanatory competence. Another study revealed that engineering students need to be equipped with technical and linguistic oral immediacy competence. However, little is known whether these competency needs are in line with the educators- perceptions of communicative competence. This paper examines the best mix of communicative competence subsets that create the magic for engineering students in technical oral presentations. For the purpose of this study, two groups of educators were interviewed. These educators were language and communication lecturers involved in teaching a speaking course and content experts who assess students- technical oral presentations at tertiary level. The findings indicate that these two groups differ in their perceptions
Abstract: In a previous work, we presented the numerical
solution of the two dimensional second order telegraph partial
differential equation discretized by the centred and rotated five-point
finite difference discretizations, namely the explicit group (EG) and
explicit decoupled group (EDG) iterative methods, respectively. In
this paper, we utilize a domain decomposition algorithm on these
group schemes to divide the tasks involved in solving the same
equation. The objective of this study is to describe the development
of the parallel group iterative schemes under OpenMP programming
environment as a way to reduce the computational costs of the
solution processes using multicore technologies. A detailed
performance analysis of the parallel implementations of points and
group iterative schemes will be reported and discussed.
Abstract: This paper presents an approach based on the
adoption of a distributed cognition framework and a non parametric
multicriteria evaluation methodology (DEA) designed specifically to
compare e-commerce websites from the consumer/user viewpoint. In
particular, the framework considers a website relative efficiency as a
measure of its quality and usability. A website is modelled as a black
box capable to provide the consumer/user with a set of
functionalities. When the consumer/user interacts with the website to
perform a task, he/she is involved in a cognitive activity, sustaining a
cognitive cost to search, interpret and process information, and
experiencing a sense of satisfaction. The degree of ambiguity and
uncertainty he/she perceives and the needed search time determine
the effort size – and, henceforth, the cognitive cost amount – he/she
has to sustain to perform his/her task. On the contrary, task
performing and result achievement induce a sense of gratification,
satisfaction and usefulness. In total, 9 variables are measured,
classified in a set of 3 website macro-dimensions (user experience,
site navigability and structure). The framework is implemented to
compare 40 websites of businesses performing electronic commerce
in the information technology market. A questionnaire to collect
subjective judgements for the websites in the sample was purposely
designed and administered to 85 university students enrolled in
computer science and information systems engineering
undergraduate courses.
Abstract: In this paper we present a combined
hashing/watermarking method for image authentication. A robust
image hash, invariant to legitimate modifications, but fragile to
illegitimate modifications is generated from the local image
characteristics. To increase security of the system the watermark is
generated using the image hash as a key. Quantized Index
Modulation of DCT coefficients is used for watermark embedding.
Watermark detection is performed without use of the original image.
Experimental results demonstrate the effectiveness of the presented
method in terms of robustness and fragility.
Abstract: The pipe inspection operation is the difficult detective
performance. Almost applications are mainly relies on a manual
recognition of defective areas that have carried out detection by an
engineer. Therefore, an automation process task becomes a necessary
in order to avoid the cost incurred in such a manual process. An
automated monitoring method to obtain a complete picture of the
sewer condition is proposed in this work. The focus of the research is
the automated identification and classification of discontinuities in
the internal surface of the pipe. The methodology consists of several
processing stages including image segmentation into the potential
defect regions and geometrical characteristic features. Automatic
recognition and classification of pipe defects are carried out by means
of using an artificial neural network technique (ANN) based on
Radial Basic Function (RBF). Experiments in a realistic environment
have been conducted and results are presented.
Abstract: The design of a pattern classifier includes an attempt
to select, among a set of possible features, a minimum subset of
weakly correlated features that better discriminate the pattern classes.
This is usually a difficult task in practice, normally requiring the
application of heuristic knowledge about the specific problem
domain. The selection and quality of the features representing each
pattern have a considerable bearing on the success of subsequent
pattern classification. Feature extraction is the process of deriving
new features from the original features in order to reduce the cost of
feature measurement, increase classifier efficiency, and allow higher
classification accuracy. Many current feature extraction techniques
involve linear transformations of the original pattern vectors to new
vectors of lower dimensionality. While this is useful for data
visualization and increasing classification efficiency, it does not
necessarily reduce the number of features that must be measured
since each new feature may be a linear combination of all of the
features in the original pattern vector. In this paper a new approach is
presented to feature extraction in which feature selection, feature
extraction, and classifier training are performed simultaneously using
a genetic algorithm. In this approach each feature value is first
normalized by a linear equation, then scaled by the associated weight
prior to training, testing, and classification. A knn classifier is used to
evaluate each set of feature weights. The genetic algorithm optimizes
a vector of feature weights, which are used to scale the individual
features in the original pattern vectors in either a linear or a nonlinear
fashion. By this approach, the number of features used in classifying
can be finely reduced.
Abstract: The demand for higher performance graphics
continues to grow because of the incessant desire towards realism.
And, rapid advances in fabrication technology have enabled us to
build several processor cores on a single die. Hence, it is important to
develop single chip parallel architectures for such data-intensive
applications. In this paper, we propose an efficient PIM architectures
tailored for computer graphics which requires a large number of
memory accesses. We then address the two important tasks necessary
for maximally exploiting the parallelism provided by the architecture,
namely, partitioning and placement of graphic data, which affect
respectively load balances and communication costs. Under the
constraints of uniform partitioning, we develop approaches for optimal
partitioning and placement, which significantly reduce search space.
We also present heuristics for identifying near-optimal placement,
since the search space for placement is impractically large despite our
optimization. We then demonstrate the effectiveness of our partitioning
and placement approaches via analysis of example scenes; simulation
results show considerable search space reductions, and our heuristics
for placement performs close to optimal – the average ratio of
communication overheads between our heuristics and the optimal was
1.05. Our uniform partitioning showed average load-balance ratio of
1.47 for geometry processing and 1.44 for rasterization, which is
reasonable.
Abstract: There are several means to measure the oxidation of edible oils, such as the acid value, the peroxide value, and the anisidine value. However, these means require large quantities of reagents and are time-consuming tasks. Therefore, a more convenient and time-saving way to measure the oxidation of edible oils is required. In this report, an edible oil condition sensor was fabricated by using single-walled nanotubes (SWNT). In order to test the sensor, oxidized edible oils, each one at a different acid value, were prepared. The SWNT sensors were immersed into these oxidized oils and the resistance changes in the sensors were measured. It was found that the conductivity of the sensors decreased as the oxidation level of oil increased. This result suggests that a change of the oil components induced by the oxidation process in edible oils is related to the conductivity change in the SWNT sensor.
Abstract: Protective clothing limits heat transfer and hampers
task performance due to the increased weight. Militarism protective
clothing enables humans to operate in adverse environments. In the
selection and evaluation of militarism protective clothing attention
should be given to heat strain, ergonomic and fit issues next to the
actual protection it offers.
Fifty Male healthy subjects participated in the study. The subjects
were dressed in shorts, T-shirts, socks, sneakers and four deferent
kinds of militarism protective clothing such as CS, CSB, CS with
NBC protection and CS with NBC- protection added.
Ergonomically and psychological strains of every four cloths were
investigated on subjects by walking on a treadmill (7km/hour) with a
19.7 kg backpack. As a result of these tests were showed that, the
highest heart rate was found wearing the NBC-protection added
outfit, the highest temperatures were observed wearing NBCprotection
added, followed by respectively CS with NBC protection,
CSB and CS and the highest value for thermal comfort (implying
worst thermal comfort) was observed wearing NBC-protection
added.
Abstract: Speckle noise affects all coherent imaging systems
including medical ultrasound. In medical images, noise suppression
is a particularly delicate and difficult task. A tradeoff between noise
reduction and the preservation of actual image features has to be made
in a way that enhances the diagnostically relevant image content.
Even though wavelets have been extensively used for denoising
speckle images, we have found that denoising using contourlets gives
much better performance in terms of SNR, PSNR, MSE, variance and
correlation coefficient. The objective of the paper is to determine the
number of levels of Laplacian pyramidal decomposition, the number
of directional decompositions to perform on each pyramidal level and
thresholding schemes which yields optimal despeckling of medical
ultrasound images, in particular. The proposed method consists of the
log transformed original ultrasound image being subjected to contourlet
transform, to obtain contourlet coefficients. The transformed
image is denoised by applying thresholding techniques on individual
band pass sub bands using a Bayes shrinkage rule. We quantify the
achieved performance improvement.
Abstract: In recent years a number of applications with multirobot
systems (MRS) is growing in various areas. But their design
is in practice often difficult and algorithms are proposed for the
theoretical background and do not consider errors and noise in real
conditions, so they are not usable in real environment. These errors
are visible also in task of target localization enough, when robots
try to find and estimate the position of the target by the sensors.
Localization of target is possible also with one robot but as it was
examined target finding and localization with group of mobile robots
can estimate the target position more accurately and faster. The
accuracy of target position estimation is made by cooperation of
MRS and particle filtering. Advantage of usage the MRS with particle
filtering was tested on task of fixed target localization by group of
mobile robots.
Abstract: Frequent patterns are patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a dataset. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages such as C, Cµ, Java. The imperative paradigm is significantly inefficient when itemset is large and the frequent pattern is long. We suggest a high-level declarative style of programming using a functional language. Our supposition is that the problem of frequent pattern discovery can be efficiently and concisely implemented via a functional paradigm since pattern matching is a fundamental feature supported by most functional languages. Our frequent pattern mining implementation using the Haskell language confirms our hypothesis about conciseness of the program. The performance studies on speed and memory usage support our intuition on efficiency of functional language.
Abstract: Grid computing is a form of distributed computing
that involves coordinating and sharing computational power, data
storage and network resources across dynamic and geographically
dispersed organizations. Scheduling onto the Grid is NP-complete,
so there is no best scheduling algorithm for all grid computing
systems. An alternative is to select an appropriate scheduling
algorithm to use in a given grid environment because of the
characteristics of the tasks, machines and network connectivity. Job
and resource scheduling is one of the key research area in grid
computing. The goal of scheduling is to achieve highest possible
system throughput and to match the application need with the
available computing resources. Motivation of the survey is to
encourage the amateur researcher in the field of grid computing, so
that they can understand easily the concept of scheduling and can
contribute in developing more efficient scheduling algorithm. This
will benefit interested researchers to carry out further work in this
thrust area of research.
Abstract: In this article, a formal specification and verification of the Rabin public-key scheme in a formal proof system is presented. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. A major objective of this article is the presentation of the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Moreover, we explicate a (computer-proven) formalization of correctness as well as a computer verification of security properties using a straight-forward computation model in Isabelle/HOL. The analysis uses a given database to prove formal properties of our implemented functions with computer support. The main task in designing a practical formalization of correctness as well as efficient computer proofs of security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as efficient formal proofs. Consequently, we get reliable proofs with a minimal error rate augmenting the used database, what provides a formal basis for more computer proof constructions in this area.
Abstract: This paper presents an architecture of current filesystem
implementations as well as our new filesystem SpadFS and operating
system Spad with rewritten VFS layer targeted at high performance
I/O applications. The paper presents microbenchmarks and real-world
benchmarks of different filesystems on the same kernel as well as
benchmarks of the same filesystem on different kernels – enabling
the reader to make conclusion how much is the performance of
various tasks affected by operating system and how much by physical
layout of data on disk. The paper describes our novel features–most
notably continuous allocation of directories and cross-file readahead
– and shows their impact on performance.
Abstract: Software maintenance, which involves making enhancements, modifications and corrections to existing software systems, consumes more than half of developer time. Specification comprehensibility plays an important role in software maintenance as it permits the understanding of the system properties more easily and quickly. The use of formal notation such as B increases a specification-s precision and consistency. However, the notation is regarded as being difficult to comprehend. Semi-formal notation such as the Unified Modelling Language (UML) is perceived as more accessible but it lacks formality. Perhaps by combining both notations could produce a specification that is not only accurate and consistent but also accessible to users. This paper presents an experiment conducted on a model that integrates the use of both UML and B notations, namely UML-B, versus a B model alone. The objective of the experiment was to evaluate the comprehensibility of a UML-B model compared to a traditional B model. The measurement used in the experiment focused on the efficiency in performing the comprehension tasks. The experiment employed a cross-over design and was conducted on forty-one subjects, including undergraduate and masters students. The results show that the notation used in the UML-B model is more comprehensible than the B model.
Abstract: One major difficulty that faces developers of
concurrent and distributed software is analysis for concurrency based
faults like deadlocks. Petri nets are used extensively in the
verification of correctness of concurrent programs. ECATNets are a
category of algebraic Petri nets based on a sound combination of
algebraic abstract types and high-level Petri nets. ECATNets have
'sound' and 'complete' semantics because of their integration in
rewriting logic and its programming language Maude. Rewriting
logic is considered as one of very powerful logics in terms of
description, verification and programming of concurrent systems We
proposed previously a method for translating Ada-95 tasking
programs to ECATNets formalism (Ada-ECATNet) and we showed
that ECATNets formalism provides a more compact translation for
Ada programs compared to the other approaches based on simple
Petri nets or Colored Petri nets. We showed also previously how the
ECATNet formalism offers to Ada many validation and verification
tools like simulation, Model Checking, accessibility analysis and
static analysis. In this paper, we describe the implementation of our
translation of the Ada programs into ECATNets.