Abstract: The proof of concept experiments were conducted to
determine the feasibility of using small amounts of Dissolved
Sulphur (DS) from the gaseous phase to precipitate platinum ions in
chloride media. Two sets of precipitation experiments were
performed in which the source of sulphur atoms was either a
thiosulphate solution (Na2S2O3) or a sulphur dioxide gas (SO2). In
liquid-liquid (L-L) system, complete precipitation of Pt was achieved
at small dosages of Na2S2O3 (0.01 – 1.0 M) in a time interval of 3-5
minutes. On the basis of this result, gas absorption tests were carried
out mainly to achieve sulphur solubility equivalent to 0.018 M. The
idea that huge amounts of precious metals could be recovered
selectively from their dilute solutions by utilizing the waste SO2
streams at low pressure seemed attractive from the economic and
environmental point of views. Therefore, mass transfer characteristics
of SO2 gas associated with reactive absorption across the gas-liquid
(G-L) interface were evaluated under different conditions of pressure
(0.5 – 2 bar), solution temperature ranges from 20 – 50 oC and acid
strength (1 – 4 M, HCl). This paper concludes with information about
selective precipitation of Pt in the presence of cations (Fe2+, Co2+,
and Cr3+) in a CSTR and recommendation to scale up laboratory data
to industrial pilot scale operations.
Abstract: This paper focuses on analyzing medical diagnostic data using classification rules in data mining and context reduction in formal concept analysis. It helps in finding redundancies among the various medical examination tests used in diagnosis of a disease. Classification rules have been derived from positive and negative association rules using the Concept lattice structure of the Formal Concept Analysis. Context reduction technique given in Formal Concept Analysis along with classification rules has been used to find redundancies among the various medical examination tests. Also it finds out whether expensive medical tests can be replaced by some cheaper tests.
Abstract: Nowadays there are more than thirty maturity models
in different knowledge areas. Maturity model is an area of interest
that contributes organizations to find out where they are in a specific
knowledge area and how to improve it. As Information Resource
Management (IRM) is the concept that information is a major
corporate resource and must be managed using the same basic
principles used to manage other assets, assessment of the current
IRM status and reveal the improvement points can play a critical role
in developing an appropriate information structure in organizations.
In this paper we proposed a framework for information resource
management maturity model (IRM3) that includes ten best practices
for the maturity assessment of the organizations' IRM.
Abstract: Tanzania is a developing country, which significantly lags behind the rest of the world in information communications technology (ICT), especially for the Internet. Internet connectivity to the rest of the world is via expensive satellite links, thus leaving the majority of the population unable to access the Internet due to the high cost. This paper introduces the concept of an optical WDM network for Internet infrastructure in Tanzania, so as to reduce Internet connection costs, and provide Internet access to the majority of people who live in both urban and rural areas. We also present a proposed optical WDM network, which mitigates the effects of system impairments, and provide simulation results to show that the data is successfully transmitted over a longer distance using a WDM network.
Abstract: A great deal of research works in the field information
systems security has been based on a positivist paradigm. Applying
the reductionism concept of the positivist paradigm for information
security means missing the bigger picture and thus, the lack of holism
which could be one of the reasons why security is still overlooked,
comes as an afterthought or perceived from a purely technical
dimension. We need to reshape our thinking and attitudes towards
security especially in a complex and dynamic environment such as e-
Business to develop a holistic understanding of e-Business security in
relation to its context as well as considering all the stakeholders in
the problem area. In this paper we argue the suitability and need for
more inductive interpretive approach and qualitative research method
to investigate e-Business security. Our discussion is based on a
holistic framework of enquiry, nature of the research problem, the
underling theoretical lens and the complexity of e-Business
environment. At the end we present a research strategy for
developing a holistic framework for understanding of e-Business
security problems in the context of developing countries based on an
interdisciplinary inquiry which considers their needs and
requirements.
Abstract: A product goes through various processes in a production flow which is also known as assembly line in manufacturing process management. Toyota created a new concept which is known as lean concept in manufacturing industry. Today it is the leading model in manufacturing plants through the globe. The linear walking worker assembly line is a flexible assembly system where each worker travels down the line carrying out each assembly task at each station; and each worker accomplishes the assembly of a unit from start to finish. This paper attempts to combine the flexibility of the walking worker and lean in order to quantify the benefits from applying the shop floor principles of lean management.
Abstract: Classifying data hierarchically is an efficient approach
to analyze data. Data is usually classified into multiple categories, or
annotated with a set of labels. To analyze multi-labeled data, such
data must be specified by giving a set of labels as a semantic range.
There are some certain purposes to analyze data. This paper shows
which multi-labeled data should be the target to be analyzed for
those purposes, and discusses the role of a label against a set of
labels by investigating the change when a label is added to the set of
labels. These discussions give the methods for the advanced analysis
of multi-labeled data, which are based on the role of a label against
a semantic range.
Abstract: Applying the idea of soft set theory to lattice implication algebras, the novel concept of (implicative) filteristic soft lattice implication algebras which related to (implicative) filter(for short, (IF-)F-soft lattice implication algebras) are introduced. Basic properties of (IF-)F-soft lattice implication algebras are derived. Two kinds of fuzzy filters (i.e.(2, 2 _qk)((2, 2 _ qk))-fuzzy (implicative) filter) of L are introduced, which are generalizations of fuzzy (implicative) filters. Some characterizations for a soft set to be a (IF-)F-soft lattice implication algebra are provided. Analogously, this idea can be used in other types of filteristic lattice implication algebras (such as fantastic (positive implicative) filteristic soft lattice implication algebras).
Abstract: Existing image-based virtual reality applications
allow users to view image-based 3D virtual environment in a more
interactive manner. User could “walkthrough"; looks left, right, up
and down and even zoom into objects in these virtual worlds of
images. However what the user sees during a “zoom in" is just a
close-up view of the same image which was taken from a distant.
Thus, this does not give the user an accurate view of the object from
the actual distance. In this paper, a simple technique for zooming in
an object in a virtual scene is presented. The technique is based on
the 'hotspot' concept in existing application. Instead of navigation
between two different locations, the hotspots are used to focus into
an object in the scene. For each object, several hotspots are created.
A different picture is taken for each hotspot. Each consecutive
hotspot created will take the user closer to the object. This will
provide the user with a correct of view of the object based on his
proximity to the object. Implementation issues and the relevance of
this technique in potential application areas are highlighted.
Abstract: There exists an injective, information-preserving function
that maps a semantic network (i.e a directed labeled network)
to a directed network (i.e. a directed unlabeled network). The edge
label in the semantic network is represented as a topological feature
of the directed network. Also, there exists an injective function that
maps a directed network to an undirected network (i.e. an undirected
unlabeled network). The edge directionality in the directed network
is represented as a topological feature of the undirected network.
Through function composition, there exists an injective function that
maps a semantic network to an undirected network. Thus, aside from
space constraints, the semantic network construct does not have any
modeling functionality that is not possible with either a directed
or undirected network representation. Two proofs of this idea will
be presented. The first is a proof of the aforementioned function
composition concept. The second is a simpler proof involving an
undirected binary encoding of a semantic network.
Abstract: Rutting is one of the major load-related distresses in airport flexible pavements. Rutting in paving materials develop gradually with an increasing number of load applications, usually appearing as longitudinal depressions in the wheel paths and it may be accompanied by small upheavals to the sides. Significant research has been conducted to determine the factors which affect rutting and how they can be controlled. Using the experimental design concepts, a series of tests can be conducted while varying levels of different parameters, which could be the cause for rutting in airport flexible pavements. If proper experimental design is done, the results obtained from these tests can give a better insight into the causes of rutting and the presence of interactions and synergisms among the system variables which have influence on rutting. Although traditionally, laboratory experiments are conducted in a controlled fashion to understand the statistical interaction of variables in such situations, this study is an attempt to identify the critical system variables influencing airport flexible pavement rut depth from a statistical DoE perspective using real field data from a full-scale test facility. The test results do strongly indicate that the response (rut depth) has too much noise in it and it would not allow determination of a good model. From a statistical DoE perspective, two major changes proposed for this experiment are: (1) actual replication of the tests is definitely required, (2) nuisance variables need to be identified and blocked properly. Further investigation is necessary to determine possible sources of noise in the experiment.
Abstract: The purpose of this paper is to summarize the
following protection of scouring countermeasures by using
Bentonite-Enhanced Sand (BES) mixtures. The concept of
underground improvement is being used in this study to reduce the
void of the sand. The sand bentonite mixture was used to bond the
ground soil conditions surrounding the pile of integral bridge. The
right composition of sand bentonite mixture was proposed based on
previous findings. The swelling effect of bentonite also was
investigated to ensure there is no adverse impact to the structure of
the integral bridge. ScourScour, another name for severe erosion,
occurs when the erosive capacity of water resulting from natural and
manmade events exceeds the ability of earth materials to resist its
effects. According to AASHTO LRFD Specifications (Section
C3.7.5), scour is the most common reason for the collapse of
highway bridges in the United States
Abstract: This research uses computational linguistics, an area of study that employs a computer to process natural language, and aims at discerning the patterns that exist in declarative sentences used in technical texts. The approach is mathematical, and the focus is on instructional texts found on web pages. The technique developed by the author and named the MAYA Semantic Technique is used here and organized into four stages. In the first stage, the parts of speech in each sentence are identified. In the second stage, the subject of the sentence is determined. In the third stage, MAYA performs a frequency analysis on the remaining words to determine the verb and its object. In the fourth stage, MAYA does statistical analysis to determine the content of the web page. The advantage of the MAYA Semantic Technique lies in its use of mathematical principles to represent grammatical operations which assist processing and accuracy if performed on unambiguous text. The MAYA Semantic Technique is part of a proposed architecture for an entire web-based intelligent tutoring system. On a sample set of sentences, partial semantics derived using the MAYA Semantic Technique were approximately 80% accurate. The system currently processes technical text in one domain, namely Cµ programming. In this domain all the keywords and programming concepts are known and understood.
Abstract: The optimal design of PI controller for Automatic Generation Control in two area is presented in this paper. The concept of Dual mode control is applied in the PI controller, such that the proportional mode is made active when the rate of change of the error is sufficiently larger than a specified limit otherwise switched to the integral mode. A digital simulation is used in conjunction with the Hooke-Jeeve’s optimization technique to determine the optimum parameters (individual gain of proportional and integral controller) of the PI controller. Integrated Square of the Error (ISE), Integrated Time multiplied by Absolute Error(ITAE) , and Integrated Absolute Error(IAE) performance indices are considered to measure the appropriateness of the designed controller. The proposed controller are tested for a two area single nonreheat thermal system considering the practical aspect of the problem such as Deadband and Generation Rate Constraint(GRC). Simulation results show that dual mode with optimized values of the gains improved the control performance than the commonly used Variable Structure .
Abstract: Recent scientific investigations indicate that
multimodal biometrics overcome the technical limitations of
unimodal biometrics, making them ideally suited for everyday life
applications that require a reliable authentication system. However,
for a successful adoption of multimodal biometrics, such systems
would require large heterogeneous datasets with complex multimodal
fusion and privacy schemes spanning various distributed
environments. From experimental investigations of current
multimodal systems, this paper reports the various issues related to
speed, error-recovery and privacy that impede the diffusion of such
systems in real-life. This calls for a robust mechanism that caters to
the desired real-time performance, robust fusion schemes,
interoperability and adaptable privacy policies.
The main objective of this paper is to present a framework that
addresses the abovementioned issues by leveraging on the
heterogeneous resource sharing capacities of Grid services and the
efficient machine learning capabilities of artificial neural networks
(ANN). Hence, this paper proposes a Grid-based neural network
framework for adopting multimodal biometrics with the view of
overcoming the barriers of performance, privacy and risk issues that
are associated with shared heterogeneous multimodal data centres.
The framework combines the concept of Grid services for reliable
brokering and privacy policy management of shared biometric
resources along with a momentum back propagation ANN (MBPANN)
model of machine learning for efficient multimodal fusion and
authentication schemes. Real-life applications would be able to adopt
the proposed framework to cater to the varying business requirements
and user privacies for a successful diffusion of multimodal
biometrics in various day-to-day transactions.
Abstract: This article describes a Web pages automatic filtering system. It is an open and dynamic system based on multi agents architecture. This system is built up by a set of agents having each a quite precise filtering task of to carry out (filtering process broken up into several elementary treatments working each one a partial solution). New criteria can be added to the system without stopping its execution or modifying its environment. We want to show applicability and adaptability of the multi-agents approach to the networks information automatic filtering. In practice, most of existing filtering systems are based on modular conception approaches which are limited to centralized applications which role is to resolve static data flow problems. Web pages filtering systems are characterized by a data flow which varies dynamically.
Abstract: The purpose of this paper is to propose an integrated
consumer health informatics utilization framework that can be used
to gauge the online health information needs and usage patterns
among Malaysian women. The proposed framework was developed
based on four different theories/models: Use and Gratification
Theory, Technology Acceptance 3 Model, Health Belief Model, and
Multi-level Model of Information Seeking. The relevant constructs
and research hypotheses are also presented in this paper. The
framework will be tested in order for it to be used successfully to
identify Malaysian women-s preferences of online health information
resources and health information seeking activities.
Abstract: Research and development R&D work involves
enormous amount of work that has to do with data measurement and
collection. This process evolves as new information is fed, new
technologies are utilized, and eventually new knowledge is created
by the stakeholders i.e., researchers, clients, and end-users. When
new knowledge is created, procedures of R&D work should evolve
and produce better results within improved research skills and
improved methods of data measurements and collection. This
measurement improvement should then be benchmarked against a
metric that should be developed at the organization. In this paper, we
are suggesting a conceptual metric for R&D work performance
improvement (PI) at the Kuwait Institute for Scientific Research
(KISR). This PI is to be measured against a set of variables in the
suggested metric, which are more closely correlated to organizational
output, as opposed to organizational norms. The paper also mentions
and discusses knowledge creation and management as an addedvalue
to R&D work and measurement improvement. The research
methodology followed in this work is qualitative in nature, based on
a survey that was distributed to researchers and interviews held with
senior researchers at KISR. Research and analyses in this paper also
include looking at and analyzing KISR-s literature.
Abstract: This paper presents findings from the evaluation study carried out to review the UAE national ID card software. The paper consults the relevant literature to explain many of the concepts and frameworks explained herein. The findings of the evaluation work that was primarily based on the ISO 9126 standard for system quality measurement highlighted many practical areas that if taken into account is argued to more likely increase the success chances of similar system implementation projects.
Abstract: A concept of switched beam antennas consisting of
2×2 rectangular array spaced by λ/4 accompanied with a null locating
has been proposed in the previous work. In this letter, the
performance evaluations of its prototype are presented. The benefits
of using proposed system have been clearly measured in term of
signal quality, throughput and delays. Also, the impact of position
shift which mesh router is not located on the expected beam direction
has also been investigated.