Abstract: Sediment and mangrove root samples from Iko River
Estuary, Nigeria were analyzed for microbial and polycyclic
aromatic hydrocarbon (PAH) content. The total heterotrophic
bacterial (THB) count ranged from 1.1x107 to 5.1 x107 cfu/g, total
fungal (TF) count ranged from 1.0x106 to 2.7x106 cfu/g, total
coliform (TC) count ranged from 2.0x104 to 8.0x104cfu/g while
hydrocarbon utilizing bacterial (HUB) count ranged from 1.0x 105 to
5.0 x 105cfu/g. There was a range of positive correlation (r = 0.72 to
0.93) between THB count and total HUB count, respectively. The
organisms were Staphylococcus aureus, Bacillus cereus,
Flavobacterium breve, Pseudomonas aeruginosa, Erwinia
amylovora, Escherichia coli, Enterobacter sp, Desulfovibrio sp,
Acinetobacter iwoffii, Chromobacterium violaceum, Micrococcus
sedentarius, Corynebacterium sp, and Pseudomonas putrefaciens.
The PAH were Naphthalene, 2-Methylnaphthalene, Acenapthylene,
Acenaphthene, Fluorene, Phenanthene, Anthracene, Fluoranthene,
Pyrene, Benzo(a)anthracene, Chrysene, Benzo(b)fluoranthene,
Benzo(k)fluoranthene, Benzo(a)pyrene, Dibenzo(a,h)anthracene,
Benzo(g,h,l)perylene ,Indeno(1,2,3-d)pyrene with individual PAH
concentrations that ranged from 0.20mg/kg to 1.02mg/kg, 0.20mg/kg
to 1.07mg/kg and 0.2mg/kg to 4.43mg/kg in the benthic sediment,
epipellic sediment and mangrove roots, respectively. Total PAH
ranged from 6.30 to 9.93mg/kg, 6.30 to 9.13mg/kg and 9.66 to
16.68mg/kg in the benthic sediment, epipellic sediment and
mangrove roots, respectively. The high concentrations in the
mangrove roots are indicative of bioaccumulation of the pollutant in
the plant tissue. The microorganisms are of ecological significance
and the detectable quantities of polycyclic aromatic hydrocarbon
could be partitioned and accumulated in tissues of infaunal and
epifaunal organisms in the study area.
Abstract: This article presents the evolution and technological changes implemented on the full scale simulators developed by the Simulation Department of the Instituto de Investigaciones Eléctricas1 (Mexican Electric Research Institute) and located at different training centers around the Mexican territory, and allows US to know the last updates, basically from the input/output view point, of the current simulators at some facilities of the electrical sector as well as the compatible industry of the electrical manufactures and industries such as Comision Federal de Electricidad (CFE*, The utility Mexican company). Tendencies of these developments and impact within the operators- scope are also presented.
Abstract: Tumor classification is a key area of research in the
field of bioinformatics. Microarray technology is commonly used in
the study of disease diagnosis using gene expression levels. The
main drawback of gene expression data is that it contains thousands
of genes and a very few samples. Feature selection methods are used
to select the informative genes from the microarray. These methods
considerably improve the classification accuracy. In the proposed
method, Genetic Algorithm (GA) is used for effective feature
selection. Informative genes are identified based on the T-Statistics,
Signal-to-Noise Ratio (SNR) and F-Test values. The initial candidate
solutions of GA are obtained from top-m informative genes. The
classification accuracy of k-Nearest Neighbor (kNN) method is used
as the fitness function for GA. In this work, kNN and Support Vector
Machine (SVM) are used as the classifiers. The experimental results
show that the proposed work is suitable for effective feature
selection. With the help of the selected genes, GA-kNN method
achieves 100% accuracy in 4 datasets and GA-SVM method
achieves in 5 out of 10 datasets. The GA with kNN and SVM
methods are demonstrated to be an accurate method for microarray
based tumor classification.
Abstract: We present a new numerical method for the computation of the steady-state solution of Markov chains. Theoretical analyses show that the proposed method, with a contraction factor α, converges to the one-dimensional null space of singular linear systems of the form Ax = 0. Numerical experiments are used to illustrate the effectiveness of the proposed method, with applications to a class of interesting models in the domain of tandem queueing networks.
Abstract: In this paper, first we introduce the stable distribution, stable process and theirs characteristics. The a -stable distribution family has received great interest in the last decade due to its success in modeling data, which are too impulsive to be accommodated by the Gaussian distribution. In the second part, we propose major applications of alpha stable distribution in telecommunication, computer science such as network delays and signal processing and financial markets. At the end, we focus on using stable distribution to estimate measure of risk in stock markets and show simulated data with statistical softwares.
Abstract: Data mining has been integrated into application systems to enhance the quality of the decision-making process. This study aims to focus on the integration of data mining technology and Knowledge Management System (KMS), due to the ability of data mining technology to create useful knowledge from large volumes of data. Meanwhile, KMS vitally support the creation and use of knowledge. The integration of data mining technology and KMS are popularly used in business for enhancing and sustaining organizational performance. However, there is a lack of studies that applied data mining technology and KMS in the education sector; particularly students- academic performance since this could reflect the IHL performance. Realizing its importance, this study seeks to integrate data mining technology and KMS to promote an effective management of knowledge within IHLs. Several concepts from literature are adapted, for proposing the new integrative data mining technology and KMS framework to an IHL.
Abstract: Database management systems that integrate user preferences promise better solution for personalization, greater flexibility and higher quality of query responses. This paper presents a tentative work that studies and investigates approaches to express user preferences in queries. We sketch an extend capabilities of SQLf language that uses the fuzzy set theory in order to define the user preferences. For that, two essential points are considered: the first concerns the expression of user preferences in SQLf by so-called fuzzy commensurable predicates set. The second concerns the bipolar way in which these user preferences are expressed on mandatory and/or optional preferences.
Abstract: Open and distance learning is a fairly new concept in
Malawi. The major public provider, the Malawi College of Distance
Education, rolled out its activities only about 40 years ago. Over the
years, the demand for distance education has tremendously increased.
The present government has displayed positive political will to uplift
ODL as outlined in the Malawi Growth and Development Strategy as
well as the National Education Sector Plan. A growing national
interest in education coupled with political stability and a booming
ICT industry also raise hope for success. However, a fragile economy
with a GNI per capita of -US$ 200 over the last decade, poor public
funding, erratic power supply and lack of expertise put strain on
efforts towards the promotion of ODL initiatives. Despite the
challenges, the nation appears determined to go flat out and explore
all possible avenues that could revolutionise education access and
equity through ODL.
Abstract: This study describes a micro device integrated with
multi-chamber for polymerase chain reaction (PCR) with different
annealing temperatures. The device consists of the reaction
polydimethylsiloxane (PDMS) chip, a cover glass chip, and is
equipped with cartridge heaters, fans, and thermocouples for
temperature control. In this prototype, commercial software is utilized
to determine the geometric and operational parameters those are
responsible for creating the denaturation, annealing, and extension
temperatures within the chip. Two cartridge heaters are placed at two
sides of the chip and maintained at two different temperatures to
achieve a thermal gradient on the chip during the annealing step. The
temperatures on the chip surface are measured via an infrared imager.
Some thermocouples inserted into the reaction chambers are used to
obtain the transient temperature profiles of the reaction chambers
during several thermal cycles. The experimental temperatures
compared to the simulated results show a similar trend. This work
should be interesting to persons involved in the high-temperature
based reactions and genomics or cell analysis.
Abstract: It has become crucial over the years for nations to
improve their credit scoring methods and techniques in light of the
increasing volatility of the global economy. Statistical methods or
tools have been the favoured means for this; however artificial
intelligence or soft computing based techniques are becoming
increasingly preferred due to their proficient and precise nature and
relative simplicity. This work presents a comparison between Support
Vector Machines and Artificial Neural Networks two popular soft
computing models when applied to credit scoring. Amidst the
different criteria-s that can be used for comparisons; accuracy,
computational complexity and processing times are the selected
criteria used to evaluate both models. Furthermore the German credit
scoring dataset which is a real world dataset is used to train and test
both developed models. Experimental results obtained from our study
suggest that although both soft computing models could be used with
a high degree of accuracy, Artificial Neural Networks deliver better
results than Support Vector Machines.
Abstract: This paper presents a Reliability-Based Topology
Optimization (RBTO) based on Evolutionary Structural Optimization
(ESO). An actual design involves uncertain conditions such as
material property, operational load and dimensional variation.
Deterministic Topology Optimization (DTO) is obtained without
considering of the uncertainties related to the uncertainty parameters.
However, RBTO involves evaluation of probabilistic constraints,
which can be done in two different ways, the reliability index
approach (RIA) and the performance measure approach (PMA). Limit
state function is approximated using Monte Carlo Simulation and
Central Composite Design for reliability analysis. ESO, one of the
topology optimization techniques, is adopted for topology
optimization. Numerical examples are presented to compare the DTO
with RBTO.
Abstract: DG application has received increasing attention during
recent years. The impact of DG on various aspects of distribution system
operation, such as reliability and energy loss, depend highly on DG
location in distribution feeder. Optimal DG placement is an important
subject which has not been fully discussed yet.
This paper presents an optimization method to determine optimal DG
placement, based on a cost/worth analysis approach. This method
considers technical and economical factors such as energy loss, load point
reliability indices and DG costs, and particularly, portability of DG. The
proposed method is applied to a test system and the impacts of different
parameters such as load growth rate and load forecast uncertainty (LFU)
on optimum DG location are studied.
Abstract: LSP routing is among the prominent issues in MPLS
networks traffic engineering. The objective of this routing is to
increase number of the accepted requests while guaranteeing the
quality of service (QoS). Requested bandwidth is the most important
QoS criterion that is considered in literatures, and a various number
of heuristic algorithms have been presented with that regards. Many
of these algorithms prevent flows through bottlenecks of the network
in order to perform load balancing, which impedes optimum
operation of the network. Here, a modern routing algorithm is
proposed as MIRAD: having a little information of the network
topology, links residual bandwidth, and any knowledge of the
prospective requests it provides every request with a maximum
bandwidth as well as minimum end-to-end delay via uniform load
distribution across the network. Simulation results of the proposed
algorithm show a better efficiency in comparison with similar
algorithms.
Abstract: In this paper, an attempt is made to compute the total
optimal cost of interdependent queuing system with controllable
arrival rates as an important performance measure of the system. An
example of application has also been presented to exhibit the use of
the model. Finally, numerical demonstration based on a computing
algorithm and variational effects of the model with the help of the
graph have also been presented.
Abstract: With the rapid growth and development of information and communication technology, the Internet has played a definite and irreplaceable role in people-s social lives in Taiwan like in other countries. In July 2008, on a general social website, an unexpected phenomenon was noticed – that there were more than one hundred users who started forming clubs voluntarily and having face-to-face gatherings for specific purposes. In this study, it-s argued whether or not teenagers- social contact on the Internet is involved in their life context, and tried to reveal the teenagers- social preferences, values, and needs, which merge with and influence teenagers- social activities. Therefore, the study conducts multiple user experience research methods, which include practical observations and qualitative analysis by contextual inquiries and in-depth interviews. Based on the findings, several design implications for software related to social interactions and cultural inheritance are offered. It is concluded that the inherent values of a social behaviors might be a key issue in developing computer-mediated communication or interaction designs in the future.
Abstract: Pipeline infrastructures normally represent high cost of investment and the pipeline must be free from risks that could cause environmental hazard and potential threats to personnel safety. Pipeline integrity such monitoring and management become very crucial to provide unimpeded transportation and avoiding unnecessary production deferment. Thus proper cleaning and inspection is the key to safe and reliable pipeline operation and plays an important role in pipeline integrity management program and has become a standard industry procedure. In view of this, understanding the motion (dynamic behavior), prediction and control of the PIG speed is important in executing pigging operation as it offers significant benefits, such as estimating PIG arrival time at receiving station, planning for suitable pigging operation, and improves efficiency of pigging tasks. The objective of this paper is to review recent developments in speed control system of pipeline PIGs. The review carried out would serve as an industrial application in a form of quick reference of recent developments in pipeline PIG speed control system, and further initiate others to add-in/update the list in the future leading to knowledge based data, and would attract active interest of others to share their view points.
Abstract: Within the realm of e-government, the development has moved towards testing new means for democratic decisionmaking, like e-panels, electronic discussion forums, and polls. Although such new developments seem promising, they are not problem-free, and the outcomes are seldom used in the subsequent formal political procedures. Nevertheless, process models offer promising potential when it comes to structuring and supporting transparency of decision processes in order to facilitate the integration of the public into decision-making procedures in a reasonable and manageable way. Based on real-life cases of urban planning processes in Sweden, we present an outline for an integrated framework for public decision making to: a) provide tools for citizens to organize discussion and create opinions; b) enable governments, authorities, and institutions to better analyse these opinions; and c) enable governments to account for this information in planning and societal decision making by employing a process model for structured public decision making.
Abstract: The classification of the protein structure is commonly
not performed for the whole protein but for structural domains, i.e.,
compact functional units preserved during evolution. Hence, a first
step to a protein structure classification is the separation of the
protein into its domains. We approach the problem of protein domain
identification by proposing a novel graph theoretical algorithm. We
represent the protein structure as an undirected, unweighted and
unlabeled graph which nodes correspond the secondary structure
elements of the protein. This graph is call the protein graph. The
domains are then identified as partitions of the graph corresponding
to vertices sets obtained by the maximization of an objective function,
which mutually maximizes the cycle distributions found in the
partitions of the graph. Our algorithm does not utilize any other kind
of information besides the cycle-distribution to find the partitions. If
a partition is found, the algorithm is iteratively applied to each of
the resulting subgraphs. As stop criterion, we calculate numerically
a significance level which indicates the stability of the predicted
partition against a random rewiring of the protein graph. Hence,
our algorithm terminates automatically its iterative application. We
present results for one and two domain proteins and compare our
results with the manually assigned domains by the SCOP database
and differences are discussed.
Abstract: Volume rendering is widely used in medical CT image
visualization. Applying 3D image visualization to diagnosis
application can require accurate volume rendering with high
resolution. Interpolation is important in medical image processing
applications such as image compression or volume resampling.
However, it can distort the original image data because of edge
blurring or blocking effects when image enhancement procedures
were applied. In this paper, we proposed adaptive tension control
method exploiting gradient information to achieve high resolution
medical image enhancement in volume visualization, where restored
images are similar to original images as much as possible. The
experimental results show that the proposed method can improve
image quality associated with the adaptive tension control efficacy.
Abstract: The present work was conducted for the synthesis of
nano size zerovalent iron (nZVI) and hexavalent chromium (Cr(VI))
removal as a highly toxic pollutant by using this nanoparticles. Batch
experiments were performed to investigate the effects of Cr(VI),
nZVI concentration, pH of solution and contact time variation on
the removal efficiency of Cr(VI). nZVI was synthesized by
reduction of ferric chloride using sodium borohydrid. SEM and
XRD examinations applied for determination of particle size and
characterization of produced nanoparticles. The results showed that
the removal efficiency decreased with Cr(VI) concentration and pH
of solution and increased with adsorbent dosage and contact time.
The Langmuir and Freundlich isotherm models were used for the
adsorption equilibrium data and the Langmuir isotherm model was
well fitted. Nanoparticle ZVI presented an outstanding ability to
remove Cr(VI) due to high surface area, low particle size and high
inherent activity.