Abstract: This paper gives an introduction to Web mining, then
describes Web Structure mining in detail, and explores the data
structure used by the Web. This paper also explores different Page
Rank algorithms and compare those algorithms used for Information
Retrieval. In Web Mining, the basics of Web mining and the Web
mining categories are explained. Different Page Rank based
algorithms like PageRank (PR), WPR (Weighted PageRank), HITS
(Hyperlink-Induced Topic Search), DistanceRank and DirichletRank
algorithms are discussed and compared. PageRanks are calculated for
PageRank and Weighted PageRank algorithms for a given hyperlink
structure. Simulation Program is developed for PageRank algorithm
because PageRank is the only ranking algorithm implemented in the
search engine (Google). The outputs are shown in a table and chart
format.
Abstract: New Growth Theory helps us make sense of the
ongoing shift from a resource-based economy to a knowledge-based
economy. It underscores the point that the economic processes which
create and diffuse new knowledge are critical to shaping the growth
of nations, communities and individual firms. In all too many
contributions to New (Endogenous) Growth Theory – though not in
all – central reference is made to 'a stock of knowledge', a 'stock of
ideas', etc., this variable featuring centre-stage in the analysis. Yet it
is immediately apparent that this is far from being a crystal clear
concept. The difficulty and uncertainty of being able to capture the
value associated with knowledge is a real problem. The intent of this
paper is introducing new thinking and theorizing about the
knowledge and its measurability in new growth theory. Moreover the
study aims to synthesize various strain of the literature with a
practical bearing on knowledge concept. By contribution of
institution framework which is found within NGT, we can indirectly
measure the knowledge concept. Institutions matter because they
shape the environment for production and employment of new
knowledge
Abstract: Construction of portable device for fast analysis of energetic materials is described in this paper. The developed analytical system consists of two main parts: a miniaturized microcolumn liquid chromatograph of unique construction and original chemiluminescence detector. This novel portable device is able to determine selectively most of nitramine- and nitroester-based explosives as well as inorganic nitrates at trace concentrations in water or soil extracts in less than 8 minutes.
Abstract: Hazardous Material transportation by road is coupled
with inherent risk of accidents causing loss of lives, grievous injuries,
property losses and environmental damages. The most common type
of hazmat road accident happens to be the releases (78%) of
hazardous substances, followed by fires (28%), explosions (14%) and
vapour/ gas clouds (6 %.).
The paper is discussing initially the probable 'Impact Zones'
likely to be caused by one flammable (LPG) and one toxic (ethylene
oxide) chemicals being transported through a sizable segment of a
State Highway connecting three notified Industrial zones in Surat
district in Western India housing 26 MAH industrial units. Three
'hotspots' were identified along the highway segment depending on
the particular chemical traffic and the population distribution within
500 meters on either sides. The thermal radiation and explosion
overpressure have been calculated for LPG / Ethylene Oxide BLEVE
scenarios along with toxic release scenario for ethylene oxide.
Besides, the dispersion calculations for ethylene oxide toxic release
have been made for each 'hotspot' location and the impact zones
have been mapped for the LOC concentrations. Subsequently, the
maximum Initial Isolation and the protective zones were calculated
based on ERPG-3 and ERPG-2 values of ethylene oxide respectively
which are estimated taking the worst case scenario under worst
weather conditions. The data analysis will be helpful to the local
administration in capacity building with respect to rescue /
evacuation and medical preparedness and quantitative inputs to
augment the District Offsite Emergency Plan document.
Abstract: Warranty is a powerful marketing tool for the
manufacturer and a good protection for both the manufacturer and the
customer. However, warranty always involves additional costs to the
manufacturer, which depend on product reliability characteristics and
warranty parameters. This paper presents an approach to optimisation
of warranty parameters for known product failure distribution to
reduce the warranty costs to the manufacturer while retaining the
promotional function of the warranty. Combination free replacement
and pro-rata warranty policy is chosen as a model and the length of
free replacement period and pro-rata policy period are varied, as well
as the coefficients that define the pro-rata cost function. Multiparametric
warranty optimisation is done by using genetic algorithm.
Obtained results are guideline for the manufacturer to choose the
warranty policy that minimises the costs and maximises the profit.
Abstract: This paper reports a distributed mutual exclusion
algorithm for mobile Ad-hoc networks. The network is clustered
hierarchically. The proposed algorithm considers the clustered
network as a logical tree and develops a token passing scheme
to get the mutual exclusion. The performance analysis and
simulation results show that its message requirement is optimal,
and thus the algorithm is energy efficient.
Abstract: Food borne illnesses have been reported to be a global
health challenge. Annual incidences of food–related diseases involve
76 million cases, of which only 14 million can be traced to known
pathogens. Poor hygienic practices have contributed greatly to this. It
has been reported that in the year 2000 about 2.1 million people died
from diarrheal diseases, hence, there is a need to ensure food safety at
all level. This study focused on the sterility examination and
inhibitory effect of honey samples on selected gram negative and
gram positive food borne pathogen from South West Nigeria. The
laboratory examinations revealed the presence of some bacterial and
fungal contaminations of honey samples and that inhibitory activity
of the honey sample was more pronounced on the gram negative
bacteria than the gram positive bacterial isolates. Antibiotic
sensitivity test conducted on the different bacterial isolates also
showed that honey was able to inhibit the proliferation of the tested
bacteria than the employed antibiotics.
Abstract: This paper examined the influence of matching
students- learning preferences with the teaching methodology
adopted, on their academic performance in an accounting course in
two types of learning environment in one university in Lebanon:
classes with PowerPoint (PPT) vs. conventional classes. Learning
preferences were either for PPT or for Conventional methodology. A
statistically significant increase in academic achievement is found in
the conventionally instructed group as compared to the group taught
with PPT. This low effectiveness of PPT might be attributed to the
learning preferences of Lebanese students. In the PPT group, better
academic performance was found among students with
learning/teaching match as compared with students with
learning/teaching mismatch. Since the majority of students display a
preference for the conventional methodology, the result might
suggest that Lebanese students- performance is not optimized by PPT
in the accounting classrooms, not because of PPT itself, but because
it is not matching the Lebanese students- learning preferences in such
a quantitative course.
Abstract: This paper describes text mining technique for automatically extracting association rules from collections of textual documents. The technique called, Extracting Association Rules from Text (EART). It depends on keyword features for discover association rules amongst keywords labeling the documents. In this work, the EART system ignores the order in which the words occur, but instead focusing on the words and their statistical distributions in documents. The main contributions of the technique are that it integrates XML technology with Information Retrieval scheme (TFIDF) (for keyword/feature selection that automatically selects the most discriminative keywords for use in association rules generation) and use Data Mining technique for association rules discovery. It consists of three phases: Text Preprocessing phase (transformation, filtration, stemming and indexing of the documents), Association Rule Mining (ARM) phase (applying our designed algorithm for Generating Association Rules based on Weighting scheme GARW) and Visualization phase (visualization of results). Experiments applied on WebPages news documents related to the outbreak of the bird flu disease. The extracted association rules contain important features and describe the informative news included in the documents collection. The performance of the EART system compared with another system that uses the Apriori algorithm throughout the execution time and evaluating extracted association rules.
Abstract: This paper presents a Particle Swarm Optimization
(PSO) method for determining the optimal parameters of a first-order
controller for TCP/AQM system. The model TCP/AQM is described
by a second-order system with time delay. First, the analytical
approach, based on the D-decomposition method and Lemma of
Kharitonov, is used to determine the stabilizing regions of a firstorder
controller. Second, the optimal parameters of the controller are
obtained by the PSO algorithm. Finally, the proposed method is
implemented in the Network Simulator NS-2 and compared with the
PI controller.
Abstract: In this paper an algorithm is used to detect the color defects of ceramic tiles. First the image of a normal tile is clustered using GCMA; Genetic C-means Clustering Algorithm; those results in best cluster centers. C-means is a common clustering algorithm which optimizes an objective function, based on a measure between data points and the cluster centers in the data space. Here the objective function describes the mean square error. After finding the best centers, each pixel of the image is assigned to the cluster with closest cluster center. Then, the maximum errors of clusters are computed. For each cluster, max error is the maximum distance between its center and all the pixels which belong to it. After computing errors all the pixels of defected tile image are clustered based on the centers obtained from normal tile image in previous stage. Pixels which their distance from their cluster center is more than the maximum error of that cluster are considered as defected pixels.
Abstract: A structural study of an aqueous electrolyte whose
experimental results are available. It is a solution of LiCl-6H2O type
at glassy state (120K) contrasted with pure water at room temperature
by means of Partial Distribution Functions (PDF) issue from neutron
scattering technique. Based on these partial functions, the Reverse
Monte Carlo method (RMC) computes radial and angular correlation
functions which allow exploring a number of structural features of
the system. The obtained curves include some artifacts. To remedy
this, we propose to introduce a screened potential as an additional
constraint. Obtained results show a good matching between
experimental and computed functions and a significant improvement
in PDFs curves with potential constraint. It suggests an efficient fit of
pair distribution functions curves.
Abstract: Quality control charts are very effective in detecting
out of control signals but when a control chart signals an out of
control condition of the process mean, searching for a special cause
in the vicinity of the signal time would not always lead to prompt
identification of the source(s) of the out of control condition as the
change point in the process parameter(s) is usually different from the
signal time. It is very important to manufacturer to determine at what
point and which parameters in the past caused the signal. Early
warning of process change would expedite the search for the special
causes and enhance quality at lower cost. In this paper the quality
variables under investigation are assumed to follow a multivariate
normal distribution with known means and variance-covariance
matrix and the process means after one step change remain at the new
level until the special cause is being identified and removed, also it is
supposed that only one variable could be changed at the same time.
This research applies artificial neural network (ANN) to identify the
time the change occurred and the parameter which caused the change
or shift. The performance of the approach was assessed through a
computer simulation experiment. The results show that neural
network performs effectively and equally well for the whole shift
magnitude which has been considered.
Abstract: This paper is concerned with the delay-distributiondependent
stability criteria for bidirectional associative memory
(BAM) neural networks with time-varying delays. Based on the
Lyapunov-Krasovskii functional and stochastic analysis approach,
a delay-probability-distribution-dependent sufficient condition is derived
to achieve the globally asymptotically mean square stable of
the considered BAM neural networks. The criteria are formulated in
terms of a set of linear matrix inequalities (LMIs), which can be
checked efficiently by use of some standard numerical packages. Finally,
a numerical example and its simulation is given to demonstrate
the usefulness and effectiveness of the proposed results.
Abstract: In this paper, based on the estimation of the Cauchy matrix of linear impulsive differential equations, by using Banach fixed point theorem and Gronwall-Bellman-s inequality, some sufficient conditions are obtained for the existence and exponential stability of almost periodic solution for Cohen-Grossberg shunting inhibitory cellular neural networks (SICNNs) with continuously distributed delays and impulses. An example is given to illustrate the main results.
Abstract: Overcurrent (OC) relays are the major protection
devices in a distribution system. The operating time of the OC relays
are to be coordinated properly to avoid the mal-operation of the
backup relays. The OC relay time coordination in ring fed
distribution networks is a highly constrained optimization problem
which can be stated as a linear programming problem (LPP). The
purpose is to find an optimum relay setting to minimize the time of
operation of relays and at the same time, to keep the relays properly
coordinated to avoid the mal-operation of relays.
This paper presents two phase simplex method for optimum time
coordination of OC relays. The method is based on the simplex
algorithm which is used to find optimum solution of LPP. The
method introduces artificial variables to get an initial basic feasible
solution (IBFS). Artificial variables are removed using iterative
process of first phase which minimizes the auxiliary objective
function. The second phase minimizes the original objective function
and gives the optimum time coordination of OC relays.
Abstract: Grid networks provide the ability to perform higher throughput computing by taking advantage of many networked computer-s resources to solve large-scale computation problems. As the popularity of the Grid networks has increased, there is a need to efficiently distribute the load among the resources accessible on the network. In this paper, we present a stochastic network system that gives a distributed load-balancing scheme by generating almost regular networks. This network system is self-organized and depends only on local information for load distribution and resource discovery. The in-degree of each node is refers to its free resources, and job assignment and resource discovery processes required for load balancing is accomplished by using fitted random sampling. Simulation results show that the generated network system provides an effective, scalable, and reliable load-balancing scheme for the distributed resources accessible on Grid networks.
Abstract: Importance of software quality is increasing leading to development of new sophisticated techniques, which can be used in constructing models for predicting quality attributes. One such technique is Artificial Neural Network (ANN). This paper examined the application of ANN for software quality prediction using Object- Oriented (OO) metrics. Quality estimation includes estimating maintainability of software. The dependent variable in our study was maintenance effort. The independent variables were principal components of eight OO metrics. The results showed that the Mean Absolute Relative Error (MARE) was 0.265 of ANN model. Thus we found that ANN method was useful in constructing software quality model.
Abstract: The paradigm of mobile agent provides a promising technology for the development of distributed and open applications. However, one of the main obstacles to widespread adoption of the mobile agent paradigm seems to be security. This paper treats the security of the mobile agent against malicious host attacks. It describes generic mobile agent protection architecture. The proposed approach is based on the dynamic adaptability and adopts the reflexivity as a model of conception and implantation. In order to protect it against behaviour analysis attempts, the suggested approach supplies the mobile agent with a flexibility faculty allowing it to present an unexpected behaviour. Furthermore, some classical protective mechanisms are used to reinforce the level of security.