Abstract: In this paper a novel algorithm is proposed that integrates the process of fuzzy hierarchy generation and rule discovery for automated discovery of Production Rules with Fuzzy Hierarchy (PRFH) in large databases.A concept of frequency matrix (Freq) introduced to summarize large database that helps in minimizing the number of database accesses, identification and removal of irrelevant attribute values and weak classes during the fuzzy hierarchy generation.Experimental results have established the effectiveness of the proposed algorithm.
Abstract: As communications systems and technology become more advanced and complex, it will be increasingly important to focus on users- individual needs. Personalization and effective user profile management will be necessary to ensure the uptake and success of new services and devices and it is therefore important to focus on the users- requirements in this area and define solutions that meet these requirements. The work on personalization and user profiles emerged from earlier ETSI work on a Universal Communications Identifier (UCI) which is a unique identifier of the user rather than a range of identifiers of the many of communication devices or services (e.g. numbers of fixed phone at home/work, mobile phones, fax and email addresses). This paper describes work on personalization including standardized information and preferences and an architectural framework providing a description of how personalization can be integrated in Next Generation Networks, together with the UCI concept.
Abstract: Let p be a prime number, Fp be a finite field and t ∈ F*p= Fp- {0}. In this paper we obtain some properties of ellipticcurves Ep,t: y2= y2= x3- t2x over Fp. In the first sectionwe give some notations and preliminaries from elliptic curves. In the second section we consider the rational points (x, y) on Ep,t. Wegive a formula for the number of rational points on Ep,t over Fnp for an integer n ≥ 1. We also give some formulas for the sum of x?andy?coordinates of the points (x, y) on Ep,t. In the third section weconsider the rank of Et: y2= x3- t2x and its 2-isogenous curve Et over Q. We proved that the rank of Etand Etis 2 over Q. In the last section we obtain some formulas for the sums Σt∈F?panp,t for an integer n ≥ 1, where ap,t denote the trace of Frobenius.
Abstract: Clustering techniques have been used by many intelligent software agents to group similar access patterns of the Web users into high level themes which express users intentions and interests. However, such techniques have been mostly focusing on one salient feature of the Web document visited by the user, namely the extracted keywords. The major aim of these techniques is to come up with an optimal threshold for the number of keywords needed to produce more focused themes. In this paper we focus on both keyword and similarity thresholds to generate themes with concentrated themes, and hence build a more sound model of the user behavior. The purpose of this paper is two fold: use distance based clustering methods to recognize overall themes from the Proxy log file, and suggest an efficient cut off levels for the keyword and similarity thresholds which tend to produce more optimal clusters with better focus and efficient size.
Abstract: In this paper flow around two cam shaped cylinders had been studied numerically. The equivalent diameter of cylinders is 27.6 mm. The space between center to center of two cam shaped cylinders is define as longitudinal pitch ratio and it varies in range of
2 varies in range of 50
Abstract: Collateralized Debt Obligations are not as widely used
nowadays as they were before 2007 Subprime crisis. Nonetheless
there remains an enthralling challenge to optimize cash flows
associated with synthetic CDOs. A Gaussian-based model is used
here in which default correlation and unconditional probabilities of
default are highlighted. Then numerous simulations are performed
based on this model for different scenarios in order to evaluate the
associated cash flows given a specific number of defaults at different
periods of time. Cash flows are not solely calculated on a single
bought or sold tranche but rather on a combination of bought and
sold tranches. With some assumptions, the simplex algorithm gives
a way to find the maximum cash flow according to correlation of
defaults and maturities. The used Gaussian model is not realistic in
crisis situations. Besides present system does not handle buying or
selling a portion of a tranche but only the whole tranche. However the
work provides the investor with relevant elements on how to know
what and when to buy and sell.
Abstract: Thailand-s health system is challenged by the rising
number of patients and decreasing ratio of medical
practitioners/patients, especially in rural areas. This may tempt
inexperienced GPs to rush through the process of anamnesis with the
risk of incorrect diagnosis. Patients have to travel far to the hospital
and wait for a long time presenting their case. Many patients try to
cure themselves with traditional Thai medicine. Many countries are
making use of the Internet for medical information gathering,
distribution and storage. Telemedicine applications are a relatively
new field of study in Thailand; the infrastructure of ICT had
hampered widespread use of the Internet for using medical
information. With recent improvements made health and technology
professionals can work out novel applications and systems to help
advance telemedicine for the benefit of the people. Here we explore
the use of telemedicine for people with health problems in rural areas
in Thailand and present a Telemedicine Diagnosis System for Rural
Thailand (TEDIST) for diagnosing certain conditions that people
with Internet access can use to establish contact with Community
Health Centers, e.g. by mobile phone. The system uses a Web-based
input method for individual patients- symptoms, which are taken by
an expert system for the analysis of conditions and appropriate
diseases. The analysis harnesses a knowledge base and a backward
chaining component to find out, which health professionals should be
presented with the case. Doctors have the opportunity to exchange
emails or chat with the patients they are responsible for or other
specialists. Patients- data are then stored in a Personal Health Record.
Abstract: Purpose: Planning and dosimetry of different VMAT algorithms (SmartArc, Ergo++, Autobeam) is compared with IMRT for Head and Neck Cancer patients. Modelling was performed to rule out the causes of discrepancies between planned and delivered dose. Methods: Five HNC patients previously treated with IMRT were re-planned with SmartArc (SA), Ergo++ and Autobeam. Plans were compared with each other and against IMRT and evaluated using DVHs for PTVs and OARs, delivery time, monitor units (MU) and dosimetric accuracy. Modelling of control point (CP) spacing, Leaf-end Separation and MLC/Aperture shape was performed to rule out causes of discrepancies between planned and delivered doses. Additionally estimated arc delivery times, overall plan generation times and effect of CP spacing and number of arcs on plan generation times were recorded. Results: Single arc SmartArc plans (SA4d) were generally better than IMRT and double arc plans (SA2Arcs) in terms of homogeneity and target coverage. Double arc plans seemed to have a positive role in achieving improved Conformity Index (CI) and better sparing of some Organs at Risk (OARs) compared to Step and Shoot IMRT (ss-IMRT) and SA4d. Overall Ergo++ plans achieved best CI for both PTVs. Dosimetric validation of all VMAT plans without modelling was found to be lower than ss-IMRT. Total MUs required for delivery were on average 19%, 30%, 10.6% and 6.5% lower than ss-IMRT for SA4d, SA2d (Single arc with 20 Gantry Spacing), SA2Arcs and Autobeam plans respectively. Autobeam was most efficient in terms of actual treatment delivery times whereas Ergo++ plans took longest to deliver. Conclusion: Overall SA single arc plans on average achieved best target coverage and homogeneity for both PTVs. SA2Arc plans showed improved CI and some OARs sparing. Very good dosimetric results were achieved with modelling. Ergo++ plans achieved best CI. Autobeam resulted in fastest treatment delivery times.
Abstract: This paper presents a large scale, quantitative investigation of the impact of discipline differences on the student experience of using an online learning environment (OLE). Based on a representative sample of 2526 respondents, a number of significant differences in the mean rating by broad discipline area of the importance of, and satisfaction with, a range of elements of an OLE were found. Broadly speaking, the Arts and Science and Technology discipline areas reported the lowest importance and satisfaction ratings for the OLE, while the Health and Behavioural Sciences area was the most satisfied with the OLE. A number of specific, systematic discipline differences are reported and discussed. Compared to the observed significant differences in mean importance ratings, there were fewer significant differences in mean satisfaction ratings, and those that were observed were less systematic than for importance ratings.
Abstract: POS (also been called DGPS/IMU) technique can obtain the Exterior Orientation Elements of aerial photo, so the triangulation and DLG production using POS can save large numbers of ground control points (GCP), and this will improve the produce efficiency of DLG and reduce the cost of collecting GCP. This paper mainly research on POS technique in production of 1:10 000 scale DLG on GCP distribution. We designed 23 kinds of ground control points distribution schemes, using integrated sensor direction method to do the triangulation experiments, based on the results of triangulation, we produce a map with the scale of 1:10 000 and test its accuracy. This paper put forward appropriate GCP distributing schemes by experiments and research above, and made preparations for the application of POS technique on photogrammetry 4D data production.
Abstract: A number of studies highlighted problems related to
ERP systems, yet, most of these studies focus on the problems during
the project and implementation stages but not during the postimplementation
use process. Problems encountered in the process of
using ERP would hinder the effective exploitation and the extended
and continued use of ERP systems and their value to organisations.
This paper investigates the different types of problems users
(operational, supervisory and managerial) faced in using ERP and
how 'feral system' is used as the coping mechanism. The paper
adopts a qualitative method and uses data collected from two cases
and 26 interviews, to inductively develop a casual network model of
ERP usage problem and its coping mechanism. This model classified
post ERP usage problems as data quality, system quality, interface
and infrastructure. The model is also categorised the different coping
mechanism through use of 'feral system' inclusive of feral
information system, feral data and feral use of technology.
Abstract: A systematic and exhaustive method based on the group
structure of a unitary Lie algebra is proposed to generate an enormous
number of quantum codes. With respect to the algebraic structure,
the orthogonality condition, which is the central rule of generating
quantum codes, is proved to be fully equivalent to the distinguishability
of the elements in this structure. In addition, four types of
quantum codes are classified according to the relation of the codeword
operators and some initial quantum state. By linking the unitary Lie
algebra with the additive group, the classical correspondences of some
of these quantum codes can be rendered.
Abstract: Phylogenetic tree is a graphical representation of the
evolutionary relationship among three or more genes or organisms.
These trees show relatedness of data sets, species or genes
divergence time and nature of their common ancestors. Quality of a
phylogenetic tree requires parsimony criterion. Various approaches
have been proposed for constructing most parsimonious trees. This
paper is concerned about calculating and optimizing the changes of
state that are needed called Small Parsimony Algorithms. This paper
has proposed enhanced small parsimony algorithm to give better
score based on number of evolutionary changes needed to produce
the observed sequence changes tree and also give the ancestor of the
given input.
Abstract: Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observations as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use quasi-likelihood estimation approach to estimate the parameters of the model. Under-dispersion parameter is estimated to be 2.14 justifying the appropriateness of Com-Poisson distribution in modelling under-dispersed count responses recorded in this study.
Abstract: Effective cooling of electronic equipment has emerged
as a challenging and constraining problem of the new century. In the
present work the feasibility and effectiveness of jet impingement
cooling on electronics were investigated numerically and
experimentally. Studies have been conducted to see the effect of the
geometrical parameters such as jet diameter (D), jet to target
spacing (Z) and ratio of jet spacing to jet diameter (Z/D) on the heat
transfer characteristics. The values of Reynolds numbers considered
are in the range 7000 to 42000. The results obtained from the
numerical studies are validated by conducting experiments. From the
studies it is found that the optimum value of Z/D ratio is 5. For a
given Reynolds number, the Nusselt number increases by about 28%
if the diameter of the nozzle is increased from 1mm to 2mm.
Correlations are proposed for Nusselt number in terms of Reynolds
number and these are valid for air as the cooling medium.
Abstract: In this study a two dimensional axisymmetric, steady state and incompressible laminar flow in a rotating single disk is numerically investigated. The finite volume method is used for solving the momentum equations. The numerical model and results
are validated by comparing it to previously reported experimental data for velocities, angles and moment coefficients. It is
demonstrated that increasing the axial distance increases the value of axial velocity and vice versa for tangential and total velocities. However, the maximum value of nondimensional radial velocity
occurs near the disk wall. It is also found that with increase rotational Reynolds number, moment coefficient decreases.
Abstract: Coated tool inserts can be considered as the backbone
of machining processes due to their wear and heat resistance.
However, defects of coating can degrade the integrity of these inserts
and the number of these defects should be minimized or eliminated if
possible. Recently, the advancement of coating processes and
analytical tools open a new era for optimizing the coating tools.
First, an overview is given regarding coating technology for cutting
tool inserts. Testing techniques for coating layers properties, as well
as the various coating defects and their assessment are also surveyed.
Second, it is introduced an experimental approach to examine the
possible coating defects and flaws of worn multicoated carbide
inserts using two important techniques namely scanning electron
microscopy and atomic force microscopy. Finally, it is
recommended a simple procedure for investigating manufacturing
defects and flaws of worn inserts.
Abstract: In this paper, the decomposition-aggregation method
is used to carry out connective stability criteria for general linear
composite system via aggregation. The large scale system is
decomposed into a number of subsystems. By associating directed
graphs with dynamic systems in an essential way, we define the
relation between system structure and stability in the sense of
Lyapunov. The stability criteria is then associated with the stability
and system matrices of subsystems as well as those interconnected
terms among subsystems using the concepts of vector differential
inequalities and vector Lyapunov functions. Then, we show that the
stability of each subsystem and stability of the aggregate model
imply connective stability of the overall system. An example is
reported, showing the efficiency of the proposed technique.
Abstract: Society has grown to rely on Internet services, and the
number of Internet users increases every day. As more and more
users become connected to the network, the window of opportunity
for malicious users to do their damage becomes very great and
lucrative. The objective of this paper is to incorporate different
techniques into classier system to detect and classify intrusion from
normal network packet. Among several techniques, Steady State
Genetic-based Machine Leaning Algorithm (SSGBML) will be used
to detect intrusions. Where Steady State Genetic Algorithm (SSGA),
Simple Genetic Algorithm (SGA), Modified Genetic Algorithm and
Zeroth Level Classifier system are investigated in this research.
SSGA is used as a discovery mechanism instead of SGA. SGA
replaces all old rules with new produced rule preventing old good
rules from participating in the next rule generation. Zeroth Level
Classifier System is used to play the role of detector by matching
incoming environment message with classifiers to determine whether
the current message is normal or intrusion and receiving feedback
from environment. Finally, in order to attain the best results,
Modified SSGA will enhance our discovery engine by using Fuzzy
Logic to optimize crossover and mutation probability. The
experiments and evaluations of the proposed method were performed
with the KDD 99 intrusion detection dataset.
Abstract: The rapid urbanization of cities has a bane in the form
road accidents that cause extensive damage to life and limbs. A
number of location based factors are enablers of road accidents in the
city. The speed of travel of vehicles is non-uniform among locations
within a city. In this study, the perception of vehicle users is captured
on a 10-point rating scale regarding the degree of variation in speed
of travel at chosen locations in the city. The average rating is used to
cluster locations using fuzzy c-means clustering and classify them as
low, moderate and high speed of travel locations. The high speed of
travel locations can be classified proactively to ensure that accidents
do not occur due to the speeding of vehicles at such locations. The
advantage of fuzzy c-means clustering is that a location may be a
part of more than one cluster to a varying degree and this gives a
better picture about the location with respect to the characteristic
(speed of travel) being studied.