Abstract: A new dynamic clustering approach (DCPSO), based
on Particle Swarm Optimization, is proposed. This approach is
applied to unsupervised image classification. The proposed approach
automatically determines the "optimum" number of clusters and
simultaneously clusters the data set with minimal user interference.
The algorithm starts by partitioning the data set into a relatively large
number of clusters to reduce the effects of initial conditions. Using
binary particle swarm optimization the "best" number of clusters is
selected. The centers of the chosen clusters is then refined via the Kmeans
clustering algorithm. The experiments conducted show that
the proposed approach generally found the "optimum" number of
clusters on the tested images.
Abstract: Road authorities have confronted problems to
maintaining the serviceability of road infrastructure systems by using
various traditional methods of contracting. As a solution to these
problems, many road authorities have started contracting out road
maintenance works to the private sector based on performance
measures. This contracting method is named Performance-Based
Maintenance Contracting (PBMC). It is considered more costeffective
than other traditional methods of contracting. It has a
substantial success records in many developed and developing
countries over the last two decades. This paper discusses and
analyses the potential issues to be considered before the introduction
of PBMC in a country.
Abstract: This paper represents four unsupervised clustering algorithms namely sIB, RandomFlatClustering, FarthestFirst, and FilteredClusterer that previously works have not been used for network traffic classification. The methodology, the result, the products of the cluster and evaluation of these algorithms with efficiency of each algorithm from accuracy are shown. Otherwise, the efficiency of these algorithms considering form the time that it use to generate the cluster quickly and correctly. Our work study and test the best algorithm by using classify traffic anomaly in network traffic with different attribute that have not been used before. We analyses the algorithm that have the best efficiency or the best learning and compare it to the previously used (K-Means). Our research will be use to develop anomaly detection system to more efficiency and more require in the future.
Abstract: Soy polyol obtained from hydroxylation of soy
epoxide with ethylene glycol were prepared as pre-polyurethane. The
two step process method were applied in the polyurethane synthesis.
The blending of soy polyol with synthetic polyol then simultaneously
carried out to TDI (2,4): MDI (4,4-) (80:20), blowing agent, and
surfactant. Ethylene glycol were not taking part in the polyurethane
synthesis. The inclusion of ethylene glycol were used as a control.
Characterization of polyurethane foam through impact resillience,
indentation deflection, and density can visualize the polyurethane
classifications.
Abstract: Computer network courses are essential parts of college computer science curriculum and hands-on networking experience is well recognized as an effective approach to help students understand better about the network concepts, the layered architecture of network protocols, and the dynamics of the networks. However, existing networking labs are usually server-based and relatively cumbersome, which require a certain level of specialty and resource to set up and maintain the lab environment. Many universities/colleges lack the resources and build-ups in this field and have difficulty to provide students with hands-on practice labs. A new affordable and easily-adoptable approach to networking labs is desirable to enhance network teaching and learning. In addition, current network labs are short on providing hands-on practice for modern wireless and mobile network learning. With the prevalence of smart mobile devices, wireless and mobile network are permeating into various aspects of our information society. The emerging and modern mobile technology provides computer science students with more authentic learning experience opportunities especially in network learning. A mobile device based hands-on labware can provide an excellent ‘real world’ authentic learning environment for computer network especially for wireless network study. In this paper, we present our mobile device-based hands-on labware (series of lab module) for computer network learning which is guided by authentic learning principles to immerse students in a real world relevant learning environment. We have been using this labware in teaching computer network, mobile security, and wireless network classes. The student feedback shows that students can learn more when they have hands-on authentic learning experience.
Abstract: The tracking allows to detect the tumor affections of cervical cancer, it is particularly complex and consuming time, because it consists in seeking some abnormal cells among a cluster of normal cells. In this paper, we present our proposed computer system for helping the doctors in tracking the cervical cancer. Knowing that the diagnosis of the malignancy is based in the set of atypical morphological details of all cells, herein, we present an unsupervised genetic algorithm for the separation of cell components since the diagnosis is doing by analysis of the core and the cytoplasm. We give also the various algorithms used for computing the morphological characteristics of cells (Ratio core/cytoplasm, cellular deformity, ...) necessary for the recognition of illness.
Abstract: Wheat has a bimodal starch granule population and the dependency of the rate of enzymatic hydrolysis on particle size has been investigated. Ungelatinised wheaten starch granules were separated into two populations by sedimentation and decantation. Particle size was analysed by laser diffraction and morphological characteristics were viewed using SEM. The sedimentation technique though lengthy, gave satisfactory separation of the granules. Samples (10μm and original) were digested with a-amylase using a dialysis model. Granules of 10μm (p10μm. Moreover, the digestion rate was dependent on particle size whereby smaller granules produced higher rate of release. The methodology and results reported here can be used as a basis for further evaluations designed to delay the release of glucose during the digestion of native starches.
Abstract: Among many different methods that are used for
optimizing different engineering problems mathematical (numerical)
optimization techniques are very important because they can easily
be used and are consistent with most of engineering problems. Many
studies and researches are done on stability analysis of three
dimensional (3D) slopes and the relating probable slip surfaces and
determination of factors of safety, but in most of them force
equilibrium equations, as in simplified 2D methods, are considered
only in two directions. In other words for decreasing mathematical
calculations and also for simplifying purposes the force equilibrium
equation in 3rd direction is omitted. This point is considered in just a
few numbers of previous studies and most of them have only given a
factor of safety and they haven-t made enough effort to find the most
probable slip surface. In this study shapes of the slip surfaces are
modeled, and safety factors are calculated considering the force
equilibrium equations in all three directions, and also the moment
equilibrium equation is satisfied in the slip direction, and using
nonlinear programming techniques the shape of the most probable
slip surface is determined. The model which is used in this study is a
3D model that is composed of three upper surfaces which can cover
all defined and probable slip surfaces. In this research the meshing
process is done in a way that all elements are prismatic with
quadrilateral cross sections, and the safety factor is defined on this
quadrilateral surface in the base of the element which is a part of the
whole slip surface. The method that is used in this study to find the
most probable slip surface is the non-linear programming method in
which the objective function that must get optimized is the factor of
safety that is a function of the soil properties and the coordinates of
the nodes on the probable slip surface. The main reason for using
non-linear programming method in this research is its quick
convergence to the desired responses. The final results show a good
compatibility with the previously used classical and 2D methods and
also show a reasonable convergence speed.
Abstract: This paper studied the synthesis of monoacylglycerol (monolaurin) by glycerolysis of coconut oil and crude glycerol, catalyzed by Carica papaya lipase. Coconut oil obtained from cold pressed extraction method and crude glycerol obtained from the biodiesel plant in Department of Chemistry, Uttaradit Rajabhat University, Thailand which used oils were used as raw materials for biodiesel production through transesterification process catalyzed by sodium hydroxide. The influences of the following variables were studied: (i) type of organic solvent, (ii) molar ratio of substrate, (iii) reaction temperature, (iv) reaction time, (v) lipase dosage, and (vi) initial water activity of enzyme. High yields in monoacylglycerol (58.35%) were obtained with molar ratio of glycerol to oil at 8:1 in ethanol, temperature was controlled at 45oC for 36 hours, the amount of enzyme used was 20 wt% of oil and initial water activity of enzyme at 0.53.
Abstract: A mammal-s body can be seen as a blood vessel with
complex tunnels. When heart pumps blood periodically, blood runs
through blood vessels and rebounds from walls of blood vessels.
Blood pressure signals can be measured with complex but periodic
patterns. When an artery is clamped during a surgical operation, the
spectrum of blood pressure signals will be different from that of
normal situation. In this investigation, intestinal artery clamping
operations were conducted to a pig for simulating the situation of
intestinal blocking during a surgical operation. Similarity theory is a
convenient and easy tool to prove that patterns of blood pressure
signals of intestinal artery blocking and unblocking are surely
different. And, the algorithm of Hilbert Huang Transform can be
applied to extract the character parameters of blood pressure pattern.
In conclusion, the patterns of blood pressure signals of two different
situations, intestinal artery blocking and unblocking, can be
distinguished by these character parameters defined in this paper.
Abstract: Formulation of biological profile is one of the modern roles of forensic anthropologist. The present study was conducted to estimate height using foot and shoeprint length of Malaysian population. The present work can be very useful information in the process of identification of individual in forensic cases based on shoeprint evidence. It can help to narrow down suspects and ease the police investigation. Besides, stature is important parameters in determining the partial identify of unidentified and mutilated bodies. Thus, this study can help the problem encountered in cases of mass disaster, massacre, explosions and assault cases. This is because it is very hard to identify parts of bodies in these cases where people are dismembered and become unrecognizable. Samples in this research were collected from 200 Malaysian adults (100 males and 100 females) with age ranging from 20 to 45 years old. In this research, shoeprint length were measured based on the print of the shoes made from the flat shoes. Other information like gender, foot length and height of subject were also recorded. The data was analyzed using IBM® SPSS Statistics 19 software. Results indicated that, foot length has a strong correlation with stature than shoeprint length for both sides of the feet. However, in the unknown, where the gender was undetermined have shown a better correlation in foot length and shoeprint length parameter compared to males and females analyzed separately. In addition, prediction equations are developed to estimate the stature using linear regression analysis of foot length and shoeprint length. However, foot lengths give better prediction than shoeprint length.
Abstract: Signalized intersections on high-volume arterials are
often congested during peak hours, causing a decrease in through
movement efficiency on the arterial. Much of the vehicle delay
incurred at conventional intersections is caused by high left-turn
demand. Unconventional intersection designs attempt to reduce
intersection delay and travel time by rerouting left-turns away from
the main intersection and replacing it with right-turn followed by Uturn.
The proposed new type of U-turn intersection is geometrically
designed with a raised island which provides a protected U-turn
movement. In this study several scenarios based on different
distances between U-turn and main intersection, traffic volume of
major/minor approaches and percentage of left-turn volumes were
simulated by use of AIMSUN, a type of traffic microsimulation
software. Subsequently some models are proposed in order to
compute travel time of each movement. Eventually by correlating
these equations to some in-field collected data of some implemented
U-turn facilities, the reliability of the proposed models are approved.
With these models it would be possible to calculate travel time of
each movement under any kind of geometric and traffic condition. By
comparing travel time of a conventional signalized intersection with
U-turn intersection travel time, it would be possible to decide on
converting signalized intersections into this new kind of U-turn
facility or not. However comparison of travel time is not part of the
scope of this research. In this paper only travel time of this innovative
U-turn facility would be predicted. According to some before and
after study about the traffic performance of some executed U-turn
facilities, it is found that commonly, this new type of U-turn facility
produces lower travel time. Thus, evaluation of using this type of
unconventional intersection should be seriously considered.
Abstract: In this paper, algorithms for the automatic localisation
of two anatomical soft tissue landmarks of the head the medial
canthus (inner corner of the eye) and the tragus (a small, pointed,
cartilaginous flap of the ear), in CT images are describet. These
landmarks are to be used as a basis for an automated image-to-patient
registration system we are developing. The landmarks are localised
on a surface model extracted from CT images, based on surface
curvature and a rule based system that incorporates prior knowledge
of the landmark characteristics. The approach was tested on a dataset
of near isotropic CT images of 95 patients. The position of the
automatically localised landmarks was compared to the position of
the manually localised landmarks. The average difference was 1.5
mm and 0.8 mm for the medial canthus and tragus, with a maximum
difference of 4.5 mm and 2.6 mm respectively.The medial canthus
and tragus can be automatically localised in CT images, with
performance comparable to manual localisation
Abstract: The prediction of transmembrane helical segments
(TMHs) in membrane proteins is an important field in the
bioinformatics research. In this paper, a new method based on discrete
wavelet transform (DWT) has been developed to predict the number
and location of TMHs in membrane proteins. PDB coded as 1KQG
was chosen as an example to describe the prediction of the number and
location of TMHs in membrane proteins by using this method. To
access the effect of the method, 80 proteins with known 3D-structure
from Mptopo database are chosen at random as the test objects
(including 325 TMHs), 308 of which can be predicted accurately, the
average predicted accuracy is 96.3%. In addition, the above 80
membrane proteins are divided into 13 groups according to their
function and type. In particular, the results of the prediction of TMHs
of the 13 groups are satisfying.
Abstract: Dichotomization of the outcome by a single cut-off point is an important part of various medical studies. Usually the relationship between the resulted dichotomized dependent variable and explanatory variables is analyzed with linear regression, probit regression or logistic regression. However, in many real-life situations, a certain cut-off point dividing the outcome into two groups is unknown and can be specified only approximately, i.e. surrounded by some (small) uncertainty. It means that in order to have any practical meaning the regression model must be robust to this uncertainty. In this paper, we show that neither the beta in the linear regression model, nor its significance level is robust to the small variations in the dichotomization cut-off point. As an alternative robust approach to the problem of uncertain medical categories, we propose to use the linear regression model with the fuzzy membership function as a dependent variable. This fuzzy membership function denotes to what degree the value of the underlying (continuous) outcome falls below or above the dichotomization cut-off point. In the paper, we demonstrate that the linear regression model of the fuzzy dependent variable can be insensitive against the uncertainty in the cut-off point location. In the paper we present the modeling results from the real study of low hemoglobin levels in infants. We systematically test the robustness of the binomial regression model and the linear regression model with the fuzzy dependent variable by changing the boundary for the category Anemia and show that the behavior of the latter model persists over a quite wide interval.
Abstract: The paper deals with a mathematical model for fluid dynamic flows on road networks which is based on conservation laws. This nonlinear framework is based on the conservation of cars. We focus on traffic circle, which is a finite number of roads that meet at some junctions. The traffic circle with junctions having either one incoming and two outgoing or two incoming and one outgoing roads. We describe the numerical schemes with the particular boundary conditions used to produce approximated solutions of the problem.
Abstract: Ontologies and tagging systems are two different ways to organize the knowledge present in the current Web. In this paper we propose a simple method to model folksonomies, as tagging systems, with ontologies. We show the scalability of the method using real data sets. The modeling method is composed of a generic ontology that represents any folksonomy and an algorithm to transform the information contained in folksonomies to the generic ontology. The method allows representing folksonomies at any instant of time.
Abstract: Recent years have witnessed the rapid development of
the Internet and telecommunication techniques. Information security
is becoming more and more important. Applications such as covert
communication, copyright protection, etc, stimulate the research of
information hiding techniques. Traditionally, encryption is used to
realize the communication security. However, important information
is not protected once decoded. Steganography is the art and science
of communicating in a way which hides the existence of the communication.
Important information is firstly hidden in a host data, such
as digital image, video or audio, etc, and then transmitted secretly
to the receiver.In this paper a data hiding model with high security
features combining both cryptography using finite state sequential
machine and image based steganography technique for communicating
information more securely between two locations is proposed.
The authors incorporated the idea of secret key for authentication
at both ends in order to achieve high level of security. Before the
embedding operation the secret information has been encrypted with
the help of finite-state sequential machine and segmented in different
parts. The cover image is also segmented in different objects through
normalized cut.Each part of the encoded secret information has been
embedded with the help of a novel image steganographic method
(PMM) on different cuts of the cover image to form different stego
objects. Finally stego image is formed by combining different stego
objects and transmit to the receiver side. At the receiving end different
opposite processes should run to get the back the original secret
message.
Abstract: A stack with a small critical temperature gradient is
desirable for a standing wave thermoacoustic engine to obtain a low
onset temperature difference (the minimum temperature difference to
start engine-s self-oscillation). The viscous and heat relaxation loss in
the stack determines the critical temperature gradient. In this work, a
dimensionless critical temperature gradient factor is obtained based
on the linear thermoacoustic theory. It is indicated that the
impedance determines the proportion between the viscous loss, heat
relaxation losses and the power production from the heat energy. It
reveals the effects of the channel dimensions, geometrical
configuration and the local acoustic impedance on the critical
temperature gradient in stacks. The numerical analysis shows that
there exists a possible optimum combination of these parameters
which leads to the lowest critical temperature gradient. Furthermore,
several different geometries have been tested and compared
numerically.
Abstract: Coal fly ash (CFA) generated by coal-based thermal
power plants is mainly composed of some oxides having high
crystallinity, like quartz and mullite. In this study, the effect of CFA
crystallinity toward lead adsorption capacity was investigated. To get
solid with various crystallinity, the solution of sodium hydroxide
(NaOH) of 1-7 M was used to treat CFA at various temperature and
reflux time. Furthermore, to evaluate the effect of NaOH-treated CFA
with respect to adsorption capacity, the treated CFA were examine as
adsorbent for removing lead in the solution. The result shows that
using NaOH to treat CFA causes crystallinity of quartz and mullite
decrease. At higher NaOH concentration (>3M), in addition the
damage of quartz and mullite crystallinity is followed by crystal
formation called hydroxysodalite. The lower crystalllinity, the higher
adsorption capacity.