Abstract: This study aims to segment objects using the K-means
algorithm for texture features. Firstly, the algorithm transforms color
images into gray images. This paper describes a novel technique for
the extraction of texture features in an image. Then, in a group of
similar features, objects and backgrounds are differentiated by using
the K-means algorithm. Finally, this paper proposes a new object
segmentation algorithm using the morphological technique. The
experiments described include the segmentation of single and multiple
objects featured in this paper. The region of an object can be
accurately segmented out. The results can help to perform image
retrieval and analyze features of an object, as are shown in this paper.
Abstract: In this paper, a novel scheme is proposed for ownership identification and authentication using color images by deploying Cryptography and Digital Watermarking as underlaying technologies. The former is used to compute the contents based hash and the latter to embed the watermark. The host image that will claim to be the rightful owner is first transformed from RGB to YST color space exclusively designed for watermarking based applications. Geometrically YS ÔèÑ T and T channel corresponds to the chrominance component of color image, therefore suitable for embedding the watermark. The T channel is divided into 4×4 nonoverlapping blocks. The size of block is important for enhanced localization, security and low computation. Each block along with ownership information is then deployed by SHA160, a one way hash function to compute the content based hash, which is always unique and resistant against birthday attack instead of using MD5 that may raise the condition i.e. H(m)=H(m'). The watermark payload varies from block to block and computed by the variance factorα . The quality of watermarked images is quite high both subjectively and objectively. Our scheme is blind, computationally fast and exactly locates the tampered region.
Abstract: In this research, we study a control method of a multivehicle
system while considering the limitation of communication
range for each vehicles. When we control networked vehicles with
limitation of communication range, it is important to control the
communication network structure of a multi-vehicle system in order
to keep the network-s connectivity. From this, we especially aim to
control the network structure to the target structure. We formulate
the networked multi-vehicle system with some disturbance and the
communication constraints as a hybrid dynamical system, and then
we study the optimal control problems of the system. It is shown
that the system converge to the objective network structure in finite
time when the system is controlled by the receding horizon method.
Additionally, the optimal control probrems are convertible into the
mixed integer problems and these problems are solvable by some
branch and bound algorithm.
Abstract: In this work social stratification is considered as one
of significant factor which generate the phenomena “terrorism” and it
puts the accent on correlation connection between them, with the
object of creation info-logical model generation of phenomena of
“terrorism” based on stratification process.
Abstract: Agricultural residue such as oil palm fronds (OPF) is
cheap, widespread and available throughout the year. Hemicelluloses
extracted from OPF can be hydrolyzed to their monomers and used in
production of xylooligosaccharides (XOs). The objective of the
present study was to optimize the enzymatic hydrolysis process of
OPF hemicellulose by varying pH, temperature, enzyme and substrate
concentration for production of XOs. Hemicelluloses was extracted
from OPF by using 3 M potassium hydroxide (KOH) at temperature of
40°C for 4 hrs and stirred at 400 rpm. The hemicellulose was then
hydrolyzed using Trichoderma longibrachiatum xylanase at different
pH, temperature, enzyme and substrate concentration. XOs were
characterized based on reducing sugar determination. The optimum
conditions to produced XOs from OPF hemicellulose was obtained at
pH 4.6, temperature of 40°C , enzyme concentration of 2 U/mL and
2% substrate concentration. The results established the suitability of
oil palm fronds as raw material for production of XOs.
Abstract: Using Dynamic Bayesian Networks (DBN) to model genetic regulatory networks from gene expression data is one of the major paradigms for inferring the interactions among genes. Averaging a collection of models for predicting network is desired, rather than relying on a single high scoring model. In this paper, two kinds of model searching approaches are compared, which are Greedy hill-climbing Search with Restarts (GSR) and Markov Chain Monte Carlo (MCMC) methods. The GSR is preferred in many papers, but there is no such comparison study about which one is better for DBN models. Different types of experiments have been carried out to try to give a benchmark test to these approaches. Our experimental results demonstrated that on average the MCMC methods outperform the GSR in accuracy of predicted network, and having the comparable performance in time efficiency. By proposing the different variations of MCMC and employing simulated annealing strategy, the MCMC methods become more efficient and stable. Apart from comparisons between these approaches, another objective of this study is to investigate the feasibility of using DBN modeling approaches for inferring gene networks from few snapshots of high dimensional gene profiles. Through synthetic data experiments as well as systematic data experiments, the experimental results revealed how the performances of these approaches can be influenced as the target gene network varies in the network size, data size, as well as system complexity.
Abstract: This article presents the developments of efficient
algorithms for tablet copies comparison. Image recognition has
specialized use in digital systems such as medical imaging,
computer vision, defense, communication etc. Comparison between
two images that look indistinguishable is a formidable task. Two
images taken from different sources might look identical but due to
different digitizing properties they are not. Whereas small variation
in image information such as cropping, rotation, and slight
photometric alteration are unsuitable for based matching
techniques. In this paper we introduce different matching
algorithms designed to facilitate, for art centers, identifying real
painting images from fake ones. Different vision algorithms for
local image features are implemented using MATLAB. In this
framework a Table Comparison Computer Tool “TCCT" is
designed to facilitate our research. The TCCT is a Graphical Unit
Interface (GUI) tool used to identify images by its shapes and
objects. Parameter of vision system is fully accessible to user
through this graphical unit interface. And then for matching, it
applies different description technique that can identify exact
figures of objects.
Abstract: With the advance of information technology in the
new era the applications of Internet to access data resources has
steadily increased and huge amount of data have become accessible
in various forms. Obviously, the network providers and agencies,
look after to prevent electronic attacks that may be harmful or may
be related to terrorist applications. Thus, these have facilitated the
authorities to under take a variety of methods to protect the special
regions from harmful data. One of the most important approaches is
to use firewall in the network facilities. The main objectives of
firewalls are to stop the transfer of suspicious packets in several
ways. However because of its blind packet stopping, high process
power requirements and expensive prices some of the providers are
reluctant to use the firewall. In this paper we proposed a method to
find a discriminate function to distinguish between usual packets and
harmful ones by the statistical processing on the network router logs.
By discriminating these data, an administrator may take an approach
action against the user. This method is very fast and can be used
simply in adjacent with the Internet routers.
Abstract: Covering-based rough sets is an extension of rough
sets and it is based on a covering instead of a partition of the
universe. Therefore it is more powerful in describing some practical
problems than rough sets. However, by extending the rough sets,
covering-based rough sets can increase the roughness of each model
in recognizing objects. How to obtain better approximations from
the models of a covering-based rough sets is an important issue.
In this paper, two concepts, determinate elements and indeterminate
elements in a universe, are proposed and given precise definitions
respectively. This research makes a reasonable refinement of the
covering-element from a new viewpoint. And the refinement may
generate better approximations of covering-based rough sets models.
To prove the theory above, it is applied to eight major coveringbased
rough sets models which are adapted from other literature.
The result is, in all these models, the lower approximation increases
effectively. Correspondingly, in all models, the upper approximation
decreases with exceptions of two models in some special situations.
Therefore, the roughness of recognizing objects is reduced. This
research provides a new approach to the study and application of
covering-based rough sets.
Abstract: Nuclear matrix protein 22 (NMP22) is a FDA approved
biomarker for bladder cancer. The objective of this study is to develop
a simple NMP22 immumosensor (NMP22-IMS) for accurate
measurement of NMP22. The NMP22-IMS was constructed with
NMP22 antibody immobilized on screen-printed carbon electrodes.
The construction procedures and antibody immobilization are simple.
Results showed that the NMP22-IMS has an excellent (r2³0.95)
response range (20 – 100 ng/mL). In conclusion, a simple and reliable
NMP22-IMS was developed, capable of precisely determining urine
NMP22 level.
Abstract: The objectives were to identify cyanide-degrading
bacteria and study cyanide removal efficiency. Agrobacterium
tumefaciens SUTS 1 was isolated. This is a new strain of
microorganisms for cyanide degradation. The maximum growth rate
of SUTS 1 obtained 4.7 × 108 CFU/ml within 4 days. The cyanide
removal efficiency was studied at 25, 50, and 150 mg/L cyanide. The
residual cyanide, ammonia, nitrate, nitrite, pH, and cell counts were
analyzed. At 25 and 50 mg/L cyanide, SUTS 1 obtained similar
removal efficiency approximately 87.50%. At 150 mg/L cyanide,
SUTS 1 enhanced the cyanide removal efficiency up to 97.90%. Cell
counts of SUTS 1 increased when the cyanide concentration was set
at lower. The ammonia increased when the removal efficiency
increased. The nitrate increased when the ammonia decreased but the
nitrite did not detect in all experiments. pH values also increased
when the cyanide concentrations were set at higher.
Abstract: Systems Analysis and Design is a key subject in
Information Technology courses, but students do not find it easy to
cope with, since it is not “precise" like programming and not exact
like Mathematics. It is a subject working with many concepts,
modeling ideas into visual representations and then translating the
pictures into a real life system. To complicate matters users who are
not necessarily familiar with computers need to give their inputs to
ensure that they get the system the need. Systems Analysis and
Design also covers two fields, namely Analysis, focusing on the
analysis of the existing system and Design, focusing on the design of
the new system. To be able to test the analysis and design of a
system, it is necessary to develop a system or at least a prototype of
the system to test the validity of the analysis and design. The skills
necessary in each aspect differs vastly. Project Management Skills,
Database Knowledge and Object Oriented Principles are all
necessary. In the context of a developing country where students
enter tertiary education underprepared and the digital divide is alive
and well, students need to be motivated to learn the necessary skills,
get an opportunity to test it in a “live" but protected environment –
within the framework of a university. The purpose of this article is to
improve the learning experience in Systems Analysis and Design
through reviewing the underlying teaching principles used, the
teaching tools implemented, the observations made and the
reflections that will influence future developments in Systems
Analysis and Design. Action research principles allows the focus to
be on a few problematic aspects during a particular semester.
Abstract: The objective of the research was to study of foot
anthropometry of children aged 7-12 years in the South of Thailand Thirty-three dimensions were measured on 305 male and 295 female
subjects with 3 age ranges (7-12 years old). The instrumentation consists of four types of anthropometer, digital vernier caliper, digital
height gauge and measuring tape. The mean values and standard
deviations of average age, height, and weight of the male subjects were 9.52(±1.70) years, 137.80(±11.55) cm, and 37.57(±11.65) kg.
Female average age, height, and weight subjects were 9.53(±1.70) years, 137.88(±11.55) cm, and 34.90(±11.57) kg respectively. The
comparison of the 33 comparison measured anthropometric. Between
male and female subjects were sexual differences in size on women in almost all areas of significance (p
Abstract: Nowadays, the increase of human population every
year results in increasing of water usage and demand. Saen Saep
canal is important canal in Bangkok. The main objective of this study
is using Artificial Neural Network (ANN) model to estimate the
Chemical Oxygen Demand (COD) on data from 11 sampling sites.
The data is obtained from the Department of Drainage and Sewerage,
Bangkok Metropolitan Administration, during 2007-2011. The
twelve parameters of water quality are used as the input of the
models. These water quality indices affect the COD. The
experimental results indicate that the ANN model provides a high
correlation coefficient (R=0.89).
Abstract: Optical 3D measurement of objects is meaningful in
numerous industrial applications. In various cases shape acquisition
of weak textured objects is essential. Examples are repetition parts
made of plastic or ceramic such as housing parts or ceramic bottles as
well as agricultural products like tubers. These parts are often
conveyed in a wobbling way during the automated optical inspection.
Thus, conventional 3D shape acquisition methods like laser scanning
might fail. In this paper, a novel approach for acquiring 3D shape of
weak textured and moving objects is presented. To facilitate such
measurements an active stereo vision system with structured light is
proposed. The system consists of multiple camera pairs and auxiliary
laser pattern generators. It performs the shape acquisition within one
shot and is beneficial for rapid inspection tasks. An experimental
setup including hardware and software has been developed and
implemented.
Abstract: The ability of the brain to organize information and generate the functional structures we use to act, think and communicate, is a common and easily observable natural phenomenon. In object-oriented analysis, these structures are represented by objects. Objects have been extensively studied and documented, but the process that creates them is not understood. In this work, a new class of discrete, deterministic, dissipative, host-guest dynamical systems is introduced. The new systems have extraordinary self-organizing properties. They can host information representing other physical systems and generate the same functional structures as the brain does. A simple mathematical model is proposed. The new systems are easy to simulate by computer, and measurements needed to confirm the assumptions are abundant and readily available. Experimental results presented here confirm the findings. Applications are many, but among the most immediate are object-oriented engineering, image and voice recognition, search engines, and Neuroscience.
Abstract: Food safety is an important concern for holiday
makers in foreign and unfamiliar tourist destinations. In fact, risk
from food in these tourist destinations has an influence on tourist
perception. This risk can potentially affect physical health and lead to
an inability to pursue planned activities. The objective of this paper
was to compare foreign tourists- demographics including gender, age
and education level, with the level of perceived risk towards food
safety. A total of 222 foreign tourists during their stay at Khao San
Road in Bangkok were used as the sample. Independent- samples ttest,
analysis of variance, and Least Significant Difference or LSD
post hoc test were utilized. The findings revealed that there were few
demographic differences in level of perceived risk among the foreign
tourists. The post hoc test indicated a significant difference among
the old and the young tourists, and between the higher and lower
level of education. Ranks of tourists- perceived risk towards food
safety unveiled some interesting results. Tourists- perceived risk of
food safety in established restaurants can be ranked as i) cleanliness
of dining utensils, ii) sanitation of food preparation area, and iii)
cleanliness of food seasoning and ingredients. Whereas, the tourists-
perceived risk of food safety in street food and drink can be ranked
as i) cleanliness of stalls and pushcarts, ii) cleanliness of food sold,
and iii) personal hygiene of street food hawkers or vendors.
Abstract: The objective of this research was to find the diffusion properties of vehicles on the road by using the V-Sphere Code. The diffusion coefficient and the size of the height of the wake were estimated with the LES option and the third order MUSCL scheme. We evaluated the code with the changes in the moments of Reynolds Stress along the mean streamline. The results show that at the leading part of a bluff body the LES has some advantages over the RNS since the changes in the strain rates are larger for the leading part. We estimated that the diffusion coefficient with the computed Reynolds stress (non-dimensional) was about 0.96 times the mean velocity.
Abstract: This paper deals with stakeholders’ decisions within energy neutral urban redevelopment processes. The decisions of these stakeholders during the process will make or break energy neutral ambitions. An extensive form of game theory model gave insight in the behavioral differences of stakeholders regarding energy neutral ambitions and the effects of the changing legislation. The results show that new legislation regarding spatial planning slightly influences the behavior of stakeholders. An active behavior of the municipality will still result in the best outcome. Nevertheless, the municipality becomes more powerful when acting passively and can make the use of planning tools to provide governance towards energy neutral urban redevelopment. Moreover, organizational support, recognizing the necessity for energy neutrality, keeping focused and collaboration among stakeholders are crucial elements to achieve the objective of an energy neutral urban (re)development.
Abstract: As the majority of faults are found in a few of its
modules so there is a need to investigate the modules that are
affected severely as compared to other modules and proper
maintenance need to be done in time especially for the critical
applications. As, Neural networks, which have been already applied
in software engineering applications to build reliability growth
models predict the gross change or reusability metrics. Neural
networks are non-linear sophisticated modeling techniques that are
able to model complex functions. Neural network techniques are
used when exact nature of input and outputs is not known. A key
feature is that they learn the relationship between input and output
through training. In this present work, various Neural Network Based
techniques are explored and comparative analysis is performed for
the prediction of level of need of maintenance by predicting level
severity of faults present in NASA-s public domain defect dataset.
The comparison of different algorithms is made on the basis of Mean
Absolute Error, Root Mean Square Error and Accuracy Values. It is
concluded that Generalized Regression Networks is the best
algorithm for classification of the software components into different
level of severity of impact of the faults. The algorithm can be used to
develop model that can be used for identifying modules that are
heavily affected by the faults.