Abstract: Color Histogram is considered as the oldest method
used by CBIR systems for indexing images. In turn, the global
histograms do not include the spatial information; this is why the
other techniques coming later have attempted to encounter this
limitation by involving the segmentation task as a preprocessing step.
The weak segmentation is employed by the local histograms while
other methods as CCV (Color Coherent Vector) are based on strong
segmentation. The indexation based on local histograms consists of
splitting the image into N overlapping blocks or sub-regions, and
then the histogram of each block is computed. The dissimilarity
between two images is reduced, as consequence, to compute the
distance between the N local histograms of the both images resulting
then in N*N values; generally, the lowest value is taken into account
to rank images, that means that the lowest value is that which helps to
designate which sub-region utilized to index images of the collection
being asked. In this paper, we make under light the local histogram
indexation method in the hope to compare the results obtained against
those given by the global histogram. We address also another
noteworthy issue when Relying on local histograms namely which
value, among N*N values, to trust on when comparing images, in
other words, which sub-region among the N*N sub-regions on which
we base to index images. Based on the results achieved here, it seems
that relying on the local histograms, which needs to pose an extra
overhead on the system by involving another preprocessing step
naming segmentation, does not necessary mean that it produces better
results. In addition to that, we have proposed here some ideas to
select the local histogram on which we rely on to encode the image
rather than relying on the local histogram having lowest distance with
the query histograms.
Abstract: Frequent pattern mining is the process of finding a
pattern (a set of items, subsequences, substructures, etc.) that occurs
frequently in a data set. It was proposed in the context of frequent
itemsets and association rule mining. Frequent pattern mining is used
to find inherent regularities in data. What products were often
purchased together? Its applications include basket data analysis,
cross-marketing, catalog design, sale campaign analysis, Web log
(click stream) analysis, and DNA sequence analysis. However, one of
the bottlenecks of frequent itemset mining is that as the data increase
the amount of time and resources required to mining the data
increases at an exponential rate. In this investigation a new algorithm
is proposed which can be uses as a pre-processor for frequent itemset
mining. FASTER (FeAture SelecTion using Entropy and Rough sets)
is a hybrid pre-processor algorithm which utilizes entropy and roughsets
to carry out record reduction and feature (attribute) selection
respectively. FASTER for frequent itemset mining can produce a
speed up of 3.1 times when compared to original algorithm while
maintaining an accuracy of 71%.
Abstract: Frequent, continuous speech training has proven to be
a necessary part of a successful speech therapy process, but
constraints of traveling time and employment dispensation become
key obstacles especially for individuals living in remote areas or for
dependent children who have working parents. In order to ameliorate
speech difficulties with ample guidance from speech therapists, a
website has been developed that supports speech therapy and training
for people with articulation disorders in the standard Thai language.
This web-based program has the ability to record speech training
exercises for each speech trainee. The records will be stored in a
database for the speech therapist to investigate, evaluate, compare
and keep track of all trainees’ progress in detail. Speech trainees can
request live discussions via video conference call when needed.
Communication through this web-based program facilitates and
reduces training time in comparison to walk-in training or
appointments. This type of training also allows people with
articulation disorders to practice speech lessons whenever or
wherever is convenient for them, which can lead to a more regular
training processes.
Abstract: One of the most important tasks in urban remote
sensing is the detection of impervious surfaces (IS), such as roofs and
roads. However, detection of IS in heterogeneous areas still remains
one of the most challenging tasks. In this study, detection of concrete
roof using an object-based approach was proposed. A new rule-based
classification was developed to detect concrete roof tile. This
proposed rule-based classification was applied to WorldView-2
image and results showed that the proposed rule has good potential to
predict concrete roof material from WorldView-2 images, with 85%
accuracy.
Abstract: The importance of the formal specification in the
software life cycle is barely concealing to anyone. Formal
specifications use mathematical notation to describe the properties of
information system precisely, without unduly constraining the way in
how these properties are achieved. Having a correct and quality
software specification is not easy task. This study concerns with how
a group of rectifiers can communicate with each other and work to
prepare and produce a correct formal software specification. WBCS
has been implemented based mainly in the proposed supported
cooperative work model and a survey conducted on the existing Webbased
collaborative writing tools. This paper aims to assess the
feasibility of executing the web-based collaboration process using
WBCS. The purpose of conducting this test is to test the system as a
whole for functionality and fitness for use based on the evaluation
test plan.
Abstract: Sudoku is a logic-based combinatorial puzzle game
which people in different ages enjoy playing it. The challenging and
addictive nature of this game has made it a ubiquitous game. Most
magazines, newspapers, puzzle books, etc. publish lots of Sudoku
puzzles every day. These puzzles often come in different levels of
difficulty so that all people, from beginner to expert, can play the
game and enjoy it. Generating puzzles with different levels of
difficulty is a major concern of Sudoku designers. There are several
works in the literature which propose ways of generating puzzles
having a desirable level of difficulty. In this paper, we propose a
method based on constraint satisfaction problems to evaluate the
difficulty of the Sudoku puzzles. Then we propose a hill climbing
method to generate puzzles with different levels of difficulty.
Whereas other methods are usually capable of generating puzzles
with only few number of difficulty levels, our method can be used to
generate puzzles with arbitrary number of different difficulty levels.
We test our method by generating puzzles with different levels of
difficulty and having a group of 15 people solve all the puzzles and
recording the time they spend for each puzzle.
Abstract: As currently various portable devices were launched,
smart business conducted using them became common. Since smart
business can use company-internal resources in an exlternal remote
place, user authentication that can identify authentic users is an
important factor. Commonly used user authentication is a method of
using user ID and Password. In the user authentication using ID and
Password, the user should see and enter authentication information
him or her. In this user authentication system depending on the user’s
vision, there is the threat of password leaks through snooping in the
process which the user enters his or her authentication information.
This study designed and produced a user authentication module
using an actuator to respond to the snooping threat.
Abstract: The growth of wireless devices affects the availability
of limited frequencies or spectrum bands as it has been known that
spectrum bands are a natural resource that cannot be added.
Meanwhile, the licensed frequencies are idle most of the time.
Cognitive radio is one of the solutions to solve those problems.
Cognitive radio is a promising technology that allows the unlicensed
users known as secondary users (SUs) to access licensed bands
without making interference to licensed users or primary users (PUs).
As cloud computing has become popular in recent years, cognitive
radio networks (CRNs) can be integrated with cloud platform. One of
the important issues in CRNs is security. It becomes a problem since
CRNs use radio frequencies as a medium for transmitting and CRNs
share the same issues with wireless communication systems. Another
critical issue in CRNs is performance. Security has adverse effect to
performance and there are trade-offs between them. The goal of this
paper is to investigate the performance related to security trade-off in
CRNs with supporting cloud platforms. Furthermore, Queuing
Network Models with preemptive resume and preemptive repeat
identical priority are applied in this project to measure the impact of
security to performance in CRNs with or without cloud platform. The
generalized exponential (GE) type distribution is used to reflect the
bursty inter-arrival and service times at the servers. The results show
that the best performance is obtained when security is disabled and
cloud platform is enabled.
Abstract: In this paper, an edge-strength guided multiscale
retinex (EGMSR) approach will be proposed for color image contrast
enhancement. In EGMSR, the pixel-dependent weight associated with
each pixel in the single scale retinex output image is computed
according to the edge strength around this pixel in order to prevent
from over-enhancing the noises contained in the smooth dark/bright
regions. Further, by fusing together the enhanced results of EGMSR
and adaptive multiscale retinex (AMSR), we can get a natural fused
image having high contrast and proper tonal rendition. Experimental
results on several low-contrast images have shown that our proposed
approach can produce natural and appealing enhanced images.
Abstract: Consumer-to-Consumer (C2C) E-commerce has been
growing at a very high speed in recent years. Since identical or
nearly-same kinds of products compete one another by relying on
keyword search in C2C E-commerce, some sellers describe their
products with spam keywords that are popular but are not related to
their products. Though such products get more chances to be retrieved
and selected by consumers than those without spam keywords,
the spam keywords mislead the consumers and waste their time.
This problem has been reported in many commercial services like
ebay and taobao, but there have been little research to solve this
problem. As a solution to this problem, this paper proposes a method
to classify whether keywords of a product are spam or not. The
proposed method assumes that a keyword for a given product is
more reliable if the keyword is observed commonly in specifications
of products which are the same or the same kind as the given
product. This is because that a hierarchical category of a product
in general determined precisely by a seller of the product and so is
the specification of the product. Since higher layers of the hierarchical
category represent more general kinds of products, a reliable degree
is differently determined according to the layers. Hence, reliable
degrees from different layers of a hierarchical category become
features for keywords and they are used together with features only
from specifications for classification of the keywords. Support Vector
Machines are adopted as a basic classifier using the features, since
it is powerful, and widely used in many classification tasks. In
the experiments, the proposed method is evaluated with a golden
standard dataset from Yi-han-wang, a Chinese C2C E-commerce,
and is compared with a baseline method that does not consider
the hierarchical category. The experimental results show that the
proposed method outperforms the baseline in F1-measure, which
proves that spam keywords are effectively identified by a hierarchical
category in C2C E-commerce.
Abstract: Bureaucracy reform program drives Indonesian
government to change their management to enhance their
organizational performance. Information technology became one of
strategic plan that organization tried to improve. Knowledge
management system is one of information system that supporting
knowledge management implementation in government which
categorized as people perspective, because this system has high
dependency in human interaction and participation. Strategic plan for
developing knowledge management system can be determine using
some of information system strategic methods. This research
conducted to define type of strategic method of information system,
stage of activity each method, strength and weakness. Literature
review methods used to identify and classify strategic methods of
information system, differentiate method type, categorize common
activities, strength and weakness. Result of this research are
determine and compare six strategic information system methods,
Balanced Scorecard and Risk Analysis believe as common strategic
method that usually used and have the highest excellence strength.
Abstract: Recently GPS data is used in a lot of studies to
automatically reconstruct travel patterns for trip survey. The aim is to
minimize the use of questionnaire surveys and travel diaries so as to
reduce their negative effects. In this paper data acquired from GPS and
accelerometer embedded in smart phones is utilized to predict the
mode of transportation used by the phone carrier. For prediction,
Support Vector Machine (SVM) and Adaptive boosting (AdaBoost)
are employed. Moreover a unique method to improve the prediction
results from these algorithms is also proposed. Results suggest that the
prediction accuracy of AdaBoost after improvement is relatively better
than the rest.
Abstract: In remote sensing, shadow causes problems in many
applications such as change detection and classification. It is caused
by objects which are elevated, thus can directly affect the accuracy of
information. For these reasons, it is very important to detect shadows
particularly in urban high spatial resolution imagery which created a
significant problem. This paper focuses on automatic shadow
detection based on a new spectral index for multispectral imagery
known as Shadow Detection Index (SDI). The new spectral index
was tested on different areas of WorldView-2 images and the results
demonstrated that the new spectral index has a massive potential to
extract shadows with accuracy of 94% effectively and automatically.
Furthermore, the new shadow detection index improved road
extraction from 82% to 93%.
Abstract: The need to extract R&D keywords from issues and use
them to retrieve R&D information is increasing rapidly. However, it is
difficult to identify related issues or distinguish them. Although the
similarity between issues cannot be identified, with an R&D lexicon,
issues that always share the same R&D keywords can be determined.
In detail, the R&D keywords that are associated with a particular issue
imply the key technology elements that are needed to solve a particular
issue.
Furthermore, the relationship among issues that share the same
R&D keywords can be shown in a more systematic way by clustering
them according to keywords. Thus, sharing R&D results and reusing
R&D technology can be facilitated. Indirectly, redundant investment
in R&D can be reduced as the relevant R&D information can be shared
among corresponding issues and the reusability of related R&D can be
improved. Therefore, a methodology to cluster issues from the
perspective of common R&D keywords is proposed to satisfy these
demands.
Abstract: In this paper we consider a nonlinear feedback control
called augmented automatic choosing control (AACC) for nonlinear
systems with constrained input using weighted gradient optimization
automatic choosing functions. Constant term which arises from
linearization of a given nonlinear system is treated as a coefficient of
a stable zero dynamics. Parameters of the control are suboptimally
selected by maximizing the stable region in the sense of Lyapunov
with the aid of a genetic algorithm. This approach is applied to a
field excitation control problem of power system to demonstrate the
splendidness of the AACC. Simulation results show that the new
controller can improve performance remarkably well.
Abstract: Given a large sparse signal, great wishes are to
reconstruct the signal precisely and accurately from lease number of
measurements as possible as it could. Although this seems possible
by theory, the difficulty is in built an algorithm to perform the
accuracy and efficiency of reconstructing. This paper proposes a new
proved method to reconstruct sparse signal depend on using new
method called Least Support Matching Pursuit (LS-OMP) merge it
with the theory of Partial Knowing Support (PSK) given new method
called Partially Knowing of Least Support Orthogonal Matching
Pursuit (PKLS-OMP).
The new methods depend on the greedy algorithm to compute the
support which depends on the number of iterations. So to make it
faster, the PKLS-OMP adds the idea of partial knowing support of its
algorithm. It shows the efficiency, simplicity, and accuracy to get
back the original signal if the sampling matrix satisfies the Restricted
Isometry Property (RIP).
Simulation results also show that it outperforms many algorithms
especially for compressible signals.
Abstract: Information of nodes’ locations is an important
criterion for lots of applications in Wireless Sensor Networks. In the
hop-based range-free localization methods, anchors transmit the
localization messages counting a hop count value to the whole
network. Each node receives this message and calculates its own
distance with anchor in hops and then approximates its own position.
However the estimative distances can provoke large error, and affect
the localization precision. To solve the problem, this paper proposes
an algorithm, which makes the unknown nodes fix the nearest anchor
as a reference and select two other anchors which are the most
accurate to achieve the estimated location. Compared to the DV-Hop
algorithm, experiment results illustrate that proposed algorithm has
less average localization error and is more effective.