Abstract: Rule Discovery is an important technique for mining
knowledge from large databases. Use of objective measures for
discovering interesting rules leads to another data mining problem,
although of reduced complexity. Data mining researchers have
studied subjective measures of interestingness to reduce the volume
of discovered rules to ultimately improve the overall efficiency of
KDD process.
In this paper we study novelty of the discovered rules as a
subjective measure of interestingness. We propose a hybrid approach
based on both objective and subjective measures to quantify novelty
of the discovered rules in terms of their deviations from the known
rules (knowledge). We analyze the types of deviation that can arise
between two rules and categorize the discovered rules according to
the user specified threshold. We implement the proposed framework
and experiment with some public datasets. The experimental results
are promising.
Abstract: Fundamental motivation of this paper is how gaze estimation can be utilized effectively regarding an application to games. In games, precise estimation is not always important in aiming targets but an ability to move a cursor to an aiming target accurately is also significant. Incidentally, from a game producing point of view, a separate expression of a head movement and gaze movement sometimes becomes advantageous to expressing sense of presence. A case that panning a background image associated with a head movement and moving a cursor according to gaze movement can be a representative example. On the other hand, widely used technique of POG estimation is based on a relative position between a center of corneal reflection of infrared light sources and a center of pupil. However, a calculation of a center of pupil requires relatively complicated image processing, and therefore, a calculation delay is a concern, since to minimize a delay of inputting data is one of the most significant requirements in games. In this paper, a method to estimate a head movement by only using corneal reflections of two infrared light sources in different locations is proposed. Furthermore, a method to control a cursor using gaze movement as well as a head movement is proposed. By using game-like-applications, proposed methods are evaluated and, as a result, a similar performance to conventional methods is confirmed and an aiming control with lower computation power and stressless intuitive operation is obtained.
Abstract: This paper proposes an efficient learning method for the layered neural networks based on the selection of training data and input characteristics of an output layer unit. Comparing to recent neural networks; pulse neural networks, quantum neuro computation, etc, the multilayer network is widely used due to its simple structure. When learning objects are complicated, the problems, such as unsuccessful learning or a significant time required in learning, remain unsolved. Focusing on the input data during the learning stage, we undertook an experiment to identify the data that makes large errors and interferes with the learning process. Our method devides the learning process into several stages. In general, input characteristics to an output layer unit show oscillation during learning process for complicated problems. The multi-stage learning method proposes by the authors for the function approximation problems of classifying learning data in a phased manner, focusing on their learnabilities prior to learning in the multi layered neural network, and demonstrates validity of the multi-stage learning method. Specifically, this paper verifies by computer experiments that both of learning accuracy and learning time are improved of the BP method as a learning rule of the multi-stage learning method. In learning, oscillatory phenomena of a learning curve serve an important role in learning performance. The authors also discuss the occurrence mechanisms of oscillatory phenomena in learning. Furthermore, the authors discuss the reasons that errors of some data remain large value even after learning, observing behaviors during learning.
Abstract: Web sites are rapidly becoming the preferred media
choice for our daily works such as information search, company
presentation, shopping, and so on. At the same time, we live in a
period where visual appearances play an increasingly important
role in our daily life. In spite of designers- effort to develop a web
site which be both user-friendly and attractive, it would be difficult
to ensure the outcome-s aesthetic quality, since the visual
appearance is a matter of an individual self perception and opinion.
In this study, it is attempted to develop an automatic system for
web pages aesthetic evaluation which are the building blocks of
web sites. Based on the image processing techniques and artificial
neural networks, the proposed method would be able to categorize
the input web page according to its visual appearance and aesthetic
quality. The employed features are multiscale/multidirectional
textural and perceptual color properties of the web pages, fed to
perceptron ANN which has been trained as the evaluator. The
method is tested using university web sites and the results
suggested that it would perform well in the web page aesthetic
evaluation tasks with around 90% correct categorization.
Abstract: Aldehydes as secondary lipid oxidation products are highly specific to the oxidative degradation of particular polyunsaturated fatty acids present in foods. Gas chromatographic analysis of those volatile compounds has been widely used for monitoring of the deterioration of food products. Developed static headspace gas chromatography method using flame ionization detector (SHS GC FID) was applied to monitor the aldehydes present in processed foods such as bakery, meat and confectionary products.
Five selected aldehydes were determined in samples without any sample preparation, except grinding for bakery and meat products. SHS–GC analysis allows the separation of propanal, pentanal, hexanal, heptanal and octanal, within 15min. Aldehydes were quantified in fresh and stored samples, and the obtained range of aldehydes in crackers was 1.62±0.05 – 9.95±0.05mg/kg, in sausages 6.62±0.46 – 39.16±0.39mg/kg; and in cocoa spread cream 0.48±0.01 – 1.13±0.02mg/kg. Referring to the obtained results, the following can be concluded, proposed method is suitable for different types of samples, content of aldehydes varies depending on the type of a sample, and differs in fresh and stored samples of the same type.
Abstract: This paper describes the smart energy monitoring system with a wireless sensor network for monitoring of electrical usage in smart house. Proposed system is composed of wireless plugs and energy control wallpad server. The wireless plug integrates an AC power socket, a relay to switch the socket ON/OFF, a Hall effect sensor to sense current of load appliance and a Kmote. The Kmote is a wireless communication interface based on TinyOS. We evaluated wireless plug in a laboratory, analyzed and presented energy consumption data from electrical appliances for 3 months in home.
Abstract: To offer a large variety of products while maintaining
low costs, high speed, and high quality in a mass customization
product development environment, platform based product
development has much benefit and usefulness in many industry fields.
This paper proposes a product configuration strategy by similarity
measure, incorporating the knowledge engineering principles such as
product information model, ontology engineering, and formal concept
analysis.
Abstract: Nowadays, we are facing with network threats that
cause enormous damage to the Internet community day by day. In
this situation, more and more people try to prevent their network
security using some traditional mechanisms including firewall,
Intrusion Detection System, etc. Among them honeypot is a versatile
tool for a security practitioner, of course, they are tools that are meant
to be attacked or interacted with to more information about attackers,
their motives and tools. In this paper, we will describe usefulness of
low-interaction honeypot and high-interaction honeypot and
comparison between them. And then we propose hybrid honeypot
architecture that combines low and high -interaction honeypot to
mitigate the drawback. In this architecture, low-interaction honeypot
is used as a traffic filter. Activities like port scanning can be
effectively detected by low-interaction honeypot and stop there.
Traffic that cannot be handled by low-interaction honeypot is handed
over to high-interaction honeypot. In this case, low-interaction
honeypot is used as proxy whereas high-interaction honeypot offers
the optimal level realism. To prevent the high-interaction honeypot
from infections, containment environment (VMware) is used.
Abstract: This paper presents a comparative analysis of a new
unsupervised PCA-based technique for steel plates texture segmentation
towards defect detection. The proposed scheme called Variance
Based Component Analysis or VBCA employs PCA for feature
extraction, applies a feature reduction algorithm based on variance of
eigenpictures and classifies the pixels as defective and normal. While
the classic PCA uses a clusterer like Kmeans for pixel clustering,
VBCA employs thresholding and some post processing operations to
label pixels as defective and normal. The experimental results show
that proposed algorithm called VBCA is 12.46% more accurate and
78.85% faster than the classic PCA.
Abstract: Switched-mode converters play now a significant role in
modern society. Their operation are often crucial in various electrical
applications affecting the every day life. Therefore, the quality of
the converters needs to be reliably verified. Recent studies have
shown that the converters can be fully characterized by a set of
frequency responses which can be efficiently used to validate the
proper operation of the converters. Consequently, several methods
have been proposed to measure the frequency responses fast and
accurately. Most often correlation-based techniques have been applied.
The presented measurement methods are highly sensitive to
external errors and system nonlinearities. This fact has been often
forgotten and the necessary uncertainty analysis of the measured
responses has been neglected. This paper presents a simple approach
to analyze the noise and nonlinearities in the frequency-response
measurements of switched-mode converters. Coherence analysis is
applied to form a confidence interval characterizing the noise and
nonlinearities involved in the measurements. The presented method is
verified by practical measurements from a high-frequency switchedmode
converter.
Abstract: Mostly the systems are dealing with time varying
signals. The Power efficiency can be achieved by adapting the system
activity according to the input signal variations. In this context
an adaptive rate filtering technique, based on the level crossing sampling
is devised. It adapts the sampling frequency and the filter order
by following the input signal local variations. Thus, it correlates the
processing activity with the signal variations. Interpolation is required
in the proposed technique. A drastic reduction in the interpolation
error is achieved by employing the symmetry during the interpolation
process. Processing error of the proposed technique is
calculated. The computational complexity of the proposed filtering
technique is deduced and compared to the classical one. Results
promise a significant gain of the computational efficiency and hence
of the power consumption.
Abstract: Producing companies aspire to high delivery
availability despite appearing disruptions. To ensure high delivery
availability safety stocksare required. Howeversafety stock leads to
additional capital commitment and compensates disruptions instead
of solving the reasons.The intention is to increase the stability in
production by configuring the production planning and control
systematically. Thus the safety stock can be reduced. The largest
proportion of inventory in producing companies is caused by batch
inventory, schedule deviations and variability of demand rates.These
reasons for high inventory levels can be reduced by configuring the
production planning and control specifically. Hence the inventory
level can be reduced. This is enabled by synchronizing the lot size
straightening the demand as well as optimizing the releasing order,
sequencing and capacity control.
Abstract: Soft topological spaces are considered as mathematical tools for dealing with uncertainties, and a fuzzy topological space is a special case of the soft topological space. The purpose of this paper is to study soft topological spaces. We introduce some new concepts in soft topological spaces such as soft first-countable spaces, soft second-countable spaces and soft separable spaces, and some basic properties of these concepts are explored.
Abstract: Granular computing deals with representation of information in the form of some aggregates and related methods for transformation and analysis for problem solving. A granulation scheme based on clustering and Rough Set Theory is presented with focus on structured conceptualization of information has been presented in this paper. Experiments for the proposed method on four labeled data exhibit good result with reference to classification problem. The proposed granulation technique is semi-supervised imbibing global as well as local information granulation. To represent the results of the attribute oriented granulation a tree structure is proposed in this paper.
Abstract: The main focus of this paper is on the human induced
forces. Almost all existing force models for this type of load (defined
either in the time or frequency domain) are developed from the
assumption of perfect periodicity of the force and are based on force
measurements conducted on rigid (i.e. high frequency) surfaces. To
verify the different authors conclusions the vertical pressure
measurements invoked during the walking was performed, using
pressure gauges in various configurations. The obtained forces are
analyzed using Fourier transformation. This load is often decisive in
the design of footbridges. Design criteria and load models proposed
by widely used standards and other researchers were introduced and a
comparison was made.
Abstract: In this paper, mathematical models for permutation flow shop scheduling and job shop scheduling problems are proposed. The first problem is based on a mixed integer programming model. As the problem is NP-complete, this model can only be used for smaller instances where an optimal solution can be computed. For large instances, another model is proposed which is suitable for solving the problem by stochastic heuristic methods. For the job shop scheduling problem, a mathematical model and its main representation schemes are presented.
Abstract: In this paper a simple watermarking method for
color images is proposed. The proposed method is based on
watermark embedding for the histograms of the HSV planes
using visual cryptography watermarking. The method has
been proved to be robust for various image processing
operations such as filtering, compression, additive noise, and
various geometrical attacks such as rotation, scaling, cropping,
flipping, and shearing.
Abstract: Segmentation in ultrasound images is challenging due to the interference from speckle noise and fuzziness of boundaries. In this paper, a segmentation scheme using fuzzy c-means (FCM) clustering incorporating both intensity and texture information of images is proposed to extract breast lesions in ultrasound images. Firstly, the nonlinear structure tensor, which can facilitate to refine the edges detected by intensity, is used to extract speckle texture. And then, a spatial FCM clustering is applied on the image feature space for segmentation. In the experiments with simulated and clinical ultrasound images, the spatial FCM clustering with both intensity and texture information gets more accurate results than the conventional FCM or spatial FCM without texture information.
Abstract: SAD (Sum of Absolute Difference) algorithm is
heavily used in motion estimation which is computationally highly
demanding process in motion picture encoding. To enhance the
performance of motion picture encoding on a VLIW processor, an
efficient implementation of SAD algorithm on the VLIW processor is
essential. SAD algorithm is programmed as a nested loop with a
conditional branch. In VLIW processors, loop is usually optimized by
software pipelining, but researches on optimal scheduling of software
pipelining for nested loops, especially nested loops with conditional
branches are rare. In this paper, we propose an optimal scheduling and
implementation of SAD algorithm with conditional branch on a VLIW
DSP processor. The proposed optimal scheduling first transforms the
nested loop with conditional branch into a single loop with conditional
branch with consideration of full utilization of ILP capability of the
VLIW processor and realization of earlier escape from the loop. Next,
the proposed optimal scheduling applies a modulo scheduling
technique developed for single loop. Based on this optimal scheduling
strategy, optimal implementation of SAD algorithm on TMS320C67x,
a VLIW DSP is presented. Through experiments on TMS320C6713
DSK, it is shown that H.263 encoder with the proposed SAD
implementation performs better than other H.263 encoder with other
SAD implementations, and that the code size of the optimal SAD
implementation is small enough to be appropriate for embedded
environments.
Abstract: This paper proposes a Web service and serviceoriented
architecture (SOA) for a computer-adaptive testing (CAT)
process on e-learning systems. The proposed architecture is
developed to solve an interoperability problem of the CAT process by
using Web service. The proposed SOA and Web service define all
services needed for the interactions between systems in order to
deliver items and essential data from Web service to the CAT Webbased
application. These services are implemented in a XML-based
architecture, platform independence and interoperability between the
Web service and CAT Web-based applications.