Abstract: CEMTool is a command style design and analyzing
package for scientific and technological algorithm and a matrix based
computation language. In this paper, we present new 2D & 3D
finite element method (FEM) packages for CEMTool. We discuss
the detailed structures and the important features of pre-processor,
solver, and post-processor of CEMTool 2D & 3D FEM packages. In
contrast to the existing MATLAB PDE Toolbox, our proposed FEM
packages can deal with the combination of the reserved words. Also,
we can control the mesh in a very effective way. With the introduction
of new mesh generation algorithm and fast solving technique, our
FEM packages can guarantee the shorter computational time than
MATLAB PDE Toolbox. Consequently, with our new FEM packages,
we can overcome some disadvantages or limitations of the existing
MATLAB PDE Toolbox.
Abstract: In this paper we describe the recognition process of Greek compound words using the PC-KIMMO software. We try to show certain limitations of the system with respect to the principles of compound formation in Greek. Moreover, we discuss the computational processing of phenomena such as stress and syllabification which are indispensable for the analysis of such constructions and we try to propose linguistically-acceptable solutions within the particular system.
Abstract: This paper presents a software quality support tool, a
Java source code evaluator and a code profiler based on
computational intelligence techniques. It is Java prototype software
developed by AI Group [1] from the Research Laboratories at
Universidad de Palermo: an Intelligent Java Analyzer (in Spanish:
Analizador Java Inteligente, AJI). It represents a new approach to
evaluate and identify inaccurate source code usage and transitively,
the software product itself.
The aim of this project is to provide the software development
industry with a new tool to increase software quality by extending
the value of source code metrics through computational intelligence.
Abstract: Simultaneous determination of multicomponents of phenol, resorcinol and catechol with a chemometric technique a PCranking artificial neural network (PCranking-ANN) algorithm is reported in this study. Based on the data correlation coefficient method, 3 representative PCs are selected from the scores of original UV spectral data (35 PCs) as the original input patterns for ANN to build a neural network model. The results obtained by iterating 8000 .The RMSEP for phenol, resorcinol and catechol with PCranking- ANN were 0.6680, 0.0766 and 0.1033, respectively. Calibration matrices were 0.50-21.0, 0.50-15.1 and 0.50-20.0 μg ml-1 for phenol, resorcinol and catechol, respectively. The proposed method was successfully applied for the determination of phenol, resorcinol and catechol in synthetic and water samples.
Abstract: Modern spatial database management systems require a unique Spatial Access Method (SAM) in order solve complex spatial quires efficiently. In this case the spatial data structure takes a prominent place in the SAM. Inadequate data structure leads forming poor algorithmic choices and forging deficient understandings of algorithm behavior on the spatial database. A key step in developing a better semantic spatial object data structure is to quantify the performance effects of semantic and outlier detections that are not reflected in the previous tree structures (R-Tree and its variants). This paper explores a novel SSRO-Tree on SAM to the Topo-Semantic approach. The paper shows how to identify and handle the semantic spatial objects with outlier objects during page overflow/underflow, using gain/loss metrics. We introduce a new SSRO-Tree algorithm which facilitates the achievement of better performance in practice over algorithms that are superior in the R*-Tree and RO-Tree by considering selection queries.
Abstract: The binary phase-only filter digital watermarking
embeds the phase information of the discrete Fourier transform of the
image into the corresponding magnitudes for better image authentication.
The paper proposed an approach of how to implement watermark
embedding by quantizing the magnitude, with discussing how to
regulate the quantization steps based on the frequencies of the magnitude
coefficients of the embedded watermark, and how to embed the
watermark at low frequency quantization. The theoretical analysis and
simulation results show that algorithm flexibility, security, watermark
imperceptibility and detection performance of the binary phase-only
filter digital watermarking can be effectively improved with quantization
based watermark embedding, and the robustness against JPEG
compression will also be increased to some extent.
Abstract: Segmentation of a color image composed of different
kinds of regions can be a hard problem, namely to compute for an
exact texture fields. The decision of the optimum number of
segmentation areas in an image when it contains similar and/or un
stationary texture fields. A novel neighborhood-based segmentation
approach is proposed. A genetic algorithm is used in the proposed
segment-pass optimization process. In this pass, an energy function,
which is defined based on Markov Random Fields, is minimized. In
this paper we use an adaptive threshold estimation method for image
thresholding in the wavelet domain based on the generalized
Gaussian distribution (GGD) modeling of sub band coefficients. This
method called Normal Shrink is computationally more efficient and
adaptive because the parameters required for estimating the threshold
depend on sub band data energy that used in the pre-stage of
segmentation. A quad tree is employed to implement the multi
resolution framework, which enables the use of different strategies at
different resolution levels, and hence, the computation can be
accelerated. The experimental results using the proposed
segmentation approach are very encouraging.
Abstract: Authentication plays a vital role in many secure
systems. Most of these systems require user to log in with his or her
secret password or pass phrase before entering it. This is to ensure all
the valuables information is kept confidential guaranteeing also its
integrity and availability. However, to achieve this goal, users are
required to memorize high entropy passwords or pass phrases.
Unfortunately, this sometimes causes difficulty for user to remember
meaningless strings of data. This paper presents a new scheme which
assigns a weight to each personal question given to the user in
revealing the encrypted secrets or password. Concentration of this
scheme is to offer fault tolerance to users by allowing them to forget
the specific password to a subset of questions and still recover the
secret and achieve successful authentication. Comparison on level of
security for weight-based and weightless secret recovery scheme is
also discussed. The paper concludes with the few areas that requires
more investigation in this research.
Abstract: An approach of design of stable of control systems with ultimately wide ranges of uncertainly disturbed parameters is offered. The method relies on using of nonlinear structurally stable functions from catastrophe theory as controllers. Theoretical part presents an analysis of designed nonlinear second-order control systems. As more important the integrators in series, canonical controllable form and Jordan forms are considered. The analysis resumes that due to added controllers systems become stable and insensitive to any disturbance of parameters. Experimental part presents MATLAB simulation of design of control systems of epidemic spread, aircrafts angular motion and submarine depth. The results of simulation confirm the efficiency of offered method of design. KeywordsCatastrophes, robust control, simulation, uncertain parameters.
Abstract: Game theory could be used to analyze the conflicted
issues in the field of information hiding. In this paper, 2-phase game
can be used to build the embedder-attacker system to analyze the
limits of hiding capacity of embedding algorithms: the embedder
minimizes the expected damage and the attacker maximizes it. In the
system, the embedder first consumes its resource to build embedded
units (EU) and insert the secret information into EU. Then the attacker
distributes its resource evenly to the attacked EU. The expected
equilibrium damage, which is maximum damage in value from the
point of view of the attacker and minimum from the embedder against
the attacker, is evaluated by the case when the attacker attacks a
subset from all the EU. Furthermore, the optimal equilibrium capacity
of hiding information is calculated through the optimal number of EU
with the embedded secret information. Finally, illustrative examples
of the optimal equilibrium capacity are presented.
Abstract: A data warehouse (DW) is a system which has value and role for decision-making by querying. Queries to DW are critical regarding to their complexity and length. They often access millions of tuples, and involve joins between relations and aggregations. Materialized views are able to provide the better performance for DW queries. However, these views have maintenance cost, so materialization of all views is not possible. An important challenge of DW environment is materialized view selection because we have to realize the trade-off between performance and view maintenance. Therefore, in this paper, we introduce a new approach aimed to solve this challenge based on Two-Phase Optimization (2PO), which is a combination of Simulated Annealing (SA) and Iterative Improvement (II), with the use of Multiple View Processing Plan (MVPP). Our experiments show that 2PO outperform the original algorithms in terms of query processing cost and view maintenance cost.
Abstract: Near-infrared (NIR) spectroscopy is a widely used
method for material identification for laboratory and industrial applications.
While standard spectrometers only allow measurements at
one sampling point at a time, NIR Spectral Imaging techniques can
measure, in real-time, both the size and shape of an object as well as
identify the material the object is made of. The online classification
and sorting of recovered paper with NIR Spectral Imaging (SI)
is used with success in the paper recycling industry throughout
Europe. Recently, the globalisation of the recycling material streams
caused that water-based flexographic-printed newspapers mainly from
UK and Italy appear also in central Europe. These flexo-printed
newspapers are not sufficiently de-inkable with the standard de-inking
process originally developed for offset-printed paper. This de-inking
process removes the ink from recovered paper and is the fundamental
processing step to produce high-quality paper from recovered paper.
Thus, the flexo-printed newspapers are a growing problem for the
recycling industry as they reduce the quality of the produced paper
if their amount exceeds a certain limit within the recovered paper
material.
This paper presents the results of a research project for the
development of an automated entry inspection system for recovered
paper that was jointly conducted by CTR AG (Austria) and PTS
Papiertechnische Stiftung (Germany). Within the project an NIR
SI prototype for the identification of flexo-printed newspaper has
been developed. The prototype can identify and sort out flexoprinted
newspapers in real-time and achieves a detection accuracy
for flexo-printed newspaper of over 95%. NIR SI, the technology the
prototype is based on, allows the development of inspection systems
for incoming goods in a paper production facility as well as industrial
sorting systems for recovered paper in the recycling industry in the
near future.
Abstract: This paper proposes a novel multi-format stream grid
architecture for real-time image monitoring system. The system, based
on a three-tier architecture, includes stream receiving unit, stream
processor unit, and presentation unit. It is a distributed computing and
a loose coupling architecture. The benefit is the amount of required
servers can be adjusted depending on the loading of the image
monitoring system. The stream receive unit supports multi capture
source devices and multi-format stream compress encoder. Stream
processor unit includes three modules; they are stream clipping
module, image processing module and image management module.
Presentation unit can display image data on several different platforms.
We verified the proposed grid architecture with an actual test of image
monitoring. We used a fast image matching method with the
adjustable parameters for different monitoring situations. Background
subtraction method is also implemented in the system. Experimental
results showed that the proposed architecture is robust, adaptive, and
powerful in the image monitoring system.
Abstract: In this paper, we study the multi-scenario knapsack problem, a variant of the well-known NP-Hard single knapsack problem. We investigate the use of an adaptive algorithm for solving heuristically the problem. The used method combines two complementary phases: a size reduction phase and a dynamic 2- opt procedure one. First, the reduction phase applies a polynomial reduction strategy; that is used for reducing the size problem. Second, the adaptive search procedure is applied in order to attain a feasible solution Finally, the performances of two versions of the proposed algorithm are evaluated on a set of randomly generated instances.
Abstract: The paper proposes and validates a new method of solving instances of the vehicle routing problem (VRP). The approach is based on a multiple agent system paradigm. The paper contains the VRP formulation, an overview of the multiple agent environment used and a description of the proposed implementation. The approach is validated experimentally. The experiment plan and the discussion of experiment results follow.
Abstract: In Virtual organization, Knowledge Discovery (KD)
service contains distributed data resources and computing grid nodes.
Computational grid is integrated with data grid to form Knowledge
Grid, which implements Apriori algorithm for mining association
rule on grid network. This paper describes development of parallel
and distributed version of Apriori algorithm on Globus Toolkit using
Message Passing Interface extended with Grid Services (MPICHG2).
The creation of Knowledge Grid on top of data and
computational grid is to support decision making in real time
applications. In this paper, the case study describes design and
implementation of local and global mining of frequent item sets. The
experiments were conducted on different configurations of grid
network and computation time was recorded for each operation. We
analyzed our result with various grid configurations and it shows
speedup of computation time is almost superlinear.
Abstract: Generalized Center String (GCS) problem are
generalized from Common Approximate Substring problem
and Common substring problems. GCS are known to be
NP-hard allowing the problems lies in the explosion of
potential candidates. Finding longest center string without
concerning the sequence that may not contain any motifs is
not known in advance in any particular biological gene
process. GCS solved by frequent pattern-mining techniques
and known to be fixed parameter tractable based on the
fixed input sequence length and symbol set size. Efficient
method known as Bpriori algorithms can solve GCS with
reasonable time/space complexities. Bpriori 2 and Bpriori
3-2 algorithm are been proposed of any length and any
positions of all their instances in input sequences. In this
paper, we reduced the time/space complexity of Bpriori
algorithm by Constrained Based Frequent Pattern mining
(CBFP) technique which integrates the idea of Constraint
Based Mining and FP-tree mining. CBFP mining technique
solves the GCS problem works for all center string of any
length, but also for the positions of all their mutated copies
of input sequence. CBFP mining technique construct TRIE
like with FP tree to represent the mutated copies of center
string of any length, along with constraints to restraint
growth of the consensus tree. The complexity analysis for
Constrained Based FP mining technique and Bpriori
algorithm is done based on the worst case and average case
approach. Algorithm's correctness compared with the
Bpriori algorithm using artificial data is shown.
Abstract: Random and natural textures classification is still
one of the biggest challenges in the field of image processing and
pattern recognition. In this paper, texture feature extraction using
Slant Hadamard Transform was studied and compared to other
signal processing-based texture classification schemes. A
parametric SHT was also introduced and employed for natural
textures feature extraction. We showed that a subtly modified
parametric SHT can outperform ordinary Walsh-Hadamard
transform and discrete cosine transform. Experiments were carried
out on a subset of Vistex random natural texture images using a
kNN classifier.
Abstract: This paper investigates the problem of sampling from transactional data streams. We introduce CFISDS as a content based sampling algorithm that works on a landmark window model of data streams and preserve more informed sample in sample space. This algorithm that work based on closed frequent itemset mining tasks, first initiate a concept lattice using initial data, then update lattice structure using an incremental mechanism.Incremental mechanism insert, update and delete nodes in/from concept lattice in batch manner. Presented algorithm extracts the final samples on demand of user. Experimental results show the accuracy of CFISDS on synthetic and real datasets, despite on CFISDS algorithm is not faster than exist sampling algorithms such as Z and DSS.
Abstract: This paper presents a technique for diagnosis of the abdominal aorta aneurysm in magnetic resonance imaging (MRI) images. First, our technique is designed to segment the aorta image in MRI images. This is a required step to determine the volume of aorta image which is the important step for diagnosis of the abdominal aorta aneurysm. Our proposed technique can detect the volume of aorta in MRI images using a new external energy for snakes model. The new external energy for snakes model is calculated from Law-s texture. The new external energy can increase the capture range of snakes model efficiently more than the old external energy of snakes models. Second, our technique is designed to diagnose the abdominal aorta aneurysm by Bayesian classifier which is classification models based on statistical theory. The feature for data classification of abdominal aorta aneurysm was derived from the contour of aorta images which was a result from segmenting of our snakes model, i.e., area, perimeter and compactness. We also compare the proposed technique with the traditional snakes model. In our experiment results, 30 images are trained, 20 images are tested and compared with expert opinion. The experimental results show that our technique is able to provide more accurate results than 95%.