Abstract: Rule Discovery is an important technique for mining
knowledge from large databases. Use of objective measures for
discovering interesting rules leads to another data mining problem,
although of reduced complexity. Data mining researchers have
studied subjective measures of interestingness to reduce the volume
of discovered rules to ultimately improve the overall efficiency of
KDD process.
In this paper we study novelty of the discovered rules as a
subjective measure of interestingness. We propose a hybrid approach
based on both objective and subjective measures to quantify novelty
of the discovered rules in terms of their deviations from the known
rules (knowledge). We analyze the types of deviation that can arise
between two rules and categorize the discovered rules according to
the user specified threshold. We implement the proposed framework
and experiment with some public datasets. The experimental results
are promising.
Abstract: Mostly the systems are dealing with time varying
signals. The Power efficiency can be achieved by adapting the system
activity according to the input signal variations. In this context
an adaptive rate filtering technique, based on the level crossing sampling
is devised. It adapts the sampling frequency and the filter order
by following the input signal local variations. Thus, it correlates the
processing activity with the signal variations. Interpolation is required
in the proposed technique. A drastic reduction in the interpolation
error is achieved by employing the symmetry during the interpolation
process. Processing error of the proposed technique is
calculated. The computational complexity of the proposed filtering
technique is deduced and compared to the classical one. Results
promise a significant gain of the computational efficiency and hence
of the power consumption.
Abstract: Masonry cavity walls are loaded by wind pressure and vertical load from upper floors. These loads results in bending moments and compression forces in the ties connecting the outer and the inner wall in a cavity wall. Large cavity walls are furthermore loaded by differential movements from the temperature gradient between the outer and the inner wall, which results in critical increase of the bending moments in the ties. Since the ties are loaded by combined compression and moment forces, the loadbearing capacity is derived from instability equilibrium equations. Most of them are iterative, since exact instability solutions are complex to derive, not to mention the extra complexity introducing dimensional instability from the temperature gradients. Using an inverse variable substitution and comparing an exact theory with an analytical instability solution a method to design tie-connectors in cavity walls was developed. The method takes into account constraint conditions limiting the free length of the wall tie, and the instability in case of pure compression which gives an optimal load bearing capacity. The model is illustrated with examples from praxis.
Abstract: Need for an appropriate system of evaluating students-
educational developments is a key problem to achieve the predefined
educational goals. Intensity of the related papers in the last years; that
tries to proof or disproof the necessity and adequacy of the students
assessment; is the corroborator of this matter. Some of these studies
tried to increase the precision of determining question weights in
scientific examinations. But in all of them there has been an attempt
to adjust the initial question weights while the accuracy and precision
of those initial question weights are still under question. Thus In
order to increase the precision of the assessment process of students-
educational development, the present study tries to propose a new
method for determining the initial question weights by considering
the factors of questions like: difficulty, importance and complexity;
and implementing a combined method of PROMETHEE and fuzzy
analytic network process using a data mining approach to improve
the model-s inputs. The result of the implemented case study proves
the development of performance and precision of the proposed
model.
Abstract: This paper examines the problem of strategic
management in highly turbulent dynamic business environmental
conditions. As shown the high complexity of the problem can be
managed with the use of System Dynamics Models and Computer
Simulation in obtaining insights, and thorough understanding of the
interdependencies between the organizational structure and the
business environmental elements, so that effective product –market
strategies can be designed. Simulation reveals the underlying forces
that hold together the structure of an organizational system in relation
to its environment. Such knowledge will contribute to the avoidance
of fundamental planning errors and enable appropriate proactive well
focused action.
Abstract: This paper proposes method of diagnosing ball screw
preload loss through the Hilbert-Huang Transform (HHT) and
Multiscale entropy (MSE) process. The proposed method can
diagnose ball screw preload loss through vibration signals when the
machine tool is in operation. Maximum dynamic preload of 2 %, 4 %,
and 6 % ball screws were predesigned, manufactured, and tested
experimentally. Signal patterns are discussed and revealed using
Empirical Mode Decomposition(EMD)with the Hilbert Spectrum.
Different preload features are extracted and discriminated using HHT.
The irregularity development of a ball screw with preload loss is
determined and abstracted using MSE based on complexity
perception. Experiment results show that the proposed method can
predict the status of ball screw preload loss. Smart sensing for the
health of the ball screw is also possible based on a comparative
evaluation of MSE by the signal processing and pattern matching of
EMD/HHT. This diagnosis method realizes the purposes of prognostic
effectiveness on knowing the preload loss and utilizing convenience.
Abstract: Recognizing human action from videos is an active
field of research in computer vision and pattern recognition. Human
activity recognition has many potential applications such as video
surveillance, human machine interaction, sport videos retrieval and
robot navigation. Actually, local descriptors and bag of visuals words
models achieve state-of-the-art performance for human action
recognition. The main challenge in features description is how to
represent efficiently the local motion information. Most of the
previous works focus on the extension of 2D local descriptors on 3D
ones to describe local information around every interest point. In this
paper, we propose a new spatio-temporal descriptor based on a spacetime
description of moving points. Our description is focused on an
Accordion representation of video which is well-suited to recognize
human action from 2D local descriptors without the need to 3D
extensions. We use the bag of words approach to represent videos.
We quantify 2D local descriptor describing both temporal and spatial
features with a good compromise between computational complexity
and action recognition rates. We have reached impressive results on
publicly available action data set
Abstract: Based on the combined shape feature and texture
feature, a fast object detection method with rotation invariant features
is proposed in this paper. A quick template matching scheme based
online learning designed for online applications is also introduced in
this paper. The experimental results have shown that the proposed
approach has the features of lower computation complexity and
higher detection rate, while keeping almost the same performance
compared to the HOG-based method, and can be more suitable for
run time applications.
Abstract: In this paper a deterministic polynomial-time
algorithm is presented for the Clique problem. The case is considered
as the problem of omitting the minimum number of vertices from the
input graph so that none of the zeroes on the graph-s adjacency
matrix (except the main diagonal entries) would remain on the
adjacency matrix of the resulting subgraph. The existence of a
deterministic polynomial-time algorithm for the Clique problem, as
an NP-complete problem will prove the equality of P and NP
complexity classes.
Abstract: In this paper, we propose a new modular approach called neuroglial consisting of two neural networks slow and fast which emulates a biological reality recently discovered. The implementation is based on complex multi-time scale systems; validation is performed on the model of the asynchronous machine. We applied the geometric approach based on the Gerschgorin circles for the decoupling of fast and slow variables, and the method of singular perturbations for the development of reductions models.
This new architecture allows for smaller networks with less complexity and better performance in terms of mean square error and convergence than the single network model.
Abstract: An ontology is a data model that represents a set of
concepts in a given field and the relationships among those concepts.
As the emphasis on achieving a semantic web continues to escalate,
ontologies for all types of domains increasingly will be developed.
These ontologies may become large and complex, and as their size
and complexity grows, so will the need for multi-user interfaces for
ontology curation. Herein a functionally comprehensive, generic
approach to maintaining an ontology as a relational database is
presented. Unlike many other ontology editors that utilize a database,
this approach is entirely domain-generic and fully supports Webbased,
collaborative editing including the designation of different
levels of authorization for users.
Abstract: Many companies have excel, it is economy and well perform to use in material requirement planning (MRP) on excel. For several products, it, however, is complex problem to link the relationship between the tables of products because the relationship depends on bill of material (BOM). This paper presents algorithm to create MRP on excel, and links relationship between tables. The study reveals MRP that is created by the algorithm which is easier and faster than MRP that created by human. By this technique, MRP on excel might be good ways to improve a productivity of companies.
Abstract: ''Cocktail party problem'' is well known as one of the human auditory abilities. We can recognize the specific sound that we want to listen by this ability even if a lot of undesirable sounds or noises are mixed. Blind source separation (BSS) based on independent component analysis (ICA) is one of the methods by which we can separate only a special signal from their mixed signals with simple hypothesis. In this paper, we propose an online approach for blind source separation using the sliding DFT and the time domain independent component analysis. The proposed method can reduce calculation complexity in comparison with conventional methods, and can be applied to parallel processing by using digital signal processors (DSPs) and so on. We evaluate this method and show its availability.
Abstract: Robot manipulators are highly coupled nonlinear
systems, therefore real system and mathematical model of dynamics
used for control system design are not same. Hence, fine-tuning of
controller is always needed. For better tuning fast simulation speed
is desired. Since, Matlab incorporates LAPACK to increase the speed
and complexity of matrix computation, dynamics, forward and
inverse kinematics of PUMA 560 is modeled on Matlab/Simulink in
such a way that all operations are matrix based which give very less
simulation time. This paper compares PID parameter tuning using
Genetic Algorithm, Simulated Annealing, Generalized Pattern Search
(GPS) and Hybrid Search techniques. Controller performances for all
these methods are compared in terms of joint space ITSE and
cartesian space ISE for tracking circular and butterfly trajectories.
Disturbance signal is added to check robustness of controller. GAGPS
hybrid search technique is showing best results for tuning PID
controller parameters in terms of ITSE and robustness.
Abstract: A new topology of unified power quality conditioner
(UPQC) is proposed for different power quality (PQ) improvement in
a three-phase four-wire (3P-4W) distribution system. For neutral
current mitigation, a star-hexagon transformer is connected in shunt
near the load along with three-leg voltage source inverters (VSIs)
based UPQC. For the mitigation of source neutral current, the uses of
passive elements are advantageous over the active compensation due
to ruggedness and less complexity of control. In addition to this, by
connecting a star-hexagon transformer for neutral current mitigation
the over all rating of the UPQC is reduced. The performance of the
proposed topology of 3P-4W UPQC is evaluated for power-factor
correction, load balancing, neutral current mitigation and mitigation
of voltage and currents harmonics. A simple control algorithm based
on Unit Vector Template (UVT) technique is used as a control
strategy of UPQC for mitigation of different PQ problems. In this
control scheme, the current/voltage control is applied over the
fundamental supply currents/voltages instead of fast changing APFs
currents/voltages, thereby reducing the computational delay.
Moreover, no extra control is required for neutral source current
compensation; hence the numbers of current sensors are reduced. The
performance of the proposed topology of UPQC is analyzed through
simulations results using MATLAB software with its Simulink and
Power System Block set toolboxes.
Abstract: This paper introduces two decoders for binary linear
codes based on Metaheuristics. The first one uses a genetic algorithm
and the second is based on a combination genetic algorithm with
a feed forward neural network. The decoder based on the genetic
algorithms (DAG) applied to BCH and convolutional codes give good
performances compared to Chase-2 and Viterbi algorithm respectively
and reach the performances of the OSD-3 for some Residue
Quadratic (RQ) codes. This algorithm is less complex for linear
block codes of large block length; furthermore their performances
can be improved by tuning the decoder-s parameters, in particular the
number of individuals by population and the number of generations.
In the second algorithm, the search space, in contrast to DAG which
was limited to the code word space, now covers the whole binary
vector space. It tries to elude a great number of coding operations
by using a neural network. This reduces greatly the complexity of
the decoder while maintaining comparable performances.
Abstract: Many multimedia communication applications require a
source to transmit messages to multiple destinations subject to quality
of service (QoS) delay constraint. To support delay constrained
multicast communications, computer networks need to guarantee an
upper bound end-to-end delay from the source node to each of
the destination nodes. This is known as multicast delay problem.
On the other hand, if the same message fails to arrive at each
destination node at the same time, there may arise inconsistency and
unfairness problem among users. This is related to multicast delayvariation
problem. The problem to find a minimum cost multicast
tree with delay and delay-variation constraints has been proven to
be NP-Complete. In this paper, we propose an efficient heuristic
algorithm, namely, Economic Delay and Delay-Variation Bounded
Multicast (EDVBM) algorithm, based on a novel heuristic function,
to construct an economic delay and delay-variation bounded multicast
tree. A noteworthy feature of this algorithm is that it has very high
probability of finding the optimal solution in polynomial time with
low computational complexity.
Abstract: An accurate optimal design of laminated composite
structures may present considerable difficulties due to the complexity
and multi-modality of the functional design space. The Big Bang
– Big Crunch (BB-BC) optimization method is a relatively new
technique and has already proved to be a valuable tool for structural
optimization. In the present study the exceptional efficiency of the
method is demonstrated by an example of the lay-up optimization
of multilayered anisotropic cylinders based on a three-dimensional
elasticity solution. It is shown that, due to its simplicity and speed,
the BB-BC is much more efficient for this class of problems when
compared to the genetic algorithms.
Abstract: We decribe a formal specification and verification of the Rabin public-key scheme in the formal proof system Is-abelle/HOL. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. The analysis presented uses a given database to prove formal properties of our implemented functions with computer support. Thema in task in designing a practical formalization of correctness as well as security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as eficient formal proofs. This yields the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Consequently, we get reliable proofs with a minimal error rate augmenting the used database. This provides a formal basis for more computer proof constructions in this area.
Abstract: Supply chain management has become more
challenging with the emerging trend of globalization and
sustainability. Lately, research related to perishable products supply
chains, in particular agricultural food products, has emerged. This is
attributed to the additional complexity of managing this type of
supply chains with the recently increased concern of public health,
food quality, food safety, demand and price variability, and the
limited lifetime of these products. Inventory management for agrifood
supply chains is of vital importance due to the product
perishability and customers- strive for quality. This paper
concentrates on developing a simulation model of a real life case
study of a two echelon production-distribution system for agri-food
products. The objective is to improve a set of performance measures
by developing a simulation model that helps in evaluating and
analysing the performance of these supply chains. Simulation results
showed that it can help in improving overall system performance.