Abstract: CIM is the standard formalism for modeling management
information developed by the Distributed Management Task
Force (DMTF) in the context of its WBEM proposal, designed to
provide a conceptual view of the managed environment. In this
paper, we propose the inclusion of formal knowledge representation
techniques, based on Description Logics (DLs) and the Web Ontology
Language (OWL), in CIM-based conceptual modeling, and then we
examine the benefits of such a decision. The proposal is specified
as a CIM metamodel level mapping to a highly expressive subset
of DLs capable of capturing all the semantics of the models. The
paper shows how the proposed mapping provides CIM diagrams with
precise semantics and can be used for automatic reasoning about the
management information models, as a design aid, by means of newgeneration
CASE tools, thanks to the use of state-of-the-art automatic
reasoning systems that support the proposed logic and use algorithms
that are sound and complete with respect to the semantics. Such a
CASE tool framework has been developed by the authors and its
architecture is also introduced. The proposed formalization is not
only useful at design time, but also at run time through the use of
rational autonomous agents, in response to a need recently recognized
by the DMTF.
Abstract: Reactiondiffusion systems are mathematical models that describe how the concentration of one or more substances distributed in space changes under the influence of local chemical reactions in which the substances are converted into each other, and diffusion which causes the substances to spread out in space. The classical representation of a reaction-diffusion system is given by semi-linear parabolic partial differential equations, whose general form is ÔêétX(x, t) = DΔX(x, t), where X(x, t) is the state vector, D is the matrix of the diffusion coefficients and Δ is the Laplace operator. If the solute move in an homogeneous system in thermal equilibrium, the diffusion coefficients are constants that do not depend on the local concentration of solvent and of solutes and on local temperature of the medium. In this paper a new stochastic reaction-diffusion model in which the diffusion coefficients are function of the local concentration, viscosity and frictional forces of solvent and solute is presented. Such a model provides a more realistic description of the molecular kinetics in non-homogenoeus and highly structured media as the intra- and inter-cellular spaces. The movement of a molecule A from a region i to a region j of the space is described as a first order reaction Ai k- → Aj , where the rate constant k depends on the diffusion coefficient. Representing the diffusional motion as a chemical reaction allows to assimilate a reaction-diffusion system to a pure reaction system and to simulate it with Gillespie-inspired stochastic simulation algorithms. The stochastic time evolution of the system is given by the occurrence of diffusion events and chemical reaction events. At each time step an event (reaction or diffusion) is selected from a probability distribution of waiting times determined by the specific speed of reaction and diffusion events. Redi is the software tool, developed to implement the model of reaction-diffusion kinetics and dynamics. It is a free software, that can be downloaded from http://www.cosbi.eu. To demonstrate the validity of the new reaction-diffusion model, the simulation results of the chaperone-assisted protein folding in cytoplasm obtained with Redi are reported. This case study is redrawing the attention of the scientific community due to current interests on protein aggregation as a potential cause for neurodegenerative diseases.
Abstract: Currently WWW is the first solution for scholars in
finding information. But, analyzing and interpreting this volume of
information will lead to researchers overload in pursuing their
research.
Trend detection in scientific publication retrieval systems helps
scholars to find relevant, new and popular special areas by
visualizing the trend of input topic.
However, there are few researches on trend detection in scientific
corpora while their proposed models do not appear to be suitable.
Previous works lack of an appropriate representation scheme for
research topics.
This paper describes a method that combines Semantic Web and
ontology to support advance search functions such as trend detection
in the context of scholarly Semantic Web system (SSWeb).
Abstract: The new idea of analyze of power system failure with
use of artificial neural network is proposed. An analysis of the
possibility of simulating phenomena accompanying system faults and
restitution is described. It was indicated that the universal model for
the simulation of phenomena in whole analyzed range does not exist.
The main classic method of search of optimal structure and
parameter identification are described shortly. The example with
results of calculation is shown.
Abstract: In this paper, we first give the representation of the general solution of the following least-squares problem (LSP): Given matrices X ∈ Rn×p, B ∈ Rp×p and A0 ∈ Rr×r, find a matrix A ∈ Rn×n such that XT AX − B = min, s. t. A([1, r]) = A0, where A([1, r]) is the r×r leading principal submatrix of the matrix A. We then consider a best approximation problem: given an n × n matrix A˜ with A˜([1, r]) = A0, find Aˆ ∈ SE such that A˜ − Aˆ = minA∈SE A˜ − A, where SE is the solution set of LSP. We show that the best approximation solution Aˆ is unique and derive an explicit formula for it. Keyw
Abstract: Mathematical models of dynamics employing exterior calculus are mathematical representations of the same unifying principle; namely, the description of a dynamic system with a characteristic differential one-form on an odd-dimensional differentiable manifold leads, by analysis with exterior calculus, to a set of differential equations and a characteristic tangent vector (vortex vector) which define transformations of the system. Using this principle, a mathematical model for economic growth is constructed by proposing a characteristic differential one-form for economic growth dynamics (analogous to the action in Hamiltonian dynamics), then generating a pair of characteristic differential equations and solving these equations for the rate of economic growth as a function of labor and capital. By contracting the characteristic differential one-form with the vortex vector, the Lagrangian for economic growth dynamics is obtained.
Abstract: Mel Frequency Cepstral Coefficient (MFCC) features
are widely used as acoustic features for speech recognition as well
as speaker recognition. In MFCC feature representation, the Mel frequency
scale is used to get a high resolution in low frequency region,
and a low resolution in high frequency region. This kind of processing
is good for obtaining stable phonetic information, but not suitable
for speaker features that are located in high frequency regions. The
speaker individual information, which is non-uniformly distributed
in the high frequencies, is equally important for speaker recognition.
Based on this fact we proposed an admissible wavelet packet based
filter structure for speaker identification. Multiresolution capabilities
of wavelet packet transform are used to derive the new features.
The proposed scheme differs from previous wavelet based works,
mainly in designing the filter structure. Unlike others, the proposed
filter structure does not follow Mel scale. The closed-set speaker
identification experiments performed on the TIMIT database shows
improved identification performance compared to other commonly
used Mel scale based filter structures using wavelets.
Abstract: This paper examines the depiction of Muslim militants in Thai newspapers in 2004. Stuart Hall-s “representation" and “public idioms" are used as theoretical frameworks. Critical Discourse Analysis is employed as a methodology to examine 240 news articles from two leading Thai language newspapers. The results show that the militants are usually labeled as “southern bandits." This suggests that they are just a culprit of the violence in the deep south of Thailand. They are usually described as people who cause turbulence. Consequently, the military have to get rid of them. However, other aspects of the groups such as their political agenda or the failures of the Thai state in dealing with the Malay Muslims were not mention in the news stories. In the time of violence, the researcher argues that this kind of newspaper coverage may help perpetuate the discourse of Malay Muslim, instead of providing fuller picture of the ongoing conflicts.
Abstract: This paper presents a novel approach to finding a
priori interesting regions in mammograms. In order to delineate those
regions of interest (ROI-s) in mammograms, which appear to be
prominent, a topographic representation called the iso-level contour
map consisting of iso-level contours at multiple intensity levels and
region segmentation based-thresholding have been proposed. The
simulation results indicate that the computed boundary gives the
detection rate of 99.5% accuracy.
Abstract: This paper proposes a declarative language for
knowledge representation (Ibn Rochd), and its environment of
exploitation (DeGSE). This DeGSE system was designed and
developed to facilitate Ibn Rochd writing applications. The system
was tested on several knowledge bases by ascending complexity,
culminating in a system for recognition of a plant or a tree, and
advisors to purchase a car, for pedagogical and academic guidance,
or for bank savings and credit. Finally, the limits of the language and
research perspectives are stated.
Abstract: This paper reviews various approaches that have been
used for the modeling and simulation of large-scale engineering
systems and determines their appropriateness in the development of a
RICS modeling and simulation tool. Bond graphs, linear graphs,
block diagrams, differential and difference equations, modeling
languages, cellular automata and agents are reviewed. This tool
should be based on linear graph representation and supports symbolic
programming, functional programming, the development of noncausal
models and the incorporation of decentralized approaches.
Abstract: Years of extensive research in the field of speech
processing for compression and recognition in the last five decades,
resulted in a severe competition among the various methods and
paradigms introduced. In this paper we include the different representations
of speech in the time-frequency and time-scale domains
for the purpose of compression and recognition. The examination of
these representations in a variety of related work is accomplished.
In particular, we emphasize methods related to Fourier analysis
paradigms and wavelet based ones along with the advantages and
disadvantages of both approaches.
Abstract: This paper presents a comparative study on two most
popular control strategies for Permanent Magnet Synchronous Motor
(PMSM) drives: field-oriented control (FOC) and direct torque
control (DTC). The comparison is based on various criteria including
basic control characteristics, dynamic performance, and
implementation complexity. The study is done by simulation using
the Simulink Power System Blockset that allows a complete
representation of the power section (inverter and PMSM) and the
control system. The simulation and evaluation of both control
strategies are performed using actual parameters of Permanent
Magnet Synchronous Motor fed by an IGBT PWM inverter.
Abstract: In this paper we propose a method for vision systems
to consistently represent functional dependencies between different
visual routines along with relational short- and long-term knowledge
about the world. Here the visual routines are bound to visual properties
of objects stored in the memory of the system. Furthermore,
the functional dependencies between the visual routines are seen
as a graph also belonging to the object-s structure. This graph is
parsed in the course of acquiring a visual property of an object to
automatically resolve the dependencies of the bound visual routines.
Using this representation, the system is able to dynamically rearrange
the processing order while keeping its functionality. Additionally, the
system is able to estimate the overall computational costs of a certain
action. We will also show that the system can efficiently use that
structure to incorporate already acquired knowledge and thus reduce
the computational demand.
Abstract: A state of the art Speaker Identification (SI) system requires a robust feature extraction unit followed by a speaker modeling scheme for generalized representation of these features. Over the years, Mel-Frequency Cepstral Coefficients (MFCC) modeled on the human auditory system has been used as a standard acoustic feature set for SI applications. However, due to the structure of its filter bank, it captures vocal tract characteristics more effectively in the lower frequency regions. This paper proposes a new set of features using a complementary filter bank structure which improves distinguishability of speaker specific cues present in the higher frequency zone. Unlike high level features that are difficult to extract, the proposed feature set involves little computational burden during the extraction process. When combined with MFCC via a parallel implementation of speaker models, the proposed feature set outperforms baseline MFCC significantly. This proposition is validated by experiments conducted on two different kinds of public databases namely YOHO (microphone speech) and POLYCOST (telephone speech) with Gaussian Mixture Models (GMM) as a Classifier for various model orders.
Abstract: Complex engineering design problems consist of
numerous factors of varying criticalities. Considering fundamental features of design and inferior details alike will result in an extensive
waste of time and effort. Design parameters should be introduced gradually as appropriate based on their significance relevant to the
problem context. This motivates the representation of design parameters at multiple levels of an abstraction hierarchy. However, developing abstraction hierarchies is an area that is not well
understood. Our research proposes a novel hierarchical abstraction methodology to plan effective engineering designs and processes. It
provides a theoretically sound foundation to represent, abstract and stratify engineering design parameters and tasks according to causality and criticality. The methodology creates abstraction
hierarchies in a recursive and bottom-up approach that guarantees no
backtracking across any of the abstraction levels. The methodology consists of three main phases, representation, abstraction, and layering to multiple hierarchical levels. The effectiveness of the
developed methodology is demonstrated by a design problem.
Abstract: In this paper, we present an analytical analysis of the
representation of images as the magnitudes of their transform with
the discrete wavelets. Such a representation plays as a model for
complex cells in the early stage of visual processing and of high
technical usefulness for image understanding, because it makes the
representation insensitive to small local shifts. We found that if the
signals are band limited and of zero mean, then reconstruction from
the magnitudes is unique up to the sign for almost all signals. We
also present an iterative reconstruction algorithm which yields very
good reconstruction up to the sign minor numerical errors in the very
low frequencies.
Abstract: A feed-forward, back-propagation Artificial Neural
Network (ANN) model has been used to forecast the occurrences of
wastewater overflows in a combined sewerage reticulation system.
This approach was tested to evaluate its applicability as a method
alternative to the common practice of developing a complete
conceptual, mathematical hydrological-hydraulic model for the
sewerage system to enable such forecasts. The ANN approach
obviates the need for a-priori understanding and representation of the
underlying hydrological hydraulic phenomena in mathematical terms
but enables learning the characteristics of a sewer overflow from the
historical data.
The performance of the standard feed-forward, back-propagation
of error algorithm was enhanced by a modified data normalizing
technique that enabled the ANN model to extrapolate into the
territory that was unseen by the training data. The algorithm and the
data normalizing method are presented along with the ANN model
output results that indicate a good accuracy in the forecasted sewer
overflow rates. However, it was revealed that the accurate
forecasting of the overflow rates are heavily dependent on the
availability of a real-time flow monitoring at the overflow structure
to provide antecedent flow rate data. The ability of the ANN to
forecast the overflow rates without the antecedent flow rates (as is
the case with traditional conceptual reticulation models) was found to
be quite poor.
Abstract: Apart from geometry, functionality is one of the most
significant hallmarks of a product. The functionality of a product can
be considered as the fundamental justification for a product
existence. Therefore a functional analysis including a complete and
reliable descriptor has a high potential to improve product
development process in various fields especially in knowledge-based
design. One of the important applications of the functional analysis
and indexing is in retrieval and design reuse concept. More than 75%
of design activity for a new product development contains reusing
earlier and existing design know-how. Thus, analysis and
categorization of product functions concluded by functional
indexing, influences directly in design optimization. This paper
elucidates and evaluates major classes for functional analysis by
discussing their major methods. Moreover it is finalized by
presenting a noble hybrid approach for functional analysis.
Abstract: Text categorization is the problem of classifying text documents into a set of predefined classes. After a preprocessing step the documents are typically represented as large sparse vectors. When training classifiers on large collections of documents, both the time and memory restrictions can be quite prohibitive. This justifies the application of features selection methods to reduce the dimensionality of the document-representation vector. Four feature selection methods are evaluated: Random Selection, Information Gain (IG), Support Vector Machine (called SVM_FS) and Genetic Algorithm with SVM (GA_FS). We showed that the best results were obtained with SVM_FS and GA_FS methods for a relatively small dimension of the features vector comparative with the IG method that involves longer vectors, for quite similar classification accuracies. Also we present a novel method to better correlate SVM kernel-s parameters (Polynomial or Gaussian kernel).