Abstract: In recent years, rapid advances in software and hardware in the field of information technology along with a digital imaging revolution in the medical domain facilitate the generation and storage of large collections of images by hospitals and clinics. To search these large image collections effectively and efficiently poses significant technical challenges, and it raises the necessity of constructing intelligent retrieval systems. Content-based Image Retrieval (CBIR) consists of retrieving the most visually similar images to a given query image from a database of images[5]. Medical CBIR (content-based image retrieval) applications pose unique challenges but at the same time offer many new opportunities. On one hand, while one can easily understand news or sports videos, a medical image is often completely incomprehensible to untrained eyes.
Abstract: A new paradigm for software design and development models software by its business process, translates the model into a process execution language, and has it run by a supporting execution engine. This process-oriented paradigm promotes modeling of software by less technical users or business analysts as well as rapid development. Since business process models may be shared by different organizations and sometimes even by different business domains, it is interesting to apply a technique used in traditional software component technology to design reusable business processes. This paper discusses an approach to apply a technique for software component fabrication to the design of process-oriented software units, called process components. These process components result from decomposing a business process of a particular application domain into subprocesses with an aim that the process components can be reusable in different process-based software models. The approach is quantitative because the quality of process component design is measured from technical features of the process components. The approach is also strategic because the measured quality is determined against business-oriented component management goals. A software tool has been developed to measure how good a process component design is, according to the required managerial goals and comparing to other designs. We also discuss how we benefit from reusable process components.
Abstract: With the growth of electricity generation from gas
energy gas pipeline reliability can substantially impact the electric
generation. A physical disruption to pipeline or to a compressor
station can interrupt the flow of gas or reduce the pressure and lead
to loss of multiple gas-fired electric generators, which could
dramatically reduce the supplied power and threaten the power
system security. Gas pressure drops during peak loading time on
pipeline system, is a common problem in network with no enough
transportation capacity which limits gas transportation and causes
many problem for thermal domain power systems in supplying their
demand. For a feasible generation scheduling planning in networks
with no sufficient gas transportation capacity, it is required to
consider gas pipeline constraints in solving the optimization problem
and evaluate the impacts of gas consumption in power plants on gas
pipelines operating condition. This paper studies about operating of
gas fired power plants in critical conditions when the demand of gas
and electricity peak together. An integrated model of gas and electric
model is used to consider the gas pipeline constraints in the economic
dispatch problem of gas-fueled thermal generator units.
Abstract: With the rapid growth in business size, today's businesses orient towards electronic technologies. Amazon.com and e-bay.com are some of the major stakeholders in this regard. Unfortunately the enormous size and hugely unstructured data on the web, even for a single commodity, has become a cause of ambiguity for consumers. Extracting valuable information from such an everincreasing data is an extremely tedious task and is fast becoming critical towards the success of businesses. Web content mining can play a major role in solving these issues. It involves using efficient algorithmic techniques to search and retrieve the desired information from a seemingly impossible to search unstructured data on the Internet. Application of web content mining can be very encouraging in the areas of Customer Relations Modeling, billing records, logistics investigations, product cataloguing and quality management. In this paper we present a review of some very interesting, efficient yet implementable techniques from the field of web content mining and study their impact in the area specific to business user needs focusing both on the customer as well as the producer. The techniques we would be reviewing include, mining by developing a knowledge-base repository of the domain, iterative refinement of user queries for personalized search, using a graphbased approach for the development of a web-crawler and filtering information for personalized search using website captions. These techniques have been analyzed and compared on the basis of their execution time and relevance of the result they produced against a particular search.
Abstract: In recent years, tuned mass damper (TMD) control systems for civil engineering structures have attracted considerable attention. This paper emphasizes on the application of particle swarm application (PSO) to design and optimize the parameters of the TMD control scheme for achieving the best results in the reduction of the building response under earthquake excitations. The Integral of the Time multiplied Absolute value of the Error (ITAE) based on relative displacement of all floors in the building is taken as a performance index of the optimization criterion. The problem of robustly TMD controller design is formatted as an optimization problem based on the ITAE performance index to be solved using the PSO technique which has a story ability to find the most optimistic results. An 11- story realistic building, located in the city of Rasht, Iran is considered as a test system to demonstrate effectiveness of the proposed method. The results analysis through the time-domain simulation and some performance indices reveals that the designed PSO based TMD controller has an excellent capability in reduction of the seismically excited example building.
Abstract: The literature reports a large number of approaches for
measuring the similarity between protein sequences. Most of these
approaches estimate this similarity using alignment-based techniques
that do not necessarily yield biologically plausible results, for two
reasons.
First, for the case of non-alignable (i.e., not yet definitively aligned
and biologically approved) sequences such as multi-domain, circular
permutation and tandem repeat protein sequences, alignment-based
approaches do not succeed in producing biologically plausible results.
This is due to the nature of the alignment, which is based on the
matching of subsequences in equivalent positions, while non-alignable
proteins often have similar and conserved domains in non-equivalent
positions.
Second, the alignment-based approaches lead to similarity measures
that depend heavily on the parameters set by the user for the alignment
(e.g., gap penalties and substitution matrices). For easily alignable
protein sequences, it's possible to supply a suitable combination of
input parameters that allows such an approach to yield biologically
plausible results. However, for difficult-to-align protein sequences,
supplying different combinations of input parameters yields different
results. Such variable results create ambiguities and complicate the
similarity measurement task.
To overcome these drawbacks, this paper describes a novel and
effective approach for measuring the similarity between protein
sequences, called SAF for Substitution and Alignment Free. Without
resorting either to the alignment of protein sequences or to substitution
relations between amino acids, SAF is able to efficiently detect the
significant subsequences that best represent the intrinsic properties of
protein sequences, those underlying the chronological dependencies of
structural features and biochemical activities of protein sequences.
Moreover, by using a new efficient subsequence matching scheme,
SAF more efficiently handles protein sequences that contain similar
structural features with significant meaning in chronologically
non-equivalent positions. To show the effectiveness of SAF, extensive
experiments were performed on protein datasets from different
databases, and the results were compared with those obtained by
several mainstream algorithms.
Abstract: MATCH project [1] entitle the development of an
automatic diagnosis system that aims to support treatment of colon
cancer diseases by discovering mutations that occurs to tumour
suppressor genes (TSGs) and contributes to the development of
cancerous tumours. The constitution of the system is based on a)
colon cancer clinical data and b) biological information that will be
derived by data mining techniques from genomic and proteomic
sources The core mining module will consist of the popular, well
tested hybrid feature extraction methods, and new combined
algorithms, designed especially for the project. Elements of rough
sets, evolutionary computing, cluster analysis, self-organization maps
and association rules will be used to discover the annotations
between genes, and their influence on tumours [2]-[11].
The methods used to process the data have to address their high
complexity, potential inconsistency and problems of dealing with the
missing values. They must integrate all the useful information
necessary to solve the expert's question. For this purpose, the system
has to learn from data, or be able to interactively specify by a domain
specialist, the part of the knowledge structure it needs to answer a
given query. The program should also take into account the
importance/rank of the particular parts of data it analyses, and adjusts
the used algorithms accordingly.
Abstract: The paper deals with the estimation of amplitude and phase of an analogue multi-harmonic band-limited signal from irregularly spaced sampling values. To this end, assuming the signal fundamental frequency is known in advance (i.e., estimated at an independent stage), a complexity-reduced algorithm for signal reconstruction in time domain is proposed. The reduction in complexity is achieved owing to completely new analytical and summarized expressions that enable a quick estimation at a low numerical error. The proposed algorithm for the calculation of the unknown parameters requires O((2M+1)2) flops, while the straightforward solution of the obtained equations takes O((2M+1)3) flops (M is the number of the harmonic components). It is applied in signal reconstruction, spectral estimation, system identification, as well as in other important signal processing problems. The proposed method of processing can be used for precise RMS measurements (for power and energy) of a periodic signal based on the presented signal reconstruction. The paper investigates the errors related to the signal parameter estimation, and there is a computer simulation that demonstrates the accuracy of these algorithms.
Abstract: Information sharing and exchange, rather than
information processing, is what characterizes information
technology in the 21st century. Ontologies, as shared common
understanding, gain increasing attention, as they appear as the
most promising solution to enable information sharing both at
a semantic level and in a machine-processable way. Domain
Ontology-based modeling has been exploited to provide
shareability and information exchange among diversified,
heterogeneous applications of enterprises.
Contextual ontologies are “an explicit specification of
contextual conceptualization". That is: ontology is
characterized by concepts that have multiple representations
and they may exist in several contexts. Hence, contextual
ontologies are a set of concepts and relationships, which are
seen from different perspectives. Contextualization is to allow
for ontologies to be partitioned according to their contexts.
The need for contextual ontologies in enterprise modeling
has become crucial due to the nature of today's competitive
market. Information resources in enterprise is distributed and
diversified and is in need to be shared and communicated
locally through the intranet and globally though the internet.
This paper discusses the roles that ontologies play in an
enterprise modeling, and how ontologies assist in building a
conceptual model in order to provide communicative and
interoperable information systems. The issue of enterprise
modeling based on contextual domain ontology is also
investigated, and a framework is proposed for an enterprise
model that consists of various applications.
Abstract: High quality requirements analysis is one of the most
crucial activities to ensure the success of a software project, so that
requirements verification for software system becomes more and more
important in Requirements Engineering (RE) and it is one of the most
helpful strategies for improving the quality of software system.
Related works show that requirement elicitation and analysis can be
facilitated by ontological approaches and semantic web technologies.
In this paper, we proposed a hybrid method which aims to verify
requirements with structural and formal semantics to detect
interactions. The proposed method is twofold: one is for modeling
requirements with the semantic web language OWL, to construct a
semantic context; the other is a set of interaction detection rules which
are derived from scenario-based analysis and represented with
semantic web rule language (SWRL). SWRL based rules are working
with rule engines like Jess to reason in semantic context for
requirements thus to detect interactions. The benefits of the proposed
method lie in three aspects: the method (i) provides systematic steps
for modeling requirements with an ontological approach, (ii) offers
synergy of requirements elicitation and domain engineering for
knowledge sharing, and (3)the proposed rules can systematically assist
in requirements interaction detection.
Abstract: Recently, neural networks have shown good
results for detection of a certain pattern in a given image. In
our previous papers [1-5], a fast algorithm for pattern
detection using neural networks was presented. Such
algorithm was designed based on cross correlation in the
frequency domain between the input image and the weights
of neural networks. Image conversion into symmetric shape
was established so that fast neural networks can give the
same results as conventional neural networks. Another
configuration of symmetry was suggested in [3,4] to improve
the speed up ratio. In this paper, our previous algorithm for
fast neural networks is developed. The frequency domain
cross correlation is modified in order to compensate for the
symmetric condition which is required by the input image.
Two new ideas are introduced to modify the cross correlation
algorithm. Both methods accelerate the speed of the fast
neural networks as there is no need for converting the input
image into symmetric one as previous. Theoretical and
practical results show that both approaches provide faster
speed up ratio than the previous algorithm.
Abstract: The purpose of this work is fast design optimization of
the seal chamber. The study includes the mass transfer between lower
and upper chamber on seal chamber for hot water application pumps.
The use of Fluent 12.1 commercial code made it possible to capture
complex flow with heat-mass transfer, radiation, Tailor instability,
and buoyancy effect. Realizable k-epsilon model was used for
turbulence modeling. Radiation heat losses were taken into account.
The temperature distribution at seal region is predicted with respect
to heat addition.
Results show the possibilities of the model simplifications by
excluding the water domain in low chamber from calculations. CFD
simulations permit to improve seal chamber design to meet target
water temperature around the seal. This study can be used for the
analysis of different seal chamber configurations.
Abstract: In this paper, we have developed a method to
compute fractal dimension (FD) of discrete time signals, in the
time domain, by modifying the box-counting method. The size
of the box is dependent on the sampling frequency of the
signal. The number of boxes required to completely cover the
signal are obtained at multiple time resolutions. The time
resolutions are made coarse by decimating the signal. The loglog
plot of total number of boxes required to cover the curve
versus size of the box used appears to be a straight line, whose
slope is taken as an estimate of FD of the signal. The results
are provided to demonstrate the performance of the proposed
method using parametric fractal signals. The estimation
accuracy of the method is compared with that of Katz, Sevcik,
and Higuchi methods. In addition, some properties of the FD
are discussed.
Abstract: Hierarchical high-level PNs (HHPNs) with time
versions are a useful tool to model systems in a variety of application
domains, ranging from logistics to complex workflows. This paper
addresses an application domain which is receiving more and more
attention: procedure that arranges the final inpatient charge in
payment-s office and their management. We shall prove that Petri net
based analysis is able to improve the delays during the procedure, in
order that inpatient charges could be more reliable and on time.
Abstract: It is important problems to increase the detection rates
and reduce false positive rates in Intrusion Detection System (IDS).
Although preventative techniques such as access control and
authentication attempt to prevent intruders, these can fail, and as a
second line of defence, intrusion detection has been introduced. Rare
events are events that occur very infrequently, detection of rare
events is a common problem in many domains. In this paper we
propose an intrusion detection method that combines Rough set and
Fuzzy Clustering. Rough set has to decrease the amount of data and
get rid of redundancy. Fuzzy c-means clustering allow objects to
belong to several clusters simultaneously, with different degrees of
membership. Our approach allows us to recognize not only known
attacks but also to detect suspicious activity that may be the result of
a new, unknown attack. The experimental results on Knowledge
Discovery and Data Mining-(KDDCup 1999) Dataset show that the
method is efficient and practical for intrusion detection systems.
Abstract: A time-domain numerical model within the
framework of transmission line modeling (TLM) is developed to
simulate electromagnetic pulse propagation inside multiple
microcavities forming photonic crystal (PhC) structures. The model
developed is quite general and is capable of simulating complex
electromagnetic problems accurately. The field quantities can be
mapped onto a passive electrical circuit equivalent what ensures that
TLM is provably stable and conservative at a local level.
Furthermore, the circuit representation allows a high level of
hybridization of TLM with other techniques and lumped circuit
models of components and devices. A photonic crystal structure
formed by rods (or blocks) of high-permittivity dieletric material
embedded in a low-dielectric background medium is simulated as an
example. The model developed gives vital spatio-temporal
information about the signal, and also gives spectral information over
a wide frequency range in a single run. The model has wide
applications in microwave communication systems, optical
waveguides and electromagnetic materials simulations.
Abstract: This paper presents optimal based damping controllers of Unified Power Flow Controller (UPFC) for improving the damping power system oscillations. The design problem of UPFC damping controller and system configurations is formulated as an optimization with time domain-based objective function by means of Adaptive Tabu Search (ATS) technique. The UPFC is installed in Single Machine Infinite Bus (SMIB) for the performance analysis of the power system and simulated using MATLAB-s simulink. The simulation results of these studies showed that designed controller has an tremendous capability in damping power system oscillations.
Abstract: The optimal grid spacing and turbulence model for the
2D numerical analysis of a vertical-axis water turbine (VAWaterT)
operating in a 2 m/s freestream current has been investigated. The
results of five different spatial domain discretizations and two
turbulence models (k-ω SST and k-ε RNG) have been compared, in
order to gain the optimal y+ parameter distribution along the blade
walls during a full rotor revolution. The resulting optimal mesh has
appeared to be quite similar to that obtained for the numerical
analysis of a vertical-axis wind turbine.
Abstract: This paper details the application of a genetic
programming framework for induction of useful classification rules
from a database of income statements, balance sheets, and cash flow
statements for North American public companies. Potentially
interesting classification rules are discovered. Anomalies in the
discovery process merit further investigation of the application of
genetic programming to the dataset for the problem domain.
Abstract: Background noise is particularly damaging to speech
intelligibility for people with hearing loss especially for sensorineural
loss patients. Several investigations on speech intelligibility have
demonstrated sensorineural loss patients need 5-15 dB higher SNR
than the normal hearing subjects. This paper describes Discrete
Cosine Transform Power Normalized Least Mean Square algorithm
to improve the SNR and to reduce the convergence rate of the LMS
for Sensory neural loss patients. Since it requires only real arithmetic,
it establishes the faster convergence rate as compare to time domain
LMS and also this transformation improves the eigenvalue
distribution of the input autocorrelation matrix of the LMS filter.
The DCT has good ortho-normal, separable, and energy compaction
property. Although the DCT does not separate frequencies, it is a
powerful signal decorrelator. It is a real valued function and thus
can be effectively used in real-time operation. The advantages of
DCT-LMS as compared to standard LMS algorithm are shown via
SNR and eigenvalue ratio computations. . Exploiting the symmetry
of the basis functions, the DCT transform matrix [AN] can be
factored into a series of ±1 butterflies and rotation angles. This
factorization results in one of the fastest DCT implementation. There
are different ways to obtain factorizations. This work uses the fast
factored DCT algorithm developed by Chen and company. The
computer simulations results show superior convergence
characteristics of the proposed algorithm by improving the SNR at
least 10 dB for input SNR less than and equal to 0 dB, faster
convergence speed and better time and frequency characteristics.