Abstract: Combinatorial optimization problems arise in many scientific and practical applications. Therefore many researchers try to find or improve different methods to solve these problems with high quality results and in less time. Genetic Algorithm (GA) and Simulated Annealing (SA) have been used to solve optimization problems. Both GA and SA search a solution space throughout a sequence of iterative states. However, there are also significant differences between them. The GA mechanism is parallel on a set of solutions and exchanges information using the crossover operation. SA works on a single solution at a time. In this work SA and GA are combined using new technique in order to overcome the disadvantages' of both algorithms.
Abstract: A hybrid feature based adaptive particle filter algorithm is presented for object tracking in real scenarios with static camera.
The hybrid feature is combined by two effective features: the Grayscale Arranging Pairs (GAP) feature and the color histogram feature. The GAP feature has high discriminative ability even under conditions of severe illumination variation and dynamic background
elements, while the color histogram feature has high reliability to identify the detected objects. The combination of two features covers the shortage of single feature. Furthermore, we adopt an updating
target model so that some external problems such as visual angles can be overcame well. An automatic initialization algorithm is introduced which provides precise initial positions of objects. The experimental
results show the good performance of the proposed method.
Abstract: While the form of crises may change, their essence
remains the same (such as a cycle of abundant liquidity, rapid credit
growth, and a low-inflation environment followed by an asset-price
bubble). The current market turbulence began in mid-2000s when the
US economy shifted to imbalanced both internal and external
macroeconomic positions. We see two key causes of these problems
– loose US monetary policy in early 2000s and US government
guarantees issued on the securities by government-sponsored
enterprises what was further fueled by financial innovations such as
structured credit products. We have discovered both negative and
positive lessons deriving from this crisis and divided the negative
lessons into three groups: financial products and valuation, processes
and business models, and strategic issues. Moreover, we address key
risk management lessons and exit strategies derived from the current
crisis and recommend policies that should help diminish the negative
impact of future potential crises.
Abstract: MATCH project [1] entitle the development of an
automatic diagnosis system that aims to support treatment of colon
cancer diseases by discovering mutations that occurs to tumour
suppressor genes (TSGs) and contributes to the development of
cancerous tumours. The constitution of the system is based on a)
colon cancer clinical data and b) biological information that will be
derived by data mining techniques from genomic and proteomic
sources The core mining module will consist of the popular, well
tested hybrid feature extraction methods, and new combined
algorithms, designed especially for the project. Elements of rough
sets, evolutionary computing, cluster analysis, self-organization maps
and association rules will be used to discover the annotations
between genes, and their influence on tumours [2]-[11].
The methods used to process the data have to address their high
complexity, potential inconsistency and problems of dealing with the
missing values. They must integrate all the useful information
necessary to solve the expert's question. For this purpose, the system
has to learn from data, or be able to interactively specify by a domain
specialist, the part of the knowledge structure it needs to answer a
given query. The program should also take into account the
importance/rank of the particular parts of data it analyses, and adjusts
the used algorithms accordingly.
Abstract: Literature reveals that many investors rely on technical trading rules when making investment decisions. If stock markets are efficient, one cannot achieve superior results by using these trading rules. However, if market inefficiencies are present, profitable opportunities may arise. The aim of this study is to investigate the effectiveness of technical trading rules in 34 emerging stock markets. The performance of the rules is evaluated by utilizing White-s Reality Check and the Superior Predictive Ability test of Hansen, along with an adjustment for transaction costs. These tests are able to evaluate whether the best model performs better than a buy-and-hold benchmark. Further, they provide an answer to data snooping problems, which is essential to obtain unbiased outcomes. Based on our results we conclude that technical trading rules are not able to outperform a naïve buy-and-hold benchmark on a consistent basis. However, we do find significant trading rule profits in 4 of the 34 investigated markets. We also present evidence that technical analysis is more profitable in crisis situations. Nevertheless, this result is relatively weak.
Abstract: None of the processing models in the software
development has explained the software systems performance
evaluation and modeling; likewise, there exist uncertainty in the
information systems because of the natural essence of requirements,
and this may cause other challenges in the processing of software
development. By definition an extended version of UML (Fuzzy-
UML), the functional requirements of the software defined
uncertainly would be supported. In this study, the behavioral
description of uncertain information systems by the aid of fuzzy-state
diagram is crucial; moreover, the introduction of behavioral diagrams
role in F-UML is investigated in software performance modeling
process. To get the aim, a fuzzy sub-profile is used.
Abstract: This paper presents an exact analytical model for
optimizing stability of thin-walled, composite, functionally graded
pipes conveying fluid. The critical flow velocity at which divergence
occurs is maximized for a specified total structural mass in order to
ensure the economic feasibility of the attained optimum designs. The
composition of the material of construction is optimized by defining
the spatial distribution of volume fractions of the material
constituents using piecewise variations along the pipe length. The
major aim is to tailor the material distribution in the axial direction so
as to avoid the occurrence of divergence instability without the
penalty of increasing structural mass. Three types of boundary
conditions have been examined; namely, Hinged-Hinged, Clamped-
Hinged and Clamped-Clamped pipelines. The resulting optimization
problem has been formulated as a nonlinear mathematical
programming problem solved by invoking the MatLab optimization
toolbox routines, which implement constrained function
minimization routine named “fmincon" interacting with the
associated eigenvalue problem routines. In fact, the proposed
mathematical models have succeeded in maximizing the critical flow
velocity without mass penalty and producing efficient and economic
designs having enhanced stability characteristics as compared with
the baseline designs.
Abstract: This paper is to develop a fuzzy net present value (FNPV) method by taking vague cash flow and imprecise required rate of return into account for evaluating the value of the Build-Operate-Transfer (BOT) sport facilities. In order to clearly manifest a more realistic capital budgeting model based on the classical net present value (NPV) method, some uncertain financial elements in NPV formula will be fuzzified as triangular fuzzy numbers. Through the conscientious manipulation of fuzzy set theory, we will find that the proposed FNPV model is a more explicit extension of classical (crisp) model and could be more practicable for the financial managers to capture the essence of capital budgeting of sport facilities than non-fuzzy model.
Abstract: Data mining uses a variety of techniques each of which is useful for some particular task. It is important to have a deep understanding of each technique and be able to perform sophisticated analysis. In this article we describe a tool built to simulate a variation of the Kohonen network to perform unsupervised clustering and support the entire data mining process up to results visualization. A graphical representation helps the user to find out a strategy to optmize classification by adding, moving or delete a neuron in order to change the number of classes. The tool is also able to automatically suggest a strategy for number of classes optimization.The tool is used to classify macroeconomic data that report the most developed countries? import and export. It is possible to classify the countries based on their economic behaviour and use an ad hoc tool to characterize the commercial behaviour of a country in a selected class from the analysis of positive and negative features that contribute to classes formation.
Abstract: In the recent past Learning Classifier Systems have
been successfully used for data mining. Learning Classifier System
(LCS) is basically a machine learning technique which combines
evolutionary computing, reinforcement learning, supervised or
unsupervised learning and heuristics to produce adaptive systems. A
LCS learns by interacting with an environment from which it
receives feedback in the form of numerical reward. Learning is
achieved by trying to maximize the amount of reward received. All
LCSs models more or less, comprise four main components; a finite
population of condition–action rules, called classifiers; the
performance component, which governs the interaction with the
environment; the credit assignment component, which distributes the
reward received from the environment to the classifiers accountable
for the rewards obtained; the discovery component, which is
responsible for discovering better rules and improving existing ones
through a genetic algorithm. The concatenate of the production rules
in the LCS form the genotype, and therefore the GA should operate
on a population of classifier systems. This approach is known as the
'Pittsburgh' Classifier Systems. Other LCS that perform their GA at
the rule level within a population are known as 'Mitchigan' Classifier
Systems. The most predominant representation of the discovered
knowledge is the standard production rules (PRs) in the form of IF P
THEN D. The PRs, however, are unable to handle exceptions and do
not exhibit variable precision. The Censored Production Rules
(CPRs), an extension of PRs, were proposed by Michalski and
Winston that exhibit variable precision and supports an efficient
mechanism for handling exceptions. A CPR is an augmented
production rule of the form: IF P THEN D UNLESS C, where
Censor C is an exception to the rule. Such rules are employed in
situations, in which conditional statement IF P THEN D holds
frequently and the assertion C holds rarely. By using a rule of this
type we are free to ignore the exception conditions, when the
resources needed to establish its presence are tight or there is simply
no information available as to whether it holds or not. Thus, the IF P
THEN D part of CPR expresses important information, while the
UNLESS C part acts only as a switch and changes the polarity of D
to ~D. In this paper Pittsburgh style LCSs approach is used for
automated discovery of CPRs. An appropriate encoding scheme is
suggested to represent a chromosome consisting of fixed size set of
CPRs. Suitable genetic operators are designed for the set of CPRs
and individual CPRs and also appropriate fitness function is proposed
that incorporates basic constraints on CPR. Experimental results are
presented to demonstrate the performance of the proposed learning
classifier system.
Abstract: In this paper, a new time-delay estimation
technique based on the cross IB-energy operator [5] is
introduced. This quadratic energy detector measures how
much a signal is present in another one. The location of the
peak of the energy operator, corresponding to the maximum of
interaction between the two signals, is the estimate of the
delay. The method is a fully data-driven approach. The
discrete version of the continuous-time form of the cross IBenergy
operator, for its implementation, is presented. The
effectiveness of the proposed method is demonstrated on real
underwater acoustic signals arriving from targets and the
results compared to the cross-correlation method.
Abstract: The paper deals with the estimation of amplitude and phase of an analogue multi-harmonic band-limited signal from irregularly spaced sampling values. To this end, assuming the signal fundamental frequency is known in advance (i.e., estimated at an independent stage), a complexity-reduced algorithm for signal reconstruction in time domain is proposed. The reduction in complexity is achieved owing to completely new analytical and summarized expressions that enable a quick estimation at a low numerical error. The proposed algorithm for the calculation of the unknown parameters requires O((2M+1)2) flops, while the straightforward solution of the obtained equations takes O((2M+1)3) flops (M is the number of the harmonic components). It is applied in signal reconstruction, spectral estimation, system identification, as well as in other important signal processing problems. The proposed method of processing can be used for precise RMS measurements (for power and energy) of a periodic signal based on the presented signal reconstruction. The paper investigates the errors related to the signal parameter estimation, and there is a computer simulation that demonstrates the accuracy of these algorithms.
Abstract: Three alumina-supported Pt-Sn catalysts have been
prepared by means of co-impregnation and characterized by XRD and
N2 adsorption. The influence of catalyst composition and reaction
conditions on the conversion and selectivity were investigated in the
hydrogenation of acetic acid in an isothermal integral fixed bed
reactor. The experiments were performed on the temperature interval
468-548 K, liquid hourly space velocity (LHSV) of 0.3-0.7h-1,
pressures between 1.0 and 5.0Mpa. A good compromise of
0.75%Pt-1.5%Sn can act as an optimized acetic acid hydrogenation
catalyst, and the conversion and selectivity can be tuned through the
variation of reaction conditions.
Abstract: In recent years, scanning probe atomic force
microscopy SPM AFM has gained acceptance over a wide spectrum
of research and science applications. Most fields focuses on physical,
chemical, biological while less attention is devoted to manufacturing
and machining aspects. The purpose of the current study is to assess
the possible implementation of the SPM AFM features and its
NanoScope software in general machining applications with special
attention to the tribological aspects of cutting tool. The surface
morphology of coated and uncoated as-received carbide inserts is
examined, analyzed, and characterized through the determination of
the appropriate scanning setting, the suitable data type imaging
techniques and the most representative data analysis parameters
using the MultiMode SPM AFM in contact mode. The NanoScope
operating software is used to capture realtime three data types
images: “Height", “Deflection" and “Friction". Three scan sizes are
independently performed: 2, 6, and 12 μm with a 2.5 μm vertical
range (Z). Offline mode analysis includes the determination of three
functional topographical parameters: surface “Roughness", power
spectral density “PSD" and “Section". The 12 μm scan size in
association with “Height" imaging is found efficient to capture every
tiny features and tribological aspects of the examined surface. Also,
“Friction" analysis is found to produce a comprehensive explanation
about the lateral characteristics of the scanned surface. Configuration
of many surface defects and drawbacks has been precisely detected
and analyzed.
Abstract: We have previously introduced an ultrasonic imaging
approach that combines harmonic-sensitive pulse sequences with a
post-beamforming quadratic kernel derived from a second-order
Volterra filter (SOVF). This approach is designed to produce images
with high sensitivity to nonlinear oscillations from microbubble
ultrasound contrast agents (UCA) while maintaining high levels of
noise rejection. In this paper, a two-step algorithm for computing the
coefficients of the quadratic kernel leading to reduction of tissue
component introduced by motion, maximizing the noise rejection and
increases the specificity while optimizing the sensitivity to the UCA
is presented. In the first step, quadratic kernels from individual
singular modes of the PI data matrix are compared in terms of their
ability of maximize the contrast to tissue ratio (CTR). In the second
step, quadratic kernels resulting in the highest CTR values are
convolved. The imaging results indicate that a signal processing
approach to this clinical challenge is feasible.
Abstract: A simple mobile engine-driven pneumatic paddy
collector made of locally available materials using local
manufacturing technology was designed, fabricated, and tested for
collecting and bagging of paddy dried on concrete pavement. The
pneumatic paddy collector had the following major components:
radial flat bladed type centrifugal fan, power transmission system,
bagging area, frame and the conveyance system. Results showed
significant differences on the collecting capacity, noise level, and fuel
consumption when rotational speed of the air mover shaft was varied.
Other parameters such as collecting efficiency, air velocity,
augmented cracked grain percentage, and germination rate were not
significantly affected by varying rotational speed of the air mover
shaft. The pneumatic paddy collector had a collecting efficiency of
99.33 % with a collecting capacity of 2685.00 kg/h at maximum
rotational speed of centrifugal fan shaft of about 4200 rpm. The
machine entailed an investment cost of P 62,829.25. The break-even
weight of paddy was 510,606.75 kg/yr at a collecting cost of 0.11
P/kg of paddy. Utilizing the machine for 400 hours per year
generated an income of P 23,887.73. The projected time needed to
recover cost of the machine based on 2685 kg/h collecting capacity
was 2.63 year.
Abstract: This paper discusses the designing of knowledge
integration of clinical information extracted from distributed medical
ontologies in order to ameliorate a machine learning-based multilabel
coding assignment system. The proposed approach is
implemented using a decision tree technique of the machine learning
on the university hospital data for patients with Coronary Heart
Disease (CHD). The preliminary results obtained show a satisfactory
finding that the use of medical ontologies improves the overall
system performance.
Abstract: Information sharing and exchange, rather than
information processing, is what characterizes information
technology in the 21st century. Ontologies, as shared common
understanding, gain increasing attention, as they appear as the
most promising solution to enable information sharing both at
a semantic level and in a machine-processable way. Domain
Ontology-based modeling has been exploited to provide
shareability and information exchange among diversified,
heterogeneous applications of enterprises.
Contextual ontologies are “an explicit specification of
contextual conceptualization". That is: ontology is
characterized by concepts that have multiple representations
and they may exist in several contexts. Hence, contextual
ontologies are a set of concepts and relationships, which are
seen from different perspectives. Contextualization is to allow
for ontologies to be partitioned according to their contexts.
The need for contextual ontologies in enterprise modeling
has become crucial due to the nature of today's competitive
market. Information resources in enterprise is distributed and
diversified and is in need to be shared and communicated
locally through the intranet and globally though the internet.
This paper discusses the roles that ontologies play in an
enterprise modeling, and how ontologies assist in building a
conceptual model in order to provide communicative and
interoperable information systems. The issue of enterprise
modeling based on contextual domain ontology is also
investigated, and a framework is proposed for an enterprise
model that consists of various applications.
Abstract: In this paper, the modified optimal sliding mode control with a proposed method to design a sliding surface is presented. Because of the inability of the previous approach of the sliding mode method to design a bounded and suitable input, the new variation is proposed in the sliding manifold to obviate problems in a structural system. Although the sliding mode control is a powerful method to reject disturbances and noises, the chattering problem is not good for actuators. To decrease the chattering phenomena, the optimal control is added to the sliding mode control. Not only the proposed method can decline the intense variations in the inputs of the system but also it can produce the efficient responses respect to the sliding mode control and optimal control that are shown by performing some numerical simulations.
Abstract: We study dynamic instability in high-rise steel moment
resisting frames (SMRFs) subjected to synthetic long-period ground
motions caused by hypothetical huge subduction earthquakes. Since
long duration as well as long dominant periods is a characteristic of
long-period ground motions, interstory drifts may enter the negative
postyield stiffness range many times when high-rise buildings are
subjected to long-period ground motions. Through the case studies of
9 high-rise SMRFs designed in accordance with the Japanese design
practice in 1980s, we demonstrate that drifting, or accumulation of
interstory drifts in one direction, occurs at the lower stories of the
SMRFs, if their natural periods are close to the dominant periods of the
long-period ground motions. The drifting led to residual interstory
drift ratio over 0.01, or to collapse if the design base shear was small.