Abstract: In this study, a black box modeling of the coupled-tank system is obtained by using fuzzy sets. The derived model is tested via adaptive neuro fuzzy inference system (ANFIS). In order to achieve a better control performance, the parameters of three different controller types, classical proportional integral controller (PID), fuzzy PID and function tuner method, are tuned by one of the evolutionary computation method, genetic algorithm. All tuned controllers are applied to the fuzzy model of the coupled-tank experimental setup and analyzed under the different reference input values. According to the results, it is seen that function tuner method demonstrates better robust control performance and guarantees the closed loop stability.
Abstract: Stochastic modeling concerns the use of probability
to model real-world situations in which uncertainty is present.
Therefore, the purpose of stochastic modeling is to estimate the
probability of outcomes within a forecast, i.e. to be able to predict
what conditions or decisions might happen under different situations.
In the present study, we present a model of a stochastic diffusion
process based on the bi-Weibull distribution function (its trend
is proportional to the bi-Weibull probability density function). In
general, the Weibull distribution has the ability to assume the
characteristics of many different types of distributions. This has
made it very popular among engineers and quality practitioners, who
have considered it the most commonly used distribution for studying
problems such as modeling reliability data, accelerated life testing,
and maintainability modeling and analysis. In this work, we start
by obtaining the probabilistic characteristics of this model, as the
explicit expression of the process, its trends, and its distribution by
transforming the diffusion process in a Wiener process as shown in
the Ricciaardi theorem. Then, we develop the statistical inference of
this model using the maximum likelihood methodology. Finally, we
analyse with simulated data the computational problems associated
with the parameters, an issue of great importance in its application to
real data with the use of the convergence analysis methods. Overall,
the use of a stochastic model reflects only a pragmatic decision on
the part of the modeler. According to the data that is available and
the universe of models known to the modeler, this model represents
the best currently available description of the phenomenon under
consideration.
Abstract: Intrusion Detection Systems are an essential tool for
network security infrastructure. However, IDSs have a serious
problem which is the generating of massive number of alerts, most of
them are false positive ones which can hide true alerts and make the
analyst confused to analyze the right alerts for report the true attacks.
The purpose behind this paper is to present a formalism model to
perform correlation engine by the reduction of false positive alerts
basing on vulnerability contextual information. For that, we propose
a formalism model based on non-monotonic JClassicδє description
logic augmented with a default (δ) and an exception (є) operator that
allows a dynamic inference according to contextual information.
Abstract: Most of self-tuning fuzzy systems, which are
automatically constructed from learning data, are based on the
steepest descent method (SDM). However, this approach often
requires a large convergence time and gets stuck into a shallow
local minimum. One of its solutions is to use fuzzy rule modules
with a small number of inputs such as DIRMs (Double-Input Rule
Modules) and SIRMs (Single-Input Rule Modules). In this paper,
we consider a (generalized) DIRMs model composed of double
and single-input rule modules. Further, in order to reduce the
redundant modules for the (generalized) DIRMs model, pruning and
generative learning algorithms for the model are suggested. In order
to show the effectiveness of them, numerical simulations for function
approximation, Box-Jenkins and obstacle avoidance problems are
performed.
Abstract: Learning outcomes of a course (CLOs) and the abilities at the time of graduation referred to as Student Outcomes (SOs) are required to be assessed for ABET accreditation. A question in an assessment must target a CLO as well as an SO and must represent a required level of competence. This paper presents the idea of an Expert System (ES) to select a proper question to satisfy ABET accreditation requirements. For ES implementation, seven attributes of a question are considered including the learning outcomes and Bloom’s Taxonomy level. A database contains all the data about a course including course content topics, course learning outcomes and the CLO-SO relationship matrix. The knowledge base of the presented ES contains a pool of questions each with tags of the specified attributes. Questions and the attributes represent expert opinions. With implicit rule base the inference engine finds the best possible question satisfying the required attributes. It is shown that the novel idea of such an ES can be implemented and applied to a course with success. An application example is presented to demonstrate the working of the proposed ES.
Abstract: Hydrologic models are increasingly used as tools to
predict stormwater quantity and quality from urban catchments.
However, due to a range of practical issues, most models produce
gross errors in simulating complex hydraulic and hydrologic systems.
Difficulty in finding a robust approach for model calibration is one of
the main issues. Though automatic calibration techniques are
available, they are rarely used in common commercial hydraulic and
hydrologic modelling software e.g. MIKE URBAN. This is partly
due to the need for a large number of parameters and large datasets in
the calibration process. To overcome this practical issue, a
framework for automatic calibration of a hydrologic model was
developed in R platform and presented in this paper. The model was
developed based on the time-area conceptualization. Four calibration
parameters, including initial loss, reduction factor, time of
concentration and time-lag were considered as the primary set of
parameters. Using these parameters, automatic calibration was
performed using Approximate Bayesian Computation (ABC). ABC is
a simulation-based technique for performing Bayesian inference
when the likelihood is intractable or computationally expensive to
compute. To test the performance and usefulness, the technique was
used to simulate three small catchments in Gold Coast. For
comparison, simulation outcomes from the same three catchments
using commercial modelling software, MIKE URBAN were used.
The graphical comparison shows strong agreement of MIKE URBAN
result within the upper and lower 95% credible intervals of posterior
predictions as obtained via ABC. Statistical validation for posterior
predictions of runoff result using coefficient of determination (CD),
root mean square error (RMSE) and maximum error (ME) was found
reasonable for three study catchments. The main benefit of using
ABC over MIKE URBAN is that ABC provides a posterior
distribution for runoff flow prediction, and therefore associated
uncertainty in predictions can be obtained. In contrast, MIKE
URBAN just provides a point estimate. Based on the results of the
analysis, it appears as though ABC the developed framework
performs well for automatic calibration.
Abstract: This paper proposes a method of learning topics for
broadcasting contents. There are two kinds of texts related to
broadcasting contents. One is a broadcasting script, which is a series of
texts including directions and dialogues. The other is blogposts, which
possesses relatively abstracted contents, stories, and diverse
information of broadcasting contents. Although two texts range over
similar broadcasting contents, words in blogposts and broadcasting
script are different. When unseen words appear, it needs a method to
reflect to existing topic. In this paper, we introduce a semantic
vocabulary expansion method to reflect unseen words. We expand
topics of the broadcasting script by incorporating the words in
blogposts. Each word in blogposts is added to the most semantically
correlated topics. We use word2vec to get the semantic correlation
between words in blogposts and topics of scripts. The vocabularies of
topics are updated and then posterior inference is performed to
rearrange the topics. In experiments, we verified that the proposed
method can discover more salient topics for broadcasting contents.
Abstract: The paper presents a method in which the expert
knowledge is applied to fuzzy inference model. Even a less
experienced person could benefit from the use of such a system, e.g.
urban planners, officials. The analysis result is obtained in a very
short time, so a large number of the proposed locations can also be
verified in a short time. The proposed method is intended for testing
of locations of car parks in a city. The paper shows selected examples
of locations of the P&R facilities in cities planning to introduce the
P&R. The analyses of existing objects are also shown in the paper
and they are confronted with the opinions of the system users, with
particular emphasis on unpopular locations. The results of the
analyses are compared to expert analysis of the P&R facilities
location that was outsourced by the city and the opinions about
existing facilities users that were expressed on social networking
sites. The obtained results are consistent with actual users’ feedback.
The proposed method proves to be good, but does not require the
involvement of a large experts team and large financial contributions
for complicated research. The method also provides an opportunity to
show the alternative location of P&R facilities. Although the results
of the method are approximate, they are not worse than results of
analysis of employed experts. The advantage of this method is ease of
use, which simplifies the professional expert analysis. The ability of
analyzing a large number of alternative locations gives a broader
view on the problem. It is valuable that the arduous analysis of the
team of people can be replaced by the model's calculation. According
to the authors, the proposed method is also suitable for
implementation on a GIS platform.
Abstract: In this paper, we propose the variational EM inference
algorithm for the multi-class Gaussian process classification model
that can be used in the field of human behavior recognition. This
algorithm can drive simultaneously both a posterior distribution of a
latent function and estimators of hyper-parameters in a Gaussian
process classification model with multiclass. Our algorithm is based
on the Laplace approximation (LA) technique and variational EM
framework. This is performed in two steps: called expectation and
maximization steps. First, in the expectation step, using the Bayesian
formula and LA technique, we derive approximately the posterior
distribution of the latent function indicating the possibility that each
observation belongs to a certain class in the Gaussian process
classification model. Second, in the maximization step, using a derived
posterior distribution of latent function, we compute the maximum
likelihood estimator for hyper-parameters of a covariance matrix
necessary to define prior distribution for latent function. These two
steps iteratively repeat until a convergence condition satisfies.
Moreover, we apply the proposed algorithm with human action
classification problem using a public database, namely, the KTH
human action data set. Experimental results reveal that the proposed
algorithm shows good performance on this data set.
Abstract: The growth in the demand of electrical energy is
leading to load on the Power system which increases the occurrence
of frequent oscillations in the system. The reason for the oscillations
is due to the lack of damping torque which is required to dominate
the disturbances of Power system. By using FACT devices, such as
Unified Power Flow Controller (UPFC) can control power flow,
reduce sub-synchronous resonances and increase transient stability.
Hence, UPFC is used to damp the oscillations occurred in Power
system. This research focuses on adapting the neuro fuzzy controller
for the UPFC design by connecting the infinite bus (SMIB - Single
machine Infinite Bus) to a linearized model of synchronous machine
(Heffron-Phillips) in the power system. This model gains the
capability to improve the transient stability and to damp the
oscillations of the system.
Abstract: People, throughout the history, have made estimates
and inferences about the future by using their past experiences.
Developing information technologies and the improvements in the
database management systems make it possible to extract useful
information from knowledge in hand for the strategic decisions.
Therefore, different methods have been developed. Data mining by
association rules learning is one of such methods. Apriori algorithm,
one of the well-known association rules learning algorithms, is not
commonly used in spatio-temporal data sets. However, it is possible
to embed time and space features into the data sets and make Apriori
algorithm a suitable data mining technique for learning spatiotemporal
association rules. Lake Van, the largest lake of Turkey, is a
closed basin. This feature causes the volume of the lake to increase or
decrease as a result of change in water amount it holds. In this study,
evaporation, humidity, lake altitude, amount of rainfall and
temperature parameters recorded in Lake Van region throughout the
years are used by the Apriori algorithm and a spatio-temporal data
mining application is developed to identify overflows and newlyformed
soil regions (underflows) occurring in the coastal parts of
Lake Van. Identifying possible reasons of overflows and underflows
may be used to alert the experts to take precautions and make the
necessary investments.
Abstract: Inference plays an important role in the learning
process and it can lead to a rapid acquisition of a second language.
When learning a non-native language i.e., a critical language like
Arabic, the students depend on the teacher’s support most of the time
to learn new concepts. The students focus on memorizing the new
vocabulary and stress on learning all the grammatical rules. Hence,
the students became mechanical and cannot produce the language
easily. As a result, they are unable to predicate the meaning of words
in the context by relying heavily on the teacher, in that they cannot
link their prior knowledge or even identify the meaning of the words
without the support of the teacher. This study explores how the
teacher guides students learning during the inference process and
what are the processes of learning that can direct student’s inference.
Abstract: Web mining is to discover and extract useful
Information. Different users may have different search goals when
they search by giving queries and submitting it to a search engine.
The inference and analysis of user search goals can be very useful for
providing an experience result for a user search query. In this project,
we propose a novel approach to infer user search goals by analyzing
search web logs. First, we propose a novel approach to infer user
search goals by analyzing search engine query logs, the feedback
sessions are constructed from user click-through logs and it
efficiently reflect the information needed for users. Second we
propose a preprocessing technique to clean the unnecessary data’s
from web log file (feedback session). Third we propose a technique
to generate pseudo-documents to representation of feedback sessions
for clustering. Finally we implement k-medoids clustering algorithm
to discover different user search goals and to provide a more optimal
result for a search query based on feedback sessions for the user.
Abstract: In addition to environmental parameters like rain,
temperature diseases on crop is a major factor which affects
production quality & quantity of crop yield. Hence disease
management is a key issue in agriculture. For the management of
disease, it needs to be detected at early stage. So, treat it properly &
control spread of the disease. Now a day, it is possible to use the
images of diseased leaf to detect the type of disease by using image
processing techniques. This can be achieved by extracting features
from the images which can be further used with classification
algorithms or content based image retrieval systems. In this paper,
color image is used to extract the features such as mean and standard
deviation after the process of region cropping. The selected features
are taken from the cropped image with different image size samples.
Then, the extracted features are taken in to the account for
classification using Fuzzy Inference System (FIS).
Abstract: In this study, three robust predicting methods, namely artificial neural network (ANN), adaptive neuro fuzzy inference system (ANFIS) and support vector machine (SVM) were used for computing the resonant frequency of A-shaped compact microstrip antennas (ACMAs) operating at UHF band. Firstly, the resonant frequencies of 144 ACMAs with various dimensions and electrical parameters were simulated with the help of IE3D™ based on method of moment (MoM). The ANN, ANFIS and SVM models for computing the resonant frequency were then built by considering the simulation data. 124 simulated ACMAs were utilized for training and the remaining 20 ACMAs were used for testing the ANN, ANFIS and SVM models. The performance of the ANN, ANFIS and SVM models are compared in the training and test process. The average percentage errors (APE) regarding the computed resonant frequencies for training of the ANN, ANFIS and SVM were obtained as 0.457%, 0.399% and 0.600%, respectively. The constructed models were then tested and APE values as 0.601% for ANN, 0.744% for ANFIS and 0.623% for SVM were achieved. The results obtained here show that ANN, ANFIS and SVM methods can be successfully applied to compute the resonant frequency of ACMAs, since they are useful and versatile methods that yield accurate results.
Abstract: To solve these problems, we investigated the management system of heating enterprise, including strategic planning based on the balanced scorecard (BSC), quality management in accordance with the standards of the Quality Management System (QMS) ISO 9001 and analysis of the system based on expert judgment using fuzzy inference. To carry out our work we used the theory of fuzzy sets, the QMS in accordance with ISO 9001, BSC, method of construction of business processes according to the notation IDEF0, theory of modeling using Matlab software simulation tools and graphical programming LabVIEW. The results of the work are as follows: We determined possibilities of improving the management of heat-supply plant-based on QMS; after the justification and adaptation of software tool it has been used to automate a series of functions for the management and reduction of resources and for the maintenance of the system up to date; an application for the analysis of the QMS based on fuzzy inference has been created with novel organization of communication software with the application enabling the analysis of relevant data of enterprise management system.
Abstract: In urban context, urban nodes such as amenity or
hazard will certainly affect house price, while classic hedonic analysis
will employ distance variables measured from each urban nodes.
However, effects from distances to facilities on house prices generally
do not represent the true price of the property. Distance variables
measured on the same surface are suffering a problem called
multicollinearity, which is usually presented as magnitude variance
and mean value in regression, errors caused by instability. In this paper,
we provided a theoretical framework to identify and gather the data
with less bias, and also provided specific sampling method on locating
the sample region to avoid the spatial multicollinerity problem in three
distance variable’s case.
Abstract: Two finite element (FEM) models are presented in
this paper to address the random nature of the response of glued
timber structures made of wood segments with variable elastic
moduli evaluated from 3600 indentation measurements. This total
database served to create the same number of ensembles as was the
number of segments in the tested beam. Statistics of these ensembles
were then assigned to given segments of beams and the Latin
Hypercube Sampling (LHS) method was called to perform 100
simulations resulting into the ensemble of 100 deflections subjected
to statistical evaluation. Here, a detailed geometrical arrangement of
individual segments in the laminated beam was considered in the
construction of two-dimensional FEM model subjected to in fourpoint
bending to comply with the laboratory tests. Since laboratory
measurements of local elastic moduli may in general suffer from a
significant experimental error, it appears advantageous to exploit the
full scale measurements of timber beams, i.e. deflections, to improve
their prior distributions with the help of the Bayesian statistical
method. This, however, requires an efficient computational model
when simulating the laboratory tests numerically. To this end, a
simplified model based on Mindlin’s beam theory was established.
The improved posterior distributions show that the most significant
change of the Young’s modulus distribution takes place in laminae in
the most strained zones, i.e. in the top and bottom layers within the
beam center region. Posterior distributions of moduli of elasticity
were subsequently utilized in the 2D FEM model and compared with
the original simulations.
Abstract: In urban area, several landmarks may affect housing
price and rents, and hedonic analysis should employ distance variables
corresponding to each landmarks. Unfortunately, the effects of
distances to landmarks on housing prices are generally not consistent
with the true price. These distance variables may cause magnitude
error in regression, pointing a problem of spatial multicollinearity. In
this paper, we provided some approaches for getting the samples with
less bias and method on locating the specific sampling area to avoid
the multicollinerity problem in two specific landmarks case.
Abstract: Load modeling is one of the central functions in
power systems operations. Electricity cannot be stored, which means
that for electric utility, the estimate of the future demand is necessary
in managing the production and purchasing in an economically
reasonable way. A majority of the recently reported approaches are
based on neural network. The attraction of the methods lies in the
assumption that neural networks are able to learn properties of the
load. However, the development of the methods is not finished, and
the lack of comparative results on different model variations is a
problem. This paper presents a new approach in order to predict the
Tunisia daily peak load. The proposed method employs a
computational intelligence scheme based on the Fuzzy neural
network (FNN) and support vector regression (SVR). Experimental
results obtained indicate that our proposed FNN-SVR technique gives
significantly good prediction accuracy compared to some classical
techniques.