Abstract: In the present work, we propose a new method for
solving the matrix equation AXB=F . The new method can
be considered as a generalized form of the well-known global full
orthogonalization method (Gl-FOM) for solving multiple linear
systems. Hence, the method will be called extended Gl-FOM (EGl-
FOM). For implementing EGl-FOM, generalized forms of block
Krylov subspace and global Arnoldi process are presented. Finally,
some numerical experiments are given to illustrate the efficiency of
our new method.
Abstract: The theory of rough sets is generalized by using a
filter. The filter is induced by binary relations and it is used to
generalize the basic rough set concepts. The knowledge
representations and processing of binary relations in the style of
rough set theory are investigated.
Abstract: Generalized Center String (GCS) problem are
generalized from Common Approximate Substring problem
and Common substring problems. GCS are known to be
NP-hard allowing the problems lies in the explosion of
potential candidates. Finding longest center string without
concerning the sequence that may not contain any motifs is
not known in advance in any particular biological gene
process. GCS solved by frequent pattern-mining techniques
and known to be fixed parameter tractable based on the
fixed input sequence length and symbol set size. Efficient
method known as Bpriori algorithms can solve GCS with
reasonable time/space complexities. Bpriori 2 and Bpriori
3-2 algorithm are been proposed of any length and any
positions of all their instances in input sequences. In this
paper, we reduced the time/space complexity of Bpriori
algorithm by Constrained Based Frequent Pattern mining
(CBFP) technique which integrates the idea of Constraint
Based Mining and FP-tree mining. CBFP mining technique
solves the GCS problem works for all center string of any
length, but also for the positions of all their mutated copies
of input sequence. CBFP mining technique construct TRIE
like with FP tree to represent the mutated copies of center
string of any length, along with constraints to restraint
growth of the consensus tree. The complexity analysis for
Constrained Based FP mining technique and Bpriori
algorithm is done based on the worst case and average case
approach. Algorithm's correctness compared with the
Bpriori algorithm using artificial data is shown.
Abstract: In this paper, the sum of squares in linear regression is
reduced to sum of squares in semi-parametric regression. We
indicated that different sums of squares in the linear regression are
similar to various deviance statements in semi-parametric regression.
In addition to, coefficient of the determination derived in linear
regression model is easily generalized to coefficient of the
determination of the semi-parametric regression model. Then, it is
made an application in order to support the theory of the linear
regression and semi-parametric regression. In this way, study is
supported with a simulated data example.
Abstract: Segmentation of a color image composed of different
kinds of regions can be a hard problem, namely to compute for an
exact texture fields. The decision of the optimum number of
segmentation areas in an image when it contains similar and/or un
stationary texture fields. A novel neighborhood-based segmentation
approach is proposed. A genetic algorithm is used in the proposed
segment-pass optimization process. In this pass, an energy function,
which is defined based on Markov Random Fields, is minimized. In
this paper we use an adaptive threshold estimation method for image
thresholding in the wavelet domain based on the generalized
Gaussian distribution (GGD) modeling of sub band coefficients. This
method called Normal Shrink is computationally more efficient and
adaptive because the parameters required for estimating the threshold
depend on sub band data energy that used in the pre-stage of
segmentation. A quad tree is employed to implement the multi
resolution framework, which enables the use of different strategies at
different resolution levels, and hence, the computation can be
accelerated. The experimental results using the proposed
segmentation approach are very encouraging.
Abstract: In this paper we present a novel approach for density estimation. The proposed approach is based on using the logistic regression model to get initial density estimation for the given empirical density. The empirical data does not exactly follow the logistic regression model, so, there will be a deviation between the empirical density and the density estimated using logistic regression model. This deviation may be positive and/or negative. In this paper we use a linear combination of Gaussian (LCG) with positive and negative components as a model for this deviation. Also, we will use the expectation maximization (EM) algorithm to estimate the parameters of LCG. Experiments on real images demonstrate the accuracy of our approach.
Abstract: Team efficacy beliefs show promise in enhancing
team performance. Using a model-based quantitative research design,
we investigated the antecedents and performance consequences of
generalized team efficacy (potency) in a sample of 56 capital projects
executed by 15 Fortune 500 companies in the process industries.
Empirical analysis of our field survey identified that generalized
team efficacy beliefs were positively associated with an objective
measure of project cost performance. Regression analysis revealed
that team competence, empowering leadership, and performance
feedback all predicted generalized team efficacy beliefs. Tests of
mediation revealed that generalized team efficacy fully mediated
between these three inputs and project cost performance.
Abstract: Functioning of a biometric system in large part
depends on the performance of the similarity measure function.
Frequently a generalized similarity distance measure function such as
Euclidian distance or Mahalanobis distance is applied to the task of
matching biometric feature vectors. However, often accuracy of a
biometric system can be greatly improved by designing a customized
matching algorithm optimized for a particular biometric application.
In this paper we propose a tailored similarity measure function for
behavioral biometric systems based on the expert knowledge of the
feature level data in the domain. We compare performance of a
proposed matching algorithm to that of other well known similarity
distance functions and demonstrate its superiority with respect to the
chosen domain.
Abstract: In this paper, we propose a new method to distinguish
between arousal and relaxation states by using multiple features
acquired from a photoplethysmogram (PPG) and support vector
machine (SVM). To induce arousal and relaxation states in subjects, 2
kinds of sound stimuli are used, and their corresponding biosignals are
obtained using the PPG sensor. Two features–pulse to pulse interval
(PPI) and pulse amplitude (PA)–are extracted from acquired PPG
data, and a nonlinear classification between arousal and relaxation is
performed using SVM.
This methodology has several advantages when compared with
previous similar studies. Firstly, we extracted 2 separate features from
PPG, i.e., PPI and PA. Secondly, in order to improve the classification
accuracy, SVM-based nonlinear classification was performed.
Thirdly, to solve classification problems caused by generalized
features of whole subjects, we defined each threshold according to
individual features.
Experimental results showed that the average classification
accuracy was 74.67%. Also, the proposed method showed the better
identification performance than the single feature based methods.
From this result, we confirmed that arousal and relaxation can be
classified using SVM and PPG features.
Abstract: Accurate demand forecasting is one of the most key
issues in inventory management of spare parts. The problem of
modeling future consumption becomes especially difficult for lumpy
patterns, which characterized by intervals in which there is no
demand and, periods with actual demand occurrences with large
variation in demand levels. However, many of the forecasting
methods may perform poorly when demand for an item is lumpy.
In this study based on the characteristic of lumpy demand patterns
of spare parts a hybrid forecasting approach has been developed,
which use a multi-layered perceptron neural network and a
traditional recursive method for forecasting future demands. In the
described approach the multi-layered perceptron are adapted to
forecast occurrences of non-zero demands, and then a conventional
recursive method is used to estimate the quantity of non-zero
demands. In order to evaluate the performance of the proposed
approach, their forecasts were compared to those obtained by using
Syntetos & Boylan approximation, recently employed multi-layered
perceptron neural network, generalized regression neural network
and elman recurrent neural network in this area. The models were
applied to forecast future demand of spare parts of Arak
Petrochemical Company in Iran, using 30 types of real data sets. The
results indicate that the forecasts obtained by using our proposed
mode are superior to those obtained by using other methods.
Abstract: Com Poisson distribution is capable of modeling the count responses irrespective of their mean variance relation and the parameters of this distribution when fitted to a simple cross sectional data can be efficiently estimated using maximum likelihood (ML) method. In the regression setup, however, ML estimation of the parameters of the Com Poisson based generalized linear model is computationally intensive. In this paper, we propose to use quasilikelihood (QL) approach to estimate the effect of the covariates on the Com Poisson counts and investigate the performance of this method with respect to the ML method. QL estimates are consistent and almost as efficient as ML estimates. The simulation studies show that the efficiency loss in the estimation of all the parameters using QL approach as compared to ML approach is quite negligible, whereas QL approach is lesser involving than ML approach.
Abstract: Methods for organizing web data into groups in order
to analyze web-based hypertext data and facilitate data availability
are very important in terms of the number of documents available
online. Thereby, the task of clustering web-based document structures
has many applications, e.g., improving information retrieval on the
web, better understanding of user navigation behavior, improving web
users requests servicing, and increasing web information accessibility.
In this paper we investigate a new approach for clustering web-based
hypertexts on the basis of their graph structures. The hypertexts will
be represented as so called generalized trees which are more general
than usual directed rooted trees, e.g., DOM-Trees. As a important
preprocessing step we measure the structural similarity between the
generalized trees on the basis of a similarity measure d. Then,
we apply agglomerative clustering to the obtained similarity matrix
in order to create clusters of hypertext graph patterns representing
navigation structures. In the present paper we will run our approach
on a data set of hypertext structures and obtain good results in
Web Structure Mining. Furthermore we outline the application of
our approach in Web Usage Mining as future work.
Abstract: This paper describes the application of a model predictive controller to the problem of batch reactor temperature control. Although a great deal of work has been done to improve reactor throughput using batch sequence control, the control of the actual reactor temperature remains a difficult problem for many operators of these processes. Temperature control is important as many chemical reactions are sensitive to temperature for formation of desired products. This controller consist of two part (1) a nonlinear control method GLC (Global Linearizing Control) to create a linear model of system and (2) a Model predictive controller used to obtain optimal input control sequence. The temperature of reactor is tuned to track a predetermined temperature trajectory that applied to the batch reactor. To do so two input signals, electrical powers and the flow of coolant in the coil are used. Simulation results show that the proposed controller has a remarkable performance for tracking reference trajectory while at the same time it is robust against noise imposed to system output.
Abstract: The IFS is a scheme for describing and manipulating complex fractal attractors using simple mathematical models. More precisely, the most popular “fractal –based" algorithms for both representation and compression of computer images have involved some implementation of the method of Iterated Function Systems (IFS) on complete metric spaces. In this paper a new generalized space called Multi-Fuzzy Fractal Space was constructed. On these spases a distance function is defined, and its completeness is proved. The completeness property of this space ensures the existence of a fixed-point theorem for the family of continuous mappings. This theorem is the fundamental result on which the IFS methods are based and the fractals are built. The defined mappings are proved to satisfy some generalizations of the contraction condition.
Abstract: In this paper, by using Mawhin-s continuation theorem of coincidence degree and a method based on delay differential inequality, some sufficient conditions are obtained for the existence and global exponential stability of periodic solutions of cellular neural networks with distributed delays and impulses on time scales. The results of this paper generalized previously known results.
Abstract: The inherent flexibilities of XML in both structure
and semantics makes mining from XML data a complex task with
more challenges compared to traditional association rule mining in
relational databases. In this paper, we propose a new model for the
effective extraction of generalized association rules form a XML
document collection. We directly use frequent subtree mining
techniques in the discovery process and do not ignore the tree
structure of data in the final rules. The frequent subtrees based on the
user provided support are split to complement subtrees to form the
rules. We explain our model within multi-steps from data preparation
to rule generation.
Abstract: In this paper, a nonconforming mixed finite element method is studied for semilinear pseudo-hyperbolic partial integrodifferential equations. By use of the interpolation technique instead of the generalized elliptic projection, the optimal error estimates of the corresponding unknown function are given.
Abstract: New generalization of the new class matrix polynomial set have been obtained. An explicit representation and an expansion of the matrix exponential in a series of these matrix are given for these matrix polynomials.
Abstract: In this study, two new classes of generalized homeomorphisms are introduced and shown that one of these classes has a group structure. Moreover, some properties of these two homeomorphisms are obtained.
Abstract: Requirements that should be met when determining the regimes of circuits with variable elements are formulated. The interpretation of the variations in the regimes, based on projective geometry, enables adequate expressions for determining and comparing the regimes to be derived. It is proposed to use as the parameters of a generalized equivalent generator of an active two-pole with changeable resistor such load current and voltage which provide the current through this resistor equal to zero.