Abstract: We have applied new accelerated algorithm for linear
discriminate analysis (LDA) in face recognition with support vector
machine. The new algorithm has the advantage of optimal selection
of the step size. The gradient descent method and new algorithm has
been implemented in software and evaluated on the Yale face
database B. The eigenfaces of these approaches have been used to
training a KNN. Recognition rate with new algorithm is compared
with gradient.
Abstract: At the end of the 20th century it was actual the
development of transport corridors and the improvement of their
technical parameters. With this purpose, many countries and Georgia
among them manufacture to construct new highways, railways and
also reconstruction-modernization of the existing transport
infrastructure. It is necessary to explore the artificial structures
(bridges and tunnels) on the existing tracks as they are very old.
Conference report includes the peculiarities of reconstruction of
tunnels, because we think that this theme is important for the
modernization of the existing road infrastructure. We must remark
that the methods of determining mining pressure of tunnel
reconstructions are worked out according to the jobs of new tunnels
but it is necessary to foresee additional mining pressure which will be
formed during their reconstruction. In this report there are given the
methods of figuring the additional mining pressure while
reconstruction of tunnels, there was worked out the computer
program, it is determined that during reconstruction of tunnels the
additional mining pressure is 1/3rd of main mining pressure.
Abstract: This paper proposes rough set models with three
different level knowledge granules in incomplete information system
under tolerance relation by similarity between objects according to
their attribute values. Through introducing dominance relation on the
discourse to decompose similarity classes into three subclasses: little
better subclass, little worse subclass and vague subclass, it dismantles
lower and upper approximations into three components. By using
these components, retrieving information to find naturally hierarchical
expansions to queries and constructing answers to elaborative queries
can be effective. It illustrates the approach in applying rough set
models in the design of information retrieval system to access different
granular expanded documents. The proposed method enhances rough
set model application in the flexibility of expansions and elaborative
queries in information retrieval.
Abstract: An approach is offered for more precise definition of base lines- borders in handwritten cursive text and general problems of handwritten text segmentation have also been analyzed. An offered method tries to solve problems arose in handwritten recognition with specific slant or in other words, where the letters of the words are not on the same vertical line. As an informative features, some recognition systems use ascending and descending parts of the letters, found after the word-s baseline detection. In such recognition systems, problems in baseline detection, impacts the quality of the recognition and decreases the rate of the recognition. Despite other methods, here borders are found by small pieces containing segmentation elements and defined as a set of linear functions. In this method, separate borders for top and bottom border lines are found. At the end of the paper, as a result, azerbaijani cursive handwritten texts written in Latin alphabet by different authors has been analyzed.
Abstract: Shirvan is located in plain in Northern Khorasan province north east of Iran and has semiarid to temperate climate. To investigate the annual changes in some qualitative parameters such as electrical conductivity, total dissolved solids and chloride concentrations which have increased during ten continuous years. Fourteen groundwater sources including deep as well as semi-deep wells were sampled and were analyzed using standard methods. The trends of obtained data were analyzed during these years and the effects of different factors on the changes in electrical conductivity, concentration of chloride and total dissolved solids were clarified. The results showed that the amounts of some qualitative parameters have been increased during 10 years time which has led to decrease in water quality. The results also showed that increased in urban populations as well as extensive industrialization in the studied area are the most important reasons to influence underground water quality. Furthermore decrease in water quantity is also evident due to more water utilization and occurrence of recent droughts in the region during recent years.
Abstract: Hazard rate estimation is one of the important topics
in forecasting earthquake occurrence. Forecasting earthquake
occurrence is a part of the statistical seismology where the main
subject is the point process. Generally, earthquake hazard rate is
estimated based on the point process likelihood equation called the
Hazard Rate Likelihood of Point Process (HRLPP). In this research,
we have developed estimation method, that is hazard rate single
decrement HRSD. This method was adapted from estimation method
in actuarial studies. Here, one individual associated with an
earthquake with inter event time is exponentially distributed. The
information of epicenter and time of earthquake occurrence are used
to estimate hazard rate. At the end, a case study of earthquake hazard
rate will be given. Furthermore, we compare the hazard rate between
HRLPP and HRSD method.
Abstract: This paper deals with the project selection problem. Project selection problem is one of the problems arose firstly in the field of operations research following some production concepts from primary product mix problem. Afterward, introduction of managerial considerations into the project selection problem have emerged qualitative factors and criteria to be regarded as well as quantitative ones. To overcome both kinds of criteria, an analytic network process is developed in this paper enhanced with fuzzy sets theory to tackle the vagueness of experts- comments to evaluate the alternatives. Additionally, a modified version of Least-Square method through a non-linear programming model is augmented to the developed group decision making structure in order to elicit the final weights from comparison matrices. Finally, a case study is considered by which developed structure in this paper is validated. Moreover, a sensitivity analysis is performed to validate the response of the model with respect to the condition alteration.
Abstract: The network of delivering commodities has been an important design problem in our daily lives and many transportation applications. The delivery performance is evaluated based on the system reliability of delivering commodities from a source node to a sink node in the network. The system reliability is thus maximized to find the optimal routing. However, the design problem is not simple because (1) each path segment has randomly distributed attributes; (2) there are multiple commodities that consume various path capacities; (3) the optimal routing must successfully complete the delivery process within the allowable time constraints. In this paper, we want to focus on the design optimization of the Multi-State Flow Network (MSFN) for multiple commodities. We propose an efficient approach to evaluate the system reliability in the MSFN with respect to randomly distributed path attributes and find the optimal routing subject to the allowable time constraints. The delivery rates, also known as delivery currents, of the path segments are evaluated and the minimal-current arcs are eliminated to reduce the complexity of the MSFN. Accordingly, the correct optimal routing is found and the worst-case reliability is evaluated. It has been shown that the reliability of the optimal routing is at least higher than worst-case measure. Two benchmark examples are utilized to demonstrate the proposed method. The comparisons between the original and the reduced networks show that the proposed method is very efficient.
Abstract: Many studies have applied the Theory of Planned
Behavior (TPB) in predicting health behaviors among unique
populations. However, a new paradigm is emerging where focus is
now directed to modification and expansion of the TPB model rather
than utilization of the traditional theory. This review proposes new
models modified from the Theory of Planned Behavior and suggest
an appropriate study design that can be used to test the models within
physical activity and dietary practice domains among Type 2
diabetics in Kenya. The review was conducted by means of literature
search in the field of nutrition behavior, health psychology and
mixed methods using predetermined key words. The results identify
pre-intention and post intention gaps within the TPB model that need
to be filled. Additional psychosocial factors are proposed to be
included in the TPB model to generate new models and the efficacy
of these models tested using mixed methods design.
Abstract: Locality Sensitive Hashing (LSH) is one of the most
promising techniques for solving nearest neighbour search problem in
high dimensional space. Euclidean LSH is the most popular variation
of LSH that has been successfully applied in many multimedia
applications. However, the Euclidean LSH presents limitations that
affect structure and query performances. The main limitation of the
Euclidean LSH is the large memory consumption. In order to achieve
a good accuracy, a large number of hash tables is required. In this
paper, we propose a new hashing algorithm to overcome the storage
space problem and improve query time, while keeping a good
accuracy as similar to that achieved by the original Euclidean LSH.
The Experimental results on a real large-scale dataset show that the
proposed approach achieves good performances and consumes less
memory than the Euclidean LSH.
Abstract: Combining classifiers is a useful method for solving
complex problems in machine learning. The ECOC (Error Correcting
Output Codes) method has been widely used for designing combining
classifiers with an emphasis on the diversity of classifiers. In this
paper, in contrast to the standard ECOC approach in which individual
classifiers are chosen homogeneously, classifiers are selected
according to the complexity of the corresponding binary problem. We
use SATIMAGE database (containing 6 classes) for our experiments.
The recognition error rate in our proposed method is %10.37 which
indicates a considerable improvement in comparison with the
conventional ECOC and stack generalization methods.
Abstract: Cellular automata have been used for design of cryptosystems. Recently some secret sharing schemes based on linear memory cellular automata have been introduced which are used for both text and image. In this paper, we illustrate that these secret sharing schemes are vulnerable to dishonest participants- collusion. We propose a cheating model for the secret sharing schemes based on linear memory cellular automata. For this purpose we present a novel uniform model for representation of all secret sharing schemes based on cellular automata. Participants can cheat by means of sending bogus shares or bogus transition rules. Cheaters can cooperate to corrupt a shared secret and compute a cheating value added to it. Honest participants are not aware of cheating and suppose the incorrect secret as the valid one. We prove that cheaters can recover valid secret by removing the cheating value form the corrupted secret. We provide methods of calculating the cheating value.
Abstract: The prediction of transmembrane helical segments
(TMHs) in membrane proteins is an important field in the
bioinformatics research. In this paper, a method based on discrete
wavelet transform (DWT) has been developed to predict the number
and location of TMHs in membrane proteins. PDB coded as 1F88 was
chosen as an example to describe the prediction of the number and
location of TMHs in membrane proteins by using this method. One
group of test data sets that contain total 19 protein sequences was
utilized to access the effect of this method. Compared with the
prediction results of DAS, PRED-TMR2, SOSUI, HMMTOP2.0 and
TMHMM2.0, the obtained results indicate that the presented method
has higher prediction accuracy.
Abstract: Anaerobic digestion process is one of the alternative
methods to convert organic waste into methane gas which is a fuel
and energy source. Activities of various kinds of microorganisms are
the main factor for anaerobic digestion which produces methane gas.
Therefore, in this study a modified Anaerobic Baffled Reactor (ABR)
with working volume of 50 liters was designed to identify the
microorganisms through biogas production. The mixture of 75%
kitchen waste and 25% sewage sludge was used as substrate.
Observations on microorganisms in the ABR showed that there exists
a small amount of protozoa (5%) and fungi (2%) in the system, but
almost 93% of the microorganism population consists of bacteria. It
is definitely clear that bacteria are responsible for anaerobic
biodegradation of kitchen waste. Results show that in the
acidification zone of the ABR (front compartments of reactor) fast
growing bacteria capable of growth at high substrate levels and
reduced pH was dominant. A shift to slower growing scavenging
bacteria that grow better at higher pH was occurring towards the end
of the reactor. Due to the ability of activity in acetate environment the
percentages of Methanococcus, Methanosarcina and Methanotrix
were higher than other kinds of methane former in the system.
Abstract: Recent years have instance that there is a invigoration
of interest in drug discovery from medicinal plants for the support of
health in all parts of the world . This study was designed to examine
the in vitro antimicrobial activities of the flowers and leaves
methanolic and ethanolic extracts of Chenopodium album L.
Chenopodium album Linn. flowers and leaves were collected from
East Esfahan, Iran. The effects of methanolic and ethanolic extracts
were tested against 4 bacterial strains by using disc,well-diffusion
method. Results showed that flowers and leaves methanolic and
ethanolic extracts of C.album don-t have any activity against the
selected bacterial strains. Our study has indicated that ,there are
effective different factors on antimicrobial properties of plant extracts
Abstract: After the terrorist attack on September 11, 2001 in
U.S., the container security issue got high attention, especially by U.S.
government, which deployed a lot of measures to promote or improve
security systems. U.S. government not only enhances its national
security system, but allies with other countries against the potential
terrorist attacks in the future. For example CSI (Container Security
Initiative), it encourages foreign ports outside U.S. to become CSI
ports as a part of U.S. anti-terrorism network. Although promotion of
the security could partly reach the goal of anti-terrorism, that will
influence the efficiency of container supply chain, which is the main
concern when implementing the inspection measurements. This paper
proposes a quick estimation methodology for an inspection service
rate by a berth allocation heuristic such that the inspection activities
will not affect the original container supply chain. Theoretical and
simulation results show this approach is effective.
Abstract: Fuzzy Cognitive Maps (FCMs) is a causal graph, which shows the relations between essential components in complex systems. Experts who are familiar with the system components and their relations can generate a related FCM. There is a big gap when human experts cannot produce FCM or even there is no expert to produce the related FCM. Therefore, a new mechanism must be used to bridge this gap. In this paper, a novel learning method is proposed to construct causal graph based on historical data and by using metaheuristic such Tabu Search (TS). The efficiency of the proposed method is shown via comparison of its results of some numerical examples with those of some other methods.
Abstract: This paper presents a multi-objective model for addressing two main objectives in designing rural roads networks: minimization of user operation costs and maximization of population covered. As limited budgets often exist, a reasonable trade-off must be obtained in order to account for both cost and social benefits in this type of networks. For a real-world rural road network, the model is solved, where all non-dominated solutions were obtained. Afterwards, an analysis is made on the (possibly) most interesting solutions (the ones providing better trade-offs). This analysis, coupled with the knowledge of the real world scenario (typically provided by decision makers) provides a suitable method for the evaluation of road networks in rural areas of developing countries.
Abstract: This paper presents an online method that learns the
corresponding points of an object from un-annotated grayscale images
containing instances of the object. In the first image being
processed, an ensemble of node points is automatically selected
which is matched in the subsequent images. A Bayesian posterior
distribution for the locations of the nodes in the images is formed.
The likelihood is formed from Gabor responses and the prior assumes
the mean shape of the node ensemble to be similar in a translation
and scale free space. An association model is applied for separating
the object nodes and background nodes. The posterior distribution is
sampled with Sequential Monte Carlo method. The matched object
nodes are inferred to be the corresponding points of the object
instances. The results show that our system matches the object nodes
as accurately as other methods that train the model with annotated
training images.