Abstract: The back-propagation algorithm calculates the weight
changes of an artificial neural network, and a two-term algorithm
with a dynamically optimal learning rate and a momentum factor
is commonly used. Recently the addition of an extra term, called a
proportional factor (PF), to the two-term BP algorithm was proposed.
The third term increases the speed of the BP algorithm. However,
the PF term also reduces the convergence of the BP algorithm, and
optimization approaches for evaluating the learning parameters are
required to facilitate the application of the three terms BP algorithm.
This paper considers the optimization of the new back-propagation
algorithm by using derivative information. A family of approaches
exploiting the derivatives with respect to the learning rate, momentum
factor and proportional factor is presented. These autonomously
compute the derivatives in the weight space, by using information
gathered from the forward and backward procedures. The three-term
BP algorithm and the optimization approaches are evaluated using
the benchmark XOR problem.
Abstract: In this paper two models using a functional network
were employed to solving classification problem. Functional networks
are generalized neural networks, which permit the specification of
their initial topology using knowledge about the problem at hand. In
this case, and after analyzing the available data and their relations, we
systematically discuss a numerical analysis method used for
functional network, and apply two functional network models to
solving XOR problem. The XOR problem that cannot be solved with
two-layered neural network can be solved by two-layered functional
network, which reveals a potent computational power of functional
networks, and the performance of the proposed model was validated
using classification problems.