Abstract: This paper highlights a new approach to look at online
principle components analysis (OPCA). Given a data matrix X ā
R,^m x n we characterise the online updates of its covariance as a
matrix perturbation problem. Up to the principle components, it
turns out that online updates of the batch PCA can be captured
by symmetric matrix perturbation of the batch covariance matrix.
We have shown that as nā n0 >> 1, the batch covariance and
its update become almost similar. Finally, utilize our new setup of
online updates to find a bound on the angle distance of the principle
components of X and its update.
Abstract: Minimization methods for training feed-forward networks with Backpropagation are compared. Feedforward network training is a special case of functional minimization, where no explicit model of the data is assumed. Therefore due to the high dimensionality of the data, linearization of the training problem through use of orthogonal basis functions is not desirable. The focus is functional minimization on any basis. A number of methods based on local gradient and Hessian matrices are discussed. Modifications of many methods of first and second order training methods are considered. Using share rates data, experimentally it is proved that Conjugate gradient and Quasi Newton?s methods outperformed the Gradient Descent methods. In case of the Levenberg-Marquardt algorithm is of special interest in financial forecasting.