An Improved Conjugate Gradient Based Learning Algorithm for Back Propagation Neural Networks

The conjugate gradient optimization algorithm is combined with the modified back propagation algorithm to yield a computationally efficient algorithm for training multilayer perceptron (MLP) networks (CGFR/AG). The computational efficiency is enhanced by adaptively modifying initial search direction as described in the following steps: (1) Modification on standard back propagation algorithm by introducing a gain variation term in the activation function, (2) Calculation of the gradient descent of error with respect to the weights and gains values and (3) the determination of a new search direction by using information calculated in step (2). The performance of the proposed method is demonstrated by comparing accuracy and computation time with the conjugate gradient algorithm used in MATLAB neural network toolbox. The results show that the computational efficiency of the proposed method was better than the standard conjugate gradient algorithm.

Towards External Varieties to Internal Varieties − Modular Perspective

Product customization is an essential requirement for manufacturing firms to achieve higher customers- satisfaction and fulfill business target. In order to achieve these objectives, firms need to handle both external varieties such as customer preference, government regulations, cultural considerations etc and internal varieties such as functional requirements of product, production efficiency, quality etc. Both of the varieties need to be accumulated and integrated together for the purpose of producing customized product. These varieties are presented and discussed in this paper along with the perspectives of modular product design and development process. Other development strategies such as modularity, component commonality, product family design and product platform are presented with a view to achieve product variety quickly and economically. A case example both for the concept of modular design and platform based product development process is also presented with the help of design structure matrix (DSM) tool. This paper is concluded with several managerial implications and future research direction.

An Improved Learning Algorithm based on the Conjugate Gradient Method for Back Propagation Neural Networks

The conjugate gradient optimization algorithm usually used for nonlinear least squares is presented and is combined with the modified back propagation algorithm yielding a new fast training multilayer perceptron (MLP) algorithm (CGFR/AG). The approaches presented in the paper consist of three steps: (1) Modification on standard back propagation algorithm by introducing gain variation term of the activation function, (2) Calculating the gradient descent on error with respect to the weights and gains values and (3) the determination of the new search direction by exploiting the information calculated by gradient descent in step (2) as well as the previous search direction. The proposed method improved the training efficiency of back propagation algorithm by adaptively modifying the initial search direction. Performance of the proposed method is demonstrated by comparing to the conjugate gradient algorithm from neural network toolbox for the chosen benchmark. The results show that the number of iterations required by the proposed method to converge is less than 20% of what is required by the standard conjugate gradient and neural network toolbox algorithm.