The BGMRES Method for Generalized Sylvester Matrix Equation AXB − X = C and Preconditioning

In this paper, we present the block generalized minimal residual (BGMRES) method in order to solve the generalized Sylvester matrix equation. However, this method may not be converged in some problems. We construct a polynomial preconditioner based on BGMRES which shows why polynomial preconditioner is superior to some block solvers. Finally, numerical experiments report the effectiveness of this method.

Convergence and Comparison Theorems of the Modified Gauss-Seidel Method

In this paper, the modified Gauss-Seidel method with the new preconditioner for solving the linear system Ax = b, where A is a nonsingular M-matrix with unit diagonal, is considered. The convergence property and the comparison theorems of the proposed method are established. Two examples are given to show the efficiency and effectiveness of the modified Gauss-Seidel method with the presented new preconditioner.

Comparison of Two Types of Preconditioners for Stokes and Linearized Navier-Stokes Equations

To solve saddle point systems efficiently, several preconditioners have been published. There are many methods for constructing preconditioners for linear systems from saddle point problems, for instance, the relaxed dimensional factorization (RDF) preconditioner and the augmented Lagrangian (AL) preconditioner are used for both steady and unsteady Navier-Stokes equations. In this paper we compare the RDF preconditioner with the modified AL (MAL) preconditioner to show which is more effective to solve Navier-Stokes equations. Numerical experiments indicate that the MAL preconditioner is more efficient and robust, especially, for moderate viscosities and stretched grids in steady problems. For unsteady cases, the convergence rate of the RDF preconditioner is slightly faster than the MAL perconditioner in some circumstances, but the parameter of the RDF preconditioner is more sensitive than the MAL preconditioner. Moreover the convergence rate of the MAL preconditioner is still quite acceptable. Therefore we conclude that the MAL preconditioner is more competitive than the RDF preconditioner. These experiments are implemented with IFISS package. 

Some Preconditioners for Block Pentadiagonal Linear Systems Based on New Approximate Factorization Methods

In this paper, getting an high-efficiency parallel algorithm to solve sparse block pentadiagonal linear systems suitable for vectors and parallel processors, stair matrices are used to construct some parallel polynomial approximate inverse preconditioners. These preconditioners are appropriate when the desired target is to maximize parallelism. Moreover, some theoretical results about these preconditioners are presented and how to construct preconditioners effectively for any nonsingular block pentadiagonal H-matrices is also described. In addition, the availability of these preconditioners is illustrated with some numerical experiments arising from two dimensional biharmonic equation.

Advanced Neural Network Learning Applied to Pulping Modeling

This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of pulping problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified odified problem M-1 Ax= M-1b where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.

An Incomplete Factorization Preconditioner for LMS Adaptive Filter

In this paper an efficient incomplete factorization preconditioner is proposed for the Least Mean Squares (LMS) adaptive filter. The proposed preconditioner is approximated from a priori knowledge of the factors of input correlation matrix with an incomplete strategy, motivated by the sparsity patter of the upper triangular factor in the QRD-RLS algorithm. The convergence properties of IPLMS algorithm are comparable with those of transform domain LMS(TDLMS) algorithm. Simulation results show efficiency and robustness of the proposed algorithm with reduced computational complexity.

Modeling of Pulping of Sugar Maple Using Advanced Neural Network Learning

This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of Pulping of Sugar Maple problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified problem where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.