How to accelerate the convergence speed and avoid computing the inversion of a Jacobian matrix is important in the solution of nonlinear algebraic equations (NAEs). This paper develops an approach with a splitting-linearizing technique based on the nonlinear term to reduce the effect of the nonlinear terms. We decompose the nonlinear terms in the NAEs through a splitting parameter and then linearize the NAEs around the values at the previous step to a linear system. Through the maximal orthogonal projection concept, to minimize a merit function within a selected interval of splitting parameters, the optimal parameters can be quickly determined. In each step, a linear system is solved by the Gaussian elimination method, and the whole iteration procedure is convergent very fast. Several numerical tests show the high performance of the optimal split-linearization iterative method (OSLIM).

Nonlinear partial differential equations (PDEs) frequently appear in many critical natural sciences and engineering technology problems. Mathematical models in physics, mechanics, and other disciplines fields can be applied to describe physical phenomena. Although some phenomena may be modeled by the linear type PDEs, the assumption of more complex conditions on the material property and geometric domain of the problem requires the use of nonlinear type PDEs. In most situations, solving nonlinear PDEs analytically and exactly is impossible. Because of the crucial role of the nonlinear problems, the researchers provided different numerical methods to obtain the solutions of nonlinear PDEs, which are properly transformed into the nonlinear algebraic equations (NAEs) by using numerically discretized methods. As mentioned in [

For the vector form of the NAEs:

The Newton method possesses a significant advantage in that it is quadratically convergent. However, the main drawbacks are sensitive to the initial point

Let

According to

Nowadays, there have different splitting techniques to decompose

For the LAE, there are many discussions about the split method to determine the best relaxation factor

In this paper, a new splitting idea is implemented and applied to decompose

Then, we can decompose

The idea of the splitting technique in [

We give an initial guess

For

The above process is a novel splitting-linearizing method for solving the NAEs.

In this study, the Gaussian elimination method solves

For a linear system

Let

The minimization in

To optimize

Let

We can search the minimal value of

In the OSLIM, there is an interesting feature that by using the Cauchy-Schwarz inequality in

In general,

The procedures of the OSLIM to solve the NAEs are summarized as follows:

Give

Give an initial guess

For

If

To further assess the performance of OSLIM, we investigate it using the convergence order. For solving a scalar equation

For solving the NAEs, we can generate the sequence

We first consider [

Two roots are (1, 1) and (1, −1) obviously; however, there exist other roots.

As noted by Yeih et al. [^{−4}. When solving this one by the residual-norm based algorithm (RNBA) [^{−7}. Above iteration algorithms based on the evolution of the residual vector are not satisfied the given convergence criterion, and the third solution cannot be obtained by these algorithms.

Using the OSLIM to solve this problem, a translation is applied as follows:

By using the OSLIM, we take ^{−16}, 6.66 × 10^{−16}). In

Considering two-variable nonlinear equations [

Hirsch et al. [

To compare with the result obtained by Atluri et al. [^{−7}, 8.913 × 10^{−7}). By using the OSLIM, we take ^{−15}, 5.33 × 10^{−15}) are much smaller than that in [

Considering a system of three algebraic equations with three variables:

Obviously

By using the OSLIM, we take ^{−15}, 1.332 × 10^{−15}). The OSLIM converges faster and more accurate than obtained by Atluri et al. [^{−4}, 7.574 × 10^{−4}, 3.005 × 10^{−6}). In

Here, considering the following boundary value problem:

The exact solution is

By using the finite difference discretisation of

Setting ^{−10} and ^{−4}, which converges faster and is more accurate than Atluri et al. [^{−4}, and the errors are in the order of 10^{−3}. COC = 1.9741 is obtained, which reveals that the OSLIM is almost quadratic convergence.

Then, considering a test example given by Krzyworzcka [

To convince describe the coefficient matrices of

Setting ^{−10}, ^{−6}. COC = 1.9996 is obtained, which shows again that the OSLIM is quadratic convergence.

−0.2803136806 | −0.1169099198 | −0.0693778627 | −0.0576450538 | −0.0600860299 |

−0.0703323454 | −0.0878221002 | −0.1158487807 | −0.1634144710 | −0.2539580395 |

The following example is given by Roose et al. [

The OSLIM can adopt two different formulations to solve

To convince desdescribee coefficient matrices of

To convince describe the coefficient matrices of

Setting ^{−10}, ^{−6}. The residual errors of

0.1030955 | 1.4359354 |

Setting ^{−10},

Considering the following a two-dimensional nonlinear Bratu equation:

After collocating points inside the domain and on the boundary, simultaneously, to satisfy the

By using the OSLIM, we take ^{−5} as shown in ^{−6}. Interestingly, the OSLIM can be used to solve a highly nonlinear PDE with high accuracy.

When evaluating the performance of newly developed iterative solutions for solving nonlinear algebraic equations (NAE), there are two critical factors: accuracy and convergence speed. In this article, we first decomposed the nonlinear terms in the NAEs through a splitting parameter and then we linearized the NAEs around the values at the previous step and derived a system of linear algebraic equations (LAEs). Through the maximum orthogonal projection concept at each iterative step, we select the optimal value of the splitting parameter and use the Gaussian elimination method to solve the LAE. The features of the proposed method are summarized as follows: (a) The merit function is in terms of the coefficient matrix, right-side vector and the value of unknown vector at the previous step. (b) It is easy to minimize the search within a preferred interval through some operations. (c) The current OSLIM is insensitive to the initial guess and without needing the inversion of the Jacobian matrix at each iterative step, which can save much of the computational time. We have successfully applied the OSLIM to solve the NAEs resulting from the nonlinear boundary value problem in one- and two-dimensional spatial spaces. Through the numerical test, the convergence order of the OSLIM is between linear and quadratic convergence.