Skip to content
RMIT University Library - Learning Lab

NM1 Newton’s method

 

Stylised image of Newton's method iterative formula

You are used to solving equations using basic algebraic operations and perhaps the quadratic formula. However, some equations cannot be solved using these methods. For example the equations: \[\begin{align*} e^{-x} & =x & \left(1\right) \end{align*}\] or \[\begin{align*} -\ln\left(x\right) & =x^{3} & \left(2\right) \end{align*}\] cannot be solved using conventional methods. However they can be solved using numerical methods. The important aspect of equations \(\left(1\right)\) and \(\left(2\right)\) is that they can be re-arranged to \(f\left(x\right)=0.\) That is, the problem can be transformed to finding the root of a function \(f\left(x\right)\).1 A root of a function \(f\left(x\right)\) is the value of \(x\) for which \(f\left(x\right)=0.\)

Common numerical methods for root finding are:

  1. Newton’s method (also called the Newton-Raphson method)

  2. the Bisection method.

In this module we consider Newton’s method.

The Newton-Raphson Method

Newton’s method (Newton-Raphson) can provide the value of \(x\) such that \(f\left(x\right)=0\).

It has the form: \[\begin{align*} x_{n+1} & =x_{n}-\frac{f\left(x_{n}\right)}{f'\left(x_{n}\right)} & \left(1\right) \end{align*}\] where \(n\) starts at \(0.\) The method is iterative, you start at \(n=0\) and continue by incrementing \(n\) by one until you achieve the desired accuracy.

Note that:

  1. \(x_{0}\) is called the first estimate of the root, or the initial guess,

  2. \(x_{n+1}\) is the new estimate of the root, and \(x_{n}\) is the old estimate of the root,

  3. \(f\left(x_{n}\right)\) is the value of the function at \(x_{n}\) and,

  4. \(f'\left(x_{n}\right)\) is the value of the derivative of \(f\left(x\right)\) at \(x_{n}.\)

Example 1

Find the solution of \(e^{-x}=x\) for \(0<x<1\) to three decimal places.

Solution:

First define the function:2 Note that you can also use \(f\left(x\right)=x-e^{-x}\) as it is equal to zero at the same value of \(x\) as \(f\left(x\right)=e^{-x}-x.\) \[\begin{align*} f\left(x\right) & =e^{-x}-x. \end{align*}\] We want to find \(x\) where \(f\left(x\right)=0\). To do this we need an initial guess \(x_{0}\) of the root. This is easily done by graphing both \(y=x\) and \(y=e^{-x}.\) The point where the graphs intersect is the point where \(f\left(x\right)=0.\) The graphs are shown below.

Intersection of graphs of y equals e to the minus x and y equals x

A suitable value for \(x_{0}\) would be \(0.5\) and

\[\begin{align*} f'\left(x\right) & =-e^{-x}-1 \end{align*}\]

Using eqn \(\left(1\right)\), we can start the method \[\begin{align*} x_{1} & =x_{0}-\frac{f\left(x_{0}\right)}{f'\left(x_{0}\right)}\\ & =0.5-\frac{f\left(0.5\right)}{f'\left(0.5\right)}\\ & =0.5-\frac{e^{-0.5}-0.5}{-e^{-0.5}-1}\\ & =0.5663\\ x_{2} & =0.5663-\frac{f\left(0.5663\right)}{f'\left(0.5663\right)}\\ & =0.5663-\frac{e^{-0.5663}-0.5663}{-e^{-0.5663}-1}\\ & =0.5671\\ x_{3} & =x_{2}-\frac{f\left(x_{2}\right)}{f'\left(x_{2}\right)}\\ & =0.5671-\frac{e^{-0.5671}-0.5671}{-e^{-0.5671}-1}\\ & =0.5671. \end{align*}\] There has been no change to the first four places so we stop and the answer is \(x=0.567\,.\)

Example 2

Find the solution of \(-\ln\left(x\right)=x^{3}\) correct to five decimal places with an initial guess of \(x_{0}=1.\)

Solution:

Define the function \[\begin{align*} f\left(x\right) & =x^{3}+\ln\left(x\right) \end{align*}\] then \[\begin{align*} f'\left(x\right) & =3x^{2}+\frac{1}{x}. \end{align*}\] In this case we have a value for \(x_{0}\) so there is no need to graph the functions. But we do it anyway.3 It is a good idea to graph the functions because it allows us to get an idea of the answer. If Newton’s method gives a result that is different to what we see in the graph, we have to consider that a mistake has been made or Newton’s method has found another root that we did not see on our graph.

Intersection of graphs for y equals minus log x and y equals x cubed

From the graph we see the answer should be about \(0.7\).

Using Newton’s method: \[\begin{align*} x_{1} & =x_{0}-\frac{f\left(x_{0}\right)}{f'\left(x_{0}\right)}\\ & =1-\frac{1^{3}+\ln\left(1\right)}{3\cdot1^{2}+\left(\frac{1}{1}\right)}\\ & =1-\frac{1+0}{3+1}\\ & =0.75\\ x_{2} & =0.75-\frac{\left(0.75\right)^{3}+\ln\left(0.75\right)}{3\cdot\left(0.75\right)^{2}+\left(\frac{1}{0.75}\right)}\\ & =0.7055775\\ x_{3} & =0.7055775-\frac{\left(0.7055775\right)^{3}+\ln\left(0.7055775\right)}{3\cdot\left(0.7055775\right)^{2}+\left(\frac{1}{0.7055775}\right)}\\ & =0.7047098\\ x_{4} & =0.7047098-\frac{\left(0.7047098\right)^{3}+\ln\left(0.7047098\right)}{3\cdot\left(0.7047098\right)^{2}+\left(\frac{1}{0.7047098}\right)}\\ & =0.7047095. \end{align*}\] As we have no change in the first five places after the decimal point we stop and take the answer as \(x=0.70471\,.\) Note that we round up the fifth digit.

Convergence of the Newton-Raphson Method

Convergence indicates how quickly you will get an acceptable value of \(f\left(x\right)=0\). Slow convergence implies many iterations, while fast convergence means few iterations.

Convergence of the method depends on the function \(f\left(x\right)\) and the initial guess. If the initial guess is in an area where the gradient of \(f\left(x\right)\) is nearly zero, convergence can be slow.

Consider the function \(f\left(x\right)=x^{4}-1.\) It is graphed below.

Graph of x to the power of four minus 1 showing roots at x equals one and x equals negative one

Obviously the solutions of \(f\left(x\right)=0\) are \(x=-1\) and \(x=1\). These can be obtained by basic algebra and there is no need to use Newton’s method. However we will use this function to see what happens with different guesses for \(x_{0}\). We assume an accuracy to two decimal places and a calculation tolerance of \(0.0000001\).

Let \(x_{0}=0.1\). In this region the gradient of \(f\left(x\right)\) is close to zero. Newton’s method takes 25 iterations to get the result \(x=1.00.\)

Let \(x_{0}=0.25\) then it takes 15 iterations to get the result \(x=1.00.\)

Let \(x_{0}=0.5\) then it takes 9 iterations to get the result \(x=1.00.\)

Let \(x_{0}=0.75\) then it takes 6 iterations to get the result \(x=1.00.\)

So we see the number of iterations decrease (higher convergence) when the gradient of \(f\left(x\right)\) increases.

Convergence to Another Root.

If the function \(f\left(x\right)\) has several roots, Newton’s method may converge to any of these. For example the function \(f\left(x\right)=x^{3}+2x^{2}-5x-6\) has three roots at \(x=-3,-1\) and \(2\) as shown below.

Graph of x cubed plus two x squared minus 5 x minus 6 has roots at x equals negative 3, negative 1 and 2

If we set \(x_{0}=-2\) we, perhaps, would think the method should find the root \(x=-3\text{ or $-1$ }.\) But Newton’s method finds the root at \(x=2.\) This is why it is important to graph the function if possible.

Download this page, NM1 Newton’s method (PDF 727KB)

What's next... NM2 Simpson’s rule

Keywords: