0. What is Newton's mehod¶
In a nutshell, Newton's method is iterative method to find a solution to the equation $f(x) = 0$. At first, you can choose arbitrary point $x_0$, and the cross point of x axis and tangent at $f(x_0)$ will be $f(f_1)$. Continuously you can find $x_0, x_1, x_2, \cdots$ which approach to a solution until $|x_k - x_{k-1}|<\epsilon$. It's quite simple, isn't it??
1. Implementation¶
Here we're gonna take a look at basic implementaion of newton method.
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import seaborn
% matplotlib inline
def newton_method(f, df,x_0,eps=1e-10,max_iter=1000):
"""
newton method for 1-dimension
"""
x_old = x_0
i = 0
x_list=[]
x_list.append(x_old)
while True:
x_new = x_old - (f(x_old)/df(x_old))
x_list.append(x_new)
if abs(x_new-x_old) < eps:
break
x_old = x_new
i +=1
if max_iter <= i:
break
return x_new,np.array(x_list)
2. Find a solution¶
Now we're ready to find a solution by "newton's method". Here we'll find a solution of $x^3 -x^2 -6x$. At first, we're gonna look at what $x^3 -x^2 -6x$ looks like.
def f(x):
return x**3 -x**2 -6*x
x = np.linspace(-4,4,101)
y = f(x)
plt.plot(x,y)
plt.ylim(-40,40)
plt.title("What 'x^3 -x**2 -6x' looks like.",fontsize=16)
plt.show()
Actually, you can calculate a solution of this with compatative ease :) Anyway Let's see the result of newton method!
def df(x):
return 3*(x**2) -2*x -6
result, x_list = newton_method(f,df,-3)
print('Result of newton mehod : {}'.format(result))
Yes ! It's what we expected. Now we might wanna follow the tracks of newton's method :)
x = np.linspace(-4,6,101)
y = f(x)
plt.figure(figsize=(12,4))
plt.plot(x,y)
for i, xx in enumerate(x_list):
plt.scatter(xx,f(xx),
label='{} iteration'.format(i))
plt.legend()
plt.xlim(-4,-1)
plt.ylim(-40,40)
plt.title("Newton method around x=-2 in 'x^3 -x**2 -6x'",fontsize=16)
plt.show()
We can also compare with 'Gradient descent' as following.
def gradient_method(f,df,x,alpha=0.08,eps=1e-10):
"""
gradient descent for 1-dimension
"""
x_old = x
x_list = []
x_list.append(x_old)
while True:
x_new = x_old - f(x_old)*alpha
x_list.append(x_new)
if abs(f(x_new)) < eps:
break
x_old = x_new
return x_new, np.array(x_list)
result, x_list = gradient_method(f=f,df=df,x=-3)
print('Result of gradient_descent : ',result)
plt.figure(figsize=(12,6))
x = np.linspace(-4,4,101)
y = f(x)
plt.plot(x,y)
for i, xx in enumerate(x_list):
plt.scatter(xx,f(xx),
label='{} iteration'.format(i))
plt.legend()
plt.xlim(-4,-1)
plt.ylim(-40,40)
plt.title("Gradent mehod around x=-2 in 'x^3 -x**2 -6x'",fontsize=16)
plt.show()
Reference :
機械学習のエッセンス 加藤公一