BFGS算法,即拟牛顿法中最常用的一种方法,其全称是Broyden-Fletcher-Goldfarb-Shanno法。它属于无约束非线性优化算法,适合于求解大规模优化问题。BFGS算法基于牛顿法,但避免了在每一步迭代中计算海森矩阵的高昂代价。
BFGS算法的核心思想是通过利用历史信息,利用近似的方式来计算海森矩阵的逆矩阵。在BFGS算法中,海森矩阵的逆矩阵被近似为一个对称正定矩阵,而非像牛顿法那样精确计算。这种近似方法要比求解高阶导数更具有实际意义。
下面是Python实现的BFGS算法的示例代码:
import numpy as np
def bfgs(f, x0, eps=1e-6, max_iter=1000):
n = x0.shape[0]
B = np.eye(n)
x = x0
k = 0
gk = grad(f, x)
while np.linalg.norm(gk) > eps and k < max_iter:
pk = - np.dot(B, gk)
alpha = line_search(f, x, pk)
x_new = x + alpha * pk
sk = x_new - x
x = x_new
gk_new = grad(f, x)
yk = gk_new - gk
gk = gk_new
B = np.dot((np.eye(n) - np.outer(sk, yk) / np.dot(yk.T, sk)), np.dot(B, (np.eye(n) - np.outer(yk, sk) / np.dot(yk.T, sk)))) + np.outer(sk, sk) / np.dot(yk.T, sk)
k = k + 1
return x
def grad(f, x, eps=1e-6):
n = x.shape[0]
g = np.zeros(n)
for i in range(n):
xi = x[i]
x[i] = xi + eps
f1 = f(x)
x[i] = xi - eps
f2 = f(x)
x[i] = xi
g[i] = (f1 - f2) / (2 * eps)
return g
def