梯度降低簡介

Outline

  • What's Gradientui

  • What does it meanspa

  • How to Searchcode

  • AutoGradit

What's Gradient

  • 導數,derivative,抽象表達io

  • 偏微分,partial derivative,沿着某個具體的軸運動class

  • 梯度,gradient,向量import

\[ \nabla{f} = (\frac{\partial{f}}{\partial{x_1}};\frac{\partial{f}}{{\partial{x_2}}};\cdots;\frac{\partial{f}}{{\partial{x_n}}}) \]tensorflow

19-梯度降低簡介-梯度圖.jpg

What does it mean?

  • 箭頭的方向表示梯度的方向
  • 箭頭模的大小表示梯度增大的速率

19-梯度降低簡介-梯度是什麼.jpg

How to search

  • 沿着梯度降低的反方向搜索

19-梯度降低簡介-2梯度搜索.jpg

For instance

\[ \theta_{t+1}=\theta_t-\alpha_t\nabla{f(\theta_t)} \]搜索

19-梯度降低簡介-二維梯度降低1.gif

19-梯度降低簡介-二維梯度降低2.gif

AutoGrad

  • With Tf.GradientTape() as tape:
    • Build computation graph
    • \(loss = f_\theta{(x)}\)
  • [w_grad] = tape.gradient(loss,[w])
import tensorflow as tf
w = tf.constant(1.)
x = tf.constant(2.)
y = x * w
with tf.GradientTape() as tape:
    tape.watch([w])
    y2 = x * w
grad1 = tape.gradient(y, [w])
grad1
[None]
with tf.GradientTape() as tape:
    tape.watch([w])
    y2 = x * w
grad2 = tape.gradient(y2, [w])
grad2
[<tf.Tensor: id=30, shape=(), dtype=float32, numpy=2.0>]
try:
    grad2 = tape.gradient(y2, [w])
except Exception as e:
    print(e)
GradientTape.gradient can only be called once on non-persistent tapes.
  • 永久保存grad
with tf.GradientTape(persistent=True) as tape:
    tape.watch([w])
    y2 = x * w
grad2 = tape.gradient(y2, [w])
grad2
[<tf.Tensor: id=35, shape=(), dtype=float32, numpy=2.0>]
grad2 = tape.gradient(y2, [w])
grad2
[<tf.Tensor: id=39, shape=(), dtype=float32, numpy=2.0>]

\(2^{nd}\)-order

  • y = xw + b

  • \(\frac{\partial{y}}{\partial{w}} = x\)
  • \(\frac{\partial^2{y}}{\partial{w^2}} = \frac{\partial{y'}}{\partial{w}} = \frac{\partial{X}}{\partial{w}} = None\)

with tf.GradientTape() as t1:
    with tf.GradientTape() as t2:
        y = x * w + b
    dy_dw, dy_db = t2.gradient(y, [w, b])

d2y_dw2 = t1.gradient(dy_dw, w)
相關文章
相關標籤/搜索