目錄python
What's Gradientui
What does it meanspa
How to Searchcode
AutoGradit
導數,derivative,抽象表達io
偏微分,partial derivative,沿着某個具體的軸運動class
梯度,gradient,向量import
\[ \nabla{f} = (\frac{\partial{f}}{\partial{x_1}};\frac{\partial{f}}{{\partial{x_2}}};\cdots;\frac{\partial{f}}{{\partial{x_n}}}) \]tensorflow
\[ \theta_{t+1}=\theta_t-\alpha_t\nabla{f(\theta_t)} \]搜索
import tensorflow as tf
w = tf.constant(1.) x = tf.constant(2.) y = x * w
with tf.GradientTape() as tape: tape.watch([w]) y2 = x * w
grad1 = tape.gradient(y, [w]) grad1
[None]
with tf.GradientTape() as tape: tape.watch([w]) y2 = x * w
grad2 = tape.gradient(y2, [w]) grad2
[<tf.Tensor: id=30, shape=(), dtype=float32, numpy=2.0>]
try: grad2 = tape.gradient(y2, [w]) except Exception as e: print(e)
GradientTape.gradient can only be called once on non-persistent tapes.
with tf.GradientTape(persistent=True) as tape: tape.watch([w]) y2 = x * w
grad2 = tape.gradient(y2, [w]) grad2
[<tf.Tensor: id=35, shape=(), dtype=float32, numpy=2.0>]
grad2 = tape.gradient(y2, [w]) grad2
[<tf.Tensor: id=39, shape=(), dtype=float32, numpy=2.0>]
y = xw + b
\(\frac{\partial^2{y}}{\partial{w^2}} = \frac{\partial{y'}}{\partial{w}} = \frac{\partial{X}}{\partial{w}} = None\)
with tf.GradientTape() as t1: with tf.GradientTape() as t2: y = x * w + b dy_dw, dy_db = t2.gradient(y, [w, b]) d2y_dw2 = t1.gradient(dy_dw, w)