源碼
#> tutorial:https://www.cnblogs.com/xianhan/p/9090426.html
# 步驟一:構建模型
# 1.TensorFlow 中的線性模型
## 佔位符(Placeholder):表示執行梯度降低時將實際數據值輸入到模型中的一個入口點。例如房子面積 (x) 和房價 (y_)。
x = tf.placeholder(tf.float32,[None,1]); # X佔位一條 Nx1維的向量
## 變量:表示咱們試圖尋找的可以使成本函數降到最小的「good」值的變量,例如 W 和 b。
W = tf.Variable(tf.zeros([1,1])); # tf.zeros([1,1]):生成 第1行含1個元素的【二維】數組:[[ 0.]]
b = tf.Variable(tf.zeros([1])); # tf.zeros([1]) : 生成 第1行含1個元素的【一維數組】:[0.]
## 而後 TensorFlow 中的線性模型 (y = W.x + b) 就是:
y = tf.matmul(x,W)+b;
# 2.TensorFlow 中的成本函數
## 與將數據點的實際房價 (y_) 輸入模型相似,咱們建立一個佔位符。
y_ = tf.placeholder(tf.float32,[None,1])
## 成本函數的最小方差就是:
cost = tf.reduce_sum(tf.pow(y_ - y,2)); # 各項樣本點的最小方差之和做爲擬合的成本函數
# 3.數據
## 因爲沒有房價(y_) 和房子面積 (x) 的實際數據點,咱們就生成它們
## 簡單起見,咱們將房價 (ys) 設置成永遠是房子面積 (xs) 的 2 倍。
for i in range(100):
## create fake data for actual data
xs = np.array([[i]]);
ys = np.array([[2*i+20]]);
pass;
# 4.梯度降低
## 有了線性模型、成本函數和數據,咱們就能夠開始執行梯度降低從而最小化代價函數,以得到 W、b 的「good」值。
learning_rate = 0.001; ## 學習率 or步長 (每次進行訓練時在最陡的梯度方向上所採起的「步」長)
train_step = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost);
# 步驟二:訓練模型
## 訓練包含以預先肯定好的次數執行梯度降低,或者是直到成本函數低於某個預先肯定的臨界值爲止。
# 1.TensorFlow 的怪異
## 全部變量都須要在訓練開始時進行初始化,不然它們可能會帶有以前執行過程當中的殘餘值。
init = tf.initialize_all_variables();
# 2.TensorFlow 會話
## 雖然 TensorFlow 是一個 Python 庫,Python 是一種解釋性的語言,可是默認狀況下不把 TensorFlow 運算用做解釋性能的緣由,所以不執行上面的 init 。
## 相反 TensorFlow 是在一個會話中進行;建立一個會話 (sess) 而後使用 sess.run() 去執行。
session = tf.Session();
session.run(init)
steps = 50; # 迭代次數太高之後,會產生過擬合現象【其計算出的值可能會是嚴重錯誤的擬合值】
# 相似地咱們在一個循環中調用 withinsess.run() 來執行上面的 train_step
arrayX = [];
arrayY = [];
for i in range(steps):
# Create fake data for y = W*x + b where W=2,b=0.2
xs = np.array([[i]]);
ys = np.array([[2*i+0.2]]);
# xs = np.array([x_data[i]]);
# ys = np.array([y_true[i]]);
arrayX.extend(xs[0]);
arrayY.extend(ys[0]);
# Train
feed = {x:xs,y_:ys};
session.run(train_step,feed_dict=feed); # feed them into train_step
# View
print("After %d iteration:"%i)
print("W:%f"%session.run(W))
print("b:%f"%session.run(b))
pass;
# 可視化
print("W:\n",session.run(W));
print("b:\n",session.run(b));
arrayX = np.array(arrayX);
arrayX = arrayX.reshape((1,steps));
arrayB = np.array(np.full(steps,session.run(b)));
arrayB = arrayB.reshape(1,steps);
arrayB = np.transpose(arrayB)
# print("arrayB:\n",arrayB);
predictYs = np.dot(np.transpose(arrayX),session.run(W))+ arrayB;
# print(predictYs);
# print(arrayX)
#print(arrayY)
plt.rcParams['figure.dpi'] = 300 #分辨率
plt.scatter(arrayX, arrayY, marker = '*',color = 'red', s = 10 ,label = 'Actual Dataset')
plt.scatter(arrayX, predictYs, marker = 'o',color = 'green', s = 8 ,label = 'Fit Dataset')
plt.legend(loc = 'best') # 設置 圖例所在的位置 使用推薦位置
After 0 iteration:
W:0.000000
b:0.000400
After 1 iteration:
W:0.004399
b:0.004799
After 2 iteration:
W:0.021145
b:0.013172
After 3 iteration:
W:0.057885
b:0.025419
After 4 iteration:
W:0.121430
b:0.041305
After 5 iteration:
W:0.216945
b:0.060408
After 6 iteration:
W:0.347000
b:0.082084
After 7 iteration:
W:0.510645
b:0.105462
After 8 iteration:
W:0.702795
b:0.129480
After 9 iteration:
W:0.914212
b:0.152971
After 10 iteration:
W:1.132310
b:0.174781
After 11 iteration:
W:1.342846
b:0.193921
After 12 iteration:
W:1.532252
b:0.209704
After 13 iteration:
W:1.690099
b:0.221846
After 14 iteration:
W:1.810968
b:0.230480
After 15 iteration:
W:1.895118
b:0.236090
After 16 iteration:
W:1.947663
b:0.239374
After 17 iteration:
W:1.976575
b:0.241075
After 18 iteration:
W:1.990276
b:0.241836
After 19 iteration:
W:1.995707
b:0.242122
After 20 iteration:
W:1.997456
b:0.242209
After 21 iteration:
W:1.997927
b:0.242232
After 22 iteration:
W:1.998075
b:0.242238
After 23 iteration:
W:1.998169
b:0.242242
After 24 iteration:
W:1.998251
b:0.242246
After 25 iteration:
W:1.998325
b:0.242249
After 26 iteration:
W:1.998393
b:0.242251
After 27 iteration:
W:1.998455
b:0.242254
After 28 iteration:
W:1.998512
b:0.242256
After 29 iteration:
W:1.998564
b:0.242258
After 30 iteration:
W:1.998613
b:0.242259
After 31 iteration:
W:1.998658
b:0.242261
After 32 iteration:
W:1.998701
b:0.242262
After 33 iteration:
W:1.998741
b:0.242263
After 34 iteration:
W:1.998778
b:0.242264
After 35 iteration:
W:1.998813
b:0.242265
After 36 iteration:
W:1.998846
b:0.242266
After 37 iteration:
W:1.998878
b:0.242267
After 38 iteration:
W:1.998907
b:0.242268
After 39 iteration:
W:1.998935
b:0.242269
After 40 iteration:
W:1.998960
b:0.242269
After 41 iteration:
W:1.998989
b:0.242270
After 42 iteration:
W:1.999004
b:0.242270
After 43 iteration:
W:1.999050
b:0.242271
After 44 iteration:
W:1.999007
b:0.242270
After 45 iteration:
W:1.999222
b:0.242275
After 46 iteration:
W:1.998624
b:0.242262
After 47 iteration:
W:2.000731
b:0.242307
After 48 iteration:
W:1.993301
b:0.242152
After 49 iteration:
W:2.021338
b:0.242724
W:
[[ 2.02133822]]
b:
[ 0.24272442]
推薦文獻