版權聲明:本文爲博主原創文章,歡迎轉載,並請註明出處。聯繫方式:460356155@qq.compython
TensorFlow是Google開發的開源的深度學習框架,也是當前使用最普遍的深度學習框架。git
1、安裝ubuntu
ubuntu16.04安裝TensorFlow很簡單:網絡
pip install tensorflow==1.1.0 --usersession
安裝是否成功驗證:框架
>>> import tensorflow as tf
>>> tf.__version__
'1.1.0'
>>> session = tf.Session()
>>> a = tf.constant(100)
>>> b = tf.constant(200)
>>> print(session.run(a+b))
300dom
2、Mnist訓練學習
定義三層全鏈接的網絡結構:768 × 300 × 10,完整代碼以下:測試
# -*- coding:utf-8 -*- u"""TensorFlow訓練Mnist""" __author__ = 'zhengbiqing 460356155@qq.com' import tensorflow as tf from tensorflow.examples.tutorials.mnist import input_data # 超參數定義 learning_rate = 0.5 epochs = 5000 batch_size = 128 def main(): # 模型定義 # 輸入圖片爲28 x 28 = 784 像素 x = tf.placeholder(tf.float32, [None, 784]) # 輸入層---->隱藏層權重及bias初始化 W1 = tf.Variable(tf.random_normal([784, 300], stddev=0.03), name='W1') b1 = tf.Variable(tf.random_normal([300]), name='b1') # 隱藏層---->輸出層權重及bias初始化 W2 = tf.Variable(tf.random_normal([300, 10], stddev=0.03), name='W2') b2 = tf.Variable(tf.random_normal([10]), name='b2') # 隱藏層輸出計算 hidden_out = tf.add(tf.matmul(x, W1), b1) hidden_out = tf.nn.relu(hidden_out) # 模型輸出 model_out = tf.nn.softmax(tf.add(tf.matmul(hidden_out, W2), b2)) # model_out = tf.nn.softmax(model_out) # 交叉熵定義 y = tf.placeholder(tf.int64, [None]) cross_entropy = tf.losses.sparse_softmax_cross_entropy(labels=y, logits=model_out) # 優化器,肯定優化目標 optimizer = tf.train.GradientDescentOptimizer(learning_rate=learning_rate).minimize(cross_entropy) # mnist 數據集 mnist = input_data.read_data_sets("MNIST_data/") # 建立session with tf.Session() as sess: # session初始化 tf.global_variables_initializer().run(session=sess) # 模型訓練 for epoch in range(epochs): batch_xs, batch_ys = mnist.train.next_batch(batch_size) sess.run(optimizer, feed_dict={x: batch_xs, y: batch_ys}) # 測試準確率 if epoch % 50 == 0: correct = tf.equal(tf.argmax(model_out, 1), y) accuracy = tf.reduce_mean(tf.cast(correct, tf.float32)) acc = sess.run(accuracy, feed_dict={x: mnist.test.images, y: mnist.test.labels}) print('Epoch:%d, Acc:%f' % (epoch, acc)) if __name__ == '__main__': main()
運行結果:優化
zbq@zbq:~/tf$ python tf-minist.py
Extracting MNIST_data/train-images-idx3-ubyte.gz
Extracting MNIST_data/train-labels-idx1-ubyte.gz
Extracting MNIST_data/t10k-images-idx3-ubyte.gz
Extracting MNIST_data/t10k-labels-idx1-ubyte.gz
Epoch:0, Acc:0.097400
Epoch:50, Acc:0.606300
Epoch:100, Acc:0.726400
Epoch:150, Acc:0.745900
Epoch:200, Acc:0.751400
......
Epoch:4800, Acc:0.957200
Epoch:4850, Acc:0.957800
Epoch:4900, Acc:0.958000
Epoch:4950, Acc:0.958700
運行5000個迭代,準確率達到了95%左右,對一個簡單的三層全鏈接網絡,該準確率仍是不錯的。