Tensorflow 學習筆記

# 在用tensorflow前先打開
$ source ~/tensorflow/bin/activate

# 激活後會有標識
(tensorflow)$

#用完後關閉
(tensorflow)$ deactivate

# 刪除tensorflow
$ rm -r ~/tensorflow

 

1.  Getting Started With Tensorflowhtml

  Tensorflow的基本單元是張量(tensor),張量的階(rank)是它的維度(the number of dimension)。node

  

  Tensorflow由一系列運算圖(computational graph)組成。點有多種類型,包括常數(constant)、變量(variable)。python

  

  要操做computational graph中的node,要用到session。git

sess = tf.Session() print(sess.run([node1, node2]))

  可視化: TensorBoard能夠用來表現computational graph。github

  

  用placeholder能夠完成相似函數(function)的功能數組

a = tf.placeholder(tf.float32) b = tf.placeholder(tf.float32) adder_node = a + b # + provides a shortcut for tf.add(a, b)

print(sess.run(adder_node, {a: 3, b: 4.5}))

  利用變量(variables)給節點添加參數網絡

 

2. Deep MNIST for Expertssession

InteractiveSession class is more convenient and flexible.ide

import tensorflow as tf sess = tf.InteractiveSession()

Placeholders with shape argument can automatically catch bugs regarding on manipulation on inconsistent tensor shape.函數

x = tf.placeholder(tf.float32, shape=[None, 784])  # None means that the number of sample is variable and the length of each sample is 784 (28 * 28)

 包含Deep convolutional neural networks的使用教程。

 

3. Read Data

batchsize: 每次訓練在訓練集中取得樣本數

iteration: 使用batchsize個樣本訓練一次

epoch: 使用訓練集中的所有樣本訓練一次

 

# 讀入的數據以列表存儲
filename = []
# tf.train.string_input_producer() 產生一個文件名隊列
filename_queue = tf.train.string_input_producer(filename, shuffle=False, num_epochs=5)# reader從文件名隊列中讀數據。對應的方法是reader.read reader = tf.WholeFileReader() key, value = reader.read(filename_queue) # tf.train.string_input_producer定義了一個epoch變量,要對它進行初始化 tf.local_variables_initializer().run() # 使用start_queue_runners以後,纔會開始填充隊列 threads = tf.train.start_queue_runners(sess=sess)

詳見: http://blog.csdn.net/buptgshengod/article/details/72956846

 

numpy

import numpy as np np.loadtext(c) # return ndarray
np.ones() # return a new array of given shape and type, filled with ones
i = 1 * 10 ** (-10) # ** means power, and its priority is higher than *
np.eye(3, dtype = int) # diagonal matrix, consist of 0 and 1
vector = np.hstack((vector, temp)) # concatenation along the second axis (horizontally in 2 dimension)
np.save(filename, array, fmt = '%d') # save the data of int format

 

4. make hand dirty

numpy一位數組轉置只須要操做shape

array_1d = np.array([1, 2]) array_1d.shape = (2, 1)  # finished
 array_2d.transpose() # 二維數組的轉置方法

python assert斷言是聲明其布爾值必須爲真的斷定

assert len(lists) >=5  # 若爲假則觸發異常

 使用名稱前的單下劃線,用於指定該名稱屬性爲「私有」,其只供內部使用。 (_spam)

 

 collections是一個集合模塊,提供許多有用的集合類,如namedtuple(提供一個列表的集合類)

>>> from collections import namedtuple >>> Point = namedtuple('Point', ['x', 'y']) >>> p = Point(1, 2) >>> p.x 1
>>> p.y 2

 5. Model  

  strides 中的四個參數,第一個是 the number of images,第二個是 the height of images,第三個是 the width of images,第四個是 the number of channel

  小白學Tensorflow之卷積神經網絡 https://www.jianshu.com/p/70c8d6663b00
  深刻MNIST http://www.tensorfly.cn/tfdoc/tutorials/mnist_pros.html
  Batch Normalization 批標準化 https://morvanzhou.github.io/tutorials/machine-learning/tensorflow/5-13-BN/
  martin-gorner/tensorflow-mnist-tutorial (batch normalization)https://github.com/martin-gorner/tensorflow-mnist-tutorial

6. 求AUC

import tensorflow as tf a = tf.Variable([0.1, 0.5]) b = tf.Variable([0.2, 0.6]) auc = tf.contrib.metrics.streaming_auc(a, b) sess = tf.Session() sess.run(tf.initialize_all_variables()) sess.run(tf.initialize_local_variables()) # try commenting this line and you'll get the error
train_auc = sess.run(auc) print(train_auc)

https://stackoverflow.com/questions/39435341/how-to-calculate-auc-with-tensorflow

 7. 可視化 (好)

Tensorflow的可視化工具Tensorboard的初步使用

https://blog.csdn.net/mao_feng/article/details/54731098

tensorflow + mnist + cnn + batch normalization

https://github.com/hwalsuklee/tensorflow-mnist-cnn

 

8. 過擬合狀況及解決

dropout https://blog.csdn.net/smf0504/article/details/55254818

batch normalization https://blog.csdn.net/whitesilence/article/details/75667002

slim cnn 代碼 https://www.2cto.com/kf/201706/649266.html

 

9.論文寫法

https://blog.csdn.net/fengbingchun/article/details/50529500 cnn 結構

相關文章
相關標籤/搜索