機器學習-實現簡單神經網絡(差代碼)

課程地址:https://www.imooc.com/learn/813html

1、機器學習的基本概念 

2、感知器分類算法

1.分類算法的整體描述

分類行爲python

向量的點積git

矩陣的轉置github

2.感知器分類算法

增長w0,x0變量。當z大於0,輸出1,當z值小於0,輸出-1算法

x(j)輸入的相關電信號數組

只有感知器獲得錯誤分類,才須要調整▽w(j)的權重網絡

學習率根據不一樣狀況調整app

更新第一個量機器學習

同理,更新第二個量函數

更新第三個量

在輸入新的分類輸入,更新權重

閾值初始化爲0,電信號份量固定值爲1,被省略

告知其算法必須知足第一種狀況(圖一),即線性分割。後兩種狀況不適合感知器算法

目標是:找到中間分開虛線

步驟:

(1)初始化向量w

(2)把x樣本輸入到感知器中

(3)點積,份量相乘結果求和

(4)結果輸入到步調函數中,激活函數,獲得結果1或者-1。

(5)若是正確則輸出電信號,獲得最終結果;若是錯誤,就把錯誤結果根據前面所描述得步驟返回來,對權重向量進行跟新,再把其餘原有的訓練樣本或者是新的徐連樣本再從新輸入到感知器中。

3、感知器分類算法的Python實現

1.實現感知器對象

y中1對應X的【1,2,3】樣本,y中的-1對應X的【4,5,6】樣本

訓練過程完成

電信號輸入的點積,判斷預測值是否大於0,返回判斷分類,1,-1

神經元算法實現完畢

2.數據解析和可視化


要加載的數據,經過逗號隔開

讀取數據,顯示前十行

把第四列賦值給y

把字符串改爲數字

把第0列和第2列抽取出來

數據可視化

數據能夠線性分割

2.神經網絡對數據實現分類

根據不一樣顏色填充準備好的數據

利用meshgrid,構造2個維矩陣

把第一個元素拿出來,重複185次,成爲第一行。能夠重複255行矩陣

ravel:二維數組還原成擴展前的單維向量

4、適應性線性神經元

1.適應性線性神經元基本原理

把求和結果直接當成最終結果,比較

切線斜率正,減小神經元w份量的值

切線斜率負,增長神經元w份量的值

取得w最小值

2.適應性神經元代碼實現 

適應性線性網絡神經元代碼完成

主要是fit函數算法不一樣

上一節代碼拷貝到以前工程文件下

##是修改過的地方

用圖表查看訓練過程

neuralNetwork.py

#-*- coding:utf-8 -*-
import numpy as np
'''
eta:學習率
n_iter:權重向量的訓練次數
W:神經分叉權重向量
errors:用於記錄神經元判斷出錯次數
'''
class Perceptron(object):
    # 初始化
    def __init__(self, eta=0.01, n_iter=10):
        self.eta = eta
        self.n_iter = n_iter
        pass

    '''
    輸入訓練數據,培訓神經元,x輸入樣本向量,y對應樣本分類
    X:shape[n_samples, n_features]
    X:[[1,2,3],[4,5,6]]
    n_samples: 2
    n_features: 3
    y:[1, -1]
    '''
    def fit(self, X, y):
        #print('X',print(X))
        #print ('X.shape[1]',X.shape[1]) #特徵值有2個
        self.W = np.zeros(1 + X.shape[1]) #+1是由於前面算法提到的W0,也就是步調函數閾值,
        #print ('W',self.W)  #[0,0,0]
        self.errors = []
        '''
        X:[[1,2,3],[4,5,6]]
        y:[1,-1]
        zip(X,y) = [[1,2,3, 1],[4,5,6, -1]]
        '''
        for _ in range(self.n_iter):
            error = 0 #初始化權重向量爲0
            dw = np.zeros(1 + X.shape[1])
            # print('dw',dw)#出現20個[0,0,0]
            '''
            zip:將對象中對應的元素打包成一個個元組,而後返回由這些元組組成的列表
            >>> a = [1,2,3]
            >>> b = [4,5,6]
            >>> zipped = zip(a,b)     # 打包爲元組的列表
            [(1, 4), (2, 5), (3, 6)]
            '''
            for xi, target in zip(X, y):
                '''
                update = η * (y - y`)
                xi 是一個向量
                update * xi 等價:
                [∇w(1) = X[1] * update, ∇w(2) = X[2] *update, ∇w(3) = X[3] *update]
                '''
                update = self.eta * (target - self.predict(xi))
                dw[1:] += update * xi
                dw[0] += update
                error += int(update != 0)
                pass
            self.W += dw
            self.errors.append(error)

            pass

        pass

    '''
    z = W0*1 + W1*X1 + ... Wn*Xn
    '''
    def net_input(self, xi):
        return np.dot(self.W[1:], xi) + self.W[0]

    def predict(self, xi):
        return np.where(self.net_input(xi) > 0, 1, -1)

        pass


# 文件讀取
file = './iris.data.csv'
import pandas as pd

df = pd.read_csv(file, header=None)
df.head(10)

# 顯示原始數據
import matplotlib.pyplot as plt
import numpy as np

y = df.loc[0:99, 4].values
y = np.where(y == 'Iris-setosa', -1, 1)
# X爲第1和3列
X = df.iloc[0:100, [0, 2]].values

plt.scatter(X[:50, 0], X[:50, 1], color='red', marker='o', label='setosa')
plt.scatter(X[50:100, 0], X[50:100, 1], color='blue', marker='x', label='versicolor')
plt.xlabel(u'花瓣長度')
plt.ylabel(u'花徑長度')
plt.legend(loc='upper left')
# plt.show()

# 訓練並打印錯誤曲線
ppn = Perceptron(0.1, 20)
ppn.fit(X, y)
print (ppn.W)
plt.scatter(range(1, len(ppn.errors) + 1), ppn.errors, color='red', marker='o')

# plt.show()

# 定義打印分類器邊界函數
from matplotlib.colors import ListedColormap


def plot_decision_regions(X, y, classifier, resolution=0.02):
    markers = ('s', 'x', 'o', 'v')
    colors = ('red', 'blue', 'lightgreen', 'grey', 'cyan')
    cmap = ListedColormap(colors[:len(np.unique(y))])

    x1_min, x1_max = X[:, 0].min() - 1, X[:, 0].max()
    x2_min, x2_max = X[:, 1].min() - 1, X[:, 1].max()

    print(x1_min, x1_max)
    print(x2_min, x2_max)

    xx1, xx2 = np.meshgrid(np.arange(x1_min, x1_max, resolution),
                           np.arange(x2_min, x2_max, resolution))
    # print(xx2.shape)
    # print(xx2)

    z = classifier.predict(np.array([xx1.ravel(), xx2.ravel()]))
    # print(xx1.ravel())
    # print(xx2.ravel())
    # print(z)
    z = z.reshape(xx1.shape)
    plt.contourf(xx1, xx2, z, alpha=0.8, cmap=cmap)
    plt.xlim(xx1.min(), xx1.max())
    plt.ylim(xx2.min(), xx2.max())

    for idx, cl in enumerate(np.unique(y)):
        plt.scatter(x=X[y == cl, 0], y=X[y == cl, 1], alpha=0.8, c=cmap(idx), marker=markers[idx], label=cl)


# 打印邊界及原始數據
plot_decision_regions(X, y, ppn)
plt.xlabel(u'花瓣長度')
plt.ylabel(u'花徑長度')
plt.legend(loc='upper left')
# plt.show()

iris.data.csv

5.1,3.5,1.4,0.2,Iris-setosa
4.9,3.0,1.4,0.2,Iris-setosa
4.7,3.2,1.3,0.2,Iris-setosa
4.6,3.1,1.5,0.2,Iris-setosa
5.0,3.6,1.4,0.2,Iris-setosa
5.4,3.9,1.7,0.4,Iris-setosa
4.6,3.4,1.4,0.3,Iris-setosa
5.0,3.4,1.5,0.2,Iris-setosa
4.4,2.9,1.4,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
5.4,3.7,1.5,0.2,Iris-setosa
4.8,3.4,1.6,0.2,Iris-setosa
4.8,3.0,1.4,0.1,Iris-setosa
4.3,3.0,1.1,0.1,Iris-setosa
5.8,4.0,1.2,0.2,Iris-setosa
5.7,4.4,1.5,0.4,Iris-setosa
5.4,3.9,1.3,0.4,Iris-setosa
5.1,3.5,1.4,0.3,Iris-setosa
5.7,3.8,1.7,0.3,Iris-setosa
5.1,3.8,1.5,0.3,Iris-setosa
5.4,3.4,1.7,0.2,Iris-setosa
5.1,3.7,1.5,0.4,Iris-setosa
4.6,3.6,1.0,0.2,Iris-setosa
5.1,3.3,1.7,0.5,Iris-setosa
4.8,3.4,1.9,0.2,Iris-setosa
5.0,3.0,1.6,0.2,Iris-setosa
5.0,3.4,1.6,0.4,Iris-setosa
5.2,3.5,1.5,0.2,Iris-setosa
5.2,3.4,1.4,0.2,Iris-setosa
4.7,3.2,1.6,0.2,Iris-setosa
4.8,3.1,1.6,0.2,Iris-setosa
5.4,3.4,1.5,0.4,Iris-setosa
5.2,4.1,1.5,0.1,Iris-setosa
5.5,4.2,1.4,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
5.0,3.2,1.2,0.2,Iris-setosa
5.5,3.5,1.3,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
4.4,3.0,1.3,0.2,Iris-setosa
5.1,3.4,1.5,0.2,Iris-setosa
5.0,3.5,1.3,0.3,Iris-setosa
4.5,2.3,1.3,0.3,Iris-setosa
4.4,3.2,1.3,0.2,Iris-setosa
5.0,3.5,1.6,0.6,Iris-setosa
5.1,3.8,1.9,0.4,Iris-setosa
4.8,3.0,1.4,0.3,Iris-setosa
5.1,3.8,1.6,0.2,Iris-setosa
4.6,3.2,1.4,0.2,Iris-setosa
5.3,3.7,1.5,0.2,Iris-setosa
5.0,3.3,1.4,0.2,Iris-setosa
7.0,3.2,4.7,1.4,Iris-versicolor
6.4,3.2,4.5,1.5,Iris-versicolor
6.9,3.1,4.9,1.5,Iris-versicolor
5.5,2.3,4.0,1.3,Iris-versicolor
6.5,2.8,4.6,1.5,Iris-versicolor
5.7,2.8,4.5,1.3,Iris-versicolor
6.3,3.3,4.7,1.6,Iris-versicolor
4.9,2.4,3.3,1.0,Iris-versicolor
6.6,2.9,4.6,1.3,Iris-versicolor
5.2,2.7,3.9,1.4,Iris-versicolor
5.0,2.0,3.5,1.0,Iris-versicolor
5.9,3.0,4.2,1.5,Iris-versicolor
6.0,2.2,4.0,1.0,Iris-versicolor
6.1,2.9,4.7,1.4,Iris-versicolor
5.6,2.9,3.6,1.3,Iris-versicolor
6.7,3.1,4.4,1.4,Iris-versicolor
5.6,3.0,4.5,1.5,Iris-versicolor
5.8,2.7,4.1,1.0,Iris-versicolor
6.2,2.2,4.5,1.5,Iris-versicolor
5.6,2.5,3.9,1.1,Iris-versicolor
5.9,3.2,4.8,1.8,Iris-versicolor
6.1,2.8,4.0,1.3,Iris-versicolor
6.3,2.5,4.9,1.5,Iris-versicolor
6.1,2.8,4.7,1.2,Iris-versicolor
6.4,2.9,4.3,1.3,Iris-versicolor
6.6,3.0,4.4,1.4,Iris-versicolor
6.8,2.8,4.8,1.4,Iris-versicolor
6.7,3.0,5.0,1.7,Iris-versicolor
6.0,2.9,4.5,1.5,Iris-versicolor
5.7,2.6,3.5,1.0,Iris-versicolor
5.5,2.4,3.8,1.1,Iris-versicolor
5.5,2.4,3.7,1.0,Iris-versicolor
5.8,2.7,3.9,1.2,Iris-versicolor
6.0,2.7,5.1,1.6,Iris-versicolor
5.4,3.0,4.5,1.5,Iris-versicolor
6.0,3.4,4.5,1.6,Iris-versicolor
6.7,3.1,4.7,1.5,Iris-versicolor
6.3,2.3,4.4,1.3,Iris-versicolor
5.6,3.0,4.1,1.3,Iris-versicolor
5.5,2.5,4.0,1.3,Iris-versicolor
5.5,2.6,4.4,1.2,Iris-versicolor
6.1,3.0,4.6,1.4,Iris-versicolor
5.8,2.6,4.0,1.2,Iris-versicolor
5.0,2.3,3.3,1.0,Iris-versicolor
5.6,2.7,4.2,1.3,Iris-versicolor
5.7,3.0,4.2,1.2,Iris-versicolor
5.7,2.9,4.2,1.3,Iris-versicolor
6.2,2.9,4.3,1.3,Iris-versicolor
5.1,2.5,3.0,1.1,Iris-versicolor
5.7,2.8,4.1,1.3,Iris-versicolor
6.3,3.3,6.0,2.5,Iris-virginica
5.8,2.7,5.1,1.9,Iris-virginica
7.1,3.0,5.9,2.1,Iris-virginica
6.3,2.9,5.6,1.8,Iris-virginica
6.5,3.0,5.8,2.2,Iris-virginica
7.6,3.0,6.6,2.1,Iris-virginica
4.9,2.5,4.5,1.7,Iris-virginica
7.3,2.9,6.3,1.8,Iris-virginica
6.7,2.5,5.8,1.8,Iris-virginica
7.2,3.6,6.1,2.5,Iris-virginica
6.5,3.2,5.1,2.0,Iris-virginica
6.4,2.7,5.3,1.9,Iris-virginica
6.8,3.0,5.5,2.1,Iris-virginica
5.7,2.5,5.0,2.0,Iris-virginica
5.8,2.8,5.1,2.4,Iris-virginica
6.4,3.2,5.3,2.3,Iris-virginica
6.5,3.0,5.5,1.8,Iris-virginica
7.7,3.8,6.7,2.2,Iris-virginica
7.7,2.6,6.9,2.3,Iris-virginica
6.0,2.2,5.0,1.5,Iris-virginica
6.9,3.2,5.7,2.3,Iris-virginica
5.6,2.8,4.9,2.0,Iris-virginica
7.7,2.8,6.7,2.0,Iris-virginica
6.3,2.7,4.9,1.8,Iris-virginica
6.7,3.3,5.7,2.1,Iris-virginica
7.2,3.2,6.0,1.8,Iris-virginica
6.2,2.8,4.8,1.8,Iris-virginica
6.1,3.0,4.9,1.8,Iris-virginica
6.4,2.8,5.6,2.1,Iris-virginica
7.2,3.0,5.8,1.6,Iris-virginica
7.4,2.8,6.1,1.9,Iris-virginica
7.9,3.8,6.4,2.0,Iris-virginica
6.4,2.8,5.6,2.2,Iris-virginica
6.3,2.8,5.1,1.5,Iris-virginica
6.1,2.6,5.6,1.4,Iris-virginica
7.7,3.0,6.1,2.3,Iris-virginica
6.3,3.4,5.6,2.4,Iris-virginica
6.4,3.1,5.5,1.8,Iris-virginica
6.0,3.0,4.8,1.8,Iris-virginica
6.9,3.1,5.4,2.1,Iris-virginica
6.7,3.1,5.6,2.4,Iris-virginica
6.9,3.1,5.1,2.3,Iris-virginica
5.8,2.7,5.1,1.9,Iris-virginica
6.8,3.2,5.9,2.3,Iris-virginica
6.7,3.3,5.7,2.5,Iris-virginica
6.7,3.0,5.2,2.3,Iris-virginica
6.3,2.5,5.0,1.9,Iris-virginica
6.5,3.0,5.2,2.0,Iris-virginica
6.2,3.4,5.4,2.3,Iris-virginica
5.9,3.0,5.1,1.8,Iris-virginica

其餘閱讀參考文檔

1.神經網絡入門:http://www.ruanyifeng.com/blog/2017/07/neural-network.html

2.github項目:https://github.com/a414351664/Perceptron-sort-algorithm

相關文章
相關標籤/搜索