人工智能實戰_第五次做業_陳澤寅

16071070 _ 陳澤寅 _ 第五次做業:

1、簡要概述

項目 內容
課程 人工智能實戰2019
做業要求 做業要求
我在這個課程的目標是 瞭解人工智能理論,提高coding能力
這個做業在哪一個具體方面幫助我實現目標 瞭解單層神經工做原理,掌握幾種梯度降低法的優缺點,本身實現簡單算法.

2、題目要求以及代碼邏輯

  • 訓練一個邏輯與門和邏輯或門,結果及代碼造成博客

一、與門算法

import numpy as np
X = np.array([[0, 0],[0, 1],[1, 0],[1, 1]])
y = np.array([0, 0, 0, 1])
y_final = np.zeros((4, 2))
y_final[np.arange(4),y]=1

weights = np.random.randn(2, 2)
bias = np.ones((1, 2))
loss = np.infty
while loss > 0.0001:
    output = 1/(1 + np.exp(-np.dot(X, weights)-bias))
    weights -= np.dot(X.T ,(output - y_final))
    bias -= np.mean(output-y_final, axis=0, keepdims=True)
    loss = -np.mean(np.log(output)*y_final)
print output
print weights
print bias
print loss

代碼結果dom

*************
outputs:
[[  9.99980721e-01   2.00393691e-05]
 [  9.75025709e-01   2.52898406e-02]
 [  9.75025709e-01   2.52898406e-02]
 [  2.85468143e-02   9.71092396e-01]]
*************
weights:
[[-7.19543715  7.16967209]
 [-7.19543715  7.16967209]]
*************
loss:
0.00999197747583

可視化:人工智能

for i in range(X.shape[0]):
        if y[i] == 0:
            plt.plot(X[i,0], X[i,1], '.', c='b')
        elif y[i] == 1:
            plt.plot(X[i,0], X[i,1], '^', c='r')
plt.axis([-0.2, 1.2, -0.2, 1.2])
w = -weights[0,0]/weights[1,0]
b = -bias[0,0]/weights[1,0]

x = np.linspace(0,1,10)
y = w*x + b
plt.plot(x,y)
plt.show()

2、或門spa

import numpy as np
import matplotlib.pyplot as plt
X = np.array([[0, 0],[0, 1],[1, 0],[1, 1]])
y = np.array([0, 1, 1, 1])
y_final = np.zeros((4, 2))
y_final[np.arange(4),y]=1

weights = np.random.randn(2, 2)
bias = np.ones((1, 2))
loss = 100
while loss > 0.001:
    output = 1/(1 + np.exp(-np.dot(X, weights)-bias))
    weights -= np.dot(X.T ,(output - y_final))
    bias -= np.mean(output-y_final, axis=0, keepdims=True)
    loss = -np.mean(np.log(output[np.arange(4), y]))
print output
print weights
print bias
print loss

代碼結果code

*************
outputs:
[[  9.97338067e-01   2.66987992e-03]
 [  6.65036579e-04   9.99332973e-01]
 [  6.65039192e-04   9.99332983e-01]
 [  1.18202142e-09   9.99999999e-01]]
*************
weights:
[[-13.24170158  13.2357432 ]
 [-13.24170551  13.23572832]]
*************
loss:
0.000999993120121

可視化:
blog

相關文章
相關標籤/搜索