1、支持向量機 (SVM)算法的原理算法
支持向量機(Support Vector Machine,常簡稱爲SVM)是一種監督式學習的方法,可普遍地應用於統計分類以及迴歸分析。它是將向量映射到一個更高維的空間裏,在這個空間裏創建有一個最大間隔超平面。在分開數據的超平面的兩邊建有兩個互相平行的超平面,分隔超平面使兩個平行超平面的距離最大化。假定平行超平面間的距離或差距越大,分類器的總偏差越小。dom
1.支持向量機的基本思想函數
對於線性可分的任務,找到一個具備最大間隔超平面,如圖所示,學習
(1)支持向量機的基本型爲:優化
(2)軟間隔的優化目標:spa
其中,0-1函數爲錯分樣本的個數。3d
(3)核方法:code
其中爲特徵映射函數。orm
二、實驗通常步驟:blog
(1)導入數據;
(2)數據歸一化;
(3)執行svm尋找最優的超平面;
(4)繪製分類超平面核支持向量;
(5)利用多項式特徵在高維空間中執行線性svm
(6)選擇合適的核函數,執行非線性svm;
三、算法優缺點:
算法優勢:
(1)使用核函數能夠向高維空間進行映射
(2)使用核函數能夠解決非線性的分類
(3)分類思想很簡單,就是將樣本與決策面的間隔最大化
(4)分類效果較好
算法缺點:
(1)SVM算法對大規模訓練樣本難以實施
(2)用SVM解決多分類問題存在困難
(3)對缺失數據敏感,對參數和核函數的選擇敏感
2、數學推導過程
對於線性可分的支持向量機求解問題實際上可轉化爲一個帶約束條件的最優化求解問題:
推理過程:
結果:
對於線性不可分的支持向量機求解問題實際上可轉化爲一個帶約束條件的soft-margin最優化求解問題:
3、代碼實現
一、線性svm
import numpy as np from sklearn.datasets import load_iris import matplotlib.pyplot as plt from sklearn.preprocessing import StandardScaler from sklearn.svm import LinearSVC from matplotlib.colors import ListedColormap import warnings def plot_decision_boundary(model,axis): x0,x1=np.meshgrid( np.linspace(axis[0],axis[1],int((axis[1]-axis[0])*100)).reshape(-1,1), np.linspace(axis[2],axis[3],int((axis[3]-axis[2])*100)).reshape(-1,1) ) x_new=np.c_[x0.ravel(),x1.ravel()] y_predict=model.predict(x_new) zz=y_predict.reshape(x0.shape) custom_cmap=ListedColormap(['#EF9A9A','#FFF59D','#90CAF9']) plt.contourf(x0,x1,zz,linewidth=5,cmap=custom_cmap) w = model.coef_[0] b = model.intercept_[0] plot_x = np.linspace(axis[0],axis[1],200) up_y = -w[0]/w[1]*plot_x - b/w[1] + 1/w[1] down_y = -w[0]/w[1]*plot_x - b/w[1] - 1/w[1] up_index = (up_y>=axis[2]) & (up_y<=axis[3]) down_index = (down_y>=axis[2]) & (down_y<=axis[3]) plt.plot(plot_x[up_index],up_y[up_index],c='black') plt.plot(plot_x[down_index],down_y[down_index],c='black') warnings.filterwarnings("ignore") data = load_iris() x = data.data y = data.target x = x[y<2,:2] y = y[y<2] scaler = StandardScaler() scaler.fit(x) x = scaler.transform(x) svc = LinearSVC(C=1e9) svc.fit(x,y) plot_decision_boundary(svc,axis=[-3,3,-3,3]) plt.scatter(x[y==0,0],x[y==0,1],c='r') plt.scatter(x[y==1,0],x[y==1,1],c='b') plt.show()
輸出結果:
二、非線性-多項式特徵
import numpy as np from sklearn import datasets import matplotlib.pyplot as plt from sklearn.preprocessing import PolynomialFeatures,StandardScaler from sklearn.svm import LinearSVC from sklearn.pipeline import Pipeline from matplotlib.colors import ListedColormap import warnings def plot_decision_boundary(model,axis): x0,x1=np.meshgrid( np.linspace(axis[0],axis[1],int((axis[1]-axis[0])*100)).reshape(-1,1), np.linspace(axis[2],axis[3],int((axis[3]-axis[2])*100)).reshape(-1,1) ) x_new=np.c_[x0.ravel(),x1.ravel()] y_predict=model.predict(x_new) zz=y_predict.reshape(x0.shape) custom_cmap=ListedColormap(['#EF9A9A','#FFF59D','#90CAF9']) plt.contourf(x0,x1,zz,linewidth=5,cmap=custom_cmap) def PolynomialSVC(degree,C=1.0): return Pipeline([ ('poly',PolynomialFeatures(degree=degree)), ('std_scaler',StandardScaler()), ('linearSVC',LinearSVC(C=1e9)) ]) warnings.filterwarnings("ignore") poly_svc = PolynomialSVC(degree=3) X,y = datasets.make_moons(noise=0.15,random_state=666) poly_svc.fit(X,y) plot_decision_boundary(poly_svc,axis=[-1.5,2.5,-1.0,1.5]) plt.scatter(X[y==0,0],X[y==0,1],c='red') plt.scatter(X[y==1,0],X[y==1,1],c='blue') plt.show()
輸出結果:
三、非線性-核方法
from sklearn.preprocessing import StandardScaler from sklearn.svm import SVC from sklearn.pipeline import Pipeline from sklearn import datasets from matplotlib.colors import ListedColormap import numpy as np import matplotlib.pyplot as plt import warnings def plot_decision_boundary(model,axis): x0,x1=np.meshgrid( np.linspace(axis[0],axis[1],int((axis[1]-axis[0])*100)).reshape(-1,1), np.linspace(axis[2],axis[3],int((axis[3]-axis[2])*100)).reshape(-1,1) ) x_new=np.c_[x0.ravel(),x1.ravel()] y_predict=model.predict(x_new) zz=y_predict.reshape(x0.shape) custom_cmap=ListedColormap(['#EF9A9A','#FFF59D','#90CAF9']) plt.contourf(x0,x1,zz,linewidth=5,cmap=custom_cmap) def RBFKernelSVC(gamma=1.0): return Pipeline([ ('std_scaler',StandardScaler()), ('svc',SVC(kernel='rbf',gamma=gamma)) ]) warnings.filterwarnings("ignore") X,y = datasets.make_moons(noise=0.15,random_state=666) svc = RBFKernelSVC(gamma=100) svc.fit(X,y) plot_decision_boundary(svc,axis=[-1.5,2.5,-1.0,1.5]) plt.scatter(X[y==0,0],X[y==0,1],c='red') plt.scatter(X[y==1,0],X[y==1,1],c='blue') plt.show()
輸出結果: