資源連接:連接: https://pan.baidu.com/s/1c1MIm1E 密碼: ganthtml
chapter2 : linear regression with one feature算法
************************************************************************************************************dom
************************************************************************************************************ide
chapter4:linear regression with multiple feature函數
************************************************************************************************************學習
************************************************************************************************************lua
chapter 5 : Octavespa
x=[0:0.01:1]; y1=sin(2*pi*x); plot(x,y1); y2=cos(2*pi*x); hold on; plot(x,y2); xlabel('time'); ylabel('value'); title('my plot'); legend('sin','cos'); print -dpng 'my.png'; close; figure(1);plot(x,y1); figure(2);plot(x,y2); figure(3); subplot(1,2,1); plot(x,y1); subplot(1,2,2); plot(x,y2); axis([0.5 1 -1 1]); %change the axis of x and y clf; a=magic(5) imagesc(a); imagesc(a), colorbar,colormap gray;
*************************************************************************************************************.net
*************************************************************************************************************code
chapter 6 : logistic regression and regularization
************************************************************************************************************
************************************************************************************************************
chapter 7 : regularization
************************************************************************************************************
************************************************************************************************************
chapter 8 : neural network
********************************************************************************************************
********************************************************************************************************
chapter 10 : Deciding what to try next
Diagnosing bias and variance
**********************************************************************************************************
**********************************************************************************************************
chapter 11 : precision and recall
*********************************************************************************************************
*********************************************************************************************************
chapter 12 : SVM
Kernel need to satisfy technical condition called "Mercer's Theorem" to make sure SVM packages' optimizations run correctly, and do not diverge.
Polynomial kernel: (XTL+constant)degree
SVM has a convex optimization problem
*********************************************************************************************************
*********************************************************************************************************
chapter 13 : Unsupervised learning and clustering
*********************************************************************************************************
*********************************************************************************************************
chapter 14 : PCA
chapter 15 :