[AI] 深度數學 - Bayes

數學似宇宙,韭菜只關心其中實用的部分。html

scikit-learn (sklearn) 官方文檔中文版算法

scikit-learn Machine Learning in Pythonapache

 

一個新穎的online圖書資源集,很是棒。網絡

機器學習原理dom

 

Bayesian Machine Learning


9. [Bayesian] 「我是bayesian我怕誰」系列 - Gaussian Process【ignore】機器學習

 隨機過程 

[Scikit-learn] 1.1 Generalized Linear Models - Bayesian Ridge Regression【等價效果】wordpress

 

8. [Bayesian] 「我是bayesian我怕誰」系列 - Variational Autoencoders函數

 稀疏表達 

[UFLDL] Generative Modelpost

[UFLDL] *Sparse Representation【稀疏表達】學習

 

7. [Bayesian] 「我是bayesian我怕誰」系列 - Boltzmann Distribution【ignore】

 貝葉斯網絡 

[Scikit-learn] Dynamic Bayesian Network - Conditional Random Field【去噪、詞性標註】

 

6. [Bayesian] 「我是bayesian我怕誰」系列 - Markov and Hidden Markov Models【隱馬及其擴展】

 時序模型 

[Scikit-learn] Dynamic Bayesian Network - HMM【基礎實踐】

[Scikit-learn] Dynamic Bayesian Network - Kalman Filter【車定位預測】

[Scikit-learn] *Dynamic Bayesian Network - Partical Filter【機器人自我定位】

 

5. [Bayesian] 「我是bayesian我怕誰」系列 - Continuous Latent Variables【降維:PCA, PPCA, FA, ICA】

 機率降維 

[Scikit-learn] 4.4 Dimensionality reduction - PCA

[Scikit-learn] 2.5 Dimensionality reduction - Probabilistic PCA & Factor Analysis

[Scikit-learn] 2.5 Dimensionality reduction - ICA

[Scikit-learn] 1.2 Dimensionality reduction - Linear and Quadratic Discriminant Analysis

 

4. [Bayesian] 「我是bayesian我怕誰」系列 - Variational Inference【公式推導解讀】

 機率聚類 

[Scikit-learn] 2.1 Clustering - Gaussian mixture models & EM

[Scikit-learn] 2.1 Clustering - Variational Bayesian Gaussian Mixture

 

3. [Bayesian] 「我是bayesian我怕誰」系列 - Latent Variables【概念解讀】

 隱變量模型 

[Bayes] Concept Search and LSI

[Bayes] Concept Search and PLSA

[Bayes] Concept Search and LDA

 

 

2. [Bayesian] 「我是bayesian我怕誰」系列 - Exact Inference【ignore】

1. [Bayesian] 「我是bayesian我怕誰」系列 - Naive Bayes with Prior【貝葉斯在文本分類的極簡例子】

 樸素貝葉斯 

[ML] Naive Bayes for Text Classification【原理概覽】

[Bayes] Maximum Likelihood estimates for text classification【代碼實現】

[Scikit-learn] 1.9 Naive Bayes【不一樣先驗的樸素貝葉斯】

 

 

 常見分佈關係 

<Statistical Inference> goto: 647/686

 

 先驗分佈與後驗分佈 

[Math] From Prior to Posterior distribution【先驗後驗基礎知識】

[Bayes] qgamma & rgamma: Central Credible Interval【後驗區間估計】

[Bayes] Multinomials and Dirichlet distribution【狄利克雷分佈】

 

其中兩個概念比較重要:

      • 無信息先驗分佈 (Non-informative prior)
      • Jeffreys先驗分佈 (Jeffreys  prior)

後驗便是:貝葉斯統計推斷

      • 後驗分佈與充分性 (Posterior distribution and sufficiency)
      • 無信息先驗下的後驗分佈 (Posterior distribution with noninformative prior)
      • 共軛先驗下的後驗分佈 (Posterior distribution with conjugate prior)

結合損失函數:貝葉斯統計決策 

      • 平方損失 (square loss)
      • 加權平方損失 (weighted squared loss)
      • 絕對值損失 (absolute loss)
      • 線性損失函數 (linear loss function)

 

 抽樣方法 

一種逼近求值策略:貝葉斯計算方法

    • MCMC抽樣方法

[Bayes] MCMC (Markov Chain Monte Carlo)【利用了馬爾科夫的平穩性】

(a).  Metropolis-Hasting算法

(b).  Gibbs採樣算法

 

 其餘未整理 

 

 

 

non-Bayesian Machine Learning


Algorithm Outline

[ML] Roadmap: a long way to go【學習路線北斗導航】

 

 

基本概念

[UFLDL] Basic Concept【基本ML概念】 

[UFLDL] *Train and Optimize 

 

 

基本算法

[Scikit-learn] 1.5 Generalized Linear Models - SGD for Regression

[Scikit-learn] 1.5 Generalized Linear Models - SGD for Classification

 

Online Learning

[Scikit-learn] 1.1 Generalized Linear Models - Comparing various online solvers

[Scikit-learn] Yield miniBatch for online learning.

 

 

線性問題

[UFLDL] Linear Regression & Classification

 

線性擬合

[Scikit-learn] 1.1 Generalized Linear Models - from Linear Regression to L1&L2【最小二乘 --> 正則化】

[Scikit-learn] 1.1 Generalized Linear Models - Lasso Regression【L2相關「內容」,正則化分類固然也能夠用】

[ML] Bayesian Linear Regression【增量在線學習的例子】

[Scikit-learn] 1.4 Support Vector Regression【依據最外邊距】

[Scikit-learn] Theil-Sen Regression【抗噪能力較好】

 

線性分類

# Discriminative Models

[Scikit-learn] 1.1 Generalized Linear Models - Logistic regression & Softmax【轉化爲最大似然,也能夠將參數「正則」】

[Scikit-learn] 1.1 Generalized Linear Models - Neural network models【MLP多層感知機】

[ML] Bayesian Logistic Regression【統計分類方法的區別】

[Scikit-learn] 1.4 Support Vector Regression【線性可分】

 

# Generative Models

Naive Bayes【參見 "貝葉斯機器學習"】

[ML] Linear Discriminant Analysis【ing】

 

 

決策樹

[ML] Decision Tree & Ensembling Metholds【Bagging pk Boosting pk SVM】

 

 

降維

[UFLDL] Dimensionality Reduction【廣義降維方法概述】

 

 

聚類

[Scikit-learn] 2.3 Clustering - kmeans

[Scikit-learn] 2.3 Clustering - Spectral clustering

[Scikit-learn] *2.3 Clustering - DBSCAN: Density-Based Spatial Clustering of Applications with Noise

[Scikit-learn] *2.3 Clustering - MeanShift

 

End.

相關文章
相關標籤/搜索