Py之imblearn:imblearn/imbalanced-learn庫的簡介、安裝、使用方法之詳細攻略

Py之imblearn:imblearn/imbalanced-learn庫的簡介、安裝、使用方法之詳細攻略python

 

 

 

目錄git

imblearn/imbalanced-learn庫的簡介github

imblearn/imbalanced-learn庫的安裝算法

imblearn/imbalanced-learn庫的使用方法app


 

 

imblearn/imbalanced-learn庫的簡介

        imblearn/imbalanced-learn是一個python包,它提供了許多重採樣技術,經常使用於顯示強烈類間不平衡的數據集中。它與scikit learn兼容,是 scikit-learn-contrib 項目的一部分。dom

在python3.6+下測試了imbalanced-learn。依賴性要求基於上一個scikit學習版本:ide

  • scipy(>=0.19.1)
  • numpy(>=1.13.3)
  • scikit-learn(>=0.22)
  • joblib(>=0.11)
  • keras 2 (optional)
  • tensorflow (optional)

 

 

 

imblearn/imbalanced-learn庫的安裝

pip install imblearn
pip install imbalanced-learn
pip install -U imbalanced-learn
conda install -c conda-forge imbalanced-learn


學習

 

 

imblearn/imbalanced-learn庫的使用方法

         大多數分類算法只有在每一個類的樣本數量大體相同的狀況下才能達到最優。高度傾斜的數據集,其中少數被一個或多個類大大超過,已經證實是一個挑戰,但同時變得愈來愈廣泛。
解決這個問題的一種方法是經過從新採樣數據集來抵消這種不平衡,但願獲得一個比其餘方法更健壯和公平的決策邊界。
測試

Re-sampling techniques are divided in two categories:this

  1. Under-sampling the majority class(es).
  2. Over-sampling the minority class.
  3. Combining over- and under-sampling.
  4. Create ensemble balanced sets.

Below is a list of the methods currently implemented in this module.

  • Under-sampling

    1. Random majority under-sampling with replacement
    2. Extraction of majority-minority Tomek links [1]
    3. Under-sampling with Cluster Centroids
    4. NearMiss-(1 & 2 & 3) [2]
    5. Condensed Nearest Neighbour [3]
    6. One-Sided Selection [4]
    7. Neighboorhood Cleaning Rule [5]
    8. Edited Nearest Neighbours [6]
    9. Instance Hardness Threshold [7]
    10. Repeated Edited Nearest Neighbours [14]
    11. AllKNN [14]
  • Over-sampling

    1. Random minority over-sampling with replacement
    2. SMOTE - Synthetic Minority Over-sampling Technique [8]
    3. SMOTENC - SMOTE for Nominal Continuous [8]
    4. bSMOTE(1 & 2) - Borderline SMOTE of types 1 and 2 [9]
    5. SVM SMOTE - Support Vectors SMOTE [10]
    6. ADASYN - Adaptive synthetic sampling approach for imbalanced learning [15]
    7. KMeans-SMOTE [17]
  • Over-sampling followed by under-sampling

    1. SMOTE + Tomek links [12]
    2. SMOTE + ENN [11]
  • Ensemble classifier using samplers internally

    1. Easy Ensemble classifier [13]
    2. Balanced Random Forest [16]
    3. Balanced Bagging
    4. RUSBoost [18]
  • Mini-batch resampling for Keras and Tensorflow
相關文章
相關標籤/搜索