【論文整理】知識蒸餾最全論文列表!一文掌握全新研究方向!

knowledge distillation papers Early Papers Model Compression, Rich Caruana, 2006 Distilling the Knowledge in a Neural Network, Hinton, J.Dean, 2015 Knowledge Acquisition from Examples Via Multiple Mod
相關文章
相關標籤/搜索