JavaShuo
欄目
標籤
distillation
distillation
全部
深度學習-- > NLP -- > improving multi-task deep neural networks via knowledge distillation for natural
2019-12-04
深度
學習
nlp
improving
multi
task
deep
neural
networks
knowledge
distillation
natural
文獻閱讀:Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Under
2019-12-06
文獻
閱讀
improving
multi
task
deep
neural
networks
knowledge
distillation
natural
language
關於Distillation as a Defense to Adversarial Perturbations against Deep Neural Networks的理解
2019-11-08
關於
distillation
defense
adversarial
perturbations
deep
neural
networks
理解
【DL】模型蒸餾Distillation
2019-11-08
模型
蒸餾
distillation
Awesome Knowledge-Distillation
2019-11-30
awesome
knowledge
distillation
Knowledge Distillation 知識蒸餾
2019-12-01
knowledge
distillation
知識
蒸餾
«
1
2
»
每日一句
每一个你不满意的现在,都有一个你没有努力的曾经。