有关蒸馏 (Distillation)的论文: (2006)Model Compression (2014)Do Deep Nets Really Need to be Deep?--- 论文笔记 (2015)Distilling the Knowledge in a Neural Network--- 论文笔记 摘要 本文提出了防御蒸馏(defensive distillation),主要思想为:使用从DNN中提取的知识来降低
专注分享技术,共同学习,共同进步。侵权联系[81616952@qq.com]
Copyright (C)ICode9.com, All Rights Reserved.
ICode9版权所有