热门搜索 :
考研考公
您的当前位置:首页正文

Soft weight-sharing for DNN

来源:东饰资讯网

Approach

where p(w) is the prior over w and p(D|w) is the model likelihood.


After re-training we set each weight to the mean of the component that takes most responsibility for it.

Experiment

References:
SOFT WEIGHT-SHARING FOR NEURAL NETWORK COMPRESSION, Karen Ullrich, 2017, ICLR

Top