三种权重的初始化方法
總結了三種權重的初始化方法,前兩種比較常見,后一種是最新的。
1. Gaussian
Weights are randomly drawn from Gaussian distributions with fixed mean (e.g., 0) and fixed standard deviation (e.g., 0.01).?
This is the most common initialization method in deep learning.
2. Xavier
This method proposes to adopt a properly scaled uniform or Gaussian distribution for initialization.
In Caffe (an openframework for deep learning) [2], It initializes the weights in network by drawing them from a distribution with zero mean and a specific variance,
? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ??
Where W ?is the initialization distribution for the neuron in question, and ? n_in is the number of neurons feeding into it. The distribution used is typically Gaussian or uniform.
In Glorot & Bengio’s paper [1], itoriginally recommended using ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ??
Where n_out is the number of neurons the result is fed to.
Reference:
[1]?X. Glorot and Y. Bengio. Understanding the difficulty of training deepfeedforward neural networks. In International Conference on Artificial Intelligence and Statistics, pages 249–256, 2010.
[2]?Y. Jia, E. Shelhamer, J. Donahue, S. Karayev, J. Long, R. Girshick, S.Guadarrama, and T. Darrell. Caffe: Convolutional architecture for fast featureembedding. arXiv:1408.5093, 2014.
3. MSRA
This method is proposed to solve the training of extremely deep rectified models directly from scratch [1].
In this method,weights are initialized with a zero-mean Gaussian distribution whose std is
? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?? Where??is the spatial filter size in layer l and d_l?1 is the number of filters in layer l?1.
Reference:
[1]?Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification, Technical report, arXiv, Feb. 2015
總結
以上是生活随笔為你收集整理的三种权重的初始化方法的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: caffe卷积层代码阅读笔记
- 下一篇: 深度学习(DL)与卷积神经网络(CNN)