深度学总结:weight_initialization
生活随笔
收集整理的這篇文章主要介紹了
深度学总结:weight_initialization
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
weight_initialization
Occam’s razor:簡單一刀切設置0或者1
General rule for setting weights
The general rule for setting the weights in a neural network is to set them to be close to zero without being too small.
Good practice is to start your weights in the range of [?y,y][-y, y][?y,y] where y=1/ny=1/\sqrt{n}y=1/n?
(nnn is the number of inputs to a given neuron).
Uniform Distribution:
設置分布區間為:[?y,y][-y, y][?y,y] where y=1/ny=1/\sqrt{n}y=1/n?
Normal Distribution:
設置分布區間為:a mean of 0 and a standard deviation of y=1/ny=1/\sqrt{n}y=1/n?.
不設置初始化
利用網絡特殊的結構,淡化初始化的影響:
比如BN,每一層接近a mean of 0 and a standard deviation of ,自動化處理,避免了初始化的影響。
總結
以上是生活随笔為你收集整理的深度学总结:weight_initialization的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: tensorflow学习:分布式tens
- 下一篇: 深度学总结:CNN Decoder, U