神经网络第五周tutorial解析
生活随笔
收集整理的這篇文章主要介紹了
神经网络第五周tutorial解析
小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個參考.
為什么一開始的eror大?
因?yàn)橐婚_始的weight是random initialized啊
the line is good enough for
even though the error is not going to zero
the result corrolation is great
underlying structure of the problem
generate it
have a model that is reasonable
llm algorithm
understanding the line very well
trained line in black line
different approaches
similar
lls is doing thinking at one go
have a learning rate about all these things
increase learning rate
for example it is clear that what
put 100
we may not convert
we did a train model
5000
nice fit
batch algorithm
lms
總結(jié)
以上是生活随笔為你收集整理的神经网络第五周tutorial解析的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。