tf.variable_scope与tf.tf.get_variable
實驗一、 不設(shè)置隨機(jī)種子,使用不同的初始化方法
import tensorflow as tf; import numpy as np; import matplotlib.pyplot as plt; with tf.variable_scope("test"):a1 = tf.get_variable(name='a1', shape=[2,3], initializer=tf.random_normal_initializer(mean=0, stddev=1))a2 = tf.get_variable(name='a2', shape=[1], initializer=tf.constant_initializer(1))a3 = tf.get_variable(name='a3', shape=[2,3], initializer=tf.ones_initializer())with tf.Session() as sess:sess.run(tf.initialize_all_variables())print(sess.run(a1))print(sess.run(a2))print(sess.run(a3))輸出結(jié)果:?
[[ 0.53831905 -0.48800603 ?0.80798125]
?[-1.9583933 ?-0.01016556 -0.9655879 ]]
[1.]
[[1. 1. 1.]
?[1. 1. 1.]]
?
實驗二、?不設(shè)置隨機(jī)種子,使用相同的初始化方法
import tensorflow as tf; import numpy as np; import matplotlib.pyplot as plt; with tf.variable_scope("tes1t", initializer=tf.random_normal_initializer(mean=0, stddev=1)):a1 = tf.get_variable(name='a1', shape=[2,3])a2 = tf.get_variable(name='a2', shape=[1])a3 = tf.get_variable(name='a3', shape=[2,3])with tf.Session() as sess:sess.run(tf.initialize_all_variables())print(sess.run(a1))print(sess.run(a2))print(sess.run(a3))輸出結(jié)果:?
[[-0.4393101 ?-0.3091908 ? 0.09686434]
?[-0.06059294 -0.7490989 ?-0.49343875]]
[-0.21072532]
[[ 0.03515918 -1.1747551 ? 1.6267052 ]
?[ 0.5114391 ?-0.2678874 ? 1.7599828 ]]
實驗一與實驗二生成的數(shù)據(jù)完全不同;因為隨機(jī)性存在;
實驗三、使用相同的初始化方法與隨機(jī)種子
import tensorflow as tf; import numpy as np; import matplotlib.pyplot as plt; with tf.variable_scope("tes1t1", initializer=tf.random_normal_initializer(mean=0, stddev=1,seed=1234)):a1 = tf.get_variable(name='a1', shape=[2,3])a2 = tf.get_variable(name='a2', shape=[1])a3 = tf.get_variable(name='a3', shape=[2,3])with tf.Session() as sess:sess.run(tf.initialize_all_variables())print(sess.run(a1))print(sess.run(a2))print(sess.run(a3))[[ 0.51340485 -0.255814 ? ?0.6519913 ]
?[ 1.3923638 ? 0.37256798 ?0.20336303]]
[0.51340485]
[[ 0.51340485 -0.255814 ? ?0.6519913 ]
?[ 1.3923638 ? 0.37256798 ?0.20336303]]
實驗三表明:使用了相同的初始化方法與隨機(jī)種子,a1,a2,a3的第一個數(shù)完全相同,a1與a3完全相同,可以得出結(jié)論隨機(jī)序列就完全固定了,即第一個數(shù)的值,第二個數(shù)的值直到第N個
import tensorflow as tf; import numpy as np; import matplotlib.pyplot as plt; with tf.variable_scope("tes1t12", initializer=tf.random_normal_initializer(mean=0, stddev=1,seed=1234)):a1 = tf.get_variable(name='a1', shape=[2,3])a2 = tf.get_variable(name='a2', shape=[1])a3 = tf.get_variable(name='a3', shape=[2,3])with tf.Session() as sess:sess.run(tf.initialize_all_variables())print(sess.run(a1))print(sess.run(a2))print(sess.run(a3))[[ 0.51340485 -0.255814 ? ?0.6519913 ]
?[ 1.3923638 ? 0.37256798 ?0.20336303]]
[0.51340485]
[[ 0.51340485 -0.255814 ? ?0.6519913 ]
?[ 1.3923638 ? 0.37256798 ?0.20336303]]
使用variable_scope變量與seed種子可復(fù)現(xiàn)同樣的隨機(jī)初始化;本質(zhì)來說,其參數(shù)初始值完全相同
總結(jié)
以上是生活随笔為你收集整理的tf.variable_scope与tf.tf.get_variable的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 数据集shuffle方法中buffer_
- 下一篇: tensorflow 1.0 学习:参数