Keras共享某个层
生活随笔
收集整理的這篇文章主要介紹了
Keras共享某个层
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
對一個層的多次調用,就是在共享這個層。
input1 = Input(shape=[28,28]) input2 = Input(shape=[28,28]) x1 = Flatten()(input1) x1 = Dense(60,activation="relu")(x1) x2 = Flatten()(input2) x2 = Dense(60,activation="relu")(x2)d = Dense(10, activation='softmax')output1 = d(x1) output2 = d(x2) model1 = Model(input=[input1], output=[output1]) model2 = Model(input=[input1], output=[output1]) print(model1.summary()) print(model2.summary())結果:
____________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ==================================================================================================== input_1 (InputLayer) (None, 28, 28) 0 ____________________________________________________________________________________________________ flatten_1 (Flatten) (None, 784) 0 input_1[0][0] ____________________________________________________________________________________________________ dense_1 (Dense) (None, 60) 47100 flatten_1[0][0] ____________________________________________________________________________________________________ dense_3 (Dense) (None, 10) 610 dense_1[0][0] ==================================================================================================== Total params: 47710 ____________________________________________________________________________________________________ None ____________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ==================================================================================================== input_1 (InputLayer) (None, 28, 28) 0 ____________________________________________________________________________________________________ flatten_1 (Flatten) (None, 784) 0 input_1[0][0] ____________________________________________________________________________________________________ dense_1 (Dense) (None, 60) 47100 flatten_1[0][0] ____________________________________________________________________________________________________ dense_3 (Dense) (None, 10) 610 dense_1[0][0] ==================================================================================================== Total params: 47710 ____________________________________________________________________________________________________ None如果修改x2:
x2 = Dense(70,activation="relu")(x2)就會報錯:
Exception: Input 0 is incompatible with layer dense_3: expected shape=(None, 60), found shape=(None, 70)就是因為第一次使用Dense層的時候,他的參數w的大小已經定了是(28*28,60),共享層的目的不就是為了共享參數的嗎,所以,我講明白了吧!
總結
以上是生活随笔為你收集整理的Keras共享某个层的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 推荐标星 100 K 的 GitHub
- 下一篇: 【数据挖掘】数据挖掘和数据分析基础