深度学习之基于Tensorflow2.0实现Xception网络
生活随笔
收集整理的這篇文章主要介紹了
深度学习之基于Tensorflow2.0实现Xception网络
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
1.Xception網絡簡介
Xception網絡是在2017提出的輕量型網絡,兼顧了準確性和參數量,Xception----->Extreme(極致的) Inception。
2.創新點
引入類似深度可分離卷積:Depthwise Separable Convolution。
為什么說是類似深度可分離卷積呢?
1)深度可分離卷積的實現過程:
2)Xception深度可分離卷積的實現過程:
Xception的深度可分離卷積與傳統的深度可分離卷積的步驟是相反的,但是原論文作者說,兩者的性能差異不大,最終的結果差異也不大,因此在實現的時候還是用的傳統的深度可分離卷積。
3.網絡模型
4.網絡實現
def Xcepiton(nb_class,input_shape):input_ten = Input(shape=input_shape)#block 1#299,299,3 -> 149,149,64x = Conv2D(32,(3,3),strides=(2,2),use_bias=False)(input_ten)x = BatchNormalization()(x)x = Activation('relu')(x)x = Conv2D(64,(3,3),use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)#block2#149,149,64 -> 75,75,128residual = Conv2D(128,(1,1),strides=(2,2),padding='same',use_bias=False)(x)residual = BatchNormalization()(residual)x = SeparableConv2D(128,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(128,(3,3),padding='same')(x)x = BatchNormalization()(x)x = MaxPooling2D((3,3),strides=(2,2),padding='same')(x)x = layers.add([x,residual])#block3#75,75,128 -> 38,38,256residual = Conv2D(256,(1,1),strides=(2,2),padding='same',use_bias=False)(x)residual = BatchNormalization()(residual)x = Activation('relu')(x)x = SeparableConv2D(256,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(256,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = MaxPooling2D((3,3),strides=(2,2),padding='same')(x)x = layers.add([x,residual])#block4#38,38,256 -> 19,19,728residual = Conv2D(728,(1,1),strides=(2,2),padding='same',use_bias=False)(x)residual = BatchNormalization()(residual)x = Activation('relu')(x)x = SeparableConv2D(728,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(728,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = MaxPooling2D((3,3),strides=(2,2),padding='same')(x)x = layers.add([x,residual])#block 5 - 12#19,19,728 -> 19,19,728for i in range(8):residual = xx = Activation('relu')(x)x = SeparableConv2D(728,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(728,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(728,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = layers.add([x,residual])#block13 #19,19,728 -> 10,10,1024residual = Conv2D(1024,(1,1),strides=(2,2),padding='same',use_bias=False)(x)residual = BatchNormalization()(residual)x = Activation('relu')(x)x = SeparableConv2D(728,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(1024,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = MaxPooling2D((3,3),strides=(2,2),padding='same')(x)x = layers.add([x,residual])#block14#10,10,1024 ->10,10,2048x = SeparableConv2D(1536,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(2048,(3,3),padding='same',use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = GlobalAveragePooling2D()(x)output_ten = Dense(nb_class,activation='softmax')(x)model = Model(input_ten,output_ten)return model model_xception = Xcepiton(24,(img_height,img_width,3)) model_xception.summary()
訓練參數并不多,是一個比較經典的輕量級網絡。
努力加油a啊
總結
以上是生活随笔為你收集整理的深度学习之基于Tensorflow2.0实现Xception网络的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 你怕失业吗?孟晚舟曾建议儿子别选和机器竞
- 下一篇: 秒杀保时捷911 上汽MG Cybers