PyTorch 实现经典模型2:AlexNet
生活随笔
收集整理的這篇文章主要介紹了
PyTorch 实现经典模型2:AlexNet
小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個參考.
AlexNet
網(wǎng)絡結(jié)構(gòu)
論文總結(jié)
代碼實現(xiàn)
1) 導入必需的包
# 1) 導入必需的包 import torch import torch.nn as nn import torch.nn.functional as F import torchvision2) 搭建網(wǎng)絡模型
# 2) 搭建網(wǎng)絡模型 class AlexNet(nn.Module):def __init__(self):super(AlexNet, self).__init__()self.layer1 = nn.Sequential(nn.Conv2d(in_channels=3, out_channels=96, kernel_size=11, stride=4), nn.ReLU(inplace=True), nn.MaxPool2d(kernel_size=3, stride=2), # LRN(local_size=5, alpha=1e-4, beta=0.75, ACROSS_CHANNELS=True))self.layer2 = nn.Sequential(nn.Conv2d(in_channels=96, out_channels=256, kernel_size=5, stride=1, padding=2, groups=2), nn.ReLU(inplace=True), nn.MaxPool2d(kernel_size=3, stride=2), # LRN(local_size=5, alpha=1e-4, beta=0.75, ACROSS_CHANNELS=True))self.layer3 = nn.Sequential(nn.Conv2d(in_channels=256, out_channels=384, kernel_size=3, padding=1),nn.ReLU(inplace=True))self.layer4 = nn.Sequential(nn.Conv2d(in_channels=384, out_channels=384, kernel_size=3, padding=1),nn.ReLU(inplace=True))self.layer5 = nn.Sequential(nn.Conv2d(in_channels=384, out_channels=256, kernel_size=3, padding=1),nn.ReLU(inplace=True), nn.MaxPool2d(kernel_size=3, stride=2))# 由此從卷積變?yōu)槿B接層self.layer6 = nn.Sequential(nn.Linear(in_features=6*6*256, out_features=4096), nn.ReLU(inplace=True), nn.Dropout())self.layer7 = nn.Sequential(nn.Linear(in_features=4096, out_features=4096), nn.ReLU(inplace=True), nn.Dropout())self.layer8 = nn.Linear(in_features=4096, out_features=1000)def forward(self, x):x = self.layer5(self.layer4(self.layer3(self.layer2(self.layer1(x)))))x = x.view(-1, 6*6*256)x = self.layer8(self.layer7(self.layer6(x)))return x3) 導入使用的數(shù)據(jù)集、網(wǎng)絡結(jié)構(gòu)、優(yōu)化器、損失函數(shù)等
4) 訓練模型
5) 保存模型結(jié)構(gòu)參數(shù)
6) 加載模型并測試模型效果
Ref
總結(jié)
以上是生活随笔為你收集整理的PyTorch 实现经典模型2:AlexNet的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: PyTorch 实现经典模型1:LeNe
- 下一篇: PyTorch 实现经典模型3:VGG