[pytorch、学习] - 3.7 softmax回归的简洁实现
生活随笔
收集整理的這篇文章主要介紹了
[pytorch、学习] - 3.7 softmax回归的简洁实现
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
參考
3.7. softmax回歸的簡潔實現
使用pytorch實現softmax
import torch from torch import nn from torch.nn import init import numpy as np import sys sys.path.append("..") import d2lzh_pytorch as d2l3.7.1. 獲取和讀取數據
batch_size = 256 train_iter, test_iter = d2l.load_data_fashion_mnist(batch_size)3.7.2. 定義和初始化模型
num_inputs = 784 num_outputs = 10class LinearNet(nn.Module):def __init__(self, num_inputs, num_outputs):super(LinearNet, self).__init__()self.linear = nn.Linear(num_inputs, num_outputs) def forward(self, x):y = self.linear(x.view(x.shape[0], -1))return y net = LinearNet(num_inputs, num_outputs)init.normal_(net.linear.weight, mean=0, std=0.01) init.constant_(net.linear.bias, val=0)3.7.3. softmax和交叉熵損失函數
loss = nn.CrossEntropyLoss()3.7.4. 定義優化算法
optimizer = torch.optim.SGD(net.parameters(), lr=0.1)3.7.5. 訓練模型
num_epochs = 5 d2l.train_ch3(net, train_iter, test_iter, loss, num_epochs, batch_size, None, None, optimizer)3.7.6. 測試
# 測試 X, y = iter(test_iter).next()true_labels = d2l.get_fashion_mnist_labels(y.numpy()) pred_labels = d2l.get_fashion_mnist_labels(net(X).argmax(dim=1).numpy()) titles = [true + '\n' + pred for true, pred in zip(true_labels, pred_labels)]d2l.show_fashion_mnist(X[0:9], titles[0:9])總結
以上是生活随笔為你收集整理的[pytorch、学习] - 3.7 softmax回归的简洁实现的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 5. DICOM图像层级分类-DCMTK
- 下一篇: js 自定义事件