查看显存占用
一.
使用pynvml包(實際顯存占用,包含cache)
import pynvmlpynvml.nvmlInit() handle = pynvml.nvmlDeviceGetHandleByIndex(0) # 0表示顯卡標號 meminfo = pynvml.nvmlDeviceGetMemoryInfo(handle) print(meminfo.total/1024**2) #總的顯存大小 print(meminfo.used/1024**2) ?#已用顯存大小 print(meminfo.free/1024**2) ?#剩余顯存大小二.
torch.cuda.memory_allocated(理論上顯存占用,不包含cache)
print(torch.cuda.memory_allocated(device=0) / (1024 * 1024))#0表示顯卡號文檔:https://pytorch.org/docs/master/generated/torch.cuda.memory_allocated.html?highlight=torch%20cuda%20memory_allocated#torch.cuda.memory_allocated
總結
- 上一篇: SSB功率测试
- 下一篇: (后台)Openbravo如何实现业务事