how to check if cuda is available code example
Example 1: cuda version check
nvcc --version
Example 2: pytorch check gpu
In [1]: import torch
In [2]: torch.cuda.current_device()
Out[2]: 0
In [3]: torch.cuda.device(0)
Out[3]: <torch.cuda.device at 0x7efce0b03be0>
In [4]: torch.cuda.device_count()
Out[4]: 1
In [5]: torch.cuda.get_device_name(0)
Out[5]: 'GeForce GTX 950M'
In [6]: torch.cuda.is_available()
Out[6]: True
Example 3: check if pytorch is using gpu minimal example
import torch
import torch.nn as nn
dev = torch.device("cuda") if torch.cuda.is_available() else torch.device("cpu")
t1 = torch.randn(1,2)
t2 = torch.randn(1,2).to(dev)
print(t1)
print(t2)
t1.to(dev)
print(t1)
print(t1.is_cuda)
t1 = t1.to(dev)
print(t1)
print(t1.is_cuda)
class M(nn.Module):
def __init__(self):
super().__init__()
self.l1 = nn.Linear(1,2)
def forward(self, x):
x = self.l1(x)
return x
model = M()
model.to(dev)
print(next(model.parameters()).is_cuda)
Example 4: check cuda version pytorch
torch.version.cuda
Example 5: how to tell if i have cuda installed
C:\ProgramData\NVIDIA Corporation\CUDA Samples\v11.1
Example 6: how to tell if i have cuda installed
C:\ProgramData\NVIDIA Corporation\CUDA Samples\v11.1\bin\win64\Release