pytorch how to remove cuda() from tensor

You have a float tensor f and want to convert it to long, you do long_tensor = f.long()

You have cuda tensor i.e data is on gpu and want to move it to cpu you can do cuda_tensor.cpu().

So to convert a torch.cuda.Float tensor A to torch.long do A.long().cpu()

Best practice for Pytorch 0.4.0 is to write device agnostic code: That is, instead of using .cuda() or .cpu() you can simply use .to(torch.device("cpu"))

A =, device=torch.device("cpu"))

Note that .to() is not an "in-place" operation (see, e.g., this answer), thus you need to assign back into A.

If you have a tensor t.

t = t.cpu() 

would be the old way.

t ="cpu")

would be the new API.