Is there a pytorch method to check the number of cpus?
At present pytorch doesn't support multiple cpu cluster in DistributedDataParallel implementation. So, I am assuming you mean number of cpu cores.
There's no direct equivalent for the gpu count method but you can get the number of threads which are available for computation in pytorch by using
torch.get_num_threads()
just use this :
os.cpu_count()