DDP using multiple CPUs? #573
lyyc199586
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi all,
I currently don't have access to any GPUs, but I can use multiple CPUs. Is it possible to use DistributedDataParallel with multi CPUs with neuraloperator? If so, how do I modify the config file?
I noticed here:
neuraloperator/neuralop/training/torch_setup.py
Line 30 in a9e63cb
It seems that once we choose
config.distributed.use_distributed=True
, the device has to becuda
Beta Was this translation helpful? Give feedback.
All reactions