Home
last modified time | relevance | path

Searched defs:set_requires_gradient_sync (Results 1 – 2 of 2) sorted by relevance

/external/pytorch/torch/distributed/_composable/
Dreplicate.py163 def set_requires_gradient_sync(self, requires_gradient_sync: bool) -> None: member in DDP
/external/pytorch/torch/distributed/_composable/fsdp/
Dfully_shard.py226 def set_requires_gradient_sync( member in FSDPModule