Searched refs:allow_in_graph (Results 1 – 16 of 16) sorted by relevance
/external/pytorch/torch/_dynamo/ |
D | decorators.py | 77 def allow_in_graph(fn): function 87 return [allow_in_graph(x) for x in fn] 327 allow_in_graph(einops.rearrange) 328 allow_in_graph(einops.reduce) 330 allow_in_graph(einops.repeat) # available since einops 0.2.0 332 allow_in_graph(einops.einsum) # available since einops 0.5.0 334 allow_in_graph(einops.pack) # available since einops 0.6.0 336 allow_in_graph(einops.unpack) # available since einops 0.6.0
|
D | __init__.py | 8 allow_in_graph,
|
/external/pytorch/docs/source/ |
D | torch.compiler_fine_grain_apis.rst | 27 …allow_in_graph``", "The annotated callable goes as is in the TorchDynamo graph. For example, a bla… 75 different backend compilers, you might have to call ``allow_in_graph`` for 78 ``torch.compiler.allow_in_graph`` 81 ``torch.compiler.allow_in_graph`` is useful when the relevant function frame 85 function is decorated with ``allow_in_graph``, TorchDynamo treats it as a 89 ``allow_in_graph`` skips TorchDynamo completely on the decorated function 91 closures, and others. Use `allow_in_graph` with caution. PyTorch downstream 93 features, but ``allow_in_graph`` bypasses TorchDynamo. Using ``allow_in_graph``
|
D | torch.compiler_api.rst | 18 allow_in_graph
|
D | torch.compiler_faq.rst | 372 For other transforms, as a workaround, use ``torch._dynamo.allow_in_graph`` 374 ``allow_in_graph`` is an escape hatch. If your code does not work with 377 ``allow_in_graph``. 379 By using ``allow_in_graph`` to annotate a function, you must make sure 396 return torch._dynamo.allow_in_graph(torch.vmap(torch.sum))(x) 401 A common pitfall is using ``allow_in_graph`` to annotate a function that
|
/external/pytorch/torch/nn/attention/ |
D | bias.py | 27 torch._dynamo.allow_in_graph(is_flash_attention_available) 28 torch._dynamo.allow_in_graph(can_use_flash_attention) 29 torch._dynamo.allow_in_graph(can_use_efficient_attention) 30 torch._dynamo.allow_in_graph(SDPAParams)
|
/external/pytorch/torch/compiler/ |
D | __init__.py | 43 def allow_in_graph(fn): function 120 return torch._dynamo.allow_in_graph(fn)
|
/external/pytorch/test/dynamo/ |
D | test_interop.py | 35 from torch._dynamo import allow_in_graph 38 f = allow_in_graph(f)
|
D | test_sdpa.py | 15 SDPAParams = torch._dynamo.allow_in_graph(SDPAParams)
|
D | test_aot_autograd_cache.py | 691 @torch._dynamo.allow_in_graph 710 @torch._dynamo.allow_in_graph 753 @torch._dynamo.allow_in_graph
|
D | test_decorators.py | 34 torch._dynamo.allow_in_graph(torch.sub) 199 torch._dynamo.allow_in_graph(my_custom_function)
|
D | test_autograd_function.py | 334 @torch._dynamo.allow_in_graph 928 torch._dynamo.allow_in_graph(FooTensor)
|
D | test_misc.py | 39 from torch._dynamo import allow_in_graph 7258 @allow_in_graph 7379 @torch._dynamo.allow_in_graph 7413 @torch._dynamo.allow_in_graph
|
D | test_repros.py | 4218 @torch._dynamo.allow_in_graph 5502 @torch._dynamo.allow_in_graph
|
/external/pytorch/torch/sparse/ |
D | semi_structured.py | 131 torch._dynamo.allow_in_graph(cls)
|
/external/pytorch/test/functorch/ |
D | test_eager_transforms.py | 43 from torch._dynamo import allow_in_graph 5132 f = allow_in_graph(f)
|