Searched defs:retain_graph (Results 1 – 11 of 11) sorted by relevance
/external/pytorch/torch/csrc/distributed/autograd/ |
D | autograd.cpp | 14 bool retain_graph) { in backward()
|
/external/pytorch/torch/csrc/autograd/ |
D | autograd.cpp | 167 std::optional<bool> retain_graph, in backward() 188 std::optional<bool> retain_graph, in grad()
|
/external/pytorch/torch/_functorch/ |
D | eager_transforms.py | 166 outputs, inputs, grad_outputs=None, retain_graph=False, create_graph=True argument 425 def wrapper(cotangents, retain_graph=True, create_graph=None): argument
|
/external/pytorch/torch/csrc/jit/runtime/ |
D | register_prim_ops_fulljit.cpp | 232 auto retain_graph = pop(stack).toOptional<bool>(); in __anon992a14c81902() local 271 auto retain_graph = pop(stack).toOptional<bool>(); in __anon992a14c81a02() local
|
D | register_distributed_ops.cpp | 237 bool retain_graph = pop(stack).toBool(); in __anonfc71e5fa0802() local
|
D | register_prim_ops.cpp | 1132 auto retain_graph = pop(stack).toOptional<bool>(); in __anonb356bc0f4602() local
|
/external/pytorch/torch/autograd/ |
D | functional.py | 173 retain_graph=None, argument
|
/external/pytorch/torch/csrc/distributed/rpc/ |
D | init.cpp | 493 bool retain_graph) { in rpc_init()
|
/external/pytorch/test/ |
D | test_decomp.py | 103 outputs, inputs, grad_outputs=None, retain_graph=False, create_graph=True argument
|
/external/pytorch/torch/ |
D | _tensor.py | 526 self, gradient=None, retain_graph=None, create_graph=False, inputs=None argument
|
/external/pytorch/test/functorch/ |
D | test_ops.py | 74 outputs, inputs, grad_outputs=None, retain_graph=False, create_graph=True argument
|