Lines Matching full:backward
77 r"""Register a backward hook.
92 See :ref:`backward-hooks-execution` for more information on how when this hook
102 >>> b.sum().backward(retain_graph=True)
107 >>> b.sum().backward(retain_graph=True)
115 r"""Register a backward pre-hook.
129 See :ref:`backward-hooks-execution` for more information on how when this hook
138 >>> b.sum().backward(retain_graph=True)
143 >>> b.sum().backward(retain_graph=True)
215 operation saves a tensor for backward (this includes intermediary results
223 namely when executing :func:`torch.Tensor.backward()` or
257 >>> y.sum().backward()
288 …r under which tensors saved by the forward pass will be stored on cpu, then retrieved for backward.
292 then copied back to the original device when needed for the backward pass.
324 >>> y.sum().backward() # all CPU tensors are moved back to GPU, for backward
325 >>> # all intermediary tensors are released (deleted) after the call to backward
413 r"""Register a multi-grad backward hook.
420 for any ``inputs`` specified for the current ``.backward()`` or ``.grad()`` call,
437 See :ref:`backward-hooks-execution` for more information on how when this hook
454 >>> c.sum().backward(retain_graph=True)
456 >>> c.sum().backward(inputs=(a,), retain_graph=True)
478 ), "expected this hook to be called inside a backward call"
509 assert id != -1, "expected this hook to be called inside a backward call"
525 # NOTE [Allow mutation on tensors saved for backward]
527 # 1. Tensor gets saved for backward
536 # 3. during backward
580 # Tensors saved for backward have an entry in _tid_to_weakhandle
599 "Trying to backward outside of the 'allow_mutation_on_saved_tensors' context"
631 # saved at one point, but cleared by backward before it is modified
636 # >>> out.backward()
670 """Context manager under which mutating tensors saved for backward is allowed.
672 Under this context manager, tensors saved for backward are cloned on mutation,
673 so the original version can still be used during backward. Normally, mutating a tensor
674 saved for backward will result in an error raised when it's used during backward.
676 To ensure the correct behavior, both the forward and backward should be run under
693 ... # backward
694 ... out.sum().backward()
769 …turn Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass
771 ) # Calls into the C++ engine to run the backward pass