Lines Matching full:backward
36 r"""Save given tensors for a future call to :func:`~Function.backward`.
41 All tensors intended to be used in the backward pass should be saved
47 nor outputs of :func:`forward`, are saved for backward, your custom Function
48 may not support double backward.
49 Custom Functions that do not support double backward should decorate their
50 :func:`backward` method with ``@once_differentiable`` so that performing
51 double backward raises an error. If you'd like to support double backward,
52 you can either recompute intermediaries based on the inputs during backward
54 …`double backward tutorial <https://pytorch.org/tutorials/intermediate/custom_function_double_backw…
57 In :func:`backward`, saved tensors can be accessed through the :attr:`saved_tensors`
78 >>> def backward(ctx, grad_out):
171 >>> def backward(ctx, grad_output):
179 >>> b.backward() # RuntimeError: one of the variables needed for gradient
201 efficiency of backward computation. You still need to accept a gradient
202 for each output in :meth:`~Function.backward`, but it's always going to
217 >>> def backward(ctx, g1, g2): # still need to accept g2
233 prior to calling the :func:`backward` and :func:`jvp` methods.
244 >>> def backward(ctx, g1, g2):
257 >>> def backward(ctx, g1, g2):
294 Apply method used when executing this Node during the backward
297 # The user should define either backward or vjp but never both.
298 backward_fn = self._forward_cls.backward # type: ignore[attr-defined]
300 if backward_fn is not Function.backward and vjp_fn is not Function.vjp:
302 "Implementing both 'backward' and 'vjp' for a custom "
331 name + "Backward", (BackwardCFunction,), {"_forward_cls": cls}
380 retrieved during the backward pass. Tensors should not be stored
382 backward compatibility). Instead, tensors should be saved either with
384 ``backward`` (equivalently, ``vjp``) or :func:`ctx.save_for_forward`
398 ``setup_context`` is not overridden. Setting up the ctx for backward
401 override ``setup_context``. Setting up the ctx for backward happens
409 def backward(ctx: Any, *grad_outputs: Any) -> Any: member in _SingleLevelFunction
410 …r"""Define a formula for differentiating the operation with backward mode automatic differentiatio…
427 :func:`backward` will have ``ctx.needs_input_grad[0] = True`` if the
432 "You must implement either the backward or vjp method for "
433 "your custom autograd.Function to use it with backward "
437 # vjp and backward are alias of each other
438 vjp = backward
468 the :meth:`forward` and :meth:`backward` static methods. Then, to use your custom
473 correct methods on ``ctx`` and validating your backward function using
489 >>> def backward(ctx, grad_output):
518 True only if this autograd.Function's forward, backward, and jvp (if they
613 # These functions would raise an error in backward anyway.
645 This class is here only for backward compatibility reasons.
765 This class is here only for backward compatibility reasons.
787 def backward(self, *gradients: Any) -> Any: # type: ignore[override] member in NestedIOFunction
789 Shared backward utility.
842 User defined backward.