• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1:github_url: https://github.com/pytorch/functorch
2
3functorch
4===================================
5
6.. currentmodule:: functorch
7
8functorch is `JAX-like <https://github.com/google/jax>`_ composable function transforms for PyTorch.
9
10.. warning::
11
12   We've integrated functorch into PyTorch. As the final step of the
13   integration, the functorch APIs are deprecated as of PyTorch 2.0.
14   Please use the torch.func APIs instead and see the
15   `migration guide <https://pytorch.org/docs/main/func.migrating.html>`_
16   and `docs <https://pytorch.org/docs/main/func.html>`_
17   for more details.
18
19What are composable function transforms?
20----------------------------------------
21
22- A "function transform" is a higher-order function that accepts a numerical function
23  and returns a new function that computes a different quantity.
24
25- functorch has auto-differentiation transforms (``grad(f)`` returns a function that
26  computes the gradient of ``f``), a vectorization/batching transform (``vmap(f)``
27  returns a function that computes ``f`` over batches of inputs), and others.
28
29- These function transforms can compose with each other arbitrarily. For example,
30  composing ``vmap(grad(f))`` computes a quantity called per-sample-gradients that
31  stock PyTorch cannot efficiently compute today.
32
33Why composable function transforms?
34-----------------------------------
35
36There are a number of use cases that are tricky to do in PyTorch today:
37
38- computing per-sample-gradients (or other per-sample quantities)
39- running ensembles of models on a single machine
40- efficiently batching together tasks in the inner-loop of MAML
41- efficiently computing Jacobians and Hessians
42- efficiently computing batched Jacobians and Hessians
43
44Composing :func:`vmap`, :func:`grad`, and :func:`vjp` transforms allows us to express the above without designing a separate subsystem for each.
45This idea of composable function transforms comes from the `JAX framework <https://github.com/google/jax>`_.
46
47Read More
48---------
49
50Check out our `whirlwind tour <whirlwind_tour>`_ or some of our tutorials mentioned below.
51
52
53.. toctree::
54   :maxdepth: 2
55   :caption: functorch: Getting Started
56
57   install
58   notebooks/whirlwind_tour.ipynb
59   ux_limitations
60
61.. toctree::
62   :maxdepth: 2
63   :caption: functorch API Reference and Notes
64
65   functorch
66   experimental
67   aot_autograd
68
69.. toctree::
70   :maxdepth: 1
71   :caption: functorch Tutorials
72
73   notebooks/jacobians_hessians.ipynb
74   notebooks/ensembling.ipynb
75   notebooks/per_sample_grads.ipynb
76   notebooks/neural_tangent_kernels.ipynb
77   notebooks/aot_autograd_optimizations.ipynb
78   notebooks/minifier.ipynb
79