Skip to content

Commit 0965fe6

Browse files
committed
Generate Python docs from pytorch/pytorch@46a5e12
1 parent 36c7a87 commit 0965fe6

2,816 files changed

Lines changed: 7823 additions & 7737 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

main/.buildinfo

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
# Sphinx build info version 1
22
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
3-
config: 415b8e915ec5776ff33d82339ead392c
3+
config: 2cd854d2e4565aea0d120a636e3d768b
44
tags: 645f666f9bcd5a90fca523b33c5a78b7

main/_images/RReLU.png

13 Bytes
Loading

main/_images/ReduceLROnPlateau.png

-43 Bytes
Loading

main/_sources/generated/exportdb/index.rst.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1414,7 +1414,7 @@ list_contains
14141414

14151415
.. note::
14161416

1417-
Tags: :doc:`python.assert <python.assert>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`python.data-structure <python.data-structure>`
1417+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`python.assert <python.assert>`, :doc:`python.data-structure <python.data-structure>`
14181418

14191419
Support Level: SUPPORTED
14201420

@@ -1469,7 +1469,7 @@ list_unpack
14691469

14701470
.. note::
14711471

1472-
Tags: :doc:`python.control-flow <python.control-flow>`, :doc:`python.data-structure <python.data-structure>`
1472+
Tags: :doc:`python.data-structure <python.data-structure>`, :doc:`python.control-flow <python.control-flow>`
14731473

14741474
Support Level: SUPPORTED
14751475

main/_sources/generated/exportdb/python.assert.rst.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ list_contains
6060

6161
.. note::
6262

63-
Tags: :doc:`python.assert <python.assert>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`python.data-structure <python.data-structure>`
63+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`python.assert <python.assert>`, :doc:`python.data-structure <python.data-structure>`
6464

6565
Support Level: SUPPORTED
6666

main/_sources/generated/exportdb/python.control-flow.rst.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ list_unpack
6262

6363
.. note::
6464

65-
Tags: :doc:`python.control-flow <python.control-flow>`, :doc:`python.data-structure <python.data-structure>`
65+
Tags: :doc:`python.data-structure <python.data-structure>`, :doc:`python.control-flow <python.control-flow>`
6666

6767
Support Level: SUPPORTED
6868

main/_sources/generated/exportdb/python.data-structure.rst.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -147,7 +147,7 @@ list_contains
147147

148148
.. note::
149149

150-
Tags: :doc:`python.assert <python.assert>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`python.data-structure <python.data-structure>`
150+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`python.assert <python.assert>`, :doc:`python.data-structure <python.data-structure>`
151151

152152
Support Level: SUPPORTED
153153

@@ -202,7 +202,7 @@ list_unpack
202202

203203
.. note::
204204

205-
Tags: :doc:`python.control-flow <python.control-flow>`, :doc:`python.data-structure <python.data-structure>`
205+
Tags: :doc:`python.data-structure <python.data-structure>`, :doc:`python.control-flow <python.control-flow>`
206206

207207
Support Level: SUPPORTED
208208

main/_sources/generated/exportdb/torch.dynamic-shape.rst.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -755,7 +755,7 @@ list_contains
755755

756756
.. note::
757757

758-
Tags: :doc:`python.assert <python.assert>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`python.data-structure <python.data-structure>`
758+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`python.assert <python.assert>`, :doc:`python.data-structure <python.data-structure>`
759759

760760
Support Level: SUPPORTED
761761

main/_sources/notes/extending.rst.txt

Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -141,6 +141,18 @@ the autograd engine.
141141
to calling backward, and so your code will need to handle such objects as if they were
142142
tensors filled with zeros. The default value of this setting is True.
143143

144+
In addition to ``ctx`` methods, the :class:`~Function` class supports the following
145+
class attribute:
146+
147+
- :attr:`~Function.clear_saved_tensors_on_access`: When set to ``True`` on the
148+
:class:`~Function` subclass, accessing ``ctx.saved_tensors`` in the backward pass
149+
will clear the internal references to those tensors. This allows the tensors to be
150+
freed as soon as the local variables returned by ``saved_tensors`` go out of scope,
151+
rather than waiting for the buffers to be cleared at the end of the Node's execution.
152+
This can reduce peak memory usage in backward passes where saved tensors are only
153+
needed once. The default is ``False``. Note that ``saved_tensors`` can only be
154+
accessed once when this is enabled; a second access will raise an error.
155+
144156
**Step 3:** If your :class:`~Function` does not support double backward
145157
you should explicitly declare this by decorating backward with the
146158
:func:`~function.once_differentiable`. With this decorator, attempts to
@@ -257,6 +269,37 @@ And here, we optimize the above example by calling set_materialize_grads(False):
257269
# Gradients of non-Tensor arguments to forward must be None.
258270
return grad_output * ctx.constant, None
259271

272+
Here is an example using ``clear_saved_tensors_on_access`` to reduce peak memory during
273+
the backward pass. This function computes two matrix multiplications, and in backward
274+
we free the intermediate tensor after computing its gradient but before computing
275+
the remaining gradients::
276+
277+
class TwoMatmuls(Function):
278+
clear_saved_tensors_on_access = True
279+
280+
@staticmethod
281+
def forward(ctx, x, weight1, weight2):
282+
inter = x.mm(weight1)
283+
ctx.save_for_backward(x, weight1, inter, weight2)
284+
return inter.mm(weight2)
285+
286+
@staticmethod
287+
def backward(ctx, grad_output):
288+
x, weight1, inter, weight2 = ctx.saved_tensors
289+
290+
# Compute gradients for second matmul
291+
grad_weight2 = inter.t().mm(grad_output)
292+
grad_inter = grad_output.mm(weight2.t())
293+
294+
# Free inter and weight2, no longer needed
295+
del inter, weight2
296+
297+
# Compute gradients for first matmul
298+
grad_weight1 = x.t().mm(grad_inter)
299+
grad_x = grad_inter.mm(weight1.t())
300+
301+
return grad_x, grad_weight1, grad_weight2
302+
260303
If you need any "intermediate" Tensors computed in :meth:`~Function.forward` to be saved,
261304
either they must be returned as outputs, or combine ``forward`` and :meth:`~Function.setup_context`
262305
(see :ref:`combining-forward-context`).

main/accelerator.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -141,7 +141,7 @@
141141
fbq('track', 'PageView');
142142
</script>
143143
<script>
144-
document.documentElement.setAttribute('data-version', 'main (2.11.0a0+gitbbc4e47 )');
144+
document.documentElement.setAttribute('data-version', 'main (2.11.0a0+git46a5e12 )');
145145
</script>
146146
<noscript>
147147
<img height="1" width="1" src="https://www.facebook.com/tr?id=243028289693773&ev=PageView&noscript=1" />

0 commit comments

Comments
 (0)