Skip to content

Commit e1278d4

Browse files
committed
Fix typo in autograd docs
1 parent 574cfe3 commit e1278d4

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

torch/autograd/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ def backward(variables, grad_variables=None, retain_graph=None, create_graph=Non
4949
The graph is differentiated using the chain rule. If any of ``variables``
5050
are non-scalar (i.e. their data has more than one element) and require
5151
gradient, the function additionaly requires specifying ``grad_variables``.
52-
It should be a sequence of matching length, that containins gradient of
52+
It should be a sequence of matching length, that contains gradient of
5353
the differentiated function w.r.t. corresponding variables (``None`` is an
5454
acceptable value for all variables that don't need gradient tensors).
5555

0 commit comments

Comments
 (0)