There was an error while loading. Please reload this page.
1 parent 574cfe3 commit e1278d4Copy full SHA for e1278d4
torch/autograd/__init__.py
@@ -49,7 +49,7 @@ def backward(variables, grad_variables=None, retain_graph=None, create_graph=Non
49
The graph is differentiated using the chain rule. If any of ``variables``
50
are non-scalar (i.e. their data has more than one element) and require
51
gradient, the function additionaly requires specifying ``grad_variables``.
52
- It should be a sequence of matching length, that containins gradient of
+ It should be a sequence of matching length, that contains gradient of
53
the differentiated function w.r.t. corresponding variables (``None`` is an
54
acceptable value for all variables that don't need gradient tensors).
55
0 commit comments