torch.autograd is PyTorch's automatic differentiation engine that powers neural network training. Let's try to understand from an example.
Here is a sample PyTorch calculates backward for us
x = torch.rand(1, requires_grad=True)
y = torch.rand(1)
v = x * y
w = torch.log(v)
xGrad = w.backward() # dw/dx
Let's assume our output and move on with this numbers
x = tensor([0.0559], requires_grad=True)
y = tensor([0.5163])
w = tensor([-3.5450], grad_fn=<LogBackward0>)
xGrad = tensor([17.8828])
This is an PyTorch generated graph
W calculation
1) Lets create functions from the generated graph by tracing backwards
x = tensor([0.0559], requires_grad=True)
y = tensor([0.5163])
v = 0.0559 * 0.5163
= 0.02886117
w = ln(0.02886117)
= -3.5452581859176
xGrad Calculation
This part needs some background with derivative rules and chain rule.
References
1) https://pytorch.org/blog/computational-graphs-constructed-in-pytorch/
2) https://www.youtube.com/watch?v=c36lUUr864M
3) https://pytorch.org/blog/overview-of-pytorch-autograd-engine/
Top comments (0)