Could someone explain to me why we use requires_grad=True in the input here?
14:12 28 Jan 2026
# Example of target with class indices
loss = nn.CrossEntropyLoss()
input = torch.randn(3, 5, requires_grad=True) <=============== WHY ?
target = torch.empty(3, dtype=torch.long).random_(5)
output = loss(input, target)
output.backward()
# Example of target with class probabilities
input = torch.randn(3, 5, requires_grad=True)
target = torch.randn(3, 5).softmax(dim=1)
output = loss(input, target)
output.backward()

This code block is taken from the PyTorch Cross Entropy page. Could you please explain why we use requires_grad=True here as I cannot find any explanation as to why.

Thanks a lot.

pytorch cross-entropy