多loss的反向传播路径

转自:https://www.jb51.net/article/213149.htm

1.多个loss

 x = torch.tensor(2.0, requires_grad=True)                                                    
 y = x**2                                                                                     
 z = x                                                                                        
# 反向传播
 y.backward()                                                                                 
 x.grad                                                                                       
 tensor(4.)
 z.backward()                                                                                 
 x.grad                                                                                       
 tensor(5.) ## 累加

 官方文档:

torch.autograd.backward(tensors, grad_tensors=None, retain_graph=None, create_graph=False, grad_variables=None)

 Computes the sum of gradients of given tensors w.r.t. graph leaves.The graph is differentiated using the chain rule. 

 不同路径的计算结果会累加到tensor上。

原文地址:https://www.cnblogs.com/BlueBlueSea/p/15542288.html