报错RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed

训练GAN net时经常遇到这个问题

RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed. Specify retain_graph=True when calling .backward() or autograd.grad() the first time.

翻译一下就是 第二次尝试在图中向后遍历时,保存的临时变量已经被释放

显然,

GAN中有一个变量存在于gen和disc之间,就是fake

加上detach() 就行

个性签名:时间会解决一切
原文地址:https://www.cnblogs.com/lfri/p/14866263.html