设定学习率衰减

方法一:https://www.pytorchtutorial.com/pytorch-learning-rate-decay/

方法二:

# lr_step = [30,80]

if epoch in opt.lr_step:
      save_model(os.path.join(opt.save_dir, 'model_{}.pth'.format(epoch)), 
                 epoch, model, optimizer)
      lr = opt.lr * (0.1 ** (opt.lr_step.index(epoch) + 1)) #进行学习率的衰减
      print('Drop LR to', lr)
      for param_group in optimizer.param_groups:
          param_group['lr'] = lr
原文地址:https://www.cnblogs.com/xyzluck/p/13023517.html