Couldn't load subscription status. Retry
There was an error while loading. Please reload this page.
pytorch 代码如下:
Generator_loss = G_loss_A + G_loss_B Generator_loss.backward() self.G_optim.step()
对应paddle 的是只要这么改吗?
Generator_loss = G_loss_A + G_loss_B Generator_loss.backward()
其中self.G_optim.step()对应paddle 中的什么呢?
self.G_optim.step()