You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, guys:
Recently, I reproduce this paper. But, I found an interesting thing. The learning rate changes from 1e-3 to 4e-5 at the first step in the first epoch, and stays till the end. I am not sure if it is a bug with accelerator.prepare() or not.
Any discussion is appreciated!
Here is how I print the learning rate.
It returns 1e-3 as my setting before training.
During training, I get the learning rate always as 4e-5.
I also try to change the args.lradj, I tried 'type1', 'constant', and 'COS'. It is always like above said.
The text was updated successfully, but these errors were encountered:
Hi, guys:
Recently, I reproduce this paper. But, I found an interesting thing. The learning rate changes from 1e-3 to 4e-5 at the first step in the first epoch, and stays till the end. I am not sure if it is a bug with accelerator.prepare() or not.
Any discussion is appreciated!
Here is how I print the learning rate.
It returns 1e-3 as my setting before training.
During training, I get the learning rate always as 4e-5.
I also try to change the args.lradj, I tried 'type1', 'constant', and 'COS'. It is always like above said.
The text was updated successfully, but these errors were encountered: