Stablizing training metrics #123520
-
BodyHey Githubbers, I'm not sure if this is the right discussion to ask the question if not let me know which space The following curves show the learning metrics, however, it doesn't start smoothly but it follows the trend of good fitting and converges. What could be the reason for the fluctuation before the 20th epoch? and how do I make a smoother curve? Note that I'm using dropout regularization and batch normalization techniques I am also using LR schedule Guidelines
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 7 replies
-
If the orange line was your training loss, I'd be more worried. Higher fluctuations in the validation error are fairly common, also for dropout and batchnorm. Admittedly though, this is quite a high variablility. What do you see if you have a lower LR for the first 20 epochs? If you suspect there may be an error in the code, you could share it. But again, it's probably fine. |
Beta Was this translation helpful? Give feedback.
If the orange line was your training loss, I'd be more worried. Higher fluctuations in the validation error are fairly common, also for dropout and batchnorm. Admittedly though, this is quite a high variablility. What do you see if you have a lower LR for the first 20 epochs? If you suspect there may be an error in the code, you could share it. But again, it's probably fine.