Skip to content

Stablizing training metrics #123520

May 12, 2024 · 1 comments · 7 replies
Discussion options

You must be logged in to vote

If the orange line was your training loss, I'd be more worried. Higher fluctuations in the validation error are fairly common, also for dropout and batchnorm. Admittedly though, this is quite a high variablility. What do you see if you have a lower LR for the first 20 epochs? If you suspect there may be an error in the code, you could share it. But again, it's probably fine.

Replies: 1 comment 7 replies

Comment options

You must be logged in to vote
7 replies
@sondosaabed
Comment options

@CopperEagle
Comment options

@sondosaabed
Comment options

@sondosaabed
Comment options

@CopperEagle
Comment options

Answer selected by sondosaabed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Programming Help Programming languages, open source, and software development.
2 participants