You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
do you happen to have any of your training/validation losses logged somewhere, and is there any chance I can have that log?
I'd like to get a rough feeling for whether hyperparameter setting changes I make have any positive or negative impact on convergence speed, without making a full training run. The reasoning here is that I have a more modern DGX available and can potentially train with more data, and that I intend to make up some speed by adjusting the batch size and learning rate upwards. I'd just like to know whether said runs look somewhat similar to yours in terms of training behavior/loss decrease per computation step.
Best regards,
Frederic
The text was updated successfully, but these errors were encountered:
Hi,
do you happen to have any of your training/validation losses logged somewhere, and is there any chance I can have that log?
I'd like to get a rough feeling for whether hyperparameter setting changes I make have any positive or negative impact on convergence speed, without making a full training run. The reasoning here is that I have a more modern DGX available and can potentially train with more data, and that I intend to make up some speed by adjusting the batch size and learning rate upwards. I'd just like to know whether said runs look somewhat similar to yours in terms of training behavior/loss decrease per computation step.
Best regards,
Frederic
The text was updated successfully, but these errors were encountered: