You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Optimize step methods for Trainer should be refactored, so that two separate functions are not used for with and without gradient optimizations. This will improve the code structure and scalability.
Additionally, additional methods to modify the optimize step should be provided, such as custom closure hooks, loss balancing techniques, etc.
It should be implemented because
No response
Additional context
No response
Would you like to work on this issue?
Yes
The text was updated successfully, but these errors were encountered:
Describe the feature
The Optimize step methods for Trainer should be refactored, so that two separate functions are not used for with and without gradient optimizations. This will improve the code structure and scalability.
Additionally, additional methods to modify the optimize step should be provided, such as custom closure hooks, loss balancing techniques, etc.
It should be implemented because
No response
Additional context
No response
Would you like to work on this issue?
Yes
The text was updated successfully, but these errors were encountered: