-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PyTorch Optimizers #5367
PyTorch Optimizers #5367
Conversation
Title: 'Optimizers' # Required; the file name should be the same as the title, but lowercase, with dashes instead of spaces, and all punctuation removed | ||
Description: 'A PyTorch optimizer is a tool that helps with the process of training a machine learning model. This involves adjusting the parameters of the model during training in order to minimize the error between the predicted output and the actual output.' # Required; ideally under 150 characters and starts with a noun (used in search engine results and content previews) | ||
Subjects: # Please only use Subjects in the subjects.md file (https://github.com/Codecademy/docs/blob/main/documentation/subjects.md). If that list feels insufficient, feel free to create a new Subject and add it to subjects.md in your PR! | ||
- 'How to use an optimizer' | ||
- 'Constructing it' | ||
- 'Per-parameter options' | ||
Tags: # Please only use Tags in the tags.md file (https://github.com/Codecademy/docs/blob/main/documentation/tags.md). If that list feels insufficient, feel free to create a new Tag and add it to tags.md in your PR! | ||
- 'PyTorch' | ||
- 'Optimizers' | ||
CatalogContent: # Please use course/path landing page slugs, rather than linking to individual content items. If listing multiple items, please put the most relevant one first | ||
- 'Concept Entry' | ||
- 'content/pytorch' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Title: 'Optimizers' # Required; the file name should be the same as the title, but lowercase, with dashes instead of spaces, and all punctuation removed | |
Description: 'A PyTorch optimizer is a tool that helps with the process of training a machine learning model. This involves adjusting the parameters of the model during training in order to minimize the error between the predicted output and the actual output.' # Required; ideally under 150 characters and starts with a noun (used in search engine results and content previews) | |
Subjects: # Please only use Subjects in the subjects.md file (https://github.com/Codecademy/docs/blob/main/documentation/subjects.md). If that list feels insufficient, feel free to create a new Subject and add it to subjects.md in your PR! | |
- 'How to use an optimizer' | |
- 'Constructing it' | |
- 'Per-parameter options' | |
Tags: # Please only use Tags in the tags.md file (https://github.com/Codecademy/docs/blob/main/documentation/tags.md). If that list feels insufficient, feel free to create a new Tag and add it to tags.md in your PR! | |
- 'PyTorch' | |
- 'Optimizers' | |
CatalogContent: # Please use course/path landing page slugs, rather than linking to individual content items. If listing multiple items, please put the most relevant one first | |
- 'Concept Entry' | |
- 'content/pytorch' | |
Title: 'Optimizers' | |
Description: 'Helps adjust the model parameters during training to minimize the error between the predicted output and the actual output.' | |
Subjects: | |
- 'Data Science' | |
- 'Machine Learning' | |
- 'AI' | |
Tags: | |
- 'Deep Learning' | |
- 'Libraries' | |
- 'TensorFlow' | |
CatalogContent: | |
- 'intro-to-py-torch-and-neural-networks' | |
- 'paths/data-science' | |
- 'paths/machine-learning' |
**torch.optim** is a package implementing various optimization algorithms. | ||
|
||
Most commonly used methods are already supported, and the interface is general enough, so that more sophisticated ones can also be easily integrated in the future. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
**torch.optim** is a package implementing various optimization algorithms. | |
Most commonly used methods are already supported, and the interface is general enough, so that more sophisticated ones can also be easily integrated in the future. | |
Optimizers help adjust the model parameters during training to minimize the error between the predicted output and the actual output. They use the gradients calculated through backpropagation to update the model in a direction that reduces this error, improving the model's performance over time. |
## How to use an optimizer | ||
|
||
To use ```torch.optim``` you have to construct an optimizer object that will hold the current state and will update the parameters based on the computed gradients. | ||
|
||
## Constructing it | ||
|
||
To construct an Optimizer you have to give it an iterable containing the parameters (all should be Variable s) to optimize. Then, you can specify optimizer-specific options such as the learning rate, weight decay, etc. | ||
|
||
Example: | ||
```codebyte/js | ||
optimizer = optim.SGD(model.parameters(), lr=0.01, momentum=0.9) | ||
optimizer = optim.Adam([var1, var2], lr=0.0001) | ||
``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
## How to use an optimizer | |
To use ```torch.optim``` you have to construct an optimizer object that will hold the current state and will update the parameters based on the computed gradients. | |
## Constructing it | |
To construct an Optimizer you have to give it an iterable containing the parameters (all should be Variable s) to optimize. Then, you can specify optimizer-specific options such as the learning rate, weight decay, etc. | |
Example: | |
```codebyte/js | |
optimizer = optim.SGD(model.parameters(), lr=0.01, momentum=0.9) | |
optimizer = optim.Adam([var1, var2], lr=0.0001) | |
``` | |
## How to use an optimizer | |
To use an optimizer including the `torch.optim` package is a must and constructing an optimizer object that will hold the current state and will update the parameters based on the computed gradients. | |
## Syntax | |
```pseudo | |
# Importing the package | |
import torch.optim as optim | |
# Instantiate the model that will be trained | |
model = YourModel() | |
# Instantiate the optimizer of choice | |
optimizer = optim.optimizer_name() | |
# Training the model | |
for epoch in range(num_epochs): | |
# Steps for training the model using an optimizer |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, @Beto-Garcia!
👋 @Beto-Garcia 🎉 Your contribution(s) can be seen here: https://www.codecademy.com/resources/docs/pytorch/optimizers Please note it may take a little while for changes to become visible. |
Description
New Concept Entry Created: content/pytorch/concepts/optimizers/optimizers.md
Issue Solved
Closes #5234
Type of Change
Checklist
main
branch.Issues Solved
section.