Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PyTorch Optimizers #5367

Merged
merged 12 commits into from
Nov 6, 2024
Merged

Conversation

Beto-Garcia
Copy link
Contributor

Description

New Concept Entry Created: content/pytorch/concepts/optimizers/optimizers.md

Issue Solved

Closes #5234

Type of Change

  • Adding a new entry

Checklist

  • All writings are my own.
  • My entry follows the Codecademy Docs style guide.
  • My changes generate no new warnings.
  • I have performed a self-review of my own writing and code.
  • I have checked my entry and corrected any misspellings.
  • I have made corresponding changes to the documentation if needed.
  • I have confirmed my changes are not being pushed from my forked main branch.
  • I have confirmed that I'm pushing from a new branch named after the changes I'm making.
  • I have linked any issues that are relevant to this PR in the Issues Solved section.

@cigar-galaxy82 cigar-galaxy82 self-assigned this Oct 4, 2024
@cigar-galaxy82 cigar-galaxy82 added new entry New entry or entries status: under review Issue or PR is currently being reviewed pytorch PyTorch labels Oct 12, 2024
Comment on lines 2 to 13
Title: 'Optimizers' # Required; the file name should be the same as the title, but lowercase, with dashes instead of spaces, and all punctuation removed
Description: 'A PyTorch optimizer is a tool that helps with the process of training a machine learning model. This involves adjusting the parameters of the model during training in order to minimize the error between the predicted output and the actual output.' # Required; ideally under 150 characters and starts with a noun (used in search engine results and content previews)
Subjects: # Please only use Subjects in the subjects.md file (https://github.com/Codecademy/docs/blob/main/documentation/subjects.md). If that list feels insufficient, feel free to create a new Subject and add it to subjects.md in your PR!
- 'How to use an optimizer'
- 'Constructing it'
- 'Per-parameter options'
Tags: # Please only use Tags in the tags.md file (https://github.com/Codecademy/docs/blob/main/documentation/tags.md). If that list feels insufficient, feel free to create a new Tag and add it to tags.md in your PR!
- 'PyTorch'
- 'Optimizers'
CatalogContent: # Please use course/path landing page slugs, rather than linking to individual content items. If listing multiple items, please put the most relevant one first
- 'Concept Entry'
- 'content/pytorch'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Title: 'Optimizers' # Required; the file name should be the same as the title, but lowercase, with dashes instead of spaces, and all punctuation removed
Description: 'A PyTorch optimizer is a tool that helps with the process of training a machine learning model. This involves adjusting the parameters of the model during training in order to minimize the error between the predicted output and the actual output.' # Required; ideally under 150 characters and starts with a noun (used in search engine results and content previews)
Subjects: # Please only use Subjects in the subjects.md file (https://github.com/Codecademy/docs/blob/main/documentation/subjects.md). If that list feels insufficient, feel free to create a new Subject and add it to subjects.md in your PR!
- 'How to use an optimizer'
- 'Constructing it'
- 'Per-parameter options'
Tags: # Please only use Tags in the tags.md file (https://github.com/Codecademy/docs/blob/main/documentation/tags.md). If that list feels insufficient, feel free to create a new Tag and add it to tags.md in your PR!
- 'PyTorch'
- 'Optimizers'
CatalogContent: # Please use course/path landing page slugs, rather than linking to individual content items. If listing multiple items, please put the most relevant one first
- 'Concept Entry'
- 'content/pytorch'
Title: 'Optimizers'
Description: 'Helps adjust the model parameters during training to minimize the error between the predicted output and the actual output.'
Subjects:
- 'Data Science'
- 'Machine Learning'
- 'AI'
Tags:
- 'Deep Learning'
- 'Libraries'
- 'TensorFlow'
CatalogContent:
- 'intro-to-py-torch-and-neural-networks'
- 'paths/data-science'
- 'paths/machine-learning'

Comment on lines 16 to 18
**torch.optim** is a package implementing various optimization algorithms.

Most commonly used methods are already supported, and the interface is general enough, so that more sophisticated ones can also be easily integrated in the future.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
**torch.optim** is a package implementing various optimization algorithms.
Most commonly used methods are already supported, and the interface is general enough, so that more sophisticated ones can also be easily integrated in the future.
Optimizers help adjust the model parameters during training to minimize the error between the predicted output and the actual output. They use the gradients calculated through backpropagation to update the model in a direction that reduces this error, improving the model's performance over time.

Comment on lines 20 to 32
## How to use an optimizer

To use ```torch.optim``` you have to construct an optimizer object that will hold the current state and will update the parameters based on the computed gradients.

## Constructing it

To construct an Optimizer you have to give it an iterable containing the parameters (all should be Variable s) to optimize. Then, you can specify optimizer-specific options such as the learning rate, weight decay, etc.

Example:
```codebyte/js
optimizer = optim.SGD(model.parameters(), lr=0.01, momentum=0.9)
optimizer = optim.Adam([var1, var2], lr=0.0001)
```
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
## How to use an optimizer
To use ```torch.optim``` you have to construct an optimizer object that will hold the current state and will update the parameters based on the computed gradients.
## Constructing it
To construct an Optimizer you have to give it an iterable containing the parameters (all should be Variable s) to optimize. Then, you can specify optimizer-specific options such as the learning rate, weight decay, etc.
Example:
```codebyte/js
optimizer = optim.SGD(model.parameters(), lr=0.01, momentum=0.9)
optimizer = optim.Adam([var1, var2], lr=0.0001)
```
## How to use an optimizer
To use an optimizer including the `torch.optim` package is a must and constructing an optimizer object that will hold the current state and will update the parameters based on the computed gradients.
## Syntax
```pseudo
# Importing the package
import torch.optim as optim
# Instantiate the model that will be trained
model = YourModel()
# Instantiate the optimizer of choice
optimizer = optim.optimizer_name()
# Training the model
for epoch in range(num_epochs):
# Steps for training the model using an optimizer

@Sriparno08 Sriparno08 self-assigned this Oct 26, 2024
@Sriparno08 Sriparno08 added status: under review Issue or PR is currently being reviewed and removed status: ready for next review labels Oct 26, 2024
Copy link
Collaborator

@Sriparno08 Sriparno08 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, @Beto-Garcia!

@Sriparno08 Sriparno08 merged commit c21db67 into Codecademy:main Nov 6, 2024
6 checks passed
Copy link

github-actions bot commented Nov 6, 2024

👋 @Beto-Garcia
You have contributed to Codecademy Docs, and we would like to know more about you and your experience.
Please take a minute to fill out this four question survey to help us better understand Docs contributions and how we can improve the experience for you and our learners.
Thank you for your help!

🎉 Your contribution(s) can be seen here:

https://www.codecademy.com/resources/docs/pytorch/optimizers

Please note it may take a little while for changes to become visible.
If you're appearing as anonymous and want to be credited, see here.

@Sriparno08 Sriparno08 added status: review 2️⃣ completed and removed status: under review Issue or PR is currently being reviewed labels Nov 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Concept Entry] PyTorch Optimizers
3 participants