Skip to content

Commit

Permalink
typo
Browse files Browse the repository at this point in the history
  • Loading branch information
Kartikay Khandelwal committed Apr 10, 2024
1 parent e812a07 commit e043dc6
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions torchtune/utils/_checkpointing/_checkpointer.py
Original file line number Diff line number Diff line change
Expand Up @@ -261,7 +261,7 @@ class FullModelHFCheckpointer(_CheckpointerInterface):
the Llama-2-7b-hf model from the meta-llama repo (https://huggingface.co/meta-llama/Llama-2-7b-hf)
A few notes about the checkpoint reading logic:
- HF checkpoint names usually oredered by ID (eg: 0001_of_0003, 0002_of_0003, etc.) To ensure
- HF checkpoint names usually ordered by ID (eg: 0001_of_0003, 0002_of_0003, etc.) To ensure
we read the files in the right order, we sort the checkpoint file names before reading
- Checkpoint conversion to and from HF's format requires access to model params which are
read directly from the "config.json" file. This helps ensure we either load the weights
Expand Down Expand Up @@ -574,7 +574,7 @@ def save_checkpoint(
"""
Save TorchTune checkpoint to file. If ``intermediate_checkpoint`` is True, an additional
checkpoint file ``recipe_state.pt`` is created in ``_output_dir`` which contains the recipe
state. The output state dicts have the following formats:
state.
Args:
state_dict (Dict[str, Any]): Checkpoint state dict to be written out to file
Expand Down

0 comments on commit e043dc6

Please sign in to comment.