Skip to content

Commit

Permalink
auto-generating sphinx docs
Browse files Browse the repository at this point in the history
  • Loading branch information
pytorchbot committed Nov 5, 2024
1 parent 2d7cc98 commit efb57e8
Show file tree
Hide file tree
Showing 7 changed files with 16 additions and 16 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -540,7 +540,7 @@ <h1>Source code for torchtune.training.checkpointing._checkpointer</h1><div clas
<span class="sd"> checkpoint_dir (str): Directory containing the checkpoint files</span>
<span class="sd"> checkpoint_files (List[str]): List of checkpoint files to load. Since the checkpointer takes care</span>
<span class="sd"> of sorting by file ID, the order in this list does not matter</span>
<span class="sd"> model_type (ModelType): Model type of the model for which the checkpointer is being loaded</span>
<span class="sd"> model_type (str): Model type of the model for which the checkpointer is being loaded</span>
<span class="sd"> output_dir (str): Directory to save the checkpoint files</span>
<span class="sd"> adapter_checkpoint (Optional[str]): Path to the adapter weights. Default is None</span>
<span class="sd"> recipe_checkpoint (Optional[str]): Path to the recipe state checkpoint file. Default is None</span>
Expand All @@ -559,7 +559,7 @@ <h1>Source code for torchtune.training.checkpointing._checkpointer</h1><div clas
<span class="bp">self</span><span class="p">,</span>
<span class="n">checkpoint_dir</span><span class="p">:</span> <span class="nb">str</span><span class="p">,</span>
<span class="n">checkpoint_files</span><span class="p">:</span> <span class="n">List</span><span class="p">[</span><span class="nb">str</span><span class="p">],</span>
<span class="n">model_type</span><span class="p">:</span> <span class="n">ModelType</span><span class="p">,</span>
<span class="n">model_type</span><span class="p">:</span> <span class="nb">str</span><span class="p">,</span>
<span class="n">output_dir</span><span class="p">:</span> <span class="nb">str</span><span class="p">,</span>
<span class="n">adapter_checkpoint</span><span class="p">:</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">str</span><span class="p">]</span> <span class="o">=</span> <span class="kc">None</span><span class="p">,</span>
<span class="n">recipe_checkpoint</span><span class="p">:</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">str</span><span class="p">]</span> <span class="o">=</span> <span class="kc">None</span><span class="p">,</span>
Expand Down Expand Up @@ -588,7 +588,7 @@ <h1>Source code for torchtune.training.checkpointing._checkpointer</h1><div clas
<span class="p">)</span>

<span class="bp">self</span><span class="o">.</span><span class="n">_resume_from_checkpoint</span> <span class="o">=</span> <span class="n">resume_from_checkpoint</span>
<span class="bp">self</span><span class="o">.</span><span class="n">_model_type</span> <span class="o">=</span> <span class="n">model_type</span>
<span class="bp">self</span><span class="o">.</span><span class="n">_model_type</span> <span class="o">=</span> <span class="n">ModelType</span><span class="p">[</span><span class="n">model_type</span><span class="p">]</span>
<span class="bp">self</span><span class="o">.</span><span class="n">_output_dir</span> <span class="o">=</span> <span class="n">Path</span><span class="p">(</span><span class="n">output_dir</span><span class="p">)</span>

<span class="c1"># recipe_checkpoint contains the recipe state. This should be available if</span>
Expand Down Expand Up @@ -751,7 +751,7 @@ <h1>Source code for torchtune.training.checkpointing._checkpointer</h1><div clas
<span class="sd"> checkpoint_dir (str): Directory containing the checkpoint files</span>
<span class="sd"> checkpoint_files (Union[List[str], Dict[str, str]]): List of checkpoint files to load. Since the checkpointer takes care</span>
<span class="sd"> of sorting by file ID, the order in this list does not matter. TODO: update this</span>
<span class="sd"> model_type (ModelType): Model type of the model for which the checkpointer is being loaded</span>
<span class="sd"> model_type (str): Model type of the model for which the checkpointer is being loaded</span>
<span class="sd"> output_dir (str): Directory to save the checkpoint files</span>
<span class="sd"> adapter_checkpoint (Optional[str]): Path to the adapter weights. Default is None</span>
<span class="sd"> recipe_checkpoint (Optional[str]): Path to the recipe state checkpoint file. Default is None</span>
Expand All @@ -767,7 +767,7 @@ <h1>Source code for torchtune.training.checkpointing._checkpointer</h1><div clas
<span class="bp">self</span><span class="p">,</span>
<span class="n">checkpoint_dir</span><span class="p">:</span> <span class="nb">str</span><span class="p">,</span>
<span class="n">checkpoint_files</span><span class="p">:</span> <span class="n">Union</span><span class="p">[</span><span class="n">List</span><span class="p">[</span><span class="nb">str</span><span class="p">],</span> <span class="n">Dict</span><span class="p">[</span><span class="nb">str</span><span class="p">,</span> <span class="nb">str</span><span class="p">]],</span>
<span class="n">model_type</span><span class="p">:</span> <span class="n">ModelType</span><span class="p">,</span>
<span class="n">model_type</span><span class="p">:</span> <span class="nb">str</span><span class="p">,</span>
<span class="n">output_dir</span><span class="p">:</span> <span class="nb">str</span><span class="p">,</span>
<span class="n">adapter_checkpoint</span><span class="p">:</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">str</span><span class="p">]</span> <span class="o">=</span> <span class="kc">None</span><span class="p">,</span>
<span class="n">recipe_checkpoint</span><span class="p">:</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">str</span><span class="p">]</span> <span class="o">=</span> <span class="kc">None</span><span class="p">,</span>
Expand Down Expand Up @@ -1152,7 +1152,7 @@ <h1>Source code for torchtune.training.checkpointing._checkpointer</h1><div clas
<span class="sd"> checkpoint_dir (str): Directory containing the checkpoint files</span>
<span class="sd"> checkpoint_files (List[str]): List of checkpoint files to load. Currently this checkpointer only</span>
<span class="sd"> supports loading a single checkpoint file.</span>
<span class="sd"> model_type (ModelType): Model type of the model for which the checkpointer is being loaded</span>
<span class="sd"> model_type (str): Model type of the model for which the checkpointer is being loaded</span>
<span class="sd"> output_dir (str): Directory to save the checkpoint files</span>
<span class="sd"> adapter_checkpoint (Optional[str]): Path to the adapter weights. Default is None</span>
<span class="sd"> recipe_checkpoint (Optional[str]): Path to the recipe state checkpoint file. Default is None</span>
Expand All @@ -1168,7 +1168,7 @@ <h1>Source code for torchtune.training.checkpointing._checkpointer</h1><div clas
<span class="bp">self</span><span class="p">,</span>
<span class="n">checkpoint_dir</span><span class="p">:</span> <span class="nb">str</span><span class="p">,</span>
<span class="n">checkpoint_files</span><span class="p">:</span> <span class="n">List</span><span class="p">[</span><span class="nb">str</span><span class="p">],</span>
<span class="n">model_type</span><span class="p">:</span> <span class="n">ModelType</span><span class="p">,</span>
<span class="n">model_type</span><span class="p">:</span> <span class="nb">str</span><span class="p">,</span>
<span class="n">output_dir</span><span class="p">:</span> <span class="nb">str</span><span class="p">,</span>
<span class="n">adapter_checkpoint</span><span class="p">:</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">str</span><span class="p">]</span> <span class="o">=</span> <span class="kc">None</span><span class="p">,</span>
<span class="n">recipe_checkpoint</span><span class="p">:</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">str</span><span class="p">]</span> <span class="o">=</span> <span class="kc">None</span><span class="p">,</span>
Expand Down
2 changes: 1 addition & 1 deletion main/_sources/deep_dives/checkpointer.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -443,7 +443,7 @@ For this section we'll use the Llama2 13B model in HF format.
checkpoint_dir=checkpoint_dir,
checkpoint_files=pytorch_files,
output_dir=checkpoint_dir,
model_type=ModelType.LLAMA2
model_type="LLAMA2"
)
torchtune_sd = checkpointer.load_checkpoint()
Expand Down
2 changes: 1 addition & 1 deletion main/deep_dives/checkpointer.html
Original file line number Diff line number Diff line change
Expand Up @@ -847,7 +847,7 @@ <h2>Putting this all together<a class="headerlink" href="#putting-this-all-toget
<span class="n">checkpoint_dir</span><span class="o">=</span><span class="n">checkpoint_dir</span><span class="p">,</span>
<span class="n">checkpoint_files</span><span class="o">=</span><span class="n">pytorch_files</span><span class="p">,</span>
<span class="n">output_dir</span><span class="o">=</span><span class="n">checkpoint_dir</span><span class="p">,</span>
<span class="n">model_type</span><span class="o">=</span><span class="n">ModelType</span><span class="o">.</span><span class="n">LLAMA2</span>
<span class="n">model_type</span><span class="o">=</span><span class="s2">&quot;LLAMA2&quot;</span>
<span class="p">)</span>
<span class="n">torchtune_sd</span> <span class="o">=</span> <span class="n">checkpointer</span><span class="o">.</span><span class="n">load_checkpoint</span><span class="p">()</span>

Expand Down
Loading

0 comments on commit efb57e8

Please sign in to comment.