Skip to content

Commit

Permalink
formatting fixes
Browse files Browse the repository at this point in the history
  • Loading branch information
Ssukriti committed Apr 8, 2024
1 parent cacea88 commit b2003d5
Showing 1 changed file with 8 additions and 3 deletions.
11 changes: 8 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -144,8 +144,9 @@ To summarize you can pick either python for singleGPU jobs or use accelerate lau

### LoRA Tuning Example

Set peft_method = "lora". You can additionally pass any arguments from [LoraConfig](https://github.com/foundation-model-stack/fms-hf-tuning/blob/main/tuning/config/peft_config.py#L7).
```
Set peft_method = "lora". You can additionally pass any arguments from [LoraConfig](https://github.com/foundation-model-stack/fms-hf-tuning/blob/main/tuning/config/peft_config.py#L21).
```bash
# Args you can pass
r: int =8
lora_alpha: int = 32
target_modules: List[str] = field(
Expand All @@ -162,6 +163,7 @@ target_modules: List[str] = field(
lora_dropout: float = 0.05

```
Example command to run:

```bash
python tuning/sft_trainer.py \
Expand Down Expand Up @@ -250,7 +252,7 @@ You can specify attention or linear layers. With the CLI, you can specify layers
### Prompt Tuning :

Specify peft_method to 'pt' . You can additionally pass any arguments from [PromptTuningConfig](https://github.com/foundation-model-stack/fms-hf-tuning/blob/main/tuning/config/peft_config.py#L39).
```
```bash
# prompt_tuning_init can be either "TEXT" or "RANDOM"
prompt_tuning_init: str = "TEXT"
num_virtual_tokens: int = 8
Expand All @@ -259,6 +261,8 @@ Specify peft_method to 'pt' . You can additionally pass any arguments from [Prom
tokenizer_name_or_path: str = "llama-7b-hf"
```

Example command you can run:

```bash

accelerate launch \
Expand Down Expand Up @@ -292,6 +296,7 @@ tuning/sft_trainer.py \

Set peft_method = 'None'

Full fine tuning needs more compute resources, so it is advised to use the MultiGPU method
```bash

accelerate launch \
Expand Down

0 comments on commit b2003d5

Please sign in to comment.