You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Following the steps in readme to run finetune_llama-2-7b-32k-mqa.sh, got below error:
Traceback (most recent call last): File "/home/ubuntu/training/OpenChatKit/training/dist_clm_train.py", line 478, in <module> main() File "/home/ubuntu/training/OpenChatKit/training/dist_clm_train.py", line 443, in main pipe = get_pp_module(args, config, device, use_dp) File "/home/ubuntu/training/OpenChatKit/training/pipeline_parallel/dist_pp_utils.py", line 7, in get_pp_module return GpipeAsync(args, config, device, use_dp) File "/home/ubuntu/training/OpenChatKit/training/pipeline_parallel/dist_gpipe_pipeline_async.py", line 197, in __init__ self.model = _StageMiddle(args, config, device) File "/home/ubuntu/training/OpenChatKit/training/modules/dist_gpt_pp_module.py", line 130, in __init__ super(GPTStageMiddle, self).__init__(args, config) File "/home/ubuntu/training/OpenChatKit/training/modules/dist_gpt_pp_module.py", line 35, in __init__ from .llama_modules import GPTEmbeddings, GPTBlock, GPTLMHead File "/home/ubuntu/training/OpenChatKit/training/modules/llama_modules.py", line 37, in <module> from flash_attn.layers.rotary import ( ModuleNotFoundError: No module named 'flash_attn'
The text was updated successfully, but these errors were encountered:
Sent from Yahoo Mail for iPhone
On Friday, December 15, 2023, 2:23 AM, jingli-wtbox ***@***.***> wrote:
Error
Following the steps in readme to run finetune_llama-2-7b-32k-mqa.sh, got below error:
Traceback (most recent call last): File "/home/ubuntu/training/OpenChatKit/training/dist_clm_train.py", line 478, in <module> main() File "/home/ubuntu/training/OpenChatKit/training/dist_clm_train.py", line 443, in main pipe = get_pp_module(args, config, device, use_dp) File "/home/ubuntu/training/OpenChatKit/training/pipeline_parallel/dist_pp_utils.py", line 7, in get_pp_module return GpipeAsync(args, config, device, use_dp) File "/home/ubuntu/training/OpenChatKit/training/pipeline_parallel/dist_gpipe_pipeline_async.py", line 197, in __init__ self.model = _StageMiddle(args, config, device) File "/home/ubuntu/training/OpenChatKit/training/modules/dist_gpt_pp_module.py", line 130, in __init__ super(GPTStageMiddle, self).__init__(args, config) File "/home/ubuntu/training/OpenChatKit/training/modules/dist_gpt_pp_module.py", line 35, in __init__ from .llama_modules import GPTEmbeddings, GPTBlock, GPTLMHead File "/home/ubuntu/training/OpenChatKit/training/modules/llama_modules.py", line 37, in <module> from flash_attn.layers.rotary import ( ModuleNotFoundError: No module named 'flash_attn'
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
Error
Following the steps in readme to run
finetune_llama-2-7b-32k-mqa.sh
, got below error:Traceback (most recent call last): File "/home/ubuntu/training/OpenChatKit/training/dist_clm_train.py", line 478, in <module> main() File "/home/ubuntu/training/OpenChatKit/training/dist_clm_train.py", line 443, in main pipe = get_pp_module(args, config, device, use_dp) File "/home/ubuntu/training/OpenChatKit/training/pipeline_parallel/dist_pp_utils.py", line 7, in get_pp_module return GpipeAsync(args, config, device, use_dp) File "/home/ubuntu/training/OpenChatKit/training/pipeline_parallel/dist_gpipe_pipeline_async.py", line 197, in __init__ self.model = _StageMiddle(args, config, device) File "/home/ubuntu/training/OpenChatKit/training/modules/dist_gpt_pp_module.py", line 130, in __init__ super(GPTStageMiddle, self).__init__(args, config) File "/home/ubuntu/training/OpenChatKit/training/modules/dist_gpt_pp_module.py", line 35, in __init__ from .llama_modules import GPTEmbeddings, GPTBlock, GPTLMHead File "/home/ubuntu/training/OpenChatKit/training/modules/llama_modules.py", line 37, in <module> from flash_attn.layers.rotary import ( ModuleNotFoundError: No module named 'flash_attn'
The text was updated successfully, but these errors were encountered: