相同数据微调qwen2.5-1.5b比qwen2-1.5b要差很多 #1010
Unanswered
1215thebqtic
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
用相同的文本纠错数据lora微调qwen2.5-1.5b和qwen2-1.5b后,qwen2.5比2要差很多,recall基本一样,但precision差很多,有很多没必要改的地方也会随便动动,请问一下原因或者我该怎么做可以提高qwen2.5的精度?谢谢!
Beta Was this translation helpful? Give feedback.
All reactions