You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello.
I would like to proceed with fine tuning with non-English speaking data.
I referred to the fine tuning guide you gave me before.(#70)
And I also referred to this person's post.(#157)
We found that the fine tuning code calls and uses the existing tokenizer, and the tokenizer does not update the new data.
Therefore, even if fine tuning is performed, non-English speaking data is provided with inconsistent outputs.
I would appreciate it if you could advise me on what to do.
The text was updated successfully, but these errors were encountered:
Hello.
I would like to proceed with fine tuning with non-English speaking data.
I referred to the fine tuning guide you gave me before.(#70)
And I also referred to this person's post.(#157)
We found that the fine tuning code calls and uses the existing tokenizer, and the tokenizer does not update the new data.
Therefore, even if fine tuning is performed, non-English speaking data is provided with inconsistent outputs.
I would appreciate it if you could advise me on what to do.
The text was updated successfully, but these errors were encountered: