Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can I serve speechbrain trained model whisper with faster whisper? #1139

Open
cod3r0k opened this issue Nov 14, 2024 · 5 comments
Open

Can I serve speechbrain trained model whisper with faster whisper? #1139

cod3r0k opened this issue Nov 14, 2024 · 5 comments

Comments

@cod3r0k
Copy link

cod3r0k commented Nov 14, 2024

Can I serve speechbrain trained model whisper with faster whisper?

@MahmoudAshraf97
Copy link
Collaborator

You have to convert it to CT2 first, there are several converters available, you can check CT2 documentation for more information

@cod3r0k
Copy link
Author

cod3r0k commented Nov 14, 2024

Great, can you help me more? What is CT2? @MahmoudAshraf97

@MahmoudAshraf97
Copy link
Collaborator

The backend of Faster Whisper
https://github.com/OpenNMT/CTranslate2/

@cod3r0k
Copy link
Author

cod3r0k commented Nov 15, 2024

Great. You mean that I do as below:

from transformers import WhisperProcessor, WhisperForConditionalGeneration

# Load the Whisper model from Hugging Face
processor = WhisperProcessor.from_pretrained("openai/whisper-large")
model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-large")

# Save the model in Hugging Face format
model.save_pretrained("whisper_huggingface")
processor.save_pretrained("whisper_huggingface")

Then

ctranslate2-converter --model whisper_huggingface --output_dir whisper_ctranslate2

Do I do it correctly?

@MahmoudAshraf97
Copy link
Collaborator

Exactly, if the model you have in not in huggingface format, you need to convert it first to that format then to CT2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants