-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question] Faster-Whisper Home Assistant GPU or Tensor (Coral) Suport? #37
Comments
Faster-whisper supports GPU execution: Coral TPU is not supported since it does not work well the recurring neural networks: SYSTRAN/faster-whisper#203 The last link however suggests TFLite version of https://github.com/usefulsensors/openai-whisper which could work, but no promises. |
Update: I doubt that a model of this size would fit into the edge TPU anyway. Your best bet is GPU. Probably even NVIDIA Jetson, but those are very expensive. |
thanks for the fast response. I understand i need to create a docker image to support gpu? |
Don't know about docker. This line just suggests to set the rhasspy3/rhasspy3/configuration.yaml Line 492 in 11e8d30
|
do you know where i can find the makefile or the build file for the docker image rhasspy/wyoming-whisper ? |
Nope, sorry |
@Shulyaka you can find it here: https://github.com/rhasspy/wyoming-addons And I added a PR to support GPU, and for the moment works for whisper but not for piper: |
The ticket SYSTRAN/faster-whisper#203 say that not be possible ti run on TPU But the project whisper-jax has done a migration with TPU Coral If the huggingface test is correct
A blog article made some benchmark also: Any implementation on TPU Coral could be an great specially for cheat hardware ? |
I do not thing Iy would be possible to run Whisper-JAX on Google Coral TPU (consumer) since model weight to be loaded is to heavy at least this is what I had read on some other discussions |
I host the Faster-Whisper Docker container on my server. I did not see if it can use other hardware to speed things up?
Thanks!
The text was updated successfully, but these errors were encountered: