You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was curious to know the memory requirements for running the model. Can anyone who has run the model, share the memory and any compute numbers such as the time it took to run for your task?
The text was updated successfully, but these errors were encountered:
It seems to be using 18Gb of Vramm when launching the demo. Response time is pretty fast on NVIDIA RTX™ 4000 SFF Ada generation 20GB. I can give audio of 10sec, and get reply within 4sec roughly
I was curious to know the memory requirements for running the model. Can anyone who has run the model, share the memory and any compute numbers such as the time it took to run for your task?
The text was updated successfully, but these errors were encountered: