Replies: 2 comments
-
It does not make much practical sense, unless you have some very specific conditions or large batches, but you can try. Make sure to read this before proceeding: |
Beta Was this translation helpful? Give feedback.
0 replies
-
GPU inference would definitely be helpful :) I have a use case where I need to proceed multiple hours of sound captured by a wearable device. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
❓ Questions and Help
Hi.
Thank you for the provided repo and models.
Wiki says, that "Using batching or GPU can also improve performance considerably".
I've tried to run VAD models of GPU and faced errors.
Could I run jit or Onnx models on GPU?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions