Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Docker Installation to Whisper-Jax #138

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

pourmand1376
Copy link

Building a docker compatible Whisper-Jax literally took me a day.

None of the methods recommended by Jax Team worked for GPU compatible installation. So, I used a nightly build from NVIDIA Jax and installed all your dependencies.

Also, some dependencies were not available in setup.py. I have added a list of complete dependencies in requirements.txt.

Copy link

@im2ex im2ex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reading, I didn't find anything wrong or suspicious (concerning security).
Immediately worked for me on Ubuntu 22.4 with podman.
Could not test the last 4 lines of docker-compose.yaml though (works different in podman - but they don't hurt)

@im2ex
Copy link

im2ex commented Sep 11, 2023

Hello, thanks so much for that PR. Came just at the right time for me.
Struggled with all kind of dependency and cuda problems this morning when trying to install and then found that PR.

Now came for this comment and saw that I am a little too late. The "Merging is blocked" was not there before. Hope my review helps.

In case someone struggles with podman and GPU: With podman only CPU is used when starting up with podman compose-up, but when following this https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/1.14.1/cdi-support.html GPU is readily working. (started container by hand, don't know yet how to bring this into docker-compose)

And another tip: it takes quite a while until the model is fetched (depending on internet speed. Almost 7GB). So comit the container locally when that is done, otherwise it is easy to loose the model and then wait again quite a while.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants