Skip to content

A step-by-step guide to setting up Nvidia GPUs with CUDA support running on Docker (and Compose) containers on NixOS host

Notifications You must be signed in to change notification settings

suvash/nixos-nvidia-cuda-python-docker-compose

Repository files navigation

Docker+Compose with CUDA (Nvidia container toolkit) + Jupyter & Python DL libraries on NixOS guide

Guide updated to : CUDA 12.3 (NixOS host) | CUDA 12.1 (docker container) | Python 3.11 | PyTorch 2.1.2

This repository includes a step-by-step guide for :

  • running deep learning libraries(such as pytorch 2.1, vllm on jupyterlab)
  • in docker containers
  • via docker-compose
  • with full CUDA support (container Cuda version: 12.1)
  • on NixOS hosts (Cuda Version: 12.3)

The primary target of the usage guide is for setting up deep learning projects on NixOS systems with Nvidia GPUs.

If you're on NixOS

Follow the guide from step 01.

The following steps are not specific to NixOS. Continue along !

If you're not on NixOS, but have installed Docker + Nvidia container toolkit successfully, and pass the tests mentioned in step 02

Follow the guide from step 03.

Just a heads up

While most of the files (and project folders are provided), you might have to update a couple of files, specially towards the final step. So, taking the time to go through the documentation is definitely advised.

Contributions

This guide is definitely most well suited for the author's personal usage. However, contributions are most welcome.

There's a fair chance that this document might not be working (or simply, outdated) by the time you see this. Please feel free to open an issue in that case.