Skip to content

sasha0552/ToriLinux

Repository files navigation

ToriLinux - Linux LiveCD for offline AI training and inference

vLLM on ToriLinux

LiveCD distribution based on ArchLinux and currently includes the following projects preinstalled, along with their dependencies:

If you would like to see another AI-related project included in ToriLinux, please open an issue.

Features

  • Easy setup: just boot the ISO, and you will have a working setup for training and/or inferencing Large Language Models/Stable Diffusion/etc.
  • Fully offline training and/or inference.
  • Includes performance state switcher, which reduces GPU temperatures when inference is not running (only on NVIDIA, automatic, llama.cpp, vllm supported).

Usage

To use ToriLinux:

  1. Install Ventoy on a USB drive.
  2. Download the latest ISO from workflows and copy it to the USB drive.
  3. Boot from the USB drive (select it as the boot device in BIOS/UEFI).
  4. Log in with the username tori and password tori. You can also use SSH.

The following options are currently available:

  1. Headless: Headless variant without GUI, for servers.
  2. With GUI: A variant with a GUI, for desktops.
  3. Empty: A variant WITHOUT projects, with CUDA/ROCm only.
Headless With GUI Empty
NVIDIA NVIDIA headless NVIDIA with GUI NVIDIA empty
AMD AMD headless AMD with GUI AMD empty

Note: You need to be logged in to GitHub to download artifacts. You can use nightly.link to download artifacts without authorization.

Misc

Note that you need pre-downloaded models on a local hard drive or NFS server, or enough RAM and internet connection to download models directly into RAM.

Note that following projects is not available on ROCm version:

The server for building the ROCm version is provided by @Sepera-okeq.