Ai tutorials on running ROCm, PyTorch, llama.cpp, Ollama, Stable Diffusion and LM Studio in Incus / LXD containers

I’ve written four AI-related tutorials that you might be interested in.

Quick Notes:

  • The tutorials are written for Incus, but you can just replace incus commands with lxc.
  • I’m using an AMD 5600G APU, but most of what you’ll see in the tutorials also applies to discrete GPUs. Whenever something is APU specific, I have marked it as such.
  • Even though I use ROCm in my containers, Nvidia CUDA users should also find these guides helpful.

Tutorials:

  1. Ai tutorial: ROCm and PyTorch on AMD APU or GPU
  2. Ai tutorial: llama.cpp and Ollama servers + plugins for VS Code / VS Codium and IntelliJ
  3. Ai tutorial: Stable Diffusion SDXL with Fooocus
  4. Ai tutorial: LLMs in LM Studio
5 Likes