!eWOErHSaiddIbsUNsJ:nixos.org

NixOS CUDA

292 Members
CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda58 Servers

You have reached the beginning of time (for this room).


SenderMessageTime
7 Mar 2025
@mdietrich:matrix.orgmdietrich Hey all, first of all thank you for your work, last time I tried to use any cuda-related programs and services I had to give up because this joint effort had not been set up.
I am just wondering if I am doing something wrong when trying to set up llama-cpp and open-webui on my NixOS machine. I've set up the nix-community cache (and ollama with CUDA support installs fine in any case), but either enabling nixpkgs.config.cudaSupport or overwriting e.g. llama-cpp's package with `services.llama-cpp.package = pkgs.overwrite { config.cudaSupport = true; config.rocmSupport = false; }
13:12:26
@mdietrich:matrix.orgmdietrich * Hey all, first of all thank you for your work, last time I tried to use any cuda-related programs and services I had to give up because this joint effort had not been set up.
I am just wondering if I am doing something wrong when trying to set up llama-cpp and open-webui on my NixOS machine. I've set up the nix-community cache (and ollama with CUDA support installs fine in any case), but neither enabling nixpkgs.config.cudaSupport or overwriting e.g. llama-cpp's package with `services.llama-cpp.package = pkgs.overwrite { config.cudaSupport = true; config.rocmSupport = false; }` just dowload and install the appropriate packages, but lead to extremely long build times. Are these packages (llama-cpp and open-webui, of which I think onnxruntime takes the longest) just not built in the community cache?
13:13:54
@ss:someonex.netSomeoneSerge (back on matrix)Let's see13:15:53
@ss:someonex.netSomeoneSerge (back on matrix)https://hydra.nix-community.org/job/nixpkgs/cuda/llama-cpp.x86_64-linux13:15:55

Show newer messages


Back to Room ListRoom Version: 9