NixOS CUDA | 288 Members | |
| CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda | 58 Servers |
| Sender | Message | Time |
|---|---|---|
| 28 Feb 2025 | ||
| 06:23:11 | ||
| 06:23:32 | ||
| 06:29:08 | ||
| 06:30:58 | ||
| 06:32:14 | ||
*
| 06:50:42 | |
*
| 06:51:04 | |
In reply to @hexa:lossy.networkRecursion aside, the default value is going to be discarded because there is a stdenv in the parent scope :( | 08:44:51 | |
In reply to @hexa:lossy.networkIt contains the propagated build inputs referenced by the cmake module. It's not in dev, because withPackages sources dev... | 08:46:15 | |
| hm, changing to that yields the same derivations as before, but maybe that is because nothing currently relies on the cuda backendStdenv 🤔 | 17:35:50 | |
| 2 Mar 2025 | ||
| In case anybody is interested: I got nsight compute working: https://github.com/Snektron/nixos-config/blob/main/packages/nsight-compute.nix last time I checked there were still issues running the one in nixpkgs and I don't think the related PR has changed in the meantime | 11:46:23 | |
| 3 Mar 2025 | ||
| 12:41:18 | ||
| Hello, I'm having trouble getting CUDA working on Nixos. I made a post on Discourse but I thought I'd ask here since it's specific to CUDA: https://discourse.nixos.org/t/ollama-cuda-driver-library-init-failure-3/61068/2 In short, I installed ollama, but ollama reports:
I guess error 3 corresponds to the I think my issue is that is have CUDA 12.4 installed, when my GPU supports CUDA 12.8? Atlough I'm not certain what the CUDA version reported by | 12:50:21 | |
| * Hello, I'm having trouble getting CUDA working on Nixos. I made a post on Discourse but I thought I'd ask here since it's specific to CUDA: https://discourse.nixos.org/t/ollama-cuda-driver-library-init-failure-3/61068/2 In short, I installed ollama, but ollama reports:
I guess error 3 corresponds to the I think my issue is that is have CUDA 12.4 installed, when my GPU supports CUDA 12.8? Atlough I'm not certain what the CUDA version reported by | 12:50:49 | |
| Also I hope it's ok to cross-post like this. Apologies if that seems pushy. | 12:51:36 | |
| I also thought the error might be because I'm using the GPU to run Wayland, but I assume that just like CPUs, GPUs can run multiple workloads in parallel? Or does CUDA need to have exclusive access to the GPU? (I know these are very naive questions, I just never dealt with GPUs before) | 12:54:37 | |
| * I also thought the error might be because I'm using the GPU to run Wayland, but I assume that just like CPUs, GPUs can run multiple workloads? Or does CUDA need to have exclusive access to the GPU? (I know these are very naive questions, I just never dealt with GPUs before) | 12:56:23 | |
connor (he/him) (UTC-8): I just noticed that pkgs/development/libraries/science/math/tensorrt/extension.nix is a thing. At first glance, this code seems dead to me (or at least I wasn't able to find a place where it is called from)? It seems that nowadays all of the TensorRT-related code lives in pkgs/development/cuda-modules. The last commit (excluding automated reformatting) that touched pkgs/development/libraries/science/math/tensorrt seems to be 8e800cedaf24f5ad9717463b809b0beef7677000 authored by you in 2023. That commit also removed pkgs/development/libraries/science/math/tensorrt/generic.nix. So I am guessing that you forgot to also delete the extension.nix? | 13:43:50 | |
| ruro: yes, seems likely :l | 16:39:14 | |
| little_dude: it's fine to cross-post! Sorry it's not working, my only suggestion would be to try running it with whatever flags Ollama needs to enable debugging and/or The version difference across CUDA driver version and CUDA library version is fine -- just means you can run CUDA libraries using up to and including 12.8. The GPU definitely supports multiple workloads, so that shouldn't be a problem either. I'm strapped for time so I probably won't be able to help debug or troubleshoot, but I think some other people in here use ollama, so they might be able to chime in. | 16:46:44 | |
| 4 Mar 2025 | ||
| i have prepared a cudaPackages_12 update from 12.4 to 12.8 here: https://github.com/NixOS/nixpkgs/pull/386983 can you have a look? I also included a nixpkgs-review result - 229 marked as broken / 219 failed to build / 2455 packages built but I am having hard time figuring out which build failures are new and which were happening even before can you advise what is the best way how to proceed? please comment on github, i am not always following the discussion here | 10:48:13 | |
| an ideal thing for me would be if someone indicated the list of packages that really need to have the build fixed before the merge happens and I would (try to) work on fixing these | 10:53:23 | |
| * an ideal thing for me would be if someone indicated the list of packages that really need to have the build fixed before the merge happens and I will (try to) work on fixing these | 10:53:33 | |
In addition to Connor's suggestions, can you check what is the output when you run cudaPackages.saxpy? | 11:26:55 | |
| Maybe the merge of this PR should happen shortly after the merge of ROCm update in #367695 to not do massive rebuilds two times? | 12:12:00 | |
| * Maybe the merge of this PR should happen shortly after the merge of ROCm update in #367695 to not do massive rebuilds two times? https://github.com/NixOS/nixpkgs/pull/367695 | 12:12:17 | |
| 5 Mar 2025 | ||
| 08:07:57 | ||
| 08:08:10 | ||
| 7 Mar 2025 | ||
| 13:03:38 | ||
| Hey all, first of all thank you for your work, last time I tried to use any cuda-related programs and services I had to give up because this joint effort had not been set up. I am just wondering if I am doing something wrong when trying to set up llama-cpp and open-webui on my NixOS machine. I've set up the nix-community cache (and ollama with CUDA support installs fine in any case), but either enabling nixpkgs.config.cudaSupport or overwriting e.g. llama-cpp's package with `services.llama-cpp.package = pkgs.overwrite { config.cudaSupport = true; config.rocmSupport = false; } | 13:12:26 | |