NixOS CUDA | 315 Members | |
| CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda | 63 Servers |
| Sender | Message | Time |
|---|---|---|
| 28 Apr 2024 | ||
| I run into this famous "No help topic for" issue with nvidia-container-toolkit | 14:41:26 | |
In reply to @vid:matrix.orgCan't say much about the state of 2311 rn, but unstable offers a CDI module | 17:09:01 | |
In reply to @vid:matrix.orgIt should be mounted from the host | 17:09:34 | |
| 29 Apr 2024 | ||
| 11:43:06 | ||
In reply to @ss:someonex.net I switched over to unstable, but am still going in circles. I feel like I'm one statement away from success, but it still seems infinitely far away. here's my cuda-specific config at the moment (it's included from configuration.nix):
| 12:52:21 | |
| a lot of that is probably superstition, I've accessed a lot of in-flight convos/repos | 12:53:03 | |
| it doesn't even seem to be including cuda_runtime.h, which I thought would be part of cudatoolkit | 12:55:39 | |
| I don't think I'm trying to do anything too exotic though, so I don't know why it doesn't "just work." The only thing different is I want to use Intel IGP for display (which works fine), nvidia for cuda | 12:57:25 | |
| * I don't think I'm trying to do anything too exotic though, basically "cuda in docker with widely used hardware," so I don't know why it doesn't "just work." The only thing different is I want to use Intel IGP for display (which works fine), nvidia for cuda | 12:58:08 | |
| * I don't think I'm trying to do anything too exotic though, basically "cuda in docker with widely used hardware," so I don't know why it doesn't "just work." The only thing different is I want to use Intel IGP for display (which works fine), nvidia for cuda. at this point I'm willing to sacrifice the ~250mb of using the nvidia display driver if it "just works" | 12:58:54 | |
| * I don't think I'm trying to do anything too exotic though, basically "cuda in docker with widely used hardware," so I don't know why it doesn't "just work." The only thing different is I want to use Intel IGP for display (which works fine), nvidia for cuda. at this point I'm willing to sacrifice the ~250mb for using the nvidia display driver if it "just works" | 12:59:12 | |
| one notable thing is I don't have services.xserver.videoDrivers; defined at all. "intel" didn't work for some reason (12700k). | 13:00:30 | |
* one notable thing is I don't have services.xserver.videoDrivers; defined at all. "intel" didn't work for some reason (12700k). | 13:00:41 | |
* one notable thing is I don't have services.xserver.videoDrivers defined at all. "intel" didn't work for some reason (12700k). | 13:01:46 | |
| 13:08:05 | ||
| Tge option is at Hardware nvidia container toolkit | 13:13:49 | |
| * The option is at Hardware nvidia container toolkit | 13:14:08 | |
In reply to @vid:matrix.orgYou need to enable either the nvidia_x11 or the datacenter driver | 13:15:09 | |
| * The option is at Hardware nvidia container toolkit Edit: I see now you have set it | 13:15:54 | |
| I don't see many references to the datacentre driver, if it'd work with a 3090 it seems like it would be good for not causing complexity with displays. | 13:19:16 | |
| I use nvidia_x11 with a 3090 | 13:21:05 | |
| (Note you don't have to enable the xserver or anything to use it; the option is just confusingly named) | 13:22:47 | |
ok, I'm sorry to ask this, but do you see anything obviously wrong in my config? nvidia-smi does find the card, but none of the libraries/docker seem to work. /run/cdi/nvidia-container-toolkit.json exists but isn't populated | 13:30:06 | |
| there seems to be a fundamental problem when nvidia-container-toolkit is installed, every docker command yields "no help topic for" <cmd> | 13:38:56 | |
| @vid do you have an example container you’re trying to run? Looks close to my setup so I could give it a try | 14:30:37 | |
| 15:29:37 | ||
| it was just the stock llama.cpp repo, following the instructions for the docker light setup. it was probably something I was doing wrong, but after spending a weekend on this, I got it running on ubuntu without pulling out a single hair. I'm going to have to stick to that camp, but I will keep an eye on nixos 'cause I really like the ideas | 15:47:43 | |
| I found that doing
Whereas My minimal settings are documented in this issue. https://github.com/NixOS/nixpkgs/issues/305312 | 16:05:20 | |
| 30 Apr 2024 | ||
| 14:20:23 | ||
| 1 May 2024 | ||
| Has anyone managed to get TensorFlow and PyTorch in the same Python environment on a recent nixos-unstable? This has seemed to have broken at some point in the last few months.
| 02:41:09 | |