NixOS CUDA | 319 Members | |
| CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda | 63 Servers |
| Sender | Message | Time |
|---|---|---|
| 29 Apr 2024 | ||
| * I don't think I'm trying to do anything too exotic though, basically "cuda in docker with widely used hardware," so I don't know why it doesn't "just work." The only thing different is I want to use Intel IGP for display (which works fine), nvidia for cuda. at this point I'm willing to sacrifice the ~250mb of using the nvidia display driver if it "just works" | 12:58:54 | |
| * I don't think I'm trying to do anything too exotic though, basically "cuda in docker with widely used hardware," so I don't know why it doesn't "just work." The only thing different is I want to use Intel IGP for display (which works fine), nvidia for cuda. at this point I'm willing to sacrifice the ~250mb for using the nvidia display driver if it "just works" | 12:59:12 | |
| one notable thing is I don't have services.xserver.videoDrivers; defined at all. "intel" didn't work for some reason (12700k). | 13:00:30 | |
* one notable thing is I don't have services.xserver.videoDrivers; defined at all. "intel" didn't work for some reason (12700k). | 13:00:41 | |
* one notable thing is I don't have services.xserver.videoDrivers defined at all. "intel" didn't work for some reason (12700k). | 13:01:46 | |
| 13:08:05 | ||
| Tge option is at Hardware nvidia container toolkit | 13:13:49 | |
| * The option is at Hardware nvidia container toolkit | 13:14:08 | |
In reply to @vid:matrix.orgYou need to enable either the nvidia_x11 or the datacenter driver | 13:15:09 | |
| * The option is at Hardware nvidia container toolkit Edit: I see now you have set it | 13:15:54 | |
| I don't see many references to the datacentre driver, if it'd work with a 3090 it seems like it would be good for not causing complexity with displays. | 13:19:16 | |
| I use nvidia_x11 with a 3090 | 13:21:05 | |
| (Note you don't have to enable the xserver or anything to use it; the option is just confusingly named) | 13:22:47 | |
ok, I'm sorry to ask this, but do you see anything obviously wrong in my config? nvidia-smi does find the card, but none of the libraries/docker seem to work. /run/cdi/nvidia-container-toolkit.json exists but isn't populated | 13:30:06 | |
| there seems to be a fundamental problem when nvidia-container-toolkit is installed, every docker command yields "no help topic for" <cmd> | 13:38:56 | |
| @vid do you have an example container you’re trying to run? Looks close to my setup so I could give it a try | 14:30:37 | |
| 15:29:37 | ||
| it was just the stock llama.cpp repo, following the instructions for the docker light setup. it was probably something I was doing wrong, but after spending a weekend on this, I got it running on ubuntu without pulling out a single hair. I'm going to have to stick to that camp, but I will keep an eye on nixos 'cause I really like the ideas | 15:47:43 | |
| I found that doing
Whereas My minimal settings are documented in this issue. https://github.com/NixOS/nixpkgs/issues/305312 | 16:05:20 | |
| 30 Apr 2024 | ||
| 14:20:23 | ||
| 1 May 2024 | ||
| Has anyone managed to get TensorFlow and PyTorch in the same Python environment on a recent nixos-unstable? This has seemed to have broken at some point in the last few months.
| 02:41:09 | |
| * Has anyone managed to get TensorFlow and PyTorch in the same Python environment on a recent nixos-unstable?
| 02:43:27 | |
| I haven't but that seems about right -- they either don't pin or pin different versions of dependencies :l | 04:44:10 | |
| I think my ISP hates me running
| 05:24:16 | |
| 15:06:29 | ||
| 2 May 2024 | ||
| 18:18:28 | ||
| 3 May 2024 | ||
| 14:01:31 | ||
| 17:47:48 | ||
| 4 May 2024 | ||
| 21:03:42 | ||