!eWOErHSaiddIbsUNsJ:nixos.org

NixOS CUDA

287 Members
CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda58 Servers

Load older messages


SenderMessageTime
10 Oct 2025
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8) cuda-legacy is going to be such a pain in the ass if the roughly nine hours I just spent trying to build PyTorch against CUDA 11.4 is any indication 23:25:40
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)(I was not successful; will resume trying with PyTorch 2.6 instead of 2.7 later)23:26:30
11 Oct 2025
@rosscomputerguy:matrix.orgTristan Ross Hey, connor (he/him) (UTC-7) & SomeoneSerge (back on matrix). Either of you wanna collab on getting Tenstorrent support into nixpkgs? I'm the only one working on it but I think since this is in a realm of AI, ML, and GPU-like computing, it would make sense to involve people already touching that stuff. 02:29:45
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)I’d love to but I don’t have time :(15:37:38
@glepage:matrix.orgGaétan Lepage FYI: I'm working on bumping onnx[runtime] in https://github.com/NixOS/nixpkgs/pull/450587
However, the build fails... More investigation needed.
16:20:35
@ss:someonex.netSomeoneSerge (back on matrix)
In reply to @rosscomputerguy:matrix.org
Hey, connor (he/him) (UTC-7) & SomeoneSerge (back on matrix). Either of you wanna collab on getting Tenstorrent support into nixpkgs? I'm the only one working on it but I think since this is in a realm of AI, ML, and GPU-like computing, it would make sense to involve people already touching that stuff.
YES! /looks at the calendar, lowers the volume/ yes, though very much part time god hiw do i learn to say no
18:17:28
@ss:someonex.netSomeoneSerge (back on matrix)* YES! /looks at the calendar, lowers the volume/ yes, though very much part time god how do i learn to say no18:17:37
12 Oct 2025
@rosscomputerguy:matrix.orgTristan RossHeh, it's not too much. Bulk of the reviewing is https://github.com/NixOS/nixpkgs/pull/44481317:19:19
13 Oct 2025
@collinarnett:matrix.orgCollin ArnettHello! Have ya'll run into this problem with the nvidia-runtime-container? https://github.com/llm-d/llm-d/issues/117#issuecomment-2992256350 apparently there is a patch for it here https://github.com/NVIDIA/k8s-device-plugin/pull/1183/files15:41:41
@collinarnett:matrix.orgCollin Arnett* Hello! Have ya'll run into this problem with the nvidia-container-toolkit? https://github.com/llm-d/llm-d/issues/117#issuecomment-2992256350 apparently there is a patch for it here https://github.com/NVIDIA/k8s-device-plugin/pull/1183/files15:52:49
@ss:someonex.netSomeoneSerge (back on matrix) connor (he/him) (UTC-7): 8am instead of 7, rsvp? 16:01:29
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Yes, 8am Pacific17:22:55
@gmacon:matrix.org@gmacon:matrix.org left the room.17:53:23
15 Oct 2025
@danielrf:matrix.orgdanielrf connor (he/him) (UTC-7): Hey, just fyi. This looks very similar to an issue we had fixed in jetpack-nixos: https://github.com/NixOS/nixpkgs/issues/451912 I can't recall if our fix was generic enough to also be applicable to the nixpkgs' nvidia-container-toolkit 03:56:45
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)pain07:03:11
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Yeah Jared had written a udevadm settle for some devices; when I refactored to use upstream’s container toolkit stuff I commented it out hoping waiting on the modprobe nvgpu service was enough07:05:36
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)

Relevant PRs:

  • https://github.com/anduril/jetpack-nixos/pull/317
  • https://github.com/anduril/jetpack-nixos/pull/331
07:06:36
@glepage:matrix.orgGaétan Lepage

connor (he/him) (UTC-7) SomeoneSerge (back on matrix) Torch 2.9.0 was just released.

I'm working on the bump.
It requires libnvshmem_host.so.3. I never heard of OpenSHMEM before.
Do we have it already in nixpkgs?

18:50:04
@apyh:matrix.orgapyhoh i have this in a fork, sec 19:43:52
@arilotter:matrix.orgAri Lotter joined the room.19:44:18
@arilotter:matrix.orgAri Lotter(still me sorry, diff devices w bad key management)19:44:26
@arilotter:matrix.orgAri Lotterhttps://github.com/PsycheFoundation/psyche/blob/main/nix/nvshmem.nix19:44:30
@arilotter:matrix.orgAri Lotterwe don't have nvshmem in nixpkgs19:44:35
@arilotter:matrix.orgAri Lotteri'm using this with torch 2.9.0 :)19:45:38
@glepage:matrix.orgGaétan LepageThanks for sharing! I guess we'll need to cleanly upstream this then?20:03:40
@glepage:matrix.orgGaétan Lepage* Thanks for sharing! I guess we'll need to cleanly upstream this then.20:03:41
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)I could probably add it to the CUDA 13 PR; nvshmem is one of the dependencies of libcublasmp I didn’t try to package20:53:02
16 Oct 2025
@arilotter:matrix.orgAri Lotterlmk if i can help - 2.9.0 (nightly) is in active usage in the above project00:46:36
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Looks like it should be very doable to package — it’s a redist so shouldn’t be too bad and can re-use all the helpers we’ve got for that. Will take a closer look tomorrow04:53:50
@niclas:overby.meNiclas Overby Ⓝ Is there something like rust-overlay for CUDA, so you can specify exactly which CUDA version to use? 11:39:49

Show newer messages


Back to Room ListRoom Version: 9