!eWOErHSaiddIbsUNsJ:nixos.org

NixOS CUDA

301 Members
CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda59 Servers

Load older messages


SenderMessageTime
10 Mar 2026
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)13.2 is out 🫩 https://developer.download.nvidia.com/compute/cuda/redist/03:35:22
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8) danielrf Orin is supported by 13.2/JP7: https://developer.nvidia.com/blog/cuda-13-2-introduces-enhanced-cuda-tile-support-and-new-python-features/#embedded_devices 06:10:08
@glepage:matrix.orgGaƩtan Lepage I got you connor (burnt/out) (UTC-8)
https://github.com/NixOS/nixpkgs/pull/498523
11:52:20
@glepage:matrix.orgGaƩtan Lepage We do have libcublasmp. Is this doc outdated? https://github.com/NixOS/nixpkgs/blob/master/pkgs/development/cuda-modules/README.md#distinguished-packages 12:31:03
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Ah yep it’s outdated, I packages nvshmem: https://github.com/NixOS/nixpkgs/blob/master/pkgs/development/cuda-modules/packages/libnvshmem.nix15:28:13
@glepage:matrix.orgGaƩtan Lepage connor (burnt/out) (UTC-8) if I want to bump libcublasmp (to 0.7.x) for example, how do I know which cudaPackage_X should be affected? 17:36:23
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)A very deep reading of the changelog, package contents changes, and thorough rebuilds and runtime verification for consumers17:38:14
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Yet another reason we need test suites for downstream packages which exercise those libraries — relying on NVIDIA’s samples (if they’re even available) isn’t sufficient because we care about whether consumers break17:39:46
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)All of the assertions I added to the packages were the result of a ton of reading and gleaning meaning through changelogs and actual package contents changes17:40:16
@glepage:matrix.orgGaƩtan LepageSounds like a ton of fun :')17:47:23
@cameron-matrix:matrix.orgCameron Barker joined the room.18:18:26
11 Mar 2026
@justbrowsing:matrix.orgKevin Mittman (jetlagged/UTC+8)Redacted or Malformed Event01:54:11
@glepage:matrix.orgGaƩtan Lepage connor (burnt/out) (UTC-8) would you agree with a 12.8 -> 12.9 global bump before messing around with 13.0? 11:05:21
@ctheune:matrix.flyingcircus.ioTheuni changed their display name from Theuni to Christian Theune.14:13:00
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Sure! I remember some weird breakages a while back when I had wanted to bump immediately after 12.9 became available, but hopefully they’re all resolved by now :)16:08:54
@glepage:matrix.orgGaƩtan Lepage https://github.com/NixOS/nixpkgs/pull/498861 16:43:46
@glepage:matrix.orgGaƩtan Lepage

connor (burnt/out) (UTC-8)
About https://github.com/NixOS/nixpkgs/pull/498681, I plan to build torch and vllm. If this works fine, I will merge it.
With the CUDA PRs on the way, I won't have the capacity to exhaustively test all of them.

No objection on your side?

23:37:24
@glepage:matrix.orgGaƩtan Lepage (same reasonning for https://github.com/NixOS/nixpkgs/pull/498678#issuecomment-4035473707). 23:39:46
@glepage:matrix.orgGaƩtan Lepage * (same reasonning for https://github.com/NixOS/nixpkgs/pull/498678). 23:39:52
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Sounds good! I’ll leave a comment on them23:51:04
@glepage:matrix.orgGaƩtan Lepage I'm testing the CUDA bump more thoroughly though.
~1.3k rebuilds left (out of 1.8k)
23:53:19
@glepage:matrix.orgGaƩtan Lepage *

connor (burnt/out) (UTC-8)
About https://github.com/NixOS/nixpkgs/pull/498681, I plan to build torch and vllm. If this works fine, I will merge it.
With all the CUDA PRs in the queue, I won't have the capacity to exhaustively test all of them.

No objection on your side?

23:54:04
12 Mar 2026
@ctheune:matrix.flyingcircus.ioTheuni changed their display name from Christian Theune to Theuni.07:18:55
@bjth:matrix.orgBryan Honof

It looks like torch's supportedTorchCudaCapabilities was out-of-sync with upstream. https://github.com/NixOS/nixpkgs/pull/499216

How would I use nixpkgs-review to test these changes?

10:53:19
@glepage:matrix.orgGaƩtan Lepage

Thanks for the PR!

Well, you don't want to rebuild all torch consumers for this. What you can do is the following:

nixpkgs-review --extra-nixpkgs-config "{ allowUnfree = true; cudaSupport = true; }" -p python3Packages.torch -p python3Packages.vllm -p python3Packages.torchvision
12:39:42
@glepage:matrix.orgGaƩtan LepageI'll try to have a look at it before next week12:39:58

There are no newer messages yet.


Back to Room ListRoom Version: 9