!eWOErHSaiddIbsUNsJ:nixos.org

NixOS CUDA

307 Members
CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda59 Servers

You have reached the beginning of time (for this room).


SenderMessageTime
29 Jan 2026
@hexa:lossy.networkhexa (UTC+1)nixos-25.11 should be fairly cheap on rebuilds14:12:03
@hexa:lossy.networkhexa (UTC+1)the opencv 4.13.0 blocks openvino updates14:12:22
@hexa:lossy.networkhexa (UTC+1)* the opencv 4.13.0 update blocks openvino updates14:12:27
@hexa:lossy.networkhexa (UTC+1)* the opencv 4.13.0 update is required to update openvino too14:12:44
@snakyeyes:matrix.orgGilles Poncelet joined the room.22:07:17
30 Jan 2026
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Can someone review/merge https://github.com/NixOS/nixpkgs/pull/485211?02:20:04
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Also coming up: https://github.com/NixOS/nixpkgs/pull/48520803:10:33
@matthewcroughan:defenestrate.itmatthewcroughan changed their display name from matthewcroughan to matthewcroughan @fosdem.13:50:24
31 Jan 2026
@bjth:matrix.orgBryan HonofHey hey, live from FOSDEM here. Is there an easy way to generate those manifest JSON files? Or is that a fully manual process?16:26:02
@bjth:matrix.orgBryan HonofNevermind, should've read the READEME. :)16:26:59
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Oh god is it up to date16:32:45
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Those manifests should come directly from NVIDIA (but they need a new line added to comply with the Nixpkgs formatter)16:33:13
1 Feb 2026
@sigmasquadron:matrix.orgFernando Rodrigues changed their display name from SigmaSquadron to Fernando Rodrigues.10:43:22
@glepage:matrix.orgGaétan Lepage OpenCV 4.13.0 bump has just been merged! 22:56:18
3 Feb 2026
@justbrowsing:matrix.orgKevin Mittman (UTC-7) Hi Bryan Honof I can help answer questions about the JSON manifestsconnor (burnt/out) (UTC-8) You could have mentioned that, happy to add a newline at the end 21:41:47
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8) Gaétan Lepage: PR for fixes related to CUDA 13 breakages: https://github.com/NixOS/nixpkgs/pull/485208 22:06:42
4 Feb 2026
@shadowrz:nixos.devYorusaka Miyabi joined the room.01:48:31
@benesim:benesim.orgBenjamin Isbarn joined the room.09:10:02
@benesim:benesim.orgBenjamin Isbarn Hi, I'm trying to run an application that uses OpenCV with cuda support built with nix on a NVIDIA® Jetson™ Orin™ Nano 8GB. This fails essentially with the following message: Internal Error: OpenCV(4.11.0) /build/source/modules/dnn/src/cuda4dnn/init.hpp:55: error: (-217:Gpu API call) CUDA driver version is insufficient for CUDA runtime version in function 'getDevice'\n (code: GpuApiCallError, -217) I did the old /run/opengl-driver/lib trick which worked flawlessly on another device which was a PC running a 3050. But this doesn't seem to work on the Jetson (I do see that the libcuda i symlinked in /run/opengl-driver/lib get's loaded when running it within strace). I tried to use the same cuda version thats on the Jetson (i.e.g cat /usr/local/cuda/version.json gave me "version" : "11.4.19" so I went with cuda_11_4 in nix. Any help would highly be appreciated :) 11:39:09
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8) So your Orin is running JetPack 5, correct?
Where did you find/get cuda_11_4? I'm not aware of that. How did you build OpenCV, from which commit, how did you configure Nixpkgs, etc.
20:13:16
@glepage:matrix.orgGaétan Lepage

RE: effort to migrate towards cuda 13 treewide

magma fails to build with cuda 13. I opened https://github.com/NixOS/nixpkgs/pull/487064 to fix it.

22:37:31
5 Feb 2026
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Merging; also made a backport for it since it's worth having there as well01:31:25
@benesim:benesim.orgBenjamin Isbarn

Yes it's currently on JetPack 5.1.3. I'm using 11cb3517b3af6af300dd6c055aeda73c9bf52c48 from nixpkgs (still 25.05 ;)). As for opencv:

          opencv = pkgs.opencv.override {
            enableCudnn = true;
            cudaPackages = pkgs.cudaPackages_11_4;
          };

and I'm using this for the nixpkgs config:

    {
      config = {
        cudaSupport = true;
        allowUnfree = true;
        allowBroken = true;
      };
      overlays = [
        (import rust-overlay)
      ];
    }
07:31:18
@benesim:benesim.orgBenjamin Isbarn

ok, i did make some progress, turns out libcuda has a couple of more dependencies on the jetson

ldd /run/opengl-driver/lib/libcuda.so linux-vdso.so.1 (0x0000ffffa1fb3000) libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000ffffa0796000) libnvrm_gpu.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm_gpu.so (0x0000ffffa0729000) libnvrm_mem.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm_mem.so (0x0000ffffa0711000) libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000ffffa0666000) libdl.so.2 => /lib/aarch64-linux-gnu/libdl.so.2 (0x0000ffffa0652000) librt.so.1 => /lib/aarch64-linux-gnu/librt.so.1 (0x0000ffffa063a000) libpthread.so.0 => /lib/aarch64-linux-gnu/libpthread.so.0 (0x0000ffffa0609000) libnvrm_sync.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm_sync.so (0x0000ffffa05f2000) libnvrm_host1x.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm_host1x.so (0x0000ffffa05d1000) /lib/ld-linux-aarch64.so.1 (0x0000ffffa1f83000) libnvos.so => /usr/lib/aarch64-linux-gnu/tegra/libnvos.so (0x0000ffffa05b1000) libnvsocsys.so => /usr/lib/aarch64-linux-gnu/tegra/libnvsocsys.so (0x0000ffffa059d000) libstdc++.so.6 => /lib/aarch64-linux-gnu/libstdc++.so.6 (0x0000ffffa03b8000) libnvsciipc.so => /usr/lib/aarch64-linux-gnu/tegra/libnvsciipc.so (0x0000ffffa0393000) libnvrm_chip.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm_chip.so (0x0000ffffa037f000) libgcc_s.so.1 => /lib/aarch64-linux-gnu/libgcc_s.so.1 (0x0000ffffa035b000)

I set LD_LIBRARY_PATH such that the linker is able to load those which did work for a small sample c program that I wrote. Need to try this workaround for the "big app" now ;)

10:23:26
@benesim:benesim.orgBenjamin Isbarn *

ok, i did make some progress, turns out libcuda has a couple of more dependencies on the jetson

``
dd /run/opengl-driver/lib/libcuda.so linux-vdso.so.1 (0x0000ffffa1fb3000) libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000ffffa0796000) libnvrm_gpu.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm_gpu.so (0x0000ffffa0729000) libnvrm_mem.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm_mem.so (0x0000ffffa0711000) libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000ffffa0666000) libdl.so.2 => /lib/aarch64-linux-gnu/libdl.so.2 (0x0000ffffa0652000) librt.so.1 => /lib/aarch64-linux-gnu/librt.so.1 (0x0000ffffa063a000) libpthread.so.0 => /lib/aarch64-linux-gnu/libpthread.so.0 (0x0000ffffa0609000) libnvrm_sync.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm_sync.so (0x0000ffffa05f2000) libnvrm_host1x.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm_host1x.so (0x0000ffffa05d1000) /lib/ld-linux-aarch64.so.1 (0x0000ffffa1f83000) libnvos.so => /usr/lib/aarch64-linux-gnu/tegra/libnvos.so (0x0000ffffa05b1000) libnvsocsys.so => /usr/lib/aarch64-linux-gnu/tegra/libnvsocsys.so (0x0000ffffa059d000) libstdc++.so.6 => /lib/aarch64-linux-gnu/libstdc++.so.6 (0x0000ffffa03b8000) libnvsciipc.so => /usr/lib/aarch64-linux-gnu/tegra/libnvsciipc.so (0x0000ffffa0393000) libnvrm_chip.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm_chip.so (0x0000ffffa037f000) libgcc_s.so.1 => /lib/aarch64-linux-gnu/libgcc_s.so.1 (0x0000ffffa035b000)


I set LD\_LIBRARY\_PATH such that the linker is able to load those which did work for a small sample c program that I wrote. Need to try this workaround for the "big app" now ;)
10:23:58
@benesim:benesim.orgBenjamin Isbarn *

ok, i did make some progress, turns out libcuda has a couple of more dependencies on the jetson

dd /run/opengl-driver/lib/libcuda.so linux-vdso.so.1 (0x0000ffffa1fb3000) libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000ffffa0796000) libnvrm\_gpu.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm\_gpu.so (0x0000ffffa0729000) libnvrm\_mem.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm\_mem.so (0x0000ffffa0711000) libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000ffffa0666000) libdl.so.2 => /lib/aarch64-linux-gnu/libdl.so.2 (0x0000ffffa0652000) librt.so.1 => /lib/aarch64-linux-gnu/librt.so.1 (0x0000ffffa063a000) libpthread.so.0 => /lib/aarch64-linux-gnu/libpthread.so.0 (0x0000ffffa0609000) libnvrm\_sync.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm\_sync.so (0x0000ffffa05f2000) libnvrm\_host1x.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm\_host1x.so (0x0000ffffa05d1000) /lib/ld-linux-aarch64.so.1 (0x0000ffffa1f83000) libnvos.so => /usr/lib/aarch64-linux-gnu/tegra/libnvos.so (0x0000ffffa05b1000) libnvsocsys.so => /usr/lib/aarch64-linux-gnu/tegra/libnvsocsys.so (0x0000ffffa059d000) libstdc++.so.6 => /lib/aarch64-linux-gnu/libstdc++.so.6 (0x0000ffffa03b8000) libnvsciipc.so => /usr/lib/aarch64-linux-gnu/tegra/libnvsciipc.so (0x0000ffffa0393000) libnvrm\_chip.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm\_chip.so (0x0000ffffa037f000) libgcc\_s.so.1 => /lib/aarch64-linux-gnu/libgcc\_s.so.1 (0x0000ffffa035b000)

I set LD_LIBRARY_PATH such that the linker is able to load those which did work for a small sample c program that I wrote. Need to try this workaround for the "big app" now ;)

10:24:24

Show newer messages


Back to Room ListRoom Version: 9