!eWOErHSaiddIbsUNsJ:nixos.org

NixOS CUDA

290 Members
CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda57 Servers

Load older messages


SenderMessageTime
7 Mar 2025
@mdietrich:matrix.orgmdietrich Wait a minute, I am slightly confused as llama-cpp seems to actually have cuda support now that I rebuilt a couple of minutes ago. It just does not use my GPU when running inference even though it reports it as visible and usable. Maybe a configuration mistake on my side (although I am using the default NixOS service). I'll look into open-webui and onnxruntime now... 13:27:31
@mdietrich:matrix.orgmdietrichYes, onnxruntime does recompile, as well as python3.12-torch-2.5.1. I'm checkin the hashes now...13:33:36
@mdietrich:matrix.orgmdietrich

I am definitely building onnxruntime myself even though I get:

> nix path-info --override-input nixpkgs github:NixOS/nixpkgs/9f41a78ead0fbe2197cd4c09b5628060456cd6e3 --derivation .\#nixosConfigurations.$(hostname).pkgs.onnxruntime
• Updated input 'nixpkgs':
    'github:nixos/nixpkgs/32fb99ba93fea2798be0e997ea331dd78167f814?narHash=sha256-ozoOtE2hGsqh4XkTJFsrTkNxkRgShxpQxDynaPZUGxk%3D' (2025-02-21)
  → 'github:NixOS/nixpkgs/9f41a78ead0fbe2197cd4c09b5628060456cd6e3?narHash=sha256-WWXRCTOWcKvtzqzVgBMON0/TWcFMyWq831HQUITE4rs%3D' (2025-02-21)
/nix/store/a22vqi9d0ndhlcy1yxw4m3ir4z7ckfrg-onnxruntime-1.20.1.drv

Which is the same hash as the hydra build store path

13:48:48
@mdietrich:matrix.orgmdietrichI get the same hash for pytorch locally and in hydra as well!13:56:11
@ss:someonex.netSomeoneSerge (back on matrix) And if you nix build --override-input nixpkgs github:NixOS/nixpkgs/9f41a78ead0fbe2197cd4c09b5628060456cd6e3 .#nixosConfigurations.$(hostname).pkgs.onnxruntime? 13:59:42
@mdietrich:matrix.orgmdietrichThen I'm building nccl and cudnn-frontend for some reason?14:15:15
@ss:someonex.netSomeoneSerge (back on matrix)Well this certainly shouldn't be happening if the hashes indeed match14:21:40
@ss:someonex.netSomeoneSerge (back on matrix)Which hydra eval did you refer to?14:22:00
@ss:someonex.netSomeoneSerge (back on matrix)* Which hydra eval did you refer to, can you link it?14:22:06
@mdietrich:matrix.orgmdietrich

Sorry, I am back now. It seems that my setup had complicated things: I was trying stuff on a laptop while the actual setup was on another host (with the actual GPU), but I did use the correct hostname for the workstation, which should (I mean, that is the whole point of Nix?) lead to the same build. (Both systems are x86_64) However I was also trying globally enable cudaSupport and package-overridden cudaSupport, which might have lead to me making a mistake, I don't know. All I can say is that

nix build --override-input nixpkgs github:NixOS/nixpkgs/9f41a78ead0fbe2197cd4c09b5628060456cd6e3 .#nixosConfigurations.$(hostname).pkgs.onnxruntime

now just downloads onnxruntime from the cache, which is the expected behaviour. I'm going to check without overridden input and pytorch again and then with the whole system.
Another question though: How would I override cudaSupport for a single package and its dependencies? Like llama-cpp is easy (llama-cpp.package = pkgs.override { cudaSupport = true; }) but open-webui itself does not have a cudaSupport option, but onnxruntime and pytorch do.

15:27:49
@mdietrich:matrix.orgmdietrich Building without the overridden nixpkgs input forces rebuild (I used just nix build .#nixosConfigurations.$(hostname).pkgs.onnxruntime) 15:29:00
@mdietrich:matrix.orgmdietrichFor earlier, I meant the derivation store path of https://hydra.nix-community.org/build/3297277#tabs-details15:39:13
@mdietrich:matrix.orgmdietrichOk, python3.12-torch-2.5.1 is fetched from the community cache again iff I override the nixpkgs input again to the same hash as in https://hydra.nix-community.org/build/3534138#tabs-buildinputs15:43:38
@mdietrich:matrix.orgmdietrich As in nix build --override-input nixpkgs github:NixOS/nixpkgs/e9b0ff70ddc61c42548501b0fafb86bb49cca858 .#nixosConfigurations.$(hostname).pkgs.python3Packages.pytorch 15:43:55
@mdietrich:matrix.orgmdietrichIf I don't, then it rebuilds15:44:23
@mdietrich:matrix.orgmdietrichDoes that mean I have to find the right commit in nixpkgs that is somewhere between the onnxruntime hydra build, pytorch hydra build and my system that lets me fetch all of them?15:45:26
@mdietrich:matrix.orgmdietrichOooorrr I just need to update my nixpkgs again. Weird, it was not that old, only a week or so. I guess that fixed it (apart from llama-cpp not actually using the GPU, but that is another issue). Thanks for the debugging help, I definitely learned new ways of debugging such problems here!15:52:49
@adam_neverwas:matrix.orgAdam Neverwas joined the room.15:56:04
@mdietrich:matrix.orgmdietrichWell, not so fast: I now get a build error of /nix/store/8msyislppkkr4dxzhir6qd2vc6qg23am-python3.12-rapidocr-onnxruntime-1.4.4.drv, which seems unrelated to https://hydra.nixos.org/build/291903538, which fails because of SDL's build failure (a dependency for some reason). I get a segfault somewhere in python in the pytestCheckPhase of the build.16:55:43
@ss:someonex.netSomeoneSerge (back on matrix)

Another question though: How would I override cudaSupport for a single package and its dependencies?

That's why we use import nixpkgs { config.cudaSupport = true; }

20:55:11
8 Mar 2025
@hexa:lossy.networkhexa @connorbaker:matrix.org i dont suppose your talk was recorded? 12:23:13
@connorbaker:matrix.orgconnor (he/him)It probably was, just taking them a while to upload. If not, the slides are available here: https://github.com/ConnorBaker/2025-planet-nix19:57:32
@ruroruro:matrix.orgruroUnless I am misunderstanding something, looking at the Southern California Linux Expo youtube channel, it seems that the only nix related streams from the 6th and 7th of march are for Room 101. And the "Evaluating the Nix Evaluator" talk was scheduled for Room 2 (which I am assuming is actually Room 102)? I wonder...20:42:23
@justbrowsing:matrix.orgKevin Mittman (UTC-8)
In reply to @ruroruro:matrix.org
Unless I am misunderstanding something, looking at the Southern California Linux Expo youtube channel, it seems that the only nix related streams from the 6th and 7th of march are for Room 101. And the "Evaluating the Nix Evaluator" talk was scheduled for Room 2 (which I am assuming is actually Room 102)? I wonder...
There were two room tracks for PlanetNix
23:57:15
@justbrowsing:matrix.orgKevin Mittman (UTC-8)It usually takes a month or so for the complete set of talks to be cut, edited and uploaded23:58:46
9 Mar 2025
@justbrowsing:matrix.orgKevin Mittman (UTC-8)oh TIL they have livestreams00:23:32
@connorbaker:matrix.orgconnor (he/him)Room 2 for Planet Nix was conference room 104, which I didn’t see on the livestreams when I checked this morning 🤷‍♂️02:25:53
@osmanfbayram:matrix.orgosbm changed their display name from osman bayram to osbm.13:30:40
@ruroruro:matrix.orgruro SomeoneSerge (UTC+U[-12,12]): sorry to bother you with this again, but could you please take a final look at https://github.com/NixOS/nixpkgs/pull/379768#discussion_r1977975372 and let me know if you prefer setting badPlatformsConditions/hydraPlatforms for broken packages or filtering them in release-cuda.nix? 17:25:00
@ruroruro:matrix.orgruroAlso, apparently the "Evaluation Errors" tab missing for individual evaluations isn't a nix-community specific problem. The same thing happens with the official Hydra instance, so I opened an issue upstream: https://github.com/NixOS/hydra/issues/1453.17:28:52

Show newer messages


Back to Room ListRoom Version: 9