NixOS CUDA | 308 Members | |
| CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda | 60 Servers |
| Sender | Message | Time |
|---|---|---|
| 27 Oct 2024 | ||
In reply to @msanft:matrix.orgYea we don't link gcc directly in nvcc but provide it independently via the overridden stdenv | 09:40:39 | |
In reply to @glepage:matrix.orgA horror security-wise though xD | 11:08:32 | |
| 12:20:53 | ||
connor (he/him) (UTC-7): hiya, any idea why there's nothing like find_package(protobuf) in onnx/onnx before it tests for TARGET protobuf::libprotobuf[-lite]? Is it supposed to inherit these variables from onnxruntime or something? | 15:32:43 | |
| 28 Oct 2024 | ||
| I would have thought that, but ONNX is also built directly and not just as a dependency of onnxruntime, so I don’t think it’s because they intend ONNX to be built solely as a subproject and leave configuration up to the parent I don’t understand how they expect it to be magically detected… but it works for me from what I remember (https://github.com/ConnorBaker/cuda-packages/blob/main/cudaPackages-common/onnx.nix) Although I’m doing the cursed C++/Python install and had to carefully choose dependencies from the normal package set and from Python packages 🤷♂️ | 06:39:55 | |
| 29 Oct 2024 | ||
| 19:58:14 | ||
| 30 Oct 2024 | ||
| 07:49:56 | ||
In reply to @connorbaker:matrix.orgHey wanna publish a review on exposing extendMkDerivationArgs? | 15:24:02 | |
| Could help with upstreaming the tricks | 15:24:30 | |
Btw at NixCon Tom brought up moving cudaSupport &el into the system which I think was raised here at least on a few occasions | 15:26:05 | |
| Maaaybe it's time? :) | 15:26:11 | |
We could even skip exposing nixpkgsFun in the public api then: could be just pkgsCross.cuda | 15:27:51 | |
GPU tests? That's buildPlatform with cuda | 15:28:50 | |
* Btw at NixCon Tom brought up moving cudaSupport &al into the system which I think was raised here at least on a few occasions | 15:32:10 | |
In reply to @ss:someonex.netI’d really like capabilities to exposed as part of potential pkgsCross because they are effectively a platform target for the purposes of codegen; thoughts? | 19:02:51 | |
In reply to @connorbaker:matrix.orgThat's exactly the idea | 20:19:26 | |
| I don't think we should be extending the triples, but it makes in the parsed system i think | 20:20:19 | |
| 31 Oct 2024 | ||
| 15:16:14 | ||
In reply to @connorbaker:matrix.org Hi connor (he/him) (UTC-7) , first off, the Nix CUDA team is doing awesome work! I have spent the last week reading everything cuda related in nixpkgs and watching NixCon talks :) I am also trying to run onnxruntime with the tensorrt backend on a jetson device with NixOS. Really looking forward to how you might solve the puzzle there. I am new to Nix, therefore I hoped it would be simpler for me to patch the
Is this doable like this, do you recommend this? I find it amazing that | 15:44:26 | |
| 15:57:40 | ||
| I have no idea if it’ll work, but please do try and let me know! I’m not familiar with the Jetson zoo, or how the binary wheels are packaged. It sounds reasonable — the wheel should have libraries built to target Jetson devices which link against CUDA libraries, and using autoPatchelfHook and including CUDA libraries in buildInputs should patch the libraries so they resolve to the ones provided by Nixpkgs… I don’t fully understand what cuda_compat does, but my understanding is that it serves as a shim between newer CUDA libraries and an older CUDA driver on the Jetson?At any rate, try it out and let me know! | 16:16:51 | |
| Ok awesome, thanks for the quick reply! At least there is not something completely wrong with the approach. I will let you know if I can make it work 👌 Did anyone else get onnxruntime working with the tensorrt execution provider on a jetson? | 16:24:25 | |
| I can’t remember if I managed to get it building on Jetson, I got distracted and started doing work on the nix interpreter | 16:26:15 | |
| Also, which Jetson generation are you using? Please let it be at least Xavier :( | 16:27:48 | |
Its the Orin NX luckily :) I am free to choose a different platform, I am still evaluating what is best for our AI on the edge usecase. From what you say it seems like its not going to be impossible, so I will work my way through it. I hope I can find a a good solution, if I do I will try to contribute it to nixpkgs or jetpack-nixos | 16:31:01 | |
| I am sorry ``nix run github:SomeoneSerge/pkgs#pkgsCuda.some-pkgs-py.stable-diffusion-webui``` doesn't work anymore | 17:06:45 | |
* I am sorry nix run github:SomeoneSerge/pkgs#pkgsCuda.some-pkgs-py.stable-diffusion-webui doesn't work anymore | 17:07:06 | |
In reply to @search-sense:matrix.orgYeah didn't keep track of it, must've broke with a nixpkgs bump | 17:13:06 | |
| I unfourtanetly lost the working commit for this repo | 17:25:04 | |
In reply to @ss:someonex.netWas this in reference to an existing PR or something new? I apologize, I've been really scattered the last two weeks :l | 17:57:28 | |