| 3 Nov 2025 |
Gaétan Lepage | But I confirm that firefox builds fine (no gcc-wrapper triggering disallowedRequisited) with both PRs applied. | 00:26:58 |
Daniel Fahey | CUDA refactor victim fix https://github.com/NixOS/nixpkgs/pull/457870 ready to merge | 13:09:11 |
| Collin Arnett changed their profile picture. | 15:23:43 |
Ari Lotter | is this a horrible idea, if i need cuda support and don't want to wait hours for builds? :)
(final: prev: {
python312Packages = prev.python312Packages.override {
overrides = pyfinal: pyprev: {
torch = pyfinal.torch-bin;
};
};
})
| 21:28:33 |
Gaétan Lepage | RE {cudaPackages.nccl, onnxruntime}: remove reference to nvcc in binary:
We need to patch both nccl's libnccl.so and onnxruntime's libonnxruntime_providers_cuda.so for the fix to actually work. | 23:10:06 |
| 4 Nov 2025 |
connor (burnt/out) (UTC-8) | should be fine, but I'd always recommend using pythonPackagesExtensions since it's a little nicer to use | 06:38:57 |
SomeoneSerge (back on matrix) | I still have no explanation for why we cannot seem to reproduce the nvcc reference with saxpy | 15:07:47 |
SomeoneSerge (back on matrix) | It's frustrating | 15:08:03 |
SomeoneSerge (back on matrix) | Elaborated on github, but here for redundancy: the reference in onnxruntime only appears when nvcc is propagated by all these cuda libs, https://github.com/NixOS/nixpkgs/pull/457424#issuecomment-3475736738 | 15:11:32 |
Gaétan Lepage | TIL: python3Packages.torchWithRocm is apprently sensitive to config.cudaSupport. | 20:11:25 |
Ari Lotter | ugh i wish we could compile packages with cudaCapabilities individually per-capability and merge them later, it's such a nightmare adding one new capability level and it causing a huge 8-hour recompile.. | 20:40:40 |
connor (burnt/out) (UTC-8) | These aliases must die, they make my life so difficult | 21:45:22 |
connor (burnt/out) (UTC-8) | Join the club
And it’s not even like we could do a mega-build in an intermediate derivation and then prune unused capabilities according to whatever the user requested because the amount of generated device code is so large linking will fail lmao | 21:46:17 |
connor (burnt/out) (UTC-8) | Gaétan Lepage are any of SomeoneSerge (back on matrix)’s comments on https://github.com/NixOS/nixpkgs/pull/457803 actionable or is it good to merge? | 21:48:00 |
connor (burnt/out) (UTC-8) | Also, would you mind reviewing https://github.com/NixOS/nixpkgs/pull/458619? | 21:48:09 |
hacker1024 | This is most likely due to a dependency, but I will also point out that all torch variants are at the moment due to an unconditional version access
https://github.com/NixOS/nixpkgs/blob/b3d51a0365f6695e7dd5cdf3e180604530ed33b4/pkgs/development/python-modules/torch/source/default.nix#L458
| 21:48:19 |
Gaétan Lepage | I quadruppled check.
Both commits of my PR are actually necessary to get a nvcc-free onnxruntime. | 21:48:42 |
Gaétan Lepage | Let me change one comment to mention the bisection | 21:48:57 |
Gaétan Lepage | connor (burnt/out) (UTC-8), I reviewed nccl-tests. Feel free to merge | 22:08:11 |
Ari Lotter | i'm trying to fix this exact linker error right now 😭 trying to get flash-attn built for cuda capabilities 7.5 thru 12.0a, and i'm so stuck, and every rebuild with an attempted fix takes ~2 hours... any ideas? 😭 | 22:17:28 |
Ari Lotter | maybe we're just screwed :) | 22:20:25 |
Robbie Buxton | Which flash attention version | 22:24:21 |
Robbie Buxton | V2 or v3 | 22:24:27 |
Robbie Buxton | And from what got tag? | 22:24:51 |
Robbie Buxton | * And from what git tag? | 22:24:59 |
Ari Lotter | v2, from tag v2.8.2 | 22:29:50 |
Robbie Buxton | I think there is currently a pr open in nixpkgs to add this, is that the one you’re building? | 22:30:41 |
Ari Lotter | oh neat, no | 22:31:37 |
Ari Lotter | let me compare my derivation with that one | 22:31:40 |
Ari Lotter | ok yeah, decently similar. difference is i'm building against cutlass 4.0 instead of 4.1, and.. somehow my deps list is wayy simpler, yet the build works (on previous versions of my derivation, pre updating CUDA)? very strange.. | 22:35:13 |