!eWOErHSaiddIbsUNsJ:nixos.org

NixOS CUDA

281 Members
CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda58 Servers

Load older messages


SenderMessageTime
24 Nov 2025
@ss:someonex.netSomeoneSerge (back on matrix)* No details in this one, but here what I consider worth focusing on https://md.someonex.net/s/ik86rsZp7#16:24:13
@ss:someonex.netSomeoneSerge (back on matrix)Go ahead and open a PR to nixos-homepage. If anyone should have objections they can voice them there, but I don't expect there to be any16:27:47
@yorik.sar:matrix.orgyorik.sarIs there some place where these points are expanded? Like what means “e.g. cudb” there, for example?16:40:12
@yorik.sar:matrix.orgyorik.sarOk, will do!16:40:18
@ss:someonex.netSomeoneSerge (back on matrix)A scattering of PRs and issues on GitHub, but say are you free to join us tomorrow 21:15 CET (20:15 UTC)? We try to sync every Tuesdays by video16:55:55
@ss:someonex.netSomeoneSerge (back on matrix)* A scattering of PRs and issues on GitHub, but say are you free to join us tomorrow 21:15 CET (20:15 UTC)? We try to sync every Tuesday by video16:56:12
@yorik.sar:matrix.orgyorik.sarSure, I could join.17:00:21
@ss:someonex.netSomeoneSerge (back on matrix) connor (burnt/out) (UTC-8) Gaétan Lepage doing same time as last week right? 17:08:59
@ss:someonex.netSomeoneSerge (back on matrix) * connor (burnt/out) (UTC-8) Gaétan Lepage doing same time as last week right? (sent an update, Connor yours seems to auto-reject) 17:09:47
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Huh, I don’t even see anything in my junk mail :/17:31:54
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)

Two thoughts:

  • Helion backend for einops
  • Optuna integration with Helion to allow for persistent studies (e.g., if the seed is fixed and the number of generations is increased, the optimization should resume from where it stopped rather than start an entirely new optimization)
17:34:43
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8) Serge I’m working on a matrix of behavior of propagating and consuming setup hooks through packages 17:36:11
@ss:someonex.netSomeoneSerge (back on matrix) I think the key bit is that buildInputs already are (0, 0) away from the current derivation (because we're building for (0, 1)) 17:59:02
@ss:someonex.netSomeoneSerge (back on matrix)But for the PR we should just drop the offset-checking logic from the hook imo18:00:16
@ss:someonex.netSomeoneSerge (back on matrix)* But for the PR we should just drop the offset-checking logic from the hook imo, and avoid propagating extra patchelf18:00:45
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8) Created this table by building everything in the scope I added and assembling the information from the logs (no build succeeds since I don't produce an out output): https://github.com/ConnorBaker/nix-propagation-behavior/blob/1afbd58f2af1468d4564722b7180cc4d89967ef3/README.md 19:28:30
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8) Using the table, our "users" are packages consuming the stub outputs, which we would expect to happen from nativeBuildInputs or buildInputs (or both).
If used in nativeBuildInputs, we need to install the hook in propagated-host-host-deps (for (-1, -1)) or propagated-build-inputs (for (-1, 0)).
If used in buildInputs, we need to install the hook in propagated-build-build-deps (for (-1, -1)) or propagated-native-build-inputs (for (-1, 0)).
Since we're modifying binaries in-place through patchelf, the (-1, -1) offsets make more sense to me, but I can install to propagated-build-inputs and propagated-native-build-inputs instead for the (-1, 0) offsets.
19:28:35
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8) I think the "wonky" behavior I had seen previously (essentially the hook wasn't sourced when put in propagated-build-inputs or propagated-native-build-inputs) can be explained by the fact I was installing the hook during postInstall. Since the dependency files are replaced in fixupPhase, those entries were clobbered.
Now that I'm installing them in postFixup, it's not an issue.
19:30:10
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)https://github.com/NixOS/nixpkgs/pull/459416 has been updated, please consider merging it21:35:10
@ss:someonex.netSomeoneSerge (back on matrix)

Nice, thanks! I had one like this somewhere too, clearly haven't looked at it in too long

But yeah must be (( "$relHostOffset" <= "$relTargetOffset" )) || continue

22:32:54
25 Nov 2025
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)https://github.com/NixOS/nixpkgs/pull/46477902:37:33
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)hope to have tensorrt version bump up tomorrow... but that also involves bumping onnx-tensorrt, tensorrt-oss, and a few other things simultaneously :F04:45:45
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8) while building every combination of TensorRT 10.x and CUDA release on x86_64-linux to verify I could remove cudnn and nvrtc as buildInputs I prodded Claude to make the optuna+helion thing I mentioned earlier: https://github.com/ConnorBaker/helion/blob/claude/enhance-helion-autotuner-01V2JcV61tYZPi7y6cuZpE54/helion/autotuner/optuna_search.py
It doesn't work with the version of Helion we have in Nixpkgs because of some API changes to parallel_benchmarking but it seems like what I wanted. Haven't had the chance to mess with it though :l
04:50:08
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)https://github.com/NixOS/nixpkgs/pull/46494715:01:41
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)https://github.com/NixOS/nixpkgs/pull/46495715:52:14
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)https://github.com/NixOS/nixpkgs/pull/46504720:23:29
@glepage:matrix.orgGaétan Lepage https://github.com/NixOS/nixpkgs/pull/450587 20:24:55
26 Nov 2025
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8) SomeoneSerge (back on matrix) I updated https://github.com/NixOS/nixpkgs/pull/459416 please merge 🫩 02:57:01
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)🤦‍♂️15:37:52
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8) https://github.com/onnx/onnx-tensorrt/pull/1043 moved from the pycuda to cuda python package and documented it nowhere I've been able to find 15:38:34

Show newer messages


Back to Room ListRoom Version: 9