| 22 Aug 2025 |
connor (he/him) | Sure! I’ll let Serge know as well since I have him on Signal | 00:52:56 |
stick | connor (he/him) (UTC-7): jfyi i tried to update cusparselt to 0.8.0 using your redist scripts, but it failed - it seems the scripts need to be adjusted somehow | 09:42:02 |
| 23 Aug 2025 |
stick | Found the issue
New releases contain files to cuda12 and cuda13, introducing another level in json
See
https://developer.download.nvidia.com/compute/cusparselt/redist/redistrib_0.7.1.json
vs
https://developer.download.nvidia.com/compute/cusparselt/redist/redistrib_0.8.0.json | 12:18:03 |
stick | I manually edited the json to adhere to the old scheme and will create a cusparselt update PR although it is not ideal and i suspect the upcoming cuda releases will use the newer scheme too | 12:23:03 |
stick | cusparselt PR in https://github.com/NixOS/nixpkgs/pull/436186 | 12:42:56 |
| Lun joined the room. | 21:07:38 |
Lun |
TODO: 2. We should probably abandon attributes such as torchWithCuda (etc.)
as they routinely end up consuming the wrong arguments\
(dependencies without cuda support).
Instead we should rely on overlays and nixpkgsFun.
(@SomeoneSerge)
_tritonEffective ?
Is there an issue tracking dropping WithCuda aliases/a discussion thread about this somewhere?
| 21:43:20 |
Lun | *
# TODO: 2. We should probably abandon attributes such as `torchWithCuda` (etc.)
# as they routinely end up consuming the wrong arguments\\
# (dependencies without cuda support).
# Instead we should rely on overlays and nixpkgsFun.
# (@SomeoneSerge)
\_tritonEffective ?
Is there an issue tracking dropping WithCuda aliases/a discussion thread about this somewhere?
| 21:43:40 |
| 24 Aug 2025 |
connor (he/him) | Not that I know of; probably hasn’t been made due to lack of time. Feel free to make one, I can add the CUDA tag to it so it doesn’t get lost. (Well, it’ll be lost in our giant backlog, but whatever.) | 18:31:36 |
connor (he/him) | Kevin MittmanSomeoneSerge (Ever OOMed by Element)I don’t think I ever set something up but Kevin we should definitely talk about the database application Serge has been building to track changes and information about CUDA binaries/releases | 18:34:00 |
Kevin Mittman (EOY sleep) | stick: https://github.com/NVIDIA/build-system-archive-import-examples/issues/4 | 18:44:39 |
Kevin Mittman (EOY sleep) | * stick: https://github.com/NVIDIA/build-system-archive-import-examples/issues/6 | 18:45:27 |
| 25 Aug 2025 |
connor (he/him) | If someone could clear up what the goals are of the following projects and if/how they relate to each other I’d appreciate it:
- warp (https://github.com/NVIDIA/warp)
- CuTe (https://docs.nvidia.com/cutlass/media/docs/cpp/cute/00_quickstart.html)
- tilus (https://github.com/NVIDIA/tilus)
- CCCL Python libraries (https://nvidia.github.io/cccl/python/index.html)
Also, was cuTile released as CuTe, or is that something else?
| 14:34:19 |
apyh | quick question for y'all - since 427a439 was merged, setuptools is now at v80, but from my understanding Torch only builds at <80... how does anything torch-related still work in nixpkgs, and how does cuda torch work at all? | 22:23:24 |
| 26 Aug 2025 |
| Bryan Honof joined the room. | 11:45:34 |
Bryan Honof | Hey all, I have created this tracking issue for bumping CUDA to version 13 https://github.com/NixOS/nixpkgs/issues/437087, not sure if that's the right way to do it. :) | 11:46:39 |
connor (he/him) | I mean if it builds on master I don’t think the CUDA variant should fail. Idk what’s built currently through nix-community. | 16:51:05 |
connor (he/him) | Will try to take a look later Bryan. Not sure how well the scripts work with 13. | 16:51:06 |
apyh | i'm trying to build master but yaknow, gotta build the world, starting with llvm | 21:23:25 |
| 27 Aug 2025 |
Kevin Mittman (EOY sleep) | There are a fair amount of changes in 13.0, is there already a thread? | 01:45:33 |
hexa | torch is cached, setuptools upgrades are rarely an issue | 02:11:52 |
hexa | projects tend to pin it proactively and we tend to unpin it | 02:12:11 |
apyh | In reply to @hexa:lossy.network torch is cached, setuptools upgrades are rarely an issue im not super experienced with this - i ask because I'm trying to build torch 2.9 nightly myself, and i can't just bump all versions with the submodule unroll thing, because it fails to build with a newer setuptools. how is torch "cached" such that it doesn't need to build with the setuptools in nixpkgs? | 02:20:20 |
hexa | https://hydra.nix-community.org/job/nixpkgs/cuda/python3Packages.torch.x86_64-linux | 02:20:54 |
hexa | but yeah, that version is outdated | 02:21:14 |
hexa | why newer versions wouldn't work with a newer setuptools but older versions do is beyond me 🙂 | 02:21:31 |
apyh | yeah no i don't think torch 2.7.1 *should* build with newer setuptools.. | 02:57:01 |
apyh | oh, i see, 2.7 doesn't have a max version on setuptools (https://github.com/pytorch/pytorch/blob/release/2.7/pyproject.toml) but 2.8 does (https://github.com/pytorch/pytorch/blob/release/2.8/pyproject.toml) | 02:58:29 |
apyh | ig however it's built in nixpkgs can ignore that constraint since it somehow will still run setup.py | 02:58:57 |
hexa | they're pinning to a version with distutils exposed, we can just provide distutils 🤷 | 03:25:52 |