| 25 Feb 2026 |
Hugo | HedgeDoc does not allow me to copy the complete log | 15:21:26 |
hexa (UTC+1) | bpa.st | 15:21:32 |
hexa (UTC+1) | * https://bpa.st | 15:21:35 |
Hugo | https://bpa.st/5CQA | 15:22:04 |
Hugo | * Related to PyTorch but not to 2.10, it looks like my CI cannot build it since yesterday morning (it was still python3.13-torch-2.9.1) 🤔 Torch 2.10.0 does not solve that issue.
I build with:
run: "cd nixpkgs-master\n nix-build -I nixpkgs=. --arg config '{ allowUnfree\
\ = true; cudaSupport = true; openclSupport = true; rocmSupport = false;}'\
\ --option allow-import-from-derivation false --max-jobs 1 -A python313Packages.torch"
Here is the last log (after 3h12 of build). https://bpa.st/5CQA
Any idea what could be the cause?
| 15:22:17 |
Hugo | Edited my message with this link as well. | 15:22:31 |
Gaétan Lepage | Are you sure that you're not simply OOMing? | 19:43:36 |
Hugo | I investigated my metrics, it looks like OOM-ing for 2.10, but not for the latest failure of 2.9.1 🤔 | 20:28:31 |
Hugo | I started a build with less cores, will see where that goes (mostl likely from 3.10 to ~5 hours) | 20:29:13 |
| 26 Feb 2026 |
| Theuni joined the room. | 07:05:25 |
Theuni | 👋 | 07:09:09 |
Theuni | Gaétan Lepage: if you need help testing the vllm 0.16 branch, i can do that. | 07:27:08 |
Gaétan Lepage | Hi! Thanks :)
I rebased the PR yesterday: https://github.com/NixOS/nixpkgs/pull/490175
I couldn't do much as its dependendency xgrammar is broken. Once I'll properly bump the {c,q}utlass et al. overrides, then sure, help with testing will be welcome :) | 07:57:01 |
Gaétan Lepage | From my initial testing, 2.10 is significantly heavier to build.
Took me >1h instead of 45 min to build with cudaSupport on a 64T Threadripper | 07:58:09 |
| 28 Feb 2026 |
| Kevin Mittman (jetlagged/UTC+8) changed their display name from Kevin Mittman (UTC-8) to Kevin Mittman (UTC+8). | 05:22:22 |
Gaétan Lepage | RE: vllm update to 0.16.0
Although vllm 0.16.0 was initially depending on torch==2.10.0, this changed.
Indeed, upstream force-pushed the tag to revert this change and go back to requiring torch==2.9.1.
I learnt about this by chance, discussing this wis a coworker. | 10:28:07 |
| 2 Mar 2026 |
| Kevin Mittman (jetlagged/UTC+8) changed their display name from Kevin Mittman (UTC+8) to Kevin Mittman (jetlagged/UTC+8). | 01:12:50 |
hexa (UTC+1) | @SomeoneSerge (back on matrix) https://hydra.nixos.org/build/323045198 👋 | 15:28:08 |
| Maren Horner joined the room. | 20:37:42 |
Gaétan Lepage | https://github.com/nixos-cuda/infra/commit/eeb5eb95d8eb0abba5fa50d14500c0d20a2e5d12
🥲 | 23:46:37 |
Lun | 😭 | 23:47:25 |
hexa (UTC+1) | Yeah, I'm afraid unless we don't GC harder this is going to be tought sell. | 23:51:53 |
hexa (UTC+1) | * Yeah, I'm afraid unless we GC harder this is going to be tought sell. | 23:51:57 |
hexa (UTC+1) | * Yeah, I'm afraid unless we GC harder this is going to be tough sell. | 23:52:02 |
| 3 Mar 2026 |
SomeoneSerge (matrix works sometimes) | =\ | 16:23:53 |
caniko | any chance to build gimp and handbrake? | 19:53:02 |
| 5 Mar 2026 |
kaya 𖤐 | Not sure if it's been mentioned here before but, for anyone affected by flash-attn builds OOM-ing. I noticed an upstream patch that tries to counter it https://github.com/Dao-AILab/flash-attention/pull/2079
Might be possible to apply it to the nix package 🤔 | 13:26:33 |
kaya 𖤐 | * | 13:26:58 |
Robbie Buxton | Omg the bane of my existence | 16:39:35 |
Robbie Buxton | That has oomed on an ungodly amount of RAM | 16:40:08 |