| 2 Mar 2026 |
Lun | 😭 | 23:47:25 |
hexa (UTC+1) | Yeah, I'm afraid unless we don't GC harder this is going to be tought sell. | 23:51:53 |
hexa (UTC+1) | * Yeah, I'm afraid unless we GC harder this is going to be tought sell. | 23:51:57 |
hexa (UTC+1) | * Yeah, I'm afraid unless we GC harder this is going to be tough sell. | 23:52:02 |
| 3 Mar 2026 |
SomeoneSerge (matrix works sometimes) | =\ | 16:23:53 |
caniko | any chance to build gimp and handbrake? | 19:53:02 |
| 5 Mar 2026 |
kaya 𖤐 | Not sure if it's been mentioned here before but, for anyone affected by flash-attn builds OOM-ing. I noticed an upstream patch that tries to counter it https://github.com/Dao-AILab/flash-attention/pull/2079
Might be possible to apply it to the nix package 🤔 | 13:26:33 |
kaya 𖤐 | * | 13:26:58 |
Robbie Buxton | Omg the bane of my existence | 16:39:35 |
Robbie Buxton | That has oomed on an ungodly amount of RAM | 16:40:08 |
Robbie Buxton | Nice to see they are trying to fix | 16:40:26 |
| 6 Mar 2026 |
connor (burnt/out) (UTC-8) | I found zram gave an amazing compression ratio (I think the data being allocated by NVCC was all zeros) so even though it allocated upwards of .25TB of RAM I didn’t need to reduce the number of jobs | 05:13:31 |
Gaétan Lepage | I enabled this on our builders. | 10:18:40 |
connor (burnt/out) (UTC-8) | Yay talking at state of the union is over, sorry I asked for packages we test specifically and then mentioned none of them | 18:59:31 |
mike | hi all | 19:02:58 |
mike | any guide for using nix on ubuntu for cuda torch? | 19:03:27 |
mike | i am basically having out of memory compiling it all on my machine | 19:11:16 |
mike | ok i got it working will document when i ran all tests. | 19:19:50 |