!eWOErHSaiddIbsUNsJ:nixos.org

NixOS CUDA

275 Members
CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda56 Servers

Load older messages


SenderMessageTime
4 Nov 2025
@glepage:matrix.orgGaétan Lepage I quadruppled check.
Both commits of my PR are actually necessary to get a nvcc-free onnxruntime.
21:48:42
@glepage:matrix.orgGaétan LepageLet me change one comment to mention the bisection21:48:57
@glepage:matrix.orgGaétan Lepage connor (burnt/out) (UTC-8), I reviewed nccl-tests. Feel free to merge 22:08:11
@arilotter:matrix.orgAri Lotter i'm trying to fix this exact linker error right now 😭 trying to get flash-attn built for cuda capabilities 7.5 thru 12.0a, and i'm so stuck, and every rebuild with an attempted fix takes ~2 hours... any ideas? 😭 22:17:28
@arilotter:matrix.orgAri Lottermaybe we're just screwed :)22:20:25
@sporeray:matrix.orgRobbie Buxton Which flash attention version 22:24:21
@sporeray:matrix.orgRobbie BuxtonV2 or v322:24:27
@sporeray:matrix.orgRobbie BuxtonAnd from what got tag?22:24:51
@sporeray:matrix.orgRobbie Buxton* And from what git tag?22:24:59
@arilotter:matrix.orgAri Lotter v2, from tag v2.8.2 22:29:50
@sporeray:matrix.orgRobbie BuxtonI think there is currently a pr open in nixpkgs to add this, is that the one you’re building?22:30:41
@arilotter:matrix.orgAri Lotteroh neat, no22:31:37
@arilotter:matrix.orgAri Lotterlet me compare my derivation with that one22:31:40
@arilotter:matrix.orgAri Lotterok yeah, decently similar. difference is i'm building against cutlass 4.0 instead of 4.1, and.. somehow my deps list is wayy simpler, yet the build works (on previous versions of my derivation, pre updating CUDA)? very strange..22:35:13
@arilotter:matrix.orgAri Lotter

but yeah i just smash into

> build/lib.linux-x86_64-cpython-312/flash_attn_2_cuda.cpython-312-x86_64-linux-gnu.so: PC-relative offset overflow in PLT entry for `_ZNK3c1010TensorImpl4sizeEl'
``` 🤷
22:35:28
@arilotter:matrix.orgAri Lotteri'm so tired of CUDA nightmares 😭 im so close to giving up and building dockerized devenvs, i just really don't want to give in..... :(22:37:57
@glepage:matrix.orgGaétan Lepage (It's a secret, but you might want to add https://cache.nixos-cuda.org as a substituter, it is slowly getting more and more artifacts)
Public key: cache.nixos-cuda.org:74DUi4Ye579gUqzH4ziL9IyiJBlDpMRn9MBN8oNan9M=
22:44:02
@glepage:matrix.orgGaétan Lepage connor (burnt/out) (UTC-8), Serge and I got #457803 ready.
We are waiting for nixpkgs's CI to get fixed (https://github.com/NixOS/nixpkgs/pull/458647).
Let's merge ASAP
23:38:07
@sporeray:matrix.orgRobbie Buxton For flash attention you should use the version of cutlass in the repo 23:54:57
@sporeray:matrix.orgRobbie Buxton They have a hash 23:55:06
@sporeray:matrix.orgRobbie Buxton In csrc/cutlass 23:56:01
@sporeray:matrix.orgRobbie Buxton* They have a rev23:56:25
5 Nov 2025
@apyh:matrix.orgapyhah fair enough 00:10:30
@ss:someonex.netSomeoneSerge (back on matrix) step 1: torchWithCuda = pkgsCuda.....torch (we were supposed to be here now, but it got out of hand)
step 2: torchWithCuda = warn "..." pkgsCuda...
step 3: torchWithCuda = throw
00:12:18
@ss:someonex.netSomeoneSerge (back on matrix)and what we really want is late binding and incremental builds00:13:41
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Why are you building for so many CUDA capabilities? I can’t really think of a reason you’d need that range in particular.01:59:14
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Added to merge queue02:07:23
@apyh:matrix.orgapyh
In reply to @connorbaker:matrix.org
Why are you building for so many CUDA capabilities? I can’t really think of a reason you’d need that range in particular.
's a distributed ml training application that needs to run on everything from gtx 10xx gpus to modern data center GH/GB200s :/
03:27:37
@apyh:matrix.orgapyhmost common hardware is gonna be 30xx 40xx 50xx, h100, a100, b20003:27:56
@apyh:matrix.orgapyhthough.. i could just see what pytorch precompiled wherls runs on and limit to that 03:28:54

Show newer messages


Back to Room ListRoom Version: 9