!eWOErHSaiddIbsUNsJ:nixos.org

NixOS CUDA

289 Members
CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda57 Servers

Load older messages


SenderMessageTime
14 Aug 2024
@caniko:matrix.orgcaniko
In reply to @ss:someonex.net
The last I see in the logs is https://github.com/SomeoneSerge/nixpkgs-cuda-ci/commit/997229a3acb24e73898da3286a2e0caeb81bc918#diff-216b2b7bfde9416c79d133bacb031e95702a20bdedb548c0b055c837aa4f6a9cR68
The maintainers' cache is in a low maintenance mode right now. If you're willing you can try the nix-community cachix, but please note that their cuda jobset isn't officially stabilized yet and can be pulled out at any moment. Both caches are provided with out any obligations, etc, etc, etc
Thanks; however, I actually don't know how to use this
19:10:21
@ss:someonex.netSomeoneSerge (back on matrix) Something like nix flake lock --update-input nixpkgs github:NixOS/nixpkgs/$commitid if I'm not mistaken 19:11:09
@ss:someonex.netSomeoneSerge (back on matrix) or is it nix flake update --update-input? 19:11:23
@caniko:matrix.orgcaniko but will nix flake update --update-input automatically start using cachix? 19:14:44
@ss:someonex.netSomeoneSerge (back on matrix)This is unrelated to cachix, this just pins a different nixpkgs version. When nix "builds" a derivation it looks at whether its inputs are available, and if not it builds or substitutes them, recursively. Which substituters to use (including cachix) is a global nix configuration.19:19:35
@ss:someonex.netSomeoneSerge (back on matrix)So if you pick a particular nixpkgs commit and look at its pytorch, it'll correspond to a hash, and if the substituter (cachix) says it knows this hash you'll end up downloading the prebuilt thing instead of running the expensive build19:20:33
@kaya:catnip.eekaya 𖤐 changed their profile picture.19:52:20
@chikyuukaiten:matrix.orgchikyuukaiten joined the room.20:03:50
16 Aug 2024
@hacker1024:matrix.orghacker1024 I have updated the tensorflow-bin derivation to support Jetsons: https://github.com/NixOS/nixpkgs/pull/334996
In doing so, though, I made it use cudaPackages_12 instead of 11. I believe this is correct for x86_64 too. On the Jetson, at least, it cannot use CUDA 11 as it's hardcoded to dlopen CUDA 12.
04:53:10
@ss:someonex.netSomeoneSerge (back on matrix)...-bin packages 😩13:17:52
@ss:someonex.netSomeoneSerge (back on matrix)Thank you13:17:54
@ss:someonex.netSomeoneSerge (back on matrix)Uhmm hercules has been exhausting my disk space and crashing every other hour for the past day13:29:07
@ss:someonex.netSomeoneSerge (back on matrix)Fun13:29:22
17 Aug 2024
@hacker1024:matrix.orghacker1024
In reply to @ss:someonex.net
...-bin packages 😩
Yeah I'd love to get TensorFlow building from source but it takes about 8 hours per build on my fastest AArch64 device
04:34:43
@hacker1024:matrix.orghacker1024We'll be getting 04:34:53
@hacker1024:matrix.orghacker1024* We'll be getting some Jetson AGX Orins soon so I might give it another shot then04:35:31
@dhofer:matrix.orgdhofer joined the room.14:24:36
@polykernel:kde.org@polykernel:kde.org left the room.22:41:04
21 Aug 2024
@philiptaron:matrix.orgPhilip Taron (UTC-8) I see there's cudaPackages_12_3 but not _12_4 or _12_5. My nvidia-smi output says 12.5; is there something I should be worried about or is it steady as it goes? 14:49:22
@ss:someonex.netSomeoneSerge (back on matrix)IIRC as long as cudaRuntimeGetVersion() <= cudaDriverGetVersion() and you're OK (nvidia-smi reports the latter, the former is what you link by runpaths in nixpkgs)15:04:10
@ss:someonex.netSomeoneSerge (back on matrix)* IIRC as long as cudaRuntimeGetVersion() <= cudaDriverGetVersion() you're OK (nvidia-smi reports the latter, the former is what you link by runpaths in nixpkgs)15:04:18
@philiptaron:matrix.orgPhilip Taron (UTC-8) Sounds good. Thanks SomeoneSerge (UTC+3). 15:08:23
@justbrowsing:matrix.orgKevin Mittman (UTC-8) SomeoneSerge (UTC+3): I'm back. Did you get the EGL issue resolved? 20:00:38
* @ss:someonex.netSomeoneSerge (back on matrix) stresses the brain muscle really hard trying to remember which of the EGL issues he might have discussed earlier20:28:48
@ss:someonex.netSomeoneSerge (back on matrix)The singularity thing?20:29:06
23 Aug 2024
@kaya:catnip.eekaya 𖤐 changed their profile picture.23:35:56
@hexa:lossy.networkhexa