!eWOErHSaiddIbsUNsJ:nixos.org

NixOS CUDA

311 Members
CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda60 Servers

Load older messages


SenderMessageTime
29 Jun 2024
@ss:someonex.netSomeoneSerge (matrix works sometimes)
In reply to @connorbaker:matrix.org
Hey all; I expect I'll be away for about a week starting in a few hours as I try to find housing on the other side of the country :)
Wow, moving again?
11:57:29
@ss:someonex.netSomeoneSerge (matrix works sometimes)
In reply to @ss:someonex.net
Should work this time: https://github.com/NixOS/nixpkgs/pull/323056
Can I bum a nixpgks-review? xD
I think this only really breaks jax and tensorflow, which are broken anyway šŸ¤”
13:24:52
@connorbaker:matrix.orgconnor (he/him)
In reply to @ss:someonex.net
Wow, moving again?
This time just me (not my dad and brother), since I’m moving for work
13:52:42
@mib:kanp.aimib 🄐 changed their profile picture.22:24:38
30 Jun 2024
@shekhinah:she.khinah.xyzshekhinah changed their display name from shekhinah to sabaoth.15:09:49
@lromor:matrix.orglromor SomeoneSerge (UTC+3): feel free to contact me here in case you need a more "realtime" discussion :) 18:49:05
@lromor:matrix.orglromor * SomeoneSerge (UTC+3): feel free to contact me here in case you need a more "realtime" discussion. 18:49:15
@lromor:matrix.orglromor * SomeoneSerge (UTC+3): I answered your github question, feel free to contact me here in case you need more info. 18:49:40
@ss:someonex.netSomeoneSerge (matrix works sometimes)
In reply to @ss:someonex.net
I think this only really breaks jax and tensorflow, which are broken anyway šŸ¤”
Jax builds, looking into tf āœ…
21:51:48
2 Jul 2024
@zimbatm:numtide.comJonas Chevalier joined the room.10:08:21
@zimbatm:numtide.comJonas ChevalieršŸ‘‹ thought I would join in on the fun for a bit10:24:19
@hexa:lossy.networkhexa Great having you here! Feel free to enable the (unfun) config.cudaSupport flag at any time now šŸ™‚ 10:41:14
@hexa:lossy.networkhexaWondering what it would take to build and ship unfree redistributable things10:41:57
@hexa:lossy.networkhexa The discussion was running in #platform-governance:nixos.org until the Zulip happened 🫠 10:42:20
@lromor:matrix.orglromor
In reply to @zimbatm:numtide.com
šŸ‘‹ thought I would join in on the fun for a bit
Someone of numtide developed nix-ld
10:42:46
@lromor:matrix.orglromor* Someone of numtide developed nix-ld am I right?10:42:57
@hexa:lossy.networkhexathat would be Mic9210:43:00
@lromor:matrix.orglromorNice10:43:06
@zimbatm:numtide.comJonas Chevalier
In reply to @hexa:lossy.network
Wondering what it would take to build and ship unfree redistributable things
I'd love to do that
11:41:34
@zimbatm:numtide.comJonas Chevalier
In reply to @lromor:matrix.org
Nice
Yes, and also https://github.com/numtide/nix-gl-host which might be relevant here
11:42:19
@zimbatm:numtide.comJonas ChevalierFor a while I was running CI on all the unfree packages on https://github.com/numtide/nixpkgs-unfree, but it got really expensive, because my CI doesn't cache failed builds, so it would rebuild broken tensorflows and friends over and over again.11:55:19
@zimbatm:numtide.comJonas ChevalierI have been reading the tensorflow packaging and friends for the past few days, and gained a lot of appreciation for the work that happened there. This goes in the leaderboard of most difficult packaging to work on.12:02:51
@hexa:lossy.networkhexaabsolutely does. also the reason we are lagging behind on the tensorflow source-built package unfortunately.12:17:26
@zimbatm:numtide.comJonas Chevalieryeah, it looks like the hard bit isn't the lack of hardware but the amount of combinatorial manpower required.12:24:42
@zimbatm:numtide.comJonas Chevalieri'd love to throw hardware at that problem :)12:24:59
@zimbatm:numtide.comJonas Chevalier are the cases where you would use the *WithCuda packages instead of import nixpkgs { config.cudaSupport = true; }? This could create a similar situation as I had when trying to mix torch and tensorflow in one python.withPackages, where you get package name collisions. 12:50:18
@ss:someonex.netSomeoneSerge (matrix works sometimes)Holaaaa14:55:06
@ss:someonex.netSomeoneSerge (matrix works sometimes)
In reply to @zimbatm:numtide.com
are the cases where you would use the *WithCuda packages instead of import nixpkgs { config.cudaSupport = true; }? This could create a similar situation as I had when trying to mix torch and tensorflow in one python.withPackages, where you get package name collisions.
Honestly, not many I think. "A standalone executable that has no transitive dependencies that require enabling optional cuda support". E.g. I think python3Packages.torchWithCuda would link against the wrong ucc/ucx/openmpi unless config.cudaSupport is set
15:00:46
@ss:someonex.netSomeoneSerge (matrix works sometimes) I think they were mainly introduced to be put in passthru.tests and for nixpkgs-reviews 15:01:32
@ss:someonex.netSomeoneSerge (matrix works sometimes) What I think we need instead is we need to somehow [more] publicly expose nixpkgsFun so we can access it from all-packages.nix and prepare the pkgsCuda and pkgsRocm attributes 15:03:14

Show newer messages


Back to Room ListRoom Version: 9