!eWOErHSaiddIbsUNsJ:nixos.org

NixOS CUDA

289 Members
CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda57 Servers

Load older messages


SenderMessageTime
24 Dec 2024
@hexa:lossy.networkhexa (UTC+1) * and ideally someone can disable that test instead?14:04:52
@matthewcroughan:defenestrate.itmatthewcroughan Amen, you are not being well enough supported financially to solve tough issues, most people in this community are not. 14:38:25
@matthewcroughan:defenestrate.itmatthewcroughanThe polish is a matter of resources14:38:58
@matthewcroughan:defenestrate.itmatthewcroughanI did some postfixups, etc and worked around it14:39:15
@matthewcroughan:defenestrate.itmatthewcroughanBut yes I also landed on the same PRs as you, and I'm not sure why it is still an issue despite those PR's having been merged14:39:32
@collinarnett:matrix.orgCollin Arnett joined the room.21:10:15
@collinarnett:matrix.orgCollin ArnettHello, are there any packages in the hackellPackage set that setup cudaSupport that I can use as an example to add cuda support for hasktorch? The author submitted a PR to bump the version here and has marked it as broken so I figured it would be good to get first class cuda support implemented here as well. https://github.com/NixOS/nixpkgs/pull/367998/21:16:42
25 Dec 2024
@ss:someonex.netSomeoneSerge (back on matrix)Oooooh that's a really great question, not least because, afaik, the haskell package set is its own thing with slightly different override patterns than elsewhere?00:04:12
@ss:someonex.netSomeoneSerge (back on matrix)I imagine haskell packages are auto-generated? If so, there must be some customization examples for other native/ffi libraries that might use dlopen at runtime or pkg-config at build time, e.g. wrappers for opengl or vulkan00:07:08
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8) If accelerate is working or supported, check that out. Not sure it’s still supported given it relies on LLVM 12 (or earlier) for LLVM-HS.
Outside of that, not sure what people use for GPU stuff with Haskell
00:14:03
@collinarnett:matrix.orgCollin Arnett* Hello, are there any packages in the hackellPackage set that setup cudaSupport that I can use as an example to add cuda support for hasktorch? The author submitted a PR to bump the version here and has marked it as unbroken so I figured it would be good to get first class cuda support implemented here as well. https://github.com/NixOS/nixpkgs/pull/367998/03:28:32
@collinarnett:matrix.orgCollin Arnett

He ended up doing this in the hackage-packages.nix file:


  "libtorch-ffi" = callPackage
    ({ mkDerivation, async, base, bytestring, c10, containers, hspec
     , inline-c, inline-c-cpp, lib, libtorch-ffi-helper
     , optparse-applicative, safe-exceptions, sysinfo, template-haskell
     , text, torch, torch_cpu, torch_cuda ? null } :
     mkDerivation {
       pname = "libtorch-ffi";
       version = "2.0.1.1";
       sha256 = "0m6gg0z6dc67rxijqycyza197365xf1p71s74a8p4pkc2m2yl6p3";
       libraryHaskellDepends = [
         async base bytestring containers inline-c inline-c-cpp
         libtorch-ffi-helper optparse-applicative safe-exceptions sysinfo
         template-haskell text
       ];
       librarySystemDepends = [ c10 torch torch_cpu ];
       testHaskellDepends = [ base hspec safe-exceptions ];
       homepage = "https://github.com/hasktorch/hasktorch#readme";
       description = "Haskell bindings for PyTorch";
       license = lib.licenses.bsd3;
       configureFlags = [
         "--extra-include-dirs=${lib.getDev pkgs.libtorch-bin}/include/torch/csrc/api/include"
       ]  ++ lib.optionals pkgs.config.cudaSupport [ "-f cuda" ];
     }) ({
       c10 = pkgs.libtorch-bin;
       torch_cpu = pkgs.libtorch-bin;
       torch = pkgs.libtorch-bin;
     } // lib.optionalAttrs (pkgs.config.cudaSupport) {
       torch_cuda = pkgs.libtorch-bin;
     });
11:39:48
@ss:someonex.netSomeoneSerge (back on matrix) I'll follow up on github 17:19:13
26 Dec 2024
@jiashuaixu:matrix.orgJesseDoes anyone have any configuration files for deep learning on nixos? I want to use cuda to train pytorch models on nixos, but I can't install cuda and cudnn correctly. I tried some but failed. Can anyone share the configuration files with me? I use a 4090 graphics card.10:43:05
@glepage:matrix.orgGaétan Lepage Have you set cudaSupport = true in your nixpkgs config ? 10:43:54
@glepage:matrix.orgGaétan LepageThis enables cuda support for all packages that support it in nixpkgs10:44:08
@trofi:matrix.org@trofi:matrix.org joined the room.15:50:27
@trofi:matrix.org@trofi:matrix.orgPlease have a look at https://github.com/NixOS/nixpkgs/pull/368366 . I have no idea what I am doing.15:50:46
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8) Oh my god builtins.sort requires strict total orderings? 16:28:17
@trofi:matrix.org@trofi:matrix.org No, it requires strict weak ordering, but >= does not provide it. a >= b can't act as lessThat. b < a can, or !(b >= a) can as well. 16:31:28
@ss:someonex.netSomeoneSerge (back on matrix) Read your blog post. You got a talent for discovering this stuff before anyone else 18:57:22
@trofi:matrix.org@trofi:matrix.org

If it makes you feel a bit better cuda is not alone in getting sort a bit wrong:

  • https://github.com/NixOS/nixpkgs/pull/368429
  • https://github.com/NixOS/nixpkgs/pull/368433
23:14:25
27 Dec 2024
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)Trofi would you ping Valentin on the issue? Feels like it’d be good to have this requirement stated in the docs01:08:02
28 Dec 2024
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8) ugh thinking about software making me sad
Samuel Ainsworth did you ever find some sort of serenity with CUDA and Nixpkgs?
00:43:26
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)I'm having thoughts about https://github.com/connorbaker/cuda-packages. In particular, does it make sense to include CUDA stuff in Nixpkgs proper when we can't take advantage of anything but eval checks? Would nix-community be a better home? Just having a growing sense of dread about updating and trying to maintain fast-moving libraries in an environment where stuff can (or does) break constantly and there's no notification of such breakage (except maybe by the community Hydra instance?). There's also the understanding that in Nixpkgs, everything work together simultaneously. As an example, I'd hate to try to upgrade OpenCV (or PyTorch) so it works with newer versions of CUDA, only to find out it causes some gnarly Darwin/ROCm/non-CUDA issue. Thinking out-of-tree designs would afford us the ability to break stuff, though that comes with a number of drawbacks (duplicating nix expressions for packages and having slight variations, merging in upstream changes, etc.). Maybe this is just fatigue talking; I think a number of these complaints were raised in a discourse post Sam made a few years ago.00:54:19
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)I mean, I certainly want to upstream the library functions and additional setup hooks/logging functionality I wrote because they're (in my opinion) widely useful. Just... the CUDA stuff.00:55:26
@connorbaker:matrix.orgconnor (burnt/out) (UTC-8)* I'm having thoughts about https://github.com/connorbaker/cuda-packages. In particular, does it make sense to include CUDA stuff in Nixpkgs proper when we can't take advantage of anything but eval checks? Would nix-community be a better home? Just having a growing sense of dread about updating and trying to maintain fast-moving libraries in an environment where stuff can (or does) break constantly and there's no notification of such breakage (except maybe by the community Hydra instance?). There's also the understanding that in Nixpkgs, everything work together simultaneously. As an example, I'd hate to try to upgrade OpenCV (or PyTorch) so it works with newer versions of CUDA, only to find out it causes some gnarly Darwin/ROCm/non-CUDA issue. Thinking out-of-tree designs would afford us the ability to break stuff, though that comes with a number of drawbacks (duplicating nix expressions for packages and having slight variations, merging in upstream changes, etc.). Maybe this is just fatigue talking; I think a number of these complaints were raised in a discourse post Samuel made a few years ago.12:33:14
@lromor:matrix.orglromorIs anyone at chaos congress?16:16:44
@trofi:matrix.org@trofi:matrix.orgGood idea! Done as https://github.com/NixOS/nix/issues/12106#issuecomment-256437584316:36:40
@matthewcroughan:defenestrate.itmatthewcroughan changed their display name from matthewcroughan to matthewcroughan (DECT: 56490).18:41:55

Show newer messages


Back to Room ListRoom Version: 9