!eWOErHSaiddIbsUNsJ:nixos.org

NixOS CUDA

251 Members
CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda46 Servers

Load older messages


SenderMessageTime
14 May 2025
@ss:someonex.netSomeoneSerge (Ever OOMed by Element)Yea I shut down myself for like 12 hours. Coffee time!11:48:56
16 May 2025
@ss:someonex.netSomeoneSerge (Ever OOMed by Element) connor (he/him) (UTC-7): I guess it's trivial but I only now confirmed that you actually can _file = jsonPath in evalModules! 15:25:22
@connorbaker:matrix.orgconnor (he/him) (UTC-7)Remind me, _file (just?) helps with debugging because it tells you where configurations came from on disk, right? I remember seeing it set by lib.importApply (introduced by Robert from flake-parts IIRC)15:30:18
@ss:someonex.netSomeoneSerge (Ever OOMed by Element) The option licenses.distribution_path."CUDA Toolkit" is defined both null and not null, in ... 15:41:27
@ss:someonex.netSomeoneSerge (Ever OOMed by Element) ... could be somethin-something-undefined (because of a lambda) or could be a path 15:42:16
@ss:someonex.netSomeoneSerge (Ever OOMed by Element) * ... could be somethin-something-undefined (because of a lambda or let-in) or could be a path 15:42:27
@hexa:lossy.networkhexa (UTC+1)working on migrating python3 to 3.1322:28:11
17 May 2025
@terrorjack:matrix.orgterrorjack set a profile picture.08:53:50
19 May 2025
@ss:someonex.netSomeoneSerge (Ever OOMed by Element) connor (he/him) (UTC-7): I started (1) switching to a column-oriented format in the last push, and I also (2) attrsOf (enum [ 1 ]) instead of listOf to enforce uniqueness... this is over-engineered isn't it? 02:05:47
@oak:universumi.fioak 🏳️‍🌈♥️ changed their display name from oak 🫱⭕🫲 to oak.10:59:27
@ss:someonex.netSomeoneSerge (Ever OOMed by Element) * connor (he/him) (UTC-7): I started (1) switching to a column-oriented format in the last push, and I also (2) use attrsOf (enum [ 1 ]) instead of listOf to enforce uniqueness... this is over-engineered isn't it? 02:06:08
@oak:universumi.fioak 🏳️‍🌈♥️ changed their display name from oak to oak 🏳️‍🌈♥️.11:01:11
@hexa:lossy.networkhexa (UTC+1)tensorflow is still disabled on 3.1315:41:14
@hexa:lossy.networkhexa (UTC+1)* tensorflow-bin is still disabled on 3.1315:41:17
@hexa:lossy.networkhexa (UTC+1)yeah, so 2.20 nightlies support 3.13 since april15:53:16
@hexa:lossy.networkhexa (UTC+1)they really dropped the ball here15:53:20
20 May 2025
@connorbaker:matrix.orgconnor (he/him) (UTC-7)too much brain ouch04:11:11
@connorbaker:matrix.orgconnor (he/him) (UTC-7)talk about it tomorrow if you're still good for our chat?04:12:13
@breakds:matrix.orgbreakds Can someone here help take a look at this PR: https://github.com/NixOS/nixpkgs/pull/408555 ? It is the packaging of flashinfer, a library for transformer inference. The produced package works locally, but I am not very confident that I did everything correctly in the packaging - it might be good to get looked by an expert. Thanks a lot! 15:59:56
@connorbaker:matrix.orgconnor (he/him) (UTC-7) SomeoneSerge (UTC+U[-12,12]): check out the latest commit on https://github.com/NixOS/nixpkgs/pull/406531, should have addressed everything we talked about this morning
I still need to review yours, should be able to later today 🫠
17:48:26
@ss:someonex.netSomeoneSerge (Ever OOMed by Element)I fell asleep after the 3rd coffee19:58:36
21 May 2025
@justbrowsing:matrix.orgKevin Mittmanarches 👀01:40:13
@connorbaker:matrix.orgconnor (he/him) (UTC-7)I should be able to do the second half of the review on your PR tomorrow morning. Gonna try to be at work before 6:30 again.04:44:41
@ss:someonex.netSomeoneSerge (Ever OOMed by Element)What's your opinion on this?13:36:17
23 May 2025
@connorbaker:matrix.orgconnor (he/him) (UTC-7)Okay hoping I get a sudden burst of productivity and get through my backlog12:29:14
@ss:someonex.netSomeoneSerge (Ever OOMed by Element) connor (he/him) (UTC-7): I'm finally beginning to delete things! 19:52:09
24 May 2025
@ereslibre:ereslibre.socialereslibre

Hi! If you have some time I have a couple PRs waiting review/merge:

  • Allow to provide CSV files for Nvidia-ctk (fixes jetson devices): https://github.com/NixOS/nixpkgs/pull/401840
  • Fix nvidia-ctk for the Nvidia docker runtime (deprecated): https://github.com/NixOS/nixpkgs/pull/407290

Thanks :)

07:46:45
@little_dude:matrix.orglittle_dudeDownload x09:07:51
@little_dude:matrix.orglittle_dude

Hello, this was a long time ago, but I'm finally back to trying to run ollama :D

saxpy doesn't work. I used this flake:

{
  description = "CUDA saxpy test";
  inputs.nixpkgs.url = "nixpkgs";
  outputs =
    { self, nixpkgs }:
    {
      devShell.x86_64-linux =
        let
          pkgs = import nixpkgs {
            system = "x86_64-linux";
            config.allowUnfree = true; # Required for CUDA
          };
        in
        pkgs.mkShell {
          name = "cuda-saxpy-shell";
          buildInputs = [
            pkgs.cudaPackages.saxpy
            pkgs.cudaPackages.cudatoolkit
          ];
          shellHook = ''
            export CUDA_PATH=${pkgs.cudatoolkit}
            export EXTRA_LDFLAGS="-L/lib -L${pkgs.linuxPackages.nvidia_x11}/lib"
            export EXTRA_CCFLAGS="-I/usr/include"
            # Should I set this?
            # export LD_LIBRARY_PATH=${pkgs.cudaPackages.cudatoolkit.lib}/lib:$LD_LIBRARY_PATH
          '';
        };
    };
}

I'm running in the same(?) initialization error I think (see the log file attached) for LD_DEBUG=libs saxpy.

The output of nvidia-smi:

{
  description = "CUDA saxpy test";
  inputs.nixpkgs.url = "nixpkgs";
  outputs =
    { self, nixpkgs }:
    {
      devShell.x86_64-linux =
        let
          pkgs = import nixpkgs {
            system = "x86_64-linux";
            config.allowUnfree = true; # Required for CUDA
          };
        in
        pkgs.mkShell {
          name = "cuda-saxpy-shell";
          buildInputs = [
            pkgs.cudaPackages.saxpy
            pkgs.cudaPackages.cudatoolkit
          ];
          shellHook = ''
            export CUDA_PATH=${pkgs.cudatoolkit}
            export EXTRA_LDFLAGS="-L/lib -L${pkgs.linuxPackages.nvidia_x11}/lib"
            export EXTRA_CCFLAGS="-I/usr/include"
            # Should I set this?
            # export LD_LIBRARY_PATH=${pkgs.cudaPackages.cudatoolkit.lib}/lib:$LD_LIBRARY_PATH
          '';
        };
    };
}
09:08:34
@little_dude:matrix.orglittle_dude *

Hello, this was a long time ago, but I'm finally back to trying to run ollama :D

saxpy doesn't work. I used this flake:

{
  description = "CUDA saxpy test";
  inputs.nixpkgs.url = "nixpkgs";
  outputs =
    { self, nixpkgs }:
    {
      devShell.x86_64-linux =
        let
          pkgs = import nixpkgs {
            system = "x86_64-linux";
            config.allowUnfree = true; # Required for CUDA
          };
        in
        pkgs.mkShell {
          name = "cuda-saxpy-shell";
          buildInputs = [
            pkgs.cudaPackages.saxpy
            pkgs.cudaPackages.cudatoolkit
          ];
          shellHook = ''
            export CUDA_PATH=${pkgs.cudatoolkit}
            export EXTRA_LDFLAGS="-L/lib -L${pkgs.linuxPackages.nvidia_x11}/lib"
            export EXTRA_CCFLAGS="-I/usr/include"
            # Should I set this?
            # export LD_LIBRARY_PATH=${pkgs.cudaPackages.cudatoolkit.lib}/lib:$LD_LIBRARY_PATH
          '';
        };
    };
}

I'm running in the same(?) initialization error I think (see the log file attached) for LD_DEBUG=libs saxpy.

The output of nvidia-smi:

[little-dude@system76-laptop:~/cuda-tests]$ nvidia-smi 
Sat May 24 11:08:06 2025       
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 570.144                Driver Version: 570.144        CUDA Version: 12.8     |
|-----------------------------------------+------------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id          Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |           Memory-Usage | GPU-Util  Compute M. |
|                                         |                        |               MIG M. |
|=========================================+========================+======================|
|   0  NVIDIA GeForce RTX 4060 ...    Off |   00000000:01:00.0 Off |                  N/A |
| N/A   46C    P0            590W /  115W |      12MiB /   8188MiB |     13%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+
                                                                                         
+-----------------------------------------------------------------------------------------+
| Processes:                                                                              |
|  GPU   GI   CI              PID   Type   Process name                        GPU Memory |
|        ID   ID                                                               Usage      |
|=========================================================================================|
|    0   N/A  N/A            3706      G   ...me-shell-48.1/bin/gnome-shell          2MiB |
+-----------------------------------------------------------------------------------------+
09:09:08

Show newer messages


Back to Room ListRoom Version: 9