!fXpAvneDgyJuYMZSwO:nixos.org

Nix Data Science

225 Members
58 Servers

Load older messages


SenderMessageTime
2 Jan 2024
@1h0:matrix.org@1h0:matrix.org joined the room.11:33:37
3 Jan 2024
@benoitdr:matrix.orgbenoitdrStill a question : I have successfully packaged ctransformers based on the information above, but it cannot see my GPU. If I would pip install it, I would use the cuda option (pip install ctransfomers[cuda]). Is there a way to pass the same option to buildPythonPackage ?13:01:55
@benoitdr:matrix.orgbenoitdr
In reply to @benoitdr:matrix.org
Still a question : I have successfully packaged ctransformers based on the information above, but it cannot see my GPU. If I would pip install it, I would use the cuda option (pip install ctransfomers[cuda]). Is there a way to pass the same option to buildPythonPackage ?

(partially) answering to myslef, looking at setup.py, I can see that the cuda option corresponds to 2 extra packages :

        "cuda": [
            "nvidia-cuda-runtime-cu12",
            "nvidia-cublas-cu12",
        ],
14:18:44
@ss:someonex.netSomeone S(casual reminder that these are poetry2nix-generated wrappers for the pypi wheels, expect them to break)14:33:54
@ss:someonex.netSomeone S
In reply to @benoitdr:matrix.org
Still a question : I have successfully packaged ctransformers based on the information above, but it cannot see my GPU. If I would pip install it, I would use the cuda option (pip install ctransfomers[cuda]). Is there a way to pass the same option to buildPythonPackage ?
Pass the respective flags to cmake
14:34:10
@ss:someonex.netSomeone S *

(casual reminder that these are poetry2nix-generated wrappers for the pypi wheels, expect them to break)

EDIT: aj this is from setup.py, nvm

14:34:33
@benoitdr:matrix.orgbenoitdrNeed more info ... indeed these 2 pkgs from setup.py are nvidia proprietary and only distributed as wheels on pypi. So what is the way out (if any) ? I have already tried to include python310Packages.pycuda, cudaPackages.libcublas, cudaPackages.cuda_cudart, cudaPackages.cudatoolkit , also passing allowUnfree = true; cudaSupport = true; to nixpkgs config, without success. I'm feeling a bit stuck here ...14:43:55
@ss:someonex.netSomeone S You need to pass this flag during the ctransformers build: https://github.com/marella/ctransformers/blob/ed02cf4b9322435972ff3566fd4832806338ca3d/CMakeLists.txt#L6 14:45:55
@ss:someonex.netSomeone S

cudaPackages.cudatoolkit

You can remove this one

14:46:22
@ss:someonex.netSomeone S

python310Packages.pycuda

I don't see pycuda in ctrasnformers' dependencies?

14:47:01
@ss:someonex.netSomeone S
In reply to @ss:someonex.net
You need to pass this flag during the ctransformers build: https://github.com/marella/ctransformers/blob/ed02cf4b9322435972ff3566fd4832806338ca3d/CMakeLists.txt#L6
Their setup.py introduces an ad hoc environment variable for that: https://github.com/marella/ctransformers/blob/ed02cf4b9322435972ff3566fd4832806338ca3d/setup.py#L10C41-L10C42
14:50:11
@ss:someonex.netSomeone SIt would've been better if they weren't wrapping/hiding cmake from the user but at least there's a variable14:50:37
@ss:someonex.netSomeone S So you can set something like env.CT_CUBLAS = "ON" 14:51:22
@benoitdr:matrix.orgbenoitdrYep, env.CT_CUBLAS = "ON"; was the trick. Still I need cudatoolkit in propagatedBuildInputs, nothing more. Thanks a lot !15:58:26
4 Jan 2024
@ss:someonex.netSomeone S

Still I need cudatoolkit in propagatedBuildInputs

cudaPackages.cudatoolkit is being deprecated, you do not need it; cf. nixpkgs' torch or opencv4 derivations for examples

04:47:57
@ss:someonex.netSomeone SAlso there's no need to propagate it04:48:08
@thetootler:matrix.orgthetootler joined the room.06:25:27
@benoitdr:matrix.orgbenoitdrStrange ... It doesn't work without it on my side19:56:48
5 Jan 2024
@palo:xaos.space@palo:xaos.space joined the room.08:38:35
@palo:terranix.orgpalo joined the room.08:54:18
@ava:milliways.info@ava:milliways.info joined the room.10:16:35
@ava:milliways.info@ava:milliways.info left the room.10:16:48
@Hmpffff:matrix.orgHmpffff joined the room.13:17:55
@ss:someonex.netSomeone S
In reply to @benoitdr:matrix.org
Strange ... It doesn't work without it on my side
Could you elaborate? What are the errors
17:31:24
6 Jan 2024
@benoitdr:matrix.orgbenoitdr

Here is my shell.nix file :

let
  nixpkgs = fetchTarball "https://github.com/NixOS/nixpkgs/archive/04220ed6763637e5899980f98d5c8424b1079353.tar.gz";
  pkgs = import nixpkgs { config = { allowUnfree = true; }; overlays = []; };
  ctransformers = pkgs.python310.pkgs.buildPythonPackage rec {
      pname = "ctransformers";
      version = "0.2.27";
      format = "setuptools";
      src = pkgs.python310.pkgs.fetchPypi {
        inherit pname version;
        sha256 = "25653d4be8a5ed4e2d3756544c1e9881bf95404be5371c3ed506a256c28663d5";
      };
      doCheck = false;
      dontUseCmakeConfigure = true;
      nativeBuildInputs = with pkgs; [
        python310Packages.setuptools
        python310Packages.scikit-build
        python310Packages.ninja
        python310Packages.cmake
      ];
      propagatedBuildInputs = with pkgs; [
        python310Packages.huggingface-hub
        python310Packages.py-cpuinfo
        cudaPackages.cudatoolkit
      ];
      env.CT_CUBLAS = "ON";
    };
in
  pkgs.mkShell {
    packages = with pkgs; [
      (python310.withPackages (ps: with ps; [
        ctransformers
      ]))
    ];
  }

Without cudatoolkit, at compile time, ctransformers complaints that cublas is not found, and at runtime, ctransformers cannot use the GPU.
As far as I undertstand, it seems a bit logical to me. How would ctransfomers find the cuda libraries without cudatoolkit ?

10:29:05
@benoitdr:matrix.orgbenoitdr

BTW, the same thing happens if I set cudaSupport to true like this :

pkgs = import nixpkgs { config = { allowUnfree = true; cudaSupport = true; }; overlays = []; };

cudatoolkit is still needed

10:43:54
@ss:someonex.netSomeone S
In reply to @benoitdr:matrix.org

Here is my shell.nix file :

let
  nixpkgs = fetchTarball "https://github.com/NixOS/nixpkgs/archive/04220ed6763637e5899980f98d5c8424b1079353.tar.gz";
  pkgs = import nixpkgs { config = { allowUnfree = true; }; overlays = []; };
  ctransformers = pkgs.python310.pkgs.buildPythonPackage rec {
      pname = "ctransformers";
      version = "0.2.27";
      format = "setuptools";
      src = pkgs.python310.pkgs.fetchPypi {
        inherit pname version;
        sha256 = "25653d4be8a5ed4e2d3756544c1e9881bf95404be5371c3ed506a256c28663d5";
      };
      doCheck = false;
      dontUseCmakeConfigure = true;
      nativeBuildInputs = with pkgs; [
        python310Packages.setuptools
        python310Packages.scikit-build
        python310Packages.ninja
        python310Packages.cmake
      ];
      propagatedBuildInputs = with pkgs; [
        python310Packages.huggingface-hub
        python310Packages.py-cpuinfo
        cudaPackages.cudatoolkit
      ];
      env.CT_CUBLAS = "ON";
    };
in
  pkgs.mkShell {
    packages = with pkgs; [
      (python310.withPackages (ps: with ps; [
        ctransformers
      ]))
    ];
  }

Without cudatoolkit, at compile time, ctransformers complaints that cublas is not found, and at runtime, ctransformers cannot use the GPU.
As far as I undertstand, it seems a bit logical to me. How would ctransfomers find the cuda libraries without cudatoolkit ?

By you llisting individual cuda libraries and nvcc as inputs, cf torch/default.nix and opencv/4.nix for the examples

The documentation needs to be clarified about this...

14:40:09
7 Jan 2024
@benoitdr:matrix.orgbenoitdr

Indeed a bit of documentation would help ;-)
Looking at cudatoolkit sources, I can see that it's importing many many things I don't need, so I would be happy to get rid of it.
Unfortunately, if I replace it by cuda_nvcc, cuda_cudart and libcublas, ctransformers doesn't build anymore.

-- Using CUDA architectures: 52;61;70
-- Unable to find cuda_runtime.h in "/nix/store/p8058x6fpdlw7hy72qsqn41qhllqncgm-cuda_nvcc-11.8.89/include" for CUDAToolkit_INCLUDE_DIR.
-- Unable to find cublas_v2.h in either "" or "/nix/math_libs/include"
-- Could NOT find CUDAToolkit (missing: CUDAToolkit_INCLUDE_DIR) (found version "11.8.89")
CMake Warning at CMakeLists.txt:163 (message):
  cuBLAS not found

Looking at CMakeLists.txt :

if (CT_CUBLAS)
    find_package(CUDAToolkit)

So it seems there is a problem with the CUDAToolkit_INCLUDE_DIR env variable.
Not sure it's related, but looking at https://cmake.org/cmake/help/latest/module/FindCUDAToolkit.html, that function is setting a CUDAToolkit_INCLUDE_DIRS (with extra S)

10:27:28
@ss:someonex.netSomeone SIs cuda_nvcc in nativeBuildInputs?10:29:56
@benoitdr:matrix.orgbenoitdryes10:34:09

Show newer messages


Back to Room ListRoom Version: 6