| 13 May 2024 |
SomeoneSerge (matrix works sometimes) | IIRC tensorflow propagates its protobuf as a python package | 00:36:25 |
SomeoneSerge (matrix works sometimes) | For a pure native project that produces an ELF this wouldn't be a problem: libtorch can link its own protobuf via RUNPATH, and so can the native parts of tensorflow. You can throw them into the same closure and they'd never conflict, unless you actually loaded both from a single executable. But the python package just shows up in sys.path... | 00:38:53 |
connor (he/him) | Ah okay | 00:39:25 |
SomeoneSerge (matrix works sometimes) | All in all, we just need python to discard its import system in favour of something at least as flexible as ld.so | 00:40:42 |
SomeoneSerge (matrix works sometimes) | * All in all, we just need python to discard its import system in favour of something at least as flexible as ld.so. Which is not to say the latter can't be improved | 00:41:14 |
| @pascal.grosmann:scs.ems.host changed their display name from Pascal Grosmann to Pascal Grosmann - Urlaub 🚐 🏝️ 🏄♂️ 18.05. - 15.09.. | 08:13:30 |
Gaétan Lepage | Ok thanks connor (he/him) (UTC-5)! | 09:33:31 |
Gaétan Lepage | Where is the right spot to set cudaSupport = true for a flake based nixos install ? | 13:53:50 |
Gaétan Lepage | (I am using flake-part, if that matters) | 13:54:03 |
trexd | In reply to @glepage:matrix.org Where is the right spot to set cudaSupport = true for a flake based nixos install ? 35 perSystem = {
36 pkgs,
37 system,
38 ...
39 }: {
40 _module.args.pkgs = import nixpkgs {
41 inherit system;
42 config.allowUnfree = true;
43 config.cudaSupport = true;
44 };
Sorry about the line numbers but you just do this.
| 14:08:16 |
trexd | Oh I reread your message. Cant you just do nixpkgs.config.cudaSupport = true; if you want to globally enable cudaSupport? | 14:12:35 |
trexd | * Oh I reread your message. Cant you just do nixpkgs.config.cudaSupport = true; in your configuration.nix if you want to globally enable cudaSupport? | 14:12:53 |
Gaétan Lepage | In reply to @trexd:matrix.org Oh I reread your message. Cant you just do nixpkgs.config.cudaSupport = true; in your configuration.nix if you want to globally enable cudaSupport? Great this seems to work, thanks ! | 14:14:40 |
SomeoneSerge (matrix works sometimes) | (nixpkgs-config as flake input when) | 14:54:12 |
Kevin Mittman (UTC-7) | In reply to @ss:someonex.net Wdym by "single input" and by the "closure with packages for a component"? Received a request to provide a product-level tarball with all of the components for i.e. CUDA. IMHO thats basically a runfile so unclear if that would be helpful or not. | 19:19:20 |
SomeoneSerge (matrix works sometimes) | Yes sounds like a step backwards | 19:20:29 |
SomeoneSerge (matrix works sometimes) | * Yes sounds like a step backwards imo | 19:20:35 |
Kevin Mittman (UTC-7) | Right, so for manifest v4 trying to add build and runtime dependencies to the JSON. Those dlopen()s are still a thorn though | 19:57:47 |
| 14 May 2024 |
| kaya 𖤐 changed their profile picture. | 10:49:13 |
aidalgol | Thread about zluda over on Discourse: https://discourse.nixos.org/t/overlaying-packages-using-cuda-to-use-zluda/45374 They're trying to get cudaPackages to use zluda. | 19:35:24 |
| 15 May 2024 |
ˈt͡sɛːzaɐ̯ | In reply to @aidalgol:matrix.org Thread about zluda over on Discourse: https://discourse.nixos.org/t/overlaying-packages-using-cuda-to-use-zluda/45374 They're trying to get cudaPackages to use zluda. (wonder if localai works without blaslt. because the nix zluda package excludes that for now...) | 00:11:32 |
| evax joined the room. | 21:57:42 |
| 16 May 2024 |
evax | Hi I'm trying to get jax with cuda to work in WSL using a flake, but the GPU is never recognized. Torch in the same flake recognizes the device. I've tried setting nixpkgs to both nixos-23.11 and nixos-unstable. | 19:11:22 |
trexd | In reply to @evax:matrix.org Hi I'm trying to get jax with cuda to work in WSL using a flake, but the GPU is never recognized. Torch in the same flake recognizes the device. I've tried setting nixpkgs to both nixos-23.11 and nixos-unstable. Can you post your Nix code? | 19:18:24 |
evax | {
description = "Jax+cuda shell";
nixConfig = {
extra-substituters = [
"https://cuda-maintainers.cachix.org"
];
extra-trusted-public-keys = [
"cuda-maintainers.cachix.org-1:0dq3bujKpuEPMCX6U4WylrUDZ9JyUG0VpVZa7CNfq5E="
];
};
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs/nixos-23.11";
flake-utils.url = "github:numtide/flake-utils";
};
outputs = { self, nixpkgs, flake-utils }:
flake-utils.lib.eachDefaultSystem (system:
let
config = {
allowUnfree = true;
cudaSupport = true;
};
pkgs = (import nixpkgs { inherit system config; }).pkgs;
python3 = pkgs.python311;
deps = ps: with ps; [
jax
jaxlib
];
devPython = python3.withPackages(ps: with ps; (deps(ps) ++ [
ipython
]));
in rec {
inherit pkgs;
devShell = pkgs.stdenv.mkDerivation {
name = "jax-shell";
buildInputs = [
devPython
];
shellHook = ''
export CUDA_PATH=${pkgs.cudatoolkit}
export LD_LIBRARY_PATH=/usr/lib/wsl/lib:${pkgs.linuxPackages.nvidia_x11}/lib:${pkgs.ncurses5}/lib
export EXTRA_LDFLAGS="-L/lib -L${pkgs.linuxPackages.nvidia_x11}/lib"
export EXTRA_CCFLAGS="-I/usr/include"
'';
};
defaultPackage = devShell;
}
);
}
| 21:02:04 |
| 17 May 2024 |
evax | This was originally with nix on top of alma linux in WSL, I switched to NixOS-WSL and have the same issue | 07:07:30 |
SomeoneSerge (matrix works sometimes) |
export LD_LIBRARY_PATH=/usr/lib/wsl/lib:${pkgs.linuxPackages.nvidia_x11}/lib:${pkgs.ncurses5}/lib
You don't want to reference nvidia_x11 in WSL environments. Even on linux we don't reference it directly, cf. various posts about "impure drivers" on github and discourse | 08:38:07 |
SomeoneSerge (matrix works sometimes) |
/usr/lib/wsl/lib:
I forget now, is libcuda.so placed directly under this path, or in a subdirectory? | 08:38:38 |
SomeoneSerge (matrix works sometimes) | Could you also please gist the errors, and logs for LD_DEBUG=libs python -c "import torch; torch.cuda.is_available()" and LD_DEBUG=libs python -c "..." (some minimal code to make jax attempt loading libcuda) | 08:39:58 |
evax | thanks, let me try these things | 08:45:37 |