23 Jul 2024 |
| kaya changed their profile picture. | 05:30:28 |
| Ezzobir Bezziou joined the room. | 08:22:21 |
| sellout joined the room. | 13:32:24 |
SomeoneSerge (utc+3) | In reply to @ss:someonex.net Sounds like "why aren't we doing this yet?" Speaking of which, we need to stop copy-pasting this stuff because why aren't these default: https://github.com/NixOS/nixpkgs/pull/328713/files#diff-2d862bf9684df6fdface7aabf2af2b1767eb17ba2f9ab8b2e7db03d22c3c0660R196-R216 | 13:49:06 |
SomeoneSerge (utc+3) | Should be like a single option | 13:49:23 |
SomeoneSerge (utc+3) | A single option to blacklist everything, and then you add a few exceptions | 13:49:45 |
hexa (UTC+1) | ideally | 13:58:01 |
| @trofi:matrix.org left the room. | 19:05:15 |
24 Jul 2024 |
Pascal | Does anyone else have trouble building ollama-cuda from nixos-unstable lately? | 05:14:53 |
hexa (UTC+1) | Pascal: anything concrete that you are seeing? | 09:40:09 |
| kaya changed their profile picture. | 21:52:00 |
25 Jul 2024 |
Pascal | Compilation of nixpkgs#ollama-cuda fails with: '/nix/store/4fxpzb7rqf19r5hypxpzbf72jaa6mphw-cuda_cudart-12.2.140-dev/include/cuda_runtime.h:82:10: fatal error: crt/host_config.h: No such file or directory' | 05:54:18 |
Pascal | @hexa: 👆 | 05:55:35 |
Pascal | This is on non-NixOS (Ubuntu 22.04). | 06:00:40 |
Pascal | * @hexa (UTC+1) 👆
| 06:02:11 |
ˈt͡sɛːzaɐ̯ | I can't reproduce that on a nixos, github:NixOS/nixpkgs/289eafaaf02177e5814d8738cf57e259f9eae46e#ollama-cuda builds fine. (should master from an ~hour ago.) | 06:26:48 |
ˈt͡sɛːzaɐ̯ | * I can't reproduce that on a nixos, github:NixOS/nixpkgs/289eafaaf02177e5814d8738cf57e259f9eae46e#ollama-cuda builds fine. (should master from half an ~hour ago.) | 06:27:09 |
ˈt͡sɛːzaɐ̯ | * Pascal: I can't reproduce that on a nixos, github:NixOS/nixpkgs/289eafaaf02177e5814d8738cf57e259f9eae46e#ollama-cuda builds fine. (should master from half an ~hour ago.) | 06:27:22 |
Pascal | In reply to @julius:mtx.liftm.de Pascal: I can't reproduce that on a nixos, github:NixOS/nixpkgs/289eafaaf02177e5814d8738cf57e259f9eae46e#ollama-cuda builds fine. (should master from half an ~hour ago.) Thanks! I'll give that one a go later. | 09:17:32 |
Pascal | @julius:mtx.liftm.de: Just checked: Works on NixOS, but still fails on Ubuntu (using nix run --impure). Same error: crt/host_config.h not found. Dang... it worked until 3 weeks or so ago.
It fails during 'Compiling the CUDA compiler identification source file' oddly using /usr/local/cuda/bin/nvcc despite finding /nix/store/...-cuda_nvcc... 🤷♂️ | 15:07:36 |
SomeoneSerge (utc+3) | In reply to @phirsch:matrix.org @julius:mtx.liftm.de: Just checked: Works on NixOS, but still fails on Ubuntu (using nix run --impure). Same error: crt/host_config.h not found. Dang... it worked until 3 weeks or so ago.
It fails during 'Compiling the CUDA compiler identification source file' oddly using /usr/local/cuda/bin/nvcc despite finding /nix/store/...-cuda_nvcc... 🤷♂️ Try entering a shell with cudaPackages.cuda_nvcc I guess | 15:13:03 |
Pascal | Since I don't have any CUDA env vars set nor nvcc on the PATH, I suspect that the nvcc location is explicitly specified in the derivation's build script... | 15:13:14 |
SomeoneSerge (utc+3) | I didn't check but I'd guess ollama doesn't propagate nvcc (not necessarily bad) and then it ends up picking up the bad nvcc from ubuntu (/usr/...) | 15:14:17 |
Pascal | Now this is weird: Simply running
nix run github:NixOS/nixpkgs/289eafaaf02177e5814d8738cf57e259f9eae46e#ollama serve
(without -cuda!) works, showing 'library=cuda' and accurate GPU info... Can't claim to understand why, but it looks like I don't actually need the '-cuda' variant. (Suspect that it's somehow managing to dload() the system CUDA SOs?) | 15:31:57 |
Pascal | @ss:someonex.net: Thanks for the tips (only saw your replies after refreshing just now). Might try that as well (although I might already be good according to the above). | 15:41:55 |
26 Jul 2024 |
| Jason Schnitzer joined the room. | 01:35:00 |
| nathan72419 joined the room. | 16:11:45 |
connor (he/him) (UTC-7) | As a heads up, I’m moving this week so my availability will probably be more limited than it was already for a few weeks :/ | 16:27:09 |
27 Jul 2024 |
Pascal | @SomeoneSerge (UTC+3) @ˈt͡sɛːzaɐ̯ No dice... While ollama (without '-cuda') somehow manages to get GPU serial and VRAM allocation into, it doesn't use the GPU when actually running a model (outputs 'Not compiled with GPU offload support'). And unfortunately, using 'nix run --impure' as above from within a nix shell with 'nvcc' from nixpkgs still fails because it's using nvcc from /usr/local/...
| 07:04:48 |
28 Jul 2024 |
| matthewcroughan changed their display name from matthewcroughan to matthewcroughan - going to nix.camp. | 16:12:17 |