| 27 Aug 2025 |
nbp | and if Nvidia backend is enabled, we might have yet another build of Firefox. | 16:08:04 |
@aloisw:julia0815.de | From the package definition it looks like you can build it without all of those and it will use BLAS, is that not good enough for Firefox? | 16:10:53 |
@aloisw:julia0815.de | I don't think I'm the only one who doesn't want there Firefox package to depend on CUDA and ROCm for features they don't even use. | 16:11:37 |
nbp | The problem is that we would have 3 variants of Firefox then, one optimized with Cuda, one with ROCm, and one without?
Sounds like we might want some re-linking mechanism to substitute one llama-cpp by another on a completed build of Firefox. | 16:14:32 |
K900 | Can we just let it use vulkan compute or opencl or whatever generic shit it has | 16:19:38 |
@aloisw:julia0815.de | It has all of them. | 16:19:59 |
K900 | Cool can we just have it use that | 16:20:09 |
K900 | Until someone complains | 16:20:12 |
K900 | And then we tell them to fuck off and override llama-cpp if they want | 16:20:20 |
@aloisw:julia0815.de | aloisw@exodus ~> nix path-info -S /nix/store/cwfaak5cpb6s49g427a4x9agxxgd1djc-llama-cpp-6210
/nix/store/cwfaak5cpb6s49g427a4x9agxxgd1djc-llama-cpp-6210 201550672
aloisw@exodus ~> nix path-info -S /nix/store/9pjd655imkq4qf0vv31ad80jhmszqh8s-llama-cpp-6210
/nix/store/9pjd655imkq4qf0vv31ad80jhmszqh8s-llama-cpp-6210 127200088
aloisw@exodus ~> nix path-info -S /nix/store/2z14x11p6l8wrbk7zjilid3259qgqjrm-llama-cpp-6210
/nix/store/2z14x11p6l8wrbk7zjilid3259qgqjrm-llama-cpp-6210 94045536
Wat, first is CPU, second is Vulkan, third is OpenCL. | 16:25:14 |
| 29 Aug 2025 |
| @luna-null:matrix.org left the room. | 02:40:17 |
nbp | https://blog.mozilla.org/en/firefox/firefox-ai/speeding-up-firefox-local-ai-runtime/ onnxruntime dependency explained | 10:26:38 |
hexa | tried tab grouping recently and it was indeed super snappy | 13:18:22 |