!PbcQeaWcgMyjVfeGQN:nixos.org

Nix Mozilla 🦊🐦🐒

152 Members
A room about a number of weird animals (also known as Mozilla products): Firefox, Thunderbird, Spidermonkey, NSS, cacert. Also a little bit of fun times, small amounts of extreme, when building weird animals. But for bugs please file GitHub issues. | Release Schedule: https://whattrainisitnow.com | Crash-Stats: https://crash-stats.mozilla.org/search/?distribution_id=%3Dnixos&product=Firefox&product=Thunderbird45 Servers

Load older messages


SenderMessageTime
27 Aug 2025
@nbp:mozilla.orgnbphttps://bugzilla.mozilla.org/show_bug.cgi?id=1985439#c1 seeing that as part of my bugmail … This sounds like a very bad news for Nix, given that I tried llama.cpp with various backends on my last computer, to find the ones which run and the one which perform well.16:03:08
@hexa:lossy.networkhexahuh?16:04:50
@nbp:mozilla.orgnbpllama.cpp might have a way to select between various backend, but last time I tried this was a nightmare to get everything at once. (i.e. it did not compile)16:07:33
@nbp:mozilla.orgnbpand if Nvidia backend is enabled, we might have yet another build of Firefox.16:08:04
@aloisw:julia0815.dealoiswFrom the package definition it looks like you can build it without all of those and it will use BLAS, is that not good enough for Firefox?16:10:53
@aloisw:julia0815.dealoiswI don't think I'm the only one who doesn't want there Firefox package to depend on CUDA and ROCm for features they don't even use.16:11:37
@nbp:mozilla.orgnbpThe problem is that we would have 3 variants of Firefox then, one optimized with Cuda, one with ROCm, and one without? Sounds like we might want some re-linking mechanism to substitute one llama-cpp by another on a completed build of Firefox.16:14:32
@k900:0upti.meK900Can we just let it use vulkan compute or opencl or whatever generic shit it has16:19:38
@aloisw:julia0815.dealoiswIt has all of them.16:19:59
@k900:0upti.meK900Cool can we just have it use that16:20:09
@k900:0upti.meK900Until someone complains16:20:12
@k900:0upti.meK900And then we tell them to fuck off and override llama-cpp if they want16:20:20
@aloisw:julia0815.dealoisw
aloisw@exodus ~> nix path-info -S /nix/store/cwfaak5cpb6s49g427a4x9agxxgd1djc-llama-cpp-6210
/nix/store/cwfaak5cpb6s49g427a4x9agxxgd1djc-llama-cpp-6210        201550672
aloisw@exodus ~> nix path-info -S /nix/store/9pjd655imkq4qf0vv31ad80jhmszqh8s-llama-cpp-6210
/nix/store/9pjd655imkq4qf0vv31ad80jhmszqh8s-llama-cpp-6210        127200088
aloisw@exodus ~> nix path-info -S /nix/store/2z14x11p6l8wrbk7zjilid3259qgqjrm-llama-cpp-6210
/nix/store/2z14x11p6l8wrbk7zjilid3259qgqjrm-llama-cpp-6210         94045536

Wat, first is CPU, second is Vulkan, third is OpenCL.

16:25:14
29 Aug 2025
@luna-null:matrix.org@luna-null:matrix.org left the room.02:40:17
@nbp:mozilla.orgnbphttps://blog.mozilla.org/en/firefox/firefox-ai/speeding-up-firefox-local-ai-runtime/ onnxruntime dependency explained10:26:38
@hexa:lossy.networkhexatried tab grouping recently and it was indeed super snappy13:18:22
@hexa:lossy.networkhexaimage.png
Download image.png
13:33:02
@k900:0upti.meK900 Hmm I wonder if the -bin packages need to be fixed for this too 14:09:38
@k900:0upti.meK900 I did get some "AI" popup in nightly the other day 14:09:43
2 Sep 2025
@hexa:lossy.networkhexaAlso on 142.001:00:38
3 Sep 2025
@ghpzin:envs.netghpzin joined the room.07:55:19
5 Sep 2025
@vcunat:matrix.orgvcunat Firefox 145 will not have 32-bit Linux support. 16:00:40
@vcunat:matrix.orgvcunatThough in nixpkgs it didn't build for a while already.16:01:37
@emilazy:matrix.orgemilyI guess Chromium must still support it, as Steam uses Chromium? 🤔16:04:50
@k900:0upti.meK900Steam uses 64-bit Chromium16:06:29
@emilazy:matrix.orgemilywait so Steam isn't actually 32 bits?16:06:56
@k900:0upti.meK900The main executable is16:08:16
@k900:0upti.meK900The Chrome bits are not16:08:19
@emilazy:matrix.orgemilyI see16:09:04
@hexa:lossy.networkhexaRedacted or Malformed Event19:25:26

Show newer messages


Back to Room ListRoom Version: 9