| 2 Oct 2025 |
Gaétan Lepage | It built successfully on my system! Running a full nixpkgs-review with cuda support right now. | 13:28:48 |
Daniel Fahey | Looking forward to seeing it build on https://hydra.nixos-cuda.org hehe | 13:32:04 |
| 3 Oct 2025 |
Daniel Fahey | Gaétan Lepage: good morning! Think I got to the bottom of the failing Python 3.12 CPU build. Can I please have your opinion on https://github.com/NixOS/nixpkgs/pull/447722#issuecomment-3364822620 think I'll just mark Python 3.12 CPU builds as broken? | 08:49:51 |
| 4 Oct 2025 |
| @palasso:matrix.org left the room. | 10:43:55 |
lon | Oh btw, v0.11 is out now https://github.com/vllm-project/vllm/releases/tag/v0.11.0 | 12:33:08 |
lon | Thank you all for making this package work and keeping it so up to date, you rock ❤️ | 12:33:56 |
Daniel Fahey | Gaétan Lepage: I think this PR is good to merge, let's ship | 14:09:13 |
Daniel Fahey | No one likes us, we don't care! We are Nix, super Nix https://repology.org/project/python%3Avllm/versions | 14:14:54 |
Daniel Fahey | Gaétan Lepage: see https://github.com/NixOS/nixpkgs/compare/master...daniel-fahey:nixpkgs:update-vllm
What's best, I'll supersede your PR?
| 19:18:57 |
Gaétan Lepage | I merged the vllm bump PR. | 19:19:22 |
Gaétan Lepage | Now I'm working on bumping it to v0.11.0 | 19:19:33 |
Daniel Fahey | Me too lol | 19:20:28 |
Gaétan Lepage | Will open the PR in a minute | 19:20:59 |
Daniel Fahey | Cool, I'm writing up what I've discovered | 19:25:37 |
lon | Redacted or Malformed Event | 22:10:50 |
Daniel Fahey | lon: oh you deleted, (hehe), I actually didn't even think about it and had completely forgotten nixpkgs didn't have CUDA 13 yet (https://github.com/NixOS/nixpkgs/pull/437723)
Thanks for looking into it all the same
| 22:29:03 |
Gaétan Lepage | Daniel Fahey FYI vllm is broken on master as my PR was merged slightly too soon. | 23:12:03 |
Gaétan Lepage | If you have a bit of time to investigate, please go on :) | 23:12:12 |
lon | Yes, sorry I deleted because I saw your commit and is the same as mine (save for the update script! I didn't know that was a pattern people in nixpkgs used, TIL) | 23:13:31 |
lon | the nvidia/cutlass dependency can also be updated fwiw, with the update script
| 23:18:40 |
lon |  Download image.png | 23:18:43 |
Daniel Fahey | yeah, just started rewriting it | 23:23:25 |
Daniel Fahey | How can you tell? Hydra? Got a link? | 23:27:58 |
Daniel Fahey | Looks okay for me, some other problem? CUDA build?
[daniel@laptop:~/Source/nixpkgs]$ nix-build -I nixpkgs=https://github.com/NixOS/nixpkgs/archive/b967613ed760449a73eaa73d7b69eb45e857ce1a.tar.gz --expr 'with import <nixpkgs> { }; python313Packages.vllm'
unpacking 'https://github.com/NixOS/nixpkgs/archive/b967613ed760449a73eaa73d7b69eb45e857ce1a.tar.gz' into the Git cache...
/nix/store/amncczb34wd5zingwclr3sqa6q7kahay-python3.13-vllm-0.11.0
[daniel@laptop:~/Source/nixpkgs]$ ./result/bin/vllm --help
INFO 10-05 00:44:00 [__init__.py:216] Automatically detected platform cpu.
usage: vllm [-h] [-v] {chat,complete,serve,bench,collect-env,run-batch} ...
vLLM CLI
positional arguments:
{chat,complete,serve,bench,collect-env,run-batch}
chat Generate chat completions via the running API server.
complete Generate text completions based on the given prompt via the running API server.
collect-env Start collecting environment information.
run-batch Run batch prompts and write results to file.
options:
-h, --help show this help message and exit
-v, --version show program's version number and exit
For full list: vllm [subcommand] --help=all
For a section: vllm [subcommand] --help=ModelConfig (case-insensitive)
For a flag: vllm [subcommand] --help=max-model-len (_ or - accepted)
Documentation: https://docs.vllm.ai
| 23:45:17 |
| 5 Oct 2025 |
Daniel Fahey | I ended up rewriting the whole thing if you want to give it a spin and leave a review? https://github.com/NixOS/nixpkgs/pull/448828 | 12:26:53 |
Daniel Fahey | I've been using this one-liner while cobbling it together
git show cc6098112333e5ac645aa14f2ea9f70878d8fe22:pkgs/development/python-modules/vllm/default.nix \
> pkgs/development/python-modules/vllm/default.nix \
&& git diff pkgs/development/python-modules/vllm/default.nix \
&& ./pkgs/development/python-modules/vllm/update.sh \
&& git diff pkgs/development/python-modules/vllm/default.nix
cc6098112333e5ac645aa14f2ea9f70878d8fe22 being the Nixpkgs revision with vLLM v0.10.2, you can also test it with other revisions corresponding to other semantic versions. I almost went to town writing tests, but I'd have enough fun by then
| 12:29:44 |
Daniel Fahey | yeah, see https://wiki.nixos.org/wiki/Nixpkgs/Update_Scripts
and you use e.g. passthru.updateScript = ./update.sh
(apparently), it's my first time writing one, will have to wait and see if @r-ryantm uses it
| 12:50:30 |
Daniel Fahey | one could also automate updating e.g. https://github.com/NixOS/nixpkgs/blob/107f8b572eb41058b610f99aba21b9a1b5925cf8/pkgs/development/python-modules/vllm/default.nix#L183-216, but I thought what I'd done was try-hard over-engineering enough already
I really wanted to try and make a reference implementation that could easily be adapted to other complicated Python packages that have multiple git deps | 12:54:29 |
Daniel Fahey | https://www.explainxkcd.com/wiki/index.php/1319:_Automation | 12:55:42 |
Daniel Fahey | vLLM is becoming a huge project and pillar in the ecosystem, sometimes their cadence for releases is daily, and going through and checking each is fiddly and tedious. May well save us all some time, but it's good to just not have to worry https://www.explainxkcd.com/wiki/index.php/1205:_Is_It_Worth_the_Time%3F
hope it keeps working, lol | 13:01:36 |