| 2 Oct 2025 |
Daniel Fahey | No worries. | 13:04:15 |
Daniel Fahey | Looks okay, I'm going to force push again with the corrected commit message crediting you | 13:05:36 |
Gaétan Lepage | Ok! | 13:12:44 |
Gaétan Lepage | It built successfully on my system! Running a full nixpkgs-review with cuda support right now. | 13:28:48 |
Daniel Fahey | Looking forward to seeing it build on https://hydra.nixos-cuda.org hehe | 13:32:04 |
| 3 Oct 2025 |
Daniel Fahey | Gaétan Lepage: good morning! Think I got to the bottom of the failing Python 3.12 CPU build. Can I please have your opinion on https://github.com/NixOS/nixpkgs/pull/447722#issuecomment-3364822620 think I'll just mark Python 3.12 CPU builds as broken? | 08:49:51 |
| 4 Oct 2025 |
| @palasso:matrix.org left the room. | 10:43:55 |
lon | Oh btw, v0.11 is out now https://github.com/vllm-project/vllm/releases/tag/v0.11.0 | 12:33:08 |
lon | Thank you all for making this package work and keeping it so up to date, you rock ❤️ | 12:33:56 |
Daniel Fahey | Gaétan Lepage: I think this PR is good to merge, let's ship | 14:09:13 |
Daniel Fahey | No one likes us, we don't care! We are Nix, super Nix https://repology.org/project/python%3Avllm/versions | 14:14:54 |
Daniel Fahey | Gaétan Lepage: see https://github.com/NixOS/nixpkgs/compare/master...daniel-fahey:nixpkgs:update-vllm
What's best, I'll supersede your PR?
| 19:18:57 |
Gaétan Lepage | I merged the vllm bump PR. | 19:19:22 |
Gaétan Lepage | Now I'm working on bumping it to v0.11.0 | 19:19:33 |
Daniel Fahey | Me too lol | 19:20:28 |
Gaétan Lepage | Will open the PR in a minute | 19:20:59 |
Daniel Fahey | Cool, I'm writing up what I've discovered | 19:25:37 |
lon | Redacted or Malformed Event | 22:10:50 |
Daniel Fahey | lon: oh you deleted, (hehe), I actually didn't even think about it and had completely forgotten nixpkgs didn't have CUDA 13 yet (https://github.com/NixOS/nixpkgs/pull/437723)
Thanks for looking into it all the same
| 22:29:03 |
Gaétan Lepage | Daniel Fahey FYI vllm is broken on master as my PR was merged slightly too soon. | 23:12:03 |
Gaétan Lepage | If you have a bit of time to investigate, please go on :) | 23:12:12 |
lon | Yes, sorry I deleted because I saw your commit and is the same as mine (save for the update script! I didn't know that was a pattern people in nixpkgs used, TIL) | 23:13:31 |
lon | the nvidia/cutlass dependency can also be updated fwiw, with the update script
| 23:18:40 |
lon |  Download image.png | 23:18:43 |
Daniel Fahey | yeah, just started rewriting it | 23:23:25 |
Daniel Fahey | How can you tell? Hydra? Got a link? | 23:27:58 |
Daniel Fahey | Looks okay for me, some other problem? CUDA build?
[daniel@laptop:~/Source/nixpkgs]$ nix-build -I nixpkgs=https://github.com/NixOS/nixpkgs/archive/b967613ed760449a73eaa73d7b69eb45e857ce1a.tar.gz --expr 'with import <nixpkgs> { }; python313Packages.vllm'
unpacking 'https://github.com/NixOS/nixpkgs/archive/b967613ed760449a73eaa73d7b69eb45e857ce1a.tar.gz' into the Git cache...
/nix/store/amncczb34wd5zingwclr3sqa6q7kahay-python3.13-vllm-0.11.0
[daniel@laptop:~/Source/nixpkgs]$ ./result/bin/vllm --help
INFO 10-05 00:44:00 [__init__.py:216] Automatically detected platform cpu.
usage: vllm [-h] [-v] {chat,complete,serve,bench,collect-env,run-batch} ...
vLLM CLI
positional arguments:
{chat,complete,serve,bench,collect-env,run-batch}
chat Generate chat completions via the running API server.
complete Generate text completions based on the given prompt via the running API server.
collect-env Start collecting environment information.
run-batch Run batch prompts and write results to file.
options:
-h, --help show this help message and exit
-v, --version show program's version number and exit
For full list: vllm [subcommand] --help=all
For a section: vllm [subcommand] --help=ModelConfig (case-insensitive)
For a flag: vllm [subcommand] --help=max-model-len (_ or - accepted)
Documentation: https://docs.vllm.ai
| 23:45:17 |
| 5 Oct 2025 |
Daniel Fahey | I ended up rewriting the whole thing if you want to give it a spin and leave a review? https://github.com/NixOS/nixpkgs/pull/448828 | 12:26:53 |
Daniel Fahey | I've been using this one-liner while cobbling it together
git show cc6098112333e5ac645aa14f2ea9f70878d8fe22:pkgs/development/python-modules/vllm/default.nix \
> pkgs/development/python-modules/vllm/default.nix \
&& git diff pkgs/development/python-modules/vllm/default.nix \
&& ./pkgs/development/python-modules/vllm/update.sh \
&& git diff pkgs/development/python-modules/vllm/default.nix
cc6098112333e5ac645aa14f2ea9f70878d8fe22 being the Nixpkgs revision with vLLM v0.10.2, you can also test it with other revisions corresponding to other semantic versions. I almost went to town writing tests, but I'd have enough fun by then
| 12:29:44 |
Daniel Fahey | yeah, see https://wiki.nixos.org/wiki/Nixpkgs/Update_Scripts
and you use e.g. passthru.updateScript = ./update.sh
(apparently), it's my first time writing one, will have to wait and see if @r-ryantm uses it
| 12:50:30 |