Hydra | 364 Members | |
| 108 Servers |
| Sender | Message | Time |
|---|---|---|
| 29 Aug 2025 | ||
| 19:29:39 | ||
| 31 Aug 2025 | ||
| 16:44:40 | ||
| Is there any workaround to fix a cached failure on hydra that cannot be reproduced locally? Or is the solution to just wait until some cache expiry? Asking for https://hydra.nixos.org/build/306350994 | 16:47:29 | |
| Restarted | 17:05:05 | |
| 1 Sep 2025 | ||
John Ericson: Since we switched to nix-eval-jobs in hydra, I noticed that evaluation is no longer "pure" because it uses --expr in combination with builtins.getFlake. So I added this pull request to have an alternative map function that can be applied to the expression that nix-eval-jobs receives: https://github.com/nix-community/nix-eval-jobs/pull/378 | 13:47:52 | |
| @joerg:thalheim.io: I think it can still be pure? | 15:47:58 | |
| Can get flake with lock file info be done with pure eval? | 15:48:29 | |
| https://github.com/nix-community/nix-eval-jobs/blob/a579b1a416dc04d50c0dc2832e9da24b0d08dbac/src/nix-eval-jobs.cc#L454 | 15:49:14 | |
| If you pass --pure-eval what happens? | 15:51:51 | |
| I thought we had some logic to do an initial fetch so we had the info we needed for pure eval? | 15:52:11 | |
| We can do the new flag in nix eval jobs, but it isn't really a nix eval jobs -specific problem, so I rather have a more general solution | 15:52:46 | |
| John Ericson: that wouldn't work with getFlakes because those inputs are not populated into the nix path. | 16:15:04 | |
* John Ericson: --pure-eval- wouldn't work with getFlakes because those inputs are not populated into the nix path. | 16:25:01 | |
* John Ericson: --pure-eval wouldn't work with getFlakes because those inputs are not populated into the nix path. | 16:25:06 | |
| 17:31:19 | ||
| 2 Sep 2025 | ||
| 14:09:19 | ||
| 3 Sep 2025 | ||
| 07:30:01 | ||
| 4 Sep 2025 | ||
| 17:02:53 | ||
| 8 Sep 2025 | ||
| 02:15:40 | ||
| 9 Sep 2025 | ||
| 15:02:14 | ||
| 10 Sep 2025 | ||
| 22:20:24 | ||
| 11 Sep 2025 | ||
| 17:50:59 | ||
| 23:24:01 | ||
| there was a kernel regression causing tailscale test to fail, which was supposedly resolved in 6.12.46. how to see if the hydra build servers are running a fixed kernel to pass the test and what would trigger an attempt to rebuild/test tailscale? | 23:26:44 | |
| 12 Sep 2025 | ||
| relevant folks are in -> #infra:nixos.org | 00:27:52 | |
| 01:43:10 | ||
| 14 Sep 2025 | ||
| 08:39:32 | ||
| 16 Sep 2025 | ||
| Hi, Could it be, that the Github Status Plugin is currently not working as expected? Since some time i am missing the reported build status for my flake on github. The error message according hydra-notify is the following:
https://github.com/NixOS/hydra/blob/274027eb504c7fe090e00c16fd94f4b832981095/src/lib/Hydra/Plugin/GithubStatus.pm#L100 Which is right according the regex, tho i am not seeing what should be wrong on my end, it looks to me that the narHash that is in the url is unexpected. My jobset definition is here: https://github.com/Shawn8901/nixos-configuration/blob/b1f6e066c77c4eb078bcf873bf6e7eeb3a91db16/.hydra/jobsets.nix#L28 And i dont see where i could cause the narHash to be included and i asume its more a internal issue here. | 19:18:47 | |
| * Hi, Could it be, that the Github Status Plugin is currently not working as expected for flakes? Since some time i am missing the reported build status for my flake on github. The error message according hydra-notify is the following:
https://github.com/NixOS/hydra/blob/274027eb504c7fe090e00c16fd94f4b832981095/src/lib/Hydra/Plugin/GithubStatus.pm#L100 Which is right according the regex, tho i am not seeing what should be wrong on my end, it looks to me that the narHash that is in the url is unexpected. My jobset definition is here: https://github.com/Shawn8901/nixos-configuration/blob/b1f6e066c77c4eb078bcf873bf6e7eeb3a91db16/.hydra/jobsets.nix#L28 And i dont see where i could cause the narHash to be included and i asume its more a internal issue here. | 19:19:18 | |
| I saw that there is https://github.com/NixOS/hydra/issues/1486 but my hydra.conf looks fine according to what is mentioned in the issue
| 19:23:16 | |