| 20 Nov 2024 |
@aloisw:kde.org | In reply to @dre:imad.nyc Unless I'm confused, I meant in the nix repo, where is system as a flag that can be accepted defined. For example, man nix.conf tells me that it takes allow-unsafe-native-code-during-evaluation as a flag, which is defined here: https://github.com/NixOS/nix/blob/32becc87fef7340600df75ffed6e7c6bc56aa827/src/libexpr/eval-settings.hh#L54 system is defined in the libstore settings: https://github.com/NixOS/nix/blob/32becc87fef7340600df75ffed6e7c6bc56aa827/src/libstore/globals.hh#L188 | 05:48:53 |
@aloisw:kde.org | In reply to @dre:imad.nyc Do you think you could point me to the entrypoint for --help in nix3 commands? I see showManPage, but as far as I can tell that's just nix2 showHelp: https://github.com/NixOS/nix/blob/32becc87fef7340600df75ffed6e7c6bc56aa827/src/nix/main.cc#L242 | 05:55:23 |
| updown joined the room. | 06:57:39 |
John Ericson | Eelco: OK https://github.com/NixOS/nix/issues/11928 is the ticket I would like to fix involving CA things. Don't worry about that. | 21:35:34 |
John Ericson | https://github.com/NixOS/nix/issues/11927 This is the plain code cleanup, no new features ticket that I would like to be done before attempting it | 21:36:03 |
connor (burnt/out) (UTC-8) | As I've been working on the persistent lists PR, I've been thinking about the data structures we use currently for lists and attribute sets.
Lists: https://github.com/NixOS/nix/blob/0498e2259a0e6ae2f6e121579d35ace79c5b3ef2/src/libexpr/value.hh#L136
Attribute sets: https://github.com/NixOS/nix/blob/0498e2259a0e6ae2f6e121579d35ace79c5b3ef2/src/libexpr/attr-set.hh
Has anyone got metrics on what the most commonly used operations are on lists and attribute sets?
I'd like to think that we could find out what operations we need to make cheap and select a data structure which provides a low cost for those operations. | 22:25:53 |
tomberek | Most common is opUpdate. I've got some valgrind stuff somewhere. | 23:25:07 |
Sergei Zimmerman (xokdvium) | In reply to @connorbaker:matrix.org As I've been working on the persistent lists PR, I've been thinking about the data structures we use currently for lists and attribute sets.
Lists: https://github.com/NixOS/nix/blob/0498e2259a0e6ae2f6e121579d35ace79c5b3ef2/src/libexpr/value.hh#L136
Attribute sets: https://github.com/NixOS/nix/blob/0498e2259a0e6ae2f6e121579d35ace79c5b3ef2/src/libexpr/attr-set.hh
Has anyone got metrics on what the most commonly used operations are on lists and attribute sets?
I'd like to think that we could find out what operations we need to make cheap and select a data structure which provides a low cost for those operations. I doubt that this information is readily available. It should be possible to either instrument cppnix to collect this information and/or use profilers (think gprof/callgrind). callgrind would be the easiest approach | 23:26:06 |
tomberek | I used this:
valgrind --tool=callgrind ./build/src/nix/nix eval --file '<nixpkgs/nixos>' system --arg configuration '{boot.isContainer=true;system.stateVersion="24.11";}' --argstr system "x86_64-linux"
and got these graphs: https://github.com/NixOS/nix/issues/9034#issue-1910269659
| 23:33:08 |
| 21 Nov 2024 |
| @nerves:bark.lgbt changed their profile picture. | 02:51:38 |
matthewcroughan | [nixos@nixos:~]$ nix-shell -p pciutils
error:
… while calling the 'import' builtin
at «string»:1:18:
1| {...}@args: with import <nixpkgs> args; (pkgs.runCommandCC or pkgs.runCommand) "shell" { buildInputs = [ (pciutils) ]; } ""
| ^
… while realising the context of a path
… while calling the 'findFile' builtin
at «string»:1:25:
1| {...}@args: with import <nixpkgs> args; (pkgs.runCommandCC or pkgs.runCommand) "shell" { buildInputs = [ (pciutils) ]; } ""
| ^
error: experimental Nix feature 'flakes' is disabled; add '--extra-experimental-features flakes' to enable it
| 15:16:11 |
matthewcroughan | Does this mean that Nix just doesn't work out of the box now because of flakes? Or have I misconfigured something? | 15:16:21 |
matthewcroughan | [nixos@nixos:~]$ nix --version
nix (Nix) 2.24.10
| 15:16:45 |
Kamilla 'ova | In reply to@matthewcroughan:defenestrate.it
[nixos@nixos:~]$ nix-shell -p pciutils
error:
… while calling the 'import' builtin
at «string»:1:18:
1| {...}@args: with import <nixpkgs> args; (pkgs.runCommandCC or pkgs.runCommand) "shell" { buildInputs = [ (pciutils) ]; } ""
| ^
… while realising the context of a path
… while calling the 'findFile' builtin
at «string»:1:25:
1| {...}@args: with import <nixpkgs> args; (pkgs.runCommandCC or pkgs.runCommand) "shell" { buildInputs = [ (pciutils) ]; } ""
| ^
error: experimental Nix feature 'flakes' is disabled; add '--extra-experimental-features flakes' to enable it
you probably have something like nixpkgs=flake:nixpkgs in your nix path | 15:27:30 |
| n-hass joined the room. | 20:59:53 |
| 22 Nov 2024 |
| @tanvir:hackliberty.org joined the room. | 20:14:24 |
| 23 Nov 2024 |
jade_ | In reply to @matthewcroughan:defenestrate.it Does this mean that Nix just doesn't work out of the box now because of flakes? Or have I misconfigured something? this is probably this bug https://github.com/NixOS/nixpkgs/issues/292465
effectively it is that if you build a nixos config with flakes it won't enable flakes in the resulting system.
this would be literally a 1 line change to fix and ive just not got around to it. you just need to make nixpkgs.flake.source set flakes to be enabled
| 01:32:14 |
Mic92 | In reply to @matthewcroughan:defenestrate.it
got this for the first time today
nixinate-phone> building '/nix/store/z03lxdp085hqpaxglqm7iphvdsvcdgvs-cardinal-24.09.drv'...
nixinate-phone> bad JSON log message from builder: [json.exception.parse_error.101] parse error at line 1, column 4099: syntax error while parsing array - invalid literal; last read: '"\u001b[01m\u001b[K/nix/store/pxb3zpbg0qdccadh884fag33va0xb4ds-gcc-13.3.0/include/c++/13.3.0/variant:1522:33:\u001b[m\u001b[K required from '\u001b[01m\u001b[Kstd::enable_if_t<(is_constructible_v<_Tp, _Args ...> && __exactly_once<_Tp>), _Tp&> std::variant<_Types>::\u001b[01;32m\u001b[Kemplace\u001b[m\u001b[K(_Args&& ...) \u001b[35m\u001b[K[with _Tp = RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 32, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 32, 1> >; _Args = {}; _Types = {NullModel, RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 8, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 8, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::GRULayerT<float, 2, 8, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 8, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::GRULayerT<float, 3, 8, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 8, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 12, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 12, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::GRULayerT<float, 2, 12, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 12, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::GRULayerT<float, 3, 12, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 12, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 16, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 16, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::GRULayerT<float, 2, 16, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 16, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::GRULayerT<float, 3, 16, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 16, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 20, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 20, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::GRULayerT<float, 2, 20, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 20, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::GRULayerT<float, 3, 20, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 20, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 32, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 32, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::GRULayerT<float, 2, 32, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 32, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::GRULayerT<float, 3, 32, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 32, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 40, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 40, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::GRULayerT<float, 2, 40, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 40, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::GRULayerT<float, 3, 40, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 40, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 64, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 64, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::GRULayerT<float, 2, 64, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 64, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::GRULayerT<float, 3, 64, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 64, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::LSTMLayerT<float, 1, 8, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 8, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::LSTMLayerT<float, 2, 8, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 8, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::LSTMLayerT<float, 3, 8, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 8, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::LSTMLayerT<float, 1, 12, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 12, 1> >, RTNeural::ModelT@nix {"a'; expected ']'
Robert Hensing (roberth): did you knot fix some json parsing error the other day? | 07:20:41 |
Mic92 | In reply to @matthewcroughan:defenestrate.it
got this for the first time today
nixinate-phone> building '/nix/store/z03lxdp085hqpaxglqm7iphvdsvcdgvs-cardinal-24.09.drv'...
nixinate-phone> bad JSON log message from builder: [json.exception.parse_error.101] parse error at line 1, column 4099: syntax error while parsing array - invalid literal; last read: '"\u001b[01m\u001b[K/nix/store/pxb3zpbg0qdccadh884fag33va0xb4ds-gcc-13.3.0/include/c++/13.3.0/variant:1522:33:\u001b[m\u001b[K required from '\u001b[01m\u001b[Kstd::enable_if_t<(is_constructible_v<_Tp, _Args ...> && __exactly_once<_Tp>), _Tp&> std::variant<_Types>::\u001b[01;32m\u001b[Kemplace\u001b[m\u001b[K(_Args&& ...) \u001b[35m\u001b[K[with _Tp = RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 32, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 32, 1> >; _Args = {}; _Types = {NullModel, RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 8, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 8, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::GRULayerT<float, 2, 8, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 8, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::GRULayerT<float, 3, 8, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 8, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 12, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 12, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::GRULayerT<float, 2, 12, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 12, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::GRULayerT<float, 3, 12, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 12, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 16, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 16, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::GRULayerT<float, 2, 16, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 16, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::GRULayerT<float, 3, 16, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 16, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 20, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 20, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::GRULayerT<float, 2, 20, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 20, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::GRULayerT<float, 3, 20, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 20, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 32, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 32, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::GRULayerT<float, 2, 32, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 32, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::GRULayerT<float, 3, 32, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 32, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 40, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 40, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::GRULayerT<float, 2, 40, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 40, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::GRULayerT<float, 3, 40, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 40, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::GRULayerT<float, 1, 64, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 64, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::GRULayerT<float, 2, 64, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 64, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::GRULayerT<float, 3, 64, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 64, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::LSTMLayerT<float, 1, 8, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 8, 1> >, RTNeural::ModelT<float, 2, 1, RTNeural::LSTMLayerT<float, 2, 8, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 8, 1> >, RTNeural::ModelT<float, 3, 1, RTNeural::LSTMLayerT<float, 3, 8, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 8, 1> >, RTNeural::ModelT<float, 1, 1, RTNeural::LSTMLayerT<float, 1, 12, RTNeural::SampleRateCorrectionMode::None>, RTNeural::DenseT<float, 12, 1> >, RTNeural::ModelT@nix {"a'; expected ']'
* Robert Hensing (roberth): did you not fix some json parsing error the other day? | 07:20:49 |
Mic92 | Must have been something else. Here we go: https://github.com/NixOS/nix/pull/11939 | 08:08:42 |
p14 | I have nix build going in a loop and I'm unable to ctrl-C it. :(. It's just doing a lot of time in 'getting git revision count of /path/to/nixpkgs', and if the process gets killed, it just goes straight into the next nix build and my shell isn't having a change to see the CTRL-C for the shell while loop. | 11:01:06 |
| 24 Nov 2024 |
| @ixlun:matrix.org changed their display name from Matthew Leach to Matthew L. | 00:43:21 |
| WeetHet joined the room. | 11:45:30 |
@trofi:matrix.org | Is meson build system on nix-2.25 intended to be used to buix nix on systems that don't yet have nix running? If it is how do you pass options to subprojects from the top level? This seems to fail (the option is in src/libstore/meson.options):
$ meson setup .. -Dsandbox-shell=/bin/sh
...
../meson.build:4:0: ERROR: Unknown options: "sandbox-shell"
| 11:56:26 |
@trofi:matrix.org | In reply to @trofi:matrix.org
nix-2.25.0 does not build against latest release of libgit2 as src/libfetchers/git-utils.cc:288:13: error: ‘git_mempack_write_thin_pack’ was not declared in this scope. It looks like it depends on unreleased feature: https://github.com/libgit2/libgit2/commit/f9c35fb50998d1c9d26293a18ade3d7c32f6ecb0. Is it intentional? I would hope for a silently disabled feature rather than hard build failure. I see people keep encountering libgit2 failures: https://github.com/NixOS/nix/issues/11925 | 12:04:20 |
| Matej Urbas joined the room. | 13:11:54 |
@trofi:matrix.org | * Is meson build system on nix-2.25 intended to be used to buix nix on systems that don't yet have nix running? If it is how do you pass options to subprojects from the top level? This seems to fail (the option is in src/libstore/meson.options):
$ meson setup .. -Dsandbox-shell=/bin/sh
...
../meson.build:4:0: ERROR: Unknown options: "sandbox-shell"
UPDATE: meson setup .. -Dlibstore:sandbox-shell=/bin/sh does work (https://mesonbuild.com/Subprojects.html)
| 16:54:08 |
Mic92 | In reply to @p14:matrix.org I have nix build going in a loop and I'm unable to ctrl-C it. :(. It's just doing a lot of time in 'getting git revision count of /path/to/nixpkgs', and if the process gets killed, it just goes straight into the next nix build and my shell isn't having a change to see the CTRL-C for the shell while loop. Try nix build -f . instead. | 21:08:03 |
p14 | In reply to @joerg:thalheim.io Try nix build -f . instead. I ended up going this route, thanks! | 21:08:38 |
| 25 Nov 2024 |
roberth | In reply to @joerg:thalheim.io Robert Hensing (roberth): did you not fix some json parsing error the other day? ported, but it seems to be not a complete fix | 08:19:32 |