| 29 Mar 2022 |
@rnhmjoj:maxwell.ydns.eu | this is an article by the author of lzip that goes through the problems with xz: http://web.archive.org/web/20220128214314/https://www.nongnu.org/lzip/xz_inadequate.html | 12:43:01 |
atemu12 | toonn: Good to know, thanks! | 12:43:57 |
Linux Hackerman | I've learnt something new today! Thanks all :) | 12:44:29 |
atemu12 | toonn: In that case, an LZMA compressor like lzip or XZ would be best if you want best compression with slow but not unreasonably slow speeds. Otherwise, zstd. | 12:45:07 |
atemu12 | toonn: Though if it's only needed by people who re-build the stdenv, they'd appreciate the orders of magnitude in speed of zstd more than the few MiB saved by LZMA | 12:47:17 |
toonn | atemu12: The idea is that network bandwidth makes a bigger difference than the decompression speed. | 12:53:19 |
toonn | I think I'll just try both and do some minimal benchmarking. | 12:54:13 |
atemu12 | toonn: At higher LZMA levels, that's actually not necessarily true IIRC. Also, a user hacking on the stdenv likely unpacks the tarball more often than they download it and likely needs to download so many source files that the difference between LZMA and zstd is a drop in the water | 12:56:22 |