!CcTBuBritXGywOEGWJ:matrix.org

NixOS Binary Cache Self-Hosting

157 Members
About how to host a very large-scale binary cache and more54 Servers

Load older messages


SenderMessageTime
21 Aug 2023
@elvishjerricco:matrix.org@elvishjerricco:matrix.orgoptane is kind of insane11:25:27
@elvishjerricco:matrix.org@elvishjerricco:matrix.orgthere was a huge influx of supply like 6 months ago or something when intel announced they were killing the division, so I managed to pick up four 110GB nvme sticks at a quarter their normal price11:26:02
@linus:schreibt.jetzt@linus:schreibt.jetztnice11:26:15
22 Aug 2023
@elvishjerricco:matrix.org@elvishjerricco:matrix.org whoa; just did a full system update with the "new" cache storage, and while optane made the querying instant, the download bandwidth was way reduced. zpool iostat 1 showed a pretty steady read bandwidth of ~80MB/s but I wasn't getting anywhere near that over the wire; whereas on the SSD I saturated the gigabit easily 05:30:57
@elvishjerricco:matrix.org@elvishjerricco:matrix.org(also this pool can easily do sequential reads upwards of 500MB/s)05:32:30
@linus:schreibt.jetzt@linus:schreibt.jetzthuh06:38:37
@linus:schreibt.jetzt@linus:schreibt.jetztdid you rewrite the nars as well?06:38:41
@linus:schreibt.jetzt@linus:schreibt.jetztI could imagine that happening if you're downloading a lot of small nars, but wouldn't expect it for big ones06:38:58
@linus:schreibt.jetzt@linus:schreibt.jetzt * I could imagine that happening if you're downloading a lot of small nars and those nars are on HDD, but wouldn't expect it for big ones06:39:20
@zhaofeng:zhaofeng.liZhaofeng Li
In reply to @elvishjerricco:matrix.org
I didn't realize accessing narinfos was such a burden

(finally trying to catch up with stuff)

narinfos aren't actually individual small files, the server is doing a database query

13:46:12
23 Aug 2023
@sofo:matrix.orgSofi joined the room.00:01:06
@elvishjerricco:matrix.org@elvishjerricco:matrix.org
In reply to @linus:schreibt.jetzt
did you rewrite the nars as well?
yep. Did a send/recv of the whole dataset. Maybe I needed to make the small blocks property a little bigger
04:07:56
@elvishjerricco:matrix.org@elvishjerricco:matrix.org
In reply to @zhaofeng:zhaofeng.li

(finally trying to catch up with stuff)

narinfos aren't actually individual small files, the server is doing a database query

I'm still using a silly ole simple nar file cache :)
04:08:29
@linus:schreibt.jetzt@linus:schreibt.jetzt

Anyone else using attic and getting "errors" like this?

copying path '/nix/store/s4jqyj35hii03rs7j5n6vn7gpgp6ja81-source' from 'http://attic.geruest.sphalerite.tech:8080/magic'...
warning: error: unable to download 'http://attic.geruest.sphalerite.tech:8080/magic/nar/s4jqyj35hii03rs7j5n6vn7gpgp6ja81.nar': HTTP error 200 (curl error: Transferred a partial file); retrying in 267 ms
warning: error: unable to download 'http://attic.geruest.sphalerite.tech:8080/magic/nar/s4jqyj35hii03rs7j5n6vn7gpgp6ja81.nar': HTTP error 200 (curl error: Transferred a partial file); retrying in 640 ms
warning: error: unable to download 'http://attic.geruest.sphalerite.tech:8080/magic/nar/s4jqyj35hii03rs7j5n6vn7gpgp6ja81.nar': HTTP error 200 (curl error: Transferred a partial file); retrying in 1122 ms
warning: error: unable to download 'http://attic.geruest.sphalerite.tech:8080/magic/nar/s4jqyj35hii03rs7j5n6vn7gpgp6ja81.nar': HTTP error 200 (curl error: Transferred a partial file); retrying in 2698 ms
09:12:47
@andreas.schraegle:helsinki-systems.de@andreas.schraegle:helsinki-systems.deI've seen that "error" before, but not with attic. Sadly don't remember why/when exactly, though.09:15:54
@linus:schreibt.jetzt@linus:schreibt.jetzt

hm

CURLE_PARTIAL_FILE (18)

A file transfer was shorter or larger than expected. This happens when the server first reports an expected transfer size, and then delivers data that does not match the previously given size. 
09:17:00
@linus:schreibt.jetzt@linus:schreibt.jetztah, curling the URL does the same thing09:17:56
@linus:schreibt.jetzt@linus:schreibt.jetzt
$ curl -v http://attic.geruest.sphalerite.tech:8080/magic/nar/ja7cry6cb9wwclhlphmffgg4fv0ky4cd.nar >/dev/null
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0*   Trying [2a01:4f9:1a:f600:5650::36]:8080...
* Connected to attic.geruest.sphalerite.tech (2a01:4f9:1a:f600:5650::36) port 8080 (#0)
> GET /magic/nar/ja7cry6cb9wwclhlphmffgg4fv0ky4cd.nar HTTP/1.1
> Host: attic.geruest.sphalerite.tech:8080
> User-Agent: curl/8.1.1
> Accept: */*
>
< HTTP/1.1 200 OK
< x-attic-cache-visibility: public
< transfer-encoding: chunked
< date: Wed, 23 Aug 2023 09:17:52 GMT
<
{ [0 bytes data]
* transfer closed with outstanding read data remaining
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
* Closing connection 0
curl: (18) transfer closed with outstanding read data remaining
09:18:06
@linus:schreibt.jetzt@linus:schreibt.jetztI guess I'll open an attic issue09:18:10
@linus:schreibt.jetzt@linus:schreibt.jetztoh nvm09:22:42
@linus:schreibt.jetzt@linus:schreibt.jetztturns out the backing s3 bucket was configured wrong and the chunks were missing09:23:30
@linus:schreibt.jetzt@linus:schreibt.jetztthough attic should probably recognise that error and report it, at least in its own log ^^09:23:44
@julienmalka:matrix.orgJulienI get a lot of « InternalServerError: The server encountered an internal error or misconfiguration. » in the middle of my attic push. Anyone here had the same issue ?11:05:56
@julienmalka:matrix.orgJulien(Usually if I relaunch the same command it will just work fine)11:06:29
@linus:schreibt.jetzt@linus:schreibt.jetztcheck the atticd logs11:07:30
@linus:schreibt.jetzt@linus:schreibt.jetztI've been having that when it fails to acquire a db connection from the pool, probably because all the connections in the pool are busy11:07:47
@linus:schreibt.jetzt@linus:schreibt.jetztI think it has a 30s timeout11:07:52
@julienmalka:matrix.orgJulienI yes11:16:12
@julienmalka:matrix.orgJulien* Ah yes11:16:21
@julienmalka:matrix.orgJulienIt looks that that’s the problem11:16:29

There are no newer messages yet.


Back to Room ListRoom Version: 10