| 9 Feb 2026 |
| nki ⚡️ changed their display name from nki ⚡️⚡️ to nki ⚡️. | 21:15:57 |
| 10 Feb 2026 |
| pneumatic changed their display name from ribosomerocker to pneumatic. | 10:28:16 |
| S3N joined the room. | 14:39:42 |
| ret2pop changed their display name from ret2pop ⚡️ to ret2pop. | 22:01:38 |
| 11 Feb 2026 |
| @topopolis:matrix.org left the room. | 00:12:46 |
| M̸̙̜̔̇Ǎ̴͎̙͔G̸̞̈N̸͔͍̝͗͋̾Ő̷͖̼͈̽̚L̷̻͚̓̔I̷̛͔̰̟̔Å̴̩̍ ̷̦̒̇͝M̷̱̠̺̉̎A̵̼̎͗͘Ỹ̸̬̲͂̕H̷̙̖͂Ē̷͉̦̌͒M̶͈̥̽̐ changed their display name from mag to magnolia_mayhem. | 03:48:42 |
| M̸̙̜̔̇Ǎ̴͎̙͔G̸̞̈N̸͔͍̝͗͋̾Ő̷͖̼͈̽̚L̷̻͚̓̔I̷̛͔̰̟̔Å̴̩̍ ̷̦̒̇͝M̷̱̠̺̉̎A̵̼̎͗͘Ỹ̸̬̲͂̕H̷̙̖͂Ē̷͉̦̌͒M̶͈̥̽̐ changed their display name from magnolia_mayhem to magnolia_mayhem -- w̵̳͐e̵̖͆l̶͖͘c̷̡̊ó̸̖m̴̳̿ȩ̵̀ ̴͕̈́t̶̰̎o̶̘͗ ̸͕̈́h̷̟̽e̷̬̕l̶̦͂l̶̛͓. | 03:50:04 |
| M̸̙̜̔̇Ǎ̴͎̙͔G̸̞̈N̸͔͍̝͗͋̾Ő̷͖̼͈̽̚L̷̻͚̓̔I̷̛͔̰̟̔Å̴̩̍ ̷̦̒̇͝M̷̱̠̺̉̎A̵̼̎͗͘Ỹ̸̬̲͂̕H̷̙̖͂Ē̷͉̦̌͒M̶͈̥̽̐ changed their display name from magnolia_mayhem -- w̵̳͐e̵̖͆l̶͖͘c̷̡̊ó̸̖m̴̳̿ȩ̵̀ ̴͕̈́t̶̰̎o̶̘͗ ̸͕̈́h̷̟̽e̷̬̕l̶̦͂l̶̛͓ to M̸̙̜̔̇Ǎ̴͎̙͔G̸̞̈N̸͔͍̝͗͋̾Ő̷͖̼͈̽̚L̷̻͚̓̔I̷̛͔̰̟̔Å̴̩̍ ̷̦̒̇͝M̷̱̠̺̉̎A̵̼̎͗͘Ỹ̸̬̲͂̕H̷̙̖͂Ē̷͉̦̌͒M̶͈̥̽̐. | 03:51:33 |
| fnctr changed their display name from yliceee to fnctr. | 11:34:55 |
| fnctr changed their profile picture. | 11:36:26 |
| fnctr changed their profile picture. | 11:39:11 |
| fnctr changed their profile picture. | 11:40:11 |
| fnctr changed their profile picture. | 11:41:57 |
| fnctr changed their profile picture. | 11:46:43 |
| 13 Feb 2026 |
| hoplopf joined the room. | 10:19:51 |
chreekat | as a sanity check, if a tool was introduced that created a drv for every source file of every haskell dependency of your haskell package, that would be unreasonable, right? Way too many drvs? | 15:04:19 |
maralorn | Not necessarily. Probably? Eval times might be horrendous. But I wouldn't bet on that. The question is usually what this drvs do and for many usecases, like e.g. compiling the overhead per drv makes it very slow. | 15:09:47 |
chreekat | interestin | 15:11:09 |
chreekat | * interesting | 15:11:12 |
chreekat | context is that i'm experimenting with using casa as a source for package sources. Right now, the tool fetches tree manifests and blobs from casa and assembles them into a package source tree that can be built in a later step | 15:12:04 |
chreekat | using IFD it's unusable, but the goal is to try using dynamic derivations / rfc92 | 15:12:38 |
chreekat | (pre-generation would be another option) | 15:13:27 |
maralorn | I have optimised our work CI by making on hlint/ormulo check derivation per file. Without caching it is a bit slower but once the store is populated the speedup is significant. | 15:14:01 |
maralorn | * | 15:14:28 |
toonn | chreekat: Isn't that what haskell.nix does? | 15:16:18 |
chreekat | it doesn't use casa as far as I know. | 15:21:33 |
toonn | No, re drv per Cabal component. Though that's not exactly what you said, I suppose. | 15:22:20 |
maralorn | Per Cabal component and per source file is a big difference | 15:23:14 |
chreekat | Yeah, and ironically per-component builds aren't my goal right now. My main goals are to make a tool that lets stack.yaml(.lock) be the source of truth for a nix build, that is also fast. I guess using casa is an orthogonal goal, but it feels right because casa is, or could be, a better solution than the all-cabal-hashes repo | 15:25:07 |
chreekat | it was sort of the next generation of "better interface to hackage" that fp complete was experimenting with. stack uses it, but besides that it sort of got trapped in amber as fpco moved away from haskell | 15:25:59 |