| 16 Oct 2022 |
| Rizky Maulana Nugraha joined the room. | 23:23:13 |
| 22 Oct 2022 |
tpw_rules | anybody thought about how to have several versions of cutensor? | 01:33:16 |
tpw_rules | for now there's only two old ones and they are auto-selected by what cuda toolkit. anyone remember why? | 01:33:28 |
tpw_rules | FRidh: ^ | 01:36:48 |
FRidh | I don't know. I suppose more can be added, it's just work to be done. | 08:01:40 |
| 23 Oct 2022 |
| waa joined the room. | 21:31:55 |
| 24 Oct 2022 |
| waa left the room. | 08:10:13 |
| 31 Oct 2022 |
| underpantsgnome changed their display name from underpantsgnome! to underpantsgnome. | 20:29:26 |
| 6 Nov 2022 |
tpw_rules | where is a link to the hercules instance again? | 01:47:12 |
tpw_rules | i found it but it seems like something is wedged: https://hercules-ci.com/github/SomeoneSerge/nixpkgs-unfree | 01:54:44 |
tpw_rules | any idea why recent builds haven't completed? was there a staging merge a while ago? | 01:54:56 |
tpw_rules | Someone S: ? | 02:05:42 |
SomeoneSerge (back on matrix) | Oh wow. I'll have a look | 17:22:38 |
SomeoneSerge (back on matrix) | Agent systemd service was somehow failing to resolve some domain names since October 22. I restarted it, seems to be online now | 17:27:19 |
tpw_rules | it looks like things are better now? i guess now i just have to be patient for the builds | 19:14:31 |
tpw_rules | https://github.com/NixOS/nixpkgs/pull/199910 | 21:59:28 |
| 8 Nov 2022 |
| pbsds changed their profile picture. | 00:46:17 |
| 9 Nov 2022 |
| breakds joined the room. | 08:01:07 |
| eahlberg joined the room. | 16:24:14 |
breakds | Hi! First I would like to say "thank you" to the Nix CUDA maintainers and the community! | 16:47:27 |
breakds | I am trying to build pytorch with CUDA 11.08, and it complaints about cudnn:
at /nix/store/171cjmpyl45dz6dy818i3kf7x3nijkpg-source/pkgs/development/libraries/science/math/cudnn/generic.nix:28:1:
27|
28| assert useCudatoolkitRunfile || (libcublas != null);
| ^
29|
Being not quite familiar with how CUDA libraries are packaged, I am wondering since cudaPackages_11_8 does not seem to have libcublas, should I turn on useCudatoolkitRunfile (it is false by default) for CUDA 11.8? Judging from the name, it seems to suggest that I can find libcublas files from the cudatoolkit itself, am I wrong?
Thanks a lot!
| 16:47:30 |
SomeoneSerge (back on matrix) | Ouch. This is actually a quite embarrassing overlook | 19:57:20 |
breakds | Thanks for taking a look at this. Is it that cuda 11.8 should be handled differently? | 20:00:20 |
breakds | https://github.com/NixOS/nixpkgs/blob/master/pkgs/development/libraries/science/math/cudnn/extension.nix#L11 I am looking at this line | 20:00:33 |
breakds | Not sure why there is a difference between 11.4+ and 11.4- | 20:01:20 |
SomeoneSerge (back on matrix) | No, it's much simpler than that:) https://github.com/NixOS/nixpkgs/pull/200426 | 20:02:29 |
breakds | Thanks for the fix! I need to learn more about how cuda packaging works. | 20:03:59 |
SomeoneSerge (back on matrix) | Regarding this particular issue: you probably already found this, but the redist packages' attributes (like libcublas) are populated from a json file, stored right in the repo. With every release we download the json "manifest" file from nvidia and keep it as is | 20:13:37 |
breakds | In reply to @ss:someonex.net Regarding this particular issue: you probably already found this, but the redist packages' attributes (like libcublas) are populated from a json file, stored right in the repo. With every release we download the json "manifest" file from nvidia and keep it as is Thanks for the explanation! It makes sense to me now. | 20:25:42 |
breakds | A separate question, as I read from https://discourse.nixos.org/t/announcing-the-nixos-cuda-maintainers-team-and-a-call-for-maintainers/18074 , is x86_64-linux computing cycle still needed for github actions? I have a spare RTX 3080 not attached to any machine at this moment, not sure what is the best way to make it useful to the project. Shall I build a machine to run github actions? | 20:35:33 |