!eWOErHSaiddIbsUNsJ:nixos.org

NixOS CUDA

290 Members
CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda58 Servers

Load older messages


SenderMessageTime
6 Mar 2023
@hexa:lossy.networkhexa Someone S: why did you even bother getting such a crappy card 😛 21:16:14
@justbrowsing:matrix.orgKevin Mittman (EOY sleep) Okay raised the issue and will be looking into it soon-ish 21:29:32
@ss:someonex.netSomeoneSerge (back on matrix)
In reply to @hexa:lossy.network
torch-bin w cuda 34s
Wdym, it was the other way around
21:54:53
@hexa:lossy.networkhexai expected the LD_PRELOAD= to enable cuda support?21:55:35
@ss:someonex.netSomeoneSerge (back on matrix)Oh. Yea, it was the stub to prevent it from loading the actual libcuda xD21:56:03
@hexa:lossy.networkhexaah lol21:56:14
@hexa:lossy.networkhexadid the two previous runs use cuda?21:56:27
@ss:someonex.netSomeoneSerge (back on matrix)That app they're using prints a line "using cpu" or "using gpu"21:56:58
@hexa:lossy.networkhexaoh, silly me21:57:07
@hexa:lossy.networkhexagot it21:57:11
@ss:someonex.netSomeoneSerge (back on matrix)So all of the runs, including OP's, use cpu21:57:13
@ss:someonex.netSomeoneSerge (back on matrix)An obvious first step would be to decipher upstream's CI (dockerfiles?) and compare the cmake flags, I guess. And step zero is to find someone motivated to do that21:59:45
@hexa:lossy.networkhexayikes, step zero sounds hard22:01:07
@hexa:lossy.networkhexabut yeah, understanding how they arrive at their wheel is the way22:01:32
@ss:someonex.netSomeoneSerge (back on matrix)
In reply to @ss:someonex.net
An obvious first step would be to decipher upstream's CI (dockerfiles?) and compare the cmake flags, I guess. And step zero is to find someone motivated to do that
Extracting actual CMakeCache.txt and literally running diff?
22:01:54
@hexa:lossy.networkhexahttps://github.com/pytorch/pytorch/tree/master/.github/workflows22:02:12
@hexa:lossy.networkhexawhich one 😄22:02:14
@hexa:lossy.networkhexaor is it here? https://github.com/pytorch/pytorch/tree/master/.circleci22:02:28
@ss:someonex.netSomeoneSerge (back on matrix)))))22:02:35
@hexa:lossy.networkhexathey have their own actions https://github.com/pytorch/pytorch/tree/master/.github/actions22:04:28
@hexa:lossy.networkhexathat they use in their workflows22:04:34
@ss:someonex.netSomeoneSerge (back on matrix)2018 https://discuss.pytorch.org/t/how-is-pytorch-pip-version-compiled-flags-libs-cpu-only/1845922:06:44
@ss:someonex.netSomeoneSerge (back on matrix)Redacted or Malformed Event22:07:11
@ss:someonex.netSomeoneSerge (back on matrix)
In reply to @hexa:lossy.network
or is it here? https://github.com/pytorch/pytorch/tree/master/.circleci
You anticipated so many options and they were all wrong
22:07:58
@hexa:lossy.networkhexayeah, I'm surprised.22:08:22
@ss:someonex.netSomeoneSerge (back on matrix)But are you impressed22:08:29
@hexa:lossy.networkhexaoh yes, I am22:08:33
@hexa:lossy.networkhexathanks for teaching me, master22:08:41
@hexa:lossy.networkhexawe are also missing libgomp22:13:44
@hexa:lossy.networkhexa * we are also missing libgomp?22:13:45

Show newer messages


Back to Room ListRoom Version: 9