| 6 Mar 2023 |
hexa | Someone S: why did you even bother getting such a crappy card 😛 | 21:16:14 |
Kevin Mittman (EOY sleep) | Okay raised the issue and will be looking into it soon-ish | 21:29:32 |
SomeoneSerge (back on matrix) | In reply to @hexa:lossy.network torch-bin w cuda 34s Wdym, it was the other way around | 21:54:53 |
hexa | i expected the LD_PRELOAD= to enable cuda support? | 21:55:35 |
SomeoneSerge (back on matrix) | Oh. Yea, it was the stub to prevent it from loading the actual libcuda xD | 21:56:03 |
hexa | ah lol | 21:56:14 |
hexa | did the two previous runs use cuda? | 21:56:27 |
SomeoneSerge (back on matrix) | That app they're using prints a line "using cpu" or "using gpu" | 21:56:58 |
hexa | oh, silly me | 21:57:07 |
hexa | got it | 21:57:11 |
SomeoneSerge (back on matrix) | So all of the runs, including OP's, use cpu | 21:57:13 |
SomeoneSerge (back on matrix) | An obvious first step would be to decipher upstream's CI (dockerfiles?) and compare the cmake flags, I guess. And step zero is to find someone motivated to do that | 21:59:45 |
hexa | yikes, step zero sounds hard | 22:01:07 |
hexa | but yeah, understanding how they arrive at their wheel is the way | 22:01:32 |
SomeoneSerge (back on matrix) | In reply to @ss:someonex.net An obvious first step would be to decipher upstream's CI (dockerfiles?) and compare the cmake flags, I guess. And step zero is to find someone motivated to do that Extracting actual CMakeCache.txt and literally running diff? | 22:01:54 |
hexa | https://github.com/pytorch/pytorch/tree/master/.github/workflows | 22:02:12 |
hexa | which one 😄 | 22:02:14 |
hexa | or is it here? https://github.com/pytorch/pytorch/tree/master/.circleci | 22:02:28 |
SomeoneSerge (back on matrix) | )))) | 22:02:35 |
hexa | they have their own actions https://github.com/pytorch/pytorch/tree/master/.github/actions | 22:04:28 |
hexa | that they use in their workflows | 22:04:34 |
SomeoneSerge (back on matrix) | 2018 https://discuss.pytorch.org/t/how-is-pytorch-pip-version-compiled-flags-libs-cpu-only/18459 | 22:06:44 |
SomeoneSerge (back on matrix) | Redacted or Malformed Event | 22:07:11 |
SomeoneSerge (back on matrix) | In reply to @hexa:lossy.network or is it here? https://github.com/pytorch/pytorch/tree/master/.circleci You anticipated so many options and they were all wrong | 22:07:58 |
hexa | yeah, I'm surprised. | 22:08:22 |
SomeoneSerge (back on matrix) | But are you impressed | 22:08:29 |
hexa | oh yes, I am | 22:08:33 |
hexa | thanks for teaching me, master | 22:08:41 |
hexa | we are also missing libgomp | 22:13:44 |
hexa | * we are also missing libgomp? | 22:13:45 |