| 6 Mar 2023 |
hexa | maybe https://github.com/pytorch/builder/blob/3eb479e831d3d8bd80d6c71203a51a4d22f93c7f/wheel/build_wheel.sh | 22:40:21 |
SomeoneSerge (matrix works sometimes) | I'm beginning to wonder if we should go back to the pytorch repo | 22:40:30 |
SomeoneSerge (matrix works sometimes) | In reply to @hexa:lossy.network maybe https://github.com/pytorch/builder/blob/3eb479e831d3d8bd80d6c71203a51a4d22f93c7f/wheel/build_wheel.sh This one is, if we trust pytorch discourse, for Darwin | 22:40:54 |
SomeoneSerge (matrix works sometimes) | And there's manywheel for Linux | 22:41:02 |
SomeoneSerge (matrix works sometimes) | But all of these scripts are full of conditional flags | 22:41:10 |
SomeoneSerge (matrix works sometimes) | So, somewhere out there something must call them with certain flags set on 🤔 | 22:41:31 |
hexa | https://github.com/pytorch/builder/blob/3eb479e831d3d8bd80d6c71203a51a4d22f93c7f/wheel/build_all.sh#L12 | 22:42:10 |
hexa | yep, darwin | 22:42:13 |
SomeoneSerge (matrix works sometimes) | .github/workflows/_binary-build-linux.yml
203: docker exec -t "${container_name}" bash -c "source ${BINARY_ENV_FILE} && bash /builder/${{ inputs.PACKAGE_TYPE }}/build.sh"
| 22:44:35 |
SomeoneSerge (matrix works sometimes) | .github/workflows/_linux-build.yml
161: docker exec -t "${container_name}" sh -c '.jenkins/pytorch/build.sh'
| 22:45:01 |
SomeoneSerge (matrix works sometimes) | Yes, I think this is the right one: https://github.com/pytorch/pytorch/blob/39e8311a29b5713c8858cab73a8f713a7f3d531c/.github/workflows/_binary-build-linux.yml#L205
...but they still take the flags from elsewhere and just propagate them | 22:51:02 |
SomeoneSerge (matrix works sometimes) | aaaaaand 0 workflows run https://github.com/pytorch/pytorch/actions/workflows/_binary-build-linux.yml | 22:51:55 |
hexa | yeah, why would they run that 😄 | 22:52:07 |
SomeoneSerge (matrix works sometimes) | https://github.com/pytorch/pytorch/actions/runs/4337823562 | 22:54:01 |
SomeoneSerge (matrix works sometimes) | Here https://github.com/pytorch/pytorch/actions/runs/4337823562/jobs/7574087583#step:14:305 | 22:55:00 |
SomeoneSerge (matrix works sometimes) | -DBUILD_LIBTORCH_CPU_WITH_DEBUG=0
Ok, how do we check we don't have any debug symbols in our libs?
| 22:55:59 |
hexa | objdump --syms | 22:59:10 |
SomeoneSerge (matrix works sometimes) | -DUSE_NCCL=1 | 22:59:17 |
SomeoneSerge (matrix works sometimes) | hmmm, I didn't even know it can be built without cuda | 23:00:18 |
SomeoneSerge (matrix works sometimes) | In reply to @hexa:lossy.network objdump --syms Seems fine | 23:04:58 |
hexa | agreed | 23:05:05 |
SomeoneSerge (matrix works sometimes) | -DUSE_FBGEMM? | 23:07:02 |
hexa |
FBGEMM (Facebook GEneral Matrix Multiplication) is a low-precision, high-performance matrix-matrix multiplications and convolution library
| 23:09:25 |
SomeoneSerge (matrix works sometimes) | ❯ nix log nixpkgs#python3Packages.torch
...
-- USE_EIGEN_FOR_BLAS : ON
-- USE_FBGEMM : ON
-- USE_FAKELOWP : OFF
-- USE_KINETO : ON
...
| 23:09:47 |
| 7 Mar 2023 |
SomeoneSerge (matrix works sometimes) | (how does one get an nvim diffsplit online?) | 00:01:28 |
SomeoneSerge (matrix works sometimes) | Meanwhile, two notable GLIBCXX_3.4.30 failures remain: python3Packages.jax and python3Packages.torchvision | 00:39:55 |
SomeoneSerge (matrix works sometimes) | Same error I saw with gpflow, something about scipy/optimize:
from ._highs._highs_wrapper import _highs_wrapper
E ImportError: /nix/store/205vsmbfhq1q2vhgskpqyymqvba4mscp-gcc-11.3.0-lib/lib/libstdc++.so.6: version `GLIBCXX_3.4.30' not found (required by /nix/store/yi7jc5p2mlwb3j37j7gwj15bk45j6xqs-python3.10-scipy-1.9.3/lib/python3.10/site-packages/scipy/optimize/_highs/_highs_wrapper.cpython-310-x86_64-linux-gnu.so)
| 00:40:34 |
SomeoneSerge (matrix works sometimes) | * Meanwhile, two notable GLIBCXX_3.4.30 failures remain: python3Packages.jax (not jaxlib) and python3Packages.torchvision | 01:21:16 |
SomeoneSerge (matrix works sometimes) | MKL -> down to 5s | 01:29:23 |
SomeoneSerge (matrix works sometimes) | Mystery solved, I guess | 01:29:38 |