!eWOErHSaiddIbsUNsJ:nixos.org

NixOS CUDA

288 Members
CUDA packages maintenance and support in nixpkgs | https://github.com/orgs/NixOS/projects/27/ | https://nixos.org/manual/nixpkgs/unstable/#cuda58 Servers

Load older messages


SenderMessageTime
6 Mar 2023
@hexa:lossy.networkhexa

then they use scripts to install

  • openssl
  • libpng
  • jni
  • patchelf (yay)
  • mkl
  • conda (nay)
22:36:17
@hexa:lossy.networkhexanice22:36:50
@hexa:lossy.networkhexayeah, I'm tired.22:37:23
@hexa:lossy.networkhexahttps://github.com/pytorch/builder/blob/3eb479e831d3d8bd80d6c71203a51a4d22f93c7f/libtorch/Dockerfile#L3422:39:22
@hexa:lossy.networkhexathat is that docker file22:39:26
@ss:someonex.netSomeoneSerge (back on matrix)Half way there, we just need to know what to run inside the container22:40:01
@hexa:lossy.networkhexamaybe https://github.com/pytorch/builder/blob/3eb479e831d3d8bd80d6c71203a51a4d22f93c7f/wheel/build_wheel.sh22:40:21
@ss:someonex.netSomeoneSerge (back on matrix)I'm beginning to wonder if we should go back to the pytorch repo22:40:30
@ss:someonex.netSomeoneSerge (back on matrix)
In reply to @hexa:lossy.network
maybe https://github.com/pytorch/builder/blob/3eb479e831d3d8bd80d6c71203a51a4d22f93c7f/wheel/build_wheel.sh
This one is, if we trust pytorch discourse, for Darwin
22:40:54
@ss:someonex.netSomeoneSerge (back on matrix)And there's manywheel for Linux22:41:02
@ss:someonex.netSomeoneSerge (back on matrix)But all of these scripts are full of conditional flags22:41:10
@ss:someonex.netSomeoneSerge (back on matrix)So, somewhere out there something must call them with certain flags set on 🤔22:41:31
@hexa:lossy.networkhexahttps://github.com/pytorch/builder/blob/3eb479e831d3d8bd80d6c71203a51a4d22f93c7f/wheel/build_all.sh#L1222:42:10
@hexa:lossy.networkhexayep, darwin22:42:13
@ss:someonex.netSomeoneSerge (back on matrix)
.github/workflows/_binary-build-linux.yml
203:          docker exec -t "${container_name}" bash -c "source ${BINARY_ENV_FILE} && bash /builder/${{ inputs.PACKAGE_TYPE }}/build.sh"
22:44:35
@ss:someonex.netSomeoneSerge (back on matrix)
.github/workflows/_linux-build.yml
161:          docker exec -t "${container_name}" sh -c '.jenkins/pytorch/build.sh'
22:45:01
@ss:someonex.netSomeoneSerge (back on matrix)Yes, I think this is the right one: https://github.com/pytorch/pytorch/blob/39e8311a29b5713c8858cab73a8f713a7f3d531c/.github/workflows/_binary-build-linux.yml#L205 ...but they still take the flags from elsewhere and just propagate them22:51:02
@ss:someonex.netSomeoneSerge (back on matrix)aaaaaand 0 workflows run https://github.com/pytorch/pytorch/actions/workflows/_binary-build-linux.yml22:51:55
@hexa:lossy.networkhexayeah, why would they run that 😄22:52:07
@ss:someonex.netSomeoneSerge (back on matrix)https://github.com/pytorch/pytorch/actions/runs/433782356222:54:01
@ss:someonex.netSomeoneSerge (back on matrix)Here https://github.com/pytorch/pytorch/actions/runs/4337823562/jobs/7574087583#step:14:30522:55:00
@ss:someonex.netSomeoneSerge (back on matrix)

-DBUILD_LIBTORCH_CPU_WITH_DEBUG=0

Ok, how do we check we don't have any debug symbols in our libs?

22:55:59
@hexa:lossy.networkhexaobjdump --syms22:59:10
@ss:someonex.netSomeoneSerge (back on matrix) -DUSE_NCCL=1 22:59:17
@ss:someonex.netSomeoneSerge (back on matrix)hmmm, I didn't even know it can be built without cuda23:00:18
@ss:someonex.netSomeoneSerge (back on matrix)
In reply to @hexa:lossy.network
objdump --syms
Seems fine
23:04:58
@hexa:lossy.networkhexaagreed23:05:05
@ss:someonex.netSomeoneSerge (back on matrix) -DUSE_FBGEMM? 23:07:02
@hexa:lossy.networkhexa

FBGEMM (Facebook GEneral Matrix Multiplication) is a low-precision, high-performance matrix-matrix multiplications and convolution library

23:09:25
@ss:someonex.netSomeoneSerge (back on matrix)
❯ nix log nixpkgs#python3Packages.torch
...

--   USE_EIGEN_FOR_BLAS    : ON
--   USE_FBGEMM            : ON
--     USE_FAKELOWP          : OFF
--   USE_KINETO            : ON
...
23:09:47

There are no newer messages yet.


Back to Room ListRoom Version: 9