!kFJOpVCFYFzxqjpJxm:nixos.org

Nix HPC

73 Members
Nix for High Perfomance Computing clusters18 Servers

Load older messages


SenderMessageTime
13 Dec 2023
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) *

https://gist.github.com/SomeoneSerge/3f894ffb5f97e55a0a5cfc10dfbc66e1#file-slurm-nix-L45-L61

Does it make sense that this consistently blocks after the following message?

vm-test-run-slurm> submit # WARNING: Open MPI accepted a TCP connection from what appears to be a
vm-test-run-slurm> submit # another Open MPI process but cannot find a corresponding process
vm-test-run-slurm> submit # entry for that peer.
vm-test-run-slurm> submit # 
vm-test-run-slurm> submit # This attempted connection will be ignored; your MPI job may or may not
vm-test-run-slurm> submit # continue properly.
vm-test-run-slurm> submit # 
vm-test-run-slurm> submit #   Local host: node1
vm-test-run-slurm> submit #   PID:        831

Also the same test with final: prev: { mpi = final.mpich; } reports the wrong world size of 1 (thrice)

02:29:50
@connorbaker:matrix.org@connorbaker:matrix.org joined the room.14:47:19
@ss:someonex.netSomeoneSerge (UTC+U[-12,12])

(trying to build mpich with pmix support)

/nix/store/p58l5qmzifl20qmjs3xfpl01f0mqlza2-binutils-2.40/bin/ld: lib/.libs/libmpi.so: undefined reference to `PMIx_Resolve_nodes
...
/nix/store/p58l5qmzifl20qmjs3xfpl01f0mqlza2-binutils-2.40/bin/ld: lib/.libs/libmpi.so: undefined reference to `PMIx_Abort'
/nix/store/p58l5qmzifl20qmjs3xfpl01f0mqlza2-binutils-2.40/bin/ld: lib/.libs/libmpi.so: undefined reference to `PMIx_Commit'
collect2: error: ld returned 1 exit status

why won't people just use cmake

18:54:34
@ss:someonex.netSomeoneSerge (UTC+U[-12,12])

I mean, the [official guide](user@testbox:~/mpich-4.0.2/build$ LD_LIBRARY_PATH=~/slurm/22.05/inst/lib/ \

../configure --prefix=/home/user/bin/mpich/ --with-pmilib=slurm
--with-pmi=pmi2 --with-slurm=/home/lipi/slurm/master/inst) literally suggests to use LD_LIBRARY_PATH to communicate the location of host libraries 🤦:

user@testbox:~/mpich-4.0.2/build$ LD_LIBRARY_PATH=~/slurm/22.05/inst/lib/ \
> ../configure --prefix=/home/user/bin/mpich/ --with-pmilib=slurm \
> --with-pmi=pmi2 --with-slurm=/home/lipi/slurm/master/inst
18:55:41
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) *

I mean, the official guide literally suggests to use LD_LIBRARY_PATH to communicate the location of host libraries 🤦:

user@testbox:~/mpich-4.0.2/build$ LD_LIBRARY_PATH=~/slurm/22.05/inst/lib/ \
> ../configure --prefix=/home/user/bin/mpich/ --with-pmilib=slurm \
> --with-pmi=pmi2 --with-slurm=/home/lipi/slurm/master/inst
18:56:07
14 Dec 2023
@ss:someonex.netSomeoneSerge (UTC+U[-12,12])

mpich outside the VM:

AddressSanitizer:DEADLYSIGNAL
=================================================================
==122066==ERROR: AddressSanitizer: SEGV on unknown address 0x601efafd3868 (pc 0x2b4e42acd6c7 bp 0x000000000001 sp 0x7ffc5f4344a0 T0)
==122066==The signal is caused by a READ memory access.
    #0 0x2b4e42acd6c7 in MPIR_pmi_init (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x4716c7)
    #1 0x2b4e42a497f8 in MPII_Init_thread (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x3ed7f8)
    #2 0x2b4e42a4a574 in MPIR_Init_impl (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x3ee574)
    #3 0x2b4e42784d5b in PMPI_Init (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x128d5b)
    #4 0x4027f0 in main (/nix/store/3raslv01lvsk5f5vx30wcivx28fwsh92-pps-samples-0.0.0/bin/roundtrip+0x4027f0)
    #5 0x2b4e454dffcd in __libc_start_call_main (/nix/store/gqghjch4p1s69sv4mcjksb2kb65rwqjy-glibc-2.38-23/lib/libc.so.6+0x27fcd)
    #6 0x2b4e454e0088 in __libc_start_main_alias_1 (/nix/store/gqghjch4p1s69sv4mcjksb2kb65rwqjy-glibc-2.38-23/lib/libc.so.6+0x28088)
    #7 0x403d14 in _start (/nix/store/3raslv01lvsk5f5vx30wcivx28fwsh92-pps-samples-0.0.0/bin/roundtrip+0x403d14)

AddressSanitizer can not provide additional info.
SUMMARY: AddressSanitizer: SEGV (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x4716c7) in MPIR_pmi_init
==122066==ABORTING
03:25:12
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) *

mpich+pmix outside the VM:

AddressSanitizer:DEADLYSIGNAL
=================================================================
==122066==ERROR: AddressSanitizer: SEGV on unknown address 0x601efafd3868 (pc 0x2b4e42acd6c7 bp 0x000000000001 sp 0x7ffc5f4344a0 T0)
==122066==The signal is caused by a READ memory access.
    #0 0x2b4e42acd6c7 in MPIR_pmi_init (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x4716c7)
    #1 0x2b4e42a497f8 in MPII_Init_thread (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x3ed7f8)
    #2 0x2b4e42a4a574 in MPIR_Init_impl (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x3ee574)
    #3 0x2b4e42784d5b in PMPI_Init (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x128d5b)
    #4 0x4027f0 in main (/nix/store/3raslv01lvsk5f5vx30wcivx28fwsh92-pps-samples-0.0.0/bin/roundtrip+0x4027f0)
    #5 0x2b4e454dffcd in __libc_start_call_main (/nix/store/gqghjch4p1s69sv4mcjksb2kb65rwqjy-glibc-2.38-23/lib/libc.so.6+0x27fcd)
    #6 0x2b4e454e0088 in __libc_start_main_alias_1 (/nix/store/gqghjch4p1s69sv4mcjksb2kb65rwqjy-glibc-2.38-23/lib/libc.so.6+0x28088)
    #7 0x403d14 in _start (/nix/store/3raslv01lvsk5f5vx30wcivx28fwsh92-pps-samples-0.0.0/bin/roundtrip+0x403d14)

AddressSanitizer can not provide additional info.
SUMMARY: AddressSanitizer: SEGV (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x4716c7) in MPIR_pmi_init
==122066==ABORTING
03:25:20
@ss:someonex.netSomeoneSerge (UTC+U[-12,12])

`/nix/store/kzwmscs89rbmv9nycpvj02qz36lsgf9h-openpmix-4.2.7/include/pmix/src/include/pmix_config.h:#define PMIX_CONFIGURE_CLI ... '--bindir=/nix/store/4c64iac91ja5r5llzs525kgl90996h09-openpmix-4.2.7-bin/bin'

Beautiful, it's literally impossible to put all of this pmicc non-sense into a separate output so that gcc/gfortran wouldn't be part of the closure

03:53:39
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) *

/nix/store/kzwmscs89rbmv9nycpvj02qz36lsgf9h-openpmix-4.2.7/include/pmix/src/include/pmix_config.h:#define PMIX_CONFIGURE_CLI ... \'--bindir=/nix/store/4c64iac91ja5r5llzs525kgl90996h09-openpmix-4.2.7-bin/bin

Beautiful, it's literally impossible to put all of this pmicc non-sense into a separate output so that gcc/gfortran wouldn't be part of the closure

03:53:51
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) *

/nix/store/kzwmscs89rbmv9nycpvj02qz36lsgf9h-openpmix-4.2.7/include/pmix/src/include/pmix_config.h:#define PMIX_CONFIGURE_CLI ... \'--bindir=/nix/store/4c64iac91ja5r5llzs525kgl90996h09-openpmix-4.2.7-bin/bin ...

Beautiful, it's literally impossible to put all of this pmicc non-sense into a separate output so that gcc/gfortran wouldn't be part of the closure

03:53:59
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) *

/nix/store/kzwmscs89rbmv9nycpvj02qz36lsgf9h-openpmix-4.2.7/include/pmix/src/include/pmix_config.h:#define PMIX_CONFIGURE_CLI ... \'--bindir=/nix/store/4c64iac91ja5r5llzs525kgl90996h09-openpmix-4.2.7-bin/bin ...

Beautiful, it's literally impossible to put all of this pmicc non-sense into a separate output so that gccwouldn't be part of the closure

03:55:15
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) *

/nix/store/kzwmscs89rbmv9nycpvj02qz36lsgf9h-openpmix-4.2.7/include/pmix/src/include/pmix_config.h:#define PMIX_CONFIGURE_CLI ... \'--bindir=/nix/store/4c64iac91ja5r5llzs525kgl90996h09-openpmix-4.2.7-bin/bin ...

Beautiful, it's literally impossible to put all of this pmicc non-sense into a separate output so that gcc wouldn't be part of the runtime closure

03:55:21
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) *

/nix/store/kzwmscs89rbmv9nycpvj02qz36lsgf9h-openpmix-4.2.7/include/pmix/src/include/pmix_config.h:#define PMIX_CONFIGURE_CLI ... \'--bindir=/nix/store/4c64iac91ja5r5llzs525kgl90996h09-openpmix-4.2.7-bin/bin ...

Beautiful, it's literally impossible to put all of this pmicc stuff into a separate output so that gcc wouldn't be part of the runtime closure

03:55:54
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) *

mpich+pmix outside the VM:

AddressSanitizer:DEADLYSIGNAL
=================================================================
==122066==ERROR: AddressSanitizer: SEGV on unknown address 0x601efafd3868 (pc 0x2b4e42acd6c7 bp 0x000000000001 sp 0x7ffc5f4344a0 T0)
==122066==The signal is caused by a READ memory access.
    #0 0x2b4e42acd6c7 in MPIR_pmi_init (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x4716c7)
    #1 0x2b4e42a497f8 in MPII_Init_thread (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x3ed7f8)
    #2 0x2b4e42a4a574 in MPIR_Init_impl (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x3ee574)
    #3 0x2b4e42784d5b in PMPI_Init (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x128d5b)
    #4 0x4027f0 in main (/nix/store/3raslv01lvsk5f5vx30wcivx28fwsh92-pps-samples-0.0.0/bin/roundtrip+0x4027f0)
    #5 0x2b4e454dffcd in __libc_start_call_main (/nix/store/gqghjch4p1s69sv4mcjksb2kb65rwqjy-glibc-2.38-23/lib/libc.so.6+0x27fcd)
    #6 0x2b4e454e0088 in __libc_start_main_alias_1 (/nix/store/gqghjch4p1s69sv4mcjksb2kb65rwqjy-glibc-2.38-23/lib/libc.so.6+0x28088)
    #7 0x403d14 in _start (/nix/store/3raslv01lvsk5f5vx30wcivx28fwsh92-pps-samples-0.0.0/bin/roundtrip+0x403d14)

AddressSanitizer can not provide additional info.
SUMMARY: AddressSanitizer: SEGV (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x4716c7) in MPIR_pmi_init
==122066==ABORTING

Works fine in the VM though

11:03:20
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) *

mpich+pmix outside the VM:

AddressSanitizer:DEADLYSIGNAL
=================================================================
==122066==ERROR: AddressSanitizer: SEGV on unknown address 0x601efafd3868 (pc 0x2b4e42acd6c7 bp 0x000000000001 sp 0x7ffc5f4344a0 T0)
==122066==The signal is caused by a READ memory access.
    #0 0x2b4e42acd6c7 in MPIR_pmi_init (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x4716c7)
    #1 0x2b4e42a497f8 in MPII_Init_thread (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x3ed7f8)
    #2 0x2b4e42a4a574 in MPIR_Init_impl (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x3ee574)
    #3 0x2b4e42784d5b in PMPI_Init (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x128d5b)
    #4 0x4027f0 in main (/nix/store/3raslv01lvsk5f5vx30wcivx28fwsh92-pps-samples-0.0.0/bin/roundtrip+0x4027f0)
    #5 0x2b4e454dffcd in __libc_start_call_main (/nix/store/gqghjch4p1s69sv4mcjksb2kb65rwqjy-glibc-2.38-23/lib/libc.so.6+0x27fcd)
    #6 0x2b4e454e0088 in __libc_start_main_alias_1 (/nix/store/gqghjch4p1s69sv4mcjksb2kb65rwqjy-glibc-2.38-23/lib/libc.so.6+0x28088)
    #7 0x403d14 in _start (/nix/store/3raslv01lvsk5f5vx30wcivx28fwsh92-pps-samples-0.0.0/bin/roundtrip+0x403d14)

AddressSanitizer can not provide additional info.
SUMMARY: AddressSanitizer: SEGV (/nix/store/f40za7vl7n0r4awd8b0jz1xla5srkmnw-mpich-4.1.2/lib/libmpi.so.12+0x4716c7) in MPIR_pmi_init
==122066==ABORTING
11:05:23
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) I'm desperate, whenever I build mpich or ompi with -fsanitize=address the FindMPI.cmake stops working 18:21:22
@ss:someonex.netSomeoneSerge (UTC+U[-12,12])(also FindMPI begins by trying to locate mpiexec, this all looks so odd)18:22:16
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) 🤔 Neither mpich nor openmpi in Nixpkgs ship these .pcs: https://gitlab.kitware.com/cmake/cmake/-/blob/2744f14db1e87f4b7f6eb6f30f7c84ea52ce4a7a/Modules/FindMPI.cmake#L1584-1592 19:49:50
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) Should we do something like https://github.com/NixOS/nixpkgs/blob/5c1d2b0d06241279e6a70e276397a5e40e867499/pkgs/build-support/alternatives/blas/default.nix? 20:31:14
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) * Should we do something like https://github.com/NixOS/nixpkgs/blob/5c1d2b0d06241279e6a70e276397a5e40e867499/pkgs/build-support/alternatives/blas/default.nix? Or should we ask upstream why don't they generate those? 20:41:54
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) *

Should we do something like https://github.com/NixOS/nixpkgs/blob/5c1d2b0d06241279e6a70e276397a5e40e867499/pkgs/build-support/alternatives/blas/default.nix? Or should we ask upstream why don't they generate those?

For context, openmpi more or less maps onto cmake's expectations, whereas mpich just merges everything into a single file:

❯ ls /nix/store/nf8fqx39w8ib34hp24lrz48287xdbxd8-openmpi-4.1.6/lib/pkgconfig/
ompi-c.pc  ompi-cxx.pc  ompi-f77.pc  ompi-f90.pc  ompi-fort.pc  ompi.pc  orte.pc
❯ ls /nix/store/24sv27w3j1j3p7lxyh689bzbhmixxf35-mpich-4.1.2/lib/pkgconfig
mpich.pc
❯ cat /nix/store/24sv27w3j1j3p7lxyh689bzbhmixxf35-mpich-4.1.2/lib/pkgconfig/mpich.pc
...
Cflags:   -I${includedir}
...
# pkg-config does not understand Cxxflags, etc. So we allow users to
# query them using the --variable option

cxxflags=  -I${includedir}
fflags=-fallow-argument-mismatch -I${includedir}
fcflags=-fallow-argument-mismatch -I${includedir}
21:01:10
3 Jan 2024
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) Aj damn, running nixpkgs-built singularity images on nixos with nixpkgs' apptainer is broken (again): https://github.com/apptainer/apptainer/blob/3c5a579e51f57b66a92266a0f45504d55bcb6553/internal/pkg/util/gpu/nvidia.go#L96C2-L103 21:53:59
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) singularityce adopted nvidia-container-cli too -> they broke --nv 22:14:03
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) * singularityce adopted nvidia-container-cli too -> --nv broken there as well 22:14:31
4 Jan 2024
@ss:someonex.netSomeoneSerge (UTC+U[-12,12])ok so the libnvidia-container patch is still functional (it does ignore ldconfig and scan /run/eopgnl-driver/lib)04:40:05
@ss:someonex.netSomeoneSerge (UTC+U[-12,12])

But apptainer doesn't seem to even run it:

❯ NVIDIA_VISIBLE_DEVICES=all strace ./result/bin/singularity exec --nv --nvccli writable.img python -c "" |& rg nvidia
futex(0x55fed1c4c888, FUTEX_WAIT_PRIVATE, 0, NULLINFO:    Setting --writable-tmpfs (required by nvidia-container-cli)
04:40:29
@ss:someonex.netSomeoneSerge (UTC+U[-12,12])

Although it claims it does:

❯ APPTAINER_MESSAGELEVEL=100000 NVIDIA_VISIBLE_DEVICES=all ./result/bin/singularity exec --nv --nvccli writable.img python -c ""
...
DEBUG   [U=1001,P=888790]  create()                      nvidia-container-cli
DEBUG   [U=1001,P=888822]  findOnPath()                  Found "nvidia-container-cli" at "/run/current-system/sw/bin/nvidia-container-cli"
DEBUG   [U=1001,P=888822]  findOnPath()                  Found "ldconfig" at "/run/current-system/sw/bin/ldconfig"
DEBUG   [U=1001,P=888822]  NVCLIConfigure()              nvidia-container-cli binary: "/run/current-system/sw/bin/nvidia-container-cli" args: ["--user" "configure" "--no-cgroups" "--device=all" "--compute" "--utility" "--ldconfig=@/run/current-system/sw/bin/ldconfig" "/nix/store/rzycmg66zpap6gjb5ylmvd8ymlfb7fag-apptainer-1.2.5/var/lib/apptainer/mnt/session/final"]
DEBUG   [U=1001,P=888822]  NVCLIConfigure()              Running nvidia-container-cli in user namespace
DEBUG   [U=1001,P=888790]  create()                      Chroot into /nix/store/rzycmg66zpap6gjb5ylmvd8ymlfb7fag-apptainer-1.2.5/var/lib/apptainer/mnt/session/final
...
04:41:45
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) (result refers to an apptainer build) 04:42:19
@ss:someonex.netSomeoneSerge (UTC+U[-12,12])

Btw

❯ strace /run/current-system/sw/bin/nvidia-container-cli "--user" "configure" "--no-cgroups" "--device=all" "--compute" "--utility" "--ldconfig=@/run/current-system/sw/bin/ldconfig" "/nix/store/rzycmg66zpap6gjb5ylmvd8ymlfb7fag-apptainer-1.2.5/var/lib/apptainer/mnt/session/final"
...
capset({version=_LINUX_CAPABILITY_VERSION_3, pid=0}, {effective=0, permitted=1<<CAP_CHOWN|1<<CAP_DAC_OVERRIDE|1<<CAP_DAC_READ_SEARCH|1<<CAP_FOWNER|1<<CAP_KILL|1<<CAP_SETGID|1<<CAP_SETUID|1<<CAP_SETPCAP|1<<CAP_NET_ADMIN|1<<CAP_SYS_CHROOT|1<<CAP_SYS_PTRACE|1<<CAP_SYS_ADMIN|1<<CAP_MKNOD, inheritable=1<<CAP_WAKE_ALARM}) = -1 EPERM (Operation not permitted)
...
nvidia-container-cli: permission error: capability change failed: operation not permitted

Was this even supposed to work without root?

04:50:21
@ss:someonex.netSomeoneSerge (UTC+U[-12,12]) Also, is it ok that $out/var/lib/apptainer/mnt/session/final is read-only? 04:50:54

Show newer messages


Back to Room ListRoom Version: 9