Mpirun launches multiple runs (Version 8Feb2023)

Dear all,

I compiled the latest version of LAMMPS (8Feb2023) today and wanted to run it in the same fashion I did with the version i used before, namely the 23Jun2022, like:

mpirun -np nproc lmp -in input

This worked fine for the former version, but with the new version executing the command apparently starts the input in serial nproc times. The execution of just:

mpirun -np 8 lmp

without an input specified gives for the 23Jun2022 version:

LAMMPS (23 Jun 2022 - Update 3)
WARNING: Using I/O redirection is unreliable with parallel runs. Better use -in switch to read input file. (src/lammps.cpp:530)
using 1 OpenMP thread(s) per MPI task

and for the 8Feb2023 version:

LAMMPS (8 Feb 2023)
using 1 OpenMP thread(s) per MPI task
LAMMPS (8 Feb 2023)
using 1 OpenMP thread(s) per MPI task
Total wall time: 0:00:00
LAMMPS (8 Feb 2023)
using 1 OpenMP thread(s) per MPI task
Total wall time: 0:00:00
LAMMPS (8 Feb 2023)
using 1 OpenMP thread(s) per MPI task
Total wall time: 0:00:00
LAMMPS (8 Feb 2023)
using 1 OpenMP thread(s) per MPI task
Total wall time: 0:00:00
LAMMPS (8 Feb 2023)
using 1 OpenMP thread(s) per MPI task
Total wall time: 0:00:00
LAMMPS (8 Feb 2023)
using 1 OpenMP thread(s) per MPI task
Total wall time: 0:00:00
LAMMPS (8 Feb 2023)
using 1 OpenMP thread(s) per MPI task
Total wall time: 0:00:00

To doublecheck I recompiled both versions just now in the exact same fashion, using cmake and the “most”-preset, but the strange behaviour remains. Is this a problem with the version, or might my PC be the issue?

Cheers
Jakob

This effect is usually due to one of two reasons:

  • when configuring LAMMPS, CMake could not find the MPI library automatically and has used the MPI STUBS library. This means you have a serial LAMMPS executable and starting multiple of them will run multiple individual runs since they will not try to communicate
  • when running LAMMPS you are using an mpirun/mpiexec command that is from a different MPI library (e.g. MPICH vs. OpenMPI). Then mpirun cannot communicate the rank assignment and each process will run as if it was launched without.

Please have a look at the output of lmp -h.
At some point it should print out the MPI library and version that was used to compile LAMMPS and provide the hint as to which of the two cases from above applies.

It can be something like this:

MPI v4.0: MPICH Version:	4.0.2
MPICH Release date:	Thu Apr  7 12:34:45 CDT 2022
MPICH ABI:	14:2:2

Or this:

MPI v1.0: LAMMPS MPI STUBS for LAMMPS version 8 Feb 2023

Thanks a lot for the feedback, cmake finding the wrong MPI library was indeed the issue!

1 Like

I compiled my version of lmp using cmake after loading mpich. But after compilation the -h command shows the MPI STUBS library
Is there more to making mpich available than the command ‘module load mpich’ ??