USER-LB package

Dear all,

I attempted to run the example simulation involving diffusion of a coarse-grained polymer in a lattice Boltzmann fluid using 8 MPI processes. Please find attached the input files I used which are identical to the ones in the examples/USER/lb/polymer/ directory of the LAMMPS distribution. When I ran a short simulation for 200 time-steps on 8 processors, I received the following messages appended to the usual LAMMPS output:

In direct memory block for handle type DATATYPE, 3 handles are still allocated
In indirect memory block 0 for handle type DATATYPE, 17 handles are still allocated
In direct memory block for handle type DATATYPE, 3 handles are still allocated
In indirect memory block 0 for handle type DATATYPE, 17 handles are still allocated
In direct memory block for handle type DATATYPE, 3 handles are still allocated
In indirect memory block 0 for handle type DATATYPE, 17 handles are still allocated
In direct memory block for handle type DATATYPE, 3 handles are still allocated
In direct memory block for handle type DATATYPE, 3 handles are still allocated
In indirect memory block 0 for handle type DATATYPE, 17 handles are still allocated
In direct memory block for handle type DATATYPE, 3 handles are still allocated
In indirect memory block 0 for handle type DATATYPE, 17 handles are still allocated
In indirect memory block 0 for handle type DATATYPE, 17 handles are still allocated
In direct memory block for handle type DATATYPE, 3 handles are still allocated
In indirect memory block 0 for handle type DATATYPE, 17 handles are still allocated

There are two things I would like to clarify:
Can these messages be ignored or do they indicate an error in the source code that could affect the results obtained? In the latter case, what might have triggered them? They seem to suggest that some MPI datatypes were not freed before the program exits. I have tried to include some MPI_Type_free commands in the destructor of fix_lb_fluid.cpp to free the derived MPI datatypes created in this code. However, these messages still appear.

Best,
Karthik

DISCLAIMER The sender of this email is an alumnus of National University of Singapore (NUS). Kindly note that NUS is not responsible for the contents of this email, and views and opinions expressed are solely the sender’s.

data.polymer (31.8 KB)

in.polymer_setgamma (5.59 KB)

Dear all,

I attempted to run the example simulation involving diffusion of a coarse-grained polymer in a lattice Boltzmann fluid using 8 MPI processes. Please find attached the input files I used which are identical to the ones in the examples/USER/lb/polymer/ directory of the LAMMPS distribution. When I ran a short simulation for 200 time-steps on 8 processors, I received the following messages appended to the usual LAMMPS output:

which LAMMPS version, what hardware/OS/compiler/MPI-library is this with?

In direct memory block for handle type DATATYPE, 3 handles are still allocated
In indirect memory block 0 for handle type DATATYPE, 17 handles are still allocated
In direct memory block for handle type DATATYPE, 3 handles are still allocated
In indirect memory block 0 for handle type DATATYPE, 17 handles are still allocated
In direct memory block for handle type DATATYPE, 3 handles are still allocated
In indirect memory block 0 for handle type DATATYPE, 17 handles are still allocated
In direct memory block for handle type DATATYPE, 3 handles are still allocated
In direct memory block for handle type DATATYPE, 3 handles are still allocated
In indirect memory block 0 for handle type DATATYPE, 17 handles are still allocated
In direct memory block for handle type DATATYPE, 3 handles are still allocated
In indirect memory block 0 for handle type DATATYPE, 17 handles are still allocated
In indirect memory block 0 for handle type DATATYPE, 17 handles are still allocated
In direct memory block for handle type DATATYPE, 3 handles are still allocated
In indirect memory block 0 for handle type DATATYPE, 17 handles are still allocated

There are two things I would like to clarify:
Can these messages be ignored or do they indicate an error in the source code that could affect the results obtained? In the latter case, what might have triggered them? They seem to suggest that some MPI datatypes were not freed before the program exits. I have tried to include some MPI_Type_free commands in the destructor of fix_lb_fluid.cpp to free the derived MPI datatypes created in this code. However, these messages still appear.

your observation is correct. there are data types created in fix lb/fluid, that are not freed. i don’t see the MPI messages, but i do see the memory leak with valgrind’s memcheck tool. this leak should be remedied. however, since this is a rather small memory leak and you are not likely to create and delete the fix many times, it should not have a negative effect on your calculation. nevertheless, the leak will be fixed in the next LAMMPS patch release as we aim to keep LAMMPS “valgrind clean” as much as possible.

thanks for reporting. if you come across something similar again, please consider reporting it on github at: https://github.com/lammps/lammps/issues

thank in advance,
axel.

Dear Sir,

Thank you for the explanation. The details you requested are as follows:

LAMMPS version : 16Mar18
Hardware: Intel® Xeon® CPU X5650 @ 2.67GHz
OS : CentOS release 6.9
Compiler: mpicxx

I am not sure how to determine the MPI library being used. May I know what is the remedy for this issue?

Best,
Karthik

Dear Sir,

Thank you for the explanation. The details you requested are as follows:

LAMMPS version : 16Mar18
Hardware: Intel® Xeon® CPU X5650 @ 2.67GHz
OS : CentOS release 6.9
Compiler: mpicxx

I am not sure how to determine the MPI library being used.

mpicxx -show will usually reveal the folder where the mpi headers and libraries are and thus provide an indication which MPI library is used.

May I know what is the remedy for this issue?

as you surmised, freeing the custom data types does the trick. here is what will be the change in the next LAMMPS patch release.

https://github.com/lammps/lammps/pull/1518/files#diff-ce66a9c3f60f5efb2708b70d91ce3f07

axel.