Granular simulation: Memory leak & Segmentation fault

Hi,

Thank you for your reply Axel. Following up on my previous post, I checked
it with the latest stable version of lammps and the segmentation fault
still happens. Attached is a small test case, which generates the segfault
when more than 2 cores are used, but runs fine with only 2 cores. I have
other cases that generate the error at different number of cores. Again
this happens before any execution of the input file, and only occurs on
Linux and not on Windows.

​actually, i believe that this input will produce bogus results even if it
doesn't crash. i don't think we have had ever tested somebody running a
hybrid pair style for this kind of granular models. hybrid/overlay is
definitely wrong, but even for hybrid, i am not certain, whether it should
work the way you are using it.

please submit this example test input deck with some explanation to the
github project as an issue and some LAMMPS developer will look into it when
there is time.

axel.

Hi Daren,
You could run your simulation case using LIGGGHTS which is a software based on LAMMPS for granular applications.

Best,
Carlos

We’re close to releasing new versions of the granular pair

styles which allow for per-type values of the various coeffs.

I think that would eliminate the need for you to run hybrid/overlay.

Steve