[lammps-users] Any way to stop losing atoms?

Dear all,

I have been working with Lammps for over a month now, but I still have the frequent problem that LAMMPS seems to be unable to track all my atoms.

This results in either an “ERROR: Lost atoms” or “ERROR: Out of range atoms - cannot compute PPPM”.

Now I am starting out with a random glass composition (atoms I randomly place in a box, at a minimum distance of 1Angstrom) at 5000K and then cooling to 300K.

I have tried to use fix nve/limit 0.1 to restrict movement, but as soon as I leave that regime (after 20k timesteps or so) and switch to nvt or npt, I get the aforementioned error. Neither “minimize”, "neigh_modify or “fix mytime all dt/reset 10 1.0e-7 0.001 0.2 units box” commands fixed my problems

Bizzarrely, I could run simulations using an old datafile where I had basically placed atoms in a pseudo-crystal structure (Si-O cubic and then replaced atoms with Na to get my desired composition). This seems to be closer to equilibrium, as I get it to run (with some of the aforementioned commands) to completion.

I have attached one of the simulations I would like to run that crashes due to lost atoms.

Any help would be greatly appreciated!



30percentNaO2.data (334 KB)

30percentNaO2.input (12.1 KB)

OOpotentials.data (37.9 KB)

SiOpotentials.data (38.3 KB)

How many processors do you run on? Do all processor counts
give the lost atoms error? Atoms are generally lost b/c
they move far away from either the simulation box or
a processor's sub-domain within the box, before reneighboring
is done. This generally means your model is bad. So I would
monitor thermo output every step (big T is bad), and dump
every step near when this happens and viz your system to
insure the dynamics is not bogus. Also you should be
checking for reneighboring every step, if you anticipate
atoms might move a long ways.