Bin size equal to 1/2 size of maximum pair cutoff and not the maximum pair cutoff

Dear Community,

According to the documentation, "by default, for neighbor style bin, LAMMPS uses bins that are 1/2 the size of the maximum pair cutoff".

In the original paper [1, p. 3], it is, however, suggested to bin atoms "into cells of size d > = r_s", where r_s = r_c + delta with r_c being the cutoff distance and delta the skin distance. Hence, only the atoms in the 26 neighboring cells have to be tested in 3D.

Choosing 1/2 the size of the maximum pair cutoff (instead of the cutoff) in LAMMPS would require testing (about) 5 x 5 x 5 - 1 = 124 cells in 3D since an atom could be "in contact" with an atom in a neighbor's neighbor cell. In other words, in 1D, an atom in cell 1 could interact with an atom in cell 3.

Hence, why is the bin size equal to 1/2 of the maximum pair cutoff and not the maximum pair cutoff , although the first choice seems to imply significantly more tests and a longer computation time than the second?

Thank you in advance,
Dominik

[1] S. Plimpton, Fast parallel algorithms for short-range molecular dynamics, Journal of Computational Physics, 117(1), 1-19, 1995.

Dear Community,

According to the documentation, “by default, for neighbor style bin, LAMMPS uses bins that are 1/2 the size of the maximum pair cutoff”.

In the original paper [1, p. 3], it is, however, suggested to bin atoms “into cells of size d >= r_s”, where r_s = r_c + delta with r_c being the cutoff distance and delta the skin distance. Hence, only the atoms in the 26 neighboring cells have to be tested in 3D.

Choosing 1/2 the size of the maximum pair cutoff (instead of the cutoff) in LAMMPS would require testing (about) 5 x 5 x 5 - 1 = 124 cells in 3D since an atom could be “in contact” with an atom in a neighbor’s neighbor cell. In other words, in 1D, an atom in cell 1 could interact with an atom in cell 3.

Hence, why is the bin size equal to 1/2 of the maximum pair cutoff and not the maximum pair cutoff, although the first choice seems to imply significantly more tests and a longer computation time than the second?

i think there is a flaw in your logic. you may be testing more cells with 1/2 the bin size, but there are fewer atoms per bin (1/8th in 3d assuming a homogeneous particle density). so you would at worst compute as many distance tests in both cases. you only have some additional overhead from looping over more cells.
but keep in mind, that the cutoff based distance is the lower bound. another condition is that you have to have an integer number of bins. with that in mind, you will have to stray less from the optimal size with a shorter minimum bin size. also consider, that you can eliminate cells in the corners with half the cutoff, so ultimately the number of distance tests would be less with 1/2 the cutoff than using the full cutoff.

i made an empirical test with a toy MD code and found that when using smaller than (at least) 1/2 the cutoff for building the cells, the overhead of iterating over the cells outweighs the benefit of having to look at fewer pairs of atoms. but using half the cutoff is a definite speed gain due to the ability to eliminate far away cells from the stencil for testing distances even with a cutoff optimal for box binning.

axel.

Another way to think of the trade-off is that with large bins you include (and thus have to distanace test)
more atoms that are outside the sphere of radius cutoff+skin. This is when you build
the neighbor list. With a stencil of smaller neighbor bins which exclude bins wholly
outside cutoff+skin, then you have less atoms outside the sphere. In the limit,
you could have tiny bins which nearly exactly cover the volume of the sphere. But, as Axel
said, there is overhead to looping over that many bins with few atoms/bin. The bin = 1/2
cutoff size is a generally good compromise between those 2 extremes.

In the original paper, only bins = cutoff size were tried.

Steve