Differences in results between simulations run on Windows and MacOS

Dear all,

I am running LAMMPS on Windows 7 in parallel (with the latest version), and also on my Mac (running Mavericks) in serial.

I am running granular simulations, which consist of pouring molecules composed overlapping spheres in to a box (with the Hertzian potential) and then calculating the resulting packing fraction.

When I run identical simulations on both MacOS and Windows, the results are slightly different (about 0.3%) between the two operating systems. Though this is still within error of my method of calculating the packing fraction of the box of particles, it seems strange that the results should be different at all. Both of the simulations are identical (same random seed, same commands in the same order except for the differences in “communicate” and “comm_modify” between MacOS and Windows LAMMPS), and yet the simulations run slightly differently.

Has anyone else seen this, or is there a way of rectifying this? I would greatly appreciate any help!

Many thanks,


yet the simulations run slightly differently.

Are they near identical at the beginning, then slowly

drift apart? That’s normal behavior in many contexts.

See this doc page:


This is probably the answer - thank you! The only thing that seemed suspicious is that the results from the simulations run on Windows were consistently about 0.3-0.4 percent higher than those run on a Mac, when averaging over the packing fraction of about 1,000 molecules (or 10,000 atoms) for thousands of different molecule shapes in a periodic box - on the Windows side, I am running the simulations on a cluster with 40 cores - but it seems likely that this could be due to rounding error, especially since it is so systematic.

Thank you for your help!