[lammps-users] inputs, units and final simulation box dimensions

I had a quick question about the output I see in the log file coming out
of a LAMMPS run. As can be seen below in the sample output I give
inputs to create a (50,100,50) simulation box and lj units are
definitely in use.

However when the system reports back on the creation of the box it comes
in at (87.358 174.716 87.358)

Later when I go ahead and graph the dump outputs I find that this second
set of numbers does indeed define the box that was used in the
computation given the placement of atoms in the visualization.

Has anybody else come across this or have an explanation? I'd imagine
I'm missing something simple here but not for lack of trying I couldn't
come up with a good explanation for this one.

Sample log file output:

Try adding the "units box" keyword to the region command, otherwise it will
interpret the boundaries as given in lattice spacings.

Hope this helps,

  Lutz

In lj units, if sigma is 1, then your box has to be something else to
give you a requested reduced density. The formula rho* = N/V sigma^3
is the conversion factor. So if you choose rho* and LAMMPS sets
sigma = 1, then V is not 50x100x50 but 87*174*87 in your example.

This is discussed in the lattice command doc page or any textbook
on LJ reduced units.

Hope that helps,
Steve