timestep in granular simulations

Hi!

I’m a PhD candidate at the Technical University of Madrid (Spain). We are running LAMMPS simulations on granular systems. Our protocol applies linear compression to 2D systems of viscous and Hookean particles. Then we analyze the statistical distribution of some stress components (we obtain them by using “compute stress/atom virial” command).

We have realized that if we use the same inputs and data file but we only change the timestep (*) we obtain solutions that have very different statistical distributions of the stress components (all the chosen timesteps are below the critical value).

We think that this result is an immediate consequence of the velocity-verlet integration scheme, the non-linear nature of these systems and the finite precision of computers: when timesteps are very small, after each time integration some overlappings are also very small and cannot be considered and, in consequence, the resulting sharing out of forces and the dynamic evolution of the system are different. We have prepared a paper dealing with this issue, and one of the referees has suggested that it could be a bug in the LAMMPS code, but we do not think so. Indeed, we have implemented the velocity-verlet algorithm with other code to model another (simpler) non-linear system and we have checked how its dynamic evolution does depend on the timestep. The thing is that in this particular case the stress distribution could be strongly determined by the timestep.

What do you think about possible bugs in the LAMMPS code?

Thank you very much!

Ignacio

(*): We have changed the number of integration steps too, so that the represented real time, force rate and total force are always the same.

e.g.:

xxxxxxxxxxxxxxxxxxxxx SIMULATION A:

units lj
dimension 2
[…]
pair_style gran/hooke 202200 NULL 550.0 NULL 0. 0
read_data inital.grn

pair_coeff * *

neigh_modify exclude type 2 3
neigh_modfy… […]

group balls type 1
group … […]

timestep 0.001
variable rampforce equal -500.0*(step/100000)

communicate single vel yes

fix 1 all enforce2d
fix 2 piston aveforce 0.0 v_rampforce 0.0
fix 3 rightwall aveforce v_rampforce 0.0 0.0
fix 4 wall freeze
fix 5 all nve

compute 3 all stress/atom virial

dump output all custom 10000 packings.out x y id vx vy c_3[1] c_3[2] c_3[4]

run 100000

xxxxxxxxxxxxxxxxxxxx SIMULATION B:

["]

timestep 0.0001

variable rampforce equal -500.0*(step/1000000)

["]

run 1000000

xxxxxxxxxxxxxxxxxxxxxxxx

 

Hi!

I’m a PhD candidate at the Technical University of Madrid (Spain). We are
running LAMMPS simulations on granular systems. Our protocol applies linear
compression to 2D systems of viscous and Hookean particles. Then we analyze
the statistical distribution of some stress components (we obtain them by
using “compute stress/atom virial” command).

We have realized that if we use the same inputs and data file but we only
change the timestep (*) we obtain solutions that have very different
statistical distributions of the stress components (all the chosen timesteps
are below the critical value).

We think that this result is an immediate consequence of the velocity-verlet
integration scheme, the non-linear nature of these systems and the finite
precision of computers: when timesteps are very small, after each time
integration some overlappings are also very small and cannot be considered
and, in consequence, the resulting sharing out of forces and the dynamic
evolution of the system are different. We have prepared a paper dealing with
this issue, and one of the referees has suggested that it could be a bug in
the LAMMPS code, but we do not think so. Indeed, we have implemented the
velocity-verlet algorithm with other code to model another (simpler)
non-linear system and we have checked how its dynamic evolution does depend
on the timestep. The thing is that in this particular case the stress
distribution could be strongly determined by the timestep.

What do you think about possible bugs in the LAMMPS code?

if you are looking for the philosophical perspective, you can find it here:

http://lammps.sandia.gov/unbug.html

outside of that, it is difficult to make a statement about this
without having specific (simple) examples that demonstrate the
behavior. i also suggest to search through the mailing list archives.
i vaguely remember a discussion about some fundamental issues with the
velocity updates being done the way they are.

ciao,
    axel.

Hi Axel,

Thanks for your rapid answer. Maybe this is a better and simpler example on how timestep choice may determine the evolution of non-linear systems:
A rigid body (consisting of 2 balls) is moving down until it elastically collides with two other balls that are at rest but located at slightly different heights. After this event balls 3 and 4 start to move and then they collide with a fifth particle that is also at rest. Well, the final position and velocity of particle 5 are strongly determined by the timestep selection. The simulation can be run, for example, with timesteps of 0.2, 0.1, 0.01 and 0.001 to check how the position of particle 5 after 30, 60, 600 and 6000 timestep, respectively, changes.

We have used LAMMPS as well as other codes and we have obtained similar results in this case. We think that velocity-verlet, finite precision, non-linearity and timestep choice cause a similar effect in case of granular systems under compression, but we’d like to discard any possible bug in LAMMPS.

xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx (SCRIPT)
units lj
dimension 2
atom_style granular
boundary f f p
newton off
pair_style gran/hooke 10.0 NULL 0.0 NULL 0. 0
read_data initial.grn

pair_coeff * *

group piston type 1
group balls type 2

timestep 0.1
communicate single vel yes

fix 1 all enforce2d
fix 2 piston rigid/nve single
fix 3 balls nve

dump output all custom 10 balls.out x y id vx vy

run 60

Xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx (INITIAL.GRN)
DATA FILE

5 atoms

2 atom types

-5.0 5.0 xlo xhi
-7.0 3.0 ylo yhi
-0.5 0.5 zlo zhi

Atoms

1 1 1.0 1.90985932 -0.5 2.0 0.0
2 1 1.0 1.90985932 0.5 2.0 0.0
3 2 1.0 1.90985932 -0.5 1.0 0.0
4 2 1.0 1.90985932 0.5 0.98 0.0
5 2 1.0 1.90985932 0.0 -4.0 0.0

Velocities

1 0 -1.0 0 0 0 0
2 0 -1.0 0 0 0 0
3 0 0 0 0 0 0
4 0 0 0 0 0 0
5 0 0 0 0 0 0
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

Ignacio,

Two points:

Look into literature if you are talking about the statistics of forces that result from compression chains in a compressed granular assembly, i.e. jammed. The literature shows universal behaviour in many compression situations, e.g. tri-axial compression (some of these studies even used LAMMPS).

Second, you have to differentiate whether you are talking about trajectories or statistics extracted as averaged over your entire system. Due to the chaotic nature of n-body problems, it is not surprising if single particle trajectories diverge using different time-steps.

Hi Eric,

Many thanks for your comments. I know some papers dealing with forces/stresses distribution but I have not found any of them dealing with effects of the timestep (just one value for the timestep is used and not always a velocity-verlet is applied). The thing is that we are carrying out biaxial tests and if we use different and randomly generated samples that we object to the same protocol, we always obtain the same distribution, but if we change the timestep (keeping real time and force rate) they also change. Although smaller timesteps do not produce significant differences in their associated statistical outputs, I think that it does not necessarily mean that they give more realistic results.

Elastic collision of n-bodies and granular compression are different problems, but I have not found another simpler example to present how timestep selection can affect the evolution of a system. I think that something similar is happening during the driving process (which includes many –inelastic- collisions and interactions) that affects the final sharing out of forces. At least that is what we have obtained from simulations.

Thank you very much,

ignacio