[lammps-users] Trouble Using the rRESPA multi-timescale integrator

Hello,

I’m having trouble using the run_style respa command within the NPT ensemble. I’m attempting to simulate a box of pure propane using the OPLS-AA forcefield. I want to use rRESPA so I can evaluate all the intramolecular forces at a smaller timestep than all the intermolecular stuff. I’ve noticed that when I apply:

timestep 1.0
run_style respa 2 1

which is a uniform timestep of 1.0fs at both integration levels, I get different results when I simply run the simulation with the velocity verlet integrator and a timestep of 1.0fs (everything seems to equilibrate quite nicely when I use this scheme). When I make the looping factor larger than 1, the system seems to become even more unstable.

I searched through some of the mail threads and noticed that in #2514 an issue was resolved that was related to using rRESPA within the NPT ensemble. I installed the most recent release of lammps (10Mar10), thinking that perhaps my problem was a bug related to the same sort of issue that has now been patched up, but that didn’t seem to help. I’ve tried using pppm and ewald for the long-range forces, but in both cases I see the same behavior.

Here’s my input script:

I get different results when I simply run the simulation with the velocity verlet integrator and a timestep of
1.0fs

What does this mean, specifically?

Re: instability - if you do the pairwise interactions at much more
than 1 fmsec than
you will likely have problems. The old mail thread discussed one bug that
was fixed. But another that has to do with the tallying of pressure
that is not.
But it is really a bookkeeping issue that should make a (hopefully)
small contribution
to the system pressure.

Steve

What I meant is that when I run rRESPA with a loop factor of 1 (ie, all forces are actually being computed at the same timestep) and a timestep of 1.0fs I get different results than when I use the default run_style with a timestep of 1.0fs. So everything is being computed at 1.0fs. The same thing happens when I try a larger timestep of 2.0fs.

I’ve attached some output for comparison.

verlet.txt is the result of running an NPT simulation of pure propane (250 molecules) at 300K and 41.94 atm with a timestep of 1.0fs using the default run_style

respa.txt is the result of running the same simulation but with an inner-loop timestep of 1.0fs and an outer loop timestep of 1.0fs [run_style respa 2 1]

I’ve only printed out a few lines of output, but you can already see there’s a pretty huge difference in things like the temperature and pairwise energy. I also noticed that there are a lot more neighborlist builds in the Verlet run compared to the rRESPA run.

Any ideas?

Many thanks!

Will

verlet.txt (3.21 KB)

respa.txt (3.28 KB)

Can you post a small (as possible) example of your input script
and data file - I will look at this.

Steve

I’ve attached an input script as well as a small (10 propane molecules) data file. If this is too small, I can produce something larger without a problem. Thanks for the help!

Will

npt.in (1.36 KB)

FilledConfig.data (30.4 KB)

William - I think I fixed this with the 25Mar10 patch. It is an issue
having to do with when the box size updates are applied during a rRESPA
run with NPT. I thought we were doing the 2 updates at symmetric positions
within the timestep hierarchy, but that appears to cause other problems.
With the change you get the desired behavior that running rRESPA with
collapsed loops (of size 1) gives the same answer as Verlet.

However, it's still a bit unclear to me we are doing the 2nd box size update
in the correct place, so you should run your bigger system and verify
that the pressure and box size equilibrate to reasonable values as compared
to a Verlet run, when running with the usual rRESPA loop factors.
Note that Verlet vs rRESPA might give slightly different values (e.g. box size
and density) since they are different models. But hopefully the differences
will be small if you use reasonable rRESPA values, and the code is
working correctly.

Steve