Hi all,
I’ve faced an issue which is related to the used force fields and my experience and I would like to share it with you for figuring out my mistake. I recently came up with a need to validate the available DMF (N,N-dimetylformamide) force fields, e.g. GROMACS 54a7 and OPLSAA.
I’ve created a simulation box whose edge’s length is 2.5x larger than the cutoff, while the cutoff is 1.2nm and the vdw interaction is shifted (just potential) and the long range coulombic interaction is treated via P3M on LAMMPS, and PME on GROMACS.
Both ffs result in the correct and expected density at pressure 1 atm and temperature 300K under a NPT simulation for 1ns, although the pressure relaxation time must be 2ps for GROMACS, and the compressibility value was chosen 6.27e-5 1/bar. 1ps is a short time and the box’s size is changed dramatically, anyway this is not my concern since LAMMPS gives the same result for 1ps.
The issue is related to the NVT simulation, where I would like to measure the derived pressure and would like to study the system’s dynamics. the pressure is extremely sensitive to the simulation box and each change of 0.5 \AA can change the average pressure, in addition, the pressure values sweep a wide range from -2000 to 2000 atms.
I can share my files and logs with you—please let me know your thoughts and insights.
I used the same force fields and parameters for both ensemble-dependent simulations.
I generated the parameters using both LigParGen (OPLS-AA) and the ATB server (GROMACS).
Best
Russell