I guess my questions are more related to MD fundamentals but since I’m using LAMMPS for the simulations and since looking at the books hasn’t (yet) provided an answer I thought to ask here. I use NVT and NPT with the following commands to equilibrate small systems (made of organic molecules in liquid phase with 5000 to 30000 atoms):
fix 1 all nvt temp 300 300 0.06
fix 1 all npt iso 0.0 0.0 0.6 temp 300 300 0.06
After that, I use NVE to check energy conservation. When I plot temperature against time during the whole simulation I observe larger fluctuations in NVT and NPT compared to the subsequent NVE (about 30% larger). This behavior persists with different values of the thermostat’s damping constant. My first reflex was to expect the contrary: a better controlled temperature in NVT then in NVE. So, is this a typical (and physical) behavior? And if so, could someone please explain why NVE does a better job when it comes to temperature fluctuations then the thermostat does?