I am having problems with lost atoms in my simulations.
My starting simulation deforms the simulation box (effectively compressing in the z direction from above and below) to get my simulation from its starting configuration to the required density. The simulation setup consists of a fluid layer sandwiched between an upper and lower surface. The boundary conditions used here is p p p.
Once I have reached the correct density, I write a restart file, and use restart2data to convert this into a data file. I then change the boundary conditions to p p f, as I don't want any interactions between the upper and lower surfaces across the boundary conditions.
However, when I run this (NVT), it loses atoms within the first 400 timesteps. I have tried decreasing the timestep down to 0.1fs, and decreasing the temperature, and running an NVE simulation but I keep losing atoms in all simulations.
I think it may be due to atoms in the output file being outwith the box dimensions (I don't they were remapped into the simulation cell using the PBCs before the restart file was written.) However I am not sure, and wondering if anyone has any suggestions.
Is there a way to change the boundary conditions with a simulation (i.e run initial deform for N timesteps, then change boundary from ppp to ppf, and then run NVT, so I don't have to deal with restart files). Alternatively, is there a way to remap all the atoms so they are within the box dimensions just before the restart file is written. Or are there any other common issues which could be causing my problem.
Thanks in advance,