I’ve been working on polymer translocation,my lammps file contain 2 runs(equilibration and translocation).I equilibrate system with 2000000steps afterward translocation begins in 800000steps.because of random force this work must repeat 2000times to reach a good time distribution,and its too long.
is it correct to equilibrate system one time and use dump file as data file then just repeat translocation process 2000times?
a) it would be better to save a data file with write_data. You cannot use read_data on a dump file as it has a different format and doesn’t contain the same information.
b) yes, you can start multiple trajectories from the same data file, but you need to first “decorrelate” them. That is often done by re-initializing the velocities with the velocity command and adding a small(!) random displacement to atoms with the displace_atoms command and adding a short equilibration time. The divergence of those trajectories is usually exponential, so a few 10000 steps should do to diverge them. For all of these steps, you should use different and nonconsecutive random number seeds (for each step and each copy of the run) so that velocities, displacements and other randomness driven commands (e.g. fix langevin) have different and non-correlated RNG sequences in the different runs for better statistical relevance.