Using moltemplate for ReaxFF simulation

Thank you for looking into this, Aidan. I did notice the non-numerical output before, but I was getting that only when I had 1 or 3 chains, and not when I had 2 chains. I had resolved the nan values by correcting geometry once before, so I was looking out for that. But since I create multiple chains by replicating the same initial structure (for the case with 1 or 2 chains), I did not expect it to be because of overlapping particles (if there was an error in 1 chain, it should have been there for 2 chains too).

When I tried it again just now with only 2 chains, I get numerical results at first (for energy, pressure and pair energy) and then a segmentation fault. The data file I used is attached. Other files are as before. I also checked the initial coordinates for all atoms, and they are all distinct, and within the box dimensions. Could you please try checking with this file as well? I am not sure but I am probably missing something very basic here.

Thanks again,

Ankit

pmmaReax.data (9.77 KB)

I ran your two chain example and it ran without incident.

Step Temp E_pair E_mol TotEng Press
0 0 26020.801 0 26020.801 13776.29
10000 608.25635 -26143.364 0 -25593.996 -463.48989
Loop time of 27.618 on 1 procs for 10000 steps with 304 atoms

Dear Aidan and Ankit

    I can reproduce the crash. Ankit is using "pair_style reax/c".
It's not clear if the crash occurs because of code in USER_REAXC,
however if Ankit is desperate, he might want to contact the
corresponding author of that package for help debugging this error.
(Metin Aktulga. His web page is at: http://www.cse.msu.edu/~hma)
There are other ways to get around the problem (see below).

    The problem may be hard to reproduce because LAMMPS runs without
crashing on these files if you run it in serial.

lmp_ubuntu -i in.CHO

    However I get a segfault if I run it in parallel (after only few
hundred iterations).

mpirun -np 2 lmp_ubuntu -i in.CHO

    In addition to this, there is definitely a problem with the
physics in Ankit's simulation. For example, when I minimize the
simulation before running MD, the segfault no longer occurs:

mpirun -np 2 lmp_ubuntu -i in_min.CHO

Either way, when I run the simulation, the atoms explode quickly. It
seems like really bad physics in combination with MPI is causing the
segfault to occur. Fixing this will probably get rid of the problem.

(However, in case the original author of USER_REAXC wants to dig into
why the segfault is occuring, I posted the files needed to reproduce
the crash. If it matters, I am using ubuntu 14.04 running on an Intel
Xeon E5620.)

----- other details -----

    I edited Ankit's input files (posted on 1/26) in order to make
them work better with moltemplate and VMD. (Attached. I also fixed
the problem with the bounday box being too small and added a
minimization step to "in_min.CHO".)

    As mentioned before, when you run LAMMPS in serial, it does not
crash. Nevertheless, I suggest that Ankit should look at his
trajectory. (If it helps, I've attached some instructions for doing
this using VMD.)

Cheers

andrew

    I added several new files. The "system.in.init", and
"system.in.settings" would normally be created by moltemplate. It was
necessary to do it this way to get around the "-nocheck" problem he
was running into. The "system.psf" file was created by topotools and
is handy if Ankit wants to look at his trajectory in VMD. Feel free
to ignore all of these files.

README_visualize.txt (2.85 KB)

reaxc_segfault_2016-1-28.tar.bz2 (9.72 KB)

This latest example reaxc_segfault_2016-1-28 also ran without incident on 16 and 17 procs and is also clean under valgrind. The only problem was that the temperature became temporarily very large (40,000 K), so the trajectory is not accurate and also unphysical. This is not a LAMMPS problem, but a problem either with the initial structure or the interatomic potential.