ReaxFF error: Non-numeric atom coords - simulation unstable

Dear all,

I am reaching out for guidance as this is my first time trying ReaxFF. My goal is to reproduce results from the paper J. Phys. Chem. C* 2015, 119, 31, 17876–17886 (https://doi.org/10.1021/acs.jpcc.5b04650) using August 2nd, 2023 version of Lammps. However, I get “Non-numeric atom coords - simulation unstable” errors. Through my research, I understand that this error may be related to the poor dynamics due to the incorrect potential or bad configuration (close contact). Here’s what I’ve tried:

First, I copied and pasted the force filed file and the Al slab configuration from the supplementary material. I tried nvt ensemble at 300K to test the system’s stability with the force field. With the given Al slab structure, the simulation ran with no error, although the temperature dropped to 0. I attached the structure and log files in this post (1_data.Al_slab, 1_log.lammps_Alslab).

Second, I increased the z box length to 10 nm to match the system in the paper. Running the simulation with only the Al slab resulted in a crash within a few steps with the “Non-numeric atom coords - simulation unstable” errors. The temperature, the pressure, and the total energy became nan. I attached the structure and log files (2_data.Al_slab, 2_log.lammps_Alslab_10nm).

Third, I reduced the time step from 0.1 fs to 0.05 fs (the authors used 0.2 fs in the paper). The simulation ran without errors, but the pressure was extremely high. Visualization didn’t reveal any close contacts. The log file is attached (3_log.lammps_Alslab_10nm_0.05fs).

Lastly, adding 150 oxygen molecules as mentioned in the paper lead to a crashe in minimization steps with an error “Not enough space for bonds!”. I attached the structure and log files (4_data.AlO_slab, 4_log.lammps_AlO).

Any insights or suggestions on resolving these issues would be greatly appreciated.

Best,
Sue

Here is the input file:

# ------------------------ INITIALIZATION ----------------------------

units           real
dimension       3
boundary        p       p       p
atom_style      charge

# ----------------------- ATOM DEFINITION ----------------------------

read_data data.AlO_slab

group Al type 1
group O type 2

neighbor 2 bin     # pairwise neighbor lists
neigh_modify delay 0 every 10 check no

# ------------------------ FORCE FIELDS ------------------------------

## ReaxFF Al/Oxygen
pair_style reaxff lmp_control
pair_coeff * * ffield.reax.AlO Al O

# ------------------------ Equilibration Run -------------------------------

##COMPUTES

compute  pe all pe/atom   # potential energy
compute  s all stress/atom NULL
compute  u all displace/atom
compute  equ all temp
compute  thermo_press_lmp all pressure thermo_temp
compute reax all pair reaxff
variable eb equal c_reax[1]
variable ep equal c_reax[12]

thermo_style custom step time temp press etotal pe vol v_eb v_ep # print these very N step
thermo 1 # output thermodynamics every N steps

##DUMP
dump dump1 all atom 1 dump.reax.AlO

velocity all create 300 400 dist gaussian mom yes rot no
velocity all scale 300 temp equ

timestep 0.1  # 0.1 fs timestep
fix 1 all nvt temp 300 300 0.01
fix 2 all qeq/reaxff 1 0.0  10.0 1e-6 reaxff

min_style cg
minimize 1.0e-6 1.0e-6 100000 100000

# ------------------------ Dynamic Run -------------------------------

run 10000

# ------------------------ Restart & Data file -----------------------

write_restart Al_slab.rest
write_data data.Al_slab_final

1_data.Al_slab (43.1 KB)
1_log.lammps_Alslab (1.3 MB)
ffield.reax.AlO (3.3 KB)
lmp_control (405 Bytes)
2_log.lammps_Alslab_10nm (14.8 KB)
2_data.Al_slab (43.1 KB)
3_log.lammps_Alslab_10nm_0.05fs (1.3 MB)
4_data.AlO_slab (67.8 KB)
4_log.lammps_AlO (3.2 KB)

Hi @suepark,

Some comments:

  • “using August 2nd, 2023 version of Lammps” Is there any reason? Else consider updating. This is a very old release in LAMMPS development pace, several issues and bugs have been corrected since, including reaxff related ones.
  • “I copied and pasted the force filed file and the Al slab configuration from the supplementary material.” The slab and forcefield files provided is in pdf format. Be specific and mention that you converted them yourself. This is a common source of error with reaxff.
  • Please read the forum guidelines to format your input files accordingly.
  • Your nvt Tdamp parameter is wrong (it is less than your dt which makes no sense). This explains why your temperature drops to 0. Please consider reading the documentation to learn how to set up a better value. Changing it and running with the latest release makes your initial case run without issue.

Hi @Germain,

Thank you so much for your response. I’ve edited the post to align with the input format guidelines.

Following your advice, I changed the Tdamp to 100, and the temperature remained at 300.

I also tried using a newer version of LAMMPS (29Aug2024_update2), which still generated an error “Not enough space for bonds!” for the Al slab system.

Regarding the force field file, I directly dragged my mouse in the pdf file, copied and pasted. For the Al slab, I copied and pasted in the same way, and I converted the xyz file to data format using ASE.

To test if the geometry might be incorrect, I created Al (111) slab with ASE. The bulk simulation seems to run without crashing, although the pressure is extremely high. The structure and log files are attached as 1_data.Al_bulk and 1_log.lammps_bulk. However, the simulation of the film crashes with the error “Not enough space for bonds!”. This suggests that either the potential or the setups in the input file may be wrong. The structure and log files are 2_data.Al111_slab and 2_log.lammps_slab.

Any insights or suggestions you could provide would be tremendously appreciated.

Best,
Sue

1_data.Al_bulk (287.0 KB)
1_log.lammps_bulk (1.3 MB)
2_data.Al111_slab (287.0 KB)
2_log.lammps_slab (5.6 KB)

You can also search through the forum for potential solutions for your error by looking for this exact error message in the search engine.

This is one that popped up already with ReaxFF and the forum also contains the full archive of the former LAMMPS mailing-list. You shall also get more insights there.

You should be using KOKKOS OpenMP version of reaxff instead of deprecated serial version even if you’re not using gpus

@stamoor has been pretty clear about that …

“IMO anyone and everyone should be using the KOKKOS version of ReaxFF. Not only is it more memory robust and will never have these hbondchk errors, it is also faster on CPUs, at least in most cases that I’ve benchmarked, or same speed at the very least.”

– Stan Moore (2024/10) on MatSci.org:

Lammps hbondchk failed.

“I highly suggest using the KOKKOS package for ReaxFF, works in serial for CPUs too.”

– Stan Moore (2024/10) on MatSci.org:

Segmentation fault: address not mapped to object at address 0xc2cfb87c.

“You could also try the KOKKOS version which doesn’t use the safezone, mincap, and minhbonds factors which can bloat the memory if you set them too high.”

– Stan Moore (2025/01) on MatSci.org:

Possible memory problem with Reaxff when the total atom number increased.

I’ve run reaxff/kk jobs with up to ~6000 cpu cores with no problems

2 Likes

Also check that your force field satisfies the following physical constraints:

  1. ATM r_vdw ≥ r_s ≥ r_p ≥ r_pp
  2. ATM r_vdw > rcore2
  3. ATM ecore2 > epsilon
  4. ATM eta > 7.2 gamma (QEQ), eta > 8.13 gamma (ACKS2)
  5. ATM bcut_acks2 ≥ max(r_s)
  6. electronegativity hierarchy ie. chi(O) > chi(N) > chi(C) > chi(H)
  7. BND De_s ≥ De_p ≥ De_pp
  8. BND p_be2 ≥ p_bo2, p_bo4, p_bo6
  9. BND p_bo6 ≥ p_bo4 ≥ p_bo2
  10. OFD r_pp ≥ r_p ≥ r_s
  11. OFD.X.Y.alpha ≥ max(ATM.X.alpha, ATM.Y.alpha)
  12. ANG p_pen1 ≥ |p_val1|
  13. ANG p_pen1 ≥ |p_val2|
  14. TOR V1 ≥ |V2|
  15. TOR V1 ≥ |V3|
  16. HBD r0_hb > r_s (ensure H-bond length exceeds σ-bond radius)

#4 is the most important, its called the “polarization catastrophe” [Troubleshooting and warnings — ReaxFF 2025.1 documentation]. the closer you get to that bound the more ill-conditioned the QEQ matrix gets and the more likely you are to have some numerical issues with your simulation.

a lot of force fields in the past have cheated on those physical constraints to overfit to a very specific system under study in a given paper. if the system you are simulating doesnt match almost exactly the training data of the original paper then that force field is likely not transferable to your new application.

one more word of caution, check that pi or pi-pi coeffs are properly trained in your force field if they are relevant in your application. in my personal experience, there are many potentials with C/H/O/N atoms but not all have the pi-bond parameters trained so a benzene molecule might behave in a completely unphysical manner, ie. hydrolyze in a few femtoseconds when benzene has a half-life in water measured in days or even weeks.

1 Like

Can also use the Kokkos Serial version of ReaxFF on CPUs instead of the OpenMP version, may be a little simpler to build/run.