memory error when using read_dump command


I am using read_dump command to perform minimization of every snapshot of my MD run. However, after going through a few snapshots from my dump file, the code aborts with the error message “p0: not enough space for bonds! total=1015904 allocated=1015236”. Below is my input script:

units real
atom_style charge # corase-grain liquids, solids, metal
boundary p p p # boundary conditions

read_data (same data file was used to generate the dump file previously)
neighbor 2 bin
neigh_modify every 10 delay 0 check no
pair_style reax/c NULL checkqeq no
pair_coeff * * ffield.reax 6 #Al Cu can also be defined in datafile
compute gv all pair reax/c
variable eb equal c_gv[1]
variable ea equal c_gv[2]
variable evdw equal c_gv[11]
compute mycoord all coord/atom 3.20
dump 1 all custom 10 minimize_1.dump id type x y z c_mycoord
thermo_style custom etotal pe ke vol press temp pxx pyy pzz pxy pxz pyz lx ly lz v_eb v_ea v_evdw
thermo_modify line multi flush yes
thermo 1

variable b loop 1020
label loop01

variable a equal 20000+1000*$b
read_dump foo.dump $a x y z
minimize 1.0e-4 1.0e-6 10 1000

next b
jump loop01

In the dump file (foo.dump) I outputted xu, yu and zu. But the read_dump command takes in only x,y and z arguments. I read from earlier posts on this mailing list that the memory problem can arise due to bad input structure. Does the above discrepancy be the cause of the problem?

Dr. Karthik Guda
Department of Mechanical and Nuclear Engineering
Penn State University

That's an error message from within the Reax/C lib.
Maybe Ray has an idea as to why this is happening
when you repeatedly invoke the lib for multiple minimizations.


Your reax/c neighbor list ran out of space, due to some unknown error.
Most likely because of unwrapped coordinates being wrapped. You can
also try with safezone and mincap keywords, to increase sizes of


1 Like