Lammps community,
I am computing the RDF for a periodic 3D fcc lattice using Lennard-Jones. I am seeing a size-dependent effect on my RDF results, and I am wondering why. For a periodic box, the formulation of g(r) should not depend on the number of atoms in the system, yet when I increase nx, ny, and nz in my script (seen below), each g(r) value decreases significantly. In testing this problem, the values eventually converged fully at ~1000000 atoms, but I don’t see why they should change at all with the number of atoms.
# Simulation settings
package gpu 1
boundary p p p
atom_style atomic
units metal# General model information
variable nx equal 1 # number of cells in x direction
variable ny equal 1 # number of cells in y direction
variable nz equal 1 # number of cells in z direction#Create Atoms
lattice fcc 1.0
region box block 0 ${nx} 0 ${ny} 0 ${nz} units lattice
create_box 1 box
create_atoms 1 box
mass 1 1.0# Potential
variable rcut equal 2.5
variable padding equal 2*${rcut}
variable sigAA equal 1.0
variable epsAA equal 1.0
pair_style lj/cut/gpu ${rcut}
pair_coeff 1 1 ${epsAA} ${sigAA}
pair_modify shift yes# Compute radial distribution (RDF) function
comm_modify cutoff ${padding}
compute myRDF all rdf 100 1 1 cutoff ${rcut}
fix fix_rdf all ave/time 1 1 1 c_myRDF[*] file out.rdf mode vector
run 0
I also computed the RDF separately by importing the dump file into OVITO, and did not see a size-dependent effect here – increasing the number of atoms did not have an effect on the g(r) values computed in OVITO (computed with the same number of bins and cutoff).
Any insight into this issue within lammps would be appreciated.
Thank you,
Chloe