RDF not exactly normalized to 1

I am trying to compute radial distribution function for certain atom types, and I have a problem with LAMMPS normalizing g® to 1.0 at the cutoff. I am using the commands:

‘pair_style lj/charmm/coul/long 10.00 10.10’

and

‘compute myRDF all rdf 500 2 2’

‘fix RDF all ave/time 500 1 500 c_myRDF file rdf.data mode vector’

When I averaged each bin, the last bin is about 1.1.

Am I doing something wrong in the rdf calculation? How can I get LAMMPS to normalize it closer to 1?

Thank you,

J. Berry

I am trying to compute radial distribution function for certain atom types,
and I have a problem with LAMMPS normalizing g(r) to 1.0 at the cutoff. I am
using the commands:

‘pair_style lj/charmm/coul/long 10.00 10.10’

and

‘compute myRDF all rdf 500 2 2’

‘fix RDF all ave/time 500 1 500 c_myRDF file rdf.data mode
vector’

When I averaged each bin, the last bin is about 1.1.

Am I doing something wrong in the rdf calculation? How can I get LAMMPS to
normalize it closer to 1?

the rdf compute in LAMMPS has been very carefully revised and tested
against external codes multiple times, so chances are that the result
you are observing is correct, provided you use the most recent version
of the module (the last changes were in late 2015 to correct for
finite size effects).

rather than asking yourself how to "force" the result to be closer to
your expectations, you should try to find out why you get the result
you see. a limit != 1.0 can be quite valid as g(r) has the limit of
1.0 only for atomic and homogeneous systems when sampled from a system
in equilibrium.

axel.