[lammps-users] Entropy and Neighbor command

Dear all,

I am trying to characterize a box with a single metal element, the model is very simple. I am trying Ni, Fe, Cr and many others. Among several thermodynamic parameters I’m also trying to evaluate the entropy. In order to get an overall indicative valur, I added the following lines in the input file:

compute entatom all entropy/atom 0.25 5

compute entot all reduce sum c_entatom

changing many other minor input parameters I saw that the results stay, say, solid.

I however found a huge difference while adding the following line:

neighbor 2.5 bin

It decreases the entropy by a lot (on the order of 20-30%). I also saw that changing the skin (here is 2.5) results do not change very much, they do change if neighbor command is added or removed. Honestly, I can’t figure out why the huge change in the entropy. Other thermodynamic values (e.g. pe) don’t change very much either with or without the neighbor command. Thank you.

Cheers,

Stefano

I suspect that you are making incorrect assumptions about what is computed here.

Please look at the description of the compute more carefully. The purpose of this compute is to determine some kind of (dis)order parameter to interpret local structure, not to compute thermodynamic properties.

Axel.

please also note that the choice of the neighbor command should not change the results by more than what can be justified due to the non-associative nature of floating point math unless they are chose to be incorrect, i.e. leave out neighbors that need to be considered or update the list not frequent enough for the structural changes incurred.

axel.

What is available in LAMMPS is documented in the manual.

And I would suggest to look up in a statistical mechanics or thermodynamics text book what options you have in principle to access the properties you are looking for. I only know about determining free energy differences associated with specific processes.

Axel.

Dear Axel,

Thank you for your answer. I did read the compute description, but not carefully enough I guess. Sorry for that. So now the question is if there actually is a command for evaluating the entropy of the system. For instance, what if I need to evaluate all the components of Gibbs free energy of a binary alloy? Thank you.

Cheers,

Stefano

Dear Stefano,

it is indeed possible to expand the total entropy of a system into an infinite series of entropic terms due to two-body, three body, four-body, …, n-body correlations. This has been shown in the paper of Baranyai, Andras; Evans, Denis J. Physical Review A: Atomic, Molecular, and Optical Physics (1989), 40 (7), 3817-22. This expansion is valid for the case of ensembles with constant chemical potential and not constant number of atoms/particles. What you are computing with the entropy/atom command is the two-body term of this expansion in a simplified way provided you are in an ensemble with constant chemical potential.

Note that the expression for the two-body entropy depends heavily on g®: If g® is not strictly equal to 1 (in the mathematical sense) for large distances, then the contribution of large r to the entropy will become (artificially) very important. In a simulation, you can choose a maximum r that you employ in the calculation of g® (for instance half of the length of the simulation box) and check if the S_2 term is still sensitive to r.

Another approximate way is to use the “quasi harmonic” approach proposed by J. Schlitter (if you are familiar with python you can have a look at a related code i have written some time ago https://github.com/evoyiatzis/IntramolecularEntropy)

Cheers
Evangelos