(no subject)

The ReaxFFSiO paper says, "All parameters are determined from QC
calculations (no empirical data is used) making it a first principles FF."

How is Reax empirical? How is my statement not justified by the paper?

Wrong. Still empirical. Again, any classical potential with force field
constants (potential parameters) fitted to experimental or ab initio
computational results are "empirical". This goes for BKS and ReaxFF. They
can "claim" their formalism is inspired by or derived from QM, but they
still have parameter empirically fitted to QM calculations - therefore
still empirical.

Take my advice: you will be laughed at and ridiculed badly if you go out
there and say BKS and ReaxFF are ab initio methods.

Full stop here. Read MD text books.

I fully respect you and want to learn from you, I just do not understand
why my interpretation of these papers is wrong?

I was not aware I sent this to the mailing list?
Why are you saying that I am trying to fully understand Ab initio MD from
you? Can I not ask questions without making you think I want a dissertation
on the topic?

This is a LAMMPS mailing list, not free on-line tutorship.

In the BKS paper, it states:

"In this Letter, we present a scheme for the development of bulk force
fields, starting from ab initio calculations on small clusters."

They fit their parameters to ab initio results of small clusters.

Benjamin,

When you make a force field, you have to decide on a functional form, and then fit the model parameters to some set of data. The data is often a bunch of empirically determined structures, or reaction energies, or whatever, but many times this is problematic because there isn’t a lot of data out there for non-equilibrium structures, and such. If you want to do dynamics, you will be sampling non-equilibrium states–and how do you know your model parameters hold up there, if none of that was in the training set? An alternative is to calculate the energies of a bunch of different configurations (including non-equilibrium states) using quantum mechanics. You can get lots of data over a broader spectrum of configurations. Is that “better”? It depends on how good the QM calculations are, and what your purpose is.

Now, you might think that a force field calibrated on “empirical data” is an “empirical force field,” and that one calibrated on ab initio calculations is an “ab initio force field,” but that would be wrong. Ray’s point is that an MM force field is ALWAYS “empirical” because it has been “empirically calibrated” to some set of data, no matter where you got that data (experiment or QM). “Ab initio” means “from first principles”, but any sort of calibration would preclude this. QM methods calculate strictly from theory, and so they qualify. In fact, some people don’t even like to call DFT with pseudopotentials “ab initio”.

My two and a half cents on the same story. The term first principles comes from a time when electrons, protons and neutrons were assumed to be the fundamental particles composing all matter (not included here the massless photons). The Schrodinger equation was the first successful attempt (formalism) to compute/predict the properties of matter by just combining the fundamental properties of such particles (masses, charges, spins) with their quantum nature (wave character).
In theory (approximations to establish the form of the hamiltonian of that of the wavefunction aside), there are no free parameters in the quantum mechanics model of matter.
Empirical potentials in general tend to avoid explicitly describing the electrons as well as their wave character. The result is a bunch of ad-hoc (better or worse) motivated expressions/functionals that (ideally) incorporate parameters to mimic the average behavior of the electrons under certain conditions and thus their empirical character. Reaxff, REBO, BKS, OPLS et al. are examples covering different attempts to achieve the aforementioned goal. Take Reaxff for example, you pretty much need a new parametrization every single time one aims to study a new system. Never mind the battery of parameters tabulated inside the FF potential file. How can this ever be called a first-principles approach? The best and worse ally of all empirical models is Morse’s law. These models were born by our need to compute and learn about the world while not waiting for computers to become faster. Yet, they will perish as soon as they pass onto the other side of the Morse’s curve. This has been happening already on certain systems where the CPU-GPU transition has allowed scientists to employ full first-principles methodologies to study the problem of interest in real time. Once you reach that point there is not way back to the empirical FF engine.
Carlos

Disclaimer: This message was specifically composed for the original author of the post and so it does not aim to ignite a philosophical discussion on the pros and cons of FFs as well as the validity of Morse’s law or the possible existence of a technological singularity.

Thank you to everyone for their extensive responses. They have indeed been super helpful. I understand why none of the potentials are implemented into LAMMPS are first-principles but rather empirical fits to first-principles or experimental data. However, with terms in a potential (such as a Coulombic term), there is indeed some physics in that term (as it follows Coulomb’s law). In the culture, would this not be somewhat considered semi-empirical? Once again, I know it is not first-principles as no one is solving Schrodinger’s equation here, but since there is some physics involved, I would think it would be classified as semi-empirical. However, Dr. Shan did stipulate that anything with free parameters must be considered empirical.

However, if the culture says that QM calculations must be involved for it to be called even semi-empirical, I understand the reasoning.

Ben

Bejamin wrote:

Thank you to everyone for their extensive responses. They have indeed been super helpful. I understand why none of the potentials are implemented into LAMMPS are first-principles but rather empirical fits to first-principles or experimental data…

comments:

Actually potentials are derived from more complicated theories… For example consider the Coulomb potential (1/r) this potential can be obtaining from the electromagnetic lagrangian , Or the Yukawa potential that model nuclear interactions can be derived form the standar model lagrangian … Usually once you have a Lagrangian density you can then derived all the forces of interactions… Wouldn’t be awesome to have a theory i.e lagrangian to describe all the forces we find in nature ?

Benjamin wrote:

Once again, I know it is not first-principles as no one is solving Schrodinger’s equation here, but since there is some physics involved, I would think it would be classified as semi-empirical. However, Dr. Shan did stipulate that anything with free parameters must be considered empirical.

comment:

Even schodinger equation have limitation in the modeling of Matter… For example we should use the Dirac equation (relativistic quantum mechanics to include the electron spin)

The many body problem in quantum mechanics is really troublesome…and therefore is necessary to use different aproximations …One of such success theories is Density functional theory (but even dft have limitations because is a theory)…Many semi-empirical potentials (can be obtaining from Quantum mechanics theory) … Also I can say its empirical because Perhaps If I make my own potential interaction and I would named the “OG POT” I perhaps may add an extra Term ^___^ to the equation if I if I feel this extra term helps to model the interaction… Sometimes we must go to the experimental data and try to make a “math model to fit the Data”,

A Salute

Oscar G,

Thank you to everyone for their extensive responses. They have indeed been super helpful. I understand why none of the potentials are implemented into LAMMPS are first-principles but rather empirical fits to first-principles or experimental data. However, with terms in a potential (such as a Coulombic term), there is indeed some physics in that term (as it follows

Coulomb’s law was developed much earlier than Schrodinger’s equation and the wave function theory, so people do not consider Coulomb’s law part of the whole ab initio or first principles family. Therefore, including Coulomb’s law does not make an empirical potential more ab initio.

Coulomb’s law). In the culture, would this not be somewhat considered semi-empirical? Once again, I know it is not first-

Please be aware that the term “semi-empirical” means something very specific, particularly in computational chemistry as a Hartree-Fock method. Please don’t use the word “semi-empirical” in MD.

principles as no one is solving Schrodinger’s equation here, but since there is some physics involved, I would think it would be classified as semi-empirical. However, Dr. Shan did stipulate that anything with free parameters must be considered

Empirical potentials often have terms that include a lot of physics, such as van der Waals, Coulombics, exponential decay attraction/repulsion and even bond orders. Many of these physics are derived from or inspired by wave function theory. The thing is that adjusting, fitting and training the coefficients to reproduce QM results make them empirical.

empirical.

However, if the culture says that QM calculations must be involved for it to be called even semi-empirical, I understand the

I can not stress this enough: don’t use the word “semi-empirical” in the context of MD.

Ray