[lammps-users] about MEAM source code

Ajing,

Both the SW and Tersoff potentials use ghost atoms. Otherwise, you will not achieve any significant parallel efficiency after few processors are used. Take a look to the papers related to spatial decomposition for MD.

About the MEAM potential, I believe is still using Fortran sources since it was a fast/reasonable way to include the force field in Lammps. You are right, cross compilation is more complex, specially if you are using different compilers (xlf, g++, etc.), but you are more than welcome to port the code to C/C++.

Regards, Javier

Message: 2

Javier,

Did you look at the LAMMPS source code yet? Could you please point me to where can I find how to treat these ghost atoms? Yes, I know in general case, these ghost atoms are required for parallel computing. BUT I was curious why this is not the case of SW and Tersoff.

Now, I have a similar Fortran version but I discovered the speed is pretty low. SO I’m planning to incorporate these in LAMMPS with C++. But i need steve help me to figure out all the related issues.

Ajing

Pair SW and Tersoff use ghost atoms. They are in the neighbor list.

Re: Fortran, it's because the author (Greg Wagner) had a previous
implementation of MEAM in Fortran, and the quickest route to
getting it working was to keep the majority of the code and wrap
it in a bit of C++ for inclusion in LAMMPS. If you have the time/energy
to rewrite it in C++, more power to you. But there is nothing inherently
slower about Fortran than C or C++. Unless the Fortran version
is doing something inefficient, I don't know why a C++ version
would become faster.

Steve

Thanks, Javier and Steve,

Obviously, I need do more learning on lammps. Email communication is part of that :slight_smile:

Re: Fortran,as you said, if C++ does not do better job than Fortran, then i should probably focus on the current version. The reason for the bad efficiency of my code is probably due to a couple of more loops over the neighbor list (for the many-body). I will study further. Thanks again.

Ajing