sphere particles - setting the angular velocity

joe,

please always copy the mailing list on your replies.
useful discussions should be archived in the mailing
list archives, so that people in a similar situation can
later research the solutions there in case it doesn't
lead to a change of the code or the documentation.

Thanks for the reply.

This program will be generating scripts that run a few thousand timesteps,
and then reserialise all the data back into python to calculate constraints
mid simulation/work out where to add more elements - I don't want to add the
overhead of writing to a second file, which will slow things down even more.

i disagree, using all those individual
commands is usually *less* efficient.
also, the data file will be *much* more
compact than a file with all the individual
commands.

What I really want is a c++ interface to initialise and retrieve the data
myself programatically, but that approach isn't documented so this seemed
quicker...

the main documentation is the source code,
which is pretty readable.

there also is the option to compile LAMMPS
as a library and use it from C or C++.
... and there is a python wrapper, too.

if you worry about i/o, why not use the python
wrapper and feed the text directly?
that would seem like an even better
approach to what you are trying to do,
based on the limited information i have.

it would require a few additions to the
library interface that are specific to
granular simulations, but that would
be fairly straightforward.

when I say bug - the documentation doesn't explicitly say you can do this,
it seems to imply that 'rot yes' is only applicable when zeroing the

rot yes. applies to the zeroing the rotational momentum
of the _entire group_ in case of initializing particles to
a distribution of random velocities. this has no relevance
to what you are doing. in general, the use for granular
media simulation, is not a very common use of LAMMPS
(the LIGGGHTS fork would probably a better place to
look at and work with), and thus support for relevant
features is - at best - spotty (which motivated the
existence of LIGGGHTS in the first place).

velocity - but that would leave no way to do it (apart from a data file)
when you can modify everything else...

nod. if you don't want to modify the sources,
a data file is all that you have left to do what
you want to do.

Attached is a script that you can see perhaps should set the angular speed
and doesn't: you can see that the angular y that we try to set up ends up in
the linear y. I also attach the output I get. (there also seems to be
something insane going on with the scaling of units, but I'll get to the

that is a common mistake. many commands
default to using coordinate data in terms of
lattice spacings rather than length units. add
"units box" and you are likely to get what you
are expecting.

cheers,
    axel.

tried something like ''velocity ID set WX WY WZ rot yes"

The velocity command does not allow setting of the angular velocity
(omega) of spherical particles. In what you have above
Wx, etc will be the linear velocity. The rot keyword does
something else. See the doc page.

The set command could easily be extended to set omega
like it does angular momentum now (for aspherical particles).

However I agree with Axel, that if you want to do this
for many particles, writing a data file is the way to go.
The Velocities section of a data file will define 6 quantities
per atom for spherical particles, the linear and angular velocities
of each.

Steve

OK, replying to all as instructed.

The python interface is broken on my machine, because of the way I
compiled lammps - I wrote to Steve about this a while back, but failed
to report it properly, and just got on with trying to write a script
generator instead. I want to ultimately write a 'proper' set of Python
bindings to the simulator, which don't rely on dynamic linking - the
script generator is a 'version 1'. Having just had a peek inside
library.cpp it looks like even using the functions it lists will
require extensive understanding of the rest of the code; but that's
fine, I'll follow the breadcrumbs.

re: 'the source code is the documentation' - the src folder is a flat
directory structure with 521 files in. A simple box diagram of the
structure of the program, where the different data structures live
etc, would make it a lot easier to use lammps as a library. Maybe I'll
even draw one after writing a cleaner wrapper.

I have found liggghts even more difficult to compile than lammps! I
don't need any of the extra features it provides - I am only
interested in mechanical interactions, which lammps has already (and
which will be faster without having to check whether they're enabled).

I'll try the data file route, and when I have a working version 1 I'll
see about improving the speed with a proper programmatic interface.
(you can watch my progress, or lack thereof, on github by the way:
https://github.com/joe-jordan/pydem )

Thanks,

Joe

OK, replying to all as instructed.

The python interface is broken on my machine, because of the way I
compiled lammps - I wrote to Steve about this a while back, but failed
to report it properly, and just got on with trying to write a script
generator instead. I want to ultimately write a 'proper' set of Python
bindings to the simulator, which don't rely on dynamic linking - the

i don't understand what you are saying here. python modules
*have* to be DSOs and thus require dynamic linkage. this is
how most python packages, even those that ship with
python are compiled.

script generator is a 'version 1'. Having just had a peek inside
library.cpp it looks like even using the functions it lists will
require extensive understanding of the rest of the code; but that's
fine, I'll follow the breadcrumbs.

well, you cannot have your cake and eat it.
you want to "mess" with LAMMPS internal
data structures, so you have to learn what
data lives where and you have to understand
some of the motivations, particularly for
parallelization, that cause the structure of
LAMMPS to be like it is. i agree that it is
intimidating at the beginning, but so are
all big package codes and there is no way
around it unless you are willing to sacrifice
performance or ease of adding new features.

the german physicist friedrich hund is quoted to
have said "the sum of all evil is a constant."
so for everything that makes things easier
on one side, you'll have to pay with some
complications elsewhere.

re: 'the source code is the documentation' - the src folder is a flat
directory structure with 521 files in. A simple box diagram of the
structure of the program, where the different data structures live
etc, would make it a lot easier to use lammps as a library. Maybe I'll
even draw one after writing a cleaner wrapper.

http://lammps.sandia.gov/doc/Developer.pdf

is a starting point. the second thing is that - for the most part -
you don't have to concern yourself with the many derived classes
and stick to the interface classes.

I have found liggghts even more difficult to compile than lammps! I

i find the notion of lammps to be difficult to be compiled confusing,
since in my experience, it is one of the most easy to compile
software packages. at least in the scientific computing field.
there are some packages that are near impossible to compile
(*and* function correctly) that make compiling lammps look
like a walk in the park. it does require a little knowledge of
your computing and development environment, but for somebody
developing software that should not be a problem. if that
was the case, then you have some bigger issues waiting
for you down the line.

cheers,
    axel.

apologies to all for yet another email in this thread.

OK, replying to all as instructed.

The python interface is broken on my machine, because of the way I
compiled lammps - I wrote to Steve about this a while back, but failed
to report it properly, and just got on with trying to write a script
generator instead. I want to ultimately write a 'proper' set of Python
bindings to the simulator, which don't rely on dynamic linking - the

i don't understand what you are saying here. python modules
*have* to be DSOs and thus require dynamic linkage. this is
how most python packages, even those that ship with
python are compiled.

to clarify - your python wrapper uses CDLL() to open a binary C
library and call the functions inside. This is most certainly not how
normal python C extensions work (cf scipy, numpy, ...); they call
compilers when you run setup.py; so you know that compilation and
linking has worked once you've 'built' the python library. They are
statically linked at compile time to the relevant C, and dynamically
loaded from python land - yours dynamically loads the C library at
runtime, making debugging why the import fails incredibly difficult.
I'm not saying it's bad - it exposes the C interface in a few lines of
python, very neat - just that it's unconventional and difficult to fix
if it doesn't 'magically work'.

script generator is a 'version 1'. Having just had a peek inside
library.cpp it looks like even using the functions it lists will
require extensive understanding of the rest of the code; but that's
fine, I'll follow the breadcrumbs.

well, you cannot have your cake and eat it.
you want to "mess" with LAMMPS internal
data structures, so you have to learn what
data lives where and you have to understand
some of the motivations, particularly for
parallelization, that cause the structure of
LAMMPS to be like it is. i agree that it is
intimidating at the beginning, but so are
all big package codes and there is no way
around it unless you are willing to sacrifice
performance or ease of adding new features.

I don't want to mess with lammps internal structures; I want to be
able to assign and retrieve attributes on atoms, before and after the
simulation has run a batch of timesteps. From the source:

/* ----------------------------------------------------------------------
   extract a pointer to an internal LAMMPS atom-based entity
   name = desired quantity, e.g. x or mass
   returns a void pointer to the entity
     which the caller can cast to the proper data type
   returns a NULL if Atom::extract() does not recognize the name
   customize by adding names to Atom::extract()
------------------------------------------------------------------------- */

void *lammps_extract_atom(void *ptr, char *name)
{
  LAMMPS *lmp = (LAMMPS *) ptr;
  return lmp->atom->extract(name);
}

ptr is the library class instance, and name is the property you want -
how on earth do you specify which atom you're requesting the property
from? The user will need to investigate the LAMMPS::atom property to
see if it's a list or static/singleton pattern, and so on. The
interface isn't 'clean' in that you need to understand how the
implementation works in order to use it. Also, to be pernickety, that
comment should really be in the header file, where you would normally
define an interface, rather than the C++ implementation file.

the german physicist friedrich hund is quoted to
have said "the sum of all evil is a constant."
so for everything that makes things easier
on one side, you'll have to pay with some
complications elsewhere.

re: 'the source code is the documentation' - the src folder is a flat
directory structure with 521 files in. A simple box diagram of the
structure of the program, where the different data structures live
etc, would make it a lot easier to use lammps as a library. Maybe I'll
even draw one after writing a cleaner wrapper.

http://lammps.sandia.gov/doc/Developer.pdf

is a starting point. the second thing is that - for the most part -
you don't have to concern yourself with the many derived classes
and stick to the interface classes.

(apologies - I missed this. the link is right there on the homepage! :slight_smile: )

I have found liggghts even more difficult to compile than lammps! I

i find the notion of lammps to be difficult to be compiled confusing,
since in my experience, it is one of the most easy to compile
software packages. at least in the scientific computing field.
there are some packages that are near impossible to compile
(*and* function correctly) that make compiling lammps look
like a walk in the park. it does require a little knowledge of
your computing and development environment, but for somebody
developing software that should not be a problem. if that
was the case, then you have some bigger issues waiting
for you down the line.

I am comparing lammps to scipy in my benchmark for 'easy to compile' -
'sudo python setup.py install' is my usual experience. In lammps case,
perhaps it might need an additional argument to specify whether to try
and build with MPI. As it stands, I had to write a makefile, download
and build copies of fftw and libjpeg, and then call make and peer at
the output. My fault for using a mac, I guess.

cheers,
axel.

Thanks for your help - I should be on the right track now.

Joe

apologies to all for yet another email in this thread.

there is nothing bad about that,
for as long as it leads to helpful
information and resolution of problems.

since this is covering new ground,
it is certainly more interesting than
seeing the same questions being
posted over and over again.

to clarify - your python wrapper uses CDLL() to open a binary C
library and call the functions inside. This is most certainly not how
normal python C extensions work (cf scipy, numpy, ...); they call
compilers when you run setup.py; so you know that compilation and
linking has worked once you've 'built' the python library. They are

that is part right and part wrong. the setup.py file(s)
in the python directory *do* call the compiler to
compile and link the DSO. using CDLL() on it,
is indeed unusual, though.

statically linked at compile time to the relevant C, and dynamically

that is not correct. you must not have static linkage in a DSO.

loaded from python land - yours dynamically loads the C library at

*any* compiled object on a modern unix machine
needs to load the C library. i don't understand
why this is a problem.

runtime, making debugging why the import fails incredibly difficult.
I'm not saying it's bad - it exposes the C interface in a few lines of
python, very neat - just that it's unconventional and difficult to fix
if it doesn't 'magically work'.

i've compiled and installed a lot of python packages myself
(mainly for deployment on HPC clusters) and have been through
its special "dependency" hell and back multiple times.
from that experience i cannot say that either approach is
better or worse. it all depends on how self-contained a package
is, how much time is dedicated to developing and testing and
how many people report problems in a way that they can
be resolved. numpy and scipy definitely stand out as
rather well maintained packages that seem to be sufficiently
coordinated in their development. that is more of an exception
than a rule. things in python land are fairly easy for as long
as you stay within its boundaries and have a fully self-consistent
and self-contained setup. things get pretty messy pretty
fast, once that isn't given anymore.

[...]

I am comparing lammps to scipy in my benchmark for 'easy to compile' -
'sudo python setup.py install' is my usual experience. In lammps case,
perhaps it might need an additional argument to specify whether to try
and build with MPI. As it stands, I had to write a makefile, download
and build copies of fftw and libjpeg, and then call make and peer at
the output. My fault for using a mac, I guess.

neither fftw nor libjpeg are required and that
is explicitly mentioned in the general compilation
documentation as well as referred to in the
specific instructions for the python interface.

http://lammps.sandia.gov/doc/Section_python.html

the serial version based on a minimal source tree
with just the granular package enabled should
indeed "just work".

axel.

In fact, on trying to reproduce the problem, it appears there isn't
one! A simple 'sudo python setup_serial.py install' builds and
installs the python library with no problems. I suspect things like
jpeg are missing, but that's no biggie (I'm weird enough to have
written my own renderers already).

I think what I was doing wrong before was trying to compile the C
binary myself, and drop it in place. The error it gave was something
about some MPI symbols being missing (although it took me an hour or
so to actually find the correct exception, as there were so many
try/except blocks in the region of the CDLL call!)

In any case it's all working for me now, including a bug I wasn't even
reporting. My ideas for improving the interface can go into a patch
for the python lib, I guess :slight_smile:

Thanks for your help,

Joe