GPU in LAMMPS

Dear All,

Our group are going to use GPU with LAMMPS software. Then we want to know,
How should we design the GPU system, so it can work effectively with the LAMMPS software?
And how are the maximum number of GPU can be used in LAMMPS?

Thank you

Best regards,
rizal

Hi

In principle every high end "Gameing" PC with NVIDIA!! GPUs is well suited to run LAMMPS with either the GPU or the USER-CUDA package (official release soon, though the code is already in the download included).
So for detailed advice check the appropriate Forums, or maybe buy one or two PC Game Magazines with Hardware advice.

Both packages can make good use of multiple GPUs, either by running multiple simulations at the same time or using multiple GPUs for one MD run.

One main difference is that if you want to run an X-Server on the machine as well, you should add some low level GPU for that (or an onboard GPU).

Besides the obvious questions (what GPUs/CPUs to use) the mainboard is an important choice. It should at least offer two 16x PCIe2.0 slots (and make sure that the GPUs can at least run in 8x/8x mode).

The same considerations as for an "Gameing" PC are also true for a Server machine.

I would not use factory overclocked GPUs - a little bit more reliability is probably worth it when you run stuff 24/7.

In terms of GPUs I would take either
2x GTX570 (270€ with 1.2GB and 350€ with 2.5GB)
or
2x GTX580 (400€ with 1.5GB and 500€ with 3.0GB)

I would not opt for GTX590 right now, because they produce more heat, and a dual GPU cuts the available bandwidth per GPU to the host again by a factor of two when you run multi-GPU simulations.

If you considering buying more than one machine, remember that Ethernet network is no viable solution for multi node GPU simulations. You more less need something like infiniband for that.

The professional GPUs are somewhat slower than the consumer cards but offer ECC if you want that. (I doubt that it is necessary - because if bit flips would occur regulararly I would expect those to show up as segmentation faults due to invalid atom ids in the neghbor lists - which make up most of the memory requirements, and this is something I have not seen during 6 months of nonstop use of 30 GTX470).
On the other hand they cost about 8 times as much as the consumer GPUs for the same performance.

Also if you use such a dual GPU solution you dont need a 1500W power supply. ~900W should be more than adequat.

Here you find some GPU comparison in single and double prec running the USER-CUDA package: http://forums.nvidia.com/index.php?showtopic=193770

The GTX570 should be as fast as the GTX480 and the GTX580 about 20% faster.

Christian

-------- Original-Nachricht --------

Dear All,

Our group are going to use GPU with LAMMPS software. Then we want to know,

i would recommend to first buy just one upper range GPU, test with it,
and see how it performs for the kind of simulations that you are running.
what degree of speedup you see and how well it works depends a lot
on what simulations you want to do and how.

How should we design the GPU system, so it can work effectively with the
LAMMPS software?

there is not one answer to this. it all depends on your budget, experience of
your system administrators, knowledge of PC technology, power supply and
cooling capacities, vendor expertise, and tolerance to failure.

you can go for "gaming" hardware and save a lot of money, but you have to
be wary of not overloading the PCI-e bus. some mainboards use a "switch"
that allows them to have more than 2 16x PCI-e slots with just one south
bridge chip, but using a third or even fourth GPU in such systems will
result in overloading the PCI-e bus. from some types of GPU accelerated
LAMMPS simulations the impact will be minor, for others significant.

be wary of vendors that are pushing dual or even 4-way 8-core or 12-core
opteron processor mainboards with GPUs. i've seen a lot of those recently,
but those are a bad choice for GPU computing, but particularly with
LAMMPS (for now).

And how are the maximum number of GPU can be used in LAMMPS?

before you hit the limit in LAMMPS you will run out of money. :wink:

cheers,
    axel.

Dear Christian and Axel,

Thanks for your advices, I really appreciate them. Then we can consult to the GPU expert here (in Taiwan) based on your advices.

Thank you,

best Regards,
rizal

2011/7/11 Axel Kohlmeyer <[email protected]>