[lammps-users] Problems allocating large arrays

Hi everybody,

I’m trying to create a fix that uses a large 3d array ( atom numbers x 3 x size I choose ). I can choose the number of atoms I want, but when I try to choose the “size I choose” larger than a specific number ( around 9000 in my case ), no matter how many atoms I created, LAMMPS tries to allocate a negative memory, and then it crashes. I’m using valgrind to try to find the problem, but I couldn’t find where this happens. I just wondered if anyone of you have ever faced a problem like that.

Regards,

Alexandre

alexandre,

Hi everybody,

I'm trying to create a fix that uses a large 3d array ( atom numbers x 3 x
size I choose ). I can choose the number of atoms I want, but when I try to
choose the "size I choose" larger than a specific number ( around 9000 in my
case ), no matter how many atoms I created, LAMMPS tries to allocate a
negative memory, and then it crashes. I'm using valgrind to try to find the
problem, but I couldn't find where this happens. I just wondered if anyone
of you have ever faced a problem like that.

have you computed how large a block of memory that would be?
3-dimensional arrays have a habit of "exploding" in size very fast.

since the code in memory.cpp uses "int" to compute the amount
of memory needed, you are limited to about 2 gigabytes per malloc.

axel.

The fact is that I did the test for just one particle, and still have the same problem. But anyway, I’ll check the memory.

2011/2/25 Axel Kohlmeyer <[email protected]>

The fact is that I did the test for just one particle, and still have the
same problem. But anyway, I'll check the memory.

the other option would be that you use uninitialized variables,
or that you corrupt memory. the next option would be to
compile with debug-info and set a break-point on the
method in Memory that you are using.

to give any kind of qualified advice, you have to post your code.

axel.