Hi Team Lammps-users,
I am trying to run rhodopsin input file with gpu and getting error: Cuda error: Cuda_NeighborBuild: neighbor build kernel execution failed in file ‘neighbor.cu’ in line 252 : unspecified launch failure.
Can anyone help me here.
LAMMPS (2 Jul 2013)
Using LAMMPS_CUDA
USER-CUDA mode is enabled (…/lammps.cpp:394)
using 1 OpenMP thread(s) per MPI task
CUDA: Activate GPU
Using device 0: Tesla K20Xm
Scanning data file …
4 = max bonds/atom
8 = max angles/atom
18 = max dihedrals/atom
2 = max impropers/atom
Reading data file …
orthogonal box = (-27.5 -38.5 -36.3646) to (27.5 38.5 36.3615)
1 by 1 by 1 MPI processor grid
32000 atoms
32000 velocities
27723 bonds
40467 angles
56829 dihedrals
1034 impropers
Finding 1-2 1-3 1-4 neighbors …
4 = max # of 1-2 neighbors
12 = max # of 1-3 neighbors
24 = max # of 1-4 neighbors
26 = max # of special neighbors
Replicating atoms …
orthogonal box = (-27.5 -38.5 -36.3646) to (192.5 269.5 254.54)
1 by 1 by 1 MPI processor grid
2048000 atoms
1774272 bonds
2589888 angles
3637056 dihedrals
66176 impropers
Finding 1-2 1-3 1-4 neighbors …
4 = max # of 1-2 neighbors
12 = max # of 1-3 neighbors
24 = max # of 1-4 neighbors
26 = max # of special neighbors
Finding SHAKE clusters …
103488 = # of size 2 clusters
232512 = # of size 3 clusters
47808 = # of size 4 clusters
270912 = # of frozen angles
PPPMCuda initialization …
G vector = 0.24522
grid = 96 125 120
stencil order = 5
absolute RMS force accuracy = 0.0300582
relative force accuracy = 9.05193e-05
brick FFT buffer size/proc = 1641250 1440000 146250
WARNING: # CUDA: You asked for the usage of Coulomb Tables. This is not supported in CUDA Pair forces. Setting is ignored.
(…/pair_lj_charmm_coul_long_cuda.cpp:171)
CUDA: VerletCuda::setup: Allocate memory on device for maximum of 2048000 atoms…
CUDA: Using precision: Global: 4 X: 8 V: 8 F: 4 PPPM: 4
Setting up run …
CUDA: VerletCuda::setup: Upload data…
Cuda error: Cuda_NeighborBuild: neighbor build kernel execution failed in file ‘neighbor.cu’ in line 252 : unspecified launch failure.
Thanks & Regards,
Padma Pavani