Rigid polyhedral unit

Dear Users,

I am simulating a system with polyhedral units(~100). I want to make them rigid. I am unable to do so using ‘group’ feature since it does not allow more than 32 groups. Also, the molecular-id does not work since the polyhedral units are connected with each other. Is there any way to make them rigid in Lammps.

Thanks,

the molecule ID is not connected to any structural feature and merely a way to “partition” the system by assigning numbers to groups of atoms.
for biological systems, the molecule ID is often assigned to individual residues not molecules.
fix rigid also has the option “custom” where you can use a property assigned via fix property/atom or an atom style variable to define the rigid bodies.

what is more important to look for is whether the individual rigid units share atoms or not. if they do, you may not use any of the rigid fixes, but have to use fix poems. that fix allows to read the rigid body definition from a file.

Axel.

Thanks for the clarification. As you have suggested the “fix poems” worked for me. However, when I execute lammps_mpi using more than 1 procs (parallel) it is complaining about a segmentation fault. I compiled the code with open-mpi (as described in the README file). Below is the error message with 2 processors:

rcut in model: 6
ntypes in model: 3
[dcc-slogin-01:76568] *** Process received signal ***
[dcc-slogin-01:76568] Signal: Segmentation fault (11)
[dcc-slogin-01:76568] Signal code: Address not mapped (1)
[dcc-slogin-01:76568] Failing at address: (nil)
[dcc-slogin-01:76568] [ 0] /lib64/libpthread.so.0(+0xf630)[0x7f11d640e630]
[dcc-slogin-01:76568] [ 1] /dscrhome/mg422/LAMMPS/July21_MODAL/lammps-3Mar20/src/lmp_mpi(__intel_avx_rep_memcpy+0x444)[0xb245c4]
[dcc-slogin-01:76568] [ 2] /admin/apps/rhel7/openmpi-4.0.2-intel-18.0.2/lib/libopen-pal.so.40(opal_convertor_unpack+0x122)[0x7f11d54424f2]
[dcc-slogin-01:76568] [ 3] /admin/apps/rhel7/openmpi-4.0.2-intel-18.0.2/lib/openmpi/mca_pml_ob1.so(mca_pml_ob1_recv_request_progress_match+0x1cd)[0x7f11c3be1ebd]
[dcc-slogin-01:76568] [ 4] /admin/apps/rhel7/openmpi-4.0.2-intel-18.0.2/lib/openmpi/mca_pml_ob1.so(mca_pml_ob1_recv_req_start+0x2a8)[0x7f11c3be0ed8]
[dcc-slogin-01:76568] [ 5] /admin/apps/rhel7/openmpi-4.0.2-intel-18.0.2/lib/openmpi/mca_pml_ob1.so(mca_pml_ob1_irecv+0x1c0)[0x7f11c3bd0630]
[dcc-slogin-01:76568] [ 6] /opt/apps/rhel7/openmpi-4.0.2-intel-18.0.2/lib/libmpi.so.40(ompi_coll_base_bcast_intra_generic+0x624)[0x7f11d6ee3f54]
[dcc-slogin-01:76568] [ 7] /opt/apps/rhel7/openmpi-4.0.2-intel-18.0.2/lib/libmpi.so.40(ompi_coll_base_bcast_intra_binomial+0xae)[0x7f11d6ee4c8e]
[dcc-slogin-01:76568] [ 8] /admin/apps/rhel7/openmpi-4.0.2-intel-18.0.2/lib/openmpi/mca_coll_tuned.so(ompi_coll_tuned_bcast_intra_dec_fixed+0xf6)[0x7f11c357ab76]
[dcc-slogin-01:76568] [ 9] /opt/apps/rhel7/openmpi-4.0.2-intel-18.0.2/lib/libmpi.so.40(MPI_Bcast+0x67)[0x7f11d6ea6d37]
[dcc-slogin-01:76568] [10] /dscrhome/mg422/LAMMPS/July21_MODAL/lammps-3Mar20/src/lmp_mpi[0x648e41]
[dcc-slogin-01:76568] [11] /dscrhome/mg422/LAMMPS/July21_MODAL/lammps-3Mar20/src/lmp_mpi[0x5a1acf]
[dcc-slogin-01:76568] [12] /dscrhome/mg422/LAMMPS/July21_MODAL/lammps-3Mar20/src/lmp_mpi[0x596c32]
[dcc-slogin-01:76568] [13] /dscrhome/mg422/LAMMPS/July21_MODAL/lammps-3Mar20/src/lmp_mpi[0x46154d]
[dcc-slogin-01:76568] [14] /dscrhome/mg422/LAMMPS/July21_MODAL/lammps-3Mar20/src/lmp_mpi[0x462ad8]
[dcc-slogin-01:76568] [15] /dscrhome/mg422/LAMMPS/July21_MODAL/lammps-3Mar20/src/lmp_mpi[0xa894de]
[dcc-slogin-01:76568] [16] /lib64/libc.so.6(__libc_start_main+0xf5)[0x7f11d6053555]
[dcc-slogin-01:76568] [17] /dscrhome/mg422/LAMMPS/July21_MODAL/lammps-3Mar20/src/lmp_mpi[0x40c6e9]
[dcc-slogin-01:76568] *** End of error message ***

Please suggest.

Thanks,
Mayank

this is impossible to debug from remote without having a minimal but complete input deck for testing.

axel.