<< Running error >>

Hello.

I successfully compile GULP 6.2 using Intel compilers from Intel OneAPI (Intel(R) 64, Version 2024.2.1 Build 20240711).

The issue is running it, where I am getting the following error:

mpirun -np 2 gulp example1.gin 
Abort(2139023) on node 0 (rank 0 in comm 0): Fatal error in PMPI_Init: Unknown error class, error stack:
MPIR_Init_thread(192)........: 
MPID_Init(1665)..............: 
MPIDI_OFI_mpi_init_hook(1625): 
open_fabric(2726)............: 
find_provider(2904)..........: OFI fi_getinfo() failed (ofi_init.c:2904:find_provider:No data available)

Running as gulp example1.gin, returns:

gulp example1.gin 
Abort(2139023) on node 0 (rank 0 in comm 0): Fatal error in PMPI_Init: Unknown error class, error stack:
MPIR_Init_thread(192)........: 
MPID_Init(1665)..............: 
MPIDI_OFI_mpi_init_hook(1625): 
open_fabric(2726)............: 
find_provider(2904)..........: OFI fi_getinfo() failed (ofi_init.c:2904:find_provider:No data available)
[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=2139023
:
system msg for write_line failure : Bad file descriptor
Abort(2139023) on node 0 (rank 0 in comm 0): Fatal error in PMPI_Init: Unknown error class, error stack:
MPIR_Init_thread(192)........: 
MPID_Init(1665)..............: 
MPIDI_OFI_mpi_init_hook(1625): 
open_fabric(2726)............: 
find_provider(2904)..........: OFI fi_getinfo() failed (ofi_init.c:2904:find_provider:No data available)
[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=2139023
:
system msg for write_line failure : Bad file descriptor

Program received signal SIGSEGV: Segmentation fault - invalid memory reference.

Backtrace for this error:
#0  0xbf5aca23c5d in ???
#1  0xbf5aca22d85 in ???
#2  0xbf5ac04531f in ???
	at ./signal/../sysdeps/unix/sysv/linux/x86_64/libc_sigaction.c:0
#3  0xbf5a21e27f4 in MPIR_Err_return_comm
	at ../../src/mpi/errhan/errutil.c:428
#4  0xbf5a230ea95 in PMPI_Init
	at ../../src/mpi/init/init.c:160
#5  0xbf5ac4ffece in pmpi_init_
	at ../../src/binding/fortran/mpif_h/initf.c:275
#6  0x557f0acb857b in ???
#7  0x557f0ad019c8 in ???
#8  0x557f09761a85 in ???
#9  0xbf5ac02a1c9 in __libc_start_call_main
	at ../sysdeps/nptl/libc_start_call_main.h:58
#10  0xbf5ac02a28a in __libc_start_main_impl
	at ../csu/libc-start.c:360
#11  0x557f09761ac4 in ???
#12  0xffffffffffffffff in ???
Segmentation fault (core dumped)

Any help is welcome!
Camps

It’s probably because there’s a mistake in your line to run the program. There are 2 ways you can correctly do this:

mpirun -np 2 gulp < example1.gin

or

mpirun -np 2 gulp example1

Your input would be looking for an input file “example1.gin.gin”
Regards,
Julian

Hey, Julian.

Thanks for your answer, but that didn’t work. In both cases, I got a similar error:

mpirun -np 2 gulp < example1.gin 
Abort(2139023) on node 0 (rank 0 in comm 0): Fatal error in PMPI_Init: Unknown error class, error stack:
MPIR_Init_thread(192)........: 
MPID_Init(1665)..............: 
MPIDI_OFI_mpi_init_hook(1625): 
open_fabric(2726)............: 
find_provider(2904)..........: OFI fi_getinfo() failed (ofi_init.c:2904:find_provider:No data available)
Abort(2139023) on node 0 (rank 0 in comm 0): Fatal error in PMPI_Init: Unknown error class, error stack:
MPIR_Init_thread(192)........: 
MPID_Init(1665)..............: 
MPIDI_OFI_mpi_init_hook(1625): 
open_fabric(2726)............: 
find_provider(2904)..........: OFI fi_getinfo() failed (ofi_init.c:2904:find_provider:No data available)

and

mpirun -np 2 gulp example1
Abort(2139023) on node 0 (rank 0 in comm 0): Fatal error in PMPI_Init: Unknown error class, error stack:
MPIR_Init_thread(192)........: 
MPID_Init(1665)..............: 
MPIDI_OFI_mpi_init_hook(1625): 
open_fabric(2726)............: 
find_provider(2904)..........: OFI fi_getinfo() failed (ofi_init.c:2904:find_provider:No data available)

Regards,
Camps

In that case it’s hard to know what’s going on as it’s specific to your machine/configuration. You should check that a basic MPI program (such as pi calculator) works with your installation. It would be best to talk to one of your local sys admin people to check that MPI is installed & configured properly as this doesn’t seem to be a GULP problem per se.
Regards,
Julian