Paramiko and remote workflow exec_command

Hello,

I am trying to send the execution of a python script containing the workflow creation on a remote cluster with paramiko. The python script will generate new workflow once previous calculation is finished, etc.
When I send the command with paramiko exec_command(‘python script.py’) from my laptop, the database authentification failed on the remote cluster when I reset the database.

When I execute directly the script on the remote cluster, the workflow is created, so it is not a configuration error.

I tried to execute the python script inside a sh job but same result. It seems it is related to paramiko and the ssh client, but except that, I have no idea where I need to look to solve this problem. Or maybe it is at the launchpad initialization ? I am currently doing :
lpad = LaunchPad.auto_load()

I would appreciate any help !

Hi Florian

Have you tried setting the locations of the various configuration files directly / explicitly instead of using auto_load()? e.g.

lp = LaunchPad.from_file(“path/to/file/on/remote”)

Other than that, you can also try to see if you can connect directly via pymongo to your MongoDB collection. This will avoid the use of the FireWorks codebase altogether. If you can get paramiko working to connect remotely via pymongo, it should also work with FireWorks.