Error: Too many open files

I’m trying to optimize a potential in Python using Scipy minimizer. This minimizer runs a series of EAM calculations in LAMMPS, updates the potential, and again runs the EAM calculations. After 300 iterations, I receive the following error. Any idea what the issue is?!

Traceback (most recent call last):
  File "Reparameterization.py", line 8129, in <module>
    CostFunctionMinimized = minimize(CostFunction, FittingNumpy[:,1], args=Arguments, method="Nelder-Mead", options={"disp": True, "maxiter" : MinimizeMaxiter}, tol=1E-2)
  File "C:\Users\XXX\AppData\Local\Programs\Python\Python39\lib\site-packages\scipy\optimize\_minimize.py", line 611, in minimize
    return _minimize_neldermead(fun, x0, args, callback, bounds=bounds,
  File "C:\Users\XXX\AppData\Local\Programs\Python\Python39\lib\site-packages\scipy\optimize\optimize.py", line 793, in _minimize_neldermead
    fxc = func(xc)
  File "C:\Users\XXX\AppData\Local\Programs\Python\Python39\lib\site-packages\scipy\optimize\optimize.py", line 464, in function_wrapper
    return function(np.copy(x), *(wrapper_args + args))
  File "Reparameterization.py", line 3405, in CostFunction
    lmp.file(LammpsInFileAddress)
  File "C:\Users\XXX\AppData\Local\LAMMPS 64-bit 2Aug2023 with Python\Python\lammps\core.py", line 601, in file
    self.lib.lammps_file(self.lmp, path)
  File "C:\Users\XXX\AppData\Local\LAMMPS 64-bit 2Aug2023 with Python\Python\lammps\core.py", line 49, in __exit__
    raise self.lmp._lammps_exception
Exception: ERROR on proc 0: cannot open eam/alloy potential file M3_EgtSatZblPpm-2.eampot: Too many open files (src/potential_file_reader.cpp:59)

Well, you have too many opened files. This error comes from the operating system, not LAMMPS itself. You can read about it, for example, here: ulimit - Too many open files - Ask Ubuntu

Are you properly closing the created LAMMPS instances?
Python will not close them automatically until the entire run is complete.

Thanks! that was the issue: Python does not close instances automatically.

Thanks for your insights, that was helpful.