RecursionError: maximum recursion depth exceeded

When I try to download the bandgap data with my api, I find that I can only download 1000 data per time and then will show the ‘RecursionError: maximum recursion depth exceeded’. Is there any way to deal with this problem? My code is as follows:

def main():

with MPRester(api_key) as mpr:
    
    docs = mpr.summary.search(elements=["Ti", "O"], fields=["material_id", "band_gap", "formula_pretty"])

    mpid_bgap_dict = [{"material_id": doc.material_id, "band_gap": doc.band_gap, "formula_pretty": doc.formula_pretty} for doc in docs]

    df = pd.DataFrame(mpid_bgap_dict)
    #df = pd.DataFrame(docs, columns=["material_id", "band_gap"])

    df.to_csv("D:/nps/mpfiles/band_gap_data.csv", index=False)
    print("CSV file saved successfully")

if name == “main”:
main()

Try initializing the DataFrame with DataFame.from_records(). See pandas.DataFrame.from_records — pandas 2.1.3 documentation

I’m very glad that you reply just in time,but your suggestion just doesn’t work.
I change the code “df = pd.DataFrame(mpid_bgap_dict)” into “df = pd.DataFrame.from_records(mpid_bgap_dict)”, but still meet the same error.

Please send the full error traceback. Thanks!

ok, the error traceback is as follows:
C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\mp_api\client\mprester.py:182: UserWarning: mpcontribs-client not installed. Install the package to query MPContribs data, or construct pourbaix diagrams: ‘pip install mpcontribs-client’
warnings.warn(
Retrieving SummaryDoc documents: 0%| | 0/5279 [00:00<?, ?it/s]Traceback (most recent call last):
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\mp_api\client\core\client.py”, line 579, in _submit_requests
data_tuples = self._multi_thread(use_document_model, params_list, pbar, timeout)
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\mp_api\client\core\client.py”, line 643, in _multi_thread
data, subtotal = future.result()
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\concurrent\futures_base.py”, line 451, in result
return self.__get_result()
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\concurrent\futures_base.py”, line 403, in __get_result
raise self._exception
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\concurrent\futures\thread.py”, line 58, in run
result = self.fn(*self.args, **self.kwargs)
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\mp_api\client\core\client.py”, line 685, in _submit_request_and_process
response = self.session.get(
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py”, line 602, in get
return self.request(“GET”, url, **kwargs)
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py”, line 589, in request
resp = self.send(prep, **send_kwargs)
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py”, line 703, in send
r = adapter.send(request, **kwargs)
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\adapters.py”, line 486, in send
resp = conn.urlopen(
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py”, line 769, in urlopen
conn = self._get_conn(timeout=pool_timeout)
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py”, line 291, in _get_conn
if conn and is_connection_dropped(conn):
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\connection.py”, line 20, in is_connection_dropped
return not conn.is_connected
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connection.py”, line 257, in is_connected
return not wait_for_read(self.sock, timeout=0.0)
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\wait.py”, line 117, in wait_for_read
return wait_for_socket(sock, read=True, timeout=timeout)
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\wait.py”, line 110, in wait_for_socket
return wait_for_socket(sock, read, write, timeout)
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\wait.py”, line 110, in wait_for_socket
return wait_for_socket(sock, read, write, timeout)
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\wait.py”, line 110, in wait_for_socket
return wait_for_socket(sock, read, write, timeout)
[Previous line repeated 981 more times]
File “C:\Users\bnx00\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\wait.py”, line 106, in wait_for_socket
if _have_working_poll():
RecursionError: maximum recursion depth exceeded
Retrieving SummaryDoc documents: 19%|████████████████████████████████████▎ | 1000/5279 [00:02<00:10, 422.57it/s]

It’s strange that I used to use this api to download about 150000 structures earlier this year whithout any trouble, and when I try to get some feature data for further training I met this error.

Interesting. Can you confirm that you’re running the latest versions of mp-api and emmet-core? Thanks!

I just use the same version as I used in September, mp-api 0.39.3. Would it be necessary to upgrade it to newest version?

Yes, it’s always advised to upgrade to the latest versions of mp-api and emmet-core.

It didnt work. I still get the same error and the limitation of 1000 data per time still exists.
Should I apply a new api key?

A new API key probably wouldn’t help here. It’s hard to say, but I think this could be an issue with your internet connection dropping. If you’re running this on an unstable home WiFi, try running it from an interactive node on a compute cluster if you have access to it. You could also start a fresh python environment and install all the latest versions to exclude any issues with your current environment. HTH.