Fail response while pagination Pipedrive API

hello
I have a problem getting all activities and emails
Because the PipeDrive API does not support getting activities and emails for a specific lead, I have to get all activities and emails and then work with them
There are a lot of them (over 150k)
When executing this code, I get the following error on a random request, that is, I can get an error both during the 10th request and during the 110th request
Code:

async function getAllActivitiesForLead(leadId) {
    let activities = [];
    let cursor = null;
    const limit = 500;

    const sleep = (ms) => new Promise((resolve) => setTimeout(resolve, ms));

    try {
        while (true) {
            console.log(`Fetching activities for leadId ${leadId}, cursor: ${cursor}, limit: ${limit}`);

            const response = await activitiesApiInstance.getActivitiesCollection({
                cursor,
                limit
            });

            const { data, additional_data } = response;

            if (!data || data.length === 0) {
                console.warn(`No activities found for leadId ${leadId} with cursor ${cursor}.`);
                break;
            }

            activities = activities.concat(data);

            if (!additional_data || !additional_data.next_cursor) {
                console.log(`All activities fetched for leadId ${leadId}. Total: ${activities.length}`);
                break;
            }

            cursor = additional_data.next_cursor;

            await sleep(2000);
        }
    } catch (error) {
        console.error(`Error fetching activities for leadId ${leadId}:`, error);
        throw error;
    }

    return activities;
}

Error:

FailResponseException {
    success: false,
    message: 'Timeout of 60000ms exceeded',
    errorCode: undefined,
    context: undefined
}

I tried increasing the timeouts and the time between requests - it did not give any result, so in my opinion the problem is not on my side, but I do not exclude that it could still be my fault
Please help, this is a very important and urgent project
Thanks for your help