Q: Script to import from Pipedrive to google sheets for last 20 days

Question from StackOverflow
(Script to import from Pipedrive to google sheets for last 20 days - Stack Overflow)

I have this script for importing from Pipedrive to Google sheets that works great. I would like it to only pull the matches from the last 20 days instead of from a specific date. I would also like to clear the rows each time and ass them in fresh rather than add them to the next empty row.

function GetPipedriveDeals() {
  var ss = SpreadsheetApp.openById('sheet name');
  var sheet = ss.getSheetByName("Sheet1");

  var url    = "https://api.pipedrive.com/v1/activities?user_id=0&start=0&limit=500&start_date=2017-06-01&api_token=xxxxxxxxxxxxxxxxxxxx";

  var response = UrlFetchApp.fetch(url); 
  var dataSet = JSON.parse(response.getContentText()); 
  var data;

  for (var i = 0; i < dataSet.data.length; i++) {
    data = dataSet.data[i];
    sheet.appendRow([data.user_id, data.type, data.add_time, data.note, data.org_name]);

Reply from StackOverflow

You can use start_date & end_date parameter to fetch last 20 days records and sheet.clear() to clear the contents of the sheet. Refer the below code. Hope this helps!

function GetPipedriveDeals() {
      var ss = SpreadsheetApp.openById('sheet name');
      var sheet = ss.getSheetByName("Sheet1");

      sheet.clear(); //clear the contents of the sheet

      var startDate = getStartDate();
      var endDate = Utilities.formatDate(new Date(), "GMT", "yyyy-MM-dd");
      var url    = "https://api.pipedrive.com/v1/activities?user_id=0&start=0&start_date="+startDate+"&end_date="+endDate+"api_token=xxxxxxxxxxxxxxxxxxxx";
      var response = UrlFetchApp.fetch(url); 
      var dataSet = JSON.parse(response.getContentText()); 
      var data;

      for (var i = 0; i < dataSet.data.length; i++) {
        data = dataSet.data[i];
        sheet.appendRow([data.user_id, data.type, data.add_time, data.note, data.org_name]);

    function getStartDate() {
      var result = new Date();
      result.setDate(result.getDate() - 20);
      return result.convertDate();

    Date.prototype.convertDate = function() {
      var mm = this.getMonth() + 1; 
      var dd = this.getDate();

      return [this.getFullYear(),
         (mm>9 ? '' : '0') + mm,
         (dd>9 ? '' : '0') + dd

Sorry to pull up an old thread.

I’m using something similar query to pull today, yesterdays and this months total calls.

Code works okay, but when i enter today’s dates, i.e. start date 2019-01-28 to end date 2019-01-28 it doesn’t bring back all results from today. However, running start date 2019-01-28 to end date 2019-01-29 does pull all results.

I’ve tried using 2019-01-28 00:00:00 to 2019-01-28 23:59:59, but doesn’t help.

On a side note, which date does the API use when selecting start and end date?

I think i’ve fix it, by changing:

Start Date = "2019-01-28 00:00:00",
End Date = "2019-01-28 23:59:59",

Start Date = “2019-01-28 00:00:00”,
End Date = “2019-01-28 24:00:00”,

Glad it’s working :slight_smile:

Just as a side note, 2019-01-28 24:00:00 is probably automatically converted to 2019-01-29 00:00:00.

I tried your code but Im getting an error. I want to pull in all pipedrive deals to date including custom fields.

Can you help? See error below

Hi. The id of the sheet is wrong. Take a look at Google Script’s documentation: https://developers.google.com/apps-script/reference/spreadsheet/spreadsheet-app#openById(String)

Hey Dani, So my code works perfectly fine when on a separate workbook with about three sheets and performs relatively quick, However Once on another workbook with many sheets, formulas, and conditional formatting, It performs quite slow and does not end up retrieving all 4400 deals.

I understand that this is a performance bottleneck due to the # of sheets, data size, formulas & referencing, but is there anyway I can overcome this without purchasing an upgrade to my G-suite account?
Currently it only pulls around 1000 rows.

Please see the code attached

The implementation of the pagination is wrong. You’re making way too many calls :grin:

There is a maximum number of items returned in any case (I believe it’s 500), so even if you pass limit=20000, that will not help.

Instead of doing a while(start<20001), you should look at the additional_data.pagination object to know if there are more items in the collection, and pass the start value to the next call accordingly.
Here you can find a more detailed explanation: https://pipedrive.readme.io/docs/core-api-concepts-pagination

I think that alone will improve your code and response time massively.

See if you can adjust the pagination and let me know if it helps :slight_smile:

Hey Dani, Thanks for the quick response !

So for the additional data pagination object is it enough to set it up as follows :

var url = “https://api.pipedrive.com/v1/deals?user_id=0&start=0&limit=20000&more_items_in_collection=true&next_start=500” ;

then fetch it with :

var response = UrlFetchApp.fetch(url+ token) ;

I did that and for some reason Im still only getting 500 deals. Is there any way you could elaborate on the additional_data.pagination set up and initialization ?

See code attached.PipeDrive%20Code

Here’s an example that I wrote for you of how to correctly use the pagination with Google Apps Scripts.
I hope it helps.

var token = '<your-token>';
var url = 'https://<your-company-domain>.pipedrive.com/v1/deals'
+ '?api_token=' + token
+ '&limit=500';
var deals = [];
var start = 0;
do {
   var response = JSON.parse(UrlFetchApp.fetch(url + '&start=' + start));
   deals = deals.concat(response.data);
      start = response.additional_data.pagination.next_start;
} while(response.additional_data.pagination.more_items_in_collection);

and in deals you’ll have all of your deals ready to be used.

Thank you so much Dani!, so it works perfectly fine but for some reason its still very slow.

I manage to pull only 1135 deals on the workbook with many sheets and formulas ( I guess this is due to the number of formulas/references and size of the google workbook)

Best thing to do for me right now is to run it on the empty workbook(much quicker) and copy the sheet over to the populated workbook.

As the volume of my deals increase , At some point I will need to upgrade to the G-suite business plan - to increase my execution time to 30 min.

Thank you so much, I really appreciated your help on this.


I think that what you’re saying makes sense. It’s probably a good idea to run it on an empty workbook, first.
Glad it helped :slight_smile:

Hi Dani,

thanks for your help. I have found another code that allows me to export a certain pipeline inkl. a filter into google sheet. But it doesn’t seem to work. Let me walk you through my steps.

In google sheets -> Tools -> Script Editor I have pasted the following code:

function myFunction() {
  // Standard functions to call the spreadsheet sheet and activesheet
function GetPipedriveDeals() {
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var sheets = ss.getSheets();
  var sheet = ss.getActiveSheet();

   //the way the url is build next step is to iterate between the end because api only allows a fixed number of calls (100) this way i can slowly fill the sheet.
  var url    = "https://api.pipedrive.com/v1/deals?start=";
  var limit  = "&limit=500";
  var filter = "&filter_id=20";
  var pipeline = 2; // put a pipeline id specific to your PipeDrive setup 
  var start  = 0;
//  var end  = start+50;
  var token  = "&api_token=xxxx"

  //call the api and fill dataAll with the jsonparse. 
  //then the information is set into the 
  //dataSet so that i can refill datall with new data.

  var response = UrlFetchApp.fetch(url+start+limit+filter+token); 
  var dataAll = JSON.parse(response.getContentText()); 
  var dataSet = dataAll;

  //create array where the data should be put
  var rows = [A2], data;

  for (var i = 0; i < dataSet.data.length; i++) {
    data = dataSet.data[i];

    if(data.pipeline_id === pipeline){ 
      rows.push([data.org_name, data.title, data.owner_name, data.status, data.pipeline_id]);//your JSON entities here
  Logger.log( JSON.stringify(rows,null,2) );   // Log transformed data

  return rows;

  1. Then in the sheet in the cell A1 I have pasted: =myFunction()
  2. As I would like the sheet to update every hour I have added a trigger.

Unfortunately the code doesn’t work. -> My sheet remains empty. I would be very grateful if you could review the code.

Thank you!

Hi :slight_smile:
I can see a couple of suspicious things that you might want to look into.

  1. I’m pretty sure that your cell is empty because you have the function GetPipedriveDeals defined inside your main function myFunction, but it’s never called. This means your myFunction is not returning anything.
    Just get rid of one of the two, for example, get rid of GetPipedriveDeals and see if it changes anything.
  2. This line
    var rows = [A2], data;
    looks like an error. A2 is not defined anywhere. Try:
    var rows = [], data;

Hope that helps :+1:

Hi Dani,

thanks a lot! I have managed to get the code working. I still have two questions, I hope you could answer.

  1. When I fill the table with data could I leave the header of the table? For example “Company Name” etc. or what other solutions could I use?
  2. Where can I find further (list of all) JSON entities? I am using “data.org_name” but I would also need to return some other custom information that we add to each Deal.

I am very sorry for theese noob questions. I have tried to research myself -> unfortunately unsuccessfully.

Thank you and best wishes,

1 Like
  1. I’m not 100% sure what you mean… I think that on G Spreadsheets you can choose where to include the function, so if you call =myFunction() on A2 instead of A1, it should start filling from A2.
  2. Depending on which entity you need to get the list of fields for, you can make an API call to /activityFields

Hello Dani,

I make this script, but I think my previous steps its wrong. Can you help me?

Hi @Leticia_Castro,
Can you give some more details?

Okay, @dani,

I want extract more deals than limit of 500. Using the additional_data_pagination, The same thing of Daniel ask up there. But my query (image) don’t work, I make the steps, but I think my first part of query it’s wrong.

My problem:
I have a pipe around 20000 deals and my query returns just 500, because the limit. I saw in this post that is possible get more using the additional_data_pagination, but I dont’ know how make this.