How To Batch JavaScript API Requests For Better Performance

Often, we find ourselves in situations where we need to make hundreds of API calls. Running them all in parallel can overwhelm the server, while executing them one by one is highly inefficient.

A better approach is to process the requests in batches—executing a batch in parallel, waiting for the responses, and then proceeding with the next batch.

In this article, we will make 20 API calls to https://jsonplaceholder.typicode.com/todos/, fetching each todo by its id. We will process these requests in batches of 5 at a time.

Before we jump into code, lets understand in brief about Promis.all().

Understanding Promise.all()

Promise.all() takes an array of promises as input and returns a single promise. This returned promise resolves only when all the input promises are fulfilled. However, if any promise in the array rejects, Promise.all() immediately rejects with that error.

Besides Promise.all(), there are other useful Promise methods worth exploring, such as Promise.allSettled() and Promise.any().

Let's Write Some Code

We will create an array of requests and execute them in batches.

const urls = Array.from({ length: 20 }, (_, i) => `https://jsonplaceholder.typicode.com/todos/${i + 1}`);

This will create a urls array with 20 requests. We will execute 5 requests at a time, and once we receive their responses, we will proceed with the next batch of 5 until all 20 requests are completed.

Now, let's write our main function, which will split the requests into batches and execute each batch in parallel. We will place all the following code in a single file, named batchDemo.js.

async function fetchInBatches(urls, batchSize) {
  // we will keep track of the batch count
  let batchNo = 0;

  // sleep function (returns a Promise that resolves after `ms` milliseconds)
  const sleep = (ms) => new Promise(resolve => setTimeout(resolve, ms));

  // iterate through the `urls` array
  for (let i = 0; i < urls.length; i += batchSize) {
    // increment for each batch
    batchNo++

    // pick the next batch of URLs
    let batch = urls.slice(i, i + batchSize);

    // make all API calls in parallel for the current batch
    // map function returns an array
    let batchResults = await Promise.all(
      batch.map(url => fetch(url).then(res => res.json()).catch(err => null))
    );

    console.log(`### batch ${batchNo} ###`);
    console.log(batchResults);

    // just giving a small timeout for better demonstration
    await sleep(1000);
  }
}

// build the `urls` array
const urls = Array.from({ length: 20 }, (_, i) => `https://jsonplaceholder.typicode.com/todos/${i + 1}`);

// call the function with the batch size of 5
fetchInBatches(urls, 5).then(() => console.log("all done!!"));

Now, let's run our code and verify that the requests are indeed being executed in batches.

$ node batchDemo.js
### batch 1 ###
[
  { userId: 1, id: 1, title: 'delectus aut autem', completed: false },
  {
    userId: 1,
    id: 2,
    title: 'quis ut nam facilis et officia qui',
    completed: false
  },
  { userId: 1, id: 3, title: 'fugiat veniam minus', completed: false },
  { userId: 1, id: 4, title: 'et porro tempora', completed: true },
  {
    userId: 1,
    id: 5,
    title: 'laboriosam mollitia et enim quasi adipisci quia provident illum',
    completed: false
  }
]
### batch 2 ###
[
  {
    userId: 1,
    id: 6,
    title: 'qui ullam ratione quibusdam voluptatem quia omnis',
    completed: false
  },
  {
    userId: 1,
    id: 7,
    title: 'illo expedita consequatur quia in',
    completed: false
  },
  {
    userId: 1,
    id: 8,
    title: 'quo adipisci enim quam ut ab',
    completed: true
  },
  {
    userId: 1,
    id: 9,
    title: 'molestiae perspiciatis ipsa',
    completed: false
  },
  {
    userId: 1,
    id: 10,
    title: 'illo est ratione doloremque quia maiores aut',
    completed: true
  }
]
### batch 3 ###
...

We can see that our requests were successfully executed in batches. This approach helps optimize resource consumption on the client side while preventing the server from being overwhelmed by hundreds of requests at once.

Hope you found this article helpful.

Thank you for reading!


My First Go Book

My First Go Book

Learn Go from scratch with hands-on examples! Whether you're a beginner or an experienced developer, this book makes mastering Go easy and fun.

🚀 Get it Now