Skip to content

Handling multiple promises

This article will cover the different ways to handle multiple promises in TypeScript and when you might use each approach. We will mostly be looking at retrieving a list of posts from an external Rest API. Suppose we want to fetch some posts from JSONPlaceholder using the Fetch API.

type Post = {
id: number;
title: string;
};
const fetchPost = async (postId: number): Promise<Post> => {
console.log(`Fetching post ${postId}`);
const response = await fetch(
`https://jsonplaceholder.typicode.com/posts/${postId}`,
);
const fullPost = await response.json();
console.log(`Fetched post ${postId}`);
return {
id: fullPost.id,
title: fullPost.title,
};
};

Firstly, let’s look at how we would retrieve the posts by executing our requests sequentially.

Sequential Execution / Promise sequencing#

.forEach#

We might be tempted to use .forEach to fetch all of the posts.

const postIds = [1, 2, 3];
const posts: Post[] = [];
postIds.forEach(async (postId) => {
const post = await fetchPost(postId);
posts.push(post);
});
console.log('Posts', posts);

But, we get some unexpected results when we look at our logs:

Terminal window
[LOG]: "Fetching post 1"
[LOG]: "Fetching post 2"
[LOG]: "Fetching post 3"
[LOG]: "Posts", []
[LOG]: "Fetched post 2"
[LOG]: "Fetched post 1"
[LOG]: "Fetched post 3"

Firstly, we don’t have any posts in the posts array! This is because .forEach does not wait for promises as mentioned in the MDN Documentation. So, right after we start firing off our requests, the console.log('Posts', posts); statement is executed which will always print an empty array.

Secondly, even if we did have posts in the posts array, you will notice that the order of the requests when they are sent and the order in which the requests finish are different. This would mean that the order of our posts array would be different each time!

Playground Link

for...of#

The for…of loop offers a more intuitive way of handling promises sequentially.

const posts: Post[] = [];
for (const postId of postIds) {
const post = await fetchPost(postId);
posts.push(post);
}
console.log('Posts', posts);

This gives us the following output:

Terminal window
[LOG]: "Fetching post 1"
[LOG]: "Fetched post 1"
[LOG]: "Fetching post 2"
[LOG]: "Fetched post 2"
[LOG]: "Fetching post 3"
[LOG]: "Fetched post 3"
[LOG]: "Posts", [{
"id": 1,
"title": "sunt aut facere repellat provident occaecati excepturi optio reprehenderit"
}, {
"id": 2,
"title": "qui est esse"
}, {
"id": 3,
"title": "ea molestias quasi exercitationem repellat qui ipsa sit aut"
}]

Playground Link

Now, we have a posts array with our expected posts in the same order that we started fetching them in.

.reduce()#

As described in the MDN Documentation, promise sequencing can also be achieved using .reduce().

const posts = await postIds.reduce(
async (acc, postId) => {
const fetchedPosts = await acc;
const newPost = await fetchPost(postId);
return [...fetchedPosts, newPost];
},
Promise.resolve([] as Post[]),
);

Playground Link

This is the equivalent of:

const posts = await Promise.resolve([] as Post[])
.then(async (fetchedPosts) => {
const newPost = await fetchPost(1);
return [...fetchedPosts, newPost];
})
.then(async (fetchedPosts) => {
const newPost = await fetchPost(2);
return [...fetchedPosts, newPost];
})
.then(async (fetchedPosts) => {
const newPost = await fetchPost(3);
return [...fetchedPosts, newPost];
});

Playground Link

This is functionally the same as using for...of but done in a more functional style.

Do you need to run promises sequentially?#

If we look at our logs again,

Terminal window
[LOG]: "Fetching post 1"
[LOG]: "Fetched post 1"
[LOG]: "Fetching post 2"
[LOG]: "Fetched post 2"
[LOG]: "Fetching post 3"
[LOG]: "Fetched post 3"

We can see that we only start fetching the next post once we have finished fetching the previous post.

If we start timing how long this takes:

const timeNow = Date.now();
const postIds = [1, 2, 3]
const posts = await postIds.reduce(
async (acc, postId) => {
const fetchedPosts = await acc;
const newPost = await fetchPost(postId);
return [...fetchedPosts, newPost];
},
Promise.resolve([] as Post[]),
);
console.log("Posts", posts);
console.log(`Executed in ${(Date.now() - timeNow) / 1000}s`);

We can see that it takes about 0.3 seconds to fetch all of the posts sequentially. This is the sum of all of the request times.

Terminal window
[LOG]: "Executed in 0.283s"

However, do we need to fetch the posts sequentially? Sometimes, we may need to fetch the posts sequentially because we do not want to overwhelm the server with too many requests. However, since we are only fetching three posts, we can consider fetching them in parallel.

Parallel Execution#

Promise.all()#

Promise.all() offers a way of only continuing when all of the promises in the array have been resolved. If you remember when we used .forEach(), the requests were actually being fired off without waiting for the previous request to complete. This is exactly the behaviour that we want.

const fetchPostPromises = postIds.map((postId) => fetchPost(postId));
const posts = await Promise.all(fetchPostPromises);

Here I have used .map() to create an array of promises. When we use .map(), the requests are already being fired off in parallel. If we did not care about the results of the requests, we could continue the execution of our function without doing anything else. However, since we want to see the results of the requests, we use Promise.all() to wait for all of the requests to complete.

Output:

Terminal window
[LOG]: "Fetching post 1"
[LOG]: "Fetching post 3"
[LOG]: "Fetched post 2"
[LOG]: "Posts", [{
"id": 1,
"title": "sunt aut facere repellat provident occaecati excepturi optio reprehenderit"
}, {
"id": 2,
"title": "qui est esse"
}, {
"id": 3,
"title": "ea molestias quasi exercitationem repellat qui ipsa sit aut"
}]
[LOG]: "Executed in 0.078s"

You will notice that unlike with .forEach(), the posts are in the same order that we started fetching them in. This is because Promise.all() returns the results of the promises in the same order as its input array. The other thing to note is that by executing the requests in parallel, we reduce the overall time it takes to fetch all of the posts to 0.078s. This is the time it takes to resolve the longest request rather than the sum of all of the request times.

Playground Link

Error handling and observability#

What happens if one of the promises fails?

Promise.all() will reject as soon as one of the promises rejects. So, if we have multiple promises that reject, we will only get the error from the first rejected promise.

try {
const posts = await Promise.all([
fetchPostThatWillResolve({
postId: 1,
timeOutInSeconds: 1,
}),
fetchPostThatWillReject({
postId: 2,
timeOutInSeconds: 0.5,
}),
fetchPostThatWillReject({
postId: 3,
timeOutInSeconds: 2,
}),
]);
console.log('Posts', posts);
} catch (error) {
console.error('Failed to posts', error);
}

So, in this case, we will only get the error from postId: 2 because it is the first promise that rejects.

Terminal window
[ERR]: "Failed to posts", Not found

Playground Link

But, you’ll see that unfortunately our error log isn’t very helpful. We don’t know which post failed. Rather than wrapping the entire Promise.all() call in a try...catch block, we can wrap each individual promise in a try...catch block so that if they fail, we can have more context about which post failed.

const fetchPostThatWillReject = async ({
postId,
timeOutInSeconds,
}: {
postId: number;
timeOutInSeconds: number;
}) => {
try {
await sleepResolve(timeOutInSeconds);
throw new Error(`Not found`);
} catch (error) {
console.error(`Failed to retrieve post ${postId}`, error);
throw error;
}
};

Playground Link

Now when we run the code, we can see that we have much more context about which post failed.

Terminal window
[ERR]: "Failed to retrieve post 2", Not found
[ERR]: "Failed to retrieve post 3", Not found

We also see that we got logs for postId: 3 which fails after the request to postId: 2 has already failed. This is because while the Promise.all() call has already been rejected after the first promise has been rejected, the remaining promises still continue to execute.

However, it is important to note you would only see the logs for all errors if the runtime in which you are executing the code does not shut down the Node.js process after the first error. Scenarios like running in a serverless functions or a scripts will kill the process immediately if the Promise.all() error is not handled and so you cannot rely on the remaining promises to have logged out their errors.

Promise.allSettled()#

Promise.allSettled() is a variant of Promise.all() that will not reject. Instead, it will always resolve to an array of objects representing the outcome of each promise. This is useful if you want to more reliably observe and log the outcome of all of the promises in an array even if more than one of them reject.

const postFetchResults = await Promise.allSettled([
fetchPostThatWillResolve({
postId: 1,
timeOutInSeconds: 1,
}),
fetchPostThatWillReject({
postId: 2,
timeOutInSeconds: 0.5,
}),
fetchPostThatWillReject({
postId: 3,
timeOutInSeconds: 2,
}),
]);
const posts = postFetchResults.filter(
(result) => result.status === 'fulfilled',
);
console.log('Posts', posts);

Playground Link

If we look at the logs, we can see that we get logs for all of the promises that were rejected and we still receive the posts that were successfully fetched.

Terminal window
[ERR]: "Failed to retrieve post 2", Not found
[ERR]: "Failed to retrieve post 3", Not found
[LOG]: "Posts", [{
"status": "fulfilled",
"value": {
"postId": 1
}
}]

Sometimes, you may not want to continue executing code after some of the promises have been rejected. In this case, you can see if any of the promises have been rejected and then throw an error.

if (postFetchResults.some((result) => result.status === 'rejected')) {
throw new Error('Failed to retrieve posts');
}

Unfortunately, one of the negatives of this is that we lose the stack trace for the error that was thrown in the rejected promise. You can get around this by using the Error.cause property to pass the error from the rejected promise to the new error that we throw.

if (postFetchResults.some((result) => result.status === 'rejected')) {
throw new Error('Failed to retrieve posts', {
cause: postFetchResults.find((result) => result.status === 'rejected')
?.reason,
});
}

I have chosen to use the first rejected promise’s error the cause property but you could also map all of the errors and pass them as an array to the cause property. However, this can be problematic when viewing the logs due to the excessive amount of information. This is usually why having the try/catch block inside of each fetchPost function is better for capturing more specific context about errors that occurred.

Bonus#

for await...of for async iterators#

Sometimes, we need to execute a series of promises that are dependent on one another and should be done sequentially. For example, we may need to fetch a list of users from a paginated API. In order to retrieve the next page of users, you need to use the results from the previous page which usually includes some kind of pagination token.

A common abstraction for handling this sequencing of dependent promises is the async iterator. For example, the JavaScript AWS SDK v3 provides async generator functions which handle retrieving the next page of a paginated response.

We could get all the users based on a QueryCommand as follows:

import { DynamoDBClient } from '@aws-sdk/client-dynamodb';
import { DynamoDBDocumentClient, paginateQuery } from '@aws-sdk/lib-dynamodb';
const dynamoDbClient = new DynamoDBClient();
const dynamoDbDocumentClient = DynamoDBDocumentClient.from(dynamoDbClient);
const paginator = paginateQuery(
{
client: dynamoDbDocumentClient,
pageSize: 25,
},
{
TableName: 'some-table',
KeyConditionExpression: 'userId = :userId',
ExpressionAttributeValues: {
':userId': '12345',
},
},
);
const users = [];
for await (const page of paginator) {
users.push(...(page.Items ?? []));
}

This makes use of for await…of which is specifically designed to work with async iterators.

This is notably different from the previous examples that were executed sequentially, because in this case, we need to use the results from the previous iteration to get the next page of results.

Error handling and observability#

Where do we need to handle errors from an async iterator?

Placing the try...catch block inside of the for await...of loop will not work because the promise is getting awaited outside of the loop.

async function* getNumbers() {
yield 1;
yield Promise.reject(new Error("Something went wrong"));
}
(async function () {
for await (const num of getNumbers()) {
try {
console.log(num);
} catch (error) {
console.error(error);
}
}
})();

Playground Link

Instead, we want to wrap the entire for await...of loop inside of a try...catch block.

async function* getNumbers() {
yield 1;
yield Promise.reject(new Error("Something went wrong"));
}
(async function () {
try {
for await (const num of getNumbers()) {
console.log(num);
}
} catch (error) {
console.error(error);
}
})();

Playground Link