2721. Execute Asynchronous Functions in Parallel

MediumJavaScript
Leetcode Link

Problem Description

This problem asks you to implement a function that mimics the behavior of Promise.all() without using the built-in method. You need to create a function called promiseAll that takes an array of asynchronous functions as input.

Each function in the input array:

  • Takes no arguments
  • Returns a promise when called

Your promiseAll function should return a new promise that behaves as follows:

Success Case:

  • The returned promise should resolve when ALL promises from the input functions have resolved successfully
  • The resolved value should be an array containing all the resolved values
  • The order of values in the result array must match the order of functions in the input array
  • All functions should execute in parallel (not sequentially)

Failure Case:

  • If ANY promise rejects, the returned promise should immediately reject
  • The rejection reason should be the same as the first promise that rejected
  • Once a rejection occurs, it doesn't matter if other promises are still pending

For example, if you have three async functions that resolve to values [1, 2, 3], your function should return a promise that resolves to [1, 2, 3]. But if the second function rejects with error "failed", your promise should reject with "failed" regardless of whether the other functions succeed.

The solution uses a counter to track completed promises and an array to store results at their correct indices. When the counter reaches the total number of functions, all promises have resolved and the result array is returned. If any promise rejects, the entire operation immediately rejects with that error.

Quick Interview Experience
Help others by sharing your interview experience
Have you seen this problem before?

Intuition

To understand how to solve this problem, let's think about what Promise.all needs to accomplish. We have multiple asynchronous operations running in parallel, and we need to know when they're all done while preserving their order.

The key insight is that we need to track two things:

  1. How many promises have completed - to know when we're done
  2. The results in their original order - even though promises may resolve in any order

Since promises can resolve at different times, we can't just push results into an array as they come in - that would lose the original ordering. Instead, we need to place each result at its specific index position.

Think of it like waiting for multiple friends to arrive at a party. You have a list of who's coming, and you need to know when everyone has arrived. Each friend might arrive at different times, but you want to remember them in the order they were on your original guest list, not the order they showed up.

This leads us to use:

  • A counter (cnt) to track how many promises have resolved
  • A pre-allocated array (ans) where we can place results at their correct indices
  • A loop with closures to capture each index i, ensuring each promise's result goes to the right spot

For error handling, we adopt a "fail-fast" approach - as soon as any promise rejects, we immediately reject the entire operation. This is like a chain being only as strong as its weakest link. We don't need to wait for other promises because the overall operation has already failed.

The wrapper new Promise((resolve, reject) => {...}) gives us control over when to resolve (when counter equals array length) or reject (on first error) the overall promise.

Solution Implementation

1import asyncio 2from typing import List, Callable, TypeVar, Awaitable 3 4T = TypeVar('T') 5 6async def promiseAll(functions: List[Callable[[], Awaitable[T]]]) -> List[T]: 7 """ 8 Executes an array of coroutine-returning functions in parallel and returns 9 a list of all resolved values in order. 10 If any coroutine raises an exception, the entire operation fails immediately. 11 12 Args: 13 functions: List of functions that return coroutines (async functions) 14 15 Returns: 16 List of resolved values from all coroutines 17 """ 18 # Create a list to store all coroutine objects 19 coroutines = [] 20 21 # Execute each coroutine-returning function to get the coroutine objects 22 for function in functions: 23 # Call the function to get the coroutine and add it to the list 24 coroutines.append(function()) 25 26 # Use asyncio.gather to run all coroutines concurrently 27 # If any coroutine raises an exception, gather will raise it immediately 28 # The results are returned in the same order as the input 29 results = await asyncio.gather(*coroutines) 30 31 return list(results) 32 33 34# Example usage: 35# async def example(): 36# result = await promiseAll([lambda: asyncio.coroutine(lambda: 42)()]) 37# print(result) # [42] 38 39# Alternative implementation that more closely mirrors the JavaScript behavior: 40async def promiseAll_alternative(functions: List[Callable[[], Awaitable[T]]]) -> List[T]: 41 """ 42 Alternative implementation that manually tracks completion similar to the JS version. 43 44 Args: 45 functions: List of functions that return coroutines 46 47 Returns: 48 List of resolved values from all coroutines 49 """ 50 # Initialize result list with the same length as input 51 results = [None] * len(functions) 52 53 # Create tasks for all coroutines 54 tasks = [] 55 for index, promise_function in enumerate(functions): 56 # Create a task for each coroutine function 57 task = asyncio.create_task(promise_function()) 58 tasks.append((index, task)) 59 60 # Wait for all tasks to complete 61 for index, task in tasks: 62 try: 63 # Wait for the task and store result at corresponding index 64 result = await task 65 results[index] = result 66 except Exception as error: 67 # If any task fails, cancel all remaining tasks 68 for _, remaining_task in tasks: 69 if not remaining_task.done(): 70 remaining_task.cancel() 71 # Re-raise the exception 72 raise error 73 74 return results 75
1import java.util.concurrent.CompletableFuture; 2import java.util.concurrent.CompletionException; 3import java.util.function.Supplier; 4import java.util.List; 5import java.util.ArrayList; 6import java.util.concurrent.atomic.AtomicInteger; 7 8/** 9 * Utility class for handling multiple asynchronous operations in parallel. 10 */ 11public class PromiseUtils { 12 13 /** 14 * Executes an array of CompletableFuture-returning functions in parallel and returns 15 * a CompletableFuture that completes with a list of all resolved values in order. 16 * If any CompletableFuture fails, the entire operation fails immediately. 17 * 18 * @param <T> The type of the result values 19 * @param functions - List of suppliers that return CompletableFutures 20 * @return CompletableFuture that completes with a list of resolved values 21 */ 22 public static <T> CompletableFuture<List<T>> promiseAll(List<Supplier<CompletableFuture<T>>> functions) { 23 // Create the result CompletableFuture that will be returned 24 CompletableFuture<List<T>> resultFuture = new CompletableFuture<>(); 25 26 // Handle edge case of empty input 27 if (functions.isEmpty()) { 28 resultFuture.complete(new ArrayList<>()); 29 return resultFuture; 30 } 31 32 // Track the number of completed futures 33 AtomicInteger completedCount = new AtomicInteger(0); 34 35 // Initialize result list with the same size as input 36 List<T> results = new ArrayList<>(); 37 for (int i = 0; i < functions.size(); i++) { 38 results.add(null); 39 } 40 41 // Execute each CompletableFuture-returning function 42 for (int index = 0; index < functions.size(); index++) { 43 final int currentIndex = index; 44 Supplier<CompletableFuture<T>> futureSupplier = functions.get(index); 45 46 // Execute the supplier and handle its CompletableFuture 47 futureSupplier.get() 48 .whenComplete((result, throwable) -> { 49 if (throwable != null) { 50 // If any future fails, immediately fail the main future 51 resultFuture.completeExceptionally(throwable); 52 } else { 53 // Store the result at the corresponding index 54 synchronized (results) { 55 results.set(currentIndex, result); 56 } 57 58 // Increment completed count and check if all are done 59 if (completedCount.incrementAndGet() == functions.size()) { 60 // If all futures have completed, complete the main future 61 resultFuture.complete(results); 62 } 63 } 64 }); 65 } 66 67 return resultFuture; 68 } 69 70 /** 71 * Example usage: 72 * List<Supplier<CompletableFuture<Integer>>> suppliers = List.of( 73 * () -> CompletableFuture.completedFuture(42) 74 * ); 75 * CompletableFuture<List<Integer>> future = promiseAll(suppliers); 76 * future.thenAccept(System.out::println); // [42] 77 */ 78} 79
1#include <vector> 2#include <future> 3#include <functional> 4#include <memory> 5 6/** 7 * Executes an array of promise-returning functions in parallel and returns 8 * a future that completes with a vector of all resolved values in order. 9 * If any promise fails, the entire operation fails immediately. 10 * 11 * @param functions - Vector of functions that return futures 12 * @returns Future that resolves to a vector of resolved values 13 */ 14template<typename T> 15std::future<std::vector<T>> promiseAll(const std::vector<std::function<std::future<T>()>>& functions) { 16 // Create a promise to control the final result 17 auto finalPromise = std::make_shared<std::promise<std::vector<T>>>(); 18 19 // Get the future from the promise 20 std::future<std::vector<T>> finalFuture = finalPromise->get_future(); 21 22 // Handle empty input case 23 if (functions.empty()) { 24 finalPromise->set_value(std::vector<T>()); 25 return finalFuture; 26 } 27 28 // Shared state for tracking completion 29 struct SharedState { 30 std::mutex mutex; 31 std::vector<T> results; 32 size_t resolvedCount = 0; 33 bool hasRejected = false; 34 }; 35 36 auto sharedState = std::make_shared<SharedState>(); 37 sharedState->results.resize(functions.size()); 38 39 // Execute each promise-returning function 40 for (size_t index = 0; index < functions.size(); ++index) { 41 // Launch async task for each function 42 std::thread([index, functions, sharedState, finalPromise]() { 43 try { 44 // Execute the function and get its future 45 std::future<T> future = functions[index](); 46 47 // Wait for the result 48 T result = future.get(); 49 50 // Store the result thread-safely 51 { 52 std::lock_guard<std::mutex> lock(sharedState->mutex); 53 54 // Check if already rejected 55 if (sharedState->hasRejected) { 56 return; 57 } 58 59 // Store the result at the corresponding index 60 sharedState->results[index] = result; 61 sharedState->resolvedCount++; 62 63 // If all promises have resolved, resolve the main promise 64 if (sharedState->resolvedCount == functions.size()) { 65 finalPromise->set_value(sharedState->results); 66 } 67 } 68 } 69 catch (const std::exception& e) { 70 // Handle any exception from the future 71 std::lock_guard<std::mutex> lock(sharedState->mutex); 72 73 // If not already rejected, reject the main promise 74 if (!sharedState->hasRejected) { 75 sharedState->hasRejected = true; 76 finalPromise->set_exception(std::current_exception()); 77 } 78 } 79 }).detach(); 80 } 81 82 return finalFuture; 83} 84 85/** 86 * Example usage: 87 * auto promise = promiseAll<int>({ 88 * []() { return std::async(std::launch::async, []() { return 42; }); } 89 * }); 90 * std::vector<int> results = promise.get(); // [42] 91 */ 92
1/** 2 * Executes an array of promise-returning functions in parallel and returns 3 * a promise that resolves with an array of all resolved values in order. 4 * If any promise rejects, the entire operation rejects immediately. 5 * 6 * @param functions - Array of functions that return promises 7 * @returns Promise that resolves to an array of resolved values 8 */ 9async function promiseAll<T>(functions: (() => Promise<T>)[]): Promise<T[]> { 10 return new Promise<T[]>((resolve, reject) => { 11 // Track the number of resolved promises 12 let resolvedCount = 0; 13 14 // Initialize result array with the same length as input 15 const results = new Array(functions.length); 16 17 // Execute each promise-returning function 18 for (let index = 0; index < functions.length; index++) { 19 const promiseFunction = functions[index]; 20 21 // Execute the function and handle its promise 22 promiseFunction() 23 .then(result => { 24 // Store the result at the corresponding index 25 results[index] = result; 26 resolvedCount++; 27 28 // If all promises have resolved, resolve the main promise 29 if (resolvedCount === functions.length) { 30 resolve(results); 31 } 32 }) 33 .catch(error => { 34 // If any promise rejects, immediately reject the main promise 35 reject(error); 36 }); 37 } 38 }); 39} 40 41/** 42 * Example usage: 43 * const promise = promiseAll([() => new Promise(res => res(42))]) 44 * promise.then(console.log); // [42] 45 */ 46

Solution Approach

Let's walk through the implementation step by step:

1. Create a wrapper promise:

return new Promise<T[]>((resolve, reject) => {  // Implementation here });

We wrap everything in a new promise that we can control - deciding when to resolve with all results or reject with an error.

2. Initialize tracking variables:

let cnt = 0; const ans = new Array(functions.length);
  • cnt tracks how many promises have successfully resolved
  • ans is a pre-allocated array with the same length as the input functions array to store results at their correct positions

3. Execute all functions in parallel:

for (let i = 0; i < functions.length; ++i) {  const f = functions[i];  f()  .then(res => {  // Handle success  })  .catch(err => {  // Handle failure  }); }

The loop immediately invokes all functions without waiting for any to complete. Each function call returns a promise that we attach handlers to.

4. Handle successful resolution:

.then(res => {  ans[i] = res;  cnt++;  if (cnt === functions.length) {  resolve(ans);  } })

When a promise resolves:

  • Store the result at index i in the ans array (preserving order)
  • Increment the counter
  • Check if all promises are done (cnt === functions.length)
  • If all are complete, resolve the main promise with the complete results array

5. Handle rejection:

.catch(err => {  reject(err); })

If any promise rejects, immediately reject the main promise with the same error. This implements the "fail-fast" behavior where the first rejection causes the entire operation to fail.

Key implementation details:

  • The closure in the loop captures the value of i for each iteration, ensuring each result goes to the correct index
  • All functions are invoked immediately in the loop, achieving parallel execution
  • The counter approach avoids needing to check the array for completion on every resolution
  • No additional error tracking is needed since we reject on the first error

This implementation efficiently handles both the happy path (all succeed) and error cases (any failure) while maintaining the correct order of results.

Ready to land your dream job?

Unlock your dream job with a 5-minute evaluator for a personalized learning plan!

Start Evaluator

Example Walkthrough

Let's walk through a concrete example with 3 async functions to see how the solution works:

Input functions:

  • fn1: resolves with value 10 after 100ms
  • fn2: resolves with value 20 after 50ms
  • fn3: resolves with value 30 after 150ms

Step-by-step execution:

  1. Initialization (t=0ms):

    • Create wrapper promise
    • Set cnt = 0
    • Create ans = [undefined, undefined, undefined] (array of length 3)
  2. Start all functions (t=0ms):

    • Loop executes immediately:
      • i=0: Call fn1(), attach .then() and .catch() handlers
      • i=1: Call fn2(), attach .then() and .catch() handlers
      • i=2: Call fn3(), attach .then() and .catch() handlers
    • All three functions are now running in parallel
  3. First resolution (t=50ms):

    • fn2 resolves with 20 (it's the fastest)
    • Its .then() handler executes with i=1:
      • ans[1] = 20ans = [undefined, 20, undefined]
      • cnt = 1
      • Check: cnt (1) !== functions.length (3), so continue waiting
  4. Second resolution (t=100ms):

    • fn1 resolves with 10
    • Its .then() handler executes with i=0:
      • ans[0] = 10ans = [10, 20, undefined]
      • cnt = 2
      • Check: cnt (2) !== functions.length (3), so continue waiting
  5. Final resolution (t=150ms):

    • fn3 resolves with 30
    • Its .then() handler executes with i=2:
      • ans[2] = 30ans = [10, 20, 30]
      • cnt = 3
      • Check: cnt (3) === functions.length (3)
      • Call resolve(ans) with [10, 20, 30]

Result: The wrapper promise resolves with [10, 20, 30] - values are in the original order despite fn2 finishing first.

Alternative scenario with rejection:

If fn2 had rejected with error "Network error" at t=50ms:

  • Its .catch() handler would execute immediately
  • Call reject("Network error")
  • The wrapper promise rejects with "Network error"
  • fn1 and fn3 continue running but their results are ignored

This demonstrates how the solution maintains order through indexed assignment and implements fail-fast behavior on errors.

Time and Space Complexity

Time Complexity: O(n) where n is the number of functions in the input array.

The code iterates through the array of functions once with a for loop, and for each function, it invokes it and attaches .then() and .catch() handlers. The iteration itself takes O(n) time. Each promise is executed asynchronously and in parallel, so the actual execution time depends on the slowest promise to resolve, but the setup and handling of promises is O(n).

Space Complexity: O(n) where n is the number of functions in the input array.

The space complexity includes:

  • The ans array which stores the results of all promises: O(n)
  • The closure variables (cnt, resolve, reject) which are constant: O(1)
  • Each promise's .then() and .catch() handlers create closures that capture the index i and reference to ans, cnt, resolve, and reject: O(n) total for all handlers

Therefore, the overall space complexity is O(n).

Common Pitfalls

1. Using the Loop Variable Directly in Callbacks (JavaScript/TypeScript)

Pitfall: A classic mistake is capturing the loop variable incorrectly in the promise handlers:

// WRONG - This will not preserve the correct order for (var i = 0; i < functions.length; i++) {  functions[i]()  .then(res => {  ans[i] = res; // Bug: 'i' will be functions.length for all callbacks  cnt++;  if (cnt === functions.length) {  resolve(ans);  }  })  .catch(reject); }

Why it fails: When using var, the variable i is function-scoped, not block-scoped. By the time the promises resolve, the loop has completed and i equals functions.length for all callbacks, causing all results to be stored at an invalid index.

Solution: Use let for block scoping or create a closure:

// Solution 1: Use 'let' for block scoping for (let i = 0; i < functions.length; i++) {  // 'i' is now block-scoped and captured correctly }  // Solution 2: Create an IIFE closure for (var i = 0; i < functions.length; i++) {  (function(index) {  functions[index]()  .then(res => {  ans[index] = res;  // ...  })  })(i); }

2. Not Handling Empty Arrays

Pitfall: Forgetting to handle the edge case when the input array is empty:

// This works but could be more explicit if (cnt === functions.length) { // When length is 0, this never triggers  resolve(ans); }

Solution: Add explicit handling for empty arrays:

if (functions.length === 0) {  resolve([]);  return; }

3. Sequential Execution Instead of Parallel (Python)

Pitfall: Accidentally awaiting each coroutine in sequence:

# WRONG - This executes sequentially results = [] for func in functions:  result = await func() # Waits for each one to complete before starting the next  results.append(result)

Solution: Create all coroutines first, then await them together:

# Correct - Executes in parallel coroutines = [func() for func in functions] results = await asyncio.gather(*coroutines)

4. Not Preserving Order When Using Manual Tracking

Pitfall: Appending results as they arrive instead of placing them at their original index:

// WRONG - Order is not preserved functions[i]()  .then(res => {  ans.push(res); // Results will be in completion order, not original order  cnt++;  if (cnt === functions.length) {  resolve(ans);  }  })

Solution: Always use the index to place results:

// Correct - Preserves original order ans[i] = res; // Place at specific index

5. Memory Leaks with Unhandled Rejections

Pitfall: When implementing manual cancellation in Python, forgetting to properly clean up tasks:

# Potential issue - tasks might not be properly cancelled for index, task in tasks:  try:  result = await task  except Exception as error:  # Other tasks continue running in the background  raise error

Solution: Ensure all pending tasks are cancelled on failure:

except Exception as error:  # Cancel all remaining tasks  for _, remaining_task in tasks:  if not remaining_task.done():  remaining_task.cancel()  raise error
Loading...
Discover Your Strengths and Weaknesses: Take Our 5-Minute Quiz to Tailor Your Study Plan:

Which of the two traversal algorithms (BFS and DFS) can be used to find whether two nodes are connected?


Recommended Readings

Want a Structured Path to Master System Design Too? Don’t Miss This!

Load More