This content originally appeared on DEV Community and was authored by Bouhadjila Hamza
TL;DR: If you're doing a lot of
fetch()
in loops, especially to internal services, you're probably wasting time and compute. Here's how usingSet
,Map
, and a custom batching/retry tool cut our response time from 4s to 900ms — and what you can do to replicate it.
The Problem
Imagine you're building a simple dashboard that lists books from your favorite public API — say Open Library. For each book, you also want to display the author's details, which are hosted on a different endpoint.
You start with this basic idea:
const books = await fetchBooks(); // returns 50 books
const results = await Promise.all(
books.map((book) => fetchAuthor(book.author_id))
);
It works, but suddenly things slow down.
Every request to your dashboard makes 50+ fetch calls to the author endpoint. You’re:
- Re-fetching the same authors multiple times
- Sending all requests in parallel (no concurrency control)
- Making your backend cry 😢
The Fix
1. Deduplicate with Set()
You realize that many books share the same authors. So instead of fetching for each book, you deduplicate them:
const uniqueAuthorIds = new Set(books.map(b => b.author_id));
Now instead of 50 fetches, maybe you're down to 20. ✅
2. Use Map()
for Fast Lookup
After fetching the author info, store them in a Map
for constant-time access:
const authorMap = new Map();
authorResponses.forEach(({ item, result, success }) => {
if (success) authorMap.set(item, result.data);
});
const booksWithAuthors = books.map(book => ({
...book,
author: authorMap.get(book.author_id) || null
}));
3. Throttle & Retry with oh-no-again
I built a utility called oh-no-again
to help manage request batching with:
- Concurrency limits
- Retry logic
- Timeouts
- Lifecycle hooks
How to Use oh-no-again.js
: Full Tutorial
Let’s walk through the full flow using Open Library’s Books and Authors API.
Step 1: Install the package
npm install oh-no-again
Step 2: Fetch books from Open Library
You’ll need a mock function to fetch a list of books. Each book contains an author_id
:
async function fetchBooks() {
return [
{ title: 'Book One', author_id: 'OL23919A' },
{ title: 'Book Two', author_id: 'OL23919A' },
{ title: 'Book Three', author_id: 'OL2162282A' },
];
}
Step 3: Create a deduplicated list of authors
const books = await fetchBooks();
const uniqueAuthorIds = new Set(books.map(b => b.author_id));
Step 4: Batch fetch author info with retries and timeouts
import { requestBatcher } from 'oh-no-again';
const authorResponses = await requestBatcher(
Array.from(uniqueAuthorIds),
5, // Max 5 concurrent fetches
(authorId) => ({
method: 'GET',
url: `https://openlibrary.org/authors/${authorId}.json`,
headers: {
accept: 'application/json',
},
}),
{
delay: 50, // Delay between retries in ms
timeout: 1000, // Timeout per request in ms
retries: 2, // Retry twice before giving up
returnMeta: true, // Attach the item and result in the output
failFast: false, // Don't abort all on first error
hooks: { //optional
onRetry: (err, attempt) =>
console.warn(`Retry #${attempt + 1}:`, err.message),
onAbort: (err) => console.warn('Aborted:', err.message),
onFailure: (err) => console.error('Final failure:', err.message),
onSuccess: (res) => console.log('Success:', res),
}
}
);
Step 5: Map responses to a lookup Map
const authorMap = new Map();
authorResponses.forEach(({ item, result, success }) => {
if (success) authorMap.set(item, result.data);
});
Step 6: Attach authors to books
const booksWithAuthors = books.map(book => ({
...book,
author: authorMap.get(book.author_id) || null
}));
console.log(booksWithAuthors);
You’re done! 🚀
The Result
✅ Response time dropped from 4s → 900ms on average\
✅ No redundant requests\
✅ Fine-tuned control over retries and batching\
✅ Cleaner, more reliable backend
Takeaways
- Avoid fetching inside loops without deduplication
- Use
Set
to ensure you don’t repeat work - Use
Map
for blazing-fast ID-based access - Use batching + concurrency control (like
oh-no-again.js
) - Profile before optimizing randomly
Want to Try It?
Check out oh-no-again
on npm:\
👉 https://www.npmjs.com/package/oh-no-again
Planned features:
- Dynamic batch sizing
- Circuit breaker logic
- Built-in metrics/logging support
Happy batching! 🚀
This content originally appeared on DEV Community and was authored by Bouhadjila Hamza

Bouhadjila Hamza | Sciencx (2025-07-12T10:39:43+00:00) How Sets, Maps, and One NPM Package Saved My Backend’s Performance. Retrieved from https://www.scien.cx/2025/07/12/how-sets-maps-and-one-npm-package-saved-my-backends-performance/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.