Lately, I’ve been diving into TanStack Query (formerly React Query) to learn more about its powerful data fetching and caching features. But before I start relying on tools that abstract everything for me, I wanted to understand the problem caching solves and implement a basic version myself.
This post walks through:
- The problem I noticed in my app
- How I came up with the idea of caching
- How I built a simple in-memory cache
- What I learned from the process
🧠 The Problem That Sparked the Idea
In one of my projects, I display a paginated list of users. Every time I changed the page, an API call was made—even if I had already fetched that page before. This felt inefficient.
Then I thought:
"Why not store the results of each API call in memory, and reuse them if the same request is made again within a short time?"
That's when I got the idea of building a simple in-memory cache:
✅ If I’ve already fetched page 2, don’t fetch it again within 5 minutes.
❌ Otherwise, fetch it from the API and cache it.
This idea led me to implement manual caching logic in my React hook.
🔄 First, The Hook Without Caching
Initially, my custom hook looked like this:
export const useGetAllUser = (currentPage, pageSize) => { const [users, setUsers] = useState([]); const [totalUsers, setTotalUsers] = useState(0); const [loading, setLoading] = useState(true); const [error, setError] = useState(null); useEffect(() => { const fetchUsers = async () => { setLoading(true); setError(null); try { const res = await fetchData("/users", { _page: currentPage, _limit: pageSize, }); setUsers(res.data); setTotalUsers(Number(res.headers["x-total-count"])); } catch (err) { setError(err.message); } finally { setLoading(false); } }; fetchUsers(); }, [currentPage, pageSize]); return { users, totalUsers, loading, error }; };
It worked—but made unnecessary API calls whenever a user switched between pages.
💡 Adding Manual Caching (In-Memory)
To fix this, I added a simple cache:
const userCache = {}; // A JS object to store cached results const CACHE_DURATION = 5 * 60 * 1000; // Cache is valid for 5 minutes
Then I modified the hook to check the cache before calling the API:
export const useGetAllUser = (currentPage, pageSize) => { const [users, setUsers] = useState([]); const [totalUsers, setTotalUsers] = useState(0); const [loading, setLoading] = useState(true); const [error, setError] = useState(null); useEffect(() => { const cacheKey = `page_${currentPage}_limit_${pageSize}`; const now = Date.now(); const cached = userCache[cacheKey]; if (cached && now - cached.timestamp < CACHE_DURATION) { // Serve from cache setUsers(cached.data); setTotalUsers(cached.total); setLoading(false); return; } // Fetch from API and store in cache const fetchUsers = async () => { setLoading(true); setError(null); try { const res = await fetchData("/users", { _page: currentPage, _limit: pageSize, }); const data = res.data; const total = Number(res.headers["x-total-count"]); setUsers(data); setTotalUsers(total); userCache[cacheKey] = { data, total, timestamp: Date.now(), }; } catch (err) { setError(err.message); } finally { setLoading(false); } }; fetchUsers(); }, [currentPage, pageSize]); return { users, totalUsers, loading, error }; };
🧩 How It Works – Step by Step
Generate a unique key (page_2_limit_10) based on the page and page size.
-
Check the cache:
- If the key exists and the data is fresh (within 5 minutes), use it directly.
- If not, fetch new data from the API.
Update the cache with new data and a timestamp.
Avoid repeated fetches for already-viewed pages within a short time.
This was the “aha!” moment for me—simple caching reduced redundant network calls and made the app feel snappier.
💻 Example Usage in Component
Here’s how I use the hook in the UI:
const { users, totalUsers, loading, error } = useGetAllUser(currentPage, pageSize); return ( <> {loading ? <SkeletonLoader /> : ( <UserGrid users={users} /> )} <Pagination current={currentPage} total={totalUsers} pageSize={pageSize} onChange={(page) => setCurrentPage(page)} /> </> );
This is clean and declarative—the hook handles everything behind the scenes.
📘 What I Learned
From this experience, I understood:
How to think like a caching system
How to build basic memoization logic
The trade-offs of manual caching (like global memory usage and no auto-invalidations)
Most importantly, I understood why TanStack Query exists—it handles all this, and much more:
Automatic stale handling
Background refetching
DevTools for cache inspection
Garbage collection
React integration
Top comments (1)
Great mindset! Manual caching first helps you truly appreciate TanStack Query’s power.