Practical recipes for blazing-fast lookups in the browser and Node — keys, eviction (TTL/LRU), invalidation, memory safety, and TypeScript-friendly utilities.

Introduction
You fetch a user by ID five times in one minute. The data barely changes, yet your app keeps hitting the network and re-rendering. That’s wasted time and money. The good news? A tiny, boring tool fixes most of this: an object as a cache.
Objects are lightning-fast for key lookups and already everywhere in JS. With a few patterns — key normalization, TTL, LRU, invalidation, and safe serialization — you can turn a plain object into a reliable cache layer that speeds up UIs and reduces load on APIs.
This post is a practical guide to using Objects (and when to prefer Map/WeakMap) as caches in the browser and Node, with production-ready snippets, React hooks, and TypeScript ergonomics.
1) Why an Object Cache?
- O(1) average lookup: constant-time access by key.
- Zero dependencies: good defaults work with plain JS.
- Tiny footprint: minimal overhead vs. many cache libraries.
- Easy to instrument: you control storage, eviction, and metrics.
Rule of thumb: If you can compute a stable string key for a value and your dataset is modest in size, an object cache is often perfect.
2) Objects vs. Map vs. WeakMap (Pick the Right Container)
Object {}
- Keys: strings & symbols (numbers become strings).
- Great for simple, serializable keys (IDs, routes, composite strings).
- Works with JSON easily (debug/logging).
Map
- Keys: any value (objects, functions).
- Maintains insertion order, iterable by default.
- Better for many entries, frequent adds/removes; performance is typically more predictable than objects for large caches.
WeakMap
- Keys: objects only, held weakly (no memory leaks if key becomes unreachable).
- Perfect for caching derived data about objects (AST nodes, DOM nodes, class instances).
- Not iterable (by design).
Guideline:
- Have string keys → Object or Map (prefer Map if size grows or you need iteration).
- Have object keys → WeakMap.
- Need LRU → Map is usually simpler (because of ordered iteration), but you can still implement LRU over objects.
3) The Smallest Useful Object Cache
const cache = Object.create(null); // null-prototype: safer dictionary
function getUserCached(id, fetchUser) {
if (cache[id]) return cache[id]; // hit
const promise = fetchUser(id).finally(() => {
// Optional: if it failed, drop the bad entry
// (otherwise, keep the rejected promise to avoid request stampede)
});
cache[id] = promise; // store Promise for de-dupe
return promise;
}
Why this works:
- De-dupes concurrent requests: store the Promise immediately; all callers await the same one.
- Null-prototype avoids
__proto__
pitfalls and accidental prototype hits.
4) Key Design: Make Collisions Impossible
A) Single primitive key
cache[userId] = value; // OK when userId is a stable string/number
B) Composite keys
function keyUser(id, fields) {
return `user:${id}|fields:${fields.sort().join(',')}`;
}
cache[keyUser(42, ['name', 'email'])] = data;
C) Request key from config
function stableKey(config) {
// Sort keys for stability
return 'cfg:' + JSON.stringify(config, Object.keys(config).sort());
}
Tips
- Use a clear namespace prefix (
"user:"
,"post:"
) to avoid cross-feature collisions. - For objects, stable stringify: sort keys to ensure the same string for equivalent configs.
- If keys can be large, hash the stable string (e.g., SHA-1) and store
ns:hash
.
5) TTL Cache (Time-to-Live) for Freshness
const cacheTTL = Object.create(null);
function setTTL(key, value, ttlMs) {
cacheTTL[key] = { value, expires: Date.now() + ttlMs };
}
function getTTL(key) {
const entry = cacheTTL[key];
if (!entry) return undefined;
if (Date.now() > entry.expires) {
delete cacheTTL[key];
return undefined;
}
return entry.value;
}
When to use:
- API responses that expire after a known time (e.g., 60s).
- Feature flags, small lists, public resources.
Gotcha:
- TTL eviction runs on read. For long-lived apps, add a periodic sweep (
setInterval
) or sweep during writes to prevent memory growth.
6) LRU (Least Recently Used) Cache Over an Object
Objects don’t preserve order. We can maintain an auxiliary doubly linked list — but that’s tedious. If you can, use Map for LRU since it remembers insertion order:
class LRU {
constructor(limit = 100) {
this.limit = limit;
this.map = new Map();
}
get(key) {
if (!this.map.has(key)) return undefined;
const val = this.map.get(key);
this.map.delete(key);
this.map.set(key, val); // move to most recent
return val;
}
set(key, val) {
if (this.map.has(key)) this.map.delete(key);
this.map.set(key, val);
if (this.map.size > this.limit) {
const firstKey = this.map.keys().next().value; // least recent
this.map.delete(firstKey);
}
}
has(key) { return this.map.has(key); }
delete(key) { return this.map.delete(key); }
clear() { this.map.clear(); }
}
If you must use an object only, keep a parallel array/queue of keys and evict oldest — acceptable for small limits (< few hundred), but Map is cleaner.
7) Stale-While-Revalidate (SWR) in 20 Lines
Show cached data immediately (stale), then refresh in the background and update cache.
const cacheSWR = Object.create(null);
async function swr(key, fetcher, ttlMs = 60_000) {
const entry = cacheSWR[key];
const now = Date.now();
if (entry && entry.value && (now - entry.ts) < ttlMs) {
// Trigger background refresh but return stale value
entry.refresh ??= fetcher().then(v => {
cacheSWR[key] = { value: v, ts: Date.now() };
entry.refresh = null;
return v;
}).catch(() => entry.refresh = null);
return entry.value;
}
// No entry or expired: fetch and cache
const value = await fetcher();
cacheSWR[key] = { value, ts: now };
return value;
}
UX win: near-instant renders with eventual freshness.
8) Promise De-Duping (Avoid Request Stampedes)
Store promises as values so simultaneous calls share work:
const inflight = Object.create(null);
function dedup(key, makePromise) {
if (inflight[key]) return inflight[key];
inflight[key] = makePromise().finally(() => { delete inflight[key]; });
return inflight[key];
}
Use dedup
inside your fetch wrappers to guard endpoints.
9) Cache Invalidation (The Hard Part)
- Direct invalidation:
delete cache[key]
after you mutate the backing data. - Tag-based invalidation: keep groups of keys under a tag; delete them together.
const byTag = Object.create(null);
function tagKey(tag, key) {
(byTag[tag] ||= new Set()).add(key);
}
function invalidateTag(tag) {
for (const k of byTag[tag] || []) delete cache[k];
byTag[tag]?.clear();
}
- Version pinning: include a schema/version number in the key namespace (
"v2:user:42"
). Bump to nuke old values automatically.
10) React Hooks: useObjectCache
and SWR-style Hook
import { useEffect, useRef, useState } from "react";
function useObjectCache<T>(key: string, fetcher: () => Promise<T>, ttl = 30_000) {
const storeRef = useRef<{ [k: string]: { v: T; ts: number } }>(Object.create(null));
const store = storeRef.current;
const [data, setData] = useState<T | null>(() => {
const e = store[key];
return e && (Date.now() - e.ts < ttl) ? e.v : null;
});
useEffect(() => {
let cancelled = false;
const e = store[key];
if (e && (Date.now() - e.ts < ttl)) return; // fresh
fetcher().then(v => {
store[key] = { v, ts: Date.now() };
if (!cancelled) setData(v);
});
return () => { cancelled = true; };
}, [key, ttl, fetcher, store]);
return data;
}
Notes:
- The hook keeps a module-local object cache. For app-wide caches, lift to context.
- Add a
refresh()
function for manual invalidation. - For SWR, return stale immediately and start a background fetch — update state on arrival.
11) TypeScript Ergonomics
A) Typed cache interface
type Cache<T> = Record<string, T>;
function makeCache<T>() {
const store: Cache<T> = Object.create(null);
return {
get: (k: string) => store[k] as T | undefined,
set: (k: string, v: T) => (store[k] = v),
has: (k: string) => Object.hasOwn(store, k),
del: (k: string) => delete store[k],
clear: () => Object.keys(store).forEach(k => delete store[k]),
};
}
B) Generic memoize with object cache
export function memoize<A extends any[], R>(
fn: (...args: A) => R,
keyFn: (...args: A) => string = (...args) => JSON.stringify(args)
) {
const cache: Record<string, R> = Object.create(null);
return (...args: A): R => {
const k = keyFn(...args);
return Object.hasOwn(cache, k) ? cache[k] : (cache[k] = fn(...args));
};
}
12) Persistence: Caching Across Page Loads
When to persist: only if data is expensive to recompute and safe to store. Consider staleness and privacy.
A) localStorage layer (small & fast)
function loadCache(key) {
const raw = localStorage.getItem(key);
return raw ? JSON.parse(raw) : Object.create(null);
}
function saveCache(key, obj) {
localStorage.setItem(key, JSON.stringify(obj));
}
Gotchas:
localStorage
is synchronous—debounce writes.- Don’t store secrets; consider IndexedDB for large blobs.
- Persist with a versioned envelope (
{ v, ts, data }
) and TTL.
B) IndexedDB for larger caches
Keep object cache in memory, periodically mirror to IndexedDB asynchronously (Dexie/idb libraries help). On boot, hydrate the in-memory cache from disk.
13) Concurrency: Avoid Races and Thundering Herds
- Lock per key: use the in-flight promise trick (Section 8).
- Write-after-read hazard: if two updates for the same key race, store a monotonically increasing epoch:
const epoch = Object.create(null);
async function writeLatest(key, producer) {
const e = (epoch[key] = (epoch[key] || 0) + 1);
const val = await producer();
if (epoch[key] === e) cache[key] = val; // only latest wins
}
- Stale writes: include an ETag/version in your key or value and reject older updates.
14) Observability: Metrics & Debugging
Add optional counters:
const stats = { hit: 0, miss: 0, set: 0, evict: 0 };
function getWithStats(key) {
if (Object.hasOwn(cache, key)) { stats.hit++; return cache[key]; }
stats.miss++; return undefined;
}
Expose a debug console command to inspect current keys and sizes. Log evictions to validate TTL/LRU behavior.
15) Security & Safety
- Prototype pollution: use null-prototype objects for caches and never merge untrusted keys directly. Guard keys:
const BLOCKED = new Set(['__proto__','prototype','constructor']);
if (BLOCKED.has(key)) throw new Error('Unsafe key');
- Memory leaks: caches grow unless you evict. Use TTL or LRU, or cap size.
- Sensitive data: don’t cache secrets in long-lived stores (especially not
localStorage
).
16) Performance Notes
- Lookups are cheap; the expensive parts are serialization and fetches.
- If key computation is heavy (e.g., stable stringify), memoize the key or precompute once.
- Avoid writing to
localStorage
on every update—debounce (300–1000ms). - For massive hot paths, prefer
Map
—it’s highly optimized and iterable.
17) Real-World Patterns You’ll Reuse
A) API GET cache with TTL + dedupe
const cache = Object.create(null);
const inflight = Object.create(null);
async function getJson(url, ttl = 30_000) {
const k = `GET:${url}`;
const e = cache[k];
const now = Date.now();
if (e && now - e.ts < ttl) return e.data; // fresh
if (inflight[k]) return inflight[k]; // dedupe
inflight[k] = fetch(url)
.then(r => r.json())
.then(data => (cache[k] = { data, ts: Date.now() }).data)
.finally(() => { delete inflight[k]; });
return inflight[k];
}
B) Function result memoization (pure functions)
const fib = memoize((n) => n < 2 ? n : fib(n-1) + fib(n-2));
C) Cache + invalidation on mutation
async function updateUser(id, patch) {
const res = await fetch(`/api/users/${id}`, { method:'PATCH', body: JSON.stringify(patch) });
const updated = await res.json();
cache[`user:${id}`] = updated; // write-through
// Or invalidate related lists:
invalidateTag('users:list');
return updated;
}
D) Component-local derivation cache with WeakMap
const derived = new WeakMap(); // key: object, value: computed data
function getDerived(user) {
let d = derived.get(user);
if (!d) {
d = expensiveCompute(user);
derived.set(user, d);
}
return d;
}
E) SWR for settings
function getSettings() {
return swr('settings', () => fetch('/api/settings').then(r => r.json()), 60_000);
}
18) Testing Your Cache
- Hit/miss behavior: first call miss, second call hit.
- TTL expiry: simulate time advance (use fake timers).
- LRU eviction: exceed limit and assert oldest key disappears.
- Concurrent dedupe: call the same key twice; ensure fetcher only runs once.
- Invalidation: after mutation, read reflects new value.
- WeakMap GC: can’t test directly, but ensure you never retain keys in arrays/sets.
19) Copy-Paste Utilities (Batteries Included)
// null-prototype record
export const dict = <T>() => Object.create(null) as Record<string, T>;
// TTL cache
export function makeTTL<T>() {
const store = dict<{ v: T; exp: number }>();
return {
get(k: string) {
const e = store[k]; if (!e) return undefined;
if (Date.now() > e.exp) { delete store[k]; return undefined; }
return e.v;
},
set(k: string, v: T, ttlMs: number) {
store[k] = { v, exp: Date.now() + ttlMs };
},
delete(k: string) { delete store[k]; },
clear() { for (const k in store) delete store[k]; }
};
}
// In-flight deduper
export function makeDeduper<R>() {
const inflight = dict<Promise<R>>();
return (key: string, run: () => Promise<R>) => {
if (inflight[key]) return inflight[key];
inflight[key] = run().finally(() => { delete inflight[key]; });
return inflight[key];
};
}
// Stable key from object (sorted)
export function stableKey(obj: object) {
const keys = Object.keys(obj).sort();
return JSON.stringify(obj, keys as any);
}
Conclusion
Using objects as caches is a small change with outsized payoff:
- Speed: instant O(1) lookups and fewer network calls.
- Simplicity: tiny utilities you control and understand.
- Correctness: with TTL/LRU, dedupe, and invalidation, caches stay fresh and small.
- Ergonomics: clean React hooks, TypeScript types, and predictable keys.
Pro Tip: Start with a null-prototype object cache + Promise dedupe. Add TTL for freshness. If your cache grows or you need ordering, switch to Map (or WeakMap for object keys). Persist only what’s safe — and test the edge cases (expiry, eviction, races).
Call to Action (CTA)
How are you caching today — plain object, Map LRU, or a library?
Share your pattern (and a gotcha you’ve hit) in the comments. If this helped, bookmark it and send to a teammate who’s still re-fetching the same data on every render.
Leave a Reply