hardSenior Backend EngineerE-commerce
How would you design a caching strategy for a high-traffic web application with stale data tolerance?
Posted 18/04/2026
by Mehedy Hasan Ador
Question Details
At an e-commerce company interview:
> "Our product listing page gets 50K requests/sec. The database can handle 5K queries/sec. Products update every 15 minutes via a sync job. Design a caching strategy that handles this traffic with acceptable staleness."
> "Our product listing page gets 50K requests/sec. The database can handle 5K queries/sec. Products update every 15 minutes via a sync job. Design a caching strategy that handles this traffic with acceptable staleness."
Suggested Solution
Multi-Layer Caching Architecture
Browser Cache → CDN Cache → Application Cache (Redis) → Database
(client) (edge) (server) (source)
Layer 1: Browser Cache
// Static product data — cache aggressively
app.get("/api/products", (req, res) => {
res.set("Cache-Control", "public, max-age=300, stale-while-revalidate=600");
// Fresh for 5min, serve stale for up to 10min while revalidating
res.json(products);
});
Layer 2: CDN Cache (Cloudflare/Fastly)
Rules:
/api/products/* → Cache 5 min, serve stale 10 min /api/products/[id] → Cache 10 min, serve stale 15 min /api/search → No cache (dynamic) // Surrogate-Key for targeted invalidation
res.set("Surrogate-Key", "products product-123");
// When product 123 updates: PURGE product-123 from CDN
Layer 3: Redis Application Cache
class CacheService {
private redis: Redis;
async get<T>(key: string, fetcher: () => Promise<T>, ttl = 300): Promise<T> {
const cached = await this.redis.get(key);
if (cached) {
const parsed = JSON.parse(cached);
// Stale-While-Revalidate pattern
if (parsed.expiresAt < Date.now()) {
// Stale — return it but refresh in background
this.refreshInBackground(key, fetcher, ttl);
}
return parsed.data as T;
}
// Cache miss — fetch and store
const data = await fetcher();
await this.set(key, data, ttl);
return data;
}
private async refreshInBackground<T>(key: string, fetcher: () => Promise<T>, ttl: number) {
fetcher().then(data => this.set(key, data, ttl)).catch(console.error);
}
private async set<T>(key: string, data: T, ttl: number) {
await this.redis.set(key, JSON.stringify({
data,
expiresAt: Date.now() + ttl * 1000,
}), "EX", ttl * 2); // Redis TTL = 2x logical TTL
}
}
Cache Invalidation Strategies
The 3 Golden Rules
1. Cache at the right layer — Browser for static, CDN for semi-static, Redis for dynamic2. Use stale-while-revalidate — Never block users on cache refresh
3. Invalidate smartly — Tag-based for bulk, event-based for single items
Traffic Flow at Scale
50,000 req/sec incoming
→ 45,000 served from CDN (90%)
→ 4,500 served from Redis (9%)
→ 500 served from Database (1%)
Database load: 500 qps (well within 5K capacity) ✅