Caching as a Distributed Systems Problem in Frontend Apps
February 26, 20262 min read31 views
Caching as a Distributed Systems Problem in Frontend Apps
Browser cache, CDN cache, Next.js cache, React cache — when you have four caches between your user and your data, you have a distributed systems problem. Cache coherence strategies for the modern web stack.
The Caching Stack
A single request to a Next.js application can traverse multiple caching layers:
┌─────────────────────────────────────────────────────────────────┐
│ The Frontend Caching Stack │
├─────────────────────────────────────────────────────────────────┤
│ │
│ User Request │
│ │ │
│ ▼ │
│ ┌─────────────────────────────────────────────┐ │
│ │ Layer 1: Browser Cache │ │
│ │ • HTTP Cache (Cache-Control, ETag) │ │
│ │ • Service Worker Cache │ │
│ │ • Memory Cache (bfcache) │ │
│ │ • React Query / SWR Cache │ │
│ └────────────────────┬────────────────────────┘ │
│ │ Cache MISS │
│ ▼ │
│ ┌─────────────────────────────────────────────┐ │
│ │ Layer 2: CDN Edge Cache │ │
│ │ • Vercel Edge Network │ │
│ │ • Cloudflare │ │
│ │ • Fastly │ │
│ │ • Regional PoPs worldwide │ │
│ └────────────────────┬────────────────────────┘ │
│ │ Cache MISS │
│ ▼ │
│ ┌─────────────────────────────────────────────┐ │
│ │ Layer 3: Application Cache │ │
│ │ • Next.js Data Cache │ │
│ │ • Next.js Full Route Cache │ │
│ │ • React Cache (request memoization) │ │
│ │ • Incremental Static Regeneration │ │
│ └────────────────────┬────────────────────────┘ │
│ │ Cache MISS │
│ ▼ │
│ ┌─────────────────────────────────────────────┐ │
│ │ Layer 4: Backend/Database Cache │ │
│ │ • Redis / Memcached │ │
│ │ • Database query cache │ │
│ │ • Connection pooling │ │
│ └────────────────────┬────────────────────────┘ │
│ │ Cache MISS │
│ ▼ │
│ ┌─────────────────────────────────────────────┐ │
│ │ Origin: Database │ │
│ └─────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────┘
Each layer has different:
- TTL semantics — seconds vs. minutes vs. hours
- Invalidation mechanisms — time-based, tag-based, on-demand
- Consistency guarantees — strong, eventual, stale-while-revalidate
- Scope — per-user, per-region, global
Cache Coherence Problems
The Stale Data Problem
┌─────────────────────────────────────────────────────────────────┐
│ Stale Data Scenario │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Timeline: │
│ │
│ T=0 User A requests /product/123 │
│ └── CDN caches: { price: $100, stock: 50 } │
│ │
│ T=30s Admin updates product │
│ └── Database: { price: $80, stock: 0 } │
│ │
│ T=45s User B requests /product/123 │
│ └── CDN returns cached: { price: $100, stock: 50 } │
│ └── User B purchases "in stock" item at wrong price! │
│ │
│ T=60s CDN cache expires │
│ └── New requests see correct data │
│ │
│ Impact: 30 seconds of incorrect pricing/inventory │
│ │
└─────────────────────────────────────────────────────────────────┘
The Thundering Herd Problem
┌─────────────────────────────────────────────────────────────────┐
│ Thundering Herd Scenario │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Popular product page, 10,000 concurrent users │
│ │
│ T=0 CDN cache expires │
│ │
│ T=0.001 Request 1 → Cache MISS → Origin │
│ T=0.002 Request 2 → Cache MISS → Origin │
│ T=0.003 Request 3 → Cache MISS → Origin │
│ ... │
│ T=0.100 Request 1000 → Cache MISS → Origin │
│ │
│ Origin receives 1000 identical requests simultaneously │
│ └── Database overwhelmed │
│ └── Response time degrades to 5+ seconds │
│ └── Some requests timeout │
│ └── Cascade failure possible │
│ │
│ Solution: Request coalescing / single-flight │
│ │
└─────────────────────────────────────────────────────────────────┘
The Cache Inconsistency Problem
┌─────────────────────────────────────────────────────────────────┐
│ Multi-Layer Inconsistency │
├─────────────────────────────────────────────────────────────────┤
│ │
│ User navigates: /products → /product/123 → /cart │
│ │
│ /products page (CDN cached 5 min ago) │
│ └── Shows: Product 123 - $100, In Stock │
│ │
│ /product/123 page (Fresh from origin) │
│ └── Shows: Product 123 - $80, Out of Stock │
│ │
│ /cart API (React Query cached) │
│ └── Shows: Product 123 - $100, In Stock │
│ │
│ User sees THREE different states of the same data │
│ └── Trust erosion │
│ └── Incorrect purchase decisions │
│ └── Support tickets │
│ │
└─────────────────────────────────────────────────────────────────┘
Browser Cache Strategies
HTTP Cache Headers
// next.config.js - Static asset caching
module.exports = {
async headers() {
return [
{
// Immutable assets (hashed filenames)
source: '/_next/static/:path*',
headers: [
{
key: 'Cache-Control',
value: 'public, max-age=31536000, immutable',
},
],
},
{
// Images with content hash
source: '/images/:path*',
headers: [
{
key: 'Cache-Control',
value: 'public, max-age=31536000, immutable',
},
],
},
{
// HTML pages - short cache with revalidation
source: '/:path*',
headers: [
{
key: 'Cache-Control',
value: 'public, max-age=0, must-revalidate',
},
],
},
{
// API routes - no caching by default
source: '/api/:path*',
headers: [
{
key: 'Cache-Control',
value: 'private, no-cache, no-store, must-revalidate',
},
],
},
];
},
};
ETag-based Validation
// api/products/[id]/route.ts
import { createHash } from 'crypto';
export async function GET(
request: Request,
{ params }: { params: { id: string } }
) {
const product = await db.products.findUnique({
where: { id: params.id },
});
if (!product) {
return new Response('Not Found', { status: 404 });
}
// Generate ETag from content
const etag = createHash('md5')
.update(JSON.stringify(product) + product.updatedAt.toISOString())
.digest('hex');
// Check If-None-Match header
const ifNoneMatch = request.headers.get('If-None-Match');
if (ifNoneMatch === etag) {
return new Response(null, { status: 304 });
}
return Response.json(product, {
headers: {
ETag: etag,
'Cache-Control': 'private, max-age=0, must-revalidate',
'Last-Modified': product.updatedAt.toUTCString(),
},
});
}
Stale-While-Revalidate Pattern
// Client-side SWR implementation
import useSWR from 'swr';
const fetcher = (url: string) => fetch(url).then((r) => r.json());
function ProductPrice({ productId }: { productId: string }) {
const { data, error, isValidating, mutate } = useSWR(
`/api/products/${productId}/price`,
fetcher,
{
// Return stale data immediately
revalidateOnFocus: true,
revalidateOnReconnect: true,
// Background revalidation interval
refreshInterval: 30000, // 30 seconds
// Dedupe requests within window
dedupingInterval: 5000,
// Keep previous data while revalidating
keepPreviousData: true,
// Error retry configuration
errorRetryCount: 3,
errorRetryInterval: 1000,
// Callback when data changes
onSuccess: (data, key, config) => {
// Track price changes for analytics
if (data.priceChanged) {
analytics.track('price_updated', { productId });
}
},
}
);
return (
<div>
<span className={isValidating ? 'opacity-50' : ''}>
${data?.price ?? '--'}
</span>
{isValidating && <Spinner size="sm" />}
</div>
);
}
CDN Cache Strategies
Vercel Edge Cache Configuration
// app/products/[id]/page.tsx
import { unstable_cache } from 'next/cache';
// Cache at CDN edge with tags for invalidation
async function getProduct(id: string) {
const product = await db.products.findUnique({
where: { id },
});
return product;
}
const getCachedProduct = unstable_cache(
getProduct,
['product'],
{
tags: ['products'], // For bulk invalidation
revalidate: 60, // Revalidate every 60 seconds
}
);
export async function generateMetadata({ params }: { params: { id: string } }) {
const product = await getCachedProduct(params.id);
return {
title: product?.name,
};
}
export default async function ProductPage({ params }: { params: { id: string } }) {
const product = await getCachedProduct(params.id);
return <ProductDetails product={product} />;
}
// Segment-level cache configuration
export const revalidate = 60; // ISR: regenerate every 60 seconds
export const dynamic = 'force-static'; // Prefer static generation
Cache Tags and Targeted Invalidation
// lib/cache.ts
import { revalidateTag, revalidatePath } from 'next/cache';
import { unstable_cache } from 'next/cache';
/**
* Hierarchical cache tag system
*/
export function createCacheTags(entity: string, id?: string): string[] {
const tags = [entity]; // e.g., 'products'
if (id) {
tags.push(`${entity}:${id}`); // e.g., 'products:123'
}
return tags;
}
/**
* Cached data fetcher with automatic tagging
*/
export function cachedFetch<T>(
key: string,
fetcher: () => Promise<T>,
options: {
tags: string[];
revalidate?: number;
}
) {
return unstable_cache(fetcher, [key], {
tags: options.tags,
revalidate: options.revalidate ?? 60,
});
}
/**
* Invalidation strategies
*/
export async function invalidateProduct(productId: string) {
// Invalidate specific product
revalidateTag(`products:${productId}`);
// Also invalidate lists that include this product
revalidateTag('products:list');
// Invalidate related paths
revalidatePath(`/products/${productId}`);
}
export async function invalidateAllProducts() {
// Nuclear option - invalidates everything tagged 'products'
revalidateTag('products');
}
export async function invalidateCategory(categoryId: string) {
revalidateTag(`categories:${categoryId}`);
revalidateTag('products:list'); // Category change affects product lists
}
// Webhook handler for CMS updates
export async function handleCMSWebhook(payload: CMSWebhookPayload) {
switch (payload.event) {
case 'product.updated':
await invalidateProduct(payload.data.id);
break;
case 'product.deleted':
await invalidateProduct(payload.data.id);
await invalidateAllProducts(); // Lists need refresh
break;
case 'category.updated':
await invalidateCategory(payload.data.id);
break;
case 'global.settings.updated':
// Purge everything
revalidateTag('site');
break;
}
}
Surrogate Keys (Fastly/Cloudflare)
// Custom cache key headers for CDN
export async function GET(request: Request) {
const product = await getProduct(request);
return Response.json(product, {
headers: {
// Fastly Surrogate-Key
'Surrogate-Key': `product-${product.id} category-${product.categoryId} products`,
// Cloudflare Cache-Tag
'Cache-Tag': `product-${product.id},category-${product.categoryId},products`,
// CDN cache duration
'CDN-Cache-Control': 'max-age=3600',
// Browser cache duration (shorter)
'Cache-Control': 'max-age=60, stale-while-revalidate=300',
},
});
}
// Purge API
async function purgeFromCDN(tags: string[]) {
// Fastly
await fetch(`https://api.fastly.com/service/${SERVICE_ID}/purge`, {
method: 'POST',
headers: {
'Fastly-Key': FASTLY_API_KEY,
'Content-Type': 'application/json',
},
body: JSON.stringify({ surrogate_keys: tags }),
});
// OR Cloudflare
await fetch(
`https://api.cloudflare.com/client/v4/zones/${ZONE_ID}/purge_cache`,
{
method: 'POST',
headers: {
Authorization: `Bearer ${CF_API_TOKEN}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({ tags }),
}
);
}
Next.js Cache Architecture
The Four Caches
┌─────────────────────────────────────────────────────────────────┐
│ Next.js Internal Caches │
├─────────────────────────────────────────────────────────────────┤
│ │
│ 1. Request Memoization (React Cache) │
│ └── Scope: Single request │
│ └── Dedupes identical fetch() calls │
│ └── Automatic in Server Components │
│ │
│ 2. Data Cache │
│ └── Scope: Cross-request, persistent │
│ └── Stores fetch() results │
│ └── Controlled by revalidate option │
│ │
│ 3. Full Route Cache │
│ └── Scope: Build time + ISR │
│ └── Stores rendered HTML + RSC payload │
│ └── Controlled by dynamic/revalidate │
│ │
│ 4. Router Cache (Client) │
│ └── Scope: Browser session │
│ └── Stores visited route RSC payloads │
│ └── Prefetched route segments │
│ │
└─────────────────────────────────────────────────────────────────┘
Controlling Cache Behavior
// app/products/[id]/page.tsx
// 1. Request Memoization - automatic deduplication
async function ProductPage({ params }: { params: { id: string } }) {
// These three calls are deduped into ONE database query
const product = await getProduct(params.id);
const productForMeta = await getProduct(params.id); // Deduped
const productForBreadcrumb = await getProduct(params.id); // Deduped
return <div>{/* ... */}</div>;
}
// 2. Data Cache - fetch options
async function getProductWithCacheControl(id: string) {
// Cache indefinitely (default)
const res1 = await fetch(`${API}/products/${id}`);
// Cache for 60 seconds
const res2 = await fetch(`${API}/products/${id}`, {
next: { revalidate: 60 },
});
// No cache (always fresh)
const res3 = await fetch(`${API}/products/${id}`, {
cache: 'no-store',
});
// Cache with tags for invalidation
const res4 = await fetch(`${API}/products/${id}`, {
next: {
revalidate: 3600,
tags: [`product-${id}`],
},
});
}
// 3. Full Route Cache - page-level control
export const revalidate = 60; // ISR every 60 seconds
export const dynamic = 'force-dynamic'; // Disable full route cache
export const fetchCache = 'force-no-store'; // Disable data cache
// 4. Router Cache - client-side invalidation
'use client';
import { useRouter } from 'next/navigation';
function RefreshButton() {
const router = useRouter();
return (
<button onClick={() => router.refresh()}>
Refresh Data
</button>
);
}
Cache Debugging
// middleware.ts - Cache header inspection
import { NextResponse } from 'next/server';
export function middleware(request: Request) {
const response = NextResponse.next();
// Add cache debugging headers
if (process.env.NODE_ENV === 'development') {
response.headers.set('X-Cache-Debug', 'true');
}
return response;
}
// Custom cache wrapper with logging
import { unstable_cache } from 'next/cache';
export function cachedWithLogging<T>(
fn: () => Promise<T>,
keyParts: string[],
options: { tags: string[]; revalidate?: number }
) {
const cached = unstable_cache(fn, keyParts, options);
return async () => {
const start = performance.now();
const result = await cached();
const duration = performance.now() - start;
// Log cache behavior (sub-millisecond = cache hit)
console.log(
`[Cache] ${keyParts.join(':')} - ${duration.toFixed(2)}ms - ${
duration < 1 ? 'HIT' : 'MISS'
}`
);
return result;
};
}
React Query / TanStack Query Cache
Multi-Level Cache Coordination
// lib/query-client.ts
import { QueryClient } from '@tanstack/react-query';
export const queryClient = new QueryClient({
defaultOptions: {
queries: {
// Cache time in memory
gcTime: 1000 * 60 * 5, // 5 minutes
// Consider data stale after
staleTime: 1000 * 30, // 30 seconds
// Retry configuration
retry: 3,
retryDelay: (attemptIndex) => Math.min(1000 * 2 ** attemptIndex, 30000),
// Refetch behavior
refetchOnWindowFocus: true,
refetchOnReconnect: true,
refetchOnMount: true,
// Structural sharing (performance)
structuralSharing: true,
},
},
});
// Hydration from server cache
export function HydrateClient({
children,
state,
}: {
children: React.ReactNode;
state: unknown;
}) {
return (
<HydrationBoundary state={state}>
{children}
</HydrationBoundary>
);
}
Optimistic Updates with Cache Sync
// hooks/use-update-product.ts
import { useMutation, useQueryClient } from '@tanstack/react-query';
interface Product {
id: string;
name: string;
price: number;
stock: number;
}
export function useUpdateProduct() {
const queryClient = useQueryClient();
return useMutation({
mutationFn: async (update: Partial<Product> & { id: string }) => {
const response = await fetch(`/api/products/${update.id}`, {
method: 'PATCH',
body: JSON.stringify(update),
});
return response.json();
},
// Optimistic update
onMutate: async (newData) => {
// Cancel outgoing refetches
await queryClient.cancelQueries({
queryKey: ['product', newData.id],
});
// Snapshot previous value
const previousProduct = queryClient.getQueryData<Product>([
'product',
newData.id,
]);
// Optimistically update
queryClient.setQueryData<Product>(['product', newData.id], (old) => ({
...old!,
...newData,
}));
// Also update in list cache
queryClient.setQueryData<Product[]>(['products'], (old) =>
old?.map((p) => (p.id === newData.id ? { ...p, ...newData } : p))
);
return { previousProduct };
},
// Rollback on error
onError: (err, newData, context) => {
if (context?.previousProduct) {
queryClient.setQueryData(
['product', newData.id],
context.previousProduct
);
}
},
// Sync with server on success
onSettled: (data, error, variables) => {
// Invalidate to ensure consistency
queryClient.invalidateQueries({
queryKey: ['product', variables.id],
});
queryClient.invalidateQueries({
queryKey: ['products'],
});
},
});
}
Server State Synchronization
// Coordinating server cache with client cache
// app/products/[id]/page.tsx
import { dehydrate, HydrationBoundary, QueryClient } from '@tanstack/react-query';
async function getProduct(id: string) {
const res = await fetch(`${process.env.API_URL}/products/${id}`, {
next: { revalidate: 60, tags: [`product-${id}`] },
});
return res.json();
}
export default async function ProductPage({ params }: { params: { id: string } }) {
const queryClient = new QueryClient();
// Prefetch into React Query cache
await queryClient.prefetchQuery({
queryKey: ['product', params.id],
queryFn: () => getProduct(params.id),
staleTime: 60 * 1000, // Match server cache
});
return (
<HydrationBoundary state={dehydrate(queryClient)}>
<ProductDetails productId={params.id} />
</HydrationBoundary>
);
}
// Client component uses hydrated data
'use client';
function ProductDetails({ productId }: { productId: string }) {
const { data: product } = useQuery({
queryKey: ['product', productId],
queryFn: () => fetch(`/api/products/${productId}`).then(r => r.json()),
staleTime: 60 * 1000,
});
// Data is immediately available from hydration
return <div>{product.name}</div>;
}
Cache Coherence Strategies
Strategy 1: Time-Based Expiration (Simple)
/**
* All caches expire at same intervals
* Trade-off: Simple but can serve stale data
*/
const CACHE_CONFIG = {
// Align TTLs across layers
browser: 60, // 1 minute
cdn: 60, // 1 minute (aligned)
server: 60, // 1 minute (aligned)
database: 300, // 5 minutes (longer, less volatile)
};
// All layers use same TTL
export async function GET(request: Request) {
const data = await getCachedData();
return Response.json(data, {
headers: {
'Cache-Control': `public, max-age=${CACHE_CONFIG.browser}, s-maxage=${CACHE_CONFIG.cdn}`,
},
});
}
Strategy 2: Event-Driven Invalidation (Consistent)
/**
* Invalidate all layers on data change
* Trade-off: Complex but consistent
*/
// Event bus for cache coordination
import { EventEmitter } from 'events';
const cacheEvents = new EventEmitter();
// Data mutation triggers invalidation
async function updateProduct(id: string, data: ProductUpdate) {
// 1. Update database
const product = await db.products.update({
where: { id },
data,
});
// 2. Emit cache invalidation event
cacheEvents.emit('invalidate', {
type: 'product',
id,
timestamp: Date.now(),
});
return product;
}
// Cache invalidation handler
cacheEvents.on('invalidate', async ({ type, id }) => {
// 1. Invalidate Next.js cache
revalidateTag(`${type}-${id}`);
// 2. Purge CDN cache
await purgeCDN([`${type}-${id}`]);
// 3. Broadcast to connected clients (for client cache)
await broadcastInvalidation({ type, id });
});
// Client-side listener
function useCacheInvalidation() {
const queryClient = useQueryClient();
useEffect(() => {
const eventSource = new EventSource('/api/cache-events');
eventSource.onmessage = (event) => {
const { type, id } = JSON.parse(event.data);
// Invalidate React Query cache
queryClient.invalidateQueries({
queryKey: [type, id],
});
};
return () => eventSource.close();
}, [queryClient]);
}
Strategy 3: Version-Based Cache Keys
/**
* Include version in cache keys
* Trade-off: Guaranteed fresh data, more cache misses
*/
interface CacheVersion {
products: number;
categories: number;
settings: number;
}
// Store versions (Redis or database)
async function getVersions(): Promise<CacheVersion> {
return {
products: await redis.get('version:products') ?? 1,
categories: await redis.get('version:categories') ?? 1,
settings: await redis.get('version:settings') ?? 1,
};
}
async function bumpVersion(key: keyof CacheVersion): Promise<number> {
return redis.incr(`version:${key}`);
}
// Include version in all cache keys
async function getProductCached(id: string) {
const versions = await getVersions();
const cacheKey = `product:${id}:v${versions.products}`;
return unstable_cache(
() => db.products.findUnique({ where: { id } }),
[cacheKey],
{ revalidate: 3600 } // Long TTL since version handles freshness
)();
}
// On product update, bump version
async function updateProduct(id: string, data: ProductUpdate) {
const product = await db.products.update({
where: { id },
data,
});
// All product caches automatically invalidated
await bumpVersion('products');
return product;
}
Strategy 4: Stale-While-Revalidate Everywhere
/**
* Accept staleness, revalidate in background
* Trade-off: Fast but eventual consistency
*/
// CDN header
const SWR_HEADERS = {
'Cache-Control': 'public, max-age=1, stale-while-revalidate=59',
// Serve stale for up to 59 seconds while revalidating
};
// Client-side with React Query
const queryConfig = {
staleTime: 1000, // Data stale after 1 second
gcTime: 60 * 1000, // Keep in cache for 60 seconds
refetchOnMount: 'always', // Always check freshness
};
// Visual indicator for stale data
function DataWithStalenessIndicator<T>({
data,
isStale,
children,
}: {
data: T;
isStale: boolean;
children: (data: T) => React.ReactNode;
}) {
return (
<div className={isStale ? 'opacity-75' : ''}>
{children(data)}
{isStale && (
<span className="text-xs text-gray-500">
Updating...
</span>
)}
</div>
);
}
Preventing Thundering Herd
Request Coalescing
// lib/single-flight.ts
type PendingRequest<T> = Promise<T>;
const inflight = new Map<string, PendingRequest<unknown>>();
export async function singleFlight<T>(
key: string,
fn: () => Promise<T>
): Promise<T> {
// Check for in-flight request
const existing = inflight.get(key);
if (existing) {
return existing as Promise<T>;
}
// Create new request
const promise = fn().finally(() => {
inflight.delete(key);
});
inflight.set(key, promise);
return promise;
}
// Usage
async function getProduct(id: string) {
return singleFlight(`product:${id}`, async () => {
const product = await db.products.findUnique({
where: { id },
});
return product;
});
}
Probabilistic Early Expiration
/**
* XFetch algorithm - probabilistic early recomputation
* Prevents synchronized cache expiry
*/
function shouldRecomputeEarly(
ttl: number,
delta: number, // Time to recompute
beta: number = 1
): boolean {
const now = Date.now();
const expiry = now + ttl;
// Probability increases as expiry approaches
const random = Math.random();
const xfetch = delta * beta * Math.log(random);
return now - xfetch >= expiry;
}
// Cache wrapper with early recomputation
async function cachedWithXFetch<T>(
key: string,
fn: () => Promise<T>,
ttl: number
): Promise<T> {
const cached = await cache.get(key);
if (cached) {
const { value, delta, expiresAt } = cached;
const remainingTTL = expiresAt - Date.now();
// Maybe recompute early
if (shouldRecomputeEarly(remainingTTL, delta)) {
// Recompute in background
recomputeAsync(key, fn, ttl);
}
return value;
}
// Cache miss - compute and store
const start = Date.now();
const value = await fn();
const delta = Date.now() - start;
await cache.set(key, {
value,
delta,
expiresAt: Date.now() + ttl,
});
return value;
}
Cache Warming
// Pre-warm caches before they expire
async function warmProductCaches() {
// Get most accessed products
const popularProducts = await db.products.findMany({
orderBy: { viewCount: 'desc' },
take: 100,
});
// Warm caches in parallel
await Promise.all(
popularProducts.map(async (product) => {
const cacheKey = `product:${product.id}`;
// Check if close to expiry
const ttl = await cache.ttl(cacheKey);
if (ttl < 60) {
// Less than 60 seconds remaining
// Re-fetch and cache
await getCachedProduct(product.id);
}
})
);
}
// Run every minute
setInterval(warmProductCaches, 60000);
Cache Architecture Patterns
Pattern 1: Write-Through Cache
┌─────────────────────────────────────────────────────────────────┐
│ Write-Through Cache │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Write Operation: │
│ │
│ Client ──► Cache ──► Database │
│ (sync) (sync) │
│ │
│ Both updated atomically │
│ ✓ Strong consistency │
│ ✗ Write latency (2x) │
│ │
└─────────────────────────────────────────────────────────────────┘
async function updateProduct(id: string, data: ProductUpdate) {
// Atomic write-through
const product = await db.$transaction(async (tx) => {
// Update database
const updated = await tx.products.update({
where: { id },
data,
});
// Update cache
await cache.set(`product:${id}`, updated, { ex: 3600 });
return updated;
});
// Invalidate derived caches
await Promise.all([
revalidateTag(`product-${id}`),
cache.del(`products:category:${product.categoryId}`),
]);
return product;
}
Pattern 2: Write-Behind (Async)
┌─────────────────────────────────────────────────────────────────┐
│ Write-Behind Cache │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Write Operation: │
│ │
│ Client ──► Cache ──────────► Queue ──────────► Database │
│ (sync) (async) (async) │
│ │
│ Cache updated immediately, DB updated eventually │
│ ✓ Low write latency │
│ ✗ Potential data loss │
│ ✗ Eventually consistent │
│ │
└─────────────────────────────────────────────────────────────────┘
// Write to cache immediately, sync to DB async
async function updateProductAsync(id: string, data: ProductUpdate) {
// Optimistic cache update
const current = await cache.get(`product:${id}`);
const updated = { ...current, ...data, updatedAt: new Date() };
await cache.set(`product:${id}`, updated, { ex: 3600 });
// Queue database write
await queue.add('db-sync', {
type: 'product-update',
id,
data,
timestamp: Date.now(),
});
return updated;
}
// Background worker
queue.process('db-sync', async (job) => {
const { id, data } = job.data;
try {
await db.products.update({
where: { id },
data,
});
} catch (error) {
// Rollback cache on failure
const freshData = await db.products.findUnique({ where: { id } });
await cache.set(`product:${id}`, freshData, { ex: 3600 });
throw error;
}
});
Pattern 3: Cache-Aside (Lazy Loading)
┌─────────────────────────────────────────────────────────────────┐
│ Cache-Aside Pattern │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Read Operation: │
│ │
│ 1. Client ──► Cache (check) │
│ │ │
│ ├── HIT ──► Return cached │
│ │ │
│ └── MISS ──► Database ──► Cache ──► Return │
│ (populate) │
│ │
│ Write Operation: │
│ │
│ Client ──► Database ──► Invalidate Cache │
│ │
│ ✓ Simple, works with existing caches │
│ ✗ Cache miss penalty │
│ ✗ Potential inconsistency window │
│ │
└─────────────────────────────────────────────────────────────────┘
Debugging Cache Issues
// Cache debugging middleware
export function withCacheDebug<T>(
fn: () => Promise<T>,
context: { layer: string; key: string }
): () => Promise<T> {
return async () => {
const start = performance.now();
const result = await fn();
const duration = performance.now() - start;
const cacheHit = duration < 5; // Sub-5ms typically indicates cache hit
console.log(JSON.stringify({
layer: context.layer,
key: context.key,
duration: duration.toFixed(2),
hit: cacheHit,
timestamp: new Date().toISOString(),
}));
return result;
};
}
// Response headers for debugging
export function addCacheDebugHeaders(
response: Response,
info: {
hit: boolean;
layer: string;
age: number;
ttl: number;
}
): Response {
const headers = new Headers(response.headers);
headers.set('X-Cache', info.hit ? 'HIT' : 'MISS');
headers.set('X-Cache-Layer', info.layer);
headers.set('X-Cache-Age', String(info.age));
headers.set('X-Cache-TTL', String(info.ttl));
return new Response(response.body, {
status: response.status,
headers,
});
}
Summary
Frontend caching is a distributed systems problem because:
- Multiple independent caches — Browser, CDN, server, database all make caching decisions independently
- No global clock — TTLs expire at different times across layers
- Network partitions — Invalidation messages can fail or delay
- Consistency vs. availability — Must choose how stale is acceptable
Coherence strategies:
| Strategy | Consistency | Latency | Complexity |
|---|---|---|---|
| Time-based TTL | Eventual | Low | Low |
| Event-driven invalidation | Strong | Medium | High |
| Version-based keys | Strong | Medium | Medium |
| Stale-while-revalidate | Eventual | Very Low | Low |
The right choice depends on your data's volatility and your users' tolerance for staleness. Product prices need strong consistency. Profile pictures can be eventually consistent. Design your cache layers accordingly.
What did you think?