Memory Management in Long-Lived JavaScript Applications
Memory Management in Long-Lived JavaScript Applications
A dashboard that runs for 8 hours will eventually crawl. A SaaS app open across browser tabs accumulates invisible weight. These aren't edge cases—they're the natural consequence of JavaScript's memory model meeting real-world usage patterns.
This guide covers why memory leaks happen in long-running applications, how to detect them, and architectural patterns that prevent them from occurring in the first place.
Why Long-Lived Apps Are Different
┌─────────────────────────────────────────────────────────────────────┐
│ SHORT-LIVED VS LONG-LIVED APPS │
├─────────────────────────────────────────────────────────────────────┤
│ │
│ SHORT-LIVED (Marketing site, blog) │
│ ══════════════════════════════════ │
│ • User visits page │
│ • Page loads, renders │
│ • User navigates away (full page load) │
│ • Memory released, fresh start │
│ • Leaks don't accumulate │
│ │
│ LONG-LIVED (Dashboard, SaaS, IDE) │
│ ═════════════════════════════════ │
│ • User opens app at 9 AM │
│ • SPA navigation (no full reload) │
│ • Subscriptions, WebSockets, intervals │
│ • Data fetched, cached, refetched │
│ • Components mount/unmount thousands of times │
│ • Memory grows: 50MB → 200MB → 500MB → crash │
│ • Session length: 4-12 hours │
│ │
│ THE ACCUMULATION PROBLEM │
│ ════════════════════════════ │
│ │
│ Memory │
│ ▲ │
│ │ ╱ App crashes │
│ │ ╱╱╱ │
│ │ ╱╱╱╱ │
│ │ ╱╱╱╱ │
│ │ ╱╱╱╱ ← GC can't reclaim │
│ │ ╱╱╱╱ leaked objects │
│ │ ╱╱╱╱ │
│ │ ╱╱╱╱ │
│ │ ╱╱╱╱ │
│ │ ╱╱╱╱ │
│ │ ╱╱╱╱ │
│ │ ╱╱╱╱ │
│ └──────────────────────────────────────────────────▶ Time │
│ 1hr 2hr 3hr 4hr 5hr 6hr 7hr │
│ │
└─────────────────────────────────────────────────────────────────────┘
JavaScript Memory Model Primer
How Garbage Collection Works
┌─────────────────────────────────────────────────────────────────────┐
│ V8 GARBAGE COLLECTION │
├─────────────────────────────────────────────────────────────────────┤
│ │
│ HEAP STRUCTURE │
│ ══════════════ │
│ │
│ ┌─────────────────────────────────────────────────────────────┐ │
│ │ HEAP │ │
│ │ ┌──────────────────┐ ┌────────────────────────────────┐ │ │
│ │ │ YOUNG GEN │ │ OLD GEN │ │ │
│ │ │ (Nursery) │ │ (Tenured) │ │ │
│ │ │ │ │ │ │ │
│ │ │ New objects │ │ Objects that survived │ │ │
│ │ │ allocated here │ │ multiple GC cycles │ │ │
│ │ │ │ │ │ │ │
│ │ │ Scavenged │ │ Mark-Sweep-Compact │ │ │
│ │ │ frequently │ │ (less frequent, slower) │ │ │
│ │ │ (~ms) │ │ (~100ms pause) │ │ │
│ │ │ │ │ │ │ │
│ │ │ Size: ~16MB │ │ Size: ~1.4GB (can grow) │ │ │
│ │ └──────────────────┘ └────────────────────────────────┘ │ │
│ └─────────────────────────────────────────────────────────────┘ │
│ │
│ REACHABILITY (What keeps objects alive) │
│ ═══════════════════════════════════════ │
│ │
│ GC Roots: │
│ ├─ Global object (window/globalThis) │
│ ├─ Currently executing function's local variables │
│ ├─ Closure scopes │
│ ├─ Active event listeners │
│ ├─ Pending callbacks (setTimeout, Promise) │
│ └─ WebSocket/EventSource connections │
│ │
│ If an object is reachable from ANY root → NOT collected │
│ If unreachable from ALL roots → collected │
│ │
└─────────────────────────────────────────────────────────────────────┘
What Constitutes a "Leak"
A memory leak occurs when objects that should be garbage collected remain reachable:
// NOT A LEAK: Object becomes unreachable
function processData() {
const largeArray = new Array(1000000).fill('data');
return largeArray.length;
}
// After return, largeArray is unreachable → GC will collect
// LEAK: Object remains reachable unintentionally
const cache = new Map();
function processData(id: string) {
const largeArray = new Array(1000000).fill('data');
cache.set(id, largeArray); // Now reachable via global cache
return largeArray.length;
}
// largeArray stays in memory forever (until cache.delete or clear)
The Seven Deadly Leaks
1. Forgotten Event Listeners
// ❌ LEAK: Listener never removed
function DashboardWidget({ onUpdate }: { onUpdate: () => void }) {
useEffect(() => {
window.addEventListener('resize', handleResize);
document.addEventListener('visibilitychange', handleVisibility);
// Missing cleanup!
}, []);
// Component unmounts, but listeners remain
// Each mount adds NEW listeners
// After 100 navigations: 100 resize handlers
}
// ✅ FIX: Always clean up
function DashboardWidget({ onUpdate }: { onUpdate: () => void }) {
useEffect(() => {
const handleResize = () => { /* ... */ };
const handleVisibility = () => { /* ... */ };
window.addEventListener('resize', handleResize);
document.addEventListener('visibilitychange', handleVisibility);
return () => {
window.removeEventListener('resize', handleResize);
document.removeEventListener('visibilitychange', handleVisibility);
};
}, []);
}
// ✅ BETTER: Use AbortController for multiple listeners
function DashboardWidget() {
useEffect(() => {
const controller = new AbortController();
const { signal } = controller;
window.addEventListener('resize', handleResize, { signal });
document.addEventListener('visibilitychange', handleVisibility, { signal });
document.addEventListener('keydown', handleKeydown, { signal });
return () => controller.abort(); // Removes ALL listeners at once
}, []);
}
2. Uncleared Timers and Intervals
// ❌ LEAK: Interval never cleared
function PollingComponent({ endpoint }: { endpoint: string }) {
const [data, setData] = useState(null);
useEffect(() => {
const intervalId = setInterval(async () => {
const response = await fetch(endpoint);
setData(await response.json());
}, 5000);
// Missing cleanup!
// Interval keeps running after unmount
// Tries to setData on unmounted component
// Holds reference to component closure
}, [endpoint]);
return <div>{JSON.stringify(data)}</div>;
}
// ✅ FIX: Clear interval on unmount
function PollingComponent({ endpoint }: { endpoint: string }) {
const [data, setData] = useState(null);
useEffect(() => {
let isMounted = true;
const intervalId = setInterval(async () => {
const response = await fetch(endpoint);
if (isMounted) {
setData(await response.json());
}
}, 5000);
return () => {
isMounted = false;
clearInterval(intervalId);
};
}, [endpoint]);
return <div>{JSON.stringify(data)}</div>;
}
// ✅ BETTER: Use a custom hook with proper cleanup
function usePolling<T>(
fetcher: () => Promise<T>,
intervalMs: number
): { data: T | null; error: Error | null } {
const [data, setData] = useState<T | null>(null);
const [error, setError] = useState<Error | null>(null);
useEffect(() => {
const controller = new AbortController();
const poll = async () => {
try {
const result = await fetcher();
if (!controller.signal.aborted) {
setData(result);
}
} catch (e) {
if (!controller.signal.aborted) {
setError(e as Error);
}
}
};
poll(); // Initial fetch
const intervalId = setInterval(poll, intervalMs);
return () => {
controller.abort();
clearInterval(intervalId);
};
}, [fetcher, intervalMs]);
return { data, error };
}
3. Closure-Captured Variables
// ❌ LEAK: Closure captures large data
function DataProcessor() {
const [processedCount, setProcessedCount] = useState(0);
const processLargeDataset = useCallback(() => {
const hugeDataset = fetchHugeDataset(); // 100MB array
return () => {
// This closure captures hugeDataset
// Even if we only use .length, entire array is retained
console.log(`Processed ${hugeDataset.length} items`);
setProcessedCount(hugeDataset.length);
};
}, []);
useEffect(() => {
const cleanup = processLargeDataset();
return cleanup; // hugeDataset retained until unmount
}, [processLargeDataset]);
}
// ✅ FIX: Extract only what's needed
function DataProcessor() {
const [processedCount, setProcessedCount] = useState(0);
const processLargeDataset = useCallback(() => {
const hugeDataset = fetchHugeDataset();
const count = hugeDataset.length; // Extract needed value
// Process dataset
processItems(hugeDataset);
// hugeDataset can now be GC'd
// Closure only captures count (a number)
return () => {
console.log(`Processed ${count} items`);
setProcessedCount(count);
};
}, []);
}
// ❌ LEAK: Closures in event handlers hold component state
function SearchComponent() {
const [results, setResults] = useState<SearchResult[]>([]); // Could be large
useEffect(() => {
const handler = () => {
// This closure captures `results` - entire array retained
console.log('Current results:', results.length);
};
window.addEventListener('beforeunload', handler);
return () => window.removeEventListener('beforeunload', handler);
}, [results]); // Re-registers on every results change!
}
// ✅ FIX: Use ref for values needed in stable closures
function SearchComponent() {
const [results, setResults] = useState<SearchResult[]>([]);
const resultsRef = useRef(results);
useEffect(() => {
resultsRef.current = results;
}, [results]);
useEffect(() => {
const handler = () => {
// Closure captures ref (stable), not results array
console.log('Current results:', resultsRef.current.length);
};
window.addEventListener('beforeunload', handler);
return () => window.removeEventListener('beforeunload', handler);
}, []); // Empty deps - handler registered once
}
4. Detached DOM Nodes
// ❌ LEAK: References to removed DOM elements
class LegacyWidget {
private containerRef: HTMLElement | null = null;
private chartInstance: any = null;
mount(container: HTMLElement) {
this.containerRef = container;
this.chartInstance = new HeavyChartLibrary(container);
}
unmount() {
// DOM element removed, but we still hold reference
this.containerRef?.remove();
// chartInstance still references internal DOM nodes
// containerRef still points to detached node
}
}
// ✅ FIX: Null out references explicitly
class LegacyWidget {
private containerRef: HTMLElement | null = null;
private chartInstance: any = null;
mount(container: HTMLElement) {
this.containerRef = container;
this.chartInstance = new HeavyChartLibrary(container);
}
unmount() {
// Destroy chart instance first
this.chartInstance?.destroy();
this.chartInstance = null;
// Then remove and dereference DOM
this.containerRef?.remove();
this.containerRef = null;
}
}
// ❌ LEAK: Storing DOM references in React state
function CanvasDrawing() {
const [elements, setElements] = useState<HTMLElement[]>([]);
const addElement = () => {
const div = document.createElement('div');
document.body.appendChild(div);
setElements(prev => [...prev, div]); // Storing DOM node in state
};
// Even if divs are removed from DOM, state holds references
}
// ✅ FIX: Store IDs or data, not DOM references
function CanvasDrawing() {
const [elementIds, setElementIds] = useState<string[]>([]);
const elementsRef = useRef(new Map<string, HTMLElement>());
const addElement = () => {
const id = crypto.randomUUID();
const div = document.createElement('div');
div.dataset.id = id;
document.body.appendChild(div);
elementsRef.current.set(id, div);
setElementIds(prev => [...prev, id]);
};
const removeElement = (id: string) => {
const el = elementsRef.current.get(id);
el?.remove();
elementsRef.current.delete(id);
setElementIds(prev => prev.filter(i => i !== id));
};
useEffect(() => {
return () => {
// Cleanup on unmount
elementsRef.current.forEach(el => el.remove());
elementsRef.current.clear();
};
}, []);
}
5. Unbounded Caches and Stores
// ❌ LEAK: Cache grows forever
const responseCache = new Map<string, Response>();
async function fetchWithCache(url: string): Promise<Response> {
if (responseCache.has(url)) {
return responseCache.get(url)!;
}
const response = await fetch(url);
responseCache.set(url, response.clone()); // Never evicted!
return response;
}
// After fetching 10,000 different URLs: 10,000 cached responses
// ✅ FIX: LRU cache with size limit
class LRUCache<K, V> {
private cache = new Map<K, V>();
private maxSize: number;
constructor(maxSize: number) {
this.maxSize = maxSize;
}
get(key: K): V | undefined {
const value = this.cache.get(key);
if (value !== undefined) {
// Move to end (most recently used)
this.cache.delete(key);
this.cache.set(key, value);
}
return value;
}
set(key: K, value: V): void {
if (this.cache.has(key)) {
this.cache.delete(key);
} else if (this.cache.size >= this.maxSize) {
// Remove least recently used (first item)
const firstKey = this.cache.keys().next().value;
this.cache.delete(firstKey);
}
this.cache.set(key, value);
}
}
const responseCache = new LRUCache<string, Response>(100);
// ✅ BETTER: Use WeakMap for object keys (auto-cleanup)
const objectMetadata = new WeakMap<object, Metadata>();
function attachMetadata(obj: object, meta: Metadata) {
objectMetadata.set(obj, meta);
// When obj is GC'd, the entry is automatically removed
}
// ✅ EVEN BETTER: Time-based expiration
class TTLCache<K, V> {
private cache = new Map<K, { value: V; expiry: number }>();
constructor(private ttlMs: number) {
// Periodic cleanup
setInterval(() => this.cleanup(), ttlMs);
}
get(key: K): V | undefined {
const entry = this.cache.get(key);
if (!entry) return undefined;
if (Date.now() > entry.expiry) {
this.cache.delete(key);
return undefined;
}
return entry.value;
}
set(key: K, value: V): void {
this.cache.set(key, {
value,
expiry: Date.now() + this.ttlMs,
});
}
private cleanup(): void {
const now = Date.now();
for (const [key, entry] of this.cache) {
if (now > entry.expiry) {
this.cache.delete(key);
}
}
}
}
6. Subscription Accumulation
// ❌ LEAK: Subscriptions pile up
function RealtimeData({ channelId }: { channelId: string }) {
const [messages, setMessages] = useState<Message[]>([]);
useEffect(() => {
const ws = new WebSocket(`wss://api.example.com/channels/${channelId}`);
ws.onmessage = (event) => {
const message = JSON.parse(event.data);
setMessages(prev => [...prev, message]); // Array grows forever!
};
return () => ws.close();
}, [channelId]);
}
// ✅ FIX: Bounded message list
function RealtimeData({ channelId }: { channelId: string }) {
const [messages, setMessages] = useState<Message[]>([]);
const MAX_MESSAGES = 100;
useEffect(() => {
const ws = new WebSocket(`wss://api.example.com/channels/${channelId}`);
ws.onmessage = (event) => {
const message = JSON.parse(event.data);
setMessages(prev => {
const next = [...prev, message];
// Keep only last N messages
return next.slice(-MAX_MESSAGES);
});
};
return () => ws.close();
}, [channelId]);
}
// ❌ LEAK: Multiple subscriptions without cleanup
class DataStore {
private subscriptions = new Set<(data: Data) => void>();
subscribe(callback: (data: Data) => void) {
this.subscriptions.add(callback);
// No unsubscribe returned!
}
notify(data: Data) {
this.subscriptions.forEach(cb => cb(data));
}
}
// Component subscribes on every mount
function DataConsumer() {
useEffect(() => {
dataStore.subscribe(handleData); // Added on mount
// Never removed!
}, []);
}
// ✅ FIX: Return unsubscribe function
class DataStore {
private subscriptions = new Set<(data: Data) => void>();
subscribe(callback: (data: Data) => void): () => void {
this.subscriptions.add(callback);
return () => {
this.subscriptions.delete(callback);
};
}
}
function DataConsumer() {
useEffect(() => {
const unsubscribe = dataStore.subscribe(handleData);
return unsubscribe; // Cleanup on unmount
}, []);
}
7. React-Specific Leaks
// ❌ LEAK: Stale closure in async effect
function UserProfile({ userId }: { userId: string }) {
const [user, setUser] = useState<User | null>(null);
useEffect(() => {
// If userId changes rapidly, old fetch completes
// and overwrites new data
fetch(`/api/users/${userId}`)
.then(r => r.json())
.then(data => setUser(data)); // Stale closure, wrong user
}, [userId]);
}
// ✅ FIX: Cancel stale requests
function UserProfile({ userId }: { userId: string }) {
const [user, setUser] = useState<User | null>(null);
useEffect(() => {
const controller = new AbortController();
fetch(`/api/users/${userId}`, { signal: controller.signal })
.then(r => r.json())
.then(data => setUser(data))
.catch(e => {
if (e.name !== 'AbortError') throw e;
});
return () => controller.abort();
}, [userId]);
}
// ❌ LEAK: Context value causing re-renders and retained references
const DataContext = createContext<{
data: HugeDataset;
refresh: () => void;
}>({ data: [], refresh: () => {} });
function DataProvider({ children }) {
const [data, setData] = useState<HugeDataset>([]);
const value = {
data,
refresh: () => fetchData().then(setData),
};
// New object every render → all consumers re-render
// If consumers hold refs to old data, leaks accumulate
return <DataContext.Provider value={value}>{children}</DataContext.Provider>;
}
// ✅ FIX: Memoize context value
function DataProvider({ children }) {
const [data, setData] = useState<HugeDataset>([]);
const refresh = useCallback(() => {
fetchData().then(setData);
}, []);
const value = useMemo(() => ({ data, refresh }), [data, refresh]);
return <DataContext.Provider value={value}>{children}</DataContext.Provider>;
}
Detecting Memory Leaks
Chrome DevTools Memory Panel
┌─────────────────────────────────────────────────────────────────────┐
│ DEVTOOLS MEMORY WORKFLOW │
├─────────────────────────────────────────────────────────────────────┤
│ │
│ 1. BASELINE SNAPSHOT │
│ ════════════════════ │
│ • Open app, let it stabilize │
│ • Force GC (click trash icon) │
│ • Take heap snapshot │
│ • Note: Total size, Object count │
│ │
│ 2. PERFORM SUSPECT ACTION │
│ ═══════════════════════════ │
│ • Navigate to route and back │
│ • Open modal and close │
│ • Load data and clear │
│ • Repeat 5-10 times │
│ │
│ 3. COMPARISON SNAPSHOT │
│ ════════════════════════ │
│ • Force GC again │
│ • Take second snapshot │
│ • Select "Comparison" view │
│ • Look for: # Delta > 0 for components/objects │
│ │
│ 4. ANALYZE RETAINERS │
│ ════════════════════════ │
│ • Click on leaked object │
│ • Expand "Retainers" panel │
│ • Follow chain to find root │
│ │
│ Example Retainer Chain: │
│ Window │
│ └─ listeners │
│ └─ resize (array) │
│ └─ Object (event handler) │
│ └─ closure scope │
│ └─ MyComponent (LEAKED!) │
│ └─ props │
│ └─ largeData (500KB) │
│ │
└─────────────────────────────────────────────────────────────────────┘
Automated Leak Detection
// scripts/memory-test.ts
import puppeteer from 'puppeteer';
interface MemorySnapshot {
usedJSHeapSize: number;
timestamp: number;
}
async function detectMemoryLeak(
url: string,
action: (page: puppeteer.Page) => Promise<void>,
iterations: number = 10
): Promise<{ leaked: boolean; growth: number }> {
const browser = await puppeteer.launch({ headless: true });
const page = await browser.newPage();
await page.goto(url);
await page.waitForNetworkIdle();
// Force GC and get baseline
await forceGC(page);
const baseline = await getMemoryUsage(page);
// Perform action multiple times
for (let i = 0; i < iterations; i++) {
await action(page);
await page.waitForTimeout(100);
}
// Force GC and measure
await forceGC(page);
const final = await getMemoryUsage(page);
await browser.close();
const growth = final.usedJSHeapSize - baseline.usedJSHeapSize;
const growthMB = growth / 1024 / 1024;
// Threshold: if memory grew by more than 1MB per iteration, likely a leak
const leaked = growthMB > iterations * 0.1;
console.log(`Baseline: ${(baseline.usedJSHeapSize / 1024 / 1024).toFixed(2)}MB`);
console.log(`Final: ${(final.usedJSHeapSize / 1024 / 1024).toFixed(2)}MB`);
console.log(`Growth: ${growthMB.toFixed(2)}MB over ${iterations} iterations`);
console.log(`Leaked: ${leaked}`);
return { leaked, growth };
}
async function forceGC(page: puppeteer.Page): Promise<void> {
// @ts-ignore
await page.evaluate(() => window.gc?.());
await page.waitForTimeout(100);
}
async function getMemoryUsage(page: puppeteer.Page): Promise<MemorySnapshot> {
return page.evaluate(() => ({
usedJSHeapSize: (performance as any).memory?.usedJSHeapSize || 0,
timestamp: Date.now(),
}));
}
// Usage in tests
describe('Memory Leak Tests', () => {
it('should not leak memory when navigating routes', async () => {
const result = await detectMemoryLeak(
'http://localhost:3000',
async (page) => {
await page.click('[data-testid="nav-dashboard"]');
await page.waitForSelector('[data-testid="dashboard"]');
await page.click('[data-testid="nav-settings"]');
await page.waitForSelector('[data-testid="settings"]');
},
20
);
expect(result.leaked).toBe(false);
});
it('should not leak memory when opening/closing modals', async () => {
const result = await detectMemoryLeak(
'http://localhost:3000/dashboard',
async (page) => {
await page.click('[data-testid="open-modal"]');
await page.waitForSelector('[data-testid="modal"]');
await page.click('[data-testid="close-modal"]');
await page.waitForSelector('[data-testid="modal"]', { hidden: true });
},
50
);
expect(result.leaked).toBe(false);
});
});
Performance Observer API
// lib/memory-monitor.ts
class MemoryMonitor {
private samples: Array<{ timestamp: number; usedHeap: number }> = [];
private intervalId: number | null = null;
private readonly maxSamples = 100;
start(intervalMs: number = 5000): void {
if (!('memory' in performance)) {
console.warn('Performance.memory not available');
return;
}
this.intervalId = window.setInterval(() => {
const memory = (performance as any).memory;
this.samples.push({
timestamp: Date.now(),
usedHeap: memory.usedJSHeapSize,
});
// Keep bounded
if (this.samples.length > this.maxSamples) {
this.samples.shift();
}
// Check for leak pattern
this.analyzeGrowth();
}, intervalMs);
}
stop(): void {
if (this.intervalId) {
clearInterval(this.intervalId);
this.intervalId = null;
}
}
private analyzeGrowth(): void {
if (this.samples.length < 10) return;
const recentSamples = this.samples.slice(-10);
const firstHeap = recentSamples[0].usedHeap;
const lastHeap = recentSamples[recentSamples.length - 1].usedHeap;
const growthRate = (lastHeap - firstHeap) / firstHeap;
// If memory grew by more than 50% in 10 samples, warn
if (growthRate > 0.5) {
console.warn(
`Potential memory leak detected: ${(growthRate * 100).toFixed(1)}% growth`,
{
from: `${(firstHeap / 1024 / 1024).toFixed(2)}MB`,
to: `${(lastHeap / 1024 / 1024).toFixed(2)}MB`,
}
);
// Report to monitoring
this.reportLeak(growthRate);
}
}
private reportLeak(growthRate: number): void {
// Send to your monitoring service
fetch('/api/monitoring/memory-alert', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
growthRate,
samples: this.samples.slice(-10),
url: window.location.href,
userAgent: navigator.userAgent,
}),
}).catch(() => {}); // Fire and forget
}
getStats(): { current: number; peak: number; average: number } {
if (this.samples.length === 0) {
return { current: 0, peak: 0, average: 0 };
}
const heaps = this.samples.map(s => s.usedHeap);
return {
current: heaps[heaps.length - 1],
peak: Math.max(...heaps),
average: heaps.reduce((a, b) => a + b, 0) / heaps.length,
};
}
}
// Usage
if (process.env.NODE_ENV === 'development') {
const monitor = new MemoryMonitor();
monitor.start(10000); // Sample every 10 seconds
}
Architectural Patterns for Memory Safety
1. Disposal Pattern
// lib/disposable.ts
interface Disposable {
dispose(): void;
}
class DisposableGroup implements Disposable {
private disposables: Disposable[] = [];
private isDisposed = false;
add<T extends Disposable>(disposable: T): T {
if (this.isDisposed) {
disposable.dispose();
throw new Error('Cannot add to disposed group');
}
this.disposables.push(disposable);
return disposable;
}
dispose(): void {
if (this.isDisposed) return;
this.isDisposed = true;
// Dispose in reverse order (LIFO)
while (this.disposables.length > 0) {
const disposable = this.disposables.pop()!;
try {
disposable.dispose();
} catch (e) {
console.error('Error disposing:', e);
}
}
}
}
// Usage in components
function useDisposables() {
const disposables = useRef(new DisposableGroup());
useEffect(() => {
return () => disposables.current.dispose();
}, []);
return disposables.current;
}
// Example: Complex component with multiple resources
function DataVisualization({ dataSource }: { dataSource: string }) {
const disposables = useDisposables();
const canvasRef = useRef<HTMLCanvasElement>(null);
useEffect(() => {
const canvas = canvasRef.current!;
// Each resource registered for cleanup
const chart = disposables.add(new ChartLibrary(canvas));
const resizeObserver = disposables.add(
createResizeObserver(canvas, () => chart.resize())
);
const dataSubscription = disposables.add(
dataSource.subscribe(data => chart.update(data))
);
// All cleaned up automatically when component unmounts
}, [dataSource, disposables]);
return <canvas ref={canvasRef} />;
}
// Helper to wrap cleanup functions
function createDisposable(cleanup: () => void): Disposable {
return { dispose: cleanup };
}
function createResizeObserver(
element: HTMLElement,
callback: () => void
): Disposable {
const observer = new ResizeObserver(callback);
observer.observe(element);
return createDisposable(() => observer.disconnect());
}
2. Resource Pool Pattern
// lib/resource-pool.ts
class ResourcePool<T> {
private available: T[] = [];
private inUse = new Set<T>();
private createResource: () => T;
private resetResource: (resource: T) => void;
private maxSize: number;
constructor(options: {
create: () => T;
reset: (resource: T) => void;
maxSize: number;
initialSize?: number;
}) {
this.createResource = options.create;
this.resetResource = options.reset;
this.maxSize = options.maxSize;
// Pre-create resources
for (let i = 0; i < (options.initialSize || 0); i++) {
this.available.push(this.createResource());
}
}
acquire(): T {
let resource: T;
if (this.available.length > 0) {
resource = this.available.pop()!;
} else if (this.inUse.size < this.maxSize) {
resource = this.createResource();
} else {
throw new Error('Pool exhausted');
}
this.inUse.add(resource);
return resource;
}
release(resource: T): void {
if (!this.inUse.has(resource)) {
throw new Error('Resource not from this pool');
}
this.inUse.delete(resource);
this.resetResource(resource);
this.available.push(resource);
}
// For async usage
async withResource<R>(fn: (resource: T) => Promise<R>): Promise<R> {
const resource = this.acquire();
try {
return await fn(resource);
} finally {
this.release(resource);
}
}
}
// Example: Canvas pool for offscreen rendering
const canvasPool = new ResourcePool<OffscreenCanvas>({
create: () => new OffscreenCanvas(1024, 1024),
reset: (canvas) => {
const ctx = canvas.getContext('2d')!;
ctx.clearRect(0, 0, canvas.width, canvas.height);
},
maxSize: 10,
initialSize: 2,
});
// Usage
async function renderThumbnail(image: ImageBitmap): Promise<Blob> {
return canvasPool.withResource(async (canvas) => {
const ctx = canvas.getContext('2d')!;
ctx.drawImage(image, 0, 0, canvas.width, canvas.height);
return canvas.convertToBlob({ type: 'image/jpeg', quality: 0.8 });
});
}
3. Weak Reference Pattern
// lib/weak-cache.ts
class WeakCache<K extends object, V> {
private cache = new WeakMap<K, V>();
get(key: K): V | undefined {
return this.cache.get(key);
}
set(key: K, value: V): void {
this.cache.set(key, value);
}
// When key is GC'd, entry is automatically removed
}
// For string keys with weak values
class WeakValueCache<V extends object> {
private cache = new Map<string, WeakRef<V>>();
private registry = new FinalizationRegistry<string>((key) => {
this.cache.delete(key);
});
get(key: string): V | undefined {
const ref = this.cache.get(key);
return ref?.deref();
}
set(key: string, value: V): void {
const ref = new WeakRef(value);
this.cache.set(key, ref);
this.registry.register(value, key);
}
}
// Example: Component instance cache
const componentCache = new WeakValueCache<React.Component>();
// When component is unmounted and GC'd, cache entry is auto-removed
4. Bounded Data Structures
// lib/bounded-structures.ts
class CircularBuffer<T> {
private buffer: (T | undefined)[];
private head = 0;
private tail = 0;
private count = 0;
constructor(private capacity: number) {
this.buffer = new Array(capacity);
}
push(item: T): T | undefined {
let evicted: T | undefined;
if (this.count === this.capacity) {
// Buffer full, overwrite oldest
evicted = this.buffer[this.tail];
this.tail = (this.tail + 1) % this.capacity;
} else {
this.count++;
}
this.buffer[this.head] = item;
this.head = (this.head + 1) % this.capacity;
return evicted;
}
toArray(): T[] {
const result: T[] = [];
let index = this.tail;
for (let i = 0; i < this.count; i++) {
result.push(this.buffer[index]!);
index = (index + 1) % this.capacity;
}
return result;
}
get length(): number {
return this.count;
}
}
// Usage: Keep last N log entries
const logBuffer = new CircularBuffer<LogEntry>(1000);
function log(entry: LogEntry) {
const evicted = logBuffer.push(entry);
// evicted is the removed entry (if buffer was full)
}
// Bounded Map with LRU eviction
class BoundedMap<K, V> {
private map = new Map<K, V>();
constructor(private maxSize: number) {}
get(key: K): V | undefined {
const value = this.map.get(key);
if (value !== undefined) {
// Move to end (most recently used)
this.map.delete(key);
this.map.set(key, value);
}
return value;
}
set(key: K, value: V): void {
if (this.map.has(key)) {
this.map.delete(key);
} else if (this.map.size >= this.maxSize) {
// Remove first (least recently used)
const firstKey = this.map.keys().next().value;
this.map.delete(firstKey);
}
this.map.set(key, value);
}
delete(key: K): boolean {
return this.map.delete(key);
}
clear(): void {
this.map.clear();
}
}
5. Subscription Manager
// lib/subscription-manager.ts
type Unsubscribe = () => void;
class SubscriptionManager {
private subscriptions = new Map<string, Unsubscribe>();
private disposed = false;
add(key: string, unsubscribe: Unsubscribe): void {
if (this.disposed) {
unsubscribe();
return;
}
// Remove existing subscription with same key
this.remove(key);
this.subscriptions.set(key, unsubscribe);
}
remove(key: string): void {
const unsubscribe = this.subscriptions.get(key);
if (unsubscribe) {
unsubscribe();
this.subscriptions.delete(key);
}
}
dispose(): void {
if (this.disposed) return;
this.disposed = true;
this.subscriptions.forEach(unsubscribe => {
try {
unsubscribe();
} catch (e) {
console.error('Error unsubscribing:', e);
}
});
this.subscriptions.clear();
}
}
// React hook
function useSubscriptions() {
const manager = useRef(new SubscriptionManager());
useEffect(() => {
return () => manager.current.dispose();
}, []);
return manager.current;
}
// Usage
function LiveDashboard({ channels }: { channels: string[] }) {
const subscriptions = useSubscriptions();
useEffect(() => {
channels.forEach(channel => {
const ws = new WebSocket(`wss://api.example.com/${channel}`);
subscriptions.add(channel, () => ws.close());
});
// When channels change, old subscriptions are cleaned up
// When component unmounts, all are cleaned up
}, [channels, subscriptions]);
}
React Query Memory Management
React Query handles caching but needs configuration for long-lived apps:
// lib/query-client.ts
import { QueryClient } from '@tanstack/react-query';
export const queryClient = new QueryClient({
defaultOptions: {
queries: {
// Remove from cache 5 minutes after last observer unmounts
gcTime: 5 * 60 * 1000, // formerly cacheTime
// Mark as stale after 30 seconds
staleTime: 30 * 1000,
// Prevent memory buildup from background refetches
refetchOnWindowFocus: false,
// Retry with backoff, but not forever
retry: 3,
},
},
});
// Periodic cache cleanup for long sessions
if (typeof window !== 'undefined') {
setInterval(() => {
const cache = queryClient.getQueryCache();
const queries = cache.getAll();
// Remove queries not accessed in last 30 minutes
const thirtyMinutesAgo = Date.now() - 30 * 60 * 1000;
queries.forEach(query => {
if (
query.state.dataUpdatedAt < thirtyMinutesAgo &&
query.getObserversCount() === 0
) {
cache.remove(query);
}
});
}, 5 * 60 * 1000); // Run every 5 minutes
}
// For specific queries that hold large data
export function useLargeDataQuery(id: string) {
return useQuery({
queryKey: ['large-data', id],
queryFn: () => fetchLargeData(id),
// Aggressive cleanup for large data
gcTime: 60 * 1000, // 1 minute
staleTime: 30 * 1000,
// Don't keep in cache when component unmounts
refetchOnMount: true,
});
}
Monitoring in Production
// lib/memory-reporting.ts
interface MemoryReport {
timestamp: number;
usedJSHeapSize: number;
totalJSHeapSize: number;
jsHeapSizeLimit: number;
sessionDuration: number;
routeChanges: number;
url: string;
}
class MemoryReporter {
private sessionStart = Date.now();
private routeChanges = 0;
private reportInterval: number | null = null;
start() {
// Track route changes
if (typeof window !== 'undefined') {
const originalPushState = history.pushState;
history.pushState = (...args) => {
this.routeChanges++;
return originalPushState.apply(history, args);
};
}
// Report every 5 minutes
this.reportInterval = window.setInterval(() => {
this.report();
}, 5 * 60 * 1000);
// Report on page unload
window.addEventListener('pagehide', () => this.report());
}
stop() {
if (this.reportInterval) {
clearInterval(this.reportInterval);
}
}
private report() {
if (!('memory' in performance)) return;
const memory = (performance as any).memory;
const report: MemoryReport = {
timestamp: Date.now(),
usedJSHeapSize: memory.usedJSHeapSize,
totalJSHeapSize: memory.totalJSHeapSize,
jsHeapSizeLimit: memory.jsHeapSizeLimit,
sessionDuration: Date.now() - this.sessionStart,
routeChanges: this.routeChanges,
url: window.location.pathname,
};
// Send to analytics
navigator.sendBeacon(
'/api/monitoring/memory',
JSON.stringify(report)
);
}
}
// Initialize in app
if (process.env.NODE_ENV === 'production') {
const reporter = new MemoryReporter();
reporter.start();
}
Production Checklist
Component Level
- Every
useEffecthas cleanup function - Event listeners use AbortController or explicit removal
- Timers (setTimeout, setInterval) are cleared
- Subscriptions return unsubscribe functions
- WebSocket connections are closed on unmount
Data Layer
- Caches have size limits (LRU, TTL)
- React Query gcTime configured appropriately
- Large data structures use bounded collections
- WeakMap/WeakRef for object references where applicable
Architecture
- Disposal pattern for complex resources
- Subscription manager for multiple subscriptions
- Resource pools for frequently created objects
- Memory monitoring in production
Testing
- Automated memory leak tests in CI
- Performance.memory tracking in staging
- Long-session manual testing (4+ hours)
- Memory profiling during development
Monitoring
- Memory usage reporting to analytics
- Alerts on memory growth patterns
- User session duration tracking
- Correlation with route changes
Summary
Memory leaks in long-lived applications are architectural problems, not just coding mistakes. They accumulate from:
- Forgotten cleanups - Event listeners, timers, subscriptions
- Closure captures - Large data retained by callbacks
- Unbounded growth - Caches, logs, history without limits
- Framework misuse - React effects without cleanup, stale closures
The solution is systematic:
- Disposal patterns - Centralized cleanup management
- Bounded structures - Hard limits on data growth
- Weak references - Let GC do its job
- Automated detection - Catch leaks before production
A dashboard that runs for 8 hours should use the same memory as one running for 8 minutes. Anything else is a bug waiting to crash your users' browsers.
What did you think?