JavaScript's Event Loop Is Lying to You
JavaScript's Event Loop Is Lying to You
Beyond the basics — microtask queue, macrotask ordering, queueMicrotask, and how misunderstanding these causes subtle bugs in React state updates and async flows in production apps.
The Mental Model You Learned Is Incomplete
You've probably seen some version of this diagram:
┌─────────────────────────────────────────────────────────────────┐
│ THE SIMPLIFIED MODEL │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Call Stack Event Queue │
│ ┌──────────┐ ┌──────────┐ │
│ │ │ ◀────│ callback │ │
│ │ func() │ │ callback │ │
│ │ main() │ │ callback │ │
│ └──────────┘ └──────────┘ │
│ │
│ "When the stack is empty, pick from the queue" │
│ │
└─────────────────────────────────────────────────────────────────┘
This model explains setTimeout and click handlers well enough. But it falls apart the moment you mix Promises, async/await, and React state updates. The reality is more nuanced — and the nuances bite.
The Actual Model: Multiple Queues
JavaScript doesn't have one queue. It has at least two, with different priorities:
┌─────────────────────────────────────────────────────────────────┐
│ THE ACTUAL MODEL │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Call Stack │
│ ┌──────────┐ │
│ │ code │ │
│ │ runs │ │
│ │ here │ │
│ └────┬─────┘ │
│ │ │
│ ▼ │
│ ┌──────────────────────────────────────────────────────────┐ │
│ │ MICROTASK QUEUE (high priority) │ │
│ │ Promise.then, queueMicrotask, MutationObserver │ │
│ │ ────────────────────────────────────────────────────── │ │
│ │ ALL microtasks drain before ANY macrotask runs │ │
│ └────────────────────────────┬─────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌──────────────────────────────────────────────────────────┐ │
│ │ MACROTASK QUEUE (lower priority) │ │
│ │ setTimeout, setInterval, setImmediate, I/O, UI events │ │
│ │ ────────────────────────────────────────────────────── │ │
│ │ Only ONE macrotask runs, then check microtasks again │ │
│ └──────────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────┘
The critical difference:
MICROTASKS:
- Promise.then / .catch / .finally
- queueMicrotask()
- MutationObserver
- process.nextTick (Node.js - even higher priority)
MACROTASKS (also called "tasks"):
- setTimeout / setInterval
- setImmediate (Node.js)
- I/O callbacks
- UI rendering
- requestAnimationFrame (before repaint)
- User events (click, scroll, etc.)
The Event Loop Algorithm
Here's what actually happens on each "tick":
┌─────────────────────────────────────────────────────────────────┐
│ EVENT LOOP ITERATION │
├─────────────────────────────────────────────────────────────────┤
│ │
│ 1. Execute ONE macrotask (if available) │
│ │ │
│ ▼ │
│ 2. Execute ALL microtasks │
│ │ └── Including any NEW microtasks added during this step │
│ │ │
│ ▼ │
│ 3. Render (if needed - ~60fps, so roughly every 16ms) │
│ │ ├── requestAnimationFrame callbacks │
│ │ ├── Style calculation │
│ │ ├── Layout │
│ │ └── Paint │
│ │ │
│ ▼ │
│ 4. Go to step 1 │
│ │
└─────────────────────────────────────────────────────────────────┘
KEY INSIGHT:
Microtasks are greedy - they ALL run before the next macrotask.
This includes microtasks that were queued BY other microtasks.
Proof: The Order Experiment
console.log('1: Sync start');
setTimeout(() => {
console.log('2: setTimeout (macrotask)');
}, 0);
Promise.resolve()
.then(() => {
console.log('3: Promise.then (microtask)');
});
queueMicrotask(() => {
console.log('4: queueMicrotask (microtask)');
});
console.log('5: Sync end');
// Output:
// 1: Sync start
// 5: Sync end
// 3: Promise.then (microtask)
// 4: queueMicrotask (microtask)
// 2: setTimeout (macrotask)
Even though setTimeout(..., 0) was called first, it runs last. The microtasks jump the queue.
A More Complex Example
console.log('1: Start');
setTimeout(() => {
console.log('2: Timeout 1');
Promise.resolve().then(() => {
console.log('3: Promise inside timeout 1');
});
}, 0);
setTimeout(() => {
console.log('4: Timeout 2');
}, 0);
Promise.resolve()
.then(() => {
console.log('5: Promise 1');
return Promise.resolve();
})
.then(() => {
console.log('6: Promise 2');
queueMicrotask(() => {
console.log('7: Microtask inside promise chain');
});
});
console.log('8: End');
// Output:
// 1: Start
// 8: End
// 5: Promise 1
// 6: Promise 2
// 7: Microtask inside promise chain
// 2: Timeout 1
// 3: Promise inside timeout 1
// 4: Timeout 2
Notice:
- All sync code runs first (1, 8)
- All microtasks drain (5, 6, 7)
- Then ONE macrotask (2)
- Its microtasks drain (3)
- Then next macrotask (4)
Where This Causes Real Bugs
Bug 1: The State Update Race
// React component with a subtle timing bug
function SearchComponent() {
const [query, setQuery] = useState('');
const [results, setResults] = useState([]);
const [isSearching, setIsSearching] = useState(false);
const handleSearch = async () => {
setIsSearching(true);
// Fetch is a macrotask (I/O)
const response = await fetch(`/api/search?q=${query}`);
const data = await response.json();
setResults(data);
setIsSearching(false);
};
// BUG: User types fast, what happens?
//
// Timeline:
// t=0: User types "a", handleSearch() called
// t=10: User types "ab", handleSearch() called again
// t=50: First fetch completes for "a"
// t=60: Second fetch completes for "ab"
//
// Expected: Results for "ab"
// Actual: Could be results for "a" (race condition)
return (/* ... */);
}
The fix requires understanding that fetch responses are macrotasks that can arrive in any order:
function SearchComponent() {
const [query, setQuery] = useState('');
const [results, setResults] = useState([]);
const [isSearching, setIsSearching] = useState(false);
// Track the latest request
const latestRequestRef = useRef<AbortController | null>(null);
const handleSearch = async () => {
// Cancel previous request
latestRequestRef.current?.abort();
const controller = new AbortController();
latestRequestRef.current = controller;
setIsSearching(true);
try {
const response = await fetch(`/api/search?q=${query}`, {
signal: controller.signal,
});
const data = await response.json();
// Only update if this is still the latest request
if (latestRequestRef.current === controller) {
setResults(data);
setIsSearching(false);
}
} catch (error) {
if (error.name !== 'AbortError') {
throw error;
}
}
};
return (/* ... */);
}
Bug 2: The "setState Isn't Immediate" Confusion
function Counter() {
const [count, setCount] = useState(0);
const handleClick = () => {
setCount(count + 1);
console.log('Count after setState:', count); // Still 0!
// WHY?
// React batches state updates and processes them as microtasks
// The console.log runs synchronously, before React processes the batch
};
return <button onClick={handleClick}>{count}</button>;
}
Understanding the timing:
┌─────────────────────────────────────────────────────────────────┐
│ CLICK HANDLER EXECUTION │
├─────────────────────────────────────────────────────────────────┤
│ │
│ 1. handleClick() starts (sync) │
│ 2. setCount(1) - schedules update (microtask) │
│ 3. console.log(count) - runs sync, count is still 0 │
│ 4. handleClick() ends │
│ 5. Event handler complete (macrotask done) │
│ 6. Microtask queue runs │
│ 7. React processes batched updates │
│ 8. Re-render with count = 1 │
│ │
└─────────────────────────────────────────────────────────────────┘
The right mental model:
function Counter() {
const [count, setCount] = useState(0);
const handleClick = () => {
// Option 1: Use the callback form to see next value
setCount(prevCount => {
const nextCount = prevCount + 1;
console.log('Next count will be:', nextCount);
return nextCount;
});
// Option 2: Use useEffect to react to changes
// (shown below)
};
useEffect(() => {
console.log('Count changed to:', count);
}, [count]);
return <button onClick={handleClick}>{count}</button>;
}
Bug 3: The Multiple setState Batching Surprise
function MultipleUpdates() {
const [count, setCount] = useState(0);
const handleClick = () => {
// How many re-renders?
setCount(count + 1);
setCount(count + 1);
setCount(count + 1);
// Answer: ONE re-render (React batches)
// Final count: 1 (not 3!)
// Because all three calls see count = 0
};
// Fix: Use functional updates
const handleClickCorrect = () => {
setCount(c => c + 1);
setCount(c => c + 1);
setCount(c => c + 1);
// Still ONE re-render (batched)
// But final count: 3 (each sees previous result)
};
return (/* ... */);
}
Bug 4: The setTimeout Escape Hatch Footgun
Before React 18, this was a common "fix" that caused other bugs:
// Pre-React 18: setTimeout escaped batching
function OldPattern() {
const [a, setA] = useState(0);
const [b, setB] = useState(0);
const handleClick = () => {
// In React 17, this caused TWO renders
setTimeout(() => {
setA(1);
setB(1);
}, 0);
// Each setState was a separate macrotask callback
// React 17 only batched within event handlers
};
// React 18 fixed this - it batches everywhere
// But old code relying on non-batched behavior breaks
}
Bug 5: The Promise.resolve Timing Assumption
// Code that assumes Promise.resolve is "safe"
function DataSync() {
const [localData, setLocalData] = useState(null);
const syncWithServer = async () => {
const serverData = await fetchFromServer();
// Developer thinks: "I'll use Promise.resolve to ensure
// state is set before I do the next thing"
setLocalData(serverData);
await Promise.resolve(); // "Flush state updates"
// BUG: This doesn't guarantee React has re-rendered!
// Promise.resolve is a microtask
// React's commit phase may not have happened yet
console.log('Data should be synced now'); // Maybe not!
sendAnalytics({ dataSynced: true }); // Potentially wrong
};
}
The actual flush mechanism:
import { flushSync } from 'react-dom';
function DataSync() {
const [localData, setLocalData] = useState(null);
const syncWithServer = async () => {
const serverData = await fetchFromServer();
// flushSync forces synchronous update - use sparingly!
flushSync(() => {
setLocalData(serverData);
});
// NOW the DOM is guaranteed to reflect the new state
console.log('Data is definitely synced');
sendAnalytics({ dataSynced: true });
};
}
The Microtask Starvation Problem
Because microtasks drain completely before the next macrotask, you can accidentally starve the event loop:
// DANGEROUS: Infinite microtask loop
function processInMicrotasks(items) {
let index = 0;
function processNext() {
if (index < items.length) {
processItem(items[index]);
index++;
queueMicrotask(processNext); // Schedule next as microtask
}
}
queueMicrotask(processNext);
}
processInMicrotasks(millionItems);
// PROBLEM:
// - All million items process before ANY macrotask
// - No UI updates possible
// - No setTimeout callbacks fire
// - User sees frozen page
The correct approach:
// SAFE: Yield to macrotask queue periodically
function processWithYielding(items, batchSize = 100) {
let index = 0;
function processBatch() {
const batchEnd = Math.min(index + batchSize, items.length);
// Process a batch synchronously
while (index < batchEnd) {
processItem(items[index]);
index++;
}
if (index < items.length) {
// Yield to macrotask queue - allows UI updates, events
setTimeout(processBatch, 0);
}
}
processBatch();
}
// Even better with modern APIs:
async function processWithScheduler(items) {
for (const item of items) {
processItem(item);
// Yield to browser periodically
if (shouldYield()) {
await scheduler.yield(); // Modern API
// or: await new Promise(r => setTimeout(r, 0));
}
}
}
function shouldYield() {
// Yield every 5ms to maintain 60fps
return performance.now() - lastYield > 5;
}
queueMicrotask: The Right Tool
queueMicrotask was added to give explicit microtask control without Promise overhead:
// These are roughly equivalent:
Promise.resolve().then(() => doSomething());
queueMicrotask(() => doSomething());
// But queueMicrotask:
// 1. Has less overhead (no Promise object created)
// 2. Expresses intent more clearly
// 3. Doesn't swallow errors the same way
// Error handling difference:
Promise.resolve().then(() => {
throw new Error('Promise error');
});
// Error becomes an unhandled rejection - async error
queueMicrotask(() => {
throw new Error('Microtask error');
});
// Error throws synchronously in the microtask - sync error
// Better stack traces, different error handling
When to Use queueMicrotask
// USE CASE 1: Coalesce multiple sync calls
class EventEmitter {
private listeners: Set<Function> = new Set();
private pendingNotify = false;
emit() {
if (!this.pendingNotify) {
this.pendingNotify = true;
queueMicrotask(() => {
this.pendingNotify = false;
this.notifyListeners();
});
}
}
// Multiple emit() calls in same sync block = one notify
}
// USE CASE 2: Ensure callback is always async
function fetchWithCallback(url: string, callback: (data: any) => void) {
const cached = cache.get(url);
if (cached) {
// DON'T: callback(cached) - would be sync
// DO: ensure async consistency
queueMicrotask(() => callback(cached));
} else {
fetch(url)
.then(r => r.json())
.then(data => {
cache.set(url, data);
callback(data); // Already async from fetch
});
}
}
// USE CASE 3: Post-construction initialization
class Component {
constructor() {
// Can't call methods that depend on full construction
queueMicrotask(() => {
this.initialize(); // Runs after constructor completes
});
}
}
React's Relationship with the Event Loop
React has its own scheduler that works with (not against) the event loop:
┌─────────────────────────────────────────────────────────────────┐
│ REACT'S SCHEDULING MODEL │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Event occurs (click, etc.) │
│ │ │
│ ▼ │
│ React schedules update (microtask or internal scheduler) │
│ │ │
│ ▼ │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ RENDER PHASE (interruptible in Concurrent Mode) │ │
│ │ - Call component functions │ │
│ │ - Build new virtual DOM │ │
│ │ - Compute diffs │ │
│ │ - Can be interrupted by higher priority work │ │
│ └─────────────────────────────────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ COMMIT PHASE (synchronous, uninterruptible) │ │
│ │ - Apply DOM changes │ │
│ │ - Run useLayoutEffect │ │
│ │ - Schedule useEffect (as macrotask or microtask) │ │
│ └─────────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────┘
useEffect vs useLayoutEffect Timing
function TimingDemo() {
const [count, setCount] = useState(0);
useLayoutEffect(() => {
// Runs SYNCHRONOUSLY after DOM mutation, before paint
// Blocks browser paint
// Good for: DOM measurements, preventing flicker
console.log('useLayoutEffect:', count);
}, [count]);
useEffect(() => {
// Runs AFTER browser paint
// Scheduled as macrotask (or special React scheduling)
// Good for: Data fetching, subscriptions, most side effects
console.log('useEffect:', count);
}, [count]);
console.log('Render:', count);
return <div>{count}</div>;
}
// Click to increment count from 0 to 1:
//
// Output order:
// 1. "Render: 1" (during render phase)
// 2. "useLayoutEffect: 1" (sync, after DOM update, before paint)
// 3. Browser paints
// 4. "useEffect: 1" (after paint)
The Concurrent Mode Twist
React 18's Concurrent Mode adds more complexity:
function ConcurrentExample() {
const [isPending, startTransition] = useTransition();
const [count, setCount] = useState(0);
const handleClick = () => {
// Urgent update - high priority
setCount(c => c + 1);
// Transition - low priority, interruptible
startTransition(() => {
setExpensiveState(computeExpensive(count));
});
// React may:
// 1. Process urgent update immediately
// 2. Start transition
// 3. Interrupt transition if user clicks again
// 4. Restart transition with new state
};
}
Async/Await: Syntax Sugar with Timing Implications
async/await is Promise syntax sugar, which means microtask timing:
async function example() {
console.log('1: Before await');
await Promise.resolve();
console.log('2: After await');
}
console.log('3: Before call');
example();
console.log('4: After call');
// Output:
// 3: Before call
// 1: Before await
// 4: After call
// 2: After await
// The await splits the function at a microtask boundary
The "Await Nothing" Pattern
// Sometimes you need to yield to microtask queue
async function processWithBreathing(items: Item[]) {
for (let i = 0; i < items.length; i++) {
processItem(items[i]);
// Yield to microtask queue every 100 items
// Allows other microtasks (like React updates) to run
if (i % 100 === 0) {
await Promise.resolve();
}
}
}
// But remember: this still blocks macrotasks!
// For true UI responsiveness, yield to macrotask queue:
async function processWithUIBreathing(items: Item[]) {
for (let i = 0; i < items.length; i++) {
processItem(items[i]);
if (i % 100 === 0) {
// Yield to macrotask queue - allows UI updates
await new Promise(r => setTimeout(r, 0));
}
}
}
Multiple Awaits vs Promise.all
// Sequential (slow)
async function sequential() {
const a = await fetchA(); // Wait
const b = await fetchB(); // Then wait
const c = await fetchC(); // Then wait
return [a, b, c];
}
// Total time: fetchA + fetchB + fetchC
// Parallel (fast)
async function parallel() {
const [a, b, c] = await Promise.all([
fetchA(),
fetchB(),
fetchC(),
]);
return [a, b, c];
}
// Total time: max(fetchA, fetchB, fetchC)
// But there's a timing subtlety:
async function parallelWithWork() {
// These fire immediately
const promiseA = fetchA();
const promiseB = fetchB();
// This sync work happens while fetches are in flight
const syncResult = heavyComputation();
// Now we wait for fetches
const [a, b] = await Promise.all([promiseA, promiseB]);
return { a, b, syncResult };
}
Node.js: Additional Queue Complexity
Node.js adds more queues and process.nextTick:
┌─────────────────────────────────────────────────────────────────┐
│ NODE.JS EVENT LOOP │
├─────────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ process.nextTick queue (HIGHEST priority) │ │
│ │ Runs BEFORE microtasks │ │
│ └─────────────────────────────────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ Microtask queue (Promise.then, queueMicrotask) │ │
│ └─────────────────────────────────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ Timers phase (setTimeout, setInterval) │ │
│ └─────────────────────────────────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ I/O callbacks phase │ │
│ └─────────────────────────────────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ setImmediate phase │ │
│ └─────────────────────────────────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ Close callbacks phase (socket.on('close')) │ │
│ └─────────────────────────────────────────────────────────┘ │
│ │
│ After each phase: drain nextTick and microtask queues │
│ │
└─────────────────────────────────────────────────────────────────┘
// Node.js ordering example
setImmediate(() => console.log('1: setImmediate'));
setTimeout(() => console.log('2: setTimeout'), 0);
process.nextTick(() => console.log('3: nextTick'));
Promise.resolve().then(() => console.log('4: Promise'));
console.log('5: Sync');
// Output:
// 5: Sync
// 3: nextTick (highest priority async)
// 4: Promise (microtask, after nextTick)
// 2: setTimeout (OR...)
// 1: setImmediate (order between these depends on timing)
Debugging Event Loop Issues
Tool 1: Logging with Timing Context
function logWithTiming(label: string) {
const stack = new Error().stack?.split('\n')[2] || '';
console.log(`[${performance.now().toFixed(2)}ms] ${label}`, stack.trim());
}
// Usage
logWithTiming('State update called');
setState(newValue);
logWithTiming('After setState');
Tool 2: Visualizing Queue Execution
function traceEventLoop() {
console.log('=== SYNC START ===');
queueMicrotask(() => console.log('MICRO 1'));
setTimeout(() => {
console.log('MACRO 1');
queueMicrotask(() => console.log('MICRO inside MACRO 1'));
}, 0);
Promise.resolve().then(() => {
console.log('MICRO 2 (promise)');
queueMicrotask(() => console.log('MICRO 3 (nested)'));
});
setTimeout(() => console.log('MACRO 2'), 0);
console.log('=== SYNC END ===');
}
// Trace output shows exact ordering
Tool 3: Detecting Microtask Starvation
function detectStarvation() {
let macrotaskRan = false;
setTimeout(() => {
macrotaskRan = true;
}, 0);
const start = performance.now();
let iterations = 0;
function checkStarvation() {
iterations++;
if (!macrotaskRan && iterations < 100000) {
queueMicrotask(checkStarvation);
} else {
const duration = performance.now() - start;
console.log(`Macrotask ran after ${iterations} microtasks (${duration}ms)`);
if (duration > 16) {
console.warn('Possible frame drop - microtask queue too long');
}
}
}
queueMicrotask(checkStarvation);
}
Patterns and Anti-Patterns
Anti-Pattern: Assuming Sync After Await
// ❌ WRONG: Assuming immediate update
async function saveAndNotify() {
await api.save(data);
setIsSaved(true);
showNotification('Saved!'); // May show before UI updates
}
// ✅ RIGHT: Use effects for reactions
async function saveAndNotify() {
await api.save(data);
setIsSaved(true);
// Notification shown via useEffect watching isSaved
}
useEffect(() => {
if (isSaved) {
showNotification('Saved!');
}
}, [isSaved]);
Anti-Pattern: setTimeout(0) for "Next Tick"
// ❌ MISLEADING: setTimeout(0) isn't "next tick"
setTimeout(() => {
// This runs after ALL pending microtasks
// AND after browser paint
// Could be 4-10ms+ later due to timer clamping
}, 0);
// ✅ CLEAR: Use queueMicrotask for microtask timing
queueMicrotask(() => {
// Runs after current sync code, before macrotasks
});
// ✅ CLEAR: Use setTimeout when you WANT macrotask timing
setTimeout(() => {
// Runs after microtasks drain and possibly after paint
}, 0);
Anti-Pattern: Polling Microtasks
// ❌ DANGEROUS: Polling with microtasks
async function waitForCondition() {
while (!condition) {
await Promise.resolve(); // Infinite microtask loop!
}
}
// ✅ SAFE: Polling with macrotasks
async function waitForCondition() {
while (!condition) {
await new Promise(r => setTimeout(r, 10)); // Yields properly
}
}
// ✅ BETTER: Event-driven
function waitForCondition() {
return new Promise(resolve => {
eventEmitter.once('conditionMet', resolve);
});
}
Pattern: Debouncing Across Microtasks
// Coalesce rapid updates into one
function createMicrotaskDebouncer<T>(
callback: (value: T) => void
) {
let pending: T | undefined;
let scheduled = false;
return (value: T) => {
pending = value;
if (!scheduled) {
scheduled = true;
queueMicrotask(() => {
scheduled = false;
callback(pending!);
});
}
};
}
// Usage
const debouncedUpdate = createMicrotaskDebouncer((value: number) => {
console.log('Update:', value);
});
// All these calls in same sync block = one callback
debouncedUpdate(1);
debouncedUpdate(2);
debouncedUpdate(3);
// Logs: "Update: 3"
Quick Reference
Timing Cheat Sheet
IMMEDIATE (sync):
├── Regular code execution
├── throw new Error()
└── console.log()
MICROTASK (before next macrotask):
├── Promise.then/.catch/.finally
├── queueMicrotask()
├── MutationObserver
├── process.nextTick() (Node - even higher priority)
└── await (after the awaited promise resolves)
MACROTASK (after all microtasks):
├── setTimeout / setInterval
├── setImmediate (Node)
├── I/O callbacks
├── UI events (click, scroll)
└── requestAnimationFrame (before paint)
AFTER PAINT:
├── requestIdleCallback
└── IntersectionObserver callbacks
React Timing
EVENT HANDLER RUNS (macrotask)
│
├── setState() calls queue updates
│
└── Event handler completes
│
▼
MICROTASKS / REACT SCHEDULER
│
├── React batches pending updates
├── Calls component functions (render phase)
│
└── Applies to DOM (commit phase)
│
├── useLayoutEffect (sync, blocks paint)
│
▼
BROWSER PAINT
│
▼
useEffect (after paint)
Debugging Questions
"Why isn't my state update showing?"
→ Check if you're reading state synchronously after setState
→ State updates are batched and applied in microtasks/scheduler
"Why is my effect running multiple times?"
→ Check dependency array
→ Check if dependencies are new objects each render
"Why is my UI freezing?"
→ Check for microtask starvation (infinite queueMicrotask loops)
→ Check for long synchronous operations
"Why are my requests returning in wrong order?"
→ Race conditions - later macrotasks can complete before earlier ones
→ Use AbortController or request ID tracking
Closing Thoughts
The event loop isn't lying to you — it's just more complicated than the simplified model suggests. The distinction between microtasks and macrotasks isn't academic; it directly impacts:
- When your React state updates become visible
- Whether your UI stays responsive during heavy operations
- How race conditions manifest in async code
- Why "just add setTimeout(0)" sometimes fixes things and sometimes makes them worse
The mental model to carry with you: microtasks are greedy, macrotasks are fair. Microtasks drain completely, starving everything else. Macrotasks take turns, allowing the browser to breathe.
When you internalize this, the timing bugs that used to feel random start making sense. And that's when you can start writing async code that works correctly not by accident, but by design.
The event loop isn't magic. It's just a very specific algorithm that runs in a very specific order. Learn the order, and you control the timing.
What did you think?