How `async`/`await` Actually Works — The State Machine V8 Builds From Your Code
How async/await Actually Works — The State Machine V8 Builds From Your Code
async/await feels like writing synchronous code that magically handles asynchronous operations. But there's no magic — V8 transforms every async function into a resumable state machine backed by implicit Promise allocations, microtask scheduling, and generator-like suspend/resume mechanics. This article traces the complete transformation: from your source code through the parser, into Ignition bytecodes, and down to the microtask queue that drives execution.
The Spec: What async/await Must Do
The ECMAScript specification (§15.8) defines async functions with precise semantics:
- An
asyncfunction always returns a Promise await expressionsuspends execution, wrapping the expression inPromise.resolve(), then resuming when it settles- Return values are wrapped:
return 42→ the returned Promise resolves with 42 - Thrown exceptions:
throw err→ the returned Promise rejects with err
┌─────────────────────────────────────────────────────────────────────────────┐
│ async/await DESUGARING (CONCEPTUAL) │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ Your code: │
│ async function fetchUser(id) { │
│ const response = await fetch(`/api/users/${id}`); │
│ const user = await response.json(); │
│ return user; │
│ } │
│ │
│ What V8 effectively does: │
│ function fetchUser(id) { │
│ return new Promise((resolve, reject) => { │
│ let state = 0; │
│ let response, user; │
│ │
│ function step(value) { │
│ try { │
│ switch (state) { │
│ case 0: │
│ state = 1; │
│ return Promise.resolve(fetch(`/api/users/${id}`)) │
│ .then(step, reject); │
│ case 1: │
│ response = value; │
│ state = 2; │
│ return Promise.resolve(response.json()) │
│ .then(step, reject); │
│ case 2: │
│ user = value; │
│ resolve(user); │
│ return; │
│ } │
│ } catch (err) { │
│ reject(err); │
│ } │
│ } │
│ step(); │
│ }); │
│ } │
│ │
│ Each `await` becomes a state transition point. │
│ The function "suspends" by returning, and "resumes" via .then(). │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
This conceptual desugaring is close but not exact. V8's actual implementation is more optimized — let's trace it.
V8's Internal Representation: JSAsyncFunctionObject
When V8 encounters an async function, it creates a JSAsyncFunctionObject at runtime. The key data structures:
┌─────────────────────────────────────────────────────────────────────────────┐
│ V8 INTERNAL OBJECTS FOR async FUNCTION │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ JSAsyncFunctionObject │
│ ├── function: the async function's code │
│ ├── context: lexical scope (closures) │
│ ├── receiver: `this` binding │
│ ├── input_or_debug_pos: current awaited value or debug info │
│ ├── resume_mode: how to resume (next/throw/return) │
│ ├── continuation: bytecode offset to resume at │
│ ├── parameters_and_registers: saved local variables │
│ └── promise: the outer Promise returned to caller │
│ │
│ This is essentially a generator with an implicit Promise wrapper. │
│ V8 actually implements async functions using the same machinery │
│ as generators — they share the suspend/resume infrastructure. │
│ │
│ Source: src/objects/js-generator.h │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
Ignition Bytecodes: The Actual State Machine
V8's Ignition interpreter generates specific bytecodes for async/await. Let's see what bytecodes your function produces:
async function fetchUser(id) {
const response = await fetch(`/api/users/${id}`);
const user = await response.json();
return user;
}
┌─────────────────────────────────────────────────────────────────────────────┐
│ IGNITION BYTECODES (simplified) │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ // === Function entry === │
│ 0: CreateAsyncFunctionObject │
│ → Allocates JSAsyncFunctionObject │
│ → Creates the outer Promise (the one returned to caller) │
│ │
│ // === Before first await === │
│ 4: LdaGlobal [fetch] // Load `fetch` │
│ 8: CallProperty1 r0, r1 // Call fetch(url) │
│ 12: Star r2 // Store result (a Promise) in r2 │
│ │
│ // === First await === │
│ 14: SuspendGenerator r2 // Save registers + bytecode offset │
│ → Saves: continuation = 18 (bytecode offset to resume at) │
│ → Saves: all live registers (r0, r1, r2, ...) │
│ → Returns to caller with the awaited promise │
│ │
│ // ... execution returns to the event loop ... │
│ // ... microtask fires when fetch resolves ... │
│ │
│ // === Resume after first await === │
│ 18: ResumeGenerator r2 // Restore registers │
│ → Restores all saved local variables │
│ → resume_mode determines: value (resolve) or throw (reject) │
│ 22: Star r3 // response = resolved value │
│ │
│ // === Before second await === │
│ 24: CallProperty0 r3 [json] // Call response.json() │
│ 28: Star r4 // Store result │
│ │
│ // === Second await === │
│ 30: SuspendGenerator r4 // Save registers + offset │
│ → continuation = 34 │
│ → Returns to event loop again │
│ │
│ // === Resume after second await === │
│ 34: ResumeGenerator r4 │
│ 38: Star r5 // user = resolved value │
│ │
│ // === Return === │
│ 40: AsyncFunctionResolve r5 // Resolve the outer Promise with user │
│ 42: Return │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
The key bytecodes:
SuspendGenerator: Saves all registers and the current bytecode offset (continuation point). The async function "freezes" here.ResumeGenerator: Restores all registers and jumps to the saved continuation offset. The function "thaws" here.AsyncFunctionResolve: Resolves the outer promise (the one returned to the caller).AsyncFunctionReject: Rejects the outer promise (when an exception is thrown).
The Await Mechanism: Promise Resolution and Microtasks
When V8 hits an await, it doesn't just call .then(). The actual mechanism involves several internal operations:
// V8 internal pseudocode for `await expression`
function PerformAwait(asyncFunction, value, outerPromise) {
// Step 1: Wrap the value in a Promise (if it isn't one already)
const promise = PromiseResolve(value);
// Step 2: Create internal resolve/reject handlers
// These handlers will RESUME the async function
const onFulfilled = CreateAsyncFunctionResume(asyncFunction, NEXT);
const onRejected = CreateAsyncFunctionResume(asyncFunction, THROW);
// Step 3: Attach handlers to the promise
// This is an *internal* .then() — not user-visible
PerformPromiseThen(promise, onFulfilled, onRejected);
// Step 4: Suspend — return control to the caller/event loop
// The async function's registers are saved on the JSAsyncFunctionObject
}
┌─────────────────────────────────────────────────────────────────────────────┐
│ AWAIT EXECUTION FLOW │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ Code: const result = await somePromise; │
│ │
│ Step 1: PromiseResolve(somePromise) │
│ ├── Is somePromise already a native Promise? │
│ │ ├── YES → use it directly (no wrapping) ◄── V8 optimization │
│ │ └── NO → Promise.resolve(somePromise) │
│ │ → Creates a new Promise that resolves to the value │
│ └── Result: a Promise p │
│ │
│ Step 2: PerformPromiseThen(p, onFulfilled, onRejected) │
│ ├── If p is already resolved: │
│ │ → EnqueueMicrotask(onFulfilled(p.result)) │
│ └── If p is pending: │
│ → Add onFulfilled/onRejected to p's reaction list │
│ → They'll fire when p settles │
│ │
│ Step 3: Suspend async function │
│ ├── Save all registers to JSAsyncFunctionObject │
│ ├── Save continuation bytecode offset │
│ └── Return control (to caller or event loop) │
│ │
│ ... time passes, promise settles ... │
│ │
│ Step 4: Microtask fires → onFulfilled(resolvedValue) executes │
│ ├── ResumeGenerator: restore registers │
│ ├── resume_mode = NEXT → result = resolvedValue │
│ └── Continue executing bytecodes after the await │
│ │
│ OR if rejected: │
│ Step 4: Microtask fires → onRejected(reason) executes │
│ ├── ResumeGenerator: restore registers │
│ ├── resume_mode = THROW → throw reason at the await point │
│ └── Enters try/catch if one exists, or rejects outer promise │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
The Extra Microtask Tick (and Optimization)
In older V8 versions, await always created an extra wrapping Promise — adding an unnecessary microtask tick. V8 v7.2+ (Node.js 12+) optimized this:
┌─────────────────────────────────────────────────────────────────────────────┐
│ AWAIT MICROTASK OPTIMIZATION │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ Old behavior (V8 < 7.2): │
│ await nativePromise: │
│ 1. Create throwaway Promise │
│ 2. PerformPromiseThen on throwaway │
│ 3. Throwaway resolves → microtask → create ANOTHER Promise │
│ 4. That resolves → microtask → finally resume │
│ Total: 3 Promises created, 2+ extra microtask ticks │
│ │
│ New behavior (V8 7.2+): │
│ await nativePromise: │
│ 1. Detect that value is already a native Promise │
│ 2. Skip wrapping → attach handlers directly │
│ 3. Promise resolves → microtask → resume │
│ Total: 0 extra Promises, 1 microtask tick │
│ │
│ This is the "fast async" optimization from V8 blog: │
│ https://v8.dev/blog/fast-async │
│ │
│ Check: IsPromise(value) && value.constructor === Promise │
│ If true → skip Promise.resolve() wrapping entirely │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
Microtask Scheduling: The Execution Order
Understanding await requires understanding microtask scheduling:
async function foo() {
console.log('A');
await Promise.resolve();
console.log('B');
}
console.log('1');
foo();
console.log('2');
Output: 1, A, 2, B
┌─────────────────────────────────────────────────────────────────────────────┐
│ EXECUTION TRACE │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ Call Stack Microtask Queue │
│ ────────── ─────────────── │
│ 1. console.log('1') (empty) │
│ Output: "1" │
│ │
│ 2. foo() called │
│ → console.log('A') (empty) │
│ Output: "A" │
│ │
│ 3. await Promise.resolve() │
│ → Promise.resolve() is │
│ already resolved │
│ → Enqueue resume handler [resumeFoo] │
│ → SUSPEND foo ← foo returns here │
│ → Return to caller │
│ │
│ 4. console.log('2') [resumeFoo] │
│ Output: "2" │
│ │
│ 5. Call stack empty → drain (empty) │
│ microtask queue │
│ → resumeFoo executes │
│ → console.log('B') │
│ Output: "B" │
│ │
│ Final output: 1, A, 2, B │
│ │
│ KEY: Code after `await` ALWAYS runs in a microtask, │
│ even if the awaited value is already resolved. │
│ This ensures consistent async behavior. │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
Multiple Awaits: The State Machine in Action
async function pipeline() {
console.log('start');
const a = await step1(); // await point 0 → state 1
console.log('after step1');
const b = await step2(a); // await point 1 → state 2
console.log('after step2');
const c = await step3(b); // await point 2 → state 3
console.log('done');
return c;
}
┌─────────────────────────────────────────────────────────────────────────────┐
│ STATE MACHINE EXECUTION │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ Continuation Code Region Next State │
│ ──────────── ─────────── ────────── │
│ Entry (0) log('start'), call step1() → Suspend at await 0 │
│ │
│ Resume (1) a = resolved_value await 0 resolved │
│ log('after step1') │
│ call step2(a) → Suspend at await 1 │
│ │
│ Resume (2) b = resolved_value await 1 resolved │
│ log('after step2') │
│ call step3(b) → Suspend at await 2 │
│ │
│ Resume (3) c = resolved_value await 2 resolved │
│ log('done') │
│ AsyncFunctionResolve(c) → Done │
│ │
│ Each await point = one SuspendGenerator + one ResumeGenerator │
│ Each resume happens in a NEW microtask (new call stack) │
│ Local variables a, b, c survive across suspensions │
│ (saved in JSAsyncFunctionObject.parameters_and_registers) │
│ │
│ Stack trace at resume (3): │
│ pipeline (resumed at bytecode offset 34) │
│ ← no parent frames from original call! Clean microtask stack. │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
Error Handling: try/catch with await
async function safeFetch(url) {
try {
const response = await fetch(url); // Can reject
const data = await response.json(); // Can reject
return data;
} catch (error) {
return { error: error.message };
}
}
┌─────────────────────────────────────────────────────────────────────────────┐
│ ERROR HANDLING IN async STATE MACHINE │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ When an awaited Promise REJECTS: │
│ 1. Microtask fires: onRejected(reason) handler │
│ 2. ResumeGenerator with mode = THROW │
│ 3. V8 throws `reason` at the exact bytecode of the await │
│ 4. Normal exception propagation: walks up the handler table │
│ 5. If try/catch exists → caught, execution continues in catch │
│ 6. If no catch → AsyncFunctionReject(outerPromise, reason) │
│ │
│ Bytecodes for try/catch around await: │
│ │
│ PushTryCatch handler=@catch_offset │
│ ... (code inside try) │
│ SuspendGenerator // await inside try │
│ ResumeGenerator // resume — check mode │
│ JumpIfResumeThrow @catch_offset // if THROW → jump to catch │
│ ... (continue try block) │
│ PopTryCatch │
│ Jump @end │
│ │
│ @catch_offset: │
│ ... (catch block code) │
│ │
│ The handler table maps bytecode ranges to catch blocks. │
│ SuspendGenerator saves the handler table entry too, │
│ so ResumeGenerator can throw into the correct catch scope. │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
Promise Allocation Cost
Every async function creates at least one Promise (the outer return Promise). Each await may create additional Promises for wrapping:
┌─────────────────────────────────────────────────────────────────────────────┐
│ PROMISE ALLOCATION COUNT │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ async function f() { │
│ const a = await promise1; // 0-1 extra Promises (optimized: 0) │
│ const b = await promise2; // 0-1 extra Promises │
│ return a + b; │
│ } │
│ │
│ Minimum allocations (V8 optimized path): │
│ 1 Promise (outer return) │
│ 1 JSAsyncFunctionObject │
│ 2 PromiseReaction objects (for the 2 awaits) │
│ Total: ~4 heap allocations │
│ │
│ Non-native awaited value (e.g., thenable): │
│ + 1 wrapping Promise per non-native await │
│ + 1 extra microtask tick per wrapping │
│ │
│ Comparison to manual Promise chains: │
│ promise1.then(a => promise2.then(b => a + b)) │
│ 2 Promise reaction chains (similar allocation count) │
│ │
│ async/await is NOT more expensive than manual Promise chains. │
│ V8's optimizations make them roughly equivalent. │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
TurboFan Optimization of async Functions
V8's TurboFan compiler can optimize async functions, but with limitations:
┌─────────────────────────────────────────────────────────────────────────────┐
│ TURBOFAN AND async FUNCTIONS │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ What TurboFan CAN optimize: │
│ - Code between await points (synchronous sections) │
│ - Type specialization of local variables │
│ - Inlining of synchronous function calls │
│ - Dead code elimination within each state │
│ │
│ What TurboFan CANNOT optimize away: │
│ - The suspend/resume mechanism itself │
│ - Promise allocations (required by spec) │
│ - Microtask scheduling (required by spec) │
│ - The JSAsyncFunctionObject allocation │
│ │
│ Key insight: TurboFan treats each "segment" between awaits as │
│ a separate optimization unit. Local variable types can be │
│ specialized within a segment but must be re-checked after resume │
│ (because the function was suspended — anything could've happened). │
│ │
│ Optimization tip: Keep hot code in synchronous functions called │
│ from async functions, not spread across await points. │
│ │
│ // Prefer this: │
│ async function process(data) { │
│ const result = await fetch(url); │
│ return heavyComputation(result); // TurboFan optimizes this │
│ } │
│ │
│ // Over this: │
│ async function process(data) { │
│ const a = await step1(); │
│ // ... complex math between awaits ... │
│ const b = await step2(); │
│ // fragmented — each segment optimized independently │
│ } │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
async/await vs Generators: The Shared Machinery
V8 implements async functions using the same infrastructure as generators. The similarity is deep:
// Generator function
function* gen() {
const a = yield promise1;
const b = yield promise2;
return a + b;
}
// async function
async function fn() {
const a = await promise1;
const b = await promise2;
return a + b;
}
┌─────────────────────────────────────────────────────────────────────────────┐
│ GENERATORS vs async FUNCTIONS IN V8 │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ Generator async function │
│ ────────────────────────────────────────────────────────── │
│ Object type: JSGeneratorObject JSAsyncFunctionObject │
│ Suspend bytecode: SuspendGenerator SuspendGenerator (same!) │
│ Resume bytecode: ResumeGenerator ResumeGenerator (same!) │
│ Yield/Await: yield value await value │
│ Auto-advance: NO (manual .next()) YES (microtask resumes) │
│ Return wrapper: { value, done } Promise │
│ Error handling: .throw(err) Promise rejection │
│ │
│ The core difference: async functions automatically chain │
│ Promise reactions to drive the resume, while generators │
│ require manual .next() calls. │
│ │
│ In V8's source code (src/builtins/builtins-async-function-gen.cc): │
│ The async function builtins create a JSAsyncFunctionObject │
│ (which inherits from JSGeneratorObject) and set up the │
│ Promise reaction chains that auto-resume on completion. │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
Async Iteration: for await...of
for await...of combines the async state machine with the iterator protocol:
async function processStream(stream) {
for await (const chunk of stream) {
await process(chunk);
}
}
This desugars to approximately:
async function processStream(stream) {
const iterator = stream[Symbol.asyncIterator]();
while (true) {
const { value: chunk, done } = await iterator.next();
if (done) break;
await process(chunk);
}
}
Each iteration involves:
await iterator.next()— suspend, wait for next value- Check
done— break if true await process(chunk)— suspend, wait for processing- Loop back
The state machine has two suspension points per iteration — this doubles the microtask overhead compared to synchronous iteration.
Debugging: Stack Traces Across Awaits
A major challenge with async/await: stack traces at resume points don't include the original call stack:
async function main() {
await step1(); // Original call stack: main → step1
}
async function step1() {
await step2(); // Stack at resume: step1 → step2 (main is gone!)
}
async function step2() {
throw new Error('fail');
// Stack trace: Error at step2 (step2:2)
// at async step1 (step1:2)
// at async main (main:2)
// V8 reconstructs the chain using Promise linkage — but it's async!
}
V8 implements async stack traces by walking the Promise chain:
┌─────────────────────────────────────────────────────────────────────────────┐
│ ASYNC STACK TRACE RECONSTRUCTION │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ When Error is created in step2: │
│ Real call stack: [step2] (resumed via microtask) │
│ │
│ V8 async trace reconstruction: │
│ 1. step2's outer Promise is the value awaited by step1 │
│ 2. step1's outer Promise is the value awaited by main │
│ 3. Walk this chain → reconstruct: main → step1 → step2 │
│ │
│ Flag: --async-stack-traces (enabled by default since V8 7.3) │
│ Cost: V8 stores additional metadata on JSAsyncFunctionObjects │
│ to enable this chain walk. ~2% memory overhead. │
│ │
│ DevTools show "async" frames with gray background: │
│ step2 (sync frame) │
│ -- async -- │
│ step1 (async continuation) │
│ -- async -- │
│ main (async continuation) │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
Top-Level Await in ES Modules
Top-level await (available in ES modules) works the same way — the module itself becomes an async function:
// module.js
const data = await fetch('/config.json').then(r => r.json());
export default data;
┌─────────────────────────────────────────────────────────────────────────────┐
│ TOP-LEVEL AWAIT │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ A module with top-level await: │
│ 1. Module evaluation is async (returns a Promise) │
│ 2. Importing modules WAIT for the Promise to resolve │
│ 3. Module graph evaluation respects dependencies: │
│ │
│ Module A (has top-level await) │
│ Module B (imports from A) — B waits for A to resolve │
│ Module C (imports from A and B) — C waits for both │
│ │
│ V8 wraps the module's top-level code in an async function body. │
│ The SuspendGenerator/ResumeGenerator mechanism is the same. │
│ The module's namespace object isn't populated until all awaits resolve. │
│ │
│ Danger: Top-level await blocks ALL importing modules. │
│ A slow fetch in module A freezes everything that depends on A. │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
V8 Source Files Reference
| Concept | File | Notes |
|---|---|---|
| JSAsyncFunctionObject | src/objects/js-generator.h | Inherits from JSGeneratorObject |
| Async function builtins | src/builtins/builtins-async-function-gen.cc | CreateAsyncFunctionObject, AsyncFunctionResolve/Reject |
| SuspendGenerator bytecode | src/interpreter/bytecodes.h | Shared with generators |
| ResumeGenerator bytecode | src/interpreter/bytecodes.h | Shared with generators |
| Await optimization | src/builtins/builtins-async-function-gen.cc | PromiseResolve fast path |
| Microtask queue | src/execution/microtask-queue.cc | Processes async resume handlers |
| Async stack traces | src/execution/isolate.cc | CaptureAsyncStackTrace |
| Bytecode generator | src/interpreter/bytecode-generator.cc | VisitAwait, VisitSuspend |
The Bottom Line
async/await is not syntax sugar over callbacks — it's a compiler-generated state machine. V8 transforms each async function into a JSAsyncFunctionObject (inheriting from generator infrastructure) that uses SuspendGenerator/ResumeGenerator bytecodes to freeze and thaw execution at each await point. All local variables are saved on the object's register array, and a continuation offset tracks where to resume. Each await creates a Promise reaction that schedules a microtask to resume the function when the awaited value settles. V8 optimizes away redundant Promise wrapping for native Promises (the "fast async" optimization) and reconstructs async stack traces by walking the Promise chain. The practical implication: code between await points is a synchronous unit that TurboFan can optimize independently, while the suspend/resume boundary has an irreducible cost in Promise allocations and microtask scheduling.
What did you think?