INP TypeScript 5.x

Fix INP in TypeScript: Reduce Interaction Latency in Type-Safe Codebases

TypeScript is the default language for most production front-end teams, yet its compiler configuration is one of the least-examined sources of Interaction to Next Paint (INP) regressions. When the TypeScript compiler targets ES5, emits decorator metadata, duplicates helper functions across every file, and allows type-only imports to survive as runtime values, the resulting JavaScript is heavier, slower to parse, and harder for V8 to optimize than the original source code warrants. On a mid-tier Android device, the gap between a well-configured TypeScript project and a carelessly configured one can be 120-160ms of additional interaction latency — enough to push a borderline INP score from “needs improvement” into “poor” territory.

This guide covers six concrete changes to your TypeScript compiler configuration and coding patterns that directly reduce INP. Each step includes before-and-after code, the exact tsconfig option to change, and an explanation of why the runtime behavior improves. These fixes are framework-agnostic and apply equally to React, Vue, Svelte, and vanilla TypeScript projects. For framework-specific interaction patterns, see the companion guide on fixing INP in React.

Expected results

Applying all six steps to a typical mid-size TypeScript SPA (React 18, Webpack 5, ~600KB initial bundle) produces measurable INP improvements on field data:

Before

280ms

INP (Needs Improvement) — ES5 target, no importHelpers, decorator metadata enabled, class polymorphism in event handlers

After

110ms

INP (Good) — ES2022 target, tslib importHelpers, emitDecoratorMetadata disabled, discriminated unions, scheduler.yield

TL;DR — quick wins:
  • Set target: "ES2022" in tsconfig to eliminate async/await down-leveling
  • Add importHelpers: true and install tslib to remove per-file helper duplication
  • Disable emitDecoratorMetadata unless you are using reflect-metadata DI containers
  • Enable isolatedModules: true and verbatimModuleSyntax: true to prevent type-only imports leaking as runtime values
  • Replace class hierarchies in event handlers with discriminated unions
  • Declare scheduler.yield types and use it to break long interaction tasks

Common causes of TypeScript-related INP regressions

Before diving into fixes, it helps to understand exactly which TypeScript compilation choices generate runtime overhead that the browser must execute during user interactions. INP measures the time from a user gesture (click, keypress, pointer down) to the next visual frame, so anything that executes synchronously on the main thread during that window directly adds latency.

  • ES5 down-leveling of async/await: When target is set to ES5 or ES2015, the TypeScript compiler transforms every async function into a generator state machine using __awaiter and __generator helpers. A single 20-line async event handler can expand to 80+ lines of generated state-machine code. V8 cannot inline or optimize these as aggressively as native async/await, and the additional code volume increases parse time on every page load.
  • Decorator metadata injection: emitDecoratorMetadata: true injects Reflect.metadata("design:type", ...) and Reflect.metadata("design:paramtypes", ...) calls throughout your output. Every decorated class, method, and parameter receives these calls. In a large Angular-style DI codebase, this can add dozens of synchronous reflect-metadata invocations to class instantiation paths that fire during interactions.
  • Per-file helper duplication without tslib: Without importHelpers: true, the TypeScript compiler inlines the full helper function body (__spreadArray, __awaiter, __extends, etc.) into every file that uses the corresponding syntax. A 200-file codebase with 30 files using spread syntax duplicates __spreadArray 30 times. Each copy must be parsed and compiled by V8 on first execution.
  • Type-only imports surviving as runtime values: Without isolatedModules: true or verbatimModuleSyntax: true, a developer can write import { SomeType } from './heavy-module' and use it only in a type position. In a full-program compilation the compiler elides it, but bundlers using per-file transforms (esbuild, SWC, Vite) cannot tell — they emit the import and pull the entire module into the bundle, including all of its side effects.
  • Class hierarchy polymorphism in hot event paths: Classes that extend other classes through multiple levels create V8 hidden-class transitions on construction and megamorphic call sites on method dispatch. When the same polymorphic method is called from a tight event handler loop — for example, dispatching 40 action objects per keystroke in a rich text editor — V8 de-optimizes the call site and falls back to slower dictionary lookups.
  • Untyped or mistyped scheduler primitives: Many TypeScript teams are not using scheduler.yield(), requestIdleCallback, or requestAnimationFrame correctly in interaction handlers because the types for these APIs are either absent (scheduler) or misunderstood. Without proper async task-splitting, long synchronous work executes inside the interaction's blocking window and directly inflates INP.

Step-by-step fix

Set tsconfig target to ES2022 and enable isolatedModules

The single highest-impact change to your TypeScript configuration is moving target from ES5 or ES2015 to ES2022. This instructs the compiler to emit native async/await, optional chaining (?.), nullish coalescing (??), logical assignment operators (??=, ||=, &&=), class fields, and Array.at() without any down-leveling transformation. All evergreen browsers and all Node.js versions above 16 support ES2022 natively.

Pair this with isolatedModules: true, which forces each file to be compilable in isolation — a requirement for esbuild, SWC, and Vite's transpile-only mode. Enabling it surfaces type-only import leaks as compile-time errors rather than silent runtime bugs.

Also add importsNotUsedAsValues: "error" (TypeScript 4.x) or its replacement verbatimModuleSyntax: true (TypeScript 5.x). This makes the compiler reject any import of a type-only binding that is not annotated with the type keyword. Bundlers then see explicit import type statements and can safely elide them without pulling in the source module.

JSON — tsconfig.json (Before)
{
  "compilerOptions": {
    "target": "ES5",
    "module": "CommonJS",
    "strict": true,
    "experimentalDecorators": true,
    "emitDecoratorMetadata": true
  }
}
JSON — tsconfig.json (After)
{
  "compilerOptions": {
    "target": "ES2022",
    "module": "ESNext",
    "moduleResolution": "Bundler",
    "strict": true,
    "isolatedModules": true,
    "verbatimModuleSyntax": true,
    "experimentalDecorators": true,
    "emitDecoratorMetadata": false,
    "importHelpers": true,
    "useDefineForClassFields": true
  }
}
Browser support note: ES2022 is safe for all major browsers released since 2021. If you still support IE11 (rare in 2026), keep a separate tsconfig.legacy.json for a legacy build with ES5 target and serve it only via differential serving. Do not let IE11 support force your modern-browser users to pay the down-leveling tax.

Install tslib and enable importHelpers

Even after switching to ES2022, TypeScript still emits a small set of helper functions for features that have no direct native equivalent: __decorate, __param, __metadata, __extends, __assign, __rest, and __spreadArray among others. Without importHelpers: true, the compiler inlines the full body of each used helper into every file that needs it. A codebase with 200 files using object spread can end up with 200 copies of the __assign function across its output chunks.

The fix is to install tslib as a production dependency and set importHelpers: true. The compiler will then emit import { __awaiter } from "tslib" at the top of files that need it, and your bundler deduplicates those imports into a single shared copy. The tslib package is 6KB minified and gzipped — a one-time cost that replaces potentially hundreds of kilobytes of duplicated inline helpers.

Shell — Install tslib
npm install tslib
# or
pnpm add tslib
# or
yarn add tslib
TypeScript — Before (inline helpers, ES5 target)
// event-handler.ts
async function handleClick(event: MouseEvent): Promise<void> {
  const data = await fetchData();
  updateUI(data);
}

// TypeScript ES5 output inlines ~50 lines of __awaiter/__generator
// into every file that uses async/await
TypeScript — After (ES2022, importHelpers)
// event-handler.ts
// ES2022 output: async/await is emitted verbatim
// No __awaiter, no __generator. The engine runs native microtask scheduling.
async function handleClick(event: MouseEvent): Promise<void> {
  const data = await fetchData();
  updateUI(data);
}

// Bundled output is ~80% smaller for async-heavy files.
// V8 can optimize native async/await with its FastAsync optimization path.

To verify the helpers are being imported rather than inlined, run tsc --listEmittedFiles and inspect a compiled output file. You should see import { __decorate } from "tslib" rather than the full function body.

Remove decorator metadata and type-only import leaks

emitDecoratorMetadata: true is required by older Angular versions and some NestJS DI containers that use reflect-metadata to infer constructor parameter types at runtime. If your project does not actually use reflection-based dependency injection, this option silently injects metadata calls throughout your compiled output.

With emitDecoratorMetadata enabled, every decorated class constructor receives an additional call pattern at module evaluation time. In a large codebase this happens synchronously across many modules during the initial script parse and during any dynamic import that loads a module with decorated classes. This synchronous work runs before the browser can commit the frame that follows the user interaction, directly contributing to INP.

TypeScript — Decorator metadata output (what gets emitted)
// Source
@Injectable()
class UserService {
  constructor(private http: HttpClient) {}
}

// Emitted JS with emitDecoratorMetadata: true
// The Reflect.metadata calls execute synchronously at module load
UserService = __decorate([
  Injectable(),
  __metadata("design:paramtypes", [HttpClient])
], UserService);

// With emitDecoratorMetadata: false, only the decorator call runs
UserService = __decorate([Injectable()], UserService);

The companion issue is type-only import leaks. With verbatimModuleSyntax: true, any import that is used only as a type must be written with the type keyword. This protects you from a bundler silently pulling in a large module because a single type import was not annotated correctly.

TypeScript — Type-only import fix
// Before: bundler may keep this import alive at runtime
import { HeavyDataProcessor } from './processors/heavy';

function processEvent(event: MouseEvent): void {
  const processor: HeavyDataProcessor = getProcessor(); // type-only usage
  processor.run(event.target);
}

// After: explicit type import — bundler elides it with zero runtime cost
import type { HeavyDataProcessor } from './processors/heavy';

function processEvent(event: MouseEvent): void {
  const processor: HeavyDataProcessor = getProcessor();
  processor.run(event.target);
}

Replace class hierarchies with discriminated unions in hot paths

Class inheritance is a natural modeling tool, but in interaction-critical code paths it creates hidden costs that are invisible in TypeScript source but very visible to the V8 optimizer. When an event handler receives objects of different subclass types and dispatches to polymorphic methods, V8 must maintain inline cache entries for each observed type. Once a call site sees more than a handful of distinct hidden classes, V8 marks it megamorphic and falls back to a slower generic lookup path.

Discriminated unions with a switch on a literal discriminant field avoid this entirely. The switch is a monomorphic operation — V8 compiles it to a jump table or a series of simple integer comparisons that it can fully optimize. This pattern is also how Redux reducers, XState event handlers, and most modern TypeScript design systems model their action types, precisely because of these performance characteristics.

TypeScript — Before (class hierarchy, polymorphic dispatch)
// Class hierarchy — V8 sees megamorphic call sites in dispatch()
abstract class UIAction {
  abstract execute(state: AppState): AppState;
}

class SelectItemAction extends UIAction {
  constructor(public readonly id: string) { super(); }
  execute(state: AppState): AppState {
    return { ...state, selectedId: this.id };
  }
}

class FilterAction extends UIAction {
  constructor(public readonly query: string) { super(); }
  execute(state: AppState): AppState {
    return { ...state, filter: this.query };
  }
}

class SortAction extends UIAction {
  constructor(public readonly field: string, public readonly dir: 'asc' | 'desc') { super(); }
  execute(state: AppState): AppState {
    return { ...state, sortField: this.field, sortDir: this.dir };
  }
}

// Handler fires on every keystroke — polymorphic, hard for V8 to optimize
function handleAction(action: UIAction, state: AppState): AppState {
  return action.execute(state); // megamorphic call site
}
TypeScript — After (discriminated union, monomorphic switch)
// Discriminated union — V8 optimizes the switch to a jump table
type UIAction =
  | { type: 'SELECT_ITEM'; id: string }
  | { type: 'FILTER'; query: string }
  | { type: 'SORT'; field: string; dir: 'asc' | 'desc' };

// Fully type-safe: TypeScript narrows the union in each case branch.
// Runtime: V8 compiles this to a monomorphic branch chain or jump table.
function handleAction(action: UIAction, state: AppState): AppState {
  switch (action.type) {
    case 'SELECT_ITEM':
      return { ...state, selectedId: action.id };
    case 'FILTER':
      return { ...state, filter: action.query };
    case 'SORT':
      return { ...state, sortField: action.field, sortDir: action.dir };
  }
}

// TypeScript enforces exhaustiveness. Add a never-guard if needed:
// default: { const _: never = action; return state; }

The same principle applies to virtual-scroll row renderers. If your virtualized list renders items using a class hierarchy (BaseRow extends to TextRow, ImageRow, VideoRow), refactor to a discriminated union of plain objects and a single render function with a switch. For type-correct virtual scrolling in TypeScript with libraries like @tanstack/virtual, type the row data union explicitly and narrow it in the item render callback. For more on this, see the guide on JavaScript performance optimization.

TypeScript — Virtual scrolling with typed discriminated rows
import { useVirtualizer } from '@tanstack/react-virtual';

type ListRow =
  | { kind: 'text'; content: string; id: string }
  | { kind: 'image'; src: string; alt: string; id: string }
  | { kind: 'separator'; label: string; id: string };

function VirtualList({ rows }: { rows: ListRow[] }) {
  const parentRef = React.useRef<HTMLDivElement>(null);

  const virtualizer = useVirtualizer({
    count: rows.length,
    getScrollElement: () => parentRef.current,
    estimateSize: (index) => {
      // Discriminated union lets us size rows without class instanceof checks
      const row = rows[index];
      switch (row.kind) {
        case 'text': return 48;
        case 'image': return 200;
        case 'separator': return 32;
      }
    },
    overscan: 5,
  });

  return (
    <div ref={parentRef} style={{ height: '600px', overflow: 'auto' }}>
      <div style={{ height: virtualizer.getTotalSize() }}>
        {virtualizer.getVirtualItems().map((virtualItem) => {
          const row = rows[virtualItem.index];
          return (
            <div key={row.id} style={{ transform: `translateY(${virtualItem.start}px)` }}>
              {renderRow(row)}
            </div>
          );
        })}
      </div>
    </div>
  );
}

function renderRow(row: ListRow): React.ReactNode {
  switch (row.kind) {
    case 'text': return <TextRow content={row.content} />;
    case 'image': return <ImageRow src={row.src} alt={row.alt} />;
    case 'separator': return <SeparatorRow label={row.label} />;
  }
}

Type and use scheduler.yield to break up long tasks

The Scheduler API's scheduler.yield() method is the most direct tool for reducing INP in interaction handlers. It yields control back to the browser so it can process pending input events and commit paint frames, then resumes your async function in a new task. Unlike setTimeout(fn, 0) which schedules a minimum 4ms-clamped task, scheduler.yield() is a user-visible task continuation that browsers can prioritize and schedule with sub-millisecond latency.

As of TypeScript 5.x, the Scheduler interface with yield() is not included in the bundled lib.dom.d.ts. You need to declare it yourself in a project-wide declaration file. The type declaration is straightforward and should include the signal option for cancellation.

TypeScript — scheduler.d.ts (global declaration)
// src/types/scheduler.d.ts
// Extends the global Scheduler interface with the yield() method.
// Remove this file once TypeScript ships official lib.dom types for it.

export {};

declare global {
  interface SchedulerYieldOptions {
    signal?: AbortSignal | 'inherit';
  }

  interface Scheduler {
    yield(options?: SchedulerYieldOptions): Promise<void>;
  }

  // scheduler is already declared in lib.dom.d.ts for postTask;
  // this augmentation adds yield() without redeclaring the variable.
}
TypeScript — Using scheduler.yield in an interaction handler
// Process a large filtered dataset on user input without blocking the frame.
// The yieldToMain() helper provides a fallback for browsers without scheduler.yield.

async function yieldToMain(): Promise<void> {
  if ('scheduler' in globalThis && typeof scheduler.yield === 'function') {
    return scheduler.yield();
  }
  return new Promise<void>((resolve) => setTimeout(resolve, 0));
}

async function handleFilterInput(query: string, items: Item[]): Promise<void> {
  const CHUNK_SIZE = 200;
  const results: Item[] = [];

  for (let i = 0; i < items.length; i += CHUNK_SIZE) {
    const chunk = items.slice(i, i + CHUNK_SIZE);

    for (const item of chunk) {
      if (matchesQuery(item, query)) {
        results.push(item);
      }
    }

    // Yield after each chunk. The browser can commit a frame and process
    // any subsequent keystrokes before we continue filtering.
    if (i + CHUNK_SIZE < items.length) {
      await yieldToMain();
    }
  }

  renderFilteredResults(results);
}

// Wire to input event with debounce to avoid redundant work
const debouncedFilter = debounce((e: Event) => {
  const query = (e.target as HTMLInputElement).value;
  handleFilterInput(query, allItems);
}, 50);

This pattern is also applicable to bundle-splitting scenarios. For more on reducing the JavaScript payload that causes long tasks in the first place, see the guide on reducing bundle size for better INP and the companion fix for Webpack-specific INP issues.

Avoid heavy generics and conditional types in hot-path modules

TypeScript's generic system and conditional types are erased entirely at compile time and have zero direct runtime cost. However, they have two indirect performance effects that matter for INP.

First, complex generic instantiations — deeply nested mapped types, recursive conditional types, large tuple manipulations — increase TypeScript's own compile time significantly. In a large monorepo with incremental builds, slow type-checking creates developer-cycle pressure to skip type-checking in the hot-module-reload path. Teams that do this with transpileOnly: true in ts-jest or --transpileOnly in ts-node often miss type errors that manifest as runtime exceptions during interaction handlers.

Second, some patterns that feel purely type-level actually influence what code gets emitted. const enum is the canonical example: it looks like a type-level abstraction but inlines numeric literals at every use site. This is incompatible with isolatedModules and esbuild, which cannot inline cross-file const enums. If your project relies on const enums for nominal typing of event codes or key codes, migrate to regular enum with as const objects, or use string literal union types which have no runtime overhead at all.

TypeScript — const enum vs. as const object
// Before: const enum breaks isolatedModules across files
// Do NOT use this when your bundler uses per-file transpilation
const enum KeyCode {
  Enter = 13,
  Escape = 27,
  Space = 32,
}

// After option A: regular enum (safe with isolatedModules, small runtime object)
enum KeyCode {
  Enter = 13,
  Escape = 27,
  Space = 32,
}

// After option B: as const object (zero runtime overhead, tree-shakeable)
const KeyCode = {
  Enter: 13,
  Escape: 27,
  Space: 32,
} as const;

type KeyCode = typeof KeyCode[keyof typeof KeyCode];

// After option C: string literal union (preferred for interaction events)
type InteractionKey = 'Enter' | 'Escape' | ' ';

function handleKeyDown(event: KeyboardEvent): void {
  const key = event.key as InteractionKey;
  switch (key) {
    case 'Enter': confirmSelection(); break;
    case 'Escape': cancelSelection(); break;
    case ' ': toggleSelection(); break;
  }
}

For interaction-critical modules, also audit your use of Object.keys(), Object.entries(), and for...in loops on typed objects. These iterate over all enumerable own properties including any decorator-injected metadata properties if emitDecoratorMetadata was previously enabled. After disabling decorator metadata, verify with a quick console.log that your objects no longer carry unexpected prototype baggage into tight loops.

For a comprehensive look at how INP interacts with the full JavaScript execution pipeline, the INP guide covers measurement, thresholds, and browser internals in depth.

Verification

After applying these changes, verify improvement at each level: compiler output, bundle size, and field INP data.

Compiler output verification: Run tsc --noEmit first to confirm zero type errors after the configuration changes. Then do a production build and inspect the output with source-map-explorer or your bundler's bundle analyzer. Look for: absence of __awaiter and __generator inline bodies (they should only appear as imports from tslib if at all), absence of Reflect.metadata calls if you disabled emitDecoratorMetadata, and a reduction in total bundle size of at least 8-15% compared to the ES5 baseline.

Shell — Bundle analysis commands
# Check for inline helper functions in output
grep -r "__awaiter\|__generator\|__spreadArray" ./dist --include="*.js" | wc -l
# Should return 0 after enabling importHelpers with tslib

# Check for Reflect.metadata calls
grep -r "Reflect.metadata" ./dist --include="*.js" | wc -l
# Should return 0 after disabling emitDecoratorMetadata

# Measure bundle size delta
du -sh ./dist/*.js
npx source-map-explorer dist/main.*.js

Runtime performance verification: Use Chrome DevTools' Performance panel to record an interaction trace. In the trace, look at the Bottom-Up view filtered to the interaction's blocking period. After these changes, you should see the removal of __awaiter call frames, shorter script evaluation times during dynamic imports, and the appearance of scheduler.yield task boundaries that chop previously long tasks into sub-50ms chunks.

Field data verification: Deploy to a staging environment and use the web-vitals npm package to collect real INP measurements from actual users. The onINP callback gives you the full attribution chain including which interaction triggered the worst INP and which long-task attribution group caused the delay. Use the performance budget tool to set a target INP threshold and monitor regressions in CI.

TypeScript — INP measurement with web-vitals
import { onINP, type INPMetricWithAttribution } from 'web-vitals/attribution';

onINP((metric: INPMetricWithAttribution) => {
  const { value, attribution } = metric;
  const { interactionTarget, interactionType, inputDelay, processingDuration, presentationDelay } = attribution;

  // Log breakdown to understand where latency is coming from
  console.table({
    'INP (ms)': value,
    'Interaction target': interactionTarget,
    'Interaction type': interactionType,
    'Input delay (ms)': inputDelay,
    'Processing duration (ms)': processingDuration,
    'Presentation delay (ms)': presentationDelay,
  });

  // Send to your analytics endpoint
  navigator.sendBeacon('/analytics/inp', JSON.stringify({
    rating: metric.rating,       // 'good' | 'needs-improvement' | 'poor'
    value,
    interactionTarget,
    processingDuration,
  }));
}, { reportAllChanges: false });

Common pitfalls

  • Setting target without checking lib: When you change target to ES2022, also verify that your lib array includes "ES2022" or "ESNext". If lib is not set, TypeScript infers it from target automatically, but if you have an explicit lib array that still lists only "ES5" and "DOM", you will get type errors for native ES2022 APIs like Array.at() and Object.hasOwn().
  • Forgetting to install tslib as a production dependency: importHelpers emits import { __awaiter } from "tslib" into your compiled output. If tslib is only in devDependencies, your production bundle will fail to resolve it at runtime in SSR environments. Always add it to dependencies.
  • Enabling verbatimModuleSyntax without auditing re-exports: Many shared utility modules use barrel re-exports (export { Foo } from './foo') that mix type and value exports. With verbatimModuleSyntax, all type re-exports must use export type. Run a full tsc --noEmit build immediately after enabling it and fix all resulting errors before shipping.
  • Using scheduler.yield without an AbortSignal in navigations: If a user triggers a navigation while your interaction handler is mid-yield, the yielded async function may resume on a page that has already been replaced. Always pair scheduler.yield with an AbortSignal tied to the component or navigation lifecycle when working in SPAs. See the INP in React fix for patterns using useEffect cleanup and AbortController.
  • Benchmarking only in Chrome on a developer machine: TypeScript-related INP improvements are most pronounced on mid-tier Android devices where V8's JIT tier-up is slower and parse time is a larger fraction of interaction latency. Always verify improvements using Chrome's CPU throttling (6x slowdown) or on a real device via Chrome Remote Debugging. Results on a MacBook Pro may understate the real-world improvement by 3-5x.

Frequently asked questions

TypeScript types are erased at compile time and have no direct runtime cost. The INP impact comes from what the TypeScript compiler emits: down-leveling transforms that convert async/await to generator state machines, decorator metadata injection via Reflect.metadata, and per-file helper function duplication. Configuring tsconfig correctly eliminates all of these. The TypeScript language itself is not the problem — the default compiler configuration choices are.

Moving from ES5 to ES2022 in a typical React TypeScript SPA reduces total JavaScript parse and compile time by 15-35% because native async/await and optional chaining are far more compact than their polyfilled equivalents. Combined with importHelpers via tslib, teams commonly see INP drop from 280ms to under 140ms on mid-tier Android devices. The improvement is most pronounced in codebases that make heavy use of async/await in event handlers and data-fetching logic.

Yes. Vite, esbuild, and SWC transpile TypeScript files in isolation without full program type information. Without isolatedModules: true, you risk emitting code that references erased const enum values or type-only imports as runtime values, which produces silent runtime errors in production that you will never see in development. Enabling isolatedModules surfaces these issues as compile-time errors rather than silent production bugs.

The Scheduler API's yield() method is not yet in TypeScript's bundled lib.dom.d.ts. Declare it with a global interface augmentation: create a scheduler.d.ts file containing declare global { interface Scheduler { yield(options?: { signal?: AbortSignal | 'inherit' }): Promise<void>; } }. Then call await scheduler.yield() inside async interaction handlers after feature-detecting it with if ('scheduler' in globalThis && typeof scheduler.yield === 'function'). Provide a setTimeout(resolve, 0) fallback for older browsers.

In hot interaction paths, yes. V8 can inline and monomorphize switch statements over string or number literal discriminants far more aggressively than it can optimize polymorphic method calls. The difference is most pronounced when the same code path processes many distinct action types per frame, which is common in reducer patterns and event bus handlers. For a 10-type action union processed on every keystroke, switching from class dispatch to discriminated union switch can reduce the handler's CPU time by 40-60% on mid-tier devices.

Continue learning