INP Vue 3 + Vite

Fix INP in Vue 3

Interaction to Next Paint (INP) measures the latency from user input to the next visual update. In Vue 3 apps, poor INP is most often caused by deep reactive objects triggering broad component re-renders, v-for lists that fully re-render when unrelated state changes, synchronous event handlers doing heavy work before yielding, and long tasks blocking the main thread. This guide covers the highest-impact Vue 3-specific fixes with working code examples.

Expected results

Before

400ms

INP (Poor) -- deep reactive updates, no v-memo, blocking event handlers

After

135ms

INP (Good) -- shallowRef, v-memo, debounced handlers, Web Workers

Step-by-step fix

Use shallowRef and shallowReactive for large data structures

Vue's ref() and reactive() recursively wrap every nested property in a Proxy. For large arrays or deeply-nested objects, this means every property access or mutation during an interaction carries reactivity overhead. Switch to shallowRef() and shallowReactive() so Vue only tracks top-level changes. When you need to trigger a re-render after mutating nested data in a shallowRef, call triggerRef() explicitly.

Vue 3 -- shallowRef vs ref
<script setup>
import { ref, shallowRef, triggerRef, shallowReactive } from 'vue';

// Bad: ref() deep-wraps all 10,000 row objects with Proxy
const rows = ref(await fetchTableData()); // 10,000 rows

// Good: shallowRef() only wraps the array reference itself
const rows = shallowRef(await fetchTableData());

// Mutating nested data in a shallowRef: use triggerRef()
function updateRowStatus(id, status) {
  const row = rows.value.find(r => r.id === id);
  if (row) {
    row.status = status; // direct mutation, no proxy overhead
    triggerRef(rows);    // tell Vue the ref changed
  }
}

// shallowReactive: tracks only top-level keys of an object
const state = shallowReactive({
  data: [],      // replacing state.data triggers reactivity
  loading: false,
  error: null,
});

// Replacing the array triggers reactivity normally
async function loadData() {
  state.loading = true;
  state.data = await fetchTableData(); // assignment tracked
  state.loading = false;
}
</script>

Apply v-memo to expensive list items

By default, any reactive state change that affects a parent component causes Vue to diff and patch every child in a v-for list. The v-memo directive accepts a dependency array and tells Vue to skip the vnode diff for a list item entirely when all dependency values are unchanged. Add it to expensive list items with the minimum set of values that can affect their rendered output -- typically the item's ID plus any selection or highlight state.

Vue template -- v-memo on list items
<template>
  <!-- Bad: entire list re-renders when selectedId changes -->
  <ul>
    <li v-for="item in items" :key="item.id">
      <ProductRow :item="item" :selected="item.id === selectedId" />
    </li>
  </ul>

  <!-- Good: only the row whose selection state changed is re-diffed -->
  <ul>
    <li
      v-for="item in items"
      :key="item.id"
      v-memo="[item.id, item.id === selectedId, item.updatedAt]"
    >
      <ProductRow :item="item" :selected="item.id === selectedId" />
    </li>
  </ul>

  <!-- v-memo also works on standalone expensive sub-trees -->
  <div v-memo="[chartData, theme]">
    <HeavyChartComponent :data="chartData" :theme="theme" />
  </div>
</template>

<script setup>
import { shallowRef } from 'vue';
const items = shallowRef([]);
const selectedId = ref(null);
</script>

Debounce expensive event handlers

Event handlers that fire on every keystroke or mouse move can queue many expensive reactive updates in rapid succession, each blocking the main thread and increasing INP. Wrap any handler that triggers non-trivial work in a debounce function so the work runs only after the user pauses. For search inputs, 200--300 ms is a common debounce interval that feels responsive while eliminating redundant work.

Vue 3 -- debounced search handler
<script setup>
import { ref } from 'vue';

// Minimal debounce utility (or use lodash/useDebounceFn from VueUse)
function debounce(fn, delay) {
  let timer;
  return (...args) => {
    clearTimeout(timer);
    timer = setTimeout(() => fn(...args), delay);
  };
}

const query = ref('');
const results = shallowRef([]);
const isLoading = ref(false);

async function runSearch(value) {
  if (!value.trim()) { results.value = []; return; }
  isLoading.value = true;
  results.value = await searchAPI(value); // API call or heavy filter
  isLoading.value = false;
}

// Bad: fires on every keystroke
function onInputBad(e) {
  query.value = e.target.value;
  runSearch(query.value);
}

// Good: waits 250 ms after the last keystroke
const onInput = debounce((e) => {
  query.value = e.target.value;
  runSearch(query.value);
}, 250);
</script>

<template>
  <input type="search" :value="query" @input="onInput" placeholder="Search..." />
  <ul>
    <li v-for="r in results" :key="r.id">{{ r.title }}</li>
  </ul>
</template>

Offload heavy computation to a Web Worker

Any computation that consistently takes more than 50 ms -- sorting large datasets, parsing CSV/JSON, running algorithms -- belongs in a Web Worker. Workers run on a separate thread, so they cannot block the main thread and therefore cannot harm INP. Create a composable useWorker() that wraps the postMessage / onmessage communication in a Promise-based API, making it easy to call from Vue components without callback spaghetti.

worker.js + useWorker composable
// src/workers/data.worker.js
// Runs on a separate thread -- no Vue reactivity, no DOM access
self.onmessage = function (e) {
  const { type, payload } = e.data;

  if (type === 'SORT') {
    const sorted = [...payload.rows].sort((a, b) => {
      return payload.direction === 'asc'
        ? a[payload.key] > b[payload.key] ? 1 : -1
        : a[payload.key] < b[payload.key] ? 1 : -1;
    });
    self.postMessage({ type: 'SORT_DONE', result: sorted });
  }

  if (type === 'FILTER') {
    const filtered = payload.rows.filter(row =>
      Object.values(row).some(v =>
        String(v).toLowerCase().includes(payload.query.toLowerCase())
      )
    );
    self.postMessage({ type: 'FILTER_DONE', result: filtered });
  }
};

// src/composables/useWorker.js
import { shallowRef, onUnmounted } from 'vue';

export function useWorker(workerUrl) {
  const worker = new Worker(new URL(workerUrl, import.meta.url), {
    type: 'module',
  });
  const pending = new Map();
  let msgId = 0;

  worker.onmessage = (e) => {
    const resolve = pending.get(e.data.type + '_resolve');
    if (resolve) { resolve(e.data.result); pending.delete(e.data.type + '_resolve'); }
  };

  function send(type, payload) {
    return new Promise((resolve) => {
      pending.set(type + '_DONE_resolve', resolve);
      worker.postMessage({ type, payload });
    });
  }

  onUnmounted(() => worker.terminate());
  return { send };
}

// Usage in a component
// <script setup>
// const { send } = useWorker('../workers/data.worker.js');
// const sorted = await send('SORT', { rows, key: 'name', direction: 'asc' });
// </script>

Yield to the browser with nextTick task splitting

When you must process a large dataset synchronously on the main thread -- for example, building a reactive data structure from a large import -- do it in chunks separated by await nextTick(). Each await yields control back to the browser event loop, giving it the opportunity to process pending input events and paint frames. This prevents a single long task from inflating INP even when the total work cannot be moved to a Worker.

Vue 3 -- chunked processing with nextTick
<script setup>
import { shallowRef, nextTick } from 'vue';

const processedRows = shallowRef([]);
const progress = ref(0);

async function processLargeDataset(rawRows) {
  const CHUNK_SIZE = 200;
  const result = [];

  for (let i = 0; i < rawRows.length; i += CHUNK_SIZE) {
    const chunk = rawRows.slice(i, i + CHUNK_SIZE);

    // Process this chunk synchronously
    for (const row of chunk) {
      result.push(transformRow(row)); // your transform logic
    }

    // Update progress and yield to the browser
    progress.value = Math.round((i / rawRows.length) * 100);
    processedRows.value = [...result]; // show partial results

    // Yield: browser can handle clicks, paints, etc. between chunks
    await nextTick();
  }

  progress.value = 100;
}

function transformRow(row) {
  // Example: normalize, compute derived fields, etc.
  return {
    ...row,
    fullName: `${row.firstName} ${row.lastName}`,
    totalValue: row.quantity * row.unitPrice,
  };
}
</script>

<template>
  <p v-if="progress < 100">Processing... {{ progress }}%</p>
  <DataTable v-else :rows="processedRows" />
</template>

Quick checklist

  • Large arrays and objects use shallowRef() or shallowReactive() instead of ref()
  • Every expensive v-for list item has a v-memo directive with a minimal dependency array
  • Event handlers that trigger expensive work are wrapped in a debounce function
  • Computation taking over 50 ms is moved to a Web Worker via useWorker()
  • Large synchronous loops on the main thread are split into chunks with await nextTick()
  • No long tasks (over 50 ms) appear in the Chrome Performance panel during interactions

Frequently asked questions

The most common causes are deep reactive objects triggering unnecessary component re-renders on every interaction, synchronous event handlers performing expensive DOM mutations or data processing before yielding, large v-for lists re-rendering entirely when unrelated state changes, and long tasks blocking the main thread during user input. Use the Chrome DevTools Performance panel to identify which tasks run during interactions and how long they take.

Yes. Vue's ref() and reactive() track every nested property with Proxy traps. For large objects such as tables with hundreds of rows and many columns, every read and write traverses the proxy chain. Switching to shallowRef() eliminates this overhead and can cut interaction time significantly for data-heavy components. You can measure the difference with Vue DevTools' Performance tab before and after the change.

v-memo accepts a dependency array. When Vue processes a v-for re-render, it checks whether all values in the array are strictly equal to the previous render for that item. If they are all equal, Vue skips the vnode diff entirely and reuses the cached DOM subtree. Only items with at least one changed dependency are diffed and patched, making large list re-renders proportional to the number of changed items rather than the total list size.

KeepAlive caches the component tree so remounting is skipped on re-activation, which improves the first interaction cost after navigating back to a cached route. It does not reduce interaction latency within an already-mounted component. Use KeepAlive for route-level caching and the other techniques in this guide (shallowRef, v-memo, debounce, Workers) to fix INP within active views.

Open Vue DevTools and switch to the Performance tab, then record while performing the slow interaction. The flame chart shows per-component render times. For lower-level event timing, use the Chrome DevTools Performance panel -- press Record, interact, then Stop and look for long tasks (red triangles) triggered by your event handler. The INP breakdowns in Chrome DevTools Insights panel show input delay, processing time, and presentation delay separately, which tells you exactly where time is being spent.

Use Chrome DevTools Performance panel with CPU throttling (4x slowdown) to simulate mid-range mobile devices. Interact with the page (click buttons, type in inputs, open menus) and look for long tasks in the flame chart. The Web Vitals Chrome Extension shows real-time INP scores as you interact. For Vue, pay attention to hydration-related interaction delays.

Continue learning