How to Measure Core Web Vitals: A Complete Guide
Core Web Vitals are the three metrics Google uses to evaluate your site's real-world user experience: Largest Contentful Paint (LCP), Cumulative Layout Shift (CLS), and Interaction to Next Paint (INP). Measuring them accurately is the first step toward optimization. This guide walks through seven tools and techniques, from quick audits to continuous monitoring.
There are two types of performance data: field data collected from real users (the gold standard Google uses for ranking) and lab data from simulated tests (useful for debugging). You need both. Field data tells you what users actually experience; lab data tells you why.
Step-by-step guide
Run a PageSpeed Insights audit
PageSpeed Insights (PSI) at pagespeed.web.dev is the fastest way to get both field and lab data in one report. Enter your URL and wait about 30 seconds for the analysis. The results show two sections: Field Data from CrUX (real users over 28 days) and Lab Data from a Lighthouse run.
Focus on the field data section first -- these are the real metrics Google uses for ranking. If the field data shows "not enough data," your URL does not have enough Chrome traffic for CrUX. In that case, check the origin-level data which aggregates all pages on your domain.
Use Chrome DevTools Performance panel
Open Chrome DevTools (F12), navigate to the Performance tab, and check the "Web Vitals" checkbox. Click the reload button to record a fresh page load. The timeline shows LCP (marked with a blue diamond), layout shifts (highlighted in pink), and long tasks (red corners on yellow bars).
For INP measurement, start a recording, then interact with the page naturally -- click buttons, type in inputs, toggle menus. Stop recording and look for "Interactions" in the Interactions lane. Each interaction shows its total delay, including input delay, processing time, and presentation delay.
1. Open DevTools (F12 or Cmd+Opt+I)
2. Go to Performance tab
3. Check 'Web Vitals' checkbox at the top
4. Click the circular record button, then reload
5. Wait for page load to complete
6. Click Stop
7. Look for:
- LCP marker (blue diamond on timeline)
- Layout Shift entries (pink highlights)
- Long Tasks (yellow bars with red corners)
- Interactions track (for INP analysis)
Install the Web Vitals Chrome extension
The Web Vitals Chrome extension adds a badge overlay that shows real-time LCP, CLS, and INP scores as you browse. The badge turns green (good), yellow (needs improvement), or red (poor) based on the current page's metrics.
Click the badge to expand a detailed view showing the exact values, the LCP element, the largest CLS source, and the slowest interaction. This is invaluable for quick spot-checking during development without needing to open DevTools.
Add the web-vitals JavaScript library
The web-vitals library by Google is a tiny (~2KB) package that measures CWV using the same methodology as CrUX. Install it and add measurement callbacks that send data to your analytics endpoint. This gives you continuous field data from every real user visit.
import { onLCP, onCLS, onINP, onFCP, onTTFB } from 'web-vitals';
function sendToAnalytics(metric) {
// Send to your analytics endpoint
const body = JSON.stringify({
name: metric.name, // 'LCP', 'CLS', 'INP', etc.
value: metric.value, // The metric value
rating: metric.rating, // 'good', 'needs-improvement', 'poor'
delta: metric.delta, // Change since last report
id: metric.id, // Unique ID for deduplication
navigationType: metric.navigationType,
});
// Use sendBeacon for reliability during page unload
if (navigator.sendBeacon) {
navigator.sendBeacon('/api/analytics', body);
} else {
fetch('/api/analytics', { body, method: 'POST', keepalive: true });
}
}
// Measure all Core Web Vitals
onLCP(sendToAnalytics);
onCLS(sendToAnalytics);
onINP(sendToAnalytics);
// Optional: additional metrics
onFCP(sendToAnalytics);
onTTFB(sendToAnalytics);
navigator.sendBeacon instead of fetch for analytics. Beacons are guaranteed to be sent even during page unload, preventing data loss.
Check Google Search Console CWV report
In Google Search Console, navigate to Core Web Vitals under "Experience" in the sidebar. The report shows how many of your URLs are rated Good, Needs Improvement, or Poor for both mobile and desktop.
Click on specific issues to see which URL groups are affected. Search Console groups URLs with similar structures (e.g., all product pages) to help you identify patterns. Fix the template or component causing the issue, and all pages in that group improve.
Run a WebPageTest audit for deep analysis
WebPageTest provides the most detailed performance analysis available. It generates waterfall charts showing every network request, filmstrip views of the visual loading sequence, and detailed breakdowns of TTFB, render timing, and resource loading.
Configure the test with settings matching your target audience: select a test location close to your users, choose an appropriate connection speed (3G Fast for mobile, Cable for desktop), and pick the device type. Run at least 3 tests and look at the median result to account for variance.
URL: https://your-site.com
Test Location: Virginia, USA (or nearest to your users)
Browser: Chrome
Connection: 4G (9 Mbps, 170ms RTT)
Number of Tests: 3 (use median)
Repeat View: First View + Repeat View
Advanced:
- Capture Video: Yes
- Timeline: Yes
- Block: (leave empty unless testing third-party impact)
Set up continuous monitoring
One-time audits are not enough. Performance regresses with every code change, dependency update, and content addition. Set up automated monitoring that runs Lighthouse audits on every pull request (using Lighthouse CI) or on a schedule (using a monitoring service).
Define performance budgets -- maximum acceptable values for each metric -- and configure alerts when they are exceeded. This catches regressions before they reach production and affect real users.
ci:
collect:
url:
- https://your-site.com/
- https://your-site.com/blog/
numberOfRuns: 3
assert:
assertions:
categories:performance:
- error
- minScore: 0.9
largest-contentful-paint:
- warn
- maxNumericValue: 2500
cumulative-layout-shift:
- warn
- maxNumericValue: 0.1
interactive:
- warn
- maxNumericValue: 3800
upload:
target: temporary-public-storage
Summary
| Tool | Data Type | Best For |
|---|---|---|
| PageSpeed Insights | Field + Lab | Quick overview and pass/fail status |
| Chrome DevTools | Lab | Debugging specific issues |
| Web Vitals Extension | Lab (local) | Quick spot-checking during development |
| web-vitals library | Field | Continuous real-user monitoring |
| Search Console | Field | URL-level CWV status for SEO |
| WebPageTest | Lab | Deep waterfall analysis |
| Lighthouse CI | Lab | Automated regression detection |
Frequently asked questions
What is the difference between field data and lab data?
Field data (also called Real User Monitoring or RUM) is collected from actual users visiting your site in the real world. Lab data is measured in a controlled, simulated environment. Field data from CrUX is what Google uses for ranking, so it is the authoritative source. Lab data from Lighthouse is useful for debugging specific issues but may not reflect real-world conditions.
How long does it take for CrUX data to update?
CrUX uses a rolling 28-day window. After making performance improvements, it takes at least 28 days for the changes to fully reflect in CrUX data. PageSpeed Insights and Search Console both pull from CrUX, so they also lag by this amount. For immediate feedback, use lab tools like Lighthouse or DevTools.
Can I measure Core Web Vitals on localhost?
Lab tools like Lighthouse and Chrome DevTools work on localhost, but they measure simulated conditions rather than real-world performance. Field data tools like CrUX and the web-vitals library require real users on a public URL. For local development, use Lighthouse with mobile simulation for a reasonable approximation.
Which Core Web Vital is most important to fix first?
Start with whichever metric is in the Poor range. If multiple metrics need improvement, prioritize LCP first because it has the biggest impact on user perception and is the hardest to pass -- only 68.3% of origins score Good on LCP, compared to 87.1% for INP and 80.9% for CLS.
How do I measure INP specifically?
INP requires real user interaction data, so lab tools alone cannot fully measure it. Use the web-vitals library to collect INP from real users, or use Chrome DevTools Performance panel to record interactions manually. The Chrome Web Vitals extension also shows INP in real-time as you interact with the page.