Fix TTFB in Angular
A standard Angular SPA sends a minimal HTML shell and waits for the JavaScript bundle to download, parse, execute, and make API requests before any content appears -- often over a second on a mid-range device. Angular Universal SSR (packaged as @angular/ssr since Angular 17) eliminates this by rendering the full HTML on the server before the response is sent. Combined with TransferState, edge deployment, and proper caching headers, TTFB can be reduced from over a second to under 250ms.
Expected results
Before
1,350ms
TTFB (Poor) -- SPA only, no SSR, no caching, remote server
After
220ms
TTFB (Good) -- Angular Universal SSR, TransferState, edge deployment, HTTP/2
Step-by-step fix
Enable Angular Universal SSR with @angular/ssr
Running ng add @angular/ssr scaffolds a server.ts file that creates an Express.js application and uses Angular's CommonEngine to render the app for each incoming request. The rendered HTML is returned as the response body. The client receives a fully-populated HTML document before any JavaScript executes, which means TTFB now reflects actual content rather than a blank shell. The same AppComponent tree runs on both server and client without code changes.
# 1. Add SSR to an existing Angular 17+ project
ng add @angular/ssr
# This creates:
# server.ts -- Express entry point
# src/app/app.config.server.ts -- server-specific providers
# and updates angular.json with a server build target
// server.ts -- generated by ng add @angular/ssr, customise as needed
import 'zone.js/node';
import { APP_BASE_HREF } from '@angular/common';
import { CommonEngine } from '@angular/ssr';
import express from 'express';
import { fileURLToPath } from 'node:url';
import { dirname, join, resolve } from 'node:path';
import bootstrap from './src/main.server';
export function app(): express.Express {
const server = express();
const serverDistFolder = dirname(fileURLToPath(import.meta.url));
const browserDistFolder = resolve(serverDistFolder, '../browser');
const indexHtml = join(serverDistFolder, 'index.server.html');
const commonEngine = new CommonEngine();
// Serve static assets with long cache
server.get('*.*', express.static(browserDistFolder, {
maxAge: '1y',
immutable: true,
}));
// Render Angular app for all other routes
server.get('*', (req, res, next) => {
commonEngine
.render({
bootstrap,
documentFilePath: indexHtml,
url: `${req.protocol}://${req.headers.host}${req.originalUrl}`,
publicPath: browserDistFolder,
providers: [{ provide: APP_BASE_HREF, useValue: req.baseUrl }],
})
.then(html => {
res.setHeader('Cache-Control', 'public, max-age=60, stale-while-revalidate=300');
res.send(html);
})
.catch(err => next(err));
});
return server;
}
function run(): void {
const port = process.env['PORT'] || 4000;
const server = app();
server.listen(port, () =>
console.log(`Node server listening on http://localhost:${port}`)
);
}
run();
Use TransferState to avoid duplicate API calls
Without TransferState, the server renders the page using data fetched from APIs. The client then bootstraps Angular, finds the same services, and fires the same API requests again. This wastes bandwidth and adds latency. TransferState serialises the server-fetched data into a <script type="application/json"> tag in the HTML. When the client bootstraps, HttpClient (with the withFetch() provider) automatically reads from TransferState instead of the network for requests that were already made on the server.
// product.service.ts
import { Injectable, inject, PLATFORM_ID } from '@angular/core';
import { HttpClient } from '@angular/common/http';
import { TransferState, makeStateKey } from '@angular/core';
import { isPlatformBrowser } from '@angular/common';
import { Observable, of } from 'rxjs';
import { tap } from 'rxjs/operators';
export interface Product { id: number; name: string; price: number; }
const PRODUCTS_KEY = makeStateKey<Product[]>('products');
@Injectable({ providedIn: 'root' })
export class ProductService {
private http = inject(HttpClient);
private transferState = inject(TransferState);
private platformId = inject(PLATFORM_ID);
getProducts(): Observable<Product[]> {
// On the client, read from TransferState if server already fetched
if (isPlatformBrowser(this.platformId)) {
const cached = this.transferState.get<Product[]>(PRODUCTS_KEY, []);
if (cached.length) {
this.transferState.remove(PRODUCTS_KEY); // consume once
return of(cached);
}
}
// On the server (or on client if not cached): fetch from API
return this.http.get<Product[]>('/api/products').pipe(
tap(products => {
// Store in TransferState so the client HTML receives it
if (!isPlatformBrowser(this.platformId)) {
this.transferState.set(PRODUCTS_KEY, products);
}
})
);
}
}
// app.config.ts -- enable HttpClient with fetch and transfer state support
import { ApplicationConfig } from '@angular/core';
import { provideHttpClient, withFetch } from '@angular/common/http';
import { provideRouter } from '@angular/router';
export const appConfig: ApplicationConfig = {
providers: [
provideRouter([]),
// withFetch() uses the Fetch API and integrates with TransferState
provideHttpClient(withFetch()),
],
};
Cache HTTP responses at the server level
Even with SSR, slow backend API calls during the render add directly to TTFB. An HttpInterceptor that caches GET responses in a server-side Map ensures that repeated renders for the same URL (and between hot restarts) reuse cached data instead of hitting the origin. Combine this with Cache-Control response headers so reverse proxies, CDNs, and browsers can also cache the rendered HTML, reducing TTFB to near zero for repeat requests.
// server-cache.interceptor.ts
import {
HttpInterceptorFn, HttpRequest, HttpHandlerFn, HttpResponse
} from '@angular/common/http';
import { inject, PLATFORM_ID } from '@angular/core';
import { isPlatformServer } from '@angular/common';
import { of } from 'rxjs';
import { tap } from 'rxjs/operators';
// Module-level cache shared across requests on the same server instance
const serverCache = new Map<string, HttpResponse<unknown>>();
const CACHE_TTL_MS = 30_000; // 30 seconds
const cacheTimestamps = new Map<string, number>();
export const serverCacheInterceptor: HttpInterceptorFn = (
req: HttpRequest<unknown>,
next: HttpHandlerFn
) => {
const platformId = inject(PLATFORM_ID);
// Only cache GET requests on the server
if (!isPlatformServer(platformId) || req.method !== 'GET') {
return next(req);
}
const key = req.urlWithParams;
const cached = serverCache.get(key);
const ts = cacheTimestamps.get(key) ?? 0;
if (cached && Date.now() - ts < CACHE_TTL_MS) {
return of(cached.clone());
}
return next(req).pipe(
tap(event => {
if (event instanceof HttpResponse) {
serverCache.set(key, event.clone());
cacheTimestamps.set(key, Date.now());
}
})
);
};
// app.config.server.ts -- register the interceptor for server builds
import { ApplicationConfig, mergeApplicationConfig } from '@angular/core';
import { provideServerRendering } from '@angular/platform-server';
import { provideHttpClient, withInterceptors } from '@angular/common/http';
import { appConfig } from './app.config';
import { serverCacheInterceptor } from './server-cache.interceptor';
const serverConfig: ApplicationConfig = {
providers: [
provideServerRendering(),
provideHttpClient(withInterceptors([serverCacheInterceptor])),
],
};
export const config = mergeApplicationConfig(appConfig, serverConfig);
Deploy to edge functions
Running Angular Universal in a single AWS or GCP region means users far from that region experience geographic latency as a fixed component of TTFB -- often 100-300ms before any rendering work begins. Deploying to edge functions (Vercel Edge, Cloudflare Workers) runs the SSR compute in the region physically nearest to the user. Geographic latency drops to under 20ms for most users worldwide. Both platforms support Angular Universal via an adapter.
// vercel.json -- deploy Angular Universal to Vercel Edge Network
{
"buildCommand": "ng build --configuration production",
"outputDirectory": "dist/my-app/browser",
"framework": "angular",
"rewrites": [
{ "source": "/(.*)", "destination": "/api/ssr" }
]
}
// api/ssr.ts -- Vercel serverless function wrapping CommonEngine
import type { VercelRequest, VercelResponse } from '@vercel/node';
import { CommonEngine } from '@angular/ssr';
import { APP_BASE_HREF } from '@angular/common';
import bootstrap from '../src/main.server';
import { join } from 'node:path';
const engine = new CommonEngine();
export default async function handler(req: VercelRequest, res: VercelResponse) {
const html = await engine.render({
bootstrap,
documentFilePath: join(process.cwd(), 'dist/my-app/browser/index.html'),
url: `https://${req.headers.host}${req.url}`,
providers: [{ provide: APP_BASE_HREF, useValue: '/' }],
});
res.setHeader('Cache-Control', 'public, s-maxage=60, stale-while-revalidate=300');
res.setHeader('Content-Type', 'text/html');
res.status(200).send(html);
}
Add Cache-Control headers for static assets and API responses
TTFB on repeat visits is determined by caching. Angular's production build appends content hashes to all JavaScript and CSS filenames, so these can be cached indefinitely with immutable. The rendered HTML should use short-lived caches with stale-while-revalidate so users get fast responses while fresh renders happen in the background. API responses used during SSR should also carry appropriate Cache-Control headers so the server-side cache interceptor has correct TTL guidance.
// server.ts -- comprehensive Cache-Control setup
import express from 'express';
import { join } from 'node:path';
export function app(): express.Express {
const server = express();
const browserDistFolder = join(process.cwd(), 'dist/my-app/browser');
// Hashed JS/CSS/fonts -- immutable, cache forever
server.get('*.{js,css,woff2,woff,ttf}',
express.static(browserDistFolder, {
maxAge: '1y',
immutable: true,
// Emits: Cache-Control: public, max-age=31536000, immutable
})
);
// Images -- cache 30 days, allow revalidation
server.get('*.{png,jpg,jpeg,webp,avif,svg}',
express.static(browserDistFolder, {
maxAge: '30d',
// Emits: Cache-Control: public, max-age=2592000
})
);
// SSR HTML -- short cache with stale-while-revalidate
server.get('*', (req, res, next) => {
if (req.path.startsWith('/api/')) return next();
// renderWithCommonEngine(req, res) would go here
res.setHeader(
'Cache-Control',
'public, max-age=30, stale-while-revalidate=300, stale-if-error=86400'
);
// ... call CommonEngine.render() and send HTML ...
next();
});
// API proxy responses -- cache based on data freshness requirements
server.get('/api/products', (_req, res, next) => {
res.setHeader('Cache-Control', 'public, max-age=60, stale-while-revalidate=600');
next();
});
return server;
}
Quick checklist
-
ng add @angular/ssris applied and theserver.tsentry point is in production builds -
All data services use
TransferStateto avoid duplicate API calls on the client -
Server-side HTTP caching interceptor is registered for
GETrequests - SSR is deployed to an edge network (Vercel, Cloudflare Workers, or equivalent)
-
Hashed static assets served with
Cache-Control: immutable -
HTML responses use
stale-while-revalidatefor fast repeat visits
Frequently asked questions
Not automatically. SSR reduces time-to-meaningful-content for the user, but the raw TTFB can be slower than serving a static shell if the server-side render is slow due to cold starts, slow API calls, or single-region deployment. The real gains come from combining SSR with server-side caching (so most requests are served from cache), edge deployment (so geographic latency is minimal), and TransferState (so the client does not refetch what the server already loaded).
TransferState is an Angular service that serialises data from the server-rendered context into a <script type="application/json"> tag embedded in the response HTML. Without it, the Angular client bootstraps and immediately fires the same API requests the server already made to generate the page. This causes a visible content flash as the client re-fetches and re-renders. TransferState makes the client read from the embedded JSON instead of the network, eliminating that double request entirely.
Yes, with static pre-rendering. Angular 17's ng build --prerender generates static HTML files for all routes at build time. These are served as plain files from a CDN with no server compute, giving TTFB under 50ms for most users. The trade-off is that page content reflects build-time data. Pages with frequently-changing data need SSR or ISR (incremental static regeneration via a short stale-while-revalidate cache).
The most common causes are: the Node.js server being in a single region far from many users (adding 50-300ms geographic latency), slow database or downstream API calls made synchronously during the render, no response caching causing every request to trigger a full render, large server bundles with high cold-start parse time, and memory leaks in long-running server instances causing periodic GC pauses that stall the render. Use the server-timing response header to measure render time separately from network time.
In Chrome DevTools Network tab, click the HTML document request and inspect the Timing section -- the Waiting for server response row is TTFB. In the Performance tab, TTFB is the interval between navigation start and the first response byte event. For field data from real users, install the web-vitals package and call onTTFB(console.log) in main.ts. Add a Server-Timing header in Express.js (res.setHeader('Server-Timing', 'render;dur=42')) to expose the server render duration separately in DevTools.
The most common causes are: uncached server-side rendering (each request triggers full page generation), slow database queries without indexes, hosting on a single-region origin server far from users, and missing CDN caching headers. For Angular, check that static/ISR pages are being served from CDN edge nodes rather than hitting the origin on every request.
Related resources
Complete TTFB Guide
Deep dive into TTFB components, server rendering strategies, and CDN caching patterns.
FixFix TTFB in Next.js
Next.js TTFB fixes using the App Router, React Server Components, and edge runtime deployment.
FixFix TTFB in Vue
Nuxt 3 SSR, server-side data fetching with useFetch, and Nitro edge deployment.
Continue learning
Complete TTFB Guide
Deep dive into TTFB -- thresholds, measurement, and optimization strategies.
FixFix LCP in Angular
Related performance optimization for the same framework.
FixFix CLS in Angular
Related performance optimization for the same framework.
ToolCWV Score Explainer
Enter your scores for personalized fix recommendations.