Caching Strategies for Modern Web Apps
Leverage HTTP caching, service workers, and CDNs to improve speed and offline support.
Why Caching Is Your Biggest Performance Lever
The fastest HTTP request is the one that never happens. A well-implemented caching strategy can eliminate 80-90% of repeat requests, dramatically reducing load times for returning visitors and lowering your server costs. But caching is also one of the most misunderstood topics in web development — get it wrong and you'll serve stale content or, worse, break your application.
HTTP Cache Headers
The foundation of web caching is the Cache-Control header. Here are the directives that matter most:
max-age=31536000, immutable— for versioned assets (JS, CSS, images with content hashes in their filenames). These can be cached forever because the filename changes when the content changes.no-cache— doesn't mean "don't cache." It means "cache it, but revalidate with the server before using it." The server responds with 304 (Not Modified) if the content hasn't changed, saving bandwidth.no-store— actually means "don't cache." Use this for sensitive data like banking pages or authenticated API responses.stale-while-revalidate=60— serve the cached version immediately while fetching a fresh copy in the background. Gives users instant responses while keeping content reasonably fresh.
CDN Caching
A CDN caches your content at edge servers distributed globally, so users download assets from a server geographically close to them. For static assets, CDN caching is essentially free performance:
- Set long
max-agevalues for hashed assets - Use the CDN's purge/invalidation API when you deploy new versions of non-hashed assets
- Consider edge functions (Vercel Edge, Cloudflare Workers) for dynamic content that can be cached regionally
Application-Level Caching
Beyond HTTP headers, your application can cache data at multiple layers:
- In-memory caching (e.g., Redis) for database query results and computed data. Set appropriate TTLs and invalidation rules.
- Next.js Data Cache for
fetchresponses in server components. Userevalidateoptions to control freshness. - Client-side state with libraries like TanStack Query, which provides built-in cache management with configurable stale times and background revalidation.
Cache Invalidation
The famous quote goes: "There are only two hard things in Computer Science: cache invalidation and naming things." Here's how to make invalidation less painful:
- Use content-based hashing for static assets. When the file content changes, the hash changes, and the old URL naturally expires from caches. This is what build tools like Webpack and Vite do by default.
- Use short TTLs for dynamic data. A 5-minute cache on an API response is usually a great tradeoff — it absorbs traffic spikes without serving dangerously stale data.
- Implement event-driven invalidation for critical data. When a user updates their profile, explicitly purge the cached version rather than waiting for the TTL to expire.
The goal isn't to cache everything aggressively. It's to cache the right things with the right lifetimes, so your users get fast responses and fresh data simultaneously.
Triforce Team
The Triforce Software team shares insights on software development, accessibility, and performance.
No comments yet. Login to start a new discussion Start a new discussion