Skip to main content
Back to BlogWeb Development

Next.js 16 Cache Components Guide for Marketing Sites

Mark Shvaya
16 min read
Next.js 16 Cache Components dashboard showing use cache directive, cacheLife profiles, and Partial Prerendering for marketing site performance

AI Summary

Next.js Cache Components are the new caching model in Next.js 16 that uses a "use cache" directive to mark functions, components, or routes as cacheable. Combined with Partial Prerendering (PPR), cacheLife profiles, and cacheTag invalidation, they let marketing sites ship static-fast shells with dynamic personalization. This guide covers the directive, the supporting APIs, migration from unstable_cache, real-world patterns for marketing pages, and the gotchas that show up after deploy.

The Short Answer: What Are Next.js Cache Components?

Next.js Cache Components are an opt-in caching primitive in Next.js 16 that wraps any async function, component, or file in a "use cache" directive. The function becomes a cache boundary: its arguments form the cache key, its return value lives in the Data Cache, and every subsequent call with the same arguments skips the work and returns the cached value. For marketing sites, that means a CMS-driven homepage, blog listing, or pricing page can render once at the edge and serve thousands of visitors in single-digit milliseconds.

The model replaces the older unstable_cache wrapper and the implicit fetch caching that App Router shipped with originally. Cache Components are explicit, composable, and pair with Partial Prerendering (PPR) so a single page can be partly static and partly dynamic—static shell, dynamic personalization holes, all on the same URL.

What you get with Cache Components:

  • Directive-based API: "use cache" at file, function, or component scope
  • cacheLife: tunable freshness with built-in profiles (seconds, minutes, hours, days, weeks, max)
  • cacheTag + updateTag: on-demand invalidation tied to content events
  • PPR compatibility: static shells with dynamic holes on the same route
  • Edge-friendly: CDN tag invalidation works with the Vercel platform out of the box

Source: Next.js docs—use cache directive.

Why Cache Components Matter for Marketing Sites

Marketing sites have an awkward shape. Most of the page is identical for every visitor—hero, navigation, testimonials, footer—but a sliver is personalized: a logged-in badge, a geolocation-aware CTA, a session-aware pricing toggle. Old caching models forced you to choose: cache the whole page and ditch personalization, or render dynamically and pay the latency on every visit.

Cache Components break that tradeoff. The static parts cache; the dynamic parts stream. The user sees the cached shell instantly and the dynamic holes fill in over the same response. Vercel's PPR launch data showed median TTFB drops of 40–80% on representative marketing pages versus full dynamic rendering.

The performance math compounds with the rest of Next.js performance optimization—static shells are smaller, faster, and easier for the CDN to serve. They also pass Core Web Vitals more reliably because LCP candidates are typically in the cached shell, not the streamed holes.

TTFB by Rendering Strategy (Next.js 16, 50th percentile)

0200400600800ms85ms130ms220ms710msStaticCache ComponentsPPRFull DynamicMarketing homepage on Vercel Edge, US-East, p50 of 1,000 requests

The "use cache" Directive: Three Scopes

The directive is a string literal that mirrors the existing "use client" and "use server" conventions. Place it at the top of a file, function, or component to mark that scope as cacheable. Next.js builds a cache key from the function's arguments and return value, then stores the result in the Data Cache.

File Scope: Cache Every Export

Place "use cache" at the top of a file and every async export becomes a cache boundary. Useful for data layers where every query should cache by default.

lib/data/posts.ts — file-level "use cache":

'use cache'

import { cacheLife, cacheTag } from 'next/cache'
import { db } from '@/lib/db'

export async function getPost(slug: string) {
  cacheLife('hours')
  cacheTag(`post:${slug}`)
  return db.post.findUnique({ where: { slug } })
}

export async function listPosts() {
  cacheLife('hours')
  cacheTag('posts:list')
  return db.post.findMany({ orderBy: { date: 'desc' } })
}

Function Scope: Cache One Async Function

Place "use cache" inside a function to scope caching to that single call site. This is the most common pattern—you cache the expensive call but leave neighboring functions uncached.

app/page.tsx — function-level cache:

async function getHomepageContent() {
  'use cache'
  cacheLife('hours')
  cacheTag('homepage')
  return await sanity.fetch(homepageQuery)
}

export default async function HomePage() {
  const content = await getHomepageContent()
  return <Hero {...content.hero} />
}

Component Scope: Cache an Async Server Component

Mark an async Server Component itself as cacheable. The component's rendered output—not just the data—is stored in the cache. Pair this with PPR and the cached component becomes part of the static prerender.

Cached Server Component:

import { cacheLife, cacheTag } from 'next/cache'

async function PricingTable({ region }: { region: string }) {
  'use cache'
  cacheLife('hours')
  cacheTag(`pricing:${region}`)

  const tiers = await getPricing(region)
  return (
    <div className="grid md:grid-cols-3 gap-6">
      {tiers.map(t => <Tier key={t.id} {...t} />)}
    </div>
  )
}

export default function PricingPage() {
  return <PricingTable region="us" />
}

cacheLife: Tuning Freshness Windows

cacheLife controls three windows: how long a cached value stays fresh, how long it can be served as stale-while-revalidating, and when it should be evicted entirely. The function ships with named profiles that match common content cadences, plus a custom object form for fine-grained control.

For most marketing surfaces, the named profiles are enough. Reach for the custom object only when the named profiles don't fit—a daily-updated stat ticker, a rate-limited third-party API, or a regulated content type with explicit freshness requirements.

ProfileStaleRevalidateBest For
seconds0s1sLive counters, stock tickers
minutes5m1mSearch results, recent activity
hours1h15mMarketing pages, blog listings, pricing
days1d1hBlog posts, case studies, evergreen content
weeks1w1dDocumentation, glossary, FAQ
max1y1yLogos, brand assets, static lists

Default cacheLife when omitted is "max". Defer to explicit profiles in production.

Custom cacheLife Object

Custom freshness for a daily report:

async function getDailyReport() {
  'use cache'
  cacheLife({
    stale: 60 * 60,        // 1 hour stale window on the client
    revalidate: 60 * 30,   // revalidate every 30 minutes server-side
    expire: 60 * 60 * 24,  // hard evict after 24 hours
  })
  return fetchReport()
}

The three windows interact: a request inside the revalidate window returns the cached copy untouched; a request between revalidate and stale returns the cached copy and triggers a background refresh; a request past stale waits for a fresh fetch; a request past expire forces a full re-execution with no fallback.

cacheTag and updateTag: On-Demand Invalidation

Time-based revalidation works fine for content that changes on a predictable cadence, but most marketing sites need event-based invalidation. A writer publishes a post, an admin updates pricing, a webhook fires from the CMS—and the cache should clear immediately, not wait for the next revalidate window.

cacheTag attaches one or more string labels to a cached value. updateTag, called from a Server Action or route handler, invalidates every cached value carrying that label. The pattern is purpose-built for content workflows.

CMS webhook handler invalidating tagged caches:

// app/api/sanity/webhook/route.ts
import { updateTag } from 'next/cache'
import { NextResponse } from 'next/server'

export async function POST(req: Request) {
  const body = await req.json()
  const { type, slug } = body

  if (type === 'post') {
    updateTag(`post:${slug}`)
    updateTag('posts:list')
  }

  if (type === 'page') {
    updateTag(`page:${slug}`)
  }

  return NextResponse.json({ revalidated: true })
}

Tagging Strategies for Marketing Sites

  • Per-entity tags: post:hello-world, page:pricing—invalidate one piece of content.
  • Collection tags: posts:list, products:featured—invalidate listing pages when any item changes.
  • Type tags: cms:all—nuclear option, useful for schema migrations.
  • Region tags: pricing:us, pricing:eu—invalidate one geo without touching others.
  • Combination tags: apply multiple tags to a single cache so one entry can be invalidated from several angles.

Pairing Cache Components with Partial Prerendering

Partial Prerendering (PPR) is the rendering model that pairs a cached static shell with streamed dynamic holes on the same route. Cache Components define the cached parts; Suspense boundaries define the dynamic parts. The cached shell ships from the CDN as static HTML, the dynamic holes stream in as the request resolves.

Enable PPR per route or globally with the experimental flag in next.config.js, then mark dynamic boundaries with <Suspense>. Anything outside a Suspense boundary that doesn't opt into dynamic APIs (cookies, headers, searchParams) becomes part of the cached shell.

PPR-enabled marketing homepage:

// next.config.js
module.exports = {
  experimental: { ppr: 'incremental' },
}

// app/page.tsx
export const experimental_ppr = true

export default function HomePage() {
  return (
    <>
      <Hero />              {/* cached, static */}
      <FeaturesGrid />      {/* cached, static */}
      <Suspense fallback={<UserBadgeSkeleton />}>
        <UserBadge />       {/* dynamic, streamed */}
      </Suspense>
      <Testimonials />      {/* cached, static */}
      <Suspense fallback={<RecentSignupsSkeleton />}>
        <RecentSignups />   {/* dynamic, streamed */}
      </Suspense>
      <Footer />            {/* cached, static */}
    </>
  )
}

The result: TTFB is the cached shell's TTFB (typically <100ms from edge), LCP candidates render from the static prerender, and only the personalized fragments wait on the dynamic work. Website speed optimization stops being a tradeoff with personalization—you get both.

Cache Hit Rate vs. Time-to-Live (Marketing Homepage)

0%25%50%75%100%No cache1 min5 min15 min1 hour6 hours1 day1 weekHit rate plateaus near 95% past the 1-hour mark on tag-invalidated content

Migrating from unstable_cache to Cache Components

Most App Router apps still use unstable_cache—the wrapper-style API that took a fetcher, options object with tags and revalidate, and returned a memoized version. Cache Components replace it cleanly. The migration is mechanical and can be done one function at a time.

Step-by-Step Migration

  1. Locate every unstable_cache call site. Most are in lib/data or app/api directories.
  2. Replace the wrapper with the directive. Add "use cache" at the top of the function and remove the unstable_cache wrapper.
  3. Convert tags to cacheTag calls. Each tag string becomes a cacheTag(string) call inside the function.
  4. Convert revalidate options to cacheLife. Pick a named profile or pass a custom object with the same revalidate seconds.
  5. Replace revalidateTag with updateTag. The function signature is the same; only the import path and name change.
  6. Test cache key behavior. Cache Components hash the function arguments; ensure no closures over external state are baked into the key.

Before — unstable_cache:

import { unstable_cache } from 'next/cache'

export const getPost = unstable_cache(
  async (slug: string) => db.post.findUnique({ where: { slug } }),
  ['post'],
  { tags: ['posts'], revalidate: 3600 }
)

After — Cache Components:

import { cacheLife, cacheTag } from 'next/cache'

export async function getPost(slug: string) {
  'use cache'
  cacheLife('hours')
  cacheTag(`post:${slug}`)
  return db.post.findUnique({ where: { slug } })
}

Pro Tip:

Run npx @next/codemod next-async-request-api after upgrading to Next.js 16. The codemod handles cookies/headers async migration and flags unstable_cache call sites that need manual conversion. It won't auto-rewrite the cache calls (the directive scope changes are too contextual), but it produces a clean inventory.

Real-World Patterns for Marketing Sites

Pattern 1: CMS-Driven Pages with Webhook Invalidation

The classic marketing setup: Sanity, Contentful, or Storyblok feeds the page, a webhook fires on publish, the cache clears. Cache Components make this five lines.

  • Tag every page query with page:{slug} and a global cms:all fallback.
  • Set cacheLife('days') since the webhook handles freshness.
  • Webhook calls updateTag('page:{slug}') on publish, updateTag('cms:all') on schema migration.
  • Authoring teams see updates within seconds; visitors get edge-cached responses the rest of the time.

Pattern 2: Pricing Page with Geo-Aware Tiers

Price the same product differently by region without losing the static-shell speed. Cache the tier table per region, render the geo-detected wrapper dynamically inside a Suspense boundary.

Geo-aware pricing with cached per-region tiers:

async function PricingTiers({ region }: { region: string }) {
  'use cache'
  cacheLife('days')
  cacheTag(`pricing:${region}`)
  const tiers = await db.pricing.findMany({ where: { region } })
  return <TierGrid tiers={tiers} />
}

async function GeoPricing() {
  const country = (await headers()).get('x-vercel-ip-country') ?? 'us'
  const region = country === 'GB' ? 'eu' : country.toLowerCase()
  return <PricingTiers region={region} />
}

export default function PricingPage() {
  return (
    <Suspense fallback={<PricingSkeleton />}>
      <GeoPricing />
    </Suspense>
  )
}

Pattern 3: Blog Index with Featured Filtering

Blog listings have a long-tail problem: the index page is hit constantly, but the underlying data only changes when posts publish. Cache the listing query, tag it with the collection name, invalidate via webhook on every publish.

Pattern 4: Static Hero with Dynamic Personalization Hole

The marketing pattern Cache Components were built for. The hero, navigation, testimonials, and footer cache for hours. A single dynamic hole renders a logged-in CTA, an in-trial banner, or a regional offer, streamed through a Suspense boundary. Comparing Next.js vs Astro for marketing sites usually comes down to whether you need this dynamic-hole flexibility—Cache Components make Next.js the cleaner answer when you do.

Marketing Page Architecture: Cached vs. Streamed Regions

Navigation (cached, hours)Hero (cached, hours) — LCP candidatePersonalized CTA (dynamic)Features (cached, days)Pricing (geo-cached, days, tag: pricing:{region})Recent Signups (dynamic)Footer (cached, weeks)Cached staticDynamic / streamedTag-invalidated

Gotchas to Watch For After Deploy

Cache Components are clean in theory and mostly clean in practice. The few patterns that bite show up after deploy, not in development—local dev rebuilds bypass most of the caching infrastructure.

Closures Over External State

Cache keys hash function arguments, not closure variables. If a cached function reads from a module-level variable, the variable's value at first call is what gets baked into the cached output—forever, until cacheLife evicts. Pass everything you need as arguments.

Dynamic APIs Inside Cached Functions

cookies(), headers(), and searchParams() are dynamic by definition. Calling them inside a "use cache" function will throw a build error. If you need the value, pass it in as an argument from a non-cached parent.

Tag Explosion

It's tempting to tag every cached value with five tags. Don't. Each tag costs lookup time on invalidation, and the platform CDN has soft limits on tags per cache entry (Vercel's limit is documented at 64 tags per entry as of April 2026). Tag at most three layers: per-entity, per-collection, type-wide.

Revalidate Storms

A popular page with a short cacheLife and high traffic can trigger thousands of simultaneous revalidations the moment its window expires. Use the stale window aggressively—stale-while-revalidating means one request triggers the refresh while everyone else gets the cached copy. cacheLife profiles ship with sensible stale windows; resist shrinking them.

When to Cache (and When Not To)

Caching has a cost: cache key computation, storage, invalidation logic, and the cognitive load of reasoning about freshness. Not every function deserves "use cache". Here's the decision rubric we use on Verlua client builds.

Cache It If:

  • • The function does work that's expensive (database query, external API, heavy computation)
  • • The result is the same for many requests (not user-specific)
  • • Acceptable staleness window is at least 1 minute
  • • You can invalidate via tag when the underlying data changes

Don't Cache If:

  • • The function is fast (in-memory transformation, simple math)
  • • The result is per-user (auth state, personalization, cart)
  • • Strict real-time accuracy is required (live counters, financial totals)
  • • You can't reliably invalidate when data changes

The default stance: cache the data layer aggressively, leave UI components uncached unless they're demonstrably slow. Most marketing sites get 80% of the wins from caching the CMS query layer alone.

7-Day Cache Components Adoption Plan

Day 1–2: Inventory and Upgrade

  • Upgrade to Next.js 16 (run the codemod for async request APIs)
  • Inventory every unstable_cache call site and document the tags/revalidate options
  • Identify CMS webhook entry points for invalidation

Day 3–4: Migrate Data Layer

  • Convert unstable_cache wrappers to "use cache" directives
  • Apply cacheLife profiles by content cadence (hours for marketing, days for blog, weeks for docs)
  • Add cacheTag calls aligned to webhook invalidation events
  • Test webhook invalidation in preview deploys

Day 5–7: Enable PPR and Measure

  • Enable experimental.ppr: 'incremental' in next.config.js
  • Wrap dynamic regions in Suspense boundaries with skeleton fallbacks
  • Opt routes into PPR with experimental_ppr = true
  • Compare TTFB and LCP before/after with Vercel Speed Insights or PageSpeed Insights
  • Document cache hit rates in your RUM tool and tune cacheLife where hits are low

If you're still on the App Router's implicit fetch caching or you're running an older framework choice, the broader question is whether React vs Next.js is even the right framework for the job. For data-heavy marketing sites with complex personalization, Next.js 16 with Cache Components is the strongest pick on the market right now.

Frequently Asked Questions

What are Next.js Cache Components?

Next.js Cache Components are an opt-in caching model introduced in Next.js 15 and stabilized in Next.js 16 that lets you mark specific functions, components, or routes as cacheable using the "use cache" directive. Cached values are stored in the Data Cache and reused across requests, while uncached parts of the page render dynamically. Combined with Partial Prerendering (PPR), Cache Components let you ship a static shell with dynamic holes—the homepage hero, navigation, and footer cache for hours, while the logged-in user badge and personalized recommendations stream in fresh on every request.

How does the "use cache" directive work?

The "use cache" directive is a string literal you place at the top of an async function, component, or file (similar to "use client" or "use server"). When Next.js sees it, the function becomes a cache boundary: its arguments form the cache key, its return value is stored in the Data Cache, and subsequent calls with the same arguments return the cached value without re-executing. You can scope the directive at three levels—file ("use cache" at the top of a file caches every export), function ("use cache" inside a function), or component ("use cache" inside an async Server Component). Cache lifetimes default to one year revalidation but can be tuned with cacheLife.

What is cacheLife and how do I use it?

cacheLife is a Next.js function that controls how long a cached value stays fresh before revalidation, and how long it can be served as stale-while-revalidating. You import it from "next/cache" and call it inside any function marked "use cache" with one of the built-in profiles (seconds, minutes, hours, days, weeks, max) or a custom object specifying stale, revalidate, and expire windows. For a marketing homepage, cacheLife("hours") is a reasonable default—it means CDN edges serve a cached response for an hour, then revalidate in the background while still showing the stale copy to the next visitor.

When should I use cacheTag and updateTag?

Use cacheTag to label a cached value with a string identifier, then call updateTag (in a Server Action or route handler) to invalidate every cached value carrying that tag. The pattern is purpose-built for content-driven sites: tag a blog post cache with the post slug, tag a product cache with the SKU, then when a writer updates the CMS, your webhook calls updateTag(slug) and the next request rebuilds. updateTag immediately invalidates the cache, while the older revalidateTag from the App Router is being phased out for the more explicit naming.

How do Cache Components differ from unstable_cache?

unstable_cache was the App Router's original caching primitive—a wrapper function that took a fetcher, options object, and tags. Cache Components replace it with a directive-based API ("use cache") that integrates more cleanly with Server Components, supports component-level caching (not just data fetching), and uses cacheLife and cacheTag as standalone functions instead of options objects. Migration is mostly mechanical: replace the unstable_cache wrapper with "use cache" at the top of the function, replace tags: ["x"] with cacheTag("x"), and replace revalidate: 3600 with cacheLife({ revalidate: 3600 }).

Does Cache Components require Partial Prerendering?

No, but the two features are designed to work together. Cache Components work in any Next.js 16 app—you can mark a single data-fetching function as "use cache" and ship that improvement without touching anything else. Partial Prerendering (PPR) is the rendering model that pairs cached static parts with streamed dynamic parts on the same page. PPR uses Cache Components to identify what can be prerendered. If you only need data caching, use Cache Components alone. If you want a static shell with dynamic holes for personalization, enable PPR (experimental.ppr in next.config.js) and use Cache Components together.

Need Help Migrating to Next.js 16?

Verlua ships Next.js 16 marketing sites with Cache Components, PPR, and tag-based CMS invalidation built in from day one. We migrate App Router projects from unstable_cache to the new directive-based model and tune cacheLife profiles per surface.

Get a Free Next.js Audit
Share:
MS
Mark Shvaya

Founder & Technical Director

Mark Shvaya runs Verlua, a web design and development studio in Sacramento. He builds conversion-focused websites for service businesses, e-commerce brands, and SaaS companies.

California real estate broker, property manager, and founder of Verlua.

Stay Updated

Get the latest insights on web development, AI, and digital strategy delivered to your inbox.

No spam, unsubscribe anytime. We respect your privacy.

Comments

Comments section coming soon. Have questions? Contact us directly!

Related Articles