How I Handle Sanity Draft Mode Without Sacrificing Edge Performance
Apr 27, 2026 · 4 min read
The problem with naive draft implementations
Most Sanity + Next.js tutorials show you draftMode() from next/headers and call it a day. You check the cookie, swap your GROQ query to include drafts, and suddenly every page request hits the Sanity CDN at runtime. Your published pages lose edge caching. Your TTFB jumps from 80ms to 350ms because you're waiting on a CDN round-trip for every visitor.
I've shipped six client projects where editors needed live preview but the public site had to stay fast. Here's the pattern I use to keep published pages on the edge while draft mode works for authenticated users.
Separate data-fetching functions by draft state
Instead of one function that branches on draftMode().isEnabled, I write two:
// app/lib/sanity/queries.ts
import { client } from './client';
import { draftMode } from 'next/headers';
const PUBLISHED_QUERY = `*[_type == "post" && slug.current == $slug][0] {
title, body, publishedAt, "image": mainImage.asset->url
}`;
const DRAFT_QUERY = `*[_type == "post" && slug.current == $slug] | order(_updatedAt desc) [0] {
title, body, publishedAt, "image": mainImage.asset->url, _updatedAt
}`;
export async function getPublishedPost(slug: string) {
return client.fetch(PUBLISHED_QUERY, { slug }, {
next: { revalidate: 3600, tags: [`post:${slug}`] }
});
}
export async function getDraftPost(slug: string) {
return client.fetch(DRAFT_QUERY, { slug }, {
perspective: 'previewDrafts',
useCdn: false,
// No revalidate — always fresh
});
}The published function uses ISR with a one-hour window and on-demand tags. The draft function hits the Sanity API directly, no CDN, perspective: 'previewDrafts' to see unpublished changes.
Now in your page component:
// app/posts/[slug]/page.tsx
import { draftMode } from 'next/headers';
import { getPublishedPost, getDraftPost } from '@/lib/sanity/queries';
export default async function PostPage({ params }: { params: { slug: string } }) {
const draft = draftMode().isEnabled;
const post = draft
? await getDraftPost(params.slug)
: await getPublishedPost(params.slug);
if (!post) return <div>Not found</div>;
return (
<article>
<h1>{post.title}</h1>
{draft && <div className="draft-banner">Draft mode active</div>}
{/* render body */}
</article>
);
}Toggling draft mode from Sanity Studio
I add a "Preview" button in the document action bar. In sanity.config.ts:
// sanity.config.ts
import { defineConfig } from 'sanity';
import { structureTool } from 'sanity/structure';
import { visionTool } from '@sanity/vision';
import schemas from './schemas';
export default defineConfig({
projectId: 'abc123',
dataset: 'production',
plugins: [
structureTool(),
visionTool(),
],
schema: { types: schemas },
document: {
actions: (prev, { schemaType }) => {
if (schemaType === 'post') {
return [
...prev,
{
label: 'Preview',
onHandle: () => {
const slug = /* extract slug from document */;
const previewUrl = `${process.env.NEXT_PUBLIC_SITE_URL}/api/draft?secret=${process.env.SANITY_PREVIEW_SECRET}&slug=${slug}`;
window.open(previewUrl, '_blank');
},
},
];
}
return prev;
},
},
});The route handler at /api/draft/route.ts:
// app/api/draft/route.ts
import { draftMode } from 'next/headers';
import { redirect } from 'next/navigation';
import { NextRequest } from 'next/server';
export async function GET(req: NextRequest) {
const secret = req.nextUrl.searchParams.get('secret');
const slug = req.nextUrl.searchParams.get('slug');
if (secret !== process.env.SANITY_PREVIEW_SECRET || !slug) {
return new Response('Invalid request', { status: 401 });
}
draftMode().enable();
redirect(`/posts/${slug}`);
}Now editors click "Preview" in Studio, the cookie is set, they see draft content. Public visitors never trigger that code path.
Why this keeps edge performance intact
Published requests hit the edge function, read from Next.js Data Cache (or Vercel's edge cache if you're on their platform), never wait on Sanity. Draft requests bypass all caching, but only editors see them. You measure your P95 TTFB on production traffic, not internal previews.
I've seen projects where devs enabled draft mode globally during development and forgot to turn it off. Suddenly every page became dynamic. This pattern makes the split explicit in code.
On-demand revalidation from webhooks
When an editor publishes a post, I revalidate the specific tag:
// app/api/revalidate/route.ts
import { revalidateTag } from 'next/cache';
import { NextRequest } from 'next/server';
export async function POST(req: NextRequest) {
const body = await req.json();
const slug = body.slug;
if (!slug) return new Response('Missing slug', { status: 400 });
revalidateTag(`post:${slug}`);
return new Response('Revalidated', { status: 200 });
}Sanity webhook (configured in project settings) hits this endpoint on publish. The edge cache updates within seconds. No need for editors to manually purge.
Where this falls short
If you have deeply nested Sanity references (post → author → author's recent posts), the draft query can get expensive. I've had queries take 800ms because we expanded four levels of references. In those cases I either simplify the preview (show fewer fields) or accept the cost — it's only for editors.
Also, if your site has user-generated content (comments, likes) that changes constantly, ISR might not fit. But for marketing sites, docs, blogs — this pattern has saved me from re-architecting every time a client asks for live preview.
Related posts
All posts →How I Ship Sub-200ms TTFB on Sanity-Powered Pages with PPR and Edge
Apr 27, 2026 · 5 min read
Combining Next.js Partial Prerendering, edge runtime, and selective Sanity queries to hit sub-200ms TTFB on content-heavy pages without stale data.
How I Shaved 140 kB Off a Next.js Bundle by Lazy-Loading Sanity Portable Text
Apr 27, 2026 · 5 min read
Portable Text blocks can bloat client bundles. Here's how I defer serializers with dynamic imports and RSC boundaries to keep marketing pages under 80 kB.
How I Structure Sanity Schemas to Avoid Query Waterfalls in Next.js
Apr 27, 2026 · 5 min read
Denormalising references and embedding common fields in Sanity schemas cuts server component render time by 40–60%. Here's the pattern I ship.