How I Shaved 140 kB Off a Next.js Bundle by Lazy-Loading Sanity Portable Text
Apr 27, 2026 · 5 min read
Most Sanity projects ship the entire @portabletext/react serializer tree to the client, even when only a homepage hero uses rich text. On a recent agency project, a single blog post detail page loaded 140 kB of JavaScript just to render headings, links, and a custom YouTube embed block. First Contentful Paint sat at 1.8 s on 3G. The client wanted sub-1-second FCP and a Lighthouse performance score above 95.
The Problem: Portable Text Serializers Are Heavy
Portable Text is Sanity's block content format. You define custom serializers for marks, blocks, and inline objects. The official React package works beautifully, but it bundles every serializer—even unused ones—into your client JavaScript if you import it in a client component.
In my case, the schema included a content field with headings, lists, links, images, and a custom youtubeEmbed block. The serializer map looked like this:
// app/blog/[slug]/page.tsx (initial, bad)
import { PortableText } from '@portabletext/react';
import { YouTubeEmbed } from '@/components/YouTubeEmbed';
import { SanityImage } from '@/components/SanityImage';
const components = {
types: {
image: SanityImage,
youtubeEmbed: YouTubeEmbed,
},
marks: {
link: ({ value, children }: any) => (
<a href={value.href} className="underline">{children}</a>
),
},
};
export default async function BlogPost({ params }: { params: { slug: string } }) {
const post = await sanityFetch({ query: POST_QUERY, params });
return (
<article>
<h1>{post.title}</h1>
<PortableText value={post.content} components={components} />
</article>
);
}This page was a server component, but PortableText and my custom serializers pulled in client-side dependencies: react-player for YouTube embeds, next/image (already tree-shaken, but still), and the full Portable Text runtime. Vercel's bundle analyzer showed 142 kB uncompressed in the client chunk.
Step One: Move Portable Text Into a Client Boundary
Next.js App Router lets you isolate client JavaScript. I extracted the PortableText call into a separate client component and kept the data-fetching server component lean.
// app/blog/[slug]/PortableTextRenderer.tsx
'use client';
import { PortableText } from '@portabletext/react';
import dynamic from 'next/dynamic';
const YouTubeEmbed = dynamic(() => import('@/components/YouTubeEmbed'), {
ssr: false,
});
const SanityImage = dynamic(() => import('@/components/SanityImage'));
const components = {
types: {
image: SanityImage,
youtubeEmbed: YouTubeEmbed,
},
marks: {
link: ({ value, children }: any) => (
<a href={value.href} className="underline">{children}</a>
),
},
};
export function PortableTextRenderer({ value }: { value: any }) {
return <PortableText value={value} components={components} />;
}Now the server component imports only the client boundary:
// app/blog/[slug]/page.tsx
import { PortableTextRenderer } from './PortableTextRenderer';
export default async function BlogPost({ params }: { params: { slug: string } }) {
const post = await sanityFetch({ query: POST_QUERY, params });
return (
<article>
<h1>{post.title}</h1>
<PortableTextRenderer value={post.content} />
</article>
);
}This alone saved 18 kB by letting Next.js code-split the renderer into a separate chunk loaded only when the user navigates to a blog post.
Step Two: Lazy-Load the Entire Renderer Below the Fold
Most blog posts have a hero, metadata, and a share bar before the article body. The Portable Text block starts 600–800 pixels down the page. I wrapped the renderer in a dynamic() import with ssr: false so it hydrates only after the initial paint.
// app/blog/[slug]/page.tsx
import dynamic from 'next/dynamic';
const PortableTextRenderer = dynamic(
() => import('./PortableTextRenderer').then((mod) => mod.PortableTextRenderer),
{ ssr: false }
);
export default async function BlogPost({ params }: { params: { slug: string } }) {
const post = await sanityFetch({ query: POST_QUERY, params });
return (
<article>
<header className="mb-12">
<h1>{post.title}</h1>
<time>{post.publishedAt}</time>
</header>
<PortableTextRenderer value={post.content} />
</article>
);
}Bundle size dropped to 74 kB. FCP improved to 0.9 s on 3G. The trade-off: users see a blank space for 100–200 ms while the chunk loads. I added a skeleton loader inside a <Suspense> boundary to smooth the transition.
Step Three: Render Plain Blocks on the Server
For posts with no custom blocks—just headings, paragraphs, and links—I wrote a lightweight server-side serializer that outputs plain HTML. I check the Portable Text array for custom types in the RSC, then conditionally render.
// lib/hasCustomBlocks.ts
export function hasCustomBlocks(value: any[]): boolean {
return value.some(
(block) => block._type === 'youtubeEmbed' || block._type === 'image'
);
}
// app/blog/[slug]/page.tsx
import { hasCustomBlocks } from '@/lib/hasCustomBlocks';
import { renderPlainPortableText } from '@/lib/renderPlainPortableText';
export default async function BlogPost({ params }: { params: { slug: string } }) {
const post = await sanityFetch({ query: POST_QUERY, params });
const usesCustomBlocks = hasCustomBlocks(post.content);
return (
<article>
<h1>{post.title}</h1>
{usesCustomBlocks ? (
<PortableTextRenderer value={post.content} />
) : (
<div dangerouslySetInnerHTML={{ __html: renderPlainPortableText(post.content) }} />
)}
</article>
);
}The renderPlainPortableText function is a 40-line pure function that maps blocks to HTML strings. No client JavaScript needed. Posts without embeds now ship 12 kB of hydration JS instead of 74 kB.
Results and Trade-Offs
After all three steps, the median blog post bundle dropped from 142 kB to 12 kB. Posts with embeds load 74 kB. Lighthouse performance score went from 82 to 97. LCP improved by 400 ms.
The downside: increased complexity. I now maintain two rendering paths and a custom server-side serializer. For teams that frequently add new Portable Text block types, this can become a maintenance burden. But for marketing sites with stable content schemas, the performance gain is worth it.
If you're shipping Sanity Portable Text in a Next.js app and your client bundles are over 100 kB, audit your serializers. Move them into client boundaries, lazy-load below the fold, and consider server-rendering simple blocks. The gains compound when you serve millions of page views.
Related posts
All posts →How I Structure Sanity Schemas to Avoid Query Waterfalls in Next.js
Apr 27, 2026 · 5 min read
Denormalising references and embedding common fields in Sanity schemas cuts server component render time by 40–60%. Here's the pattern I ship.
INP for React Apps: Profiling and Eliminating Long Tasks
Apr 25, 2026 · 4 min read
INP is responsiveness. Learn how to find long tasks, profile React re-renders, reduce main-thread work, and ship fast interactions consistently.
How to Fix LCP on Image-Heavy Pages (Next.js Patterns That Work)
Apr 24, 2026 · 4 min read
LCP is usually one big image. Here’s how to identify the true LCP element, reduce TTFB, ship the right image bytes, and consistently hit <2.5s on real devices.