A few weeks ago I looked at my network tab and winced: my SPA's main bundle had grown to several megabytes. Every route eagerly shipped the entire application, even though most users only touched a couple of pages. Caching wasn't helping much either — one change in shared code invalidated the whole thing. That was the nudge I needed to refactor the app's loading strategy.

This post walks through the exact changes I made: moving page components to React.lazy(), wrapping routes in <Suspense> with a meaningful fallback, and teaching Vite to split vendors by type via manualChunks. Along the way I'll touch on progressive UX (showing loading states that don't feel janky), opportunistic prefetch on intent, error boundaries, and how I measured the before/after.

TL;DR: The main bundle shrank by roughly 95%, the home page script weight dropped by ~50%, subsequent routes load their code only when you actually visit them, and Lighthouse performance scores moved up in a way users can feel.

What "Before" Looked Like

My app followed a familiar pattern: a single App with a router and all top-level pages imported directly. It was simple and it worked, but it meant the bundler dutifully included everything in the initial payload. First visits on mid-range mobiles were especially painful; the device would spend precious seconds parsing and compiling code that might never run for that session. Because most code lived in the same big file, caching provided little relief — small changes blew away the entire cached asset.

  • Before: multi‑MB bundle, every route downloads everything, weak caching, long main‑thread parse/compile.

The Strategy

I pursued two coordinated changes:

  1. I pushed all route‑level views behind dynamic imports using React.lazy. React's <Suspense> boundary provides a loading state while the chunk downloads and initializes. That alone cuts the initial JavaScript to "what's needed for the first view" and defers everything else to the moment it's actually required.
  2. I guided Vite (Rollup under the hood) with a more intentional manualChunks configuration. Rather than accepting a single "vendor" blob, I separated React + ReactDOM, Framer Motion, icons, and other vendors into their own persistent chunks. Libraries like React change rarely; giving them a stable, independent cache line pays dividends across deployments.

Refactoring Routes with React.lazy and Suspense

I started by converting each page import into a lazy import. The router remains tidy; only the import expressions change.

// AppRouter.tsx
import { Suspense, lazy } from "react";
import { createBrowserRouter, RouterProvider } from "react-router-dom";

const HomePage = lazy(() => import("./pages/HomePage"));
const DashboardPage = lazy(() => import("./pages/DashboardPage"));
const ReportsPage = lazy(() => import("./pages/ReportsPage"));
const SettingsPage = lazy(() => import("./pages/SettingsPage"));

function Loading() {
  return (
    <div className="p-6 text-sm opacity-70" role="status" aria-live="polite">
      Loading…
    </div>
  );
}

const router = createBrowserRouter([
  { path: "/", element: <Suspense fallback={<Loading />}><HomePage /></Suspense> },
  { path: "/dashboard", element: <Suspense fallback={<Loading />}><DashboardPage /></Suspense> },
  { path: "/reports", element: <Suspense fallback={<Loading />}><ReportsPage /></Suspense> },
  { path: "/settings", element: <Suspense fallback={<Loading />}><SettingsPage /></Suspense> },
]);

export default function AppRouter() {
  return <RouterProvider router={router} />;
}

If you prefer nested layouts, you can put a single <Suspense> around the layout's outlet so child routes automatically inherit the fallback. I kept a small, consistent loading UI — readable text or a skeleton — because a subtle, honest indicator is less frustrating than a flashy spinner. The important bit is to avoid layout shift: a skeleton that matches the final structure helps keep the page steady as content arrives.

Splitting Vendors Intentionally with Vite's manualChunks

Vite already splits code, but in my case I wanted predictable, cache‑friendly boundaries. I created separate chunks for:

  • react-vendor: react, react-dom
  • motion: framer-motion
  • icons: lucide-react (or your icon library)
  • vendor: everything else in node_modules

That looks like this in vite.config.ts:

// vite.config.ts
import { defineConfig } from "vite";
import react from "@vitejs/plugin-react";

export default defineConfig({
  plugins: [react()],
  build: {
    sourcemap: true,
    modulePreload: { polyfill: false },
    rollupOptions: {
      output: {
        entryFileNames: "assets/[name]-[hash].js",
        chunkFileNames: "assets/[name]-[hash].js",
        assetFileNames: "assets/[name]-[hash][extname]",
        manualChunks(id) {
          if (id.includes("node_modules")) {
            if (id.includes("react") || id.includes("react-dom")) return "react-vendor";
            if (id.includes("framer-motion")) return "motion";
            if (id.includes("lucide-react") || id.includes("@iconify") || id.includes("react-icons")) return "icons";
            return "vendor";
          }
        },
      },
    },
    chunkSizeWarningLimit: 700,
    target: "es2020",
    cssTarget: "chrome100",
  },
});

This makes a huge difference on repeat visits. When you deploy, only the chunks that actually changed receive a new hash. Users often get instant vendor cache hits, while new page code streams in on demand. It also helps parallelize downloads: the browser can grab multiple small chunks rather than one monolith.

Making Navigation Feel Instant (Prefetch on Intent)

Once the basic lazy loading worked, I added a small nicety: prefetch the next route's chunk when the user shows intent, like hovering or focusing a link. It's just a dynamic import that fires early; by the time the user clicks, the chunk is already in the HTTP cache.

// PrefetchLink.tsx
import React from "react";
import { Link } from "react-router-dom";

export function PrefetchLink(props: { to: string; label: string; loader?: () => Promise<any> }) {
  const { to, label, loader } = props;
  const warm = () => loader?.();
  return <Link to={to} onMouseEnter={warm} onFocus={warm}>{label}</Link>;
}

// Usage
const lazyReports = () => import("./pages/ReportsPage");
const ReportsPage = React.lazy(lazyReports);

// somewhere in nav:
<PrefetchLink to="/reports" label="Reports" loader={lazyReports} />;

This tiny addition preserves the benefits of lazy loading, yet navigation feels snappier because the waiting happens "offstage."

Guardrails: Error Boundaries with Suspense

<Suspense> is for loading, not errors. To prevent a broken lazy chunk from taking down the tree, wrap route elements with a minimal error boundary:

import { Component, ReactNode } from "react";

class RouteErrorBoundary extends Component<{ children: ReactNode }, { hasError: boolean }> {
  state = { hasError: false };
  static getDerivedStateFromError() { return { hasError: true }; }
  render() {
    if (this.state.hasError) return <div className="p-6">Something went wrong.</div>;
    return this.props.children;
  }
}

Pairing error boundaries with suspense boundaries gives you resilient, incremental UI: "loading" while chunks arrive, graceful "something went wrong" if they don't.

What Changed for Users

On the first visit, the browser now downloads a much smaller entry along with a few vendor chunks. The page component itself is a route chunk that only loads when you navigate there. Because React and ReactDOM live in react-vendor, they're highly cacheable and seldom change. Heavier libraries such as Framer Motion and icons don't punish routes that never animate or render icons.

📊 Performance Improvements

  • Main bundle trimmed by ~95%
  • Home page resource size cut by ~50%
  • Desktop Performance: 70–78 → 80–88 (+10–15 points)
  • Mobile Performance: 55–65 → 68–76 (+13–18 points)
  • FCP improved by 0.4–1.0s
  • LCP improved by 0.4–1.0s

These ranges are realistic for a medium SPA that previously shipped everything up front. Your exact gains will vary with device/network and how much you can truly defer.

A Few Lessons I Learned

One surprise was how easy it is to accidentally defeat code‑splitting by importing a "convenient" helper that in turn drags in half a UI kit. Keep an eye on shared utilities and audit transitive imports. For icons, import only what you use or keep the icon library as its own chunk so it's not on the critical path. Try to keep global providers slim; if your theme or data layer initializes a lot of work at the root, splitting routes won't help as much.

On the caching front, set far‑future headers for hashed static assets (e.g., Cache-Control: public, max-age=31536000, immutable), and keep index.html on a short TTL so it can point to new chunk names after a deploy. And measure on consistent conditions — throttle in DevTools or use WebPageTest so you compare apples to apples.

Measuring the Impact

My checklist was straightforward:

  1. Record baseline bundle sizes (vite build) and Lighthouse scores on both desktop and mobile profiles.
  2. Convert routes to React.lazy and wrap with <Suspense>.
  3. Add the manualChunks configuration and rebuild.
  4. Re-run Lighthouse and note FCP/LCP/TTI differences.
  5. Inspect the Network panel: initial view should show a small entry plus a few vendor chunks; navigating to a new route should fetch a concise route chunk, not the world.
  6. Repeat visits should show vendor chunk cache hits.

If your numbers aren't moving, look for eager imports, heavy work in root providers, or large CSS/Font payloads that aren't being deferred.

Wrapping Up

This refactor was a reminder that performance is often about doing less, later. With a day's work, I turned a monolithic SPA into a set of small, cacheable pieces that arrive when needed. React.lazy and <Suspense> gave me the declarative ergonomics I wanted; Vite's manualChunks gave me the control the cache needed. The UX now feels immediate, especially on mobile, and future deployments benefit from vendor chunk reuse.

If you're sitting on a SPA whose bundle quietly crept into the multi‑megabyte range, this is a highly leveraged change. Start with one route, feel the difference, then roll it across the app.

References & Further Reading

💬 Comments & Reactions