attempt with tanstack start and material ui
TanStack, known for its exceptional utilities like React Table, Query, and Router, has unveiled TanStack Start — a meta-framework addressing real-world performance and development challenges in Fullstack React apps.

For more than five years, Next.js has been my default framework for any web development outside of work. This personal blog you are reading right now went through at least three iterations. I changed the CSS framework multiple times, I changed backend technologies—started with AWS Amplify, moved to Serverless, then SST. After staying in the AWS ecosystem for over five years, I finally wanted to give Google Firebase a shot and migrated everything over. I also kept switching the component library—Chakra UI, Tailwind, Flowbite—but Next.js never changed. It was not that I trusted it blindly. I was just comfortable with it. It was my "safe" default.
But as Next.js evolved from a simple tool into a massive ecosystem, the developer experience shifted with it. Between complex caching rules and rigid server-client boundaries, things started feeling heavier than they needed to be.
Growing larger with complexities
Starting with Next.js 13, things became less stable. React Server Components were introduced alongside the App Router, and the framework began changing its foundational assumptions frequently.
Suddenly, everything became "server-side by default." We entered a world of use client, use server, and use cache. The paradigm flipped entirely, bringing frequent hydration issues along the way.
We adapted to the idea that everything was cached by default in Next.js 14. Then Next.js 15 arrived with Turbopack and a completely inverted mental model: nothing is cached by default. You now have to explicitly opt in to caching behavior.
Next.js 15 made Turbopack the default build tool, moving away from Webpack. The Rust-based bundler promised 10x performance improvements, but developers report variable experiences—excelling at hot refresh but struggling with broken imports, high resource consumption, and cold starts.
It is not just problems within Next.js either. Deploying across cloud environments is its own challenge. Vercel takes it natively without issues. AWS Amplify supports it to a reasonable extent. Platforms like Render and Netlify work too, but I have hit issues with API routes and SSR on initial setup. And Firebase? It does not natively support API routes and SSR—you have to convert them into Firebase Functions. Technically every cloud provider has to come up with some workaround to make it work.
Does that make it difficult? Not for me. There was always a way to deploy anywhere. It was still my choice and I never changed it—until recently.
TanStack Start enters the arena
For every popular React library, TanStack seems to have their own alternative. I remember it started with:
- TanStack Router — as an alternative to React Router DOM
- TanStack Query (React Query) — when Redux Toolkit had RTK Query, they came with their own data fetching and caching solution
- TanStack Table — for enterprise-grade tables, and surprisingly it is backed by AG Grid itself
- TanStack Form — as an alternative to React Hook Form
- TanStack Start — the full-stack framework, an alternative to Next.js and Astro
They also shipped TanStack Store, Ranger, DB, CLI, Virtual, Pacer, and more. But I never tried them because the established alternatives worked fine for me and I preferred to stick with what I knew.
Then in the last few months, almost every Next.js and React developer I follow started promoting TanStack Start. That got my attention. I thought of giving it a real try—not just reading about it, but building something with it.
Understanding the fundamental differences
Before jumping in, I wanted to understand the core architectural differences between Next.js and TanStack Start.
Server-first vs client-first with selective SSR. In Next.js, every component is a React Server Component by default. You start on the server and explicitly opt into client-side interactivity with use client. This works well for content-heavy websites and SEO-critical pages. TanStack Start assumes you are building an interactive application. You have fine-grained control over the rendering mode via the ssr property on each route.
Routing with type safety. TanStack Router generates a routeTree.gen.ts file. If you change a route parameter, every link using that route fails at build time—not at runtime. That is a meaningful difference when your app grows.
Isomorphic loaders vs async server components. Next.js uses async server components that run exclusively on the server. If you need that data on the client for subsequent interactions, the framework has to re-fetch or stream from the server. TanStack uses isomorphic loaders—the same code runs on the server during the initial load and on the client during navigation. This avoids waterfalls and unnecessary server round-trips.
Server functions. Next.js 15 uses Server Actions, primarily designed for forms and mutations. They are POST-only and can feel tightly coupled to the render cycle. TanStack Start server functions support any HTTP method, built-in validation with Zod, and composable middleware.
Build tooling. Next.js 15 is pushing Turbopack, which is still maturing. TanStack Start uses Vite, which has been battle-tested for years with a massive plugin ecosystem and predictable performance.
Deployment. Next.js is heavily optimized for Vercel. TanStack Start does not care where you deploy—Cloudflare Workers, Netlify, any Node.js server, your pick.
The experiment: rebuilding my blog with TanStack
After understanding the differences, I wanted a real development comparison. This time I did not want to build it from scratch myself. I asked Cursor to recreate my entire Next.js + shadcn + Firebase blog application using the TanStack ecosystem.
The migration map looked like this:
| What | Next.js Stack | TanStack Stack |
|---|---|---|
| Framework + Routing | Next.js 15 App Router | TanStack Start + TanStack Router |
| Data Fetching | Direct Firestore reads (no caching layer) | TanStack Query |
| Forms | react-hook-form | TanStack Form |
| Shared State | Raw useState | TanStack Store |
| Tables | @tanstack/react-table (already used) | @tanstack/react-table |
| UI Library | shadcn/ui + Radix + Tailwind v4 | Material UI v9 |
| Build Tool | Turbopack | Vite 7 |
Why everything TanStack? It is just an experiment. If I am trying it, I might as well try everything from their ecosystem. I will bring in TanStack DB and the rest later.
Why Material UI? I may have to work with it in the future, so exploring it now alongside a new framework felt like a good time.
Side-by-side: what I actually found
After building both versions, here is how they compare across the criteria that matter to me.
Bundle size
I built both projects and measured the output. Here are the actual numbers:
| Metric | Next.js 15 (Turbopack) | TanStack Start (Vite 7) |
|---|---|---|
| Client JS | ~2,903 KB (2.83 MB) | ~1,823 KB (1.78 MB) |
| Server JS | ~2,244 KB (2.19 MB) | ~335 KB (0.33 MB) |
| Build time | 29.1 seconds | 25.4 seconds (19.5s client + 5.9s server) |
| Total build output | ~378 MB (standalone mode) | Client + server assets only |
A few things worth noting. The TanStack client bundle is 37% smaller than Next.js. The server-side difference is even more dramatic—TanStack's server chunks are 85% smaller. Vite's Rollup-based production build does aggressive tree-shaking that shows up clearly in the numbers.
The Next.js total of 378 MB is inflated because output: 'standalone' bundles the entire Node.js runtime and node_modules into the build folder. That is by design for containerized deployments, so it is not an apples-to-apples comparison on total size. The client and server JS chunks are the fair comparison, and TanStack wins on both.
The TanStack build did flag one chunk at 640 KB (index-B0KLAaJs.js) as oversized—likely the MUI core bundle. That is worth splitting with dynamic imports, but for a first pass it is a solvable problem.
Build time was close—29.1 seconds for Next.js versus 25.4 for TanStack. Not a massive difference, but Vite's split between client and server builds feels more transparent than Turbopack's single pass.
Routing: file-based, nested layouts, type safety
Both frameworks use file-based routing with nested layouts, but the experience is different.
| Aspect | Next.js (my blog) | TanStack (rebuilt) |
|---|---|---|
| File convention | app/ directory with page.tsx, layout.tsx | src/routes/ with file names as routes |
| Nested layouts | Route groups like (public)/layout.tsx, admin/layout.tsx | Pathless layouts like _public.tsx, admin.tsx |
| Dynamic routes | [slug], [id] folder naming | $slug, $id in file names |
| Type-safe routes | No — links are plain strings, typos are runtime errors | Yes — routeTree.gen.ts auto-generated, broken links fail at build time |
| Route groups | (public) for grouping without URL impact | _public prefix for pathless layout segments |
The type-safe routing in TanStack is a genuine improvement. In my Next.js app, a typo in a Link href is a runtime surprise. In the TanStack version, it is a compile-time error.
History, memory, and hash routers
TanStack Router ships with support for history, memory, and hash-based routing out of the box. This matters if you are building apps that need to work in non-standard environments—embedded webviews, Electron, or situations where the URL cannot change. Next.js only supports history-based routing. For my blog, this is not a dealbreaker, but for application development it is a meaningful advantage.
Data fetching and caching
This is where the difference is most visible.
My Next.js app has no caching layer at all. It reads directly from Firestore in server components. Every page request hits the database. I never added React Query or SWR because Next.js server components felt like they should handle it—but there is no client-side cache for subsequent navigations.
The TanStack version uses TanStack Query with a QueryClient wired into the router. Data fetched during server-side rendering is automatically dehydrated and available on the client. Subsequent navigations use the cache. I also get stale-while-revalidate, background refetching, and optimistic updates without any extra setup.
| Behavior | Next.js (my blog) | TanStack (rebuilt) |
|---|---|---|
| Server data on client | Re-fetch or stream from server | Dehydrated from SSR, cached on client |
| Client-side cache | None | TanStack Query with configurable stale time |
| Optimistic updates | Manual implementation needed | Built into TanStack Query mutations |
| Background refetch | Not available | Automatic with TanStack Query |
Optimistic UI
TanStack Query has built-in optimistic update support on mutations. You can update the UI instantly and roll back if the server rejects the change. In my Next.js app, achieving the same thing would require manual state management, custom rollback logic, and careful coordination between server actions and client state. TanStack makes it declarative.
Devtools
| Tool | Next.js (my blog) | TanStack (rebuilt) |
|---|---|---|
| Query devtools | Not installed | React Query Devtools (bottom-right panel) |
| Router devtools | None available | TanStack Router Devtools available (not yet enabled) |
| Linting | ESLint 9 with next/core-web-vitals | TypeScript --noEmit only |
The TanStack ecosystem ships its own devtools for both the query layer and the router. My Next.js app has ESLint configured but no runtime devtools. The TanStack rebuild has React Query Devtools mounted, which gives real-time visibility into cache state, query status, and refetch behavior. That alone has been useful for debugging data flow.
Server functions and middleware
Both apps have server-side logic, but the patterns differ.
My Next.js app uses Server Actions ('use server') for auth operations and Route Handlers (route.ts) for the contact API. Auth protection is handled by Edge Middleware that reads a session cookie and redirects unauthenticated users.
The TanStack app uses createServerFn for server functions and createAPIFileRoute for API endpoints. Auth protection is done in the route's beforeLoad hook—no separate middleware file. Server functions live in a dedicated src/server/functions/ directory, which feels more organized than scattering 'use server' across component files.
| Aspect | Next.js (my blog) | TanStack (rebuilt) |
|---|---|---|
| Server functions | Server Actions ('use server' in action files) | createServerFn in dedicated server/functions/ |
| API routes | app/api/*/route.ts | src/routes/api/-*.ts with createAPIFileRoute |
| Auth middleware | Edge Middleware (middleware.ts) | Route beforeLoad hook |
| Validation | Zod (in forms) | Zod (in server functions and forms) |
SSR and API routes
Both apps run SSR. My Next.js blog uses force-dynamic on most pages and outputs a standalone Node.js server. The TanStack app has ssr: true registered globally and runs on a node-server preset through Vite.
API-wise, both have similar endpoint coverage—health/contact/auth/analytics. The TanStack API file route convention with the - prefix (-analytics.ts, -sitemap.ts) took a moment to get used to, but it works cleanly once you understand the pattern.
React Server Components
My Next.js app uses RSC by default—server components fetch data, client components handle interactivity with 'use client' boundaries. The TanStack app does not use RSC. It is a classic SSR + client React setup. TanStack Start has experimental RSC support through @tanstack/react-start-rsc (it pulled it as a transitive dependency), but this project does not opt into it.
For a blog with admin dashboard, the absence of RSC in TanStack is not a limitation. The isomorphic loader pattern with TanStack Query achieves similar performance characteristics—data loads on the server, hydrates on the client, and subsequent navigations are handled client-side without round-trips.
Live deployment
The local build worked perfectly the first time. The Vercel deployment did not. The function crashed on cold start with Cannot find package '@google-cloud/firestore' imported from /var/task/_libs/firebase-admin.mjs, even though firebase-admin was clearly in package.json and resolving fine on my machine.
Reading through the TanStack Start deployment docs made it clear that I needed to wire in the Nitro Vite plugin and select the vercel preset to produce a proper serverless function bundle. Adding nitro({ preset: 'vercel' }) to vite.config.ts got the build through Vercel's pipeline cleanly — but the same module-not-found error came right back at runtime.
The cause turned out to be specific to Nitro v3 and firebase-admin. Nitro v3 (released late 2025) inlines externals into _libs/*.mjs by default and only keeps packages listed in traceDeps external. firebase-admin does not survive that bundling for two reasons: it calls require('@google-cloud/firestore') dynamically, which Nitro currently fails to trace (nitrojs/nitro#4094 is still open), and its transitive dep google-gax uses __dirname to locate its .proto files, which does not exist in ESM scope after Nitro's CJS→ESM conversion.
I fixed it by keeping firebase-admin and its Google Cloud peer deps entirely out of the bundle and shipping them as real node_modules into the Vercel function output via a small postbuild script. The script walks the dependency tree from firebase-admin and preserves nested node_modules paths so that packages with multiple installed versions — node-fetch@2 for google-gax, node-fetch@3 for google-auth-library — each end up in the right place. Once that landed, the function loaded cleanly and the app is live at ganesan-dev-26-tanstack-start.vercel.app.
Performance comparison
Both apps are deployed side by side, so I could measure them on the same network from the same machine. The setup:
| App | Deployed URL | Platform |
|---|---|---|
| Next.js blog | ganesan.dev | Firebase Hosting |
| TanStack blog | ganesan-dev-26-tanstack-start.vercel.app | Vercel |
Performance metrics to compare
I ran Lighthouse 13 against both home pages (mobile preset = simulated mid-tier device on Slow 4G; desktop preset = wired throttling) and probed raw HTTP timings with a Node fetch loop. Numbers below are from a single test location, so treat them as directional rather than absolute.
| Metric | Next.js 15 (Firebase Hosting) | TanStack Start (Vercel) | Difference |
|---|---|---|---|
| First Contentful Paint (FCP) | 1,190 ms | 2,910 ms | Next.js 1.72 s faster |
| Largest Contentful Paint (LCP) | 2,890 ms | 3,150 ms | Next.js 260 ms faster |
| Time to Interactive (TTI) | 5,330 ms | 4,370 ms | TanStack 960 ms faster |
| Total Blocking Time (TBT) | 2,710 ms | 338 ms | TanStack ~8× lower |
| Cumulative Layout Shift (CLS) | 0.001 | 0.001 | Tie |
| Lighthouse score (mobile) | 61 / 100 | 75 / 100 | TanStack +14 |
| Lighthouse score (desktop) | 87 / 100 | 84 / 100 | Next.js +3 |
| Time to First Byte (TTFB, warm) | ~1,390 ms | ~305 ms | TanStack ~4.5× faster |
| JS transferred (network) | 326 KB | 222 KB | TanStack 32% lighter |
| Page load (home, warm cache) | 1.43 s | 0.43 s | TanStack ~3.3× faster |
Two patterns jump out. TanStack Start ships materially less JavaScript (326 KB → 222 KB, a 32% drop) and Vercel's edge gets the first byte back in roughly a quarter of the time Firebase Hosting does for me, which together translate into a much lower Total Blocking Time and a +14 point Lighthouse score on mobile. Next.js still wins FCP and LCP though, because its statically generated HTML paints content the moment it arrives — TanStack Start's SSR'd shell paints later because the route's data has to come back before the LCP element is rendered. Desktop scores are within margin of error, which is the boring-but-honest answer: on a fast network and a fast machine, both feel instant.
Final thoughts
In the React ecosystem, Next.js has been something of a monopoly lately. Astro came and found its niche. Newer frameworks like SolidJS and Lit are carving their own paths. The arrival of TanStack Start is healthy competition.
It is not about "killing" Next.js. It is about having a choice again. TanStack Start prioritizes explicitness over magic and stability over constant reinvention. The type-safe routing alone is worth the exploration. The integrated query layer with devtools, the isomorphic data loading, the deployment flexibility—these are not small things.
Will I migrate my blog permanently? I am not sure yet. But I now have a real alternative that I actually built something with, not just read about.
What I genuinely liked: the folder structure and the way server functions are organised — both make day-to-day development noticeably easier to reason about. What is holding me back from a full migration is platform breadth. Next.js has first-class, well-trodden paths to AWS, Vercel, Firebase, Cloudflare, and pretty much anywhere else you might want to ship. TanStack Start gets there through Nitro, which is powerful but, as this post hopefully made clear, still has rough edges on less common targets. I am not ready to bet a production app on it.