Freelancer Tamal
All articles
Technical SEO· 14 min · May 14, 2026

JavaScript SEO: Rendering, Hydration & Why Googlebot Still Misses Content

Googlebot renders JavaScript — but with delays, quirks, and limits that bite SPAs hard. Here's what actually gets indexed, what doesn't, and the rendering strategies that work in 2026.

Freelancer Tamal, SEO expert
SEO Expert · Rangpur, Bangladesh · 6+ years experience

Googlebot renders JavaScript. That much is officially settled. What's not settled — and is the source of most JS-SEO confusion — is what gets rendered, when, with what limits, and how reliably. SPAs that 'work fine for users' regularly miss content from indexation. Here's the working 2026 model.

Table of contents

1. How Googlebot processes JavaScript (the two-pass model) · 2. What gets indexed vs what gets dropped · 3. SSR vs SSG vs ISR vs CSR — when each works · 4. The hydration mismatch trap · 5. Diagnosing rendering issues · 6. AEO-specific JS considerations · 7. FAQ

How does Googlebot process JavaScript?

Quick answer

Googlebot uses a two-pass model: (1) initial crawl indexes the raw HTML, (2) deferred render queue executes JavaScript using an evergreen Chromium and reindexes. The render pass can lag the crawl pass by minutes to days depending on site importance. Content available only after JS execution is indexed second-pass — meaning fresh content takes longer to appear in search than equivalent server-rendered content.

What gets indexed vs what gets dropped

Indexed: content rendered into the DOM within ~5 seconds, navigation links present in rendered HTML, schema injected before render completion. Dropped or unreliably indexed: content behind user interaction (clicks, hovers), content loaded via infinite scroll without proper paginated URLs, content loaded after timeouts >5s, content gated by cookie consent that blocks until accepted. **Googlebot doesn't click, scroll, or accept cookies — anything requiring those actions is invisible.**

SSR vs SSG vs ISR vs CSR — when each works

SSR (server-side render): safest for dynamic content, freshness, and SEO. SSG (static generation): fastest, ideal for content that changes infrequently. ISR (incremental static regen, Next.js): good middle ground. CSR (client-side only): worst for SEO; avoid for any indexable content. **In 2026, defaulting to CSR for marketing pages is malpractice — use SSR or SSG.**

The hydration mismatch trap

When server-rendered HTML differs from client-rendered HTML (different timestamps, conditionally rendered widgets, locale-detected content), React/Vue throw hydration errors. These break interactivity and can cause Googlebot to see content that disappears post-hydration. Audit by viewing source vs rendered DOM in Search Console URL Inspection — discrepancies are red flags.

Diagnosing rendering issues

Search Console URL Inspection > Live Test > View rendered HTML — this is the ground truth for what Googlebot sees. Cross-check with: Screaming Frog with JS rendering enabled, Mobile-Friendly Test rendered HTML view, and the Rich Results Test rendered DOM. **If your important content isn't in the URL Inspection rendered HTML, it isn't being indexed — fix the rendering, not the SEO copy.**

AEO-specific JS considerations

AI engine crawlers (GPTBot, ClaudeBot, PerplexityBot) generally do NOT execute JavaScript as of early 2026. They index raw HTML only. Pages whose primary content requires JS rendering are largely invisible to AI training and live retrieval — even when Googlebot indexes them fine. SSR or SSG is non-negotiable for AEO.

Frequently asked

Does Next.js / Remix / TanStack Start fix JS SEO automatically?

Mostly — modern meta-frameworks default to SSR/SSG which solves indexation. The remaining work is route-level: ensure each page's loader resolves data on the server, not in a client useEffect.

Are SPAs ever appropriate for SEO-dependent content?

Rarely. Authenticated dashboards behind login? Fine. Public marketing/blog/product pages? Use SSR or SSG. The performance and SEO costs of CSR are no longer worth the developer-experience benefits.

Does Googlebot wait for lazy-loaded images?

Yes if implemented with native loading='lazy' or proper IntersectionObserver. Custom lazy-load that requires scroll position rarely works for Googlebot — use the native attribute.

Will AI crawlers ever execute JavaScript?

Some are starting to (Bing/ChatGPT browsing does in real-time), but training-corpus crawlers still default to raw HTML. Don't bet AEO visibility on AI engines becoming SPA-compatible — ship SSR.

How do I test what GPTBot sees?

curl -A 'Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; GPTBot/1.0; +https://openai.com/gptbot)' https://yoursite.com — compare to a browser view. Discrepancies = invisible-to-AI content.

Free auditBook a call