JavaScript rendering is the single most common technical AEO problem. AI crawlers vary widely in how reliably they execute JavaScript. A site that depends on client-side rendering may be partially or fully invisible to engines that don’t render JS, even when the same site indexes well in Google.
The rendering problem
A modern web page can deliver content in three ways:
- Server-side rendering (SSR). The HTML returned by the server contains the actual content. The page is fully readable without JavaScript.
- Client-side rendering (CSR). The HTML is a near-empty shell. JavaScript runs in the browser to fetch and render content.
- Hybrid (SSR + hydration). The server returns rendered HTML, then JavaScript adds interactivity. The initial HTML has the content.
For AI crawlers, only the first and third reliably work. CSR-only pages are usually empty to crawlers that don’t render JavaScript.
How major crawlers handle JavaScript
Behavior varies by operator and is not always documented:
- Googlebot renders JavaScript reliably. Pages indexed by Google generally work for AI engines that ground in Google.
- Bingbot renders JavaScript with some limitations. Pages with heavy JS may have issues.
- GPTBot, ClaudeBot, PerplexityBot render JavaScript inconsistently. Documentation is sparse. Observed behavior suggests assume no rendering for safety.
- User-initiated fetch bots (
ChatGPT-User,Claude-User,Perplexity-User) typically do render JS because they’re emulating a user request.
The conservative position: assume training and search crawlers do not render JavaScript reliably and design accordingly.
What this means for site architecture
For sites that depend on JavaScript for primary content:
- SSR or static rendering is effectively required. Pure SPAs with client-side data fetching are not AEO-viable.
- Pre-rendering for crawler user agents is a common middle ground. The server detects the bot and returns pre-rendered HTML.
- Hybrid rendering with SSR for initial paint and JS for interactivity is the modern standard and works well.
For specific patterns and trade-offs:
- Lazy loading and AI — content that loads after user interaction.
- Dynamic rendering for AEO — serving pre-rendered HTML to crawlers.
How to test rendering
The fastest test:
`bash
curl -A “GPTBot” https://example.com/affected-page | grep -o “
[^<]*
“
`
If the H1 appears in the curl output, the content is in the initial HTML. If not, JavaScript is required to render it.
A more thorough test:
- Fetch the page with curl using each major crawler’s user agent.
- For each, save the response body.
- Read the saved HTML — does it contain the actual page content, or just script tags and a ?
- Compare with what a real browser sees. Major content visible in the browser but missing from the curl response is the problem.
What to fix
In rough order of impact:
- Server-render the primary content. Title, headings, key paragraphs, and structured data all need to be in the initial HTML.
- Server-render schema markup. Schema injected by JavaScript at runtime is not seen by crawlers that don’t render JS.
- Server-render internal links. Crawlers can’t follow
onClickhandlers; links need to be actualelements. - Server-render meta tags. Title, description, canonical, and Open Graph tags all need to be in the head of the initial HTML.
Interactive elements (filters, dropdowns, modals) can be JS-driven without harming AEO, as long as the underlying content is server-rendered.
What not to do
- Cloak. Don’t serve different content to bots than to users. Engines detect and penalize this.
- Rely on
tags. Crawlers that render JS see the JS-rendered content; crawlers that don’t may not rendereither. Use SSR instead. - Defer hydration of critical content. Even with SSR, if hydration replaces the rendered HTML with different content for users, crawlers and users see different things.
- Use JavaScript to insert canonicals or robots tags. These need to be in the initial HTML.
Modern frameworks
Most modern frameworks support SSR or static rendering. Defaults vary:
- Next.js, Nuxt, Remix, SvelteKit, Astro — SSR or static rendering by default. AEO-friendly out of the box.
- Plain React, Vue, Angular SPAs — CSR by default. Need explicit pre-rendering or SSR setup.
- Headless CMS architectures — depend on the consumer; static site generators are AEO-friendly, runtime SPAs are not.
The migration cost from CSR to SSR is real but the AEO impact justifies it for content sites, especially where citation visibility is a core goal.
Implementation example
At AwesomeShoes Co., the frontend team built product guides as a client-side app. The AEO lead discovers that AI search crawlers receive nearly empty HTML, so detailed fit guidance never reaches answer engines.
Implementation discussion: the frontend engineer migrates core guide sections to SSR, the technical SEO lead verifies that headings and schema appear in initial HTML responses, and the content strategist rewrites key passages into extractable answer blocks. The team checks both crawl render snapshots and citation movement to ensure the change is understandable, technically correct, and effective.