TL;DR
- 42% of top websites render content via JavaScript that AI crawlers can't execute
- GPTBot, PerplexityBot, and ClaudeBot don't fully execute client-side JavaScript
- Server-side rendering (SSR) is the most reliable fix for AI readability
- A simple JavaScript-disabled browser test reveals if your site has this problem
- Six categories of AI readability failure exist beyond just JavaScript rendering
The Problem No One Is Talking About
Every conversation about AI visibility focuses on content — what to write, how to structure it, what schema to add. Almost nobody is asking the more fundamental question: can AI actually read your site at all?
The answer, for a significant portion of marketing and SaaS websites, is no. 42% of the web's most visited pages deliver their primary content via JavaScript rendering. AI crawlers don't execute JavaScript the same way browsers do. For nearly half the internet, the AI is reading an empty page.
How AI Crawlers Actually Work
The major AI platforms — ChatGPT, Perplexity, Claude, Gemini — each use different methods to retrieve and index web content. Some use dedicated crawlers (GPTBot, PerplexityBot, ClaudeBot). Some rely on existing indices from Bing, Google, or Brave. What they have in common: most do not execute JavaScript at the depth a full browser does.
This means when GPTBot visits your homepage, it retrieves the initial HTML. If your navigation, hero section, and core value proposition are rendered by JavaScript that runs after page load, the crawler sees none of it.
The Six Categories of AI Readability Failure
- JavaScript-rendered content — Primary content loaded via React, Vue, or Angular that crawlers see as blank HTML
- Firewall blocking — Cloudflare, WAF rules, or bot protection settings that silently block AI crawlers
- Missing or incorrect schema — Product pages, FAQ sections, and pricing data without structured markup
- Thin content depth — Pages with fewer than 300 words of substantive content
- Hidden content — Key information inside accordions, tabs, or modals not in the initial DOM
- Broken crawl paths — Incorrect robots.txt rules, missing sitemaps, or accidental noindex tags
The Fix
The most reliable solution is server-side rendering (SSR) or static site generation (SSG). Next.js, Nuxt, and SvelteKit all support SSR out of the box. When your server sends fully rendered HTML rather than a JavaScript shell, every crawler sees the same content your users see.
Three additional fixes to prioritise:
- Add a verified sitemap.xml and submit it to every major search engine and AI crawler
- Review your robots.txt and Cloudflare rules to confirm AI bot user agents are not being blocked
- Add JSON-LD schema markup to your homepage, pricing page, and any product or service pages
Frequently Asked Questions
Can AI crawlers read JavaScript-rendered content?
Most AI crawlers like GPTBot and ClaudeBot do not fully execute JavaScript. They retrieve the initial HTML, so content rendered by React, Vue, or Angular after page load is often invisible to them.
How do I test if AI can read my website?
Disable JavaScript in your browser (Chrome DevTools → Settings → Disable JavaScript) and reload your key pages. What you see is roughly what AI crawlers see.
What is the best fix for JavaScript rendering issues?
Server-side rendering (SSR) or static site generation (SSG) using frameworks like Next.js, Nuxt, or SvelteKit ensures fully rendered HTML is sent to all crawlers.
Hema Team
Contributor
