Back to blog
TechnicalMar 20267 min read

The 42% Invisible Problem: Why AI Can't Read Your JavaScript

A
Arjun MehtaPublished Mar 2026

42% of the web's most visited pages deliver their primary content via JavaScript rendering. AI crawlers don't execute JavaScript the same way browsers do. For nearly half the internet, the AI is reading an empty page.

The Problem No One Is Talking About

Every conversation about AI visibility focuses on content — what to write, how to structure it, what schema to add. Almost nobody is asking the more fundamental question: can AI actually read your site at all?

The answer, for a significant portion of marketing and SaaS websites, is no.

Here's what's happening. Modern websites — particularly those built on React, Next.js, Vue, or Angular — deliver much of their content through client-side JavaScript rendering. The HTML that loads initially is often a shell: a few div tags, a script import, and nothing else. The actual content — headlines, product descriptions, pricing, features — is assembled in the browser by JavaScript executing after the initial load.

Google's crawlers learned to handle this years ago. AI crawlers often haven't.

How AI Crawlers Actually Work

The major AI platforms — ChatGPT, Perplexity, Claude, Gemini — each use different methods to retrieve and index web content. Some use dedicated crawlers (GPTBot, PerplexityBot, ClaudeBot). Some rely on existing indices from Bing, Google, or Brave. What they have in common: most do not execute JavaScript at the depth a full browser does.

This means when GPTBot visits your homepage, it retrieves the initial HTML. If your navigation, hero section, and core value proposition are rendered by JavaScript that runs after page load, the crawler sees none of it. Your page is, for all practical purposes, invisible.

GPTBot, PerplexityBot, and ClaudeBot are all identifiable in your server logs. If you're not seeing requests from these crawlers, that's itself a signal worth investigating.

The Six Categories of AI Readability Failure

Based on Hema's Site Health audit data, JavaScript rendering is the most common blocker, but it's one of six categories of AI readability failures we consistently see:

How to Find Out If You Have This Problem

Start with the simplest possible test. Disable JavaScript in your browser (Chrome: DevTools → Settings → Debugger → Disable JavaScript) and reload your most important pages. What you see is roughly what AI crawlers see. If your homepage becomes a blank white screen, you have a problem.

For a more thorough audit, Hema's Site Health module runs a 6-module AI readability audit across your entire site — Technical Fixes, Issue Alerts, Firewall & Security, Hidden Content, Content Quality & Depth, and Data Tags & Schema — and surfaces every issue with a severity score and a ready-to-use fix instruction for your developer.

The Fix

The most reliable solution is server-side rendering (SSR) or static site generation (SSG). Next.js, Nuxt, and SvelteKit all support SSR out of the box. When your server sends fully rendered HTML rather than a JavaScript shell, every crawler sees the same content your users see.

If a full SSR migration isn't feasible right now, the interim approach is hybrid rendering: serve a static HTML fallback for your most important pages — homepage, product pages, pricing — while continuing to use client-side rendering for interactive elements like dashboards and account pages that don't need to be indexed.

Three additional fixes to prioritise alongside rendering:

The brands that fix these issues first will have a permanent crawl advantage over competitors who never audit their AI readability. Most of your competitors haven't done this yet.

Run your free AI Site Health audit at tryhema.com → Technical Fixes. You'll see your overall readability score across all six modules and a prioritised list of every issue on your site within 10 minutes.

Related articles

Every new article, direct to your inbox.

No noise, just signal.