EN UK RU
🤖 SEO Tool · Google 2 MB Limit

Check your site against
Google's 2 MB crawl limit

Googlebot won't fully index pages or resources heavier than 2 MB. Enter a URL — the tool analyzes your HTML document, every CSS & JS file, Googlebot accessibility, and more.

What the tool checks

Full audit in seconds — no sign-up, no installation

📄

HTML document size

Googlebot downloads and processes only the first 2 MB of HTML. Everything beyond is ignored — critical for pages with large inline data or JSON-LD blocks.

🎨

CSS and JS files

Each external stylesheet and script also has a 2 MB limit. Heavy bundles not only slow page load but also restrict how much Googlebot can parse and render.

🤖

Googlebot accessibility

We check robots.txt, X-Robots-Tag headers, and the meta robots tag. A page can be open to users but invisible to Google at the same time.

Compression & caching

gzip/brotli compression reduces transfer size 3–10×. Without it a 900 KB file may exceed limits. Cache-Control headers prevent repeated downloads.

🔒

Security headers

Audit of HSTS, CSP, X-Frame-Options, and other headers. They affect both user security and Chrome's Page Experience signals.

🗺️

robots.txt & Sitemap

We parse the robots.txt file: is Googlebot blocked, are there sitemap links, and which rules apply to the checked URL.

Frequently asked questions

Everything about the 2 MB limit and indexing

Why did Google set the limit at exactly 2 MB?
Google officially documents a 15 MB download limit, but only the first 2 MB of HTML content is indexed. This constraint exists so Googlebot can efficiently crawl billions of pages without spending excessive time on any single one.
Is the 2 MB limit applied before or after compression?
The limit applies to the decompressed content size. If your HTML is 600 KB gzip-compressed but expands to 2.5 MB — the last portion won't be indexed. This tool always shows the real uncompressed size.
What happens if a JS file exceeds 2 MB?
Googlebot won't execute the full script. If your content is rendered via JavaScript (SPA), the bot won't see part of the page. This is especially critical for e-commerce and news sites. Use code splitting and consider SSR/SSG.
How do I enable gzip/brotli compression?
Nginx: gzip on; gzip_types text/css application/javascript;
Apache: enable mod_deflate and mod_brotli.
Cloudflare: compression is on by default for all traffic.
Node.js / Express: use the compression package.
The page returns 200 but Googlebot accessibility shows an error?
Some servers detect the User-Agent and block bots via 403/503 or return empty content. This can also happen via Cloudflare Bot Fight Mode, Fail2Ban, or WAF rules. Check your firewall and bot protection settings.
Does file size affect Core Web Vitals?
Not directly, but a large JS file increases parse and execution time, which worsens LCP and TBT. Large CSS blocks rendering. Both metrics feed into Google's Page Experience ranking signals.
💼

Need an SEO audit, web development, or custom tools?

We build high-performance web products and SEO solutions. Ivatech agency is available for new projects on Upwork.

Work with Ivatech on Upwork