Sitemap Exceeds 50MB / 50,000 URL Limit

Updated April 2026·By SitemapFixer Team

Google enforces two hard limits on sitemap files: 50,000 URLs and 50MB uncompressed. When your sitemap crosses either threshold, Google logs "Sitemap is too large" in Search Console and processes none of the URLs inside - your entire content catalog becomes invisible until you split the file.

Check your sitemap against Google limits
We report URL count, uncompressed size, and recommended split strategy
Analyze My Sitemap

What is this error?

A sitemap exceeds Google's limits when it contains more than 50,000 <url> entries or is larger than 50MB uncompressed (the gzipped size doesn't matter - the decompressed bytes are what count). Search Console reports "Sitemap is too large" or "Sitemap is too big." The file is rejected entirely - Google does not partially process it.

Why does it happen?

This hits e-commerce sites with large catalogs (100k+ SKUs), marketplaces with many user-generated listings, news publishers with deep archives, and documentation sites with many versioned pages. Often the sitemap stays under 50k URLs but crosses 50MB because of long URLs, image/video extensions, or detailed <news:news> metadata that inflates per-entry size.

Why does it hurt SEO?

Complete rejection: if Google can't process the sitemap, every URL inside loses its discovery and lastmod boost. For large sites this is catastrophic - tens of thousands of pages suddenly rely only on internal link discovery, which is slower and less reliable. Indexing coverage gradually decays over weeks as Google's internal index of the site goes stale.

How to detect it

Check your sitemap's URL count with grep -c "<loc>" sitemap.xml and its size with ls -lh sitemap.xml (uncompressed). Sitemap Fixer reports both metrics automatically, and if you're over either limit, it computes an optimal split into balanced child sitemaps you can pass to your generator.

How to fix it

1. Split the URLs into multiple child sitemaps (e.g., sitemap-posts-1.xml, sitemap-posts-2.xml). 2. Keep each child under 45,000 URLs and 45MB to leave safety headroom. 3. Create a sitemap_index.xml that lists each child with its own <loc> and <lastmod>. 4. Organize children by type (products, categories, posts) for easier per-segment monitoring. 5. Reference the index (not individual children) in robots.txt and Search Console. 6. Automate splitting in your build pipeline so growth doesn't re-trigger the limit.

Real-world example

A marketplace had 128,000 listings in a single sitemap.xml and saw "Sitemap is too large" in GSC for 3 weeks. After splitting into 3 child sitemaps under an index (sitemap-listings-1/2/3.xml), Google began processing all 128,000 URLs within 48 hours and indexed coverage rose from 41,000 to 98,000 over the following 6 weeks.

Common mistakes

Frequently Asked Questions

What are Google's sitemap size limits?
Each sitemap file can contain up to 50,000 URLs and must not exceed 50MB uncompressed. Gzipped size doesn't count - it's the decompressed file size that matters. Hit either limit and Google stops processing the file.
How do I split a sitemap that's too large?
Create a sitemap index file (sitemap_index.xml) that points to multiple child sitemaps, each under 50k URLs. Organize children by content type (posts, products, categories) so you can also see indexing coverage per segment in Search Console.
Can I submit a sitemap index to Google Search Console?
Yes. Submit just the index URL - Google will discover and fetch all child sitemaps automatically. The Sitemaps report in Search Console shows aggregated coverage plus per-child breakdowns.
Fix this in your sitemap now
Enter your domain and get a full sitemap audit in 60 seconds
Analyze My Sitemap Free
Related sitemap errors
All sitemap errors