Discovered - currently not indexed
"Discovered - currently not indexed" is one of the most frustrating Google Search Console statuses because it means Google knows your URL exists (it found it in your sitemap or through a link) but has deliberately chosen not to crawl it yet. Google typically explains this by saying it delayed the request to avoid overloading your site. In practice, it almost always signals a crawl budget, site quality, or server capacity problem.
What this GSC status means
Google knows about the URL - it is in Google's known URLs list - but it has not been fetched yet. Google explicitly states: "Typically, Google wanted to crawl the URL but this was expected to overload the site; therefore Google rescheduled the crawl." In reality, the rescheduling signal also correlates with low internal URL priority, poor site authority, and template-heavy or thin content patterns. Google is making a deliberate decision to not spend crawl budget on this URL right now.
Common causes
- Slow server response times (TTFB over 600ms) that make Google throttle crawl rate.
- Bloated sitemaps listing thousands of URLs, many of which are thin or duplicative.
- Weak internal linking - the URL is in the sitemap but has zero or few internal links.
- Low overall site authority (new domain, few backlinks) - Google crawl budget is partly a function of PageRank.
- A burst of new URLs that exceeds your normal publishing volume, signaling a potential spam pattern.
- URLs that look like auto-generated patterns (tag/filter/search parameters).
How it affects indexing
The URLs do not rank at all - they are not in the index. If it is a temporary queue, they will be crawled and likely indexed in the coming days or weeks. But when the bucket is large and stagnant (hundreds or thousands of URLs sitting there for months) it indicates Google has deprioritized most of your content. New publishing without authority growth will just add to the queue.
How to diagnose
In GSC, open the Page indexing report and click "Discovered - currently not indexed". Look at the count and the URLs - are they your highest-value pages or low-priority ones? Check Settings > Crawl stats for average response time and host issues. Run a crawl of your site with a tool and check for orphan pages (URLs in the sitemap with zero internal links). Verify server response time is under 500ms on a representative sample.
How to fix
1. Clean your sitemap - remove thin, duplicate, redirecting, and non-canonical URLs. Only submit high-quality canonicals. 2. Check Crawl Stats in GSC and work on server TTFB - aim for under 500ms average. 3. Add internal links from high-traffic, high-authority pages to each stuck URL (not just from the footer). 4. Consolidate or noindex thin pages so they do not dilute the quality signal for the whole site. 5. For critical URLs, use URL Inspection > Request Indexing (limit ~10/day). 6. Build topical clusters - a hub page linking to related posts lifts every URL in the cluster. 7. Earn external backlinks to the site overall - higher authority = bigger crawl budget. 8. Split very large sitemaps by content type (products vs blog vs categories) to help Google prioritize.