Google Search Console API: Developer Guide
The Google Search Console API gives developers programmatic access to the same data available in the GSC web interface — and more. It removes the 1,000-row UI limit, enables scheduled data collection, supports automation workflows, and is the foundation for custom SEO dashboards and monitoring tools. This guide covers everything from initial setup to building production-ready data pipelines.
What the GSC API Provides
The Search Console API exposes three major capabilities. The Search Analytics API returns query, page, country, and device performance data — clicks, impressions, CTR, and position — with the same 16-month retention window as the UI. The URL Inspection API lets you programmatically inspect any URL for its indexing status, coverage state, mobile usability, and AMP validity. The Sitemaps API allows you to list submitted sitemaps, submit new ones, and delete existing ones — all without logging into the interface. Together, these cover most of what an SEO or developer would need to automate GSC workflows.
Setting Up API Access in Google Cloud
Start by going to the Google Cloud Console (console.cloud.google.com) and creating a new project or selecting an existing one. Enable the "Google Search Console API" from the API Library. Then create credentials — for server-side scripts, create a Service Account and download the JSON key file; for user-facing apps, create an OAuth 2.0 Client ID. Add the service account email (ending in @[project].iam.gserviceaccount.com) as a verified user in your GSC property with at least Read permissions. This delegation step is what many tutorials skip — without it, the service account has no access to your property data despite having API credentials.
OAuth 2.0 Authentication Flow
For user-delegated access (the user authorizes your app to access their GSC data), implement the OAuth 2.0 authorization code flow. Redirect the user to Google's authorization endpoint with the scope https://www.googleapis.com/auth/webmasters.readonly (read-only) or https://www.googleapis.com/auth/webmasters (full access). After the user grants access, Google redirects back to your callback URL with an authorization code, which you exchange for an access token and refresh token. Store the refresh token securely — it allows you to obtain new access tokens without requiring the user to re-authorize. Access tokens expire after 1 hour; use the refresh token to obtain a new one automatically.
Search Analytics API: Parameters and Filters
The Search Analytics API uses a POST request to the endpoint: https://searchconsole.googleapis.com/webmasters/v3/sites/SITE_URL/searchAnalytics/query (where SITE_URL is your URL-encoded property). The request body specifies startDate, endDate, dimensions (an array of "query", "page", "country", "device", "date"), rowLimit (max 25,000), and optionally dimensionFilterGroups to apply query or page filters. The response includes a rows array where each row contains keys (the dimension values) and the four metrics. Always request only the dimensions you need — adding dimensions multiplies the result set and can inflate your row count significantly, requiring more pagination.
URL Inspection API
The URL Inspection API (https://searchconsole.googleapis.com/v1/urlInspection/index:inspect) accepts a POST body with inspectionUrl and siteUrl. The response includes the indexing state (INDEXED, NOT_INDEXED, or NEUTRAL), the canonical URL Google selected, mobile usability status, crawl details including last crawl time and HTTP response code, and rich result eligibility. This is particularly powerful for monitoring: build a script that runs URL Inspection on your most important pages weekly and alerts you if any page transitions from INDEXED to NOT_INDEXED. The API quota is 2,000 requests per day per property, so prioritize your critical pages.
Sitemap Management via API
The Sitemaps API exposes three operations: list (GET all sitemaps submitted to a property with their status and last read time), submit (PUT a new sitemap URL to a property), and delete (DELETE a sitemap from GSC — note this only removes it from GSC, not from your server). Use the list endpoint to monitor whether your sitemaps are being read successfully and whether Google is reporting errors. The lastDownloaded field tells you when Googlebot last fetched the sitemap, and the errors and warnings counts tell you if there are sitemap-level issues. Automate a weekly check on sitemap health as part of your SEO monitoring infrastructure.
Rate Limits and Quotas
The Search Console API has a default quota of 1,200 queries per minute per user and 1,200 queries per minute per project. Each API call counts as one query regardless of the rowLimit requested. For the URL Inspection API, the limit is 2,000 requests per day per property, shared across all users and applications. If you exceed rate limits, the API returns a 429 response — implement exponential backoff in your code to handle this gracefully. For large-scale data collection (pulling months of data across many dimensions), spread requests over multiple hours rather than hammering the API in a single burst.
Pulling More Than 25,000 Rows
The Search Analytics API maxes out at 25,000 rows per request, but you can paginate using the startRow parameter. Set startRow to 0 for the first request, 25000 for the second, 50000 for the third, and so on. Continue until the response returns fewer rows than your rowLimit — that signals you've reached the end of the dataset. For very large sites, also consider splitting requests by date: instead of pulling 90 days in one request, pull 7-day windows sequentially. This reduces row count per request and makes it easier to store and process data incrementally. The BigQuery export remains the best option for truly exhaustive data collection at scale.
Building a GSC Data Pipeline
A production GSC data pipeline has four components: a scheduler (cron job, Cloud Scheduler, or GitHub Actions) that triggers daily; a Python or Node.js script that authenticates, paginates through the API, and writes data to a destination; a storage layer (BigQuery, PostgreSQL, or Google Sheets) where data accumulates over time; and a reporting layer (Looker Studio, Metabase, or a custom dashboard) that reads from the storage layer. Keep your pipeline idempotent — if it runs twice for the same date, it should upsert rather than duplicate rows. Include error logging and alerting so you know immediately if an API call fails or credentials expire.