What this does
Every few hours the workflow reads your sitemap, asks Google which URLs from it are actually indexed, and submits the ones that aren't directly to Google's Indexing API. The URLs that get submitted are logged so the workflow doesn't waste API quota re-submitting the same page over and over.
About 2-5 minutes per run. Most runs do nothing because everything is already indexed – the workflow only burns time when there are genuinely new pages to push.
The problem this solves
Google says it crawls your site automatically. In practice, new pages can sit unindexed for two weeks before Googlebot gets around to them – sometimes longer if your site has low crawl budget. For a site publishing 5-10 new pages a week, that's 5-10 pages earning zero traffic for half a month each.
The fix is the Google Indexing API. You can submit a URL directly and Google usually crawls it within hours instead of weeks. The problem is doing it consistently:
- You publish a page. You're meant to copy-paste the URL into GSC's URL Inspection tool, click "Request Indexing", wait for the "URL submitted" confirmation. Five minutes per page.
- You forget. The page sits unindexed for a month before someone notices traffic isn't coming.
- You remember sometimes. The pages you remembered to submit get indexed in a day. The ones you forgot don't. Your indexation rate becomes random.
The workflow makes "submit every new URL to Google" a thing that happens whether you remember or not. It reads your sitemap (which is already authoritative – your CMS generates it), checks what's indexed, submits what isn't, logs everything for audit. You don't think about indexing again.
For sites with 100+ pages a quarter, this gets pages earning traffic 5-15 days sooner on average, which compounds into real money over a year.
What you put in
Two config values, set once:
- The sitemap URL (usually
https://yourdomain.com/sitemap.xml) - The GSC property the site is verified under (e.g.
https://yourdomain.com/)
Plus a service account with GSC access to your property + the Indexing API enabled in Google Cloud. The setup doc walks you through this – it's 10 minutes of clicking in Google Cloud Console.
That's it. The workflow runs on its own from there.
What you get out
Every run produces:
- A list of URLs that were not indexed and got submitted to Google
- A Google Sheet row per submission – URL, date, response from the API, status (success / quota-exceeded / error)
- Nothing in your inbox unless something errors. The workflow is meant to be invisible when it's working.
You can open the sheet any time to see indexing trends: how many pages got pushed last month, which ones failed, whether your sitemap is generating expected URL volume.
How long per run
Workflow time: 2-5 minutes per scheduled run. Most of it is waiting on GSC's URL Inspection API, which is rate-limited.
Your time: zero, after setup. The workflow runs on the schedule you pick (every 2 hours, every day, whatever fits the publishing pace). No notification spam by default.
End-to-end per month for a typical site (~50 new URLs): about 30-60 minutes of workflow runtime spread across the month. Your time: zero.
When this is a good fit
- You publish at least 4-5 new pages a week. Below that, manual submission via GSC is fine.
- Your sitemap is up to date – generated by your CMS or by a tool you trust. Garbage sitemap = garbage submissions.
- You're on Google's good side. The Indexing API has rules: don't submit duplicate URLs in tight loops, don't submit pages that are clearly low-quality. The workflow respects rate limits, but it can't fix a site that's filing thin pages.
- You'd rather get pages indexed in days, not weeks. For some sites this is a competitive edge, especially in news-adjacent and rapid-publishing niches.
When this isn't a good fit
- Your site publishes a handful of pages a quarter. Manual submission for 5-10 pages is faster than building this.
- You don't have GSC verified. Whole thing requires it.
- You want indexing for non-job-posting / non-livestream pages on Google. The Indexing API officially supports job postings + livestreams. For regular pages it still mostly works, but Google may deprioritize requests at any time. We use it for blog content + landing pages and it works. Not guaranteed forever.
- You're trying to fix a site-wide indexation problem that comes from low page quality, duplicate content, or thin pages. Submission won't help; the page will get crawled and rejected. Fix the underlying issues first.
What's actually under the hood
The workflow runs on n8n. The shape:
- Schedule trigger (every 2 hours by default)
- Fetch sitemap XML over HTTP
- Parse XML to JSON. If the sitemap is a sitemap index, recurse into each child sitemap.
- Filter to URLs published in the last N days (configurable – default 90)
- Check submission history in Google Sheets – skip anything already submitted in the last 14 days
- For each remaining URL: call GSC's URL Inspection API. Did Google index it?
- Filter to URLs Google says are NOT indexed
- For each: call Google's Indexing API with a
URL_UPDATEDrequest - Add a delay between submissions to stay under the rate limit
- Log each submission to the history sheet
There's also a fallback path: if a URL was submitted but Google's response was "Quota exceeded", the workflow queues it for retry next run.
The reason this is a workflow and not a one-off script is the history-aware logic. A naive version submits every URL every run, burns its daily quota in 10 minutes, and gets you nothing. The history check + 14-day debounce means submissions actually get spread out so the quota lasts the day.
What you own at handover
- The full n8n workflow file
- The Google Sheet template (submission history + a dashboard tab)
- A setup doc walking through service account creation in Google Cloud + GSC permission grants
- A runbook for the common errors (quota exceeded, permission denied, sitemap-not-reachable)
- A Loom showing one full run from sitemap to submission
- Optional add-on: email alert if the daily quota gets exhausted (means your sitemap is generating URLs faster than the API can absorb – useful early signal)
Why I can help
The mechanics of this workflow are simple. n8n + two Google API calls. What earns its keep is the surrounding logic that makes the workflow robust:
- The 14-day debounce so you don't re-submit the same URL hundreds of times and waste quota. Sounds obvious. Naive versions skip it.
- The sitemap-index recursion so the workflow works for multi-sitemap sites without rewriting.
- The "submitted but quota exceeded" retry queue so a bad day doesn't lose URLs.
- The age filter so you don't waste quota on pages already indexed for 2 years.
Each of these came from running the workflow at scale and watching it fail in a specific way. The version you get already has each fix baked in.
What it costs to run
Per run: free. Google's Indexing API + GSC URL Inspection API are both free up to generous quotas (200 submissions per day for the Indexing API; effectively unlimited for inspection). n8n hosting if you self-host: about $5/month on a small VPS.
Build cost: 1-3 days of my time – the setup is mostly Google Cloud service-account plumbing + sheet wiring, not custom prompt work.
How to start
Book a call. Bring your sitemap URL + access to your GSC. We'll set up a service account on the call. The workflow can be running on a schedule by the end of the same week.
More SEO automations
Keyword cannibalization detection
Watch a keyword sheet, pull 30 days of GSC data for each target client, and use GPT-4o to flag which keywords have multiple pages competing – with remediation steps written back to the sheet.
Read the build →
Content decay detection
A weekly GSC pull that compares the last 7 days to the previous 28, classifies every page by decay severity, and sends a Slack + email digest of what to refresh.
Read the build →
SEO competitor research
Track competitor publishing cadence, ranking shifts, and content gaps automatically.
Page coming soon