What this does
Pick an existing blog post that used to rank and isn't ranking anymore. The workflow scrapes it, then scrapes the current top 10 results for the same keyword. It figures out what those new top-ranking articles cover that yours doesn't, plus what your article still does well. Then it rewrites only the sections that need updating, keeps the rest, and gives you back a refreshed article in a Google Doc.
About 10 minutes from "this old post is decaying" to a draft of the refreshed version.
The problem this solves
Content rot is the slow, predictable killer of SEO traffic. A post you wrote two years ago that ranked #2 is now at position #14. Not because it got worse, but because the search results moved on without you. New articles got published. The reader's expectations shifted. Google's idea of what "answers this query" changed.
The fix is content refreshes. You take old posts, check what the current top-ranking pages cover that yours doesn't, rewrite the gaps, re-publish. Done well, content refreshes are one of the highest-ROI things you can do in SEO – you already have the URL, the backlinks, the indexation. You just need to make the page worth ranking again.
The problem is the cost. A real content refresh – not just "update the date and republish" – takes 3-4 hours per post. You have to read the current top 10 articles, read your own article, do the comparison, decide what to rewrite, then actually rewrite it without breaking the parts that still work.
That works for one or two posts. It doesn't work when you have 80 declining posts in your library. So most teams either don't refresh content, or do shallow "update the date, change two sentences" refreshes that don't actually help.
This workflow does the comparison + rewrite work in 10 minutes per post. Your editor's job becomes reviewing what got rewritten and what was preserved – the high-judgement part – not the grind of doing the comparison from scratch.
What you put in
A Google Sheet row per post you want to refresh:
- The URL of your existing post
- The primary keyword it targets (or used to target)
- The topic – a short phrase that captures what the article is about, mainly for context to GPT
- Optional: the product / brand you want to keep promoting if the post has product mentions
Set row to "Planned" and the workflow picks it up.
What you get out
A Google Doc with:
- The refreshed article – your original sections that still hold up, plus rewritten sections where the SERP analysis shows gaps
- A summary of what changed and why – which sections got rewritten, which got preserved, what new content was added and based on what
- The article in a clean, edit-ready format with proper H2 / H3 / bullet structure
You review the diff (the "what changed and why" note saves time here), fact-check the new sections, and re-publish to your CMS.
How long per refresh
Workflow time: 8-12 minutes per post. The SERP scraping + 14 sequential GPT calls take most of it.
Your edit time: 30-45 minutes per post. Read the change summary, spot-check the new sections, make voice tweaks, validate the structure still flows.
End-to-end per refresh: about 1 hour of human time. Down from 3-4 hours. At 10 refreshes a month, that's 20-30 hours saved.
When this is a good fit
- You have a blog with at least 30-40 posts older than 12 months. Below that, you don't have enough decaying inventory for refreshes to be the highest-leverage move.
- You can identify which posts are decaying (via GSC, Ahrefs, or your analytics). The workflow doesn't auto-detect – you hand it the URLs you want refreshed.
- You'd rather get 10 mediocre posts back to ranking than write 1 new post. Content refresh ROI usually beats new-content ROI when your existing library is decent.
- You have a styling guide or you're willing to align voice during setup. Without it, the rewrites read differently from your originals and the post feels patchwork.
When this isn't a good fit
- Your blog is brand new. Refreshes are for established libraries.
- You write evergreen content where the SERP doesn't shift much. Some niches (e.g. fundamental engineering tutorials) don't decay the way most B2B content does. The refresh logic doesn't add value.
- You're hoping for "auto-detect decaying posts and auto-rewrite them on a schedule." That's a different workflow. This one starts when you hand it a URL.
- The post is structurally broken (wrong target keyword, no real value, dated for reasons that can't be fixed). The refresh polishes the rough edges. It can't fix a fundamentally weak post.
What's actually under the hood
The workflow runs on n8n with about 14 GPT calls in sequence. Here's the shape:
- Read the URL + keyword + topic from your sheet
- Scrape the existing article (via Jina)
- Pull the top 10 SERP results for the keyword
- For each of the top 10, scrape + summarize (10 GPT calls, one per article)
- Comparison call: "Here are the top 10 article summaries, and here's the user's existing article. What does the user's article do well? What's it missing that the others now cover? What's it doing that's no longer needed?"
- Decision call per section: "For each section of the user's article, decide – keep as-is, rewrite, expand, or delete?"
- Rewrite calls for each section flagged "rewrite" or "expand" – each one runs with context about what the section needs to cover now
- Compile: stitch the kept sections + the rewritten sections back into a full article
- Write the change summary doc: which sections changed, why, what new content was added
- Output everything to Google Docs
The reason this works is the per-section decision step. Most "AI content refreshers" just rewrite the entire article. Result: you lose the parts that were already strong, the voice shifts, the post feels like a different article entirely. The decision step preserves what works and only touches what needs touching.
The diff summary at the end is the other piece that earns its keep. You don't have to compare the old article to the new one yourself – the workflow tells you exactly what changed and why. That cuts the review time in half.
What you own at handover
- The full n8n workflow file
- Every GPT prompt in plain text, documented
- The Google Sheet templates (refresh queue + change log)
- The styling-guide doc that the workflow uses to keep rewrites on-voice
- Optional add-on: a GSC integration that flags decaying posts automatically (this is its own setup, available if you want it)
- Optional add-on: WordPress / Webflow push from approved refreshes
- A Loom showing the end-to-end refresh loop
- A runbook for the edge cases: what to do when the workflow's "rewrite" decision was wrong, how to override per-section, how to handle posts where the original keyword no longer matches what people search for
Why I can help
Content refreshes are one of those things where the easy version is useless and the right version is expensive to build. The easy version is "scrape the old post, send to GPT, ask it to rewrite." Output: a different article in a slightly different voice that doesn't actually plug the SERP gaps.
The right version requires three things most refresh tools skip:
- A SERP analysis step that knows what the top-ranking articles actually do well (not what they're titled)
- A per-section decision step that distinguishes "this is fine, keep it" from "this needs rewriting" from "this section shouldn't exist anymore"
- A diff summary so the editor can review changes without re-reading the whole article
I built this after running content refreshes manually for two clients in 2024 and realising the per-section decision was the part nobody automates. The prompts encode the rules I worked out by hand – when to keep, when to rewrite, when to delete.
What it costs to run
Per refresh: about $0.25-$0.60 in OpenAI tokens (14 GPT calls, mostly cheap GPT-4.1-mini). SERP scraping API: $0.05-$0.10 per refresh. Total: under $1 per article refresh.
Build cost: 2+ weeks of my time to wire the workflow, tune the comparison + decision prompts to your editorial voice, set up the sheets, optionally add the GSC decay-detection step, and train your editor on the review loop.
How to start
Book a call. Bring 3 declining posts from your blog – ideally posts that used to rank and don't anymore. We'll run the workflow on one of them during the call. You decide on the spot whether the refreshed draft is the kind of thing your editor can actually ship.
More SEO automations
Keyword cannibalization detection
Watch a keyword sheet, pull 30 days of GSC data for each target client, and use GPT-4o to flag which keywords have multiple pages competing – with remediation steps written back to the sheet.
Read the build →
Content decay detection
A weekly GSC pull that compares the last 7 days to the previous 28, classifies every page by decay severity, and sends a Slack + email digest of what to refresh.
Read the build →
Automated indexing for new pages
Walk your sitemap, ask GSC which URLs are still not indexed, and submit them to Google's Indexing API on a schedule.
Read the build →