Public Tool Used to Target Specific Pages
The problem involved Google’s “Remove Outdated Content” feature. The tool is open to the public and is meant to clean up links that no longer work or contain outdated snippets. Attackers found a way to use it against working pages by submitting removal requests based on false signals.
In one case, a journalist’s article remained live on its original site. But after repeated requests through this tool, the article kept disappearing from search results. Each time it was restored through Google Search Console, the same page would get deindexed again.
Case-Sensitive URLs Opened a Loophole
The method worked by changing the capitalization in the URL. For example, a link containing a capital letter in its path would point to a page that didn’t exist. When Google received that version and got a 404 error, it treated the original lowercase page as gone and removed it.
This loophole came from a mismatch in how the system handled uppercase and lowercase URLs. While URLs on most servers are case sensitive, Google’s removal system didn’t fully respect that in this context. As a result, a changed letter was enough to fool the system into delisting an active page.
Publishers Saw Hundreds of Articles Removed
The attack wasn’t limited to one story. Reports showed that more than 400 articles were affected. These removals happened even though the pages were still live and untouched. Site owners had to check their Search Console daily, cancel unauthorized removals, and manually restore visibility. The removals kept happening in cycles.
These incidents were flagged in Google’s user forums, where publishers asked whether it was possible to block such attacks. The system had no way to prevent outsiders from making repeat requests. Google eventually responded and confirmed the issue was under review.
Google Confirms the Bug and Closes the Gap
In follow-up discussions, Google confirmed the removals were caused by a case-handling error in the tool. The company said the bug had affected a small number of websites. It has since updated the system and restored the wrongly removed pages.
One option for website owners would have been to redirect all versions of a URL to lowercase. This could have prevented attackers from submitting altered addresses that led to false removals. But setting that up requires changes to the server configuration, which many publishers might not know how to do.
Public Reporting Tools Remain a Risk
This case highlighted how even simple systems can be misused when protections are missing. Tools meant to clean up search results can become entry points for attacks if they don’t include verification or limits.
Although this specific flaw is no longer active, the broader issue remains. Tools that allow anyone to submit changes carry risks for those who publish online. Webmasters are advised to keep track of their pages in Search Console and watch for unusual patterns in visibility.

Notes: This post was edited/created using GenAI tools. Image: DIW-Aigen.
Read next: Making Instagram Content Work: A Closer Look at What Each Post Type Really Does