Update: I’m building out a web app that handles the hassle of IndexNow submissions.
While building Public.Law, I realized I had a “crawl budget problem”. With 500,000+ content pages, I need to be strategic about how search engines discover updates. Because: there are too many pages to re-crawl the entire site at once.
This led me to IndexNow—a protocol that might actually solve the “tell search engines what changed” problem properly.
Google Killed Sitemap Pings
In June 2023, Google deprecated their sitemap ping endpoint. Their internal studies showed these submissions were ineffective and attracted spam.
Google now recommends passive discovery through sitemaps and Search Console. This works fine for small sites, but when you have hundreds of thousands of pages, waiting for crawlers to discover changes is inefficient.
IndexNow: A Different Approach
IndexNow is a protocol Microsoft and Yandex introduced in October 2021. Instead of waiting for crawlers, you actively notify search engines when content changes via HTTP POST.
The interesting part: when you submit to one search engine, it gets shared with the others. Current participants include Bing, Yandex, Seznam, and Naver.
The AI Connection
Microsoft’s integration of Bing with ChatGPT creates an interesting architectural connection. IndexNow potentially gives content a faster path to AI visibility since Bing feeds into these systems. I haven’t tested this extensively, but the pieces seem to line up.
Implementation Reality
IndexNow adoption seems limited, probably due to implementation complexity and uncertainty about ROI.
The technical requirements: generate an API key, host a verification file, implement automated submissions with rate limiting (10,000 URLs/day), and handle response codes properly. The protocol has been stable since 2021, but like any API integration, there are operational details to work through.
My Take
For Public.Law’s scale (500k+ pages), the shift from passive discovery to active notification makes architectural sense. Google’s absence from the consortium is notable, but the crawl budget efficiency alone might make it worthwhile.
The implementation is straightforward: generate an API key, host a verification file, POST when content changes. The tricky part is deciding when to trigger submissions without flooding the endpoint.
I’m planning to implement it. For large-scale content sites, it seems like a useful addition to the indexing toolkit.
Leave a comment