Measuring whether a standard works by checking if anyone uses it before it exists is backward. And yet that’s exactly what The SEO Framework just did. They published data showing that across six months, 57 AI bots, and 180,000 AI-related requests to their site, not a single bot requested llms.txt. Their conclusion: implementing it would be “a waste of resources.” Yoast SEO and Rank Math, they imply, are selling a feature that doesn’t do what it promises. The data is solid. I have no reason to doubt the methodology. The conclusion is where it breaks. How standards actually get adopted Web standards don’t get adopted because bots start requesting something that doesn’t exist yet. They get adopted because publishers start serving something, which gives crawlers a reason to look for it, which gives more publishers a reason to serve it. That’s the cycle. Someone has to go first. XML sitemaps followed exactly this path. Google launched the protocol in June 2005. For over a year, Google was…
No comments yet. Log in to reply on the Fediverse. Comments will appear here.