Recently I wrote that the open web doesn’t need more defenders, it needs builders. That the next open web needs machine-readable architecture: content structured for machines, not just rendered for browsers. Linked knowledge graphs that search engines and AI agents can consume without reverse-engineering HTML. That’s the argument. Here’s what I actually built. The problem that started it I was building an SEO plugin for EmDash, and the core problem was generating a valid schema.org @graph for every page. Not just a flat snippet, but a proper linked graph: WebSite, WebPage, Article, Person, BreadcrumbList, all wired together with @id references so an agent or search engine can walk the relationships. I’d already written that logic for this blog. It worked, but it was tangled into the Astro components. When I needed the same graph logic in EmDash, the choice was: copy it and maintain two versions, or extract it into something shareable. Before AI, “extract this into a shared library”…
No comments yet. Log in to reply on the Fediverse. Comments will appear here.