AI agents hit a broken link and give up. Add one script tag and your 404 pages start suggesting the right page — in a format agents already understand.
Enter your domain to get a ready-to-paste script tag
As real users browse your site, the script quietly learns every page — its URL, title, and headings. Your site index builds itself over time, no sitemap required.
An agent follows an outdated link and lands on your 404. The script recognizes it's a dead end and asks: "what did they probably mean?"
The URL is fuzzy-matched against your index. Version changes (v2→v3), typos, restructured paths — it figures out where the content moved.
Suggestions appear as a human-readable list and as schema.org JSON-LD — a format AI agents already know how to parse. No integration needed on their side.
{
"deadUrl": "https://docs.acme.com/v2/auth",
"suggestions": [
{
"url": "https://docs.acme.com/v3/auth",
"title": "Authentication Guide",
"score": 0.85,
"matchType": "moved"
}
],
"jsonLd": { "@context": "https://schema.org", ... }
}
You migrated your docs from v2 to v3. Now every AI agent, every Stack Overflow answer, every tutorial links to pages that don't exist anymore. agent-404 notices the version shift and sends them to the right v3 page.
You moved /blog/ to /articles/ and set up redirects for the top 20 posts. But there were 200 more you forgot about. The fuzzy matcher catches those without you maintaining a redirect map.
LLM agents, coding assistants, and RAG pipelines follow links baked into training data. When those links go stale, they hallucinate answers. Give them a structured path to the current content instead.
Search crawlers find your dead pages and index them as errors. With JSON-LD suggestions embedded in the 404, crawlers discover the right content instead of flagging another broken link.
Fully open source. Self-host or use the hosted version.
Add one script tag. Your 404 pages start working for you.