Guide
How to make your site usable by AI agents.
Most agent-readiness fixes are ordinary web quality fixes made explicit: reachable pages, clear navigation, readable content, and obvious next steps. Dedicated machine-readable files help, but they work best when the public site is already coherent.
1. Make the site reachable
- Serve the public site over HTTPS.
- Avoid blocking ordinary, well-behaved crawlers by default.
- Use
robots.txtto express policy instead of accidental denial. - Make sure important public pages return real HTML, not challenge pages.
2. Make the site navigable
- Use clear titles, headings, and organization identity signals.
- Publish a useful
sitemap.xml. - Make contact, pricing, documentation, product, and policy routes obvious where relevant.
- Remove duplicate or conflicting labels for the same task.
3. Make the content absorbable
- Render meaningful text in the initial HTML where possible.
- Use semantic headings and sections.
- Avoid hiding the core content behind client-only shells.
- Keep summaries, product descriptions, and instructions specific.
4. Make handoff paths obvious
Agents do not need to complete every task autonomously. They do need to know where to send the human next. Forms, booking links, contact routes, API docs, issue trackers, or contribution guides can all be valid handoff paths when they are clear.
5. Add dedicated agent resources
Once the public site is coherent, add resources such as
llms.txt, agents.json, OpenAPI, feeds,
changelogs, MCP discovery, and security.txt. These help
agents move from "can scrape this page" to "can understand the site as
a system."