Make your site readable by AI

llms.txt is the de-facto standard to help LLMs and agents understand and navigate your site content.

Why llms.txt?

A simple file for a complex world

Just like robots.txt handles crawlers and sitemap.xml indexing, llms.txt provides curated context for AI agents.

Content control

Define what agents can use and how.

Structure for LLMs

Provide sections, glossaries, endpoints and defined intents.

Web-friendly

Lives alongside robots.txt and sitemap.xml without conflicts.

Get started

Implement llms.txt in three steps

Create the file at the site root, define sections and publish.

1. Create the file

Add llms.txt at your domain root.

2. Define sections

Specify intent, paths and priorities for content.

3. Publish and verify

Use the guide to validate and optimize.

llms.txt
# llms.txt
version: 1
title: "Product documentation"
sections:
  - path: /docs
    intent: support
  - path: /blog
    intent: research
allow:
  - agents: "*"