As 61% of searches are expected to start on AI platforms by the end of 2026 according to Gartner, a new standard is emerging to help websites communicate with language models: llms.txt. This file, proposed in September 2024, transforms how AIs understand and index your content.
In this article, we explore what llms.txt is, how it differs from traditional files like robots.txt, and how to implement it to maximize your visibility on AI answer engines.
- 61% of searches will start on AI platforms by 2026 (Gartner)
- Google AI Overviews appear in 55% of searches
- llms.txt adoption has been growing continuously since September 2024
What is llms.txt?
llms.txt is a standardized text file placed at the root of your website (example: https://yoursite.com/llms.txt). Its objective: provide AI crawlers with a structured and prioritized view of your most important content.
Unlike a sitemap that exhaustively lists all your pages, llms.txt presents a curated selection of your essential resources. It's a sort of "reading guide" for LLMs that allows them to quickly understand what your company does and where to find your strategic content.
The format was initially proposed by Jeremy Howard in September 2024 and quickly gained traction in the tech community. The core idea: adapt the robots.txt concept to the era of language models, but with an inclusive rather than restrictive approach.
llms.txt vs robots.txt vs sitemap.xml
To fully understand the value of llms.txt, let's compare it to files you already know.
| Characteristic | robots.txt | sitemap.xml | llms.txt |
|---|---|---|---|
| Objective | Control access | List all pages | Guide to key content |
| Approach | Restrictive (block) | Exhaustive (list everything) | Curative (select) |
| Target | Traditional crawlers | Search engines | Language models (LLM) |
| Format | Allow/Disallow directives | Structured XML | Readable Markdown |
| Number of links | N/A | Unlimited | 20-50 recommended |
robots.txt remains essential for controlling which bots can access which parts of your site. It continues to work for AI crawlers like GPTBot (OpenAI), ClaudeBot (Anthropic) or PerplexityBot.
sitemap.xml helps traditional search engines discover all your pages. But its exhaustive nature isn't optimal for LLMs that need to quickly understand your value proposition.
llms.txt fills a gap: it offers a structured summary, readable by humans and machines, that points to your most strategic resources.
How AI Crawlers Use llms.txt
Major AI players deploy crawlers to feed their models and search features:
Anthropic (Claude)
Anthropic was one of the first to officially adopt the llms.txt standard. ClaudeBot consults this file to prioritize content indexing. Claude uses this information to provide more accurate and better-sourced responses.
OpenAI (ChatGPT)
GPTBot analyzes llms.txt when present to understand a site's structure and priorities. Even without official announced support, observations show that ChatGPT SearchGPT takes these indications into account.
Perplexity
Perplexity, as an AI-based answer engine, particularly benefits from llms.txt to quickly identify authoritative sources on a given topic.
How to Create Your llms.txt File
Creating an llms.txt file follows a simple Markdown-based format. Here are the steps to create yours.
Step 1: Basic Structure
The file starts with a title and description of your site, followed by thematic sections.
# Your Company Name > Concise description of your activity and value proposition. > This section helps LLMs understand your positioning. ## Documentation - [Getting Started Guide](https://yoursite.com/docs/getting-started): Complete introduction to our solution - [API Reference](https://yoursite.com/docs/api): API technical documentation - [FAQ](https://yoursite.com/faq): Frequently asked questions ## Products - [Main Product](https://yoursite.com/product): Description and features - [Pricing](https://yoursite.com/pricing): Available plans and options ## Blog - [Article 1](https://yoursite.com/blog/article-1): Topic covered - [Article 2](https://yoursite.com/blog/article-2): Topic covered
Step 2: Complete Example
Here's a concrete example for a SaaS company:
# AI Labs Audit > AI Labs Audit is a visibility audit platform for conversational AIs. > We help companies measure and optimize their presence on ChatGPT, > Claude, Gemini and Perplexity. ## About - [Homepage](https://ailabsaudit.com/): Platform presentation - [About](https://ailabsaudit.com/about): Our mission and team - [Contact](https://ailabsaudit.com/contact): Contact us ## Features - [AI Audit](https://ailabsaudit.com/features/audit): Multi-AI visibility analysis - [Reports](https://ailabsaudit.com/features/reports): Detailed reports - [Tracking](https://ailabsaudit.com/features/tracking): Continuous monitoring ## Resources - [Blog](https://ailabsaudit.com/blog): Articles and guides - [AEO Glossary](https://ailabsaudit.com/glossary): Sector definitions - [llms.txt Guide](https://ailabsaudit.com/blog/llms-txt): This guide ## Pricing - [Plans](https://ailabsaudit.com/#pricing): Options and rates
Step 3: Deployment
Place the file at the root of your website:
# The final URL should be: https://yoursite.com/llms.txt # For a Flask/Django site, place it in the static folder # or configure a dedicated route # For a static site, simply place it at the root /llms.txt
Best Practices for llms.txt
1. Limit the Number of Links (20-50 maximum)
More isn't better. LLMs work better with structured and prioritized information. Select your 20 to 50 most strategic pages rather than listing everything.
2. Organize by Logical Categories
Use clear sections (Documentation, Products, Blog, etc.) to help AIs understand your content structure. This facilitates processing and categorization.
3. Write Useful Descriptions
Each link should be accompanied by a brief description. It's this description that LLMs use to decide a resource's relevance.
4. Update Quarterly
Your llms.txt should reflect your current content. Plan a minimum quarterly review, and immediate updates when:
- Publishing major content
- Launching new products/services
- Site restructuring
- Positioning change
5. Start with the Most Important
Order matters. Place your most strategic pages at the beginning of the file. LLMs generally pay more attention to the first listed items.
Current State of Adoption
In January 2026, llms.txt adoption is in full growth. Here are the main players that officially support the standard:
Confirmed Support
- Anthropic: Official support for Claude and ClaudeBot
- Cursor: The AI IDE uses llms.txt to understand projects
- Mintlify: Documentation platform with native integration
- Several frameworks: Integrations available for Next.js, Astro, etc.
Observed Support (unofficial)
- OpenAI/SearchGPT: GPTBot analyzes llms.txt when present
- Perplexity: Observed usage for sourcing
- Google AI: Likely experimentation with Gemini
The absence of official announcement doesn't mean absence of support. Most AI crawlers analyze text files at site roots, and llms.txt's Markdown format makes it particularly readable.
Limitations and Perspectives
Current Limitations
No formal standard: llms.txt is not yet a W3C or IETF standard. Its adoption relies on de facto convention rather than formal specification.
No usage guarantee: Even if you create a perfect llms.txt, nothing guarantees AIs will use it. It's an optimization opportunity, not a visibility guarantee.
Maintenance required: An outdated llms.txt can be counterproductive if it points to deleted pages or deprecated content.
2026-2027 Perspectives
Likely standard evolution includes:
- Formal specification: An RFC or official standard could emerge
- Extensions: Support for metadata (update date, priority, language)
- CMS integration: Automatic generation in WordPress, Shopify, etc.
- Validation tools: Validators similar to those for robots.txt
Measure Your Current AI Visibility
Before optimizing, measure. Our audits analyze your presence on ChatGPT, Claude, Gemini and Perplexity.
Request a Free AuditConclusion: Should You Adopt llms.txt?
The answer is yes, for several reasons:
- Minimal cost: Creating an llms.txt takes less than an hour
- High potential: Even a marginal impact on AI visibility can generate significant traffic
- Pioneer advantage: Few sites have adopted it, it's an opportunity for differentiation
- No risk: At worst, the file is ignored; at best, it improves your visibility
In a context where searches are massively migrating to AI platforms, every optimization counts. llms.txt represents a piece of the AEO (Answer Engine Optimization) puzzle that would be a shame to neglect.
Frequently Asked Questions About llms.txt
What is llms.txt?
llms.txt is a standardized file placed at the root of your website that provides AI crawlers (ChatGPT, Claude, Perplexity) with a structured view of your most important content. It complements robots.txt and sitemap.xml by specifically targeting language models.
What is the difference between llms.txt and robots.txt?
robots.txt controls crawler access (allow/block). llms.txt guides AIs toward your priority content without any blocking notion. robots.txt is intended for traditional search engines, llms.txt for language models like GPT-4 or Claude.
How many links to include in llms.txt?
Best practices recommend between 20 and 50 links maximum. Prioritize quality over quantity: only include your most strategic pages, well-documented and regularly updated.
How often to update llms.txt?
A quarterly update is recommended at minimum. Also update after each major content publication, site restructuring or new product/service launch.
Which AI crawlers use llms.txt?
In January 2026, Anthropic (Claude), Cursor, Mintlify and several other players officially support llms.txt. OpenAI and Perplexity also analyze this file even without officially announced support. Adoption is growing rapidly.