Your website might be invisible to AI.
Not to Google you’ve probably got that covered with SEO. But when someone asks ChatGPT, Claude, or Perplexity a question about your industry, your business, or the services you offer… does the AI know you exist?
This is the problem LLMs.txt is trying to solve.
It’s a simple text file similar to robots.txt or your sitemap that tells AI systems exactly what your website is about and which pages matter most. Think of it as a cheat sheet you’re handing directly to artificial intelligence.
But here’s the catch: not every website needs one. And not every business should rush to implement it.
In this guide, we’ll break down what LLMs.txt actually is, how it works, and most importantly, which types of service businesses stand to benefit the most from early adoption.
Navigate This Post
What Is LLMs.txt? (The Simple Explanation)
LLMs.txt is a proposed web standard created by Jeremy Howard, co-founder of Answer.AI, in September 2024. It’s a plain text file written in Markdown format that sits in your website’s root directory (like yoursite.com/llms.txt).
The file gives large language models a structured, easy-to-read summary of your website’s most important content your key pages, services, documentation, and resources.
Instead of forcing AI crawlers to wade through your entire website (with all its navigation menus, sidebars, JavaScript, and ads), you’re giving them a clean, curated guide to what actually matters.
Why Was LLMs.txt Created?
The Problem with Traditional Web Crawling for AI
When Google crawls your website, it indexes everything every page, every link, every piece of content. That works fine for traditional search because Google has a massive infrastructure designed specifically for this purpose.
AI systems like ChatGPT, Claude, and Perplexity work differently.
They don’t maintain a permanent index of the web. Instead, they process information in real-time, pulling content when users ask questions. And they face a fundamental limitation: context windows.
How Context Windows Limit LLM Understanding?
A context window is essentially the AI’s working memory how much text it can process at once. While these windows have grown significantly (some models now handle 100,000+ tokens), they’re still finite.
When an AI tries to understand your website by reading raw HTML pages, most of that context window gets wasted on:
- Navigation menus and footers
- JavaScript and CSS code
- Advertisements and tracking scripts
- Sidebar widgets and related post links
- Cookie consent banners
The actual content the stuff that matters might only be 20% of what the AI is processing. That’s incredibly inefficient.
The Difference Between Human-Readable and AI-Readable Content
Humans can glance at a webpage and instantly know what’s important. We automatically filter out the navigation, skip the ads, and focus on the main content.
AI doesn’t have that intuition. It processes everything sequentially and treats a navigation link with the same initial weight as a critical service description.
LLMs.txt solves this by giving AI a pre-filtered, structured summary. No noise. Just signal.
How Does LLMs.txt Work?
The Markdown File Structure Explained
LLMs.txt uses Markdown formatting the same lightweight syntax developers use for README files on GitHub. It’s both human-readable and machine-parseable.
The basic structure includes:
- H1 heading (#): Your website or company name
- Blockquote (>): A brief description of what your site is about
- H2 headings (##): Section categories (Services, Resources, Contact, etc.)
- Bulleted links (-): URLs with descriptions in [Link Text](URL): Description format
Key Components of an LLMs.txt File
Every effective LLMs.txt file should include:
- Clear identification: Who you are and what you do
- Priority pages: Your most important service or product pages
- Supporting content: Resources, guides, and FAQs that provide context
- Contact/conversion pages: How users can take action
Optional but helpful:
- Important notes or caveats about your business
- An “Optional” section for lower-priority pages that can be skipped if context is limited
LLMs.txt vs Robots.txt vs Sitemap.xml: What’s the Difference?
These three files serve different purposes. Here’s how they compare:
| Feature | Robots.txt | Sitemap.xml | LLMs.txt |
| Primary Purpose | Tell crawlers what NOT to access | List ALL indexable pages | Highlight IMPORTANT pages for AI |
| Target Audience | Search engine bots | Search engines | Large language models |
| File Format | Plain text with directives | XML | Markdown |
| Required? | No, but recommended | No, but strongly recommended | No (proposed standard) |
| Content Focus | Access permissions | Comprehensive page list | Curated content summary |
| Best For | Blocking sensitive pages | SEO indexing | AI visibility and context |
When to Use Each File
Robots.txt: Use to block private pages, duplicate content, or sections you don’t want crawled (admin areas, staging pages, etc.)
Sitemap.xml: Use to help search engines discover and index all your public pages efficiently
LLMs.txt: Use to guide AI systems to your highest-value content and provide context for how your site is structured
Can They Work Together?
Absolutely. These files are complementary, not competing.
Your robots.txt controls access. Your sitemap provides comprehensive discovery. Your LLMs.txt provides AI-specific prioritization and context.
Think of it like a library: robots.txt is the “Staff Only” sign, the sitemap is the complete card catalog, and LLMs.txt is the librarian’s “recommended reading” list.
Which Service Websites Need LLMs.txt the Most?
Not every website benefits equally from LLMs.txt. Based on how AI systems process and cite information, certain industries and business types stand to gain significantly more than others.
Here’s our priority ranking:
| Industry | Need Level | Primary Benefit | Implementation Complexity |
| Law Firms | High | Complex practice areas need clear structure | Low |
| Dental/Medical Practices | High | Service + location clarity for local AI queries | Low |
| Multi-Location Businesses | High | Location-service mapping for AI | Medium |
| SaaS/Developer Tools | High | Documentation discoverability | Medium |
| E-commerce (Complex Catalogs) | Medium-High | Product and policy clarity | Medium |
| Professional Services | Medium | Methodology and expertise signals | Low |
| Educational Institutions | Medium | Course and program discovery | Medium |
1. Law Firms and Legal Services (High Priority)
Law firms have complex, content-rich websites that AI often struggles to parse correctly. You might have dozens of practice area pages, attorney profiles, case results, and legal resources but without a clear structure, AI might pull from the wrong page or miss your expertise entirely.
Pages to prioritize in your LLMs.txt:
- Practice area pages (personal injury, family law, criminal defense, estate planning)
- Attorney profile pages with credentials and case experience
- Case results and testimonials
- Legal FAQ and resource libraries
- Location and service area pages
Why it matters: When someone asks AI, “What should I do after a car accident in Dallas?” you want your personal injury content to be the source it cites not a generic legal directory.
Example LLMs.txt section for a law firm:
Practice Areas
– [Personal Injury](https://example.com/personal-injury): Car accidents, truck accidents, slip and fall, wrongful death representation
– [Family Law](https://example.com/family-law): Divorce, child custody, child support, adoption services
– [Criminal Defense](https://example.com/criminal-defense): DWI defense, drug charges, assault, theft charges
Our Attorneys
– [Attorney Profiles](https://example.com/attorneys): Meet our legal team with combined 75+ years experience
– [Case Results](https://example.com/case-results): Notable verdicts and settlements
Legal Resources
– [Personal Injury FAQ](https://example.com/faq/personal-injury): Common questions about injury claims in Texas
– [What To Do After an Accident](https://example.com/blog/after-car-accident): Step-by-step guide for accident victims
2. Dental Practices and Healthcare Providers (High Priority)
Dental practices face a unique challenge: they need to rank for both service-based queries (“dental implants cost”) and location-based queries (“dentist near me”). AI systems increasingly handle both types of questions.
Pages to prioritize in your LLMs.txt:
- Individual service pages (implants, Invisalign, emergency dental, cosmetic dentistry)
- Patient education content explaining procedures
- Insurance and payment information
- New patient resources
- Location pages for multi-location practices
Why it matters: An empty dental chair costs $200-300 per hour. If AI assistants start recommending competitors because your site structure is unclear, that’s revenue walking out the door.
Example LLMs.txt section for a dental practice:
Dental Services
– [Dental Implants](https://example.com/dental-implants): Full implant placement, All-on-4, implant-supported dentures
– [Invisalign](https://example.com/invisalign): Clear aligner treatment for teens and adults
– [Emergency Dental Care](https://example.com/emergency-dentist): Same-day appointments for toothaches, broken teeth, dental trauma
– [Cosmetic Dentistry](https://example.com/cosmetic-dentistry): Veneers, teeth whitening, smile makeovers
Patient Information
– [Insurance & Financing](https://example.com/insurance): We accept most PPO plans, CareCredit, and in-house payment plans available
– [New Patient Information](https://example.com/new-patients): What to expect at your first visit, forms, and policies
About Our Practice
– [Meet Our Dentists](https://example.com/our-team): Dr. Smith (20+ years experience), Dr. Johnson (pediatric specialist)
– [Office Tour](https://example.com/office-tour): State-of-the-art facility with latest dental technology
3. Multi-Location Service Businesses (High Priority)
If you operate multiple locations whether you’re a dental group, law firm with satellite offices, or a franchise business AI has a particularly hard time understanding which location serves which area.
Pages to prioritize in your LLMs.txt:
- Individual location pages with addresses and service areas
- Service pages that specify location availability
- Centralized FAQ that applies across locations
- Booking or contact pages for each location
Why it matters: Without a clear structure, AI might tell a user to visit your downtown location when your suburban office is 5 minutes from their home. Or worse, it might not mention you at all because it can’t determine your service area.
4. SaaS Companies and Developer Tools
This is actually where LLMs.txt has seen the most adoption so far. Companies like Cloudflare, Vercel, Anthropic, and Stripe have implemented LLMs.txt files because developers frequently use AI coding assistants that need to reference documentation.
Pages to prioritize:
- API documentation and reference guides
- Getting started and quickstart guides
- Pricing and feature comparison pages
- Integration guides and tutorials
5. E-commerce Stores with Complex Catalogs
If you sell products with detailed specifications, size guides, or complex policies, LLMs.txt helps AI understand your catalog structure.
Pages to prioritize:
- Category and collection pages
- Shipping and return policies
- Size guides and buying guides
- FAQ pages
6. Professional Services Firms
Consulting firms, agencies, and B2B service providers benefit from a clearly structured methodology and case study content.
Pages to prioritize:
- Service methodology pages
- Case study libraries
- Team expertise and credentials
- Industry-specific service pages
7. Educational Institutions and Course Providers
Universities, online course platforms, and training providers have complex content hierarchies that benefit from an AI-friendly structure.
Pages to prioritize:
- Course catalog and curriculum pages
- Admission requirements and deadlines
- Faculty directories
- Tuition and financial aid information
Who Probably Doesn’t Need LLMs.txt? (Be Honest With Yourself)
LLMs.txt isn’t for everyone. If any of these describe your situation, you can probably skip it for now:
Single-page portfolio websites: If your entire web presence is one page, there’s nothing to structure.
Simple brochure sites under 10 pages: The overhead isn’t worth it. Focus on making those 10 pages excellent first.
Businesses with no content marketing strategy: LLMs.txt highlights your best content. If you don’t have content worth highlighting, start there.
Sites already struggling with basic SEO fundamentals: Fix your foundation first. If your site has crawl errors, missing meta tags, or broken links, those problems matter more than LLMs.txt.
How to Create an LLMs.txt File? (Step-by-Step)
Step 1: Audit Your Most Important Pages
Before writing anything, identify the 15-30 pages that best represent your business. Ask yourself:
- Which pages do I most want AI to reference when answering questions about my industry?
- What content demonstrates our expertise and authority?
- Which pages convert visitors into leads or customers?
Don’t dump your entire sitemap. This is a curated list, not a comprehensive index.
Step 2: Write the File in Markdown Format
Open any plain text editor (Notepad, TextEdit, VS Code) and create a new file called llms.txt.
Start with your H1 header and description:
# Your Business Name
> One-sentence description of what you do and who you serve.
Add sections with H2 headers and bulleted links:
Section Name
– [Page Title](https://yoursite.com/page-url): Brief description of what this page covers
Keep descriptions concise under 100 characters per link. AI skims quickly.
Step 3: Upload to Your Root Directory
Your LLMs.txt file should be accessible at https://yoursite.com/llms.txt.
For WordPress: Use a file manager plugin like WP File Manager, or access your hosting control panel (cPanel) and upload to the public_html folder.
For Shopify: Create a new page with the handle “llms” and use a custom template, or add it through theme customization.
For custom sites: Upload directly to your root directory via FTP or your hosting file manager.
Step 4: Verify and Test Your Implementation
After uploading, open a new browser tab and navigate to yoursite.com/llms.txt. You should see your Markdown file displayed as plain text.
If you see a 404 error, double-check:
- The filename is exactly llms.txt (no spaces, correct extension)
- The file is in the root directory, not a subfolder
- Your server isn’t blocking .txt files
Platform-Specific Implementation Guides
WordPress (Manual + Plugin Options)
Manual method:
- Log in to your hosting cPanel
- Open File Manager
- Navigate to public_html
- Upload your llms.txt file
- Verify at yoursite.com/llms.txt
Plugin options:
- Yoast SEO: Recent versions can auto-generate LLMs.txt from your sitemap data
- Rank Math: Offers more control over which content types appear
- Website LLMs.txt plugin: Dedicated plugin with 30,000+ installs that auto-generates and tracks AI crawler access
Shopify Implementation
Shopify doesn’t allow direct file uploads to the root directory. Workaround:
- Go to Online Store → Pages
- Create a new page with URL handle llms
- Add your content in Markdown format
- This makes it accessible at yoursite.com/pages/llms
Note: This isn’t the ideal root location, but it’s the best Shopify natively allows.
Custom Websites and Static Site Generators
For sites built with Next.js, Hugo, Jekyll, or similar:
- Create llms.txt in your public or static folder
- The file will automatically be served from your root URL after build
- For dynamic generation, create an API route that returns Markdown with text/plain content type
Does LLMs.txt Actually Work? (The Honest Truth)
Let’s be real: LLMs.txt is still a proposed standard, not an established protocol. Before you invest time implementing it, you deserve to know what the data actually shows.
What the Data Actually Shows
Semrush’s experiment: Semrush implemented LLMs.txt on their sister site, Search Engine Land, in March 2025. Their findings after several months:
- Zero visits from Google-Extended (Google’s AI crawler)
- Zero visits from GPTBot (OpenAI’s crawler)
- Zero visits from PerplexityBot or ClaudeBot
- Traditional crawlers like Googlebot did visit, but with no special treatment
Adoption numbers: As of mid-2025, only about 951 domains had published an LLMs.txt file — a tiny fraction of the web.
The Skeptic’s Case Against LLMs.txt
Google’s John Mueller stated directly: “No AI system currently uses llms.txt.”
Critics argue:
- Traditional robots.txt and sitemaps already serve similar purposes
- Without official support from OpenAI, Google, or Anthropic, it’s speculative
- It could become another abandoned web standard like meta keywords
- Time spent on LLMs.txt might be better spent on proven SEO fundamentals
The Optimist’s Case For Early Adoption
On the other hand:
Anthropic has implemented it themselves. When the company behind Claude publishes their own LLMs.txt file, that’s a meaningful signal even if they haven’t officially announced crawler support.
The downside risk is minimal. Creating an LLMs.txt file takes 30-60 minutes. If it never gets adopted, you’ve lost an hour. If it becomes standard, you’re ahead of 99.9% of websites.
It’s useful regardless of AI crawlers. The process of creating an LLMs.txt file forces you to audit and prioritize your content that’s valuable for content strategy even if AI never reads the file.
First-mover advantage compounds. If major AI platforms announce support in 2026 or 2027, sites that already have well-structured LLMs.txt files will have an immediate advantage.
Our Recommendation: When It’s Worth Your Time
Implement LLMs.txt if:
- You have 20+ pages of quality content worth highlighting
- Your industry is frequently queried in AI assistants (legal, medical, technical)
- You have developer resources or can spare 1-2 hours for implementation
- You want to stay ahead of emerging standards
Skip it for now if:
- Your site has fewer than 10 pages
- You’re still working on basic SEO fundamentals
- Your content isn’t authoritative enough to be worth citing
- You have zero technical resources for implementation
The Future of LLMs.txt and GEO (Generative Engine Optimization)
Why Early Adoption Matters?
Web standards often follow a predictable pattern: early skepticism, slow adoption, then sudden acceleration when major players commit.
Remember when SSL certificates were “optional”? When was mobile-responsive design “nice to have”? When was schema markup “probably overkill”?
Each of these became essential almost overnight once Google made them ranking factors.
LLMs.txt could follow the same trajectory. If Google announces support for AI Overviews, or if OpenAI/Anthropic officially adopts the standard, the sites that already have well-structured files will have a head start.
How AI Search Is Changing Discovery?
The way people find information is shifting:
- 36 million US adults are projected to use generative AI for search by 2028, more than double 2024 numbers
- AI-driven answers increasingly replace traditional search results for informational queries
- Users trust AI-curated recommendations, meaning the sources AI cites gain significant visibility
This isn’t about replacing SEO it’s about expanding your visibility to include AI-powered discovery channels.
Preparing Your Website for AI-First Search
Beyond LLMs.txt, consider:
- Structured data markup: Schema.org markup helps both search engines and AI understand your content
- Clear content hierarchy: Use proper heading structure (H1 → H2 → H3) consistently
- Authoritative, citable content: Create content that’s specific enough to be worth referencing
- FAQ content: Question-and-answer format matches how users query AI assistants
How Inshalytics Can Help Implement LLMs.txt?
Creating an effective LLMs.txt file requires more than just listing URLs. It requires understanding which content deserves AI visibility, how to structure information for maximum clarity, and how this fits into your broader digital marketing strategy.
At Inshalytics, we offer:
Full content audit for AI readability: We analyze your existing content to identify which pages should be prioritized for AI visibility and which need improvement before being highlighted.
Custom LLMs.txt file creation: We build your file following best practices, with proper Markdown formatting and descriptions optimized for how AI systems process information.
Ongoing monitoring and updates: As your content evolves, your LLMs.txt should too. We help maintain and update your file as you publish new content and retire old pages.
Comprehensive GEO strategy: LLMs.txt is just one piece. We help position your entire web presence for AI-powered discovery, from schema markup to content optimization.
Key Takeaways
- LLMs.txt is a proposed standard for helping AI systems understand your website’s most important content, think of it as a curated guide for large language models.
- It’s not yet officially supported by major AI companies, but early signals (like Anthropic publishing their own file) suggest potential future adoption.
- Law firms, dental practices, and multi-location businesses stand to benefit most due to their complex content structures and local search dynamics.
- Implementation is low-risk, the process takes 1-2 hours, forces useful content auditing, and positions you ahead of competitors if the standard gains traction.
- Don’t neglect SEO fundamentals, LLMs.txt complements, not replaces, traditional optimization. Fix your foundation first.
Frequently Asked Questions
Is LLMs.txt the same as robots.txt?
No. Robots.txt tells crawlers what they can’t access. LLMs.txt tells AI what they should prioritize. They serve opposite purposes and work together.
Do I need LLMs.txt if I have a sitemap?
A sitemap lists everything. LLMs.txt highlights what matters most. Your sitemap might have 500 URLs; your LLMs.txt should have 15-50. They’re complementary, not redundant.
Will LLMs.txt hurt my traditional SEO?
No. LLMs.txt has no impact on traditional search engine rankings. It’s a separate file that targets AI systems specifically.
How often should I update my LLMs.txt file?
Review quarterly at a minimum. Update whenever you publish significant new content, retire old pages, or restructure your website. Outdated links in your LLMs.txt file reflect poorly on your site’s maintenance.
Which AI crawlers actually read LLMs.txt?
Currently, none have officially confirmed support. However, the standard is gaining attention, and companies like Anthropic have published their own files, suggesting they see potential value in the format.
Can LLMs.txt block AI from using my content?
No. LLMs.txt is for prioritization, not permission. To block AI crawlers, you need to modify your robots.txt file with directives for specific AI user agents (like GPTBot or ClaudeBot).




