Replies: 2 comments
-
Sorry for duplicating idea in #80692 (comment) and #81182 |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi, We have gotten these requested here too:
The last link includes a comment on how to do this, on user land for now. We have also implement these here: |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Goals
next/metadata
public/llms.txt
to be auto-servedpages/api/llms.txt.ts
that can return dynamically generated valuesNon-Goals
Background
llms.txt
is an emerging convention (see: https://llmstxt.org/) proposed to standardize how websites communicate their preferences regarding how Large Language Models (LLMs) like ChatGPT or Gemini should crawl, ingest, and use site content.This format is similar in spirit to
robots.txt
, but tailored for LLMs, and can include:Proposal
Introduce built-in support in Next.js for serving a
llms.txt
file from the root of the site, similar to howrobots.txt
is handled, to explicitly indicate which parts of a site are designed to be LLM-friendly or LLM-restricted.Next.js already emphasizes performance, SEO, and modern web practices. By enabling an easy way to serve
llms.txt
, developers can:This aligns with Next.js’s commitment to modern, responsible web development.
Beta Was this translation helpful? Give feedback.
All reactions