robots.txt
Source URL: https://nextjs.org/docs/app/api-reference/file-conventions/metadata/robots
robots.txt
Section titled “robots.txt”Add or generate a robots.txt file that matches the Robots Exclusion Standard in the root of app directory to tell search engine crawlers which URLs they can access on your site.
Static robots.txt
Section titled “Static robots.txt”User-Agent: *Allow: /Disallow: /private/
Sitemap: https://acme.com/sitemap.xmlGenerate a Robots file
Section titled “Generate a Robots file”Add a robots.js or robots.ts file that returns a Robots object.
Good to know:
robots.jsis a special Route Handler that is cached by default unless it uses a Dynamic API or dynamic config option.
import type { MetadataRoute } from 'next'
export default function robots(): MetadataRoute.Robots { return { rules: { userAgent: '*', allow: '/', disallow: '/private/', }, sitemap: 'https://acme.com/sitemap.xml', }}export default function robots() { return { rules: { userAgent: '*', allow: '/', disallow: '/private/', }, sitemap: 'https://acme.com/sitemap.xml', }}Output:
User-Agent: *Allow: /Disallow: /private/
Sitemap: https://acme.com/sitemap.xmlCustomizing specific user agents
Section titled “Customizing specific user agents”You can customize how individual search engine bots crawl your site by passing an array of user agents to the rules property. For example:
import type { MetadataRoute } from 'next'
export default function robots(): MetadataRoute.Robots { return { rules: [ { userAgent: 'Googlebot', allow: ['/'], disallow: '/private/', }, { userAgent: ['Applebot', 'Bingbot'], disallow: ['/'], }, ], sitemap: 'https://acme.com/sitemap.xml', }}export default function robots() { return { rules: [ { userAgent: 'Googlebot', allow: ['/'], disallow: ['/private/'], }, { userAgent: ['Applebot', 'Bingbot'], disallow: ['/'], }, ], sitemap: 'https://acme.com/sitemap.xml', }}Output:
User-Agent: GooglebotAllow: /Disallow: /private/
User-Agent: ApplebotDisallow: /
User-Agent: BingbotDisallow: /
Sitemap: https://acme.com/sitemap.xmlRobots object
Section titled “Robots object”type Robots = { rules: | { userAgent?: string | string[] allow?: string | string[] disallow?: string | string[] crawlDelay?: number } | Array<{ userAgent: string | string[] allow?: string | string[] disallow?: string | string[] crawlDelay?: number }> sitemap?: string | string[] host?: string}Version History
Section titled “Version History”| Version | Changes |
|---|---|
v13.3.0 | robots introduced. |