Generate the robots.txt file for a site
seo-in-astro can generate the robots.txt file for your Astro site with automatic configuration.
Enabling robots.txt generation
Section titled “Enabling robots.txt generation”The robots is generated by default without any configuration needed. seo-in-astro generates a robots.txt file with sensible default rules that work well for most cases.
Viewing the robots.txt file
Section titled “Viewing the robots.txt file”-
Run a build for your Astro project.
Terminal window npm run buildTerminal window yarn buildTerminal window pnpm build -
Start the preview server.
Terminal window npm run previewTerminal window yarn previewTerminal window pnpm preview -
Open http://localhost:4321/robots.txt in your browser to see the generated file.
Advanced robots configuration
Section titled “Advanced robots configuration”By default, seo-in-astro generates a robots.txt file with sensible default rules that work well for most cases.
However, you can override these defaults by using the robotsTxt option in the integration configuration.
This option allows you to specify custom rules, user agents, and more.
import { defineConfig } from "astro/config";import { seoInAstro } from "@dlcastillop/seo-in-astro";
export default defineConfig({ integrations: [ seoInAstro({ baseUrl: "https://example.com", siteName: "Example", defaultOgImg: "/default-og.png", robotsTxt: { rules: [ { userAgent: "*", allow: "/", disallow: ["/admin", "/private"], }, { userAgent: "Googlebot", allow: "/", crawlDelay: 10, }, ], }, }), ],});