Skip to content

Generate the robots.txt file for a site

seo-in-astro can generate the robots.txt file for your Astro site with automatic configuration.

The robots is generated by default without any configuration needed. seo-in-astro generates a robots.txt file with sensible default rules that work well for most cases.

  1. Run a build for your Astro project.

    Terminal window
    npm run build
  2. Start the preview server.

    Terminal window
    npm run preview
  3. Open http://localhost:4321/robots.txt in your browser to see the generated file.

By default, seo-in-astro generates a robots.txt file with sensible default rules that work well for most cases.

However, you can override these defaults by using the robotsTxt option in the integration configuration.

This option allows you to specify custom rules, user agents, and more.

astro.config.mjs
import { defineConfig } from "astro/config";
import { seoInAstro } from "@dlcastillop/seo-in-astro";
export default defineConfig({
integrations: [
seoInAstro({
baseUrl: "https://example.com",
siteName: "Example",
defaultOgImg: "/default-og.png",
robotsTxt: {
rules: [
{
userAgent: "*",
allow: "/",
disallow: ["/admin", "/private"],
},
{
userAgent: "Googlebot",
allow: "/",
crawlDelay: 10,
},
],
},
}),
],
});