# Generate the robots.txt file for a site

`seo-in-astro` can generate the `robots.txt` file for your Astro site with automatic configuration.

## Enabling robots.txt generation

The robots is generated by default without any configuration needed. `seo-in-astro` generates a `robots.txt` file with sensible default rules that work well for most cases.

## Viewing the robots.txt file

1. Run a build for your Astro project.

   ```sh
   npm run build
   ```

   ```sh
   yarn build
   ```

   ```sh
   pnpm build
   ```

2. Start the preview server.

   ```sh
   npm run preview
   ```

   ```sh
   yarn preview
   ```

   ```sh
   pnpm preview
   ```

3. Open http://localhost:4321/robots.txt in your browser to see the generated file.

## Advanced robots configuration

By default, `seo-in-astro` generates a `robots.txt` file with sensible default rules that work well for most cases.

However, you can override these defaults by using the `robotsTxt` option in the integration configuration.

This option allows you to specify custom rules, user agents, and more.

```js
// astro.config.mjs
import { defineConfig } from "astro/config";
import { seoInAstro } from "seo-in-astro";

export default defineConfig({
  integrations: [
    seoInAstro({
      baseUrl: "https://example.com",
      siteName: "Example",
      defaultOgImg: "/default-og.png",
      robotsTxt: {
        rules: [
          {
            userAgent: "*",
            allow: "/",
            disallow: ["/admin", "/private"],
          },
          {
            userAgent: "Googlebot",
            allow: "/",
            crawlDelay: 10,
          },
        ],
      },
    }),
  ],
});
```