The v6 major of Nuxt Robots adds new content composables, improves validation with warnings support, and includes several important bug fixes.
⚠️ Breaking Changes
Site Config v4
Nuxt Site Config is a module used internally by Nuxt Robots.
Its major update to v4.0.0 shouldn't have any direct effect on your site, however, you may want to double check
the breaking changes.
robots:config Hook Context
The HookRobotsConfigContext now includes a warnings: string[] field alongside the existing errors: string[]. If you use the robots:config Nitro hook and inspect the context, be aware of this new field.
🚀 Features
defineRobotsSchema() Composable
A new composable for @nuxt/content v3 that simplifies adding robots fields to your content schema.
import { defineCollection, z } from '@nuxt/content'
import { defineRobotsSchema } from '@nuxtjs/robots/content'
export default defineContentConfig({
collections: {
pages: defineCollection({
type: 'page',
source: '**/*.md',
schema: z.object({
...defineRobotsSchema(z),
}),
}),
},
})This replaces the previous asRobotsCollection() helper, which is now deprecated.
Validation Warnings
The robots.txt validation system now supports warnings in addition to errors. The first warning checks for Disallow: /api rules, which may unintentionally block API routes that need to be accessible.
Warnings appear in the devtools debug view alongside errors, helping you catch potential misconfigurations before they cause issues.
Production Debug Route
A new /__robots__/debug-production.json server route is available in development. It fetches your production site's robots.txt, validates it, and returns a structured response with errors, warnings, parsed groups, and sitemaps. This makes it easy to compare your local configuration against what's live in production.
🔧 Bug Fixes
skipSiteIndexable Now Skips Disallow: /
Previously, setting skipSiteIndexable: true (used by sitemap generation) only skipped the site config indexable check. It now also filters out Disallow: / root disallow rules from path matching, ensuring sitemap URLs are correctly generated on staging or non-indexable environments. Specific path rules like /admin still apply as expected.
Route Rules Nullish Guard
Route rules with undefined or null values no longer cause runtime errors. The normaliseRobotsRouteRule function now safely handles nullish input.
Devtools Meta Tag Parsing
Meta tag parsing in the devtools debug view is now attribute order agnostic, fixing cases where <meta content="..." name="robots"> was not detected.
🚀 Features
- content: Add
defineRobotsSchema()composable - by @harlan-zw in #283 (ac97d) - validation: Add warnings support and warn on /api disallow - by @harlan-zw in #287 (379bb)
🐞 Bug Fixes
- Guard against undefined routeRules values - by @harlan-zw in #277 (393d7)
skipSiteIndexablenow skipsdisallow: /robots.txt rules - by @harlan-zw in #282 (57957)- devtools:
- Make meta tag parsing attribute order agnostic - by @harlan-zw in #280 (37f52)
- Production inspect mode - by @harlan-zw (3a626)