express-robots-middleware

raw JSON →
1.0.1 verified Sat Apr 25 auth: no javascript

Middleware for Express.js applications to dynamically generate a robots.txt file. Version 1.0.1 is the current stable release, with infrequent updates since 2016. It supports multiple user agents, allow/disallow directives, and crawl-delay configuration. Differentiators include simple integration as route middleware and no external dependencies beyond Express.

error TypeError: Cannot read property 'set' of undefined
cause Middleware used without being bound to a route (e.g., app.use(robotsMiddleware) instead of app.get('/robots.txt', robotsMiddleware)).
fix
Bind the middleware to the '/robots.txt' route using app.get('/robots.txt', robotsMiddleware).
error TypeError: expressRobotsMiddleware is not a function
cause Default import used incorrectly in CommonJS environment without the .default property.
fix
Use const expressRobotsMiddleware = require('express-robots-middleware').default;
gotcha Options must be passed as an object with capitalized keys: UserAgent, Disallow, Allow, CrawlDelay. Using lowercase keys will be silently ignored.
fix Use exactly { UserAgent: string, Disallow: string[], Allow: string, CrawlDelay: string }
breaking In version 1.0.0, the middleware returned a function that directly sent the response; in 1.0.1, it uses next() for compatibility with error handling.
fix Upgrade to 1.0.1 to avoid unexpected behavior with Express error middleware.
gotcha The middleware does not validate the options. Invalid or missing fields will result in an empty robots.txt or runtime errors.
fix Always provide all required fields: UserAgent, Disallow array, Allow, CrawlDelay.
npm install express-robots-middleware
yarn add express-robots-middleware
pnpm add express-robots-middleware

Creates an Express app that serves a robots.txt with a disallow list and crawl delay.

import express from 'express';
import expressRobotsMiddleware from 'express-robots-middleware';

const app = express();

const robotsMiddleware = expressRobotsMiddleware({
  UserAgent: '*',
  Disallow: ['/private', '/tmp'],
  Allow: '/',
  CrawlDelay: '10'
});

app.get('/robots.txt', robotsMiddleware);

app.listen(3000, () => console.log('Server running on port 3000'));