Express Limiter
raw JSON →Express Limiter is a middleware for Express applications designed to enforce rate limiting on incoming HTTP requests, built specifically on Redis. It allows developers to configure limits based on various request properties like IP address, user ID, or custom functions. The package provides granular control over rate limiting rules, including total requests, expiration times, whitelisting, and custom handling for rate-limited requests. This package is currently at version 1.6.1, with its last update on npm in September 2017. Due to its age and lack of recent updates, it is largely considered unmaintained, with more modern and actively developed alternatives like `express-rate-limit` being preferred for new projects. It differentiates itself by being tightly coupled with Redis for distributed rate limiting.
Common errors
error Redis connection error: Error: connect ECONNREFUSED 127.0.0.1:6379 ↓
redis-server) and verify its configuration. Ensure your application's Redis client is configured with the correct host and port. error TypeError: require(...) is not a function ↓
require('express-limiter') result with your Express app/router and a Redis client: const limiter = require('express-limiter')(app, client); error All requests are being rate-limited globally, not per-user. ↓
app.set('trust proxy', true) in your Express application and configure lookup: 'headers.x-forwarded-for' in your limiter options. Adjust trust proxy to a specific IP or subnet if known for better security. Warnings
breaking The `express-limiter` package is considered abandoned, with no significant updates since September 2017. For new projects or actively maintained applications, it is strongly recommended to use modern and actively maintained alternatives like `express-rate-limit`. ↓
gotcha This library critically depends on a running Redis instance. Failure to connect to Redis will result in `express-limiter` middleware potentially failing or behaving unexpectedly if `ignoreErrors` is not configured. ↓
gotcha When running behind a proxy (e.g., Nginx, cloud load balancers), `connection.remoteAddress` will likely reflect the proxy's IP, not the actual client's. This can lead to global rate limiting for all users through that proxy. ↓
gotcha By default, if `ignoreErrors` is `false`, any errors from Redis will prevent the middleware from calling `next()`, potentially stalling requests. The default behavior also sends generic 'Rate limit exceeded' messages. ↓
Install
npm install express-limiter yarn add express-limiter pnpm add express-limiter Imports
- limiterFactory wrong
import limiterFactory from 'express-limiter';correctconst limiterFactory = require('express-limiter'); - limiter
const limiter = require('express-limiter')(app, client); - limiterMiddleware wrong
app.use(limiter);correctapp.get('/api/action', limiter({ lookup: 'connection.remoteAddress' }), function (req, res) { /* ... */ });
Quickstart
const express = require('express');
const app = express();
const client = require('redis').createClient();
// Basic error logging for redis client
client.on('error', (err) => console.error('Redis Client Error', err));
const limiter = require('express-limiter')(app, client);
limiter({
path: '/api/action',
method: 'get',
lookup: ['connection.remoteAddress'],
total: 150,
expire: 1000 * 60 * 60 // 150 requests per hour
});
app.get('/api/action', function (req, res) {
res.status(200).send('ok');
});
// Start the Express server
const PORT = process.env.PORT || 3000;
app.listen(PORT, () => {
console.log(`Express server running on port ${PORT}`);
});