HTTP Content Encoding Handler
The `http-encoding` package provides a robust and consistent API for handling HTTP message body `Content-Encoding` headers. It supports a comprehensive set of common encodings, including Gzip, raw Deflate (with or without zlib wrappers), Brotli, Zstandard (Zstd), and Base64. The library is compatible with Node.js environments (specifically `>=v18.0.0`) and modern browsers, offering both promise-based buffer APIs (`decodeBuffer`, `encodeBuffer`) and web-standard `TransformStream` streaming APIs (`createDecodeStream`, `createEncodeStream`). It intelligently leverages native `CompressionStream` and `DecompressionStream` APIs for performance where available. All encoding names are handled case-insensitively, and various no-op encodings like 'identity' are explicitly supported. Currently at version 2.2.0, it is actively maintained as part of the broader HTTP Toolkit ecosystem. Its primary differentiator is its unified API for multiple complex encoding schemes, making it ideal for tasks involving HTTP proxying, inspection, or manipulation.
Common errors
-
Error: The encoding 'brotli' cannot be handled synchronously.
cause Attempting to use `decodeBufferSync` or `encodeBufferSync` with encodings like Brotli or Zstandard that require asynchronous processing.fixReplace calls to `decodeBufferSync` or `encodeBufferSync` with their asynchronous counterparts, `decodeBuffer` or `encodeBuffer` respectively. -
TypeError: http_encoding_1.decodeBuffer is not a function
cause Incorrect module import syntax, typically occurring when attempting to `require()` a named ESM export as a default, or incorrectly destructuring CommonJS exports.fixEnsure you are using the correct ESM import syntax: `import { decodeBuffer } from 'http-encoding';`.
Warnings
- gotcha The `decodeBufferSync` method is less performant and has limited codec support, specifically excluding Brotli and Zstandard. It is generally not recommended for production use cases involving these encodings or high-performance scenarios.
- gotcha The library heavily leverages native `CompressionStream` and `DecompressionStream` APIs for its streaming functionality in Node.js 18+ and modern browsers. While it aims for broad compatibility, older Node.js versions or non-standard environments might exhibit limited functionality or require polyfills for these web-standard streams.
- gotcha This package adopts an ESM-first development approach, especially given its Node.js >=v18.0.0 engine requirement. While CommonJS `require()` might offer partial interoperability, it is strongly advised to use ESM `import` statements for full type safety, consistent behavior, and future compatibility.
Install
-
npm install http-encoding -
yarn add http-encoding -
pnpm add http-encoding
Imports
- decodeBuffer
const { decodeBuffer } = require('http-encoding')import { decodeBuffer } from 'http-encoding' - createEncodeStream
import createEncodeStream from 'http-encoding'
import { createEncodeStream } from 'http-encoding' - gzip
import * as encoding from 'http-encoding'; encoding.gzip(...)
import { gzip, brotliDecompress } from 'http-encoding'
Quickstart
import { decodeBuffer, encodeBuffer, createDecodeStream, createEncodeStream } from 'http-encoding';
import { Readable, Transform } from 'stream'; // For Node.js streaming example
async function handleContentEncoding() {
const originalText = "Hello, world! This is a test of HTTP content encoding.";
const originalBuffer = Buffer.from(originalText, 'utf-8');
// --- Buffer API Example (Gzip) ---
console.log("-- Buffer API (Gzip) --");
const gzippedBuffer = await encodeBuffer(originalBuffer, 'gzip', { level: 9 });
console.log(`Original size: ${originalBuffer.length} bytes, Gzipped size: ${gzippedBuffer.length} bytes`);
const decodedGzipBuffer = await decodeBuffer(gzippedBuffer, 'gzip');
console.log(`Decoded Gzip matches original: ${decodedGzipBuffer.toString('utf-8') === originalText}`);
// --- Streaming API Example (Brotli in Node.js) ---
console.log("\n-- Streaming API (Brotli) --");
// createEncodeStream and createDecodeStream return web-standard TransformStream instances.
// In Node.js, these are compatible with native streams but might need casting for TypeScript
// if not specifically targeting 'dom' or 'webworker' lib types in tsconfig.
const encoderStream = createEncodeStream('brotli');
const decoderStream = createDecodeStream('brotli');
if (!encoderStream || !decoderStream) {
console.error("Brotli streaming not available or identity encoding used. Ensure Node.js >= 18.");
return;
}
const inputReadable = Readable.from([originalBuffer]);
let encodedChunks: Buffer[] = [];
let decodedChunks: Buffer[] = [];
// Pipe the original content through the encoder stream
await new Promise<void>((resolve, reject) => {
inputReadable
.pipe(encoderStream as any) // Cast for Node.js Stream compatibility
.on('data', (chunk) => encodedChunks.push(chunk))
.on('end', resolve)
.on('error', reject);
});
const encodedTotal = Buffer.concat(encodedChunks);
console.log(`Original size: ${originalBuffer.length} bytes, Brotli encoded size: ${encodedTotal.length} bytes`);
// Pipe the encoded content through the decoder stream
await new Promise<void>((resolve, reject) => {
Readable.from([encodedTotal])
.pipe(decoderStream as any) // Cast for Node.js Stream compatibility
.on('data', (chunk) => decodedChunks.push(chunk))
.on('end', resolve)
.on('error', reject);
});
const decodedTotal = Buffer.concat(decodedChunks);
console.log(`Decoded Brotli matches original: ${decodedTotal.toString('utf-8') === originalText}`);
}
handleContentEncoding().catch(console.error);