Pako - JavaScript Zlib Compression
Pako is a high-performance JavaScript port of the zlib library, designed for fast data compression and decompression directly within web browsers and Node.js environments. The current stable version is 2.1.0. It aims to provide binary-compatible results with the original C zlib implementation (v1.2.8 as per the README), demonstrating near-native speed in modern JavaScript engines for CPU-intensive tasks. Pako differentiates itself by its modular design, allowing individual components to be used or browserified, and its support for both raw deflate/inflate streams and gzip format. It handles various input types, including automatic UTF-8 recoding for strings during compression and optional UTF-8 string output for decompression. Its release cadence is generally feature-driven and as needed, rather than fixed time intervals.
Common errors
-
ReferenceError: require is not defined
cause Attempting to use `require()` in an ES module environment (e.g., Node.js with `"type": "module"` in `package.json`, or a browser without a bundler that transpiles CJS).fixFor Pako v2.x, ensure your environment supports CommonJS or use a bundler (like Webpack or Rollup) configured for CJS interop. For Pako v3.x, switch to ES module imports: `import { deflate } from 'pako';` -
TypeError: pako.deflate is not a function
cause This typically occurs when using `import pako from 'pako'` with Pako v3, or when mixing CJS/ESM import styles. In v3, `deflate` (and other core functions) are named exports, not properties of a default `pako` object.fixFor Pako v3, change your import to `import { deflate, inflate } from 'pako';` and call them directly as `deflate(...)`, `inflate(...)`. If using v2 with a bundler that mishandles default exports, try `import * as pako from 'pako';` or stick to `const pako = require('pako');`. -
Error: incorrect header check
cause The input data provided to `pako.inflate` (or `Inflate` class) is corrupted, incomplete, or was compressed with a different format than expected (e.g., raw deflate data passed to `inflate` expecting gzip, or vice versa).fixVerify the integrity and completeness of your compressed data. Ensure that the `format` option for `inflate` (e.g., `windowBits`) matches the format used during compression. For example, use `pako.inflate(data, { raw: true })` for raw deflate data, or `pako.inflate(data, { to: 'string' })` if expecting UTF-8 output from a compatible `deflate` operation.
Warnings
- breaking Starting with v3.0.0, Pako transitioned to an ESM-first package. While bundlers may handle CJS interop, direct `require('pako')` for the main entry point will lead to issues, and `import pako from 'pako'` will behave differently or require specific build configurations compared to v2.x.
- breaking In Pako v3.x, methods like `deflate`, `inflate`, `deflateRaw`, and `inflateRaw` are now direct named exports from the top-level `pako` module, rather than properties of a default `pako` object (e.g., `pako.deflate`). This changes the way these functions are accessed.
- gotcha When using the streaming `Deflate` or `Inflate` classes, errors are not thrown as exceptions directly but are instead indicated by the `err` property on the instance after `push` operations. Failing to check `deflator.err` or `inflator.err` can lead to silent data corruption or incomplete results.
Install
-
npm install pako -
yarn add pako -
pnpm add pako
Imports
- pako
import pako from 'pako';
const pako = require('pako'); - Deflate class
import { Deflate } from 'pako';const { Deflate } = require('pako'); - inflate function
import { inflate } from 'pako';const { inflate } = require('pako');
Quickstart
const pako = require('pako');
const data = { message: 'Hello, pako!', timestamp: Date.now(), items: ['a', 'b', 'c'] };
const stringifiedData = JSON.stringify(data);
console.log('Original size (bytes):', Buffer.byteLength(stringifiedData, 'utf8'));
// Compress stringified data
const compressed = pako.deflate(stringifiedData, { level: 9 }); // Use highest compression level
console.log('Compressed size (bytes):', compressed.length);
// Decompress back to string
const restoredString = pako.inflate(compressed, { to: 'string' });
console.log('Decompressed string matches original:', restoredString === stringifiedData);
const restoredData = JSON.parse(restoredString);
console.log('Restored data:', restoredData);
// Demonstrating stream interface for larger data
const deflator = new pako.Deflate();
deflator.push(new TextEncoder().encode('First chunk of data. '), false);
deflator.push(new TextEncoder().encode('Second chunk, and this is the end.'), true);
if (deflator.err) {
console.error('Deflate error:', deflator.msg);
} else {
console.log('Stream compressed data length:', deflator.result.length);
}