{"id":15667,"library":"kafkajs","title":"KafkaJS Client","description":"KafkaJS is a modern Apache Kafka client library for Node.js, currently at stable version 2.2.4. It offers comprehensive features for producing and consuming messages, as well as administrative tasks, providing native support for Kafka 0.10+ and specifically 0.11 features like transactions and custom authentication mechanisms. The library ships with full TypeScript definitions, ensuring a robust development experience. It aims for a low-level, high-performance approach, with a relatively frequent release cadence for bug fixes and minor enhancements, as seen by multiple patch releases in quick succession after major version bumps. A key differentiator is its focus on Node.js specifics and robust error handling to prevent common issues like CPU spikes.","status":"active","version":"2.2.4","language":"javascript","source_language":"en","source_url":"https://github.com/tulios/kafkajs","tags":["javascript","kafka","sasl","scram","typescript"],"install":[{"cmd":"npm install kafkajs","lang":"bash","label":"npm"},{"cmd":"yarn add kafkajs","lang":"bash","label":"yarn"},{"cmd":"pnpm add kafkajs","lang":"bash","label":"pnpm"}],"dependencies":[],"imports":[{"note":"The primary entry point for configuring the Kafka client. While CommonJS `require` works, ESM `import` is the standard for modern Node.js applications, especially with TypeScript.","wrong":"const { Kafka } = require('kafkajs')","symbol":"Kafka","correct":"import { Kafka } from 'kafkajs'"},{"note":"These interfaces/classes are typically instantiated via the `Kafka` instance (e.g., `kafka.producer()`), but their types or constructors are directly exported from the main `kafkajs` package. Avoid internal path imports.","wrong":"import { Producer } from 'kafkajs/lib/producer'","symbol":"Producer, Consumer, Admin","correct":"import { Producer, Consumer, Admin } from 'kafkajs'"},{"note":"For TypeScript, it's best practice to use `import type` when importing only type definitions to ensure no runtime code is generated, improving bundle size and clarity.","wrong":"import { EachMessagePayload } from 'kafkajs'","symbol":"EachMessagePayload, KafkaConfig","correct":"import type { EachMessagePayload, KafkaConfig } from 'kafkajs'"},{"note":"Provides access to built-in partitioners like `DefaultPartitioner` and `LegacyPartitioner`. It's recommended to import `Partitioners` and then access the specific partitioner via dot notation.","wrong":"const { DefaultPartitioner } = require('kafkajs/partitioners')","symbol":"Partitioners","correct":"import { Partitioners } from 'kafkajs'"}],"quickstart":{"code":"import { Kafka, logLevel } from 'kafkajs';\n\nconst kafka = new Kafka({\n  clientId: 'my-app',\n  brokers: [process.env.KAFKA_BROKER_1 ?? 'localhost:9092'],\n  logLevel: logLevel.INFO,\n});\n\nconst producer = kafka.producer();\nconst consumer = kafka.consumer({ groupId: 'my-group' });\n\nconst run = async () => {\n  // Producer\n  await producer.connect();\n  await producer.send({\n    topic: 'test-topic',\n    messages: [\n      { value: 'Hello KafkaJS user!' },\n      { key: 'my-key', value: 'Another message with a key!' },\n    ],\n  });\n  console.log('Message sent by producer.');\n\n  // Consumer\n  await consumer.connect();\n  await consumer.subscribe({ topic: 'test-topic', fromBeginning: true });\n\n  await consumer.run({\n    eachMessage: async ({ topic, partition, message }) => {\n      console.log({\n        value: message.value?.toString(),\n        key: message.key?.toString(),\n        headers: message.headers,\n        topic,\n        partition,\n      });\n    },\n  });\n  console.log('Consumer started, waiting for messages...');\n};\n\nrun().catch(async (e) => {\n  console.error(`[example/kafkajs] ${e.message}`, e);\n  await producer.disconnect();\n  await consumer.disconnect();\n  process.exit(1);\n});\n\nprocess.on('SIGTERM', async () => {\n  console.log('SIGTERM received, disconnecting Kafka clients.');\n  await producer.disconnect();\n  await consumer.disconnect();\n  process.exit(0);\n});","lang":"typescript","description":"This quickstart demonstrates how to initialize a Kafka client, send messages using a producer, and consume messages using a consumer, including basic error handling and graceful shutdown."},"warnings":[{"fix":"Thoroughly read the official v2.0.0 migration guide (https://kafka.js.org/docs/migration-guide-v2.0.0) before upgrading. Pay special attention to partitioner configuration and Node.js version requirements.","message":"Upgrading to v2.0.0 from v1.x introduces several breaking changes, including a new default partitioner, removal of Node.js 10/12 support, and changes to admin client methods and TypeScript enums. Not addressing these can lead to messages being routed incorrectly or runtime errors.","severity":"breaking","affected_versions":">=2.0.0"},{"fix":"Upgrade to KafkaJS v2.1.0 or newer to resolve the issue of excessive CPU usage during prolonged broker unavailability.","message":"Prior to v2.1.0, KafkaJS consumers could exhibit 100% CPU utilization when all configured Kafka brokers were unavailable, leading to application unresponsiveness and crashes.","severity":"gotcha","affected_versions":"<2.1.0"},{"fix":"Upgrade to KafkaJS v2.2.3 or newer to fix the SASL/PLAIN authentication regression.","message":"A regression in versions 2.2.0 through 2.2.2 caused issues with SASL/PLAIN authentication, preventing clients from connecting correctly when using this mechanism.","severity":"gotcha","affected_versions":"2.2.0 - 2.2.2"},{"fix":"Ensure you are using KafkaJS v2.2.4 or newer to mitigate issues with consumers getting stuck after throttling events.","message":"Consumers might get stuck or become unresponsive after very brief throttling periods, potentially halting message processing, a bug resolved in v2.2.4.","severity":"gotcha","affected_versions":"<2.2.4"},{"fix":"Upgrade to KafkaJS v2.2.4 or newer to prevent infinite crash loops when no brokers are available, improving client resilience.","message":"Older versions could enter an infinite crash loop when no brokers were available during startup or reconnection, leading to application instability.","severity":"gotcha","affected_versions":"<2.2.4"},{"fix":"Ensure each logical consumer group (e.g., for different applications or environments) uses a unique `groupId`. Describe your consumer group to verify expected members.","message":"Re-using `groupId` across different applications or independent deployments of the same application can lead to unexpected consumer behavior, such as receiving messages for unsubscribed topics or frequent rebalances.","severity":"gotcha","affected_versions":"all"}],"env_vars":null,"last_verified":"2026-04-21T00:00:00.000Z","next_check":"2026-07-20T00:00:00.000Z","problems":[{"fix":"Upgrade KafkaJS to v2.2.4 or newer to fix the consumer group rejoining logic after `ILLEGAL_GENERATION` errors.","cause":"A consumer group coordination issue where a rebalance incorrectly led to an `ILLEGAL_GENERATION` error preventing consumers from rejoining the group.","error":"The group is rebalancing, so a rejoin is needed (ILLEGAL_GENERATION)"},{"fix":"Upgrade KafkaJS to v2.1.0 or newer to resolve the 100% CPU utilization issue during broker unavailability.","cause":"KafkaJS client entering a tight loop when all configured Kafka brokers are unreachable or unavailable.","error":"Application consuming 100% CPU without processing messages."},{"fix":"Upgrade KafkaJS to v2.2.3 or newer to fix the regression in SASL/PLAIN authentication.","cause":"A regression in `v2.2.0` through `v2.2.2` broke SASL/PLAIN authentication support.","error":"Failed to connect: SASL authentication failed (when using PLAIN mechanism)"},{"fix":"Upgrade KafkaJS to v2.1.0 or newer to fix persistent errors when producing after a topic authorization error.","cause":"A producer might get stuck or persistently fail to send messages after encountering a topic authorization error.","error":"Error: Topic authorization failed (producer getting stuck)"}],"ecosystem":"npm"}