{"id":10528,"library":"avsc","title":"Avsc - Avro for JavaScript","description":"Avsc is a pure JavaScript implementation of the Apache Avro specification, currently stable at version 5.7.9. It provides fast and compact data serialization and deserialization, often outperforming JSON with smaller encodings. Key features include comprehensive support for Avro type inference, schema evolution, logical types (e.g., handling JavaScript Date objects transparently), and remote procedure calls (RPC) with IDL support. The library is actively maintained, with recent minor updates indicating ongoing development. It differentiates itself by offering a complete Avro ecosystem within JavaScript, making it suitable for high-performance data interchange and integration with Avro-based systems like Apache Kafka, especially in Node.js environments. The package also ships with built-in TypeScript type definitions.","status":"active","version":"5.7.9","language":"javascript","source_language":"en","source_url":"git://github.com/mtth/avsc","tags":["javascript","api","avdl","avpr","avro","avsc","binary","buffer","data","typescript"],"install":[{"cmd":"npm install avsc","lang":"bash","label":"npm"},{"cmd":"yarn add avsc","lang":"bash","label":"yarn"},{"cmd":"pnpm add avsc","lang":"bash","label":"pnpm"}],"dependencies":[{"reason":"Optional dependency for decompressing Avro container files using Snappy compression.","package":"snappy","optional":true},{"reason":"Optional dependency for checksum validation in compressed Avro container files.","package":"buffer-crc32","optional":true}],"imports":[{"note":"While CommonJS `require` is common in older Node.js examples, ESM `import * as avro` is the correct pattern for modern TypeScript/ESM projects. TypeScript users will typically prefer named imports when possible.","wrong":"const avro = require('avsc');","symbol":"avro","correct":"import * as avro from 'avsc';"},{"note":"The primary `Type` class for schema and value operations is a named export. Ensure you import it directly or access it via the `avro` namespace.","wrong":"import avro from 'avsc'; const type = avro.Type; // 'avro' is not the default export","symbol":"Type","correct":"import { Type } from 'avsc';"},{"note":"For Avro RPC implementations, the `Service` class is a named export. It is used to define and create servers based on Avro protocols.","wrong":"import avro from 'avsc'; const service = avro.Service; // 'avro' is not the default export","symbol":"Service","correct":"import { Service } from 'avsc';"},{"note":"Utility functions like `createFileDecoder` are provided as named exports for direct import, simplifying usage and enabling tree-shaking in ESM environments.","wrong":"import avro from 'avsc'; const decoder = avro.createFileDecoder;","symbol":"createFileDecoder","correct":"import { createFileDecoder } from 'avsc';"}],"quickstart":{"code":"import { Type } from 'avsc';\n\nconst petSchema = {\n  type: 'record',\n  name: 'Pet',\n  fields: [\n    {\n      name: 'kind',\n      type: { type: 'enum', name: 'PetKind', symbols: ['CAT', 'DOG', 'FISH'] }\n    },\n    { name: 'name', type: 'string' }\n  ]\n};\n\n// Create an Avro Type from a schema definition\nconst petType = Type.forSchema(petSchema);\n\n// Encode a JavaScript object into an Avro binary buffer\nconst myPet = { kind: 'CAT', name: 'Albert' };\nconst buf = petType.toBuffer(myPet);\n\nconsole.log('Original object:', myPet);\nconsole.log('Encoded buffer:', buf.toString('hex'));\n\n// Decode the Avro binary buffer back into a JavaScript object\nconst decodedPet = petType.fromBuffer(buf);\n\nconsole.log('Decoded object:', decodedPet);\n\n// Example of schema inference for similar structures\nconst addressType = Type.forValue({\n  city: 'Cambridge',\n  zipCodes: ['02138', '02139'],\n  visits: 2\n});\n\nconst otherAddress = { city: 'Seattle', zipCodes: ['98101'], visits: 3 };\nconst otherBuf = addressType.toBuffer(otherAddress);\nconsole.log('\\nInferred schema for:', otherAddress);\nconsole.log('Encoded (inferred) buffer:', otherBuf.toString('hex'));\n","lang":"typescript","description":"This quickstart demonstrates how to define an Avro schema, create an Avro `Type`, and then use it to encode and decode JavaScript objects into binary buffers. It also shows basic type inference."},"warnings":[{"fix":"Review and update code that handles Avro union types. If you relied on a wrapper object around union values, you may need to adjust your parsing logic or explicitly configure `avsc` to use the legacy wrapped union representation if available (though it's recommended to migrate to the unwrapped format).","message":"Version 4.0.0 introduced a significant breaking change in how Avro unions are represented. Unions are now unwrapped by default, meaning they no longer use an encapsulating object for their value. This can affect code expecting the old wrapped structure.","severity":"breaking","affected_versions":">=4.0.0"},{"fix":"For applications requiring exact 64-bit integer precision, configure `avsc` to use custom 'long' types such as `BigInt` (Node.js >= 10) or libraries like `long.js`. This is done via options when creating `Type` instances, e.g., `Type.forSchema(schema, { logicalTypes: { 'long': MyBigIntLongType } })`.","message":"JavaScript's native `number` type is a 64-bit floating-point number, which can lead to precision loss for Avro's 64-bit integer (`long`) type if the values exceed `Number.MAX_SAFE_INTEGER` (2^53 - 1).","severity":"gotcha","affected_versions":">=0.11"},{"fix":"For complex IDL definitions with multiple imports, consider flattening your schema into a single file or using pre-processing steps to resolve imports before passing them to `avsc.readProtocol` or `avro.assemble`. `parseTypeSchema` specifically does not support imports.","message":"The `avro.assemble` function for Avro IDL (AVDL) and JSON protocol (AVPR) import support, introduced in v3.3.0, does not fully support nested or external `import` statements within IDL files for schema parsing, unlike some other Avro implementations. This can complicate large, modular schema definitions.","severity":"gotcha","affected_versions":">=3.3.0"},{"fix":"Strictly adhere to Avro schema evolution rules (e.g., only add nullable fields, do not remove existing fields, ensure compatible type changes). Thoroughly test schema compatibility between different versions of your data producers and consumers.","message":"While `avsc` supports schema evolution, providing incompatible reader and writer schemas (e.g., removing required fields or changing fundamental types without a compatible evolution rule) will result in runtime errors during decoding.","severity":"gotcha","affected_versions":">=0.11"}],"env_vars":null,"last_verified":"2026-04-19T00:00:00.000Z","next_check":"2026-07-18T00:00:00.000Z","problems":[{"fix":"Verify the integrity of the Avro binary data. Ensure the schema used for decoding (`reader's schema`) is compatible with the schema used for encoding (`writer's schema`). If reading from a file, check file corruption or incorrect file type.","cause":"Attempting to decode a corrupted or malformed Avro buffer, or using a reader schema that is fundamentally incompatible with the writer's schema.","error":"RangeError: Attempt to access memory outside buffer bounds"},{"fix":"Ensure that all non-nullable fields defined in your Avro schema are present in the JavaScript object you are attempting to serialize, or make the field nullable in the schema definition (e.g., `['null', 'string']`).","cause":"A JavaScript object being encoded is missing a field that is defined as 'required' (non-nullable) in the Avro schema.","error":"AvroTypeException: missing required field: 'fieldName'"},{"fix":"Review your Avro schema definition carefully against the Avro specification. Check for typos, missing `type` properties, incorrect capitalization of primitive types, or invalid complex type structures.","cause":"The schema provided to `Type.forSchema` is syntactically incorrect, incomplete, or contains invalid Avro type definitions.","error":"TypeError: \"type\" must be a valid Avro type."},{"fix":"Ensure all named types are fully defined or accessible within the context of the combined schemas when performing schema evolution. Check for correct namespaces and names of referenced types.","cause":"Occurs during schema resolution for evolution, indicating a named type (e.g., a record, enum, or fixed type) referenced in the writer's or reader's schema cannot be found or resolved.","error":"Error: No matching type for schema 'avro.FullName'"}],"ecosystem":"npm"}