{"id":13026,"library":"csv-parse","title":"CSV Parser for Node.js and Web","description":"csv-parse is a robust and flexible CSV parsing library for both Node.js and web environments, currently at version 6.2.1. It efficiently converts CSV text input into arrays or objects. A core feature is its implementation of the Node.js `stream.Transform` API, enabling scalable processing of large datasets with minimal memory footprint. For simpler use cases, it also offers convenient callback-based and synchronous APIs. Key differentiators include its extensive options for handling various CSV formats (delimiters, quotes, escapes, comments, line breaks, etc.), multiple distribution targets (Node.js, Web, ESM, CJS), a long and stable history since its initial release in 2010, and a strong focus on complete test coverage. The package is part of the larger `csv` project and integrates seamlessly with related packages like `csv-generate` and `csv-stringify`. It maintains a regular release cadence with ongoing development and support from Adaltas, making it a reliable choice for CSV parsing needs.","status":"active","version":"6.2.1","language":"javascript","source_language":"en","source_url":"https://github.com/adaltas/node-csv","tags":["javascript","csv","parse","parser","convert","tsv","stream","backend","frontend","typescript"],"install":[{"cmd":"npm install csv-parse","lang":"bash","label":"npm"},{"cmd":"yarn add csv-parse","lang":"bash","label":"yarn"},{"cmd":"pnpm add csv-parse","lang":"bash","label":"pnpm"}],"dependencies":[],"imports":[{"note":"For streaming or callback-based parsing in ESM. Since v5, all imports are named exports; default imports are no longer supported.","wrong":"import parse from 'csv-parse'","symbol":"parse","correct":"import { parse } from 'csv-parse'"},{"note":"For synchronous parsing in ESM. The path changed from `csv-parse/lib/sync` to `csv-parse/sync` in v5.","wrong":"import { parse } from 'csv-parse/lib/sync'","symbol":"parse (sync)","correct":"import { parse } from 'csv-parse/sync'"},{"note":"For streaming or callback-based parsing in CommonJS. Although `require('csv-parse')` works, destructuring is recommended for consistency with ESM.","wrong":"const parse = require('csv-parse')","symbol":"parse","correct":"const { parse } = require('csv-parse')"},{"note":"For synchronous parsing in CommonJS. The path for the sync module changed from `csv-parse/lib/sync` to `csv-parse/sync` in v5.","wrong":"const parse = require('csv-parse/lib/sync')","symbol":"parse (sync)","correct":"const { parse } = require('csv-parse/sync')"},{"note":"Type import for configuration options in TypeScript.","symbol":"Options","correct":"import type { Options } from 'csv-parse'"},{"note":"For browser-specific ESM builds. Directly importing from 'csv-parse' might pull Node.js polyfills, leading to issues.","wrong":"import { parse } from 'csv-parse'","symbol":"parse (browser ESM)","correct":"import { parse } from 'csv-parse/browser/esm'"}],"quickstart":{"code":"import assert from \"assert\";\nimport { parse } from \"csv-parse\";\n\nconst records = [];\n// Initialize the parser with a custom delimiter\nconst parser = parse({\n  delimiter: \":\",\n  // Using `columns: true` to output objects instead of arrays\n  // (though in this example, fixed column names are not specified)\n});\n\n// Use the readable stream API to consume records asynchronously\nparser.on(\"readable\", function () {\n  let record;\n  while ((record = parser.read()) !== null) {\n    records.push(record);\n  }\n});\n\n// Catch any parsing or stream errors\nparser.on(\"error\", function (err) {\n  console.error(\"Parsing error:\", err.message);\n});\n\n// Test that the parsed records matched the expected records\nparser.on(\"end\", function () {\n  assert.deepStrictEqual(records, [\n    [\"root\", \"x\", \"0\", \"0\", \"root\", \"/root\", \"/bin/bash\"],\n    [\"someone\", \"x\", \"1022\", \"1022\", \"\", \"/home/someone\", \"/bin/bash\"],\n  ]);\n  console.log(\"CSV parsing complete and successful!\");\n});\n\n// Write data to the stream - for a file, you'd pipe fs.createReadStream here\nparser.write(\"root:x:0:0:root:/root:/bin/bash\\n\");\nparser.write(\"someone:x:1022:1022::/home/someone:/bin/bash\\n\");\n\n// Close the writable stream to signal no more data will be written\nparser.end();\n","lang":"typescript","description":"Demonstrates basic stream-based CSV parsing with a custom delimiter, error handling, and record collection."},"warnings":[{"fix":"For ESM, use named imports: `import { parse } from 'csv-parse'`. For CJS sync API, change `require('csv-parse/lib/sync')` to `require('csv-parse/sync')`. Review the documentation for specific paths if you encounter module resolution errors.","message":"Version 5 of `csv-parse` introduced breaking changes related to ECMAScript Modules (ESM) migration. CommonJS consumers need to update import paths for sync modules from `require('csv-parse/lib/sync')` to `require('csv-parse/sync')`. All imports for `parse` (stream/callback) and `parse/sync` (synchronous) are now named exports, meaning `import { parse } from 'csv-parse'` is correct, while `import parse from 'csv-parse'` will fail.","severity":"breaking","affected_versions":">=5.0.0"},{"fix":"Update your parsing options and error handling logic to use the new names. Consult the `csv-parse` API documentation for the full list of renamed options and error codes.","message":"Several options were renamed in `csv-parse` v5 to improve clarity and consistency. Examples include `relax` renamed to `relax_quotes`, `skip_lines_with_empty_values` to `skip_records_with_empty_values`, and `skip_lines_with_error` to `skip_records_with_error`. Error codes were also renamed, e.g., `CSV_RECORD_DONT_MATCH_COLUMNS_LENGTH` became `CSV_RECORD_INCONSISTENT_COLUMNS`.","severity":"breaking","affected_versions":">=5.0.0"},{"fix":"Always attach your final processing logic (e.g., verifying `records` array content) to the `parser.on('end', ...)` event, not `parser.on('finish', ...)`.","message":"Incorrect handling of stream events can lead to incomplete data. The `finish` event is emitted when the writable stream has flushed all its input data, but it does not mean all *parsed records* have been consumed from the readable stream. To ensure all records are processed, you must use the `end` event of the readable stream.","severity":"gotcha","affected_versions":">=3.0.0"},{"fix":"Ensure your CSV data adheres to standard formatting. Use options like `relax_quotes: true` or `relax_column_count: true` (since v5, formerly `relax` and `relax_column_count` respectively) to make the parser more tolerant of minor inconsistencies, but be aware this might obscure actual data quality issues. For unescaped quotes, ensure data is pre-processed or `escape` option is correctly configured.","message":"Malformed CSV data can lead to parsing errors. Common issues include unescaped delimiters within text fields, unescaped double quotes within quoted strings, inconsistent delimiters, or inconsistent row lengths.","severity":"gotcha","affected_versions":">=3.0.0"},{"fix":"Always specify the correct `encoding` option if your CSV file is not UTF-8. For example, `parse({ encoding: 'latin1' })`. If reading from a file, ensure the `fs.createReadStream` also uses the correct encoding.","message":"When working with different character encodings (e.g., UTF-8, ISO-8859-1), incorrect or unspecified encoding can lead to garbled text or parsing errors, especially with special characters.","severity":"gotcha","affected_versions":">=3.0.0"}],"env_vars":null,"last_verified":"2026-04-19T00:00:00.000Z","next_check":"2026-07-18T00:00:00.000Z","problems":[{"fix":"For ESM, use `import { parse } from 'csv-parse'` or `import { parse } from 'csv-parse/sync'`. For CommonJS, ensure you are using the correct `require` paths, e.g., `const { parse } = require('csv-parse')` or `const { parse } = require('csv-parse/sync')` (note the change from `/lib/sync` in v5+).","cause":"Attempting to `require()` an ESM module from `csv-parse` in a CommonJS context (Node.js versions that enforce ESM for `.js` files when `\"type\": \"module\"` is present in `package.json`), or when using the wrong import path for CommonJS.","error":"Error [ERR_REQUIRE_ESM]: Must use import to load ES Module: ..."},{"fix":"For browser environments, use the specific browser ESM or IIFE builds provided by the library. For example, `import { parse } from 'csv-parse/browser/esm'` if using a module bundler like Webpack. Ensure your bundler is configured to correctly handle Node.js polyfills if you must use the Node.js distribution in the browser.","cause":"This typically occurs in browser environments when a Node.js-specific module (like `buffer` or stream polyfills) is expected but not available or correctly bundled. This often happens when importing the Node.js distribution of `csv-parse` into a browser project without proper polyfilling/bundling setup.","error":"Error: Cannot find module 'buffer' (or '_stream_readable.js:46 Uncaught Error: Cannot find module 'buffer'')"},{"fix":"Inspect the CSV data for malformed quoted fields. If the data quality is inconsistent, consider setting the `relax_quotes` option to `true` (formerly `relax`) to make the parser more tolerant, or ensure the `escape` option is correctly configured if a custom escape character is used. If the error code is `CSV_INVALID_OPENING_QUOTE` try to `parser.on('end', ...)` instead of `parser.on('close', ...)`.","cause":"An unescaped quote character was found at an unexpected location, usually within a quoted field, causing the parser to incorrectly terminate the field. This also includes an opening quote not being closed by the end of the data.","error":"Parse Error: CSV_INVALID_CLOSING_QUOTE"},{"fix":"Verify the CSV data for structural consistency. If variable column counts are expected, set the `relax_column_count` option to `true` (available since v3, also `relax_column_count` was `relax_column_count` pre-v5). If `columns` option is enabled and a record doesn't match the defined columns, the error `CSV_RECORD_DONT_MATCH_COLUMNS_LENGTH` (now `CSV_RECORD_INCONSISTENT_COLUMNS`) may occur.","cause":"A record (row) in the CSV has a different number of fields (columns) than previous records, which violates the expected consistent structure.","error":"Parse Error: CSV_RECORD_INCONSISTENT_FIELDS_LENGTH"}],"ecosystem":"npm","meta_description":null,"install_score":null,"install_tag":null,"quickstart_score":null,"quickstart_tag":null,"pypi_latest":null,"cli_name":""}