{"id":13406,"library":"jsonparse","title":"JSON Streaming Parser for Node.js","description":"jsonparse is a pure JavaScript library providing a streaming JSON parser for Node.js environments. Currently at version 1.3.1, this package allows developers to process large JSON inputs incrementally, rather than loading the entire structure into memory like the built-in `JSON.parse` method. This streaming capability is its primary differentiator, enabling more memory-efficient handling of vast datasets or continuous data feeds. However, the package has seen no updates since 2012, indicating it is no longer actively maintained. Its release cadence is effectively nonexistent, and it targets very old Node.js versions (0.2.0+). Developers should be aware of its age when considering it for modern applications.","status":"abandoned","version":"1.3.1","language":"javascript","source_language":"en","source_url":"ssh://git@github.com/creationix/jsonparse","tags":["javascript"],"install":[{"cmd":"npm install jsonparse","lang":"bash","label":"npm"},{"cmd":"yarn add jsonparse","lang":"bash","label":"yarn"},{"cmd":"pnpm add jsonparse","lang":"bash","label":"pnpm"}],"dependencies":[],"imports":[{"note":"This library is CommonJS-only, reflecting its age. ESM `import` statements are not supported.","wrong":"import JSONParse from 'jsonparse';","symbol":"JSONParse","correct":"const JSONParse = require('jsonparse');"}],"quickstart":{"code":"const JSONParse = require('jsonparse');\nconst fs = require('fs');\nconst path = require('path');\n\n// Imagine this is a very large JSON stream from a file or network.\n// For demonstration, we use a string and simulate chunks.\nconst largeJsonString = `{\n  \"header\": {\"requestId\": \"abc-123\", \"timestamp\": \"${new Date().toISOString()}\"},\n  \"data\": [\n    {\"id\": 1, \"name\": \"Item A\", \"value\": 100},\n    {\"id\": 2, \"name\": \"Item B\", \"value\": 200},\n    {\"id\": 3, \"name\": \"Item C\", \"value\": 300},\n    {\"id\": 4, \"name\": \"Item D\", \"value\": 400},\n    {\"id\": 5, \"name\": \"Item E\", \"value\": 500}\n  ],\n  \"footer\": {\"processed\": true}\n}`;\n\nconst parser = new JSONParse();\nlet processedItemsCount = 0;\n\n// The onValue event is fired for every primitive value, and for objects/arrays once they are complete.\nparser.onValue = function (value, key, parent) {\n  // `this.stack` is an array representing the current path in the JSON structure.\n  // For example, if parsing `{\"data\": [{\"id\": 1}]}`, when 'id' is parsed,\n  // this.stack might be ['data', 0, 'id'].\n\n  // Check if we are inside the 'data' array and the value is a complete item object.\n  // The 'parent' argument is the immediate parent of the current value.\n  // If `key` is null, it means `value` is an element of an array.\n  if (this.stack.length === 1 && this.stack[0] === 'data' && key === null) {\n    // 'value' here is a complete item object (e.g., {\"id\": 1, \"name\": \"Item A\", \"value\": 100})\n    console.log('Parsed item:', value);\n    processedItemsCount++;\n  } else if (this.stack.length === 0 && key === 'header') {\n    // Process header object once it's complete\n    console.log('Parsed header:', value);\n  }\n};\n\nparser.onEnd = function () {\n  console.log(`\\nStreaming parse complete. Total items processed: ${processedItemsCount}`);\n};\n\nparser.onError = function (err) {\n  console.error('Error during JSON parsing:', err.message);\n  process.exit(1);\n};\n\n// Simulate receiving data in chunks\nconst chunkSize = 50; // Smaller chunks for demonstration\nfor (let i = 0; i < largeJsonString.length; i += chunkSize) {\n  const chunk = largeJsonString.substring(i, i + chunkSize);\n  parser.write(chunk);\n  // In a real application, you might add a small delay here to simulate\n  // network latency, but for this quickstart, immediate writing is fine.\n}\nparser.end(); // Signal that no more data will be written.\n\nconsole.log('JSONParse initialized and started processing data stream.');","lang":"javascript","description":"This quickstart demonstrates how to use `jsonparse` to incrementally parse a large JSON string by handling individual objects or values as they are identified in the stream. It shows how to initialize the parser, handle `onValue` events for specific paths (like array items or top-level objects), and manage parsing errors and completion."},"warnings":[{"fix":"Consider modern, actively maintained streaming JSON parsers like `JSONStream` or `@streamparser/json`, which offer better performance, security, and ESM compatibility. If full document parsing is sufficient, `JSON.parse` is native and highly optimized.","message":"This package is effectively abandoned, with its last release over a decade ago (v1.3.1 in 2012). It has not been updated to support modern JavaScript features, Node.js versions, or security practices. Using it in contemporary projects may lead to compatibility issues or introduce unpatched vulnerabilities.","severity":"breaking","affected_versions":">=1.0.0"},{"fix":"Ensure your project uses CommonJS or use dynamic `import()` if you must use it within an ESM context (which is generally discouraged for unmaintained libraries). It's best to refactor to a modern ESM-compatible streaming parser.","message":"The package is CommonJS-only (`require`). It does not support ES Modules (`import`), which is the standard module system in modern Node.js environments. Attempting to use `import` will result in a runtime error.","severity":"gotcha","affected_versions":">=1.0.0"},{"fix":"Benchmark against native `JSON.parse` (if the full string fits in memory) or other performant streaming libraries to assess suitability for your specific performance requirements. Libraries like `@streamparser/json` are designed for speed while remaining pure JS.","message":"This pure JavaScript implementation may be significantly slower than native JSON parsing (`JSON.parse`) or other streaming parsers that utilize C++ addons for performance-critical operations. For high-throughput scenarios, this can be a bottleneck.","severity":"gotcha","affected_versions":">=1.0.0"},{"fix":"Thoroughly test `onValue` logic, especially with complex or malformed JSON. Pay attention to `this.stack` to understand the current parsing depth and path. Consider using a JSONPath-like approach if you only need specific parts of the JSON.","tags":["performance","memory"],"message":"The `onValue` event provides raw values and context (`this.stack`, `key`, `parent`). Correctly reconstructing or acting upon specific parts of a deeply nested JSON structure requires careful state management within the `onValue` handler to avoid memory leaks or incorrect data handling.","severity":"gotcha","affected_versions":">=1.0.0"}],"env_vars":null,"last_verified":"2026-04-19T00:00:00.000Z","next_check":"2026-07-18T00:00:00.000Z","problems":[{"fix":"Change your import statement to `const JSONParse = require('jsonparse');` to correctly load the CommonJS module.","cause":"Attempting to use `new JSONParse()` after importing with ES Modules syntax (`import JSONParse from 'jsonparse'`) in a context where only CommonJS `require` is supported for this package.","error":"TypeError: JSONParse is not a constructor"},{"fix":"Inspect the input JSON for syntax errors (e.g., missing commas, unquoted keys, incorrect primitive values). Ensure that the data being passed to `parser.write()` forms valid JSON when concatenated, or that errors are handled via `parser.onError`.","cause":"The input stream contains malformed JSON, or the `write` method received an incomplete or invalid JSON chunk that could not be parsed correctly.","error":"SyntaxError: Unexpected token"}],"ecosystem":"npm","meta_description":null,"install_score":null,"install_tag":null,"quickstart_score":null,"quickstart_tag":null,"pypi_latest":null,"cli_name":"","cli_version":null}