{"id":17240,"library":"got-scraping","title":"Got Scraping","description":"Got Scraping is an HTTP client built as an extension of the popular `got` library, specifically designed for web scraping tasks by making requests appear browser-like out of the box. It integrates features such as automatic browser-like header generation through the `header-generator` package, which allows for specifying desired browser, device, locale, and operating system profiles. It also simplifies proxy management, supporting HTTP and HTTPS proxies, including HTTP/2, and performs automatic ALPN negotiation for target servers. The package requires Node.js >=16 due to stability concerns with HTTP/2 in earlier versions. The current stable version is 4.2.1. However, as of its recent deprecation announcement, `got-scraping` is officially End-of-Life (EOL) and will no longer receive updates or support. For new projects, the maintainers strongly recommend migrating to `impit` (github.com/apify/impit), a modern, `fetch`-API-based HTTP client built on Rust's `reqwest` library, which offers a similar feature set and performance benefits.","status":"deprecated","version":"4.2.1","language":"javascript","source_language":"en","source_url":"https://github.com/apify/got-scraping","tags":["javascript","typescript"],"install":[{"cmd":"npm install got-scraping","lang":"bash","label":"npm"},{"cmd":"yarn add got-scraping","lang":"bash","label":"yarn"},{"cmd":"pnpm add got-scraping","lang":"bash","label":"pnpm"}],"dependencies":[{"reason":"Core HTTP client library that `got-scraping` extends and builds upon.","package":"got","optional":false}],"imports":[{"note":"The package is ESM-only since v4. For compatibility with CommonJS modules, use a dynamic import: `gotScraping ??= (await import('got-scraping')).gotScraping;`","wrong":"const { gotScraping } = require('got-scraping');","symbol":"gotScraping","correct":"import { gotScraping } from 'got-scraping';"},{"note":"Type definition for configuring `headerGeneratorOptions` when customizing browser-like headers.","symbol":"HeaderGeneratorOptions","correct":"import type { HeaderGeneratorOptions } from 'got-scraping';"},{"note":"Exports a function to retrieve the internal HTTP/HTTPS agents used by `got-scraping`, which can be useful for advanced agent management or debugging purposes.","symbol":"getAgents","correct":"import { getAgents } from 'got-scraping';"}],"quickstart":{"code":"import { gotScraping } from 'got-scraping';\n\nasync function fetchExampleData() {\n    try {\n        const response = await gotScraping.get({\n            url: 'https://httpbin.org/headers', // A simple endpoint to inspect request headers\n            proxyUrl: process.env.PROXY_URL ?? 'http://username:password@proxy.example.com:8000',\n            useHeaderGenerator: true,\n            headerGeneratorOptions: {\n                browsers: [\n                    { name: 'chrome', minVersion: 90, maxVersion: 99 }\n                ],\n                devices: ['desktop'],\n                locales: ['en-US', 'de-DE'],\n                operatingSystems: ['windows', 'macos']\n            },\n            // Standard Got options are also available\n            timeout: { request: 30000 }, // 30-second total request timeout\n            retry: { limit: 2, methods: ['GET'] }, // Retry GET requests up to 2 times\n            headers: {\n                'X-Custom-Header': 'Scraping-Demo-Request'\n            }\n        });\n\n        console.log('Status Code:', response.statusCode);\n        console.log('Response Body (first 500 chars):', response.body.substring(0, 500));\n        // Further processing of `response.body` (e.g., JSON.parse) would happen here.\n    } catch (error) {\n        console.error('Failed to fetch data:', error instanceof Error ? error.message : error);\n    }\n}\n\nfetchExampleData();","lang":"typescript","description":"Demonstrates a basic GET request using `gotScraping` with proxy support and custom header generation options, showing how to fetch and log response data."},"warnings":[{"fix":"For new projects, use `impit` (npm install impit). For existing projects, evaluate the effort required to migrate to `impit` or understand that `got-scraping` will not be maintained further.","message":"The `got-scraping` package has been officially declared End-of-Life (EOL) by its maintainers. It will no longer receive updates, bug fixes, or support. Users are strongly advised to migrate to the recommended alternative, `impit`, for new projects and consider migrating existing projects.","severity":"breaking","affected_versions":">=4.0.0"},{"fix":"Ensure your project is configured as an ES module (e.g., add `\"type\": \"module\"` to `package.json`) and use `import { gotScraping } from 'got-scraping';`. If you must use CommonJS, employ dynamic imports: `const { gotScraping } = await import('got-scraping');` or `gotScraping ??= (await import('got-scraping')).gotScraping;`.","message":"`got-scraping` is an ESM-only module since version 4. This means it can only be imported using `import` statements in ES modules or dynamically with `import()` in CommonJS contexts. Direct `require()` calls will result in errors.","severity":"breaking","affected_versions":">=4.0.0"},{"fix":"Upgrade your Node.js environment to version 16 or later using a version manager like `nvm` (e.g., `nvm install 16 && nvm use 16`).","message":"The package requires Node.js version 16 or newer. Running `got-scraping` on older Node.js versions may lead to instability, particularly with HTTP/2 connections, or outright failures.","severity":"breaking","affected_versions":"<4.0.0 (prior versions may have worked on older Node, but >=16 is explicitly required for v4)"},{"fix":"Double-check `proxyUrl` format and credentials. Ensure the proxy server is reachable and configured to accept connections from your environment. Use environment variables or a secure configuration for credentials, not hardcoded values in production.","message":"When using proxies, ensure the `proxyUrl` is correctly formatted with scheme, host, port, and credentials if required (e.g., `http://username:password@myproxy.com:1234`). Incorrectly formatted URLs or invalid credentials will lead to proxy connection failures.","severity":"gotcha","affected_versions":">=4.0.0"},{"fix":"Pass a consistent `sessionToken` object: `gotScraping.get({ url: '...', sessionToken: mySessionObject });` to ensure headers are reused. A new object will always generate new headers.","message":"By default, `got-scraping` generates new browser-like headers for each request. To maintain a consistent set of headers across multiple requests, you must provide a unique `sessionToken` object (any non-primitive object will work, e.g., `{}`) to reuse the same generated headers for subsequent requests within that session.","severity":"gotcha","affected_versions":">=4.0.0"}],"env_vars":null,"last_verified":"2026-04-22T00:00:00.000Z","next_check":"2026-07-21T00:00:00.000Z","problems":[{"fix":"Change your import statement to `import { gotScraping } from 'got-scraping';` or configure your project as a CommonJS module and use dynamic `import()` for `got-scraping`.","cause":"Attempting to use CommonJS `require()` in an ES module context or with an ESM-only library like `got-scraping`.","error":"ReferenceError: require is not defined"},{"fix":"Do not use `new` keyword or try to call `gotScraping()` directly. Instead, call its methods: `gotScraping.get(url)`, `gotScraping.post(options)`, or configure via `gotScraping(options)`.","cause":"`gotScraping` is an already instantiated `got` instance, not a constructor. It's meant to be called directly for requests or to chain methods like `.get()` or `.post()`.","error":"TypeError: gotScraping is not a function"},{"fix":"Ensure the `proxyUrl` includes valid username and password credentials: `http://username:password@proxy.example.com:port`. Verify that the credentials are correct for your proxy.","cause":"The configured proxy server requires authentication, but either no credentials were provided or they are incorrect.","error":"Error: Proxy CONNECT Error: 407 Proxy Authentication Required"},{"fix":"Upgrade your Node.js environment to version 16 or higher. Use a Node Version Manager (e.g., `nvm`) to easily switch versions: `nvm install 16 && nvm use 16`.","cause":"`got-scraping` requires Node.js >=16, and you are running an older, incompatible version.","error":"Error: This module was compiled against Node.js version XX. You are currently running version YY. Please update Node.js to a compatible version."}],"ecosystem":"npm","meta_description":null}