Graphlit TypeScript Client SDK
raw JSON →The Graphlit TypeScript Client SDK, currently at version 1.0.20260422004, provides the official programmatic interface for interacting with the Graphlit Platform. This SDK is designed to simplify the development of AI-powered applications by abstracting the complexities of knowledge retrieval and large language model (LLM) integration. It enables developers to ingest diverse content types, including PDFs, websites, audio, and video, and leverage advanced capabilities such as Retrieval-Augmented Generation (RAG) for conversational AI, automated insight extraction (summaries, entities, metadata), and the creation of knowledge graphs. The client library is actively maintained with frequent updates, as indicated by its rapid versioning and extensive 'What's New' sections covering features like OpenAI Responses API support, enhanced streaming provider resilience with automatic retries, and a comprehensive agent framework. Its key differentiators include built-in context management, artifact collection, and an autonomous agent harness, aiming to accelerate the development of sophisticated AI applications.
Common errors
error Error: The 'graphlit-client' package requires Node.js version >=20.0.0. ↓
nvm to manage multiple Node.js versions. error Graphlit API Error: Unauthorized - Invalid API Key or Secret ↓
GRAPHLIT_API_KEY and GRAPHLIT_SECRET_KEY environment variables are correctly set and contain valid credentials. Regenerate keys in the Graphlit platform if necessary. error ReferenceError: require is not defined in ES module scope ↓
"type": "module" in package.json) and use import { Symbol } from 'graphlit-client';. If using CommonJS, ensure your setup correctly transpiles or targets CommonJS modules if the library is ESM-only. Warnings
breaking Starting with v1.6.0, the Graphlit Client SDK requires Node.js version 20 or higher. Earlier Node.js versions are no longer supported and will cause runtime errors. ↓
gotcha The SDK automatically routes eligible OpenAI models (e.g., GPT-5.4) through the OpenAI Responses API for improved performance and intelligence. This can affect how certain parameters, like `temperature`, are handled (e.g., `temperature` is ignored if reasoning effort is not 'none'). Additionally, `tool_choice: "required"` is automatically sent on the first API request to ensure tool engagement, then reverts to `auto`. ↓
gotcha Error event handling for streaming providers has been refined. Providers that previously emitted both an error event and threw an error will now only emit an error event, preventing duplicate error handling logic. A new `ProviderError` class normalizes error reporting. ↓
Install
npm install graphlit-client yarn add graphlit-client pnpm add graphlit-client Imports
- GraphlitClient wrong
const GraphlitClient = require('graphlit-client').GraphlitClient;correctimport { GraphlitClient } from 'graphlit-client'; - StreamAgentOptions wrong
import type { StreamAgentOptions } from 'graphlit-client/types';correctimport { StreamAgentOptions } from 'graphlit-client'; - ProviderError wrong
import { ProviderError } from 'graphlit-client/errors';correctimport { ProviderError } from 'graphlit-client';
Quickstart
import { GraphlitClient, StreamAgentOptions } from 'graphlit-client';
async function main() {
const apiKey = process.env.GRAPHLIT_API_KEY ?? '';
const secretKey = process.env.GRAPHLIT_SECRET_KEY ?? '';
if (!apiKey || !secretKey) {
console.error('Please set GRAPHLIT_API_KEY and GRAPHLIT_SECRET_KEY environment variables.');
process.exit(1);
}
const client = new GraphlitClient({
apiKey,
secretKey,
environment: 'prod' // Or 'dev', 'staging', etc.
});
const agentId = 'your-agent-id'; // Replace with a valid agent ID from your Graphlit platform
const prompt = 'Summarize the key findings from our latest quarterly report.';
console.log(`Running agent ${agentId} with prompt: "${prompt}"`);
const agentOptions: StreamAgentOptions = {
useResponsesApi: true, // Example option to route through OpenAI Responses API
debug: false,
conversationId: `user-session-${Date.now()}`,
// Additional options like model, maxTokens, temperature can be configured here.
};
try {
const stream = client.streamAgent(agentId, prompt, agentOptions);
for await (const chunk of stream) {
if (chunk.content) {
process.stdout.write(chunk.content);
}
// Additional handling for other chunk types (e.g., tool_calls, reasoning)
}
console.log('\nAgent stream finished successfully.');
} catch (error) {
console.error('Error during agent streaming:', error);
}
}
main().catch(console.error);