LangChain xAI Integration

1.2.2 · active · verified Sun Apr 12

An integration package connecting xAI and LangChain (current version 1.2.2). It provides an interface to interact with xAI's Grok models, including features like chat completions, streaming, and Live Search. The library is part of the broader LangChain ecosystem, which generally follows a rapid release cycle with continuous updates and improvements to its partner integrations.

Warnings

Install

Imports

Quickstart

This example demonstrates how to initialize the `ChatXAI` model, pass messages, and invoke it to get a response. It also shows how to stream responses. Ensure your `XAI_API_KEY` is set as an environment variable or passed directly to the `ChatXAI` constructor.

import os
from langchain_xai import ChatXAI
from langchain_core.messages import HumanMessage, SystemMessage

# Set your xAI API key as an environment variable (e.g., in your shell: export XAI_API_KEY="YOUR_API_KEY")
# Or pass it directly as xai_api_key="YOUR_API_KEY" to ChatXAI
api_key = os.environ.get("XAI_API_KEY", "YOUR_XAI_API_KEY_HERE") # Replace with actual key or ensure env var is set

if api_key == "YOUR_XAI_API_KEY_HERE":
    print("Warning: XAI_API_KEY not found in environment variables. Please set it or pass it directly.")
    print("Skipping example due to missing API key.")
else:
    model = ChatXAI(
        model="grok-4",
        temperature=0,
        xai_api_key=api_key # Pass the API key
    )

    messages = [
        SystemMessage(content="You are a helpful assistant."),
        HumanMessage(content="What is the capital of France?"),
    ]

    print("Invoking Grok model...")
    response = model.invoke(messages)
    print(response.content)

    print("\nStreaming response:")
    for chunk in model.stream(messages):
        print(chunk.content, end="")
    print()

view raw JSON →