LangChain IBM Integration

1.0.6 · active · verified Tue Apr 14

langchain-ibm is an integration package that connects LangChain with IBM watsonx.ai, leveraging the ibm-watsonx-ai SDK. It provides wrappers for various IBM watsonx.ai capabilities, including chat models, large language models (LLMs), embedding models, and rerankers, enabling developers to build AI applications using IBM's foundation models within the LangChain framework. The library is actively maintained with frequent updates and releases.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize and use the `ChatWatsonx` model. It requires an IBM Cloud API key and a watsonx.ai Project ID, which should be set as environment variables for security. You also need to specify a `model_id` and the appropriate `url` for your watsonx.ai service instance.

import os
from getpass import getpass
from langchain_ibm import ChatWatsonx
from ibm_watsonx_ai.foundation_models.schema import TextChatParameters

# Set environment variables (replace with your actual values or retrieve from a secure source)
if not os.environ.get("WATSONX_API_KEY"): os.environ["WATSONX_API_KEY"] = getpass("Enter your IBM Cloud API Key: ")
if not os.environ.get("WATSONX_PROJECT_ID"): os.environ["WATSONX_PROJECT_ID"] = getpass("Enter your IBM watsonx.ai Project ID: ")

# Define model parameters
parameters = TextChatParameters(
    temperature=0.7,
    max_completion_tokens=500
)

# Initialize the ChatWatsonx model
# Replace with your actual model_id and url
model = ChatWatsonx(
    model_id="ibm/granite-13b-chat-v2", 
    url="https://us-south.ml.cloud.ibm.com", # Example URL, choose based on your region
    project_id=os.environ["WATSONX_PROJECT_ID"],
    params=parameters,
)

# Invoke the model
response = model.invoke("What is the capital of France?")
print(response.content)

view raw JSON →