SambaNova LangChain Integration

1.1.0 · active · verified Thu Apr 16

An integration package connecting SambaNova and LangChain, enabling developers to leverage SambaNova's large language models. It provides chat models (`ChatSambaNova`) and embedding models (`SambaNovaEmbeddings`) for seamless interaction with SambaCloud and SambaStack. The current version is 1.1.0, with a regular release cadence that incorporates new features and ensures compatibility with LangChain updates.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to set up and use the `ChatSambaNova` model for chat completions and `SambaNovaEmbeddings` for text embeddings. It requires setting `SAMBANOVA_API_KEY` (and optionally `SAMBANOVA_API_BASE` for SambaStack) as environment variables.

import os
from langchain_sambanova import ChatSambaNova
from langchain_core.messages import HumanMessage

# Ensure SAMBANOVA_API_KEY is set as an environment variable
# For SambaStack, SAMBANOVA_API_BASE might also be required.
os.environ['SAMBANOVA_API_KEY'] = os.environ.get('SAMBANOVA_API_KEY', 'your_sambanova_api_key_here')

# Initialize the SambaNova Chat model
llm = ChatSambaNova(
    model="Meta-Llama-3.3-70B-Instruct",  # Replace with your desired model
    max_tokens=1024,
    temperature=0.7,
    top_p=0.01,
)

# Invoke the model with a message
messages = [
    ("system", "You are a helpful assistant that translates English to French."),
    ("human", "I love programming."),
]

response = llm.invoke(messages)
print(response.content)

# Example for embeddings
from langchain_sambanova import SambaNovaEmbeddings

embeddings = SambaNovaEmbeddings(model="E5-Mistral-7B-Instruct") # Replace with your desired embedding model

query_embedding = embeddings.embed_query("What is the meaning of life?")
print(f"Embedding snippet: {str(query_embedding)[:50]}...")

view raw JSON →