Databricks Agents SDK
The Databricks Agents SDK provides tools and interfaces for building AI agents within the Databricks platform. It facilitates the creation, deployment, and management of agents that can leverage various tools, LLMs, and integrate with MLflow for logging and tracing. The current version is 1.9.4, with frequent updates aligned with the evolving AI and MLflow ecosystems.
Warnings
- gotcha The `databricks-agents` library requires a Databricks workspace and a deployed LLM serving endpoint to function correctly, even for basic usage. It is not designed for standalone, local-only operation without a Databricks backend.
- gotcha The library heavily integrates with the LangChain ecosystem. Breaking changes or significant API shifts in underlying LangChain components (e.g., `langchain_core`, `langchain-community`) can indirectly impact usage patterns or require code adjustments in `databricks-agents` applications.
- gotcha For optimal logging, tracing, and experiment tracking, `databricks-agents` is designed to work seamlessly with MLflow. While not strictly mandatory, users who skip MLflow integration might miss out on valuable debugging and performance monitoring capabilities.
Install
-
pip install databricks-agents
Imports
- ChatAgent
from databricks.agents.chat.agent import ChatAgent
- DatabricksLLM
from databricks.agents.providers.databricks import DatabricksLLM
- DollyTool
from databricks.agents.tools.dolly.tool import DollyTool
Quickstart
import os
from databricks.agents.chat.agent import ChatAgent
from databricks.agents.providers.databricks import DatabricksLLM
# Ensure these environment variables are set for Databricks connectivity
# export DATABRICKS_HOST="https://<your-workspace-url>"
# export DATABRICKS_TOKEN="dapi<your-token>"
# export DATABRICKS_LLM_ENDPOINT="<your-model-serving-endpoint>" (e.g., 'databricks-mixtral-8x7b-instruct')
databricks_host = os.environ.get('DATABRICKS_HOST', 'https://your-workspace.cloud.databricks.com')
databricks_token = os.environ.get('DATABRICKS_TOKEN', 'YOUR_DATABRICKS_TOKEN')
llm_endpoint = os.environ.get('DATABRICKS_LLM_ENDPOINT', 'databricks-mixtral-8x7b-instruct')
# Initialize a Databricks-backed LLM provider
llm = DatabricksLLM(
endpoint=llm_endpoint,
host=databricks_host,
api_token=databricks_token,
temperature=0.1
)
# Create a ChatAgent
agent = ChatAgent(
llm=llm,
# Add tools as needed, e.g., tools=[DollyTool()]
verbose=True
)
# Interact with the agent
response = agent.chat("What is the capital of France?")
print(f"Agent response: {response}")
# Example with a follow-up question
response_follow_up = agent.chat("And what is its most famous landmark?")
print(f"Agent follow-up response: {response_follow_up}")