{"id":1981,"library":"databricks-agents","title":"Databricks Agents SDK","description":"The Databricks Agents SDK provides tools and interfaces for building AI agents within the Databricks platform. It facilitates the creation, deployment, and management of agents that can leverage various tools, LLMs, and integrate with MLflow for logging and tracing. The current version is 1.9.4, with frequent updates aligned with the evolving AI and MLflow ecosystems.","status":"active","version":"1.9.4","language":"en","source_language":"en","source_url":"https://github.com/databricks/databricks-agents","tags":["databricks","agents","llm","ai","mlflow","langchain"],"install":[{"cmd":"pip install databricks-agents","lang":"bash","label":"Install core library"}],"dependencies":[{"reason":"Required for using OpenAI LLMs with Databricks Agents.","package":"openai","optional":true},{"reason":"Required for using Anthropic LLMs (e.g., Claude) with Databricks Agents.","package":"anthropic","optional":true},{"reason":"Required for using Google Generative AI LLMs with Databricks Agents.","package":"google-generativeai","optional":true},{"reason":"Required for using LlamaIndex integrations.","package":"llama-index","optional":true},{"reason":"Required for using Hugging Face models.","package":"huggingface-hub","optional":true}],"imports":[{"symbol":"ChatAgent","correct":"from databricks.agents.chat.agent import ChatAgent"},{"symbol":"DatabricksLLM","correct":"from databricks.agents.providers.databricks import DatabricksLLM"},{"symbol":"DollyTool","correct":"from databricks.agents.tools.dolly.tool import DollyTool"}],"quickstart":{"code":"import os\nfrom databricks.agents.chat.agent import ChatAgent\nfrom databricks.agents.providers.databricks import DatabricksLLM\n\n# Ensure these environment variables are set for Databricks connectivity\n# export DATABRICKS_HOST=\"https://<your-workspace-url>\"\n# export DATABRICKS_TOKEN=\"dapi<your-token>\"\n# export DATABRICKS_LLM_ENDPOINT=\"<your-model-serving-endpoint>\" (e.g., 'databricks-mixtral-8x7b-instruct')\n\ndatabricks_host = os.environ.get('DATABRICKS_HOST', 'https://your-workspace.cloud.databricks.com')\ndatabricks_token = os.environ.get('DATABRICKS_TOKEN', 'YOUR_DATABRICKS_TOKEN')\nllm_endpoint = os.environ.get('DATABRICKS_LLM_ENDPOINT', 'databricks-mixtral-8x7b-instruct')\n\n# Initialize a Databricks-backed LLM provider\nllm = DatabricksLLM(\n    endpoint=llm_endpoint,\n    host=databricks_host,\n    api_token=databricks_token,\n    temperature=0.1\n)\n\n# Create a ChatAgent\nagent = ChatAgent(\n    llm=llm,\n    # Add tools as needed, e.g., tools=[DollyTool()]\n    verbose=True\n)\n\n# Interact with the agent\nresponse = agent.chat(\"What is the capital of France?\")\nprint(f\"Agent response: {response}\")\n\n# Example with a follow-up question\nresponse_follow_up = agent.chat(\"And what is its most famous landmark?\")\nprint(f\"Agent follow-up response: {response_follow_up}\")","lang":"python","description":"This quickstart demonstrates how to set up a `ChatAgent` using a Databricks-served LLM. It highlights the requirement for Databricks host, token, and an LLM serving endpoint, typically configured via environment variables. The agent is initialized with the LLM and can then process chat prompts. Ensure your Databricks workspace has a suitable model serving endpoint deployed for the `DatabricksLLM`."},"warnings":[{"fix":"Ensure you have access to a Databricks workspace, an appropriate Databricks Model Serving endpoint deployed (e.g., for Mixtral, Llama 2), and configure `DATABRICKS_HOST`, `DATABRICKS_TOKEN`, and `DATABRICKS_LLM_ENDPOINT` environment variables or pass them directly.","message":"The `databricks-agents` library requires a Databricks workspace and a deployed LLM serving endpoint to function correctly, even for basic usage. It is not designed for standalone, local-only operation without a Databricks backend.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Regularly consult the `databricks-agents` and LangChain documentation, and pin your `langchain` related dependencies to specific minor versions to avoid unexpected changes. Test your agent implementations thoroughly after any dependency updates.","message":"The library heavily integrates with the LangChain ecosystem. Breaking changes or significant API shifts in underlying LangChain components (e.g., `langchain_core`, `langchain-community`) can indirectly impact usage patterns or require code adjustments in `databricks-agents` applications.","severity":"gotcha","affected_versions":"All versions, particularly with new LangChain releases"},{"fix":"Familiarize yourself with MLflow's LLM logging capabilities. Ensure MLflow is configured in your environment or explicitly enabled in your agent's context to leverage automatic logging of prompts, responses, and tool calls.","message":"For optimal logging, tracing, and experiment tracking, `databricks-agents` is designed to work seamlessly with MLflow. While not strictly mandatory, users who skip MLflow integration might miss out on valuable debugging and performance monitoring capabilities.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-09T00:00:00.000Z","next_check":"2026-07-08T00:00:00.000Z"}