LlamaIndex IBM Watsonx LLM
raw JSON → 0.7.0.post1 verified Sat May 09 auth: no python
LlamaIndex integration for IBM watsonx.ai foundation models. Current version 0.7.0.post1 requires Python 3.11+ and <3.14. Lightly maintained, with occasional releases following LlamaIndex updates.
pip install llama-index-llms-ibm Common errors
error ModuleNotFoundError: No module named 'llama_index.llms.ibm' ↓
cause Outdated package or wrong import path for v0.7+. The correct subpackage is not installed.
fix
Install the package:
pip install llama-index-llms-ibm and import using from llama_index.llms.ibm import WatsonxLLM. error ValueError: The api_key client must be specified, or set the WATSONX_APIKEY environment variable. ↓
cause Missing or empty API key.
fix
Set the WATSONX_APIKEY environment variable or pass
api_key parameter. error ClientError: Error: ibm_watsonx_ai...Project ID is required for this operation. ↓
cause Missing project ID required for watsonx.ai.
fix
Set WATSONX_PROJECT_ID environment variable or pass
project_id parameter. Warnings
breaking In v0.7.0, the import path changed from `llama_index.llms.ibm` to requiring specific submodule imports. Old `from llama_index.llms.ibm import IBM` no longer works; use `from llama_index.llms.ibm import WatsonxLLM`. ↓
fix Update imports to use WatsonxLLM class and correct path.
breaking The `model_id` parameter is now required and defaults removed. Previously optional models may fail if not explicitly provided. ↓
fix Always specify `model_id` when creating WatsonxLLM instance.
gotcha IBM watsonx.ai credentials must be passed via environment variables (WATSONX_APIKEY, WATSONX_PROJECT_ID) or constructor arguments. Passing only API key without project ID raises a missing credentials error. ↓
fix Ensure both WATSONX_APIKEY and WATSONX_PROJECT_ID are set.
deprecated The `IBM` class alias is deprecated in v0.7.0 and will be removed in a future release. Use `WatsonxLLM` instead. ↓
fix Replace `IBM` with `WatsonxLLM`.
Imports
- IBM wrong
from llama_index.llms import IBMcorrectfrom llama_index.llms.ibm import IBM - IBM Watsonx wrong
from llama_index.llms.ibm import Watsonxcorrectfrom llama_index.llms.ibm import WatsonxLLM
Quickstart
import os
from llama_index.llms.ibm import WatsonxLLM
api_key = os.environ.get('WATSONX_APIKEY', '')
project_id = os.environ.get('WATSONX_PROJECT_ID', '')
llm = WatsonxLLM(
model_id='meta-llama/llama-3-70b-instruct',
api_key=api_key,
project_id=project_id,
)
response = llm.complete('What is the capital of France?')
print(response.text)