LlamaIndex Groq LLM Integration
raw JSON → 0.5.0 verified Fri May 01 auth: no python
LlamaIndex integration for Groq LLMs, providing high-speed inference via Groq's hardware-accelerated API. Version 0.5.0 requires Python 3.10+ and supports LlamaIndex v0.11+ with the new import structure. Released on PyPI; updates follow LlamaIndex releases.
pip install llama-index-llms-groq Common errors
error ModuleNotFoundError: No module named 'llama_index' ↓
cause LlamaIndex core is not installed; 'llama-index-llms-groq' only depends on 'llama-index-core', not the full 'llama-index' meta-package.
fix
Run 'pip install llama-index-llms-groq' (installs 'llama-index-core' automatically). Alternatively, install full suite: 'pip install llama-index llama-index-llms-groq'.
error ImportError: cannot import name 'Groq' from 'llama_index.llms' ↓
cause Old-style import path. In LlamaIndex v0.11+, the LLM classes are in subpackages 'llama_index.llms.<provider>'.
fix
Use 'from llama_index.llms.groq import Groq'
error AuthenticationError: API key not found. Pass your API key via the 'api_key' parameter or set the 'GROQ_API_KEY' environment variable. ↓
cause Missing API key; the package does not read env vars automatically.
fix
Set environment variable 'GROQ_API_KEY' or pass 'api_key' parameter when initializing Groq.
Warnings
breaking LlamaIndex v0.11 introduced a new package structure: 'llama-index-llms-groq' instead of the old monolithic 'llama-index' plus 'llama-index[groq]'. Code using 'from llama_index.llms import Groq' from the old package must be updated to 'from llama_index.llms.groq import Groq'. ↓
fix Run 'pip install llama-index-llms-groq' and update all import paths to 'from llama_index.llms.groq import Groq'
deprecated The 'llama-index' meta-package (version <0.11) included all integrations by default. As of v0.11, integrations are separate packages to reduce bloat. The meta-package no longer includes Groq automatically. ↓
fix Install 'llama-index-llms-groq' explicitly
gotcha The package does not automatically load the GROQ_API_KEY from environment variables; you must pass it explicitly or set it via 'llm = Groq(model=..., api_key=os.environ["GROQ_API_KEY"])'. ↓
fix Always provide 'api_key' parameter or configure via environment variable in your code.
gotcha Some models are token-limited (e.g., 'mixtral-8x7b-32768' has 32k context). Exceeding the limit will raise a 'ValueError: Request too large for model...'. Check model limits before streaming long documents. ↓
fix Use models with larger context windows (e.g., 'llama3-70b-8192' has 8k, 'gemma2-9b-it' has 8k) or chunk input.
Imports
- Groq wrong
from llama_index.llms.groq import Groq as GroqLLMcorrectfrom llama_index.llms.groq import Groq - Groq wrong
from llama_index.llms.groq import GroqDeprecatedcorrectfrom llama_index.llms.groq import Groq
Quickstart
import os
from llama_index.llms.groq import Groq
api_key = os.environ.get("GROQ_API_KEY", "")
if not api_key:
raise ValueError("Set GROQ_API_KEY environment variable")
llm = Groq(model="mixtral-8x7b-32768", api_key=api_key)
response = llm.complete("What is Groq?")
print(response)