Graphiti Core
Graphiti Core is a Python library for building and querying temporal context graphs, primarily designed for AI agents. It enables real-time, incremental updates to knowledge graphs, tracking how facts and relationships evolve over time without requiring full recomputation. It supports hybrid search (semantic, keyword, graph traversal) and integrates with various graph databases and LLM providers. The current version is 0.28.2, with frequent updates addressing new features, performance, and security.
Warnings
- breaking Graphiti-core versions prior to 0.28.2 contain a Cypher injection vulnerability. It is critical to update to 0.28.2 or later to mitigate this security risk.
- breaking Version 0.28.1 replaced the `diskcache` dependency with a `sqlite-based cache` to resolve a CVE. If your application directly interacted with `diskcache` or relied on specific `diskcache` configurations, this change might require adjustments.
- gotcha Graphiti defaults to using OpenAI for LLM inference and embedding. An `OPENAI_API_KEY` environment variable must be set for the library to function correctly unless an alternative LLM provider is explicitly configured and installed (e.g., `graphiti-core[anthropic]`).
- gotcha Version 0.28.0 introduced a 'driver operations architecture redesign'. Users with custom database drivers or those who extensively customized database interactions might need to review and update their implementations to align with the new architecture.
- gotcha Graphiti is designed for high concurrency in its ingestion pipelines. By default, concurrency is set low to prevent LLM Provider 429 (Rate Limit) errors. If performance is critical, increase concurrency via the `SEMAPHORE_LIMIT` environment variable.
- gotcha Graphiti requires an external graph database (e.g., Neo4j, FalkorDB, Kuzu, Amazon Neptune) to operate. Ensure a compatible database is running and accessible with the correct connection parameters.
- gotcha Due to `graphiti-core`'s asynchronous nature and persistent connections, integrating it directly into applications with incompatible concurrency models (e.g., certain web frameworks) can lead to 'Event loop is closed' errors. For production, isolating `graphiti-core` in its own process, such as via a dedicated API service, is recommended for stability.
Install
-
pip install graphiti-core -
pip install graphiti-core[falkordb] -
pip install graphiti-core[kuzu] -
pip install graphiti-core[neptune] -
pip install graphiti-core[anthropic,groq,google-genai]
Imports
- Graphiti
from graphiti_core import Graphiti
- EpisodeType
from graphiti_core.nodes import EpisodeType
Quickstart
import asyncio
import os
from datetime import datetime
from graphiti_core import Graphiti
from graphiti_core.nodes import EpisodeType
# Ensure OPENAI_API_KEY and FALKORDB_URI are set in your environment
# Example: export OPENAI_API_KEY="sk-..."
# Example: docker run -p 6379:6379 -p 3000:3000 -it --rm falkordb/falkordb:latest
async def main():
# Initialize Graphiti with FalkorDB driver (default to localhost)
graphiti = Graphiti(
uri=os.environ.get('FALKORDB_URI', "falkor://localhost:6379")
)
# Build indices (run once during setup)
print("Building indices and constraints...")
await graphiti.build_indices_and_constraints()
print("Indices built.")
# Add an episode (information to be stored in the graph)
episode_body = (
"Alice met Bob at the AI conference in San Francisco on March 15, 2024. "
"They discussed the latest developments in graph databases and decided "
"to collaborate on a new project using Graphiti and FalkorDB."
)
print("Adding episode...")
await graphiti.add_episode(
name="Conference Meeting",
episode_body=episode_body,
episode_type=EpisodeType.text,
reference_time=datetime(2024, 3, 15),
source_description="Conference notes"
)
print("Episode added.")
# Search the knowledge graph
print("Searching the graph...")
search_results = await graphiti.search(
query="What did Alice and Bob discuss?",
num_results=3
)
print("\nSearch Results:")
if search_results:
for i, result in enumerate(search_results):
print(f"- Result {i+1}: {result.node.summary}")
else:
print("No search results found.")
# Close the connection
await graphiti.close()
print("Connection closed.")
if __name__ == "__main__":
# Set a dummy API key if not already set, for local testing without network
if not os.environ.get('OPENAI_API_KEY'):
print("Warning: OPENAI_API_KEY environment variable not set. Using a dummy key.")
os.environ['OPENAI_API_KEY'] = 'dummy-key-for-test'
asyncio.run(main())