{"id":4605,"library":"langchain-postgres","title":"LangChain PostgreSQL Integration","description":"langchain-postgres is an integration package that connects LangChain abstractions with PostgreSQL, leveraging its robust features for vector stores, chat message history, and LangGraph checkpoint saving. It is currently at version 0.0.17 and receives regular updates. The package supports asyncpg and psycopg3 drivers, enabling efficient and scalable interactions with PostgreSQL databases.","status":"active","version":"0.0.17","language":"en","source_language":"en","source_url":"https://github.com/langchain-ai/langchain-postgres","tags":["LangChain","PostgreSQL","Vector Store","Chat History","LangGraph","Database","AI","LLM"],"install":[{"cmd":"pip install -U langchain-postgres","lang":"bash","label":"Latest Stable Release"},{"cmd":"pip install -U 'langchain-postgres[psycopg]' # for psycopg3 driver","lang":"bash","label":"With Psycopg3 Driver"},{"cmd":"pip install -U 'langchain-postgres[asyncpg]' # for asyncpg driver","lang":"bash","label":"With Asyncpg Driver"}],"dependencies":[{"reason":"Core LangChain abstractions are utilized by this integration package.","package":"langchain-core","optional":false},{"reason":"Default PostgreSQL driver (Psycopg 3) for synchronous operations.","package":"psycopg","optional":true},{"reason":"Asynchronous PostgreSQL driver.","package":"asyncpg","optional":true},{"reason":"Used for database abstraction and connection management.","package":"sqlalchemy","optional":false},{"reason":"Connection pooling for psycopg.","package":"psycopg-pool","optional":true},{"reason":"PostgreSQL extension for vector similarity search, integral to PGVectorStore.","package":"pgvector","optional":false},{"reason":"Numerical operations, potentially for embedding handling.","package":"numpy","optional":false}],"imports":[{"symbol":"PGEngine","correct":"from langchain_postgres import PGEngine"},{"symbol":"PGVectorStore","correct":"from langchain_postgres import PGVectorStore"},{"symbol":"PostgresChatMessageHistory","correct":"from langchain_postgres.chat_message_histories import PostgresChatMessageHistory"},{"note":"PostgresSaver for LangGraph checkpoints is located in `langgraph.checkpoint.postgres`, not directly in `langchain_postgres`.","wrong":"from langchain_postgres import PostgresSaver","symbol":"PostgresSaver","correct":"from langgraph.checkpoint.postgres import PostgresSaver"},{"note":"As of v0.0.14+, `PGVector` is deprecated. Migrate to `PGVectorStore` for improved performance and manageability.","wrong":"from langchain_postgres import PGVector","symbol":"PGVector","correct":"from langchain_postgres.vectorstores import PGVector"}],"quickstart":{"code":"import os\nfrom langchain_core.documents import Document\nfrom langchain_core.embeddings import DeterministicFakeEmbedding\nfrom langchain_postgres import PGEngine, PGVectorStore\n\n# Replace with your PostgreSQL connection string\nCONNECTION_STRING = os.environ.get('POSTGRES_CONNECTION_STRING', 'postgresql+psycopg://langchain:langchain@localhost:6024/langchain')\n\n# Initialize PGEngine\nengine = PGEngine.from_connection_string(url=CONNECTION_STRING)\n\n# Define vector size and embedding service\nVECTOR_SIZE = 768  # Adjust based on your embedding model\nembedding = DeterministicFakeEmbedding(size=VECTOR_SIZE)\nTABLE_NAME = \"my_doc_collection\"\n\n# Initialize the vector store table (if it doesn't exist)\nengine.init_vectorstore_table(\n    table_name=TABLE_NAME,\n    vector_size=VECTOR_SIZE,\n)\n\n# Create a synchronous PGVectorStore instance\nstore = PGVectorStore.create_sync(\n    engine=engine,\n    table_name=TABLE_NAME,\n    embedding_service=embedding,\n)\n\n# Add documents\ndocs = [\n    Document(page_content=\"Apples and oranges\"),\n    Document(page_content=\"Cars and airplanes\"),\n    Document(page_content=\"Dogs and cats\"),\n]\nstore.add_documents(docs)\n\n# Perform a similarity search\nquery = \"fruits\"\nresults = store.similarity_search(query, k=1)\nprint(f\"Similarity search for '{query}': {results[0].page_content}\")\n\n# Example for Chat Message History\nfrom langchain_postgres.chat_message_histories import PostgresChatMessageHistory\nimport uuid\n\nsession_id = str(uuid.uuid4())\nchat_history = PostgresChatMessageHistory(session_id=session_id, table_name=\"chat_messages\", connection=engine.get_connection())\nchat_history.add_user_message(\"Hello LangChain Postgres!\")\nchat_history.add_ai_message(\"Hi there!\")\nprint(f\"Chat history for session {session_id}: {chat_history.messages}\")","lang":"python","description":"This quickstart demonstrates how to set up `PGEngine` for connecting to PostgreSQL, initialize a `PGVectorStore` for document embedding and similarity search, and use `PostgresChatMessageHistory` for persisting chat messages. It uses `DeterministicFakeEmbedding` for demonstration purposes; in a real application, you would replace this with an actual embedding model. Remember to set your `POSTGRES_CONNECTION_STRING` environment variable."},"warnings":[{"fix":"Replace `PGVector` imports and instantiations with `PGVectorStore`. Refer to the official migration guide for detailed steps.","message":"In versions 0.0.14 and higher, the `PGVector` class has been deprecated. Users should migrate to `PGVectorStore` for improved performance and manageability.","severity":"breaking","affected_versions":">=0.0.14"},{"fix":"Ensure your connection string or connection object creation includes `autocommit=True` and `row_factory=dict_row` (e.g., `from psycopg.rows import dict_row`). For production, consider `ConnectionPool` from `psycopg_pool` to manage connections.","message":"When using `PostgresSaver` (from `langgraph.checkpoint.postgres`) with manually created PostgreSQL connections, it is crucial to include `autocommit=True` and `row_factory=dict_row` in the connection parameters. Failure to do so can lead to `TypeError` exceptions due to incorrect row access (tuple vs. dictionary) and unpersisted table creations after `.setup()` calls.","severity":"gotcha","affected_versions":"All"},{"fix":"Modify your PostgreSQL connection strings to use `postgresql+psycopg://` as the scheme instead of `postgresql+psycopg2://`.","message":"The connection string format has changed for `langchain-postgres` to explicitly work with `psycopg3`. Update connection strings from `postgresql+psycopg2://...` to `postgresql+psycopg://...`.","severity":"breaking","affected_versions":"All"},{"fix":"Always call `checkpointer.setup()` on your `PostgresSaver` instance when setting it up for the first time or when ensuring table existence.","message":"For `PostgresSaver`, when the database is first used, the `.setup()` method must be called to create the required tables. Operations will fail if tables are not initialized.","severity":"gotcha","affected_versions":"All"},{"fix":"Optimize connection pool settings (e.g., `ASYNCPG_POOL_MAX_SIZE`, `ASYNCPG_POOL_MIN_SIZE`), increase PostgreSQL `max_connections`, implement connection retry logic, and consider using a connection pooler like PgBouncer.","message":"LangChain applications, especially those using checkpoints in production deployments, can suffer from 'too many clients already' errors if the PostgreSQL connection limit is exceeded. This is often due to a high number of runs, large checkpoint sizes, or unmanaged connection pools.","severity":"gotcha","affected_versions":"All"}],"env_vars":null,"last_verified":"2026-04-12T00:00:00.000Z","next_check":"2026-07-11T00:00:00.000Z"}