psycopg-binary
psycopg-binary is the pre-compiled binary distribution for Psycopg 3, a modern PostgreSQL database adapter for Python. It provides C optimizations for performance without requiring local build prerequisites. While `psycopg-binary` is the package name on PyPI, it acts as an optional component for the core `psycopg` library (Psycopg 3). It is actively maintained, currently at version 3.3.3, and follows a frequent release cadence, often aligning with new Python and PostgreSQL versions.
Warnings
- breaking The `psycopg-binary` package provides C optimizations for Psycopg 3, but the primary library to import is `psycopg`. Do not `import psycopg_binary` directly. If migrating from `psycopg2-binary`, the installation command is `pip install "psycopg[binary]"` (or `"psycopg[binary,pool]"`), and the import path changes from `import psycopg2` to `import psycopg`.
- gotcha The `psycopg-binary` package bundles its own versions of C libraries like `libpq` and `libssl`. This can lead to conflicts with other Python modules or system libraries (potentially causing segfaults, especially with Python's `ssl` module under concurrency). System library upgrades will not update the versions used by `psycopg-binary`. For production environments, building from source (`pip install psycopg[c]`) is often advised to link against system libraries.
- breaking Psycopg 3 introduces several API changes compared to Psycopg 2. Key differences include server-side parameter binding (which can break queries like `SET TimeZone`), a redesigned connection pool, and different handling of date/time infinities. Refer to the official 'Differences from psycopg2' documentation for a complete list of changes.
- gotcha Psycopg 3.3.x officially supports Python versions from 3.10 to 3.14. Older Python versions (e.g., 3.8, 3.9, 3.7) are only supported by earlier Psycopg 3.x releases. Ensure your Python environment meets the requirements for the `psycopg-binary` version you are installing.
- gotcha Older versions of `psycopg[binary]` (prior to 3.1.11) had reports of high memory consumption, particularly for large `SELECT` queries, due to query caching. This issue was addressed in `psycopg` version 3.1.11.
Install
-
pip install "psycopg[binary,pool]"
Imports
- connect
from psycopg import connect
Quickstart
import os
import psycopg
# Retrieve connection string from environment or use a default
conn_str = os.environ.get(
"PSYCOPG_CONN_STRING",
"dbname=test user=postgres password=mysecretpassword host=localhost port=5432"
)
try:
# Establish a connection using a context manager for automatic closing
with psycopg.connect(conn_str) as conn:
# Open a cursor to perform database operations
with conn.cursor() as cur:
# Example: Create a table (if it doesn't exist)
cur.execute("DROP TABLE IF EXISTS test_table")
cur.execute("CREATE TABLE test_table (id SERIAL PRIMARY KEY, name VARCHAR(100))")
# Example: Insert data
cur.execute("INSERT INTO test_table (name) VALUES (%s)", ("Alice",))
cur.execute("INSERT INTO test_table (name) VALUES (%s)", ("Bob",))
# Commit the transaction (if not in autocommit mode, default is autocommit=False)
conn.commit()
# Example: Query data
cur.execute("SELECT id, name FROM test_table ORDER BY id")
print("--- Data from test_table ---")
for record in cur:
print(f"ID: {record[0]}, Name: {record[1]}")
except psycopg.Error as e:
print(f"Database error: {e}")
# In a production application, you would handle this more robustly,
# e.g., logging the error and potentially exiting or retrying.