banks
Banks (version 2.4.1) is a Python library designed as a prompt programming language for Large Language Models (LLMs). It simplifies the creation, templating, versioning, and management of LLM prompts, offering an intuitive alternative to f-strings for complex prompt engineering. The library maintains an active and frequent release cadence, continuously adding features and improvements.
Warnings
- gotcha Python 3.9 compatibility was temporarily removed in v2.3.0 (stopping testing) but re-added in v2.4.1. Users on Python 3.9 should ensure they are using Banks v2.4.1 or newer to avoid compatibility issues.
- breaking In v1.8.0, `litellm` and `redis` transitioned from direct requirements to optional dependencies. If your application relies on features that use these libraries (e.g., LLM text generation within prompts or caching), you must install Banks with the `[all]` extra (e.g., `pip install "banks[all]"`).
- gotcha For asynchronous operations, the environment variable `BANKS_ASYNC_ENABLED` must be set to `true`, and you should use the `AsyncPrompt` class instead of `Prompt`. Failure to set the environment variable will result in synchronous behavior even with `AsyncPrompt`.
- breaking Version 2.0.0 introduced breaking changes specifically related to dependencies, noted as `[breaking] fix deps again`. Users upgrading from versions prior to 2.0.0 may encounter issues with their dependency tree.
Install
-
pip install banks -
pip install "banks[all]"
Imports
- Prompt
from banks import Prompt
- DirectoryTemplateRegistry
from banks.registries import DirectoryTemplateRegistry
Quickstart
from banks import Prompt
# Define a prompt template using Banks' templating language
prompt_template = """
{% chat role="system" %}
You are a {{ persona }}.
{% endchat %}
{% chat role="user" %}
Hello {{ name }}!
{% endchat %}
"""
# Create a Prompt object
p = Prompt(prompt_template)
# Render the prompt with context to generate chat messages
chat_messages = p.chat_messages({"persona": "friendly assistant", "name": "Alice"})
# Print the generated chat messages (e.g., for use with an LLM client)
print(chat_messages)
# Expected output:
# [{'role': 'system', 'content': 'You are a friendly assistant.'},
# {'role': 'user', 'content': 'Hello Alice!'}]