{"id":6542,"library":"baml-py","title":"BAML Python Bindings","description":"BAML (Basically a Made-up Language) is a domain-specific language and toolchain designed to build reliable AI workflows and agents by transforming prompt engineering into schema engineering. It generates type-safe client code for Python (and other languages like TypeScript, Ruby, Go), enabling structured outputs from Large Language Models with built-in features like streaming, retries, and broad model support. The library is actively maintained with frequent releases, currently at version 0.220.0.","status":"active","version":"0.220.0","language":"en","source_language":"en","source_url":"https://github.com/BoundaryML/baml","tags":["AI","LLM","structured-output","prompt-engineering","DSL","type-safe","codegen"],"install":[{"cmd":"pip install baml-py","lang":"bash","label":"Install baml-py"},{"cmd":"baml-cli init","lang":"bash","label":"Initialize BAML project"},{"cmd":"baml-cli generate","lang":"bash","label":"Generate Python client code"}],"dependencies":[{"reason":"Recommended for managing API keys for LLMs.","package":"python-dotenv","optional":true}],"imports":[{"note":"BAML generates a `baml_client` directory; interactions happen via this generated client, not directly through the `baml-py` package itself. `b` is typically the synchronous client instance. For async, use `from baml_client.async_client import b`.","wrong":"import baml_py","symbol":"b","correct":"from baml_client.sync_client import b"},{"note":"Custom types (schemas) defined in `.baml` files are converted into Pydantic models and placed in the generated `baml_client.types` module.","wrong":"from baml_py.types import ...","symbol":"GeneratedTypes","correct":"from baml_client.types import YourSchemaName"},{"note":"Used for programmatic cancellation of BAML function calls.","symbol":"AbortController","correct":"from baml_py import AbortController"}],"quickstart":{"code":"import os\n\n# NOTE: This example assumes you have run 'baml-cli init' and 'baml-cli generate'.\n# A 'baml_src' directory with a BAML function (e.g., ExtractResume) and a 'baml_client'\n# directory with generated code must exist in your project.\n\n# Example BAML function definition (e.g., in baml_src/main.baml):\n# class Resume {\n#   name string\n#   title string\n# }\n# function ExtractResume(resume_text: string) -> Resume {\n#   client \"openai/gpt-4o\"\n#   prompt \"\"\"\n#     Parse the following resume and return structured data.\n#     {{ resume_text }}\n#     {{ ctx.output_format }}\n#   \"\"\"\n# }\n\n# Ensure your OpenAI API key is set as an environment variable\nos.environ['OPENAI_API_KEY'] = os.environ.get('OPENAI_API_KEY', 'sk-fake-key-for-test')\n\nfrom baml_client.sync_client import b\nfrom baml_client.types import Resume\n\ndef process_resume(resume_text: str) -> Resume:\n    print(f\"Processing resume: {resume_text[:50]}...\")\n    try:\n        # Call your BAML function, which is now a Python method on 'b'\n        response = b.ExtractResume(resume_text=resume_text)\n        print(\"Successfully extracted resume data:\")\n        print(f\"  Name: {response.name}\")\n        print(f\"  Title: {response.title}\")\n        return response\n    except Exception as e:\n        print(f\"An error occurred: {e}\")\n        # In a real application, you'd handle specific BAML errors like BamlValidationError\n        raise\n\nif __name__ == \"__main__\":\n    sample_resume = (\n        \"Name: Alice Wonderland\\n\"\n        \"Title: Software Engineer\\n\"\n        \"Experience: 5 years at ExampleCorp, developing scalable backend services.\\n\"\n        \"Education: M.S. Computer Science, University of XYZ\"\n    )\n    \n    # This call would only succeed if a baml_src/main.baml with ExtractResume is defined \n    # and baml-cli generate has been run.\n    # mock_resume_data = process_resume(sample_resume)\n    print(\"To run this, ensure baml-cli init and baml-cli generate have been executed.\")\n    print(\"And OPENAI_API_KEY is set in your environment.\")\n\n","lang":"python","description":"This quickstart demonstrates how to use `baml-py` after initializing a BAML project and generating client code. First, install `baml-py` and use `baml-cli init` to create a `baml_src` directory with example BAML functions (e.g., `ExtractResume` defined in a `.baml` file). Then, run `baml-cli generate` to create the `baml_client` Python module. The generated client (`b`) allows you to call your BAML-defined functions directly from Python with type-safety. Ensure necessary LLM API keys (e.g., `OPENAI_API_KEY`) are set in your environment."},"warnings":[{"fix":"After upgrading `baml-py`, update the `version` field in `generators.baml` and ensure your `baml-cli` and VSCode extension are also updated to match. Rerun `baml-cli generate`.","message":"Maintain consistent versions across BAML components. The `baml-py` package, `baml-cli` (installed globally or via `uv`/`poetry`), the VSCode BAML extension, and the `generator` version specified in your `generators.baml` file must all be in sync. Mismatched versions can lead to code generation errors or runtime issues.","severity":"breaking","affected_versions":"All versions"},{"fix":"Define or modify your LLM functions, schemas, and clients only within the `.baml` files in your `baml_src` directory. The `baml_client` will be updated automatically.","message":"The `baml_client` directory and its contents are entirely auto-generated. Do not manually edit files within `baml_client`, as changes will be overwritten the next time `baml-cli generate` (or the VSCode extension on save) runs.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Ensure all required environment variables are set in your execution environment or loaded via `python-dotenv`. For local development, check the playground settings in the VSCode extension.","message":"API keys for LLM providers (e.g., OpenAI, Anthropic, Google) are typically configured in `baml_src/clients.baml` and usually rely on environment variables (e.g., `env.OPENAI_API_KEY`). If these variables are not correctly set, BAML client calls will fail.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Use `%load_ext autoreload` and `%autoreload 2` at the start of your notebook. Then, import the `baml_client` module directly (e.g., `import app.baml_client as client`) and call functions with the module prefix (e.g., `client.b.ExtractResume(...)`).","message":"When using BAML in Jupyter Notebooks, the standard `from baml_client.sync_client import b` import pattern does not work well with the `%autoreload` extension. Changes in `.baml` files might not be reflected without a kernel restart.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Always include `{{ ctx.output_format }}` at the end of your BAML function's prompt string.","message":"All BAML function prompts intended for structured output **must** include `{{ ctx.output_format }}` in the prompt template. Omitting this Jinja filter will prevent the final structured output step from occurring, leading to parsing failures or unexpected plain text output.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-15T00:00:00.000Z","next_check":"2026-07-14T00:00:00.000Z","problems":[]}