Aleph Alpha Python Client

11.5.1 · active · verified Thu Apr 16

The `aleph-alpha-client` is the official Python client for interacting with Aleph Alpha's API endpoints. It provides synchronous and asynchronous interfaces to access various AI capabilities, including completion, embedding, evaluation, and tool calling with large language and multimodal models. The library is actively maintained, with frequent releases, and is currently at version 11.5.1.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize the synchronous client, construct a basic text completion request, and print the model's response. It expects the Aleph Alpha API token to be set as an environment variable `ALEPH_ALPHA_API_TOKEN`.

import os
from aleph_alpha_client import Client, CompletionRequest, Prompt

# Ensure ALEPH_ALPHA_API_TOKEN is set in your environment variables
api_token = os.environ.get('ALEPH_ALPHA_API_TOKEN', 'YOUR_API_TOKEN')
if not api_token or api_token == 'YOUR_API_TOKEN':
    raise ValueError("ALEPH_ALPHA_API_TOKEN environment variable not set or is default. Please set it to your actual API token.")

# Instantiate the synchronous client
client = Client(token=api_token)

# Define the prompt and completion request
request = CompletionRequest(
    prompt=Prompt.from_text("Provide a short description of AI:"),
    maximum_tokens=64,
    model="luminous-base" # Or another available model, e.g., "pharia-1-llm-7b-control"
)

try:
    # Send the completion request
    response = client.complete(request=request, model=request.model)
    print(response.completions[0].completion)
except Exception as e:
    print(f"An error occurred: {e}")

view raw JSON →