Tinker Python SDK

0.18.0 · active · verified Thu Apr 16

Tinker is the official Python SDK for the Tinker API, designed for fine-tuning large language models (LLMs). It abstracts away the complexities of distributed GPU training, allowing developers to focus on data and algorithms. The current version is 0.18.0, and it is actively maintained with ongoing development and documentation updates.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize the Tinker SDK, set up a service client, and create a LoRA training client. It also includes a basic asynchronous example for initiating an optimizer step, highlighting the typical pattern for interacting with the Tinker API. A Tinker API key must be set as an environment variable (TINKER_API_KEY).

import os
import tinker
from tinker import types

os.environ['TINKER_API_KEY'] = os.environ.get('TINKER_API_KEY', 'your_tinker_api_key_here')

# Initialize the Tinker ServiceClient
service_client = tinker.ServiceClient()

# Create a LoRA training client (example for fine-tuning)
training_client = service_client.create_lora_training_client(
    base_model="meta-llama/Llama-3.2-1B",
    rank=32,
)

# Example of an asynchronous operation (replace with actual data and loss_fn)
async def run_optim_step():
    # In a real scenario, you would have actual data and define a loss function
    # For this quickstart, we'll simulate a minimal Datum and OptimStepRequest
    dummy_model_input = types.ModelInput(
        text="This is a dummy prompt.", 
        tokens=[1, 2, 3]
    )
    dummy_loss_fn_inputs = {"labels": types.TensorData(data_float=[1.0, 2.0])}
    datum = types.Datum(model_input=dummy_model_input, loss_fn_inputs=dummy_loss_fn_inputs)
    
    optim_request = types.OptimStepRequest(
        datums=[datum],
        # Other required fields like loss_fn, optim_params, etc. would go here
        # This is a simplified example; refer to full docs for actual usage
        loss_fn=types.LossFunction.CROSS_ENTROPY,
        optim_params=types.AdamParams(learning_rate=1e-5)
    )
    optim_future = await training_client.optim_step_async(optim_request)
    # await optim_future.get_result_async()
    print("Optim step initiated.")

import asyncio
asyncio.run(run_optim_step())

view raw JSON →