Azure Monitor Ingestion Client Library for Python

1.1.0 · active · verified Sat Apr 11

The Azure Monitor Ingestion client library for Python is used to send custom logs to Azure Monitor using the Logs Ingestion API. This library allows you to send data from virtually any source to supported built-in tables or custom tables that you create in a Log Analytics workspace, with support for extending their schemas. The current version is 1.1.0.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to ingest custom logs into Azure Monitor using the `LogsIngestionClient`. It requires a configured Data Collection Endpoint (DCE) and Data Collection Rule (DCR) in Azure. Authentication is handled via `DefaultAzureCredential` from `azure-identity`, which supports various authentication flows. The log data is sent as a list of dictionaries, where each dictionary represents a log entry and must conform to the schema defined in the DCR, including a `TimeGenerated` field.

import os
from datetime import datetime
from azure.identity import DefaultAzureCredential
from azure.monitor.ingestion import LogsIngestionClient

# --- Prerequisites ---
# 1. An Azure Log Analytics workspace.
# 2. A Data Collection Endpoint (DCE) for your workspace.
# 3. A Data Collection Rule (DCR) associated with your DCE and workspace.
# 4. A stream defined within your DCR, often with a name like 'Custom-MyStreamName'.
# 5. The service principal or managed identity used for authentication must have the
#    'Monitoring Metrics Publisher' role on the Data Collection Rule resource.

# Set environment variables (or replace directly in code for testing):
# os.environ['DATA_COLLECTION_ENDPOINT'] = 'https://<your-dce-name>.<region>.data.microsoft.com'
# os.environ['LOGS_DCR_RULE_ID'] = '<your-data-collection-rule-immutable-id>'
# os.environ['LOGS_DCR_STREAM_NAME'] = 'Custom-YourCustomStreamName'

endpoint = os.environ.get('DATA_COLLECTION_ENDPOINT', 'https://example.ingest.monitor.azure.com')
rule_id = os.environ.get('LOGS_DCR_RULE_ID', 'dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx')
stream_name = os.environ.get('LOGS_DCR_STREAM_NAME', 'Custom-MyTableRawData')

# For local development, DefaultAzureCredential will attempt to use environment variables
# (AZURE_TENANT_ID, AZURE_CLIENT_ID, AZURE_CLIENT_SECRET), Azure CLI, or Visual Studio Code login.
# For deployed applications, it can use Managed Identity.
credential = DefaultAzureCredential()

client = LogsIngestionClient(endpoint=endpoint, credential=credential)

logs = [
    {
        "TimeGenerated": datetime.utcnow().isoformat(),
        "Computer": "Server1",
        "CustomField_s": "Value1",
        "EventId_d": 1001
    },
    {
        "TimeGenerated": datetime.utcnow().isoformat(),
        "Computer": "Server2",
        "CustomField_s": "Value2",
        "EventId_d": 1002
    }
]

try:
    client.upload(rule_id=rule_id, stream_name=stream_name, logs=logs)
    print("Logs uploaded successfully.")
except Exception as e:
    print(f"An error occurred: {e}")
    # For detailed troubleshooting, enable logging:
    # import logging, sys
    # logging.getLogger('azure.monitor.ingestion').setLevel(logging.DEBUG)
    # logging.getLogger('azure.monitor.ingestion').addHandler(logging.StreamHandler(stream=sys.stdout))
    # client = LogsIngestionClient(endpoint=endpoint, credential=credential, logging_enable=True)

view raw JSON →