Azure Monitor Ingestion Client Library for Python
The Azure Monitor Ingestion client library for Python is used to send custom logs to Azure Monitor using the Logs Ingestion API. This library allows you to send data from virtually any source to supported built-in tables or custom tables that you create in a Log Analytics workspace, with support for extending their schemas. The current version is 1.1.0.
Warnings
- breaking The legacy HTTP Data Collector API (used by older methods like sending directly to `/api/logs` with shared keys) is deprecated and scheduled for retirement by September 2026. The `azure-monitor-ingestion` library uses the newer Logs Ingestion API with Data Collection Endpoints (DCEs) and Data Collection Rules (DCRs).
- breaking The Logs Ingestion API will enforce TLS 1.2 or higher for all connections starting March 1, 2026. Older TLS versions will no longer be supported.
- gotcha Successful log ingestion requires a pre-configured Data Collection Endpoint (DCE) and Data Collection Rule (DCR) in Azure, along with a Log Analytics Workspace. These resources are not created by the Python library and must be set up via the Azure portal, CLI, or ARM templates.
- gotcha The principal used for authentication (e.g., a Service Principal or Managed Identity) must have the 'Monitoring Metrics Publisher' role assigned on the Data Collection Rule resource to successfully upload logs.
- gotcha Log entries must conform to the schema defined in your Data Collection Rule (DCR). Specifically, each log entry typically requires a `TimeGenerated` field in ISO 8601 format.
- gotcha While the Logs Ingestion API has different ingestion limits than legacy methods, Log Analytics workspaces still have overall ingestion volume rate limits. Exceeding these limits can lead to data being dropped.
Install
-
pip install azure-monitor-ingestion azure-identity
Imports
- LogsIngestionClient
from azure.monitor.ingestion import LogsIngestionClient
- LogsIngestionClient (async)
from azure.monitor.ingestion.aio import LogsIngestionClient
- DefaultAzureCredential
from azure.identity import DefaultAzureCredential
Quickstart
import os
from datetime import datetime
from azure.identity import DefaultAzureCredential
from azure.monitor.ingestion import LogsIngestionClient
# --- Prerequisites ---
# 1. An Azure Log Analytics workspace.
# 2. A Data Collection Endpoint (DCE) for your workspace.
# 3. A Data Collection Rule (DCR) associated with your DCE and workspace.
# 4. A stream defined within your DCR, often with a name like 'Custom-MyStreamName'.
# 5. The service principal or managed identity used for authentication must have the
# 'Monitoring Metrics Publisher' role on the Data Collection Rule resource.
# Set environment variables (or replace directly in code for testing):
# os.environ['DATA_COLLECTION_ENDPOINT'] = 'https://<your-dce-name>.<region>.data.microsoft.com'
# os.environ['LOGS_DCR_RULE_ID'] = '<your-data-collection-rule-immutable-id>'
# os.environ['LOGS_DCR_STREAM_NAME'] = 'Custom-YourCustomStreamName'
endpoint = os.environ.get('DATA_COLLECTION_ENDPOINT', 'https://example.ingest.monitor.azure.com')
rule_id = os.environ.get('LOGS_DCR_RULE_ID', 'dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx')
stream_name = os.environ.get('LOGS_DCR_STREAM_NAME', 'Custom-MyTableRawData')
# For local development, DefaultAzureCredential will attempt to use environment variables
# (AZURE_TENANT_ID, AZURE_CLIENT_ID, AZURE_CLIENT_SECRET), Azure CLI, or Visual Studio Code login.
# For deployed applications, it can use Managed Identity.
credential = DefaultAzureCredential()
client = LogsIngestionClient(endpoint=endpoint, credential=credential)
logs = [
{
"TimeGenerated": datetime.utcnow().isoformat(),
"Computer": "Server1",
"CustomField_s": "Value1",
"EventId_d": 1001
},
{
"TimeGenerated": datetime.utcnow().isoformat(),
"Computer": "Server2",
"CustomField_s": "Value2",
"EventId_d": 1002
}
]
try:
client.upload(rule_id=rule_id, stream_name=stream_name, logs=logs)
print("Logs uploaded successfully.")
except Exception as e:
print(f"An error occurred: {e}")
# For detailed troubleshooting, enable logging:
# import logging, sys
# logging.getLogger('azure.monitor.ingestion').setLevel(logging.DEBUG)
# logging.getLogger('azure.monitor.ingestion').addHandler(logging.StreamHandler(stream=sys.stdout))
# client = LogsIngestionClient(endpoint=endpoint, credential=credential, logging_enable=True)