{"id":2881,"library":"azure-monitor-ingestion","title":"Azure Monitor Ingestion Client Library for Python","description":"The Azure Monitor Ingestion client library for Python is used to send custom logs to Azure Monitor using the Logs Ingestion API. This library allows you to send data from virtually any source to supported built-in tables or custom tables that you create in a Log Analytics workspace, with support for extending their schemas. The current version is 1.1.0.","status":"active","version":"1.1.0","language":"en","source_language":"en","source_url":"https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/monitor/azure-monitor-ingestion","tags":["azure","monitor","logs","ingestion","observability","cloud","logging"],"install":[{"cmd":"pip install azure-monitor-ingestion azure-identity","lang":"bash","label":"Install with pip"}],"dependencies":[{"reason":"Required Python version.","package":"python","version":">=3.9","optional":false},{"reason":"Required for authentication using Azure Active Directory credentials like DefaultAzureCredential.","package":"azure-identity","optional":false}],"imports":[{"symbol":"LogsIngestionClient","correct":"from azure.monitor.ingestion import LogsIngestionClient"},{"note":"Use the '.aio' submodule for asynchronous client operations. Requires an asynchronous HTTP framework like aiohttp.","symbol":"LogsIngestionClient (async)","correct":"from azure.monitor.ingestion.aio import LogsIngestionClient"},{"symbol":"DefaultAzureCredential","correct":"from azure.identity import DefaultAzureCredential"}],"quickstart":{"code":"import os\nfrom datetime import datetime\nfrom azure.identity import DefaultAzureCredential\nfrom azure.monitor.ingestion import LogsIngestionClient\n\n# --- Prerequisites ---\n# 1. An Azure Log Analytics workspace.\n# 2. A Data Collection Endpoint (DCE) for your workspace.\n# 3. A Data Collection Rule (DCR) associated with your DCE and workspace.\n# 4. A stream defined within your DCR, often with a name like 'Custom-MyStreamName'.\n# 5. The service principal or managed identity used for authentication must have the\n#    'Monitoring Metrics Publisher' role on the Data Collection Rule resource.\n\n# Set environment variables (or replace directly in code for testing):\n# os.environ['DATA_COLLECTION_ENDPOINT'] = 'https://<your-dce-name>.<region>.data.microsoft.com'\n# os.environ['LOGS_DCR_RULE_ID'] = '<your-data-collection-rule-immutable-id>'\n# os.environ['LOGS_DCR_STREAM_NAME'] = 'Custom-YourCustomStreamName'\n\nendpoint = os.environ.get('DATA_COLLECTION_ENDPOINT', 'https://example.ingest.monitor.azure.com')\nrule_id = os.environ.get('LOGS_DCR_RULE_ID', 'dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx')\nstream_name = os.environ.get('LOGS_DCR_STREAM_NAME', 'Custom-MyTableRawData')\n\n# For local development, DefaultAzureCredential will attempt to use environment variables\n# (AZURE_TENANT_ID, AZURE_CLIENT_ID, AZURE_CLIENT_SECRET), Azure CLI, or Visual Studio Code login.\n# For deployed applications, it can use Managed Identity.\ncredential = DefaultAzureCredential()\n\nclient = LogsIngestionClient(endpoint=endpoint, credential=credential)\n\nlogs = [\n    {\n        \"TimeGenerated\": datetime.utcnow().isoformat(),\n        \"Computer\": \"Server1\",\n        \"CustomField_s\": \"Value1\",\n        \"EventId_d\": 1001\n    },\n    {\n        \"TimeGenerated\": datetime.utcnow().isoformat(),\n        \"Computer\": \"Server2\",\n        \"CustomField_s\": \"Value2\",\n        \"EventId_d\": 1002\n    }\n]\n\ntry:\n    client.upload(rule_id=rule_id, stream_name=stream_name, logs=logs)\n    print(\"Logs uploaded successfully.\")\nexcept Exception as e:\n    print(f\"An error occurred: {e}\")\n    # For detailed troubleshooting, enable logging:\n    # import logging, sys\n    # logging.getLogger('azure.monitor.ingestion').setLevel(logging.DEBUG)\n    # logging.getLogger('azure.monitor.ingestion').addHandler(logging.StreamHandler(stream=sys.stdout))\n    # client = LogsIngestionClient(endpoint=endpoint, credential=credential, logging_enable=True)\n","lang":"python","description":"This quickstart demonstrates how to ingest custom logs into Azure Monitor using the `LogsIngestionClient`. It requires a configured Data Collection Endpoint (DCE) and Data Collection Rule (DCR) in Azure. Authentication is handled via `DefaultAzureCredential` from `azure-identity`, which supports various authentication flows. The log data is sent as a list of dictionaries, where each dictionary represents a log entry and must conform to the schema defined in the DCR, including a `TimeGenerated` field."},"warnings":[{"fix":"Migrate to the Logs Ingestion API by using the `azure-monitor-ingestion` library. This requires setting up DCEs and DCRs in Azure and using Microsoft Entra ID (e.g., Managed Identity or Service Principal) for authentication instead of workspace keys.","message":"The legacy HTTP Data Collector API (used by older methods like sending directly to `/api/logs` with shared keys) is deprecated and scheduled for retirement by September 2026. The `azure-monitor-ingestion` library uses the newer Logs Ingestion API with Data Collection Endpoints (DCEs) and Data Collection Rules (DCRs).","severity":"breaking","affected_versions":"<1.1.0 (for those migrating from older custom ingestion methods)"},{"fix":"Ensure your Python environment and underlying OS are configured to use TLS 1.2 or higher for outbound connections. Most modern Python installations on supported operating systems should meet this requirement by default.","message":"The Logs Ingestion API will enforce TLS 1.2 or higher for all connections starting March 1, 2026. Older TLS versions will no longer be supported.","severity":"breaking","affected_versions":"All versions calling the API after March 1, 2026"},{"fix":"Before using the library, create a Log Analytics Workspace, a Data Collection Endpoint, and a Data Collection Rule in your Azure subscription. The DCR defines the schema of the logs you will send.","message":"Successful log ingestion requires a pre-configured Data Collection Endpoint (DCE) and Data Collection Rule (DCR) in Azure, along with a Log Analytics Workspace. These resources are not created by the Python library and must be set up via the Azure portal, CLI, or ARM templates.","severity":"gotcha","affected_versions":"All"},{"fix":"Assign the 'Monitoring Metrics Publisher' role to the identity used by `DefaultAzureCredential` on the specific Data Collection Rule resource in the Azure portal.","message":"The principal used for authentication (e.g., a Service Principal or Managed Identity) must have the 'Monitoring Metrics Publisher' role assigned on the Data Collection Rule resource to successfully upload logs.","severity":"gotcha","affected_versions":"All"},{"fix":"Review your DCR's stream schema and ensure that the dictionaries you pass to the `upload` method match its expected fields and types. Always include `\"TimeGenerated\": datetime.utcnow().isoformat()` for timestamping.","message":"Log entries must conform to the schema defined in your Data Collection Rule (DCR). Specifically, each log entry typically requires a `TimeGenerated` field in ISO 8601 format.","severity":"gotcha","affected_versions":"All"},{"fix":"Monitor your ingestion volume. If you anticipate exceeding the default limits, contact Microsoft support to request an increase for your Log Analytics workspace.","message":"While the Logs Ingestion API has different ingestion limits than legacy methods, Log Analytics workspaces still have overall ingestion volume rate limits. Exceeding these limits can lead to data being dropped.","severity":"gotcha","affected_versions":"All"}],"env_vars":null,"last_verified":"2026-04-11T00:00:00.000Z","next_check":"2026-07-10T00:00:00.000Z"}