Skip to main content
Azure Content Understanding hero Openlayer integrates with Azure Content Understanding, Microsoft’s service for extracting structured data and insights from documents, images, audio, and video using LLMs.

Monitoring Azure Content Understanding

To use monitoring mode, instrument your code to publish the analysis requests your AI system makes to the Openlayer platform. Each begin_analyzepoller.result() call is automatically traced and published with inputs, outputs, latency, token usage, and the underlying model used.

Setup

Instrument your client:
Python
import os
from azure.ai.contentunderstanding import ContentUnderstandingClient
from azure.ai.contentunderstanding.models import AnalysisInput
from azure.core.credentials import AzureKeyCredential

# 1. Set the environment variables
os.environ["OPENLAYER_API_KEY"] = "YOUR_OPENLAYER_API_KEY_HERE"
os.environ["OPENLAYER_INFERENCE_PIPELINE_ID"] = "YOUR_OPENLAYER_INFERENCE_PIPELINE_ID_HERE"

# 2. Wrap the Content Understanding client with `trace_azure_content_understanding`
from openlayer.lib import trace_azure_content_understanding, configure

# Configure if you want to upload documents to Openlayer storage
configure(
    attachment_upload_enabled=True,   # upload binary/file attachments
    url_upload_enabled=True,          # also download & re-upload external URLs
)

client = trace_azure_content_understanding(
    ContentUnderstandingClient(
        endpoint="YOUR_AZURE_CONTENT_UNDERSTANDING_ENDPOINT_HERE",
        credential=AzureKeyCredential("YOUR_AZURE_CONTENT_UNDERSTANDING_KEY_HERE"),
        api_version="2025-11-01",
    )
)

# 3. Use the client normally — tracing happens automatically
poller = client.begin_analyze(
    analyzer_id="prebuilt-invoice",
    inputs=[AnalysisInput(url="https://example.com/invoice.pdf")],
)
result = poller.result()

See full Python example

Once instrumented, every analysis call is automatically published to Openlayer. In the “Data” page of your Openlayer data source, you can see the traces for each request. Azure Content Understanding traces After your AI system requests are continuously published and logged by Openlayer, you can create tests that run at a regular cadence on top of them.
To store the source files alongside your traces in Openlayer, enable attachment uploads in configure() by setting attachment_upload_enabled=True (for binary inputs) and url_upload_enabled=True (to also fetch and persist URL-referenced files).