OTel hero

OpenTelemetry (OTel) is an open-source framework used to collect observability data. It It is widely used by frameworks like Semantic Kernel, Vercel AI SDK, Spring AI, and others.

You can configure Openlayer as the backend for your OTel trace data. If you are already using a framework that captures OTel traces, you can point it to Openlayer’s OTel endpoint to export traces and monitor your AI system.

OpenTelemetry endpoint

Openlayer accepts OTel traces at the following endpoint: https://api.openlayer.com/v1/otel. This endpoint uses the OTLP protocol and expects telemetry data in protobuf format over HTTPS.

Most OTel-instrumented SDKs use this format by default, but be sure to check your SDK’s documentation to confirm your setup.

To send OTel data to Openlayer, configure your SDK to use the endpoint above and include the correct authentication headers. This is typically done using the environment variables shown below.

OTEL_EXPORTER_OTLP_ENDPOINT=https://api.openlayer.com/v1/otel
OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer YOUR_OPENLAYER_API_KEY, x-bt-parent=pipeline_id:YOUR_OPENLAYER_PIPELINE_ID"

If you use an OTel Collector that requires signal-specific environment variables, the export endpoint must be https://api.openlayer.com/v1/otel/v1/traces.

Property mapping

When Openlayer receives OTel data, it transforms it into its own trace format. This involves mapping properties from the GenAI semantic convention and popular frameworks into Openlayer’s trace data model.

The OTel GenAI semantic convention is still evolving. If an integration does not work as expected or if Openlayer does not parse all attributes correctly, please reach out.

Libraries and frameworks with OpenTelemetry support

Any OpenTelemetry-compatible instrumentation can be used to export traces to Openlayer.

The libraries and frameworks below are already instrumented for OpenTelemetry and traces can be exported to Openlayer.

Check out their dedicated integration guides to learn how to set it up: