Openlayer can act as an OpenTelemetry backend,
enabling trace ingestion from any OpenTelemetry-compatible instrumentation library.This guide shows how to use the OpenLIT
library to instrument an LLM framework or provider and send trace data to
Openlayer for monitoring and evaluation.
Initialize OpenLIT instrumentation in your application.
Copy
Ask AI
import openlitopenlit.init(disable_batch=True)
3
Run LLMs and workflows as usual
Once instrumentation is set up, you can run your LLM calls as usual.
Trace data will be automatically captured and exported to Openlayer, where
you can begin testing and analyzing it.For example:
Copy
Ask AI
from openai import OpenAIclient = OpenAI()client.chat.completions.create( model="gpt-4o-mini", messages=[{"role": "user", "content": "How are you doing today?"}],)