Spring AI is a
Spring-based framework that helps you build AI applications. It comes with built-in OpenTelemetry
instrumentation, making it easy to export trace data.This guide shows how to export Spring AI traces to Openlayer for observability
and evaluation.
With the dependencies in place, Spring Boot will auto-configure OpenTelemetry tracing.You just need to:
Set the OTLP endpoint (pointing to Openlayer)
Enable tracing for Spring AI
Example configuration:
Copy
Ask AI
spring: application: name: my-llm-app ai: chat: observations: include-prompt: true # Include prompt content in tracing (disabled by default for privacy) include-completion: true # Include completion content in tracing (disabled by default)management: tracing: sampling: probability: 1.0 # Sample 100% of requests for full tracing (adjust in production as needed) observations: annotations: enabled: true # Enable @Observed (if you use observation annotations in code)
3
Point to Openlayer's OpenTelemetry endpoint
Finally, point your application to Openlayer’s OpenTelemetry endpoint via
the following environment variables:
Once instrumentation is set up, you can run your Spring application and LLM calls as usual.
Trace data will be automatically captured and exported to Openlayer, where
you can begin testing and analyzing it.