Learn how to export Semantic Kernel traces to Openlayer
Semantic Kernel is an
open-source SDK from Microsoft that helps you build AI applications using languages
like Python, C#, and Java. It comes with built-in OpenTelemetry
instrumentation, making it easy to export trace data.This guide shows how to export Semantic Kernel traces to Openlayer for observability
and evaluation.
While this guide shows code snippets in Python, the integration also works for
all other programming languages supported by Semantic Kernel, such as
C#,
and
Java.
Once instrumentation is set up, you can run your LLM calls as usual.
Trace data will be automatically captured and exported to Openlayer, where
you can begin testing and analyzing it.For example:
Copy
Ask AI
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletionfrom semantic_kernel.prompt_template import InputVariable, PromptTemplateConfigkernel.add_service( OpenAIChatCompletion(ai_model_id="gpt-4o-mini"),)prompt = """{{$input}}Please provide a concise response to the question above."""prompt_template_config = PromptTemplateConfig( template=prompt, name="question_answerer", template_format="semantic-kernel", input_variables=[ InputVariable(name="input", description="The question from the user", is_required=True), ])summarize = kernel.add_function( function_name="answerQuestionFunc", plugin_name="questionAnswererPlugin", prompt_template_config=prompt_template_config,)await kernel.invoke(summarize, input="What's the meaning of life?")