The examples below show how Openlayer’s monitoring mode can be used in isolation.

For examples that span development and monitoring modes, check out the Templates.

ExampleStack
OpenAI Chat Completions - Python
Monitoring OpenAI chat completion calls in Python.
Python
OpenAI
OpenAI Chat Completions - TypeScript
Monitoring OpenAI chat completion calls in TypeScript.
TypeScript
OpenAI
Tracing a RAG system
Tracing every step of a RAG pipeline.
Python
OpenAI
Anthropic Messages - Python
Monitoring Anthropic message creation calls in Python.
Python
Anthropic
Azure OpenAI Chat Completions - Python
Monitoring Azure OpenAI chat completion calls in Python.
Python
Azure OpenAI
Mistral AI Chat Completions - Python
Monitoring Mistral AI chat completion and streaming calls in Python.
Python
Mistral AI
LLMs with LangChain - Python
Monitoring LLMs built with LangChain using a callback handler.
Python
LangChain
LLMs with LangChain - TypeScript
Streaming data from LangChain uses to Openlayer with TypeScript.
TypeScript
LangChain
Vertex AI via LangChain - Python
Monitoring Vertex AI calls via LangChain using a callback handler.
Python
Vertex AI
LangChain
Ollama via LangChain - Python
Monitoring Ollama calls via LangChain using a callback handler.
Python
Ollama
LangChain
Groq Chat Completions - Python
Monitoring Groq LLM chat completion calls in Python.
Python
Groq
OpenAI Assistants API - Python
Monitoring OpenAI Assistant API runs in Python.
Python
OpenAI
OpenAI Assistants API - TypeScript
Monitoring OpenAI Assistant API runs in TypeScript.
TypeScript
OpenAI
Manually streaming data for monitoring - Python
Monitoring a tabular classification model in production.
Python
Traditional ML