Skip to main content
Platform rules require your team to take specific actions within Openlayer. Unlike evidence-based rules, they don’t require any uploads — they’re satisfied automatically as you use the platform.

How completion works

When you instrument your app and start capturing production traces, rules like “Capture production traces” and “Enable monitoring notifications” are marked complete. When you integrate Openlayer into your CI/CD pipeline and run tests, rules like “Setup development” and “Capture pre-production data” are satisfied. Compliance becomes a byproduct of good engineering practices.

What this produces for auditors

Each platform rule generates a specific type of evidence in Openlayer:
  • Observability rules produce continuous test run history with timestamps, pass/fail status, and the specific data points that triggered failures — plus full trace logs of every production request, including inputs, outputs, latency, cost, and intermediate steps.
  • Offline testing rules produce version-controlled test results tied to git commits, showing systematic evaluation of every system change before it reaches production.
  • Evaluation rules produce historical results for each test type — bias, prompt injection, PII, hallucination, and more — with trends over time.
  • Project metadata rules produce structured records of ownership, risk classification, and approval status for every AI initiative in your workspace.
This evidence maps directly to specific articles in each standard. Open a framework’s Documentation tab to see exactly which articles each rule addresses.