Continuously score production LLM traffic and track quality over time.
Monitor results – production traces with scores.
openinference.*
, ai.prompt
/ai.response
, gen_ai.*
).Traces search page with scores created by a 'monitor'.
<Icons.clock/>
). You’ll land on the monitor overview page.Create monitor modal.
spanName
, serviceName
, or full-text searchText.Monitor options – Metrics
Monitor options – sample & filter.
Monitor results – Scores.
Monitor overview list.
Edit monitor modal.