Observability
Tracing is the most crucial component in debugging and improving your AI app. It brings visibility into every execution step while collecting valuable data for evaluations and fine-tuning. With Laminar, you can start tracing with a single line of code.
import { Laminar, observe } from '@lmnr-ai/lmnr';
// automatically traces common LLM frameworks and SDKs
Laminar.initialize({ projectApiKey: "..." });
// you can also manually trace any function
const myFunction = observe({name: 'myFunc'}, async () => {
...
})

Tommy He
CTO, Clarum

I can attest to it being the only reliable and performant LLM monitoring platform I've tried. Founding team is great to talk to and super responsive.
Hashim Rehman
CTO, Remo

Laminar's evals help us maintain high accuracy while moving fast, and their team is incredibly responsive. We now use them for every LLM based feature we build.
Michael Ettlinger
CTO, Saturn

Laminar's tracing is genuinely great. So much better than the others I've tried.
Automatic tracing of LLM frameworks and SDKs with 1 line of code
Simply initialize Laminar at the top of your project and popular LLM frameworks and SDKs will be traced automatically.
Real-time traces
Don't wait for your AI workflows and agents to finish to debug them. Laminar's tracing engine provides real-time traces.
Browser agent observability
Laminar automatically records high-quality browser sessions and syncs them with agent traces to help you see what the browser agent sees. This drastically improves the debugging experience and allows you to fix issues 10x faster.

LLM playground
Open LLM spans in the playground to experiment with prompts and models.
Datasets
Build datasets from span data for evals, fine-tuning and prompt engineering.
Labels
Label your spans with custom tags to make them more informative.
Open-Source and easy to self-host
Laminar is fully open-source and easy to self-host.