Endpoints
| Endpoint | Method | Description |
|---|---|---|
/api/v1/ingest/v1/traces | POST | OTLP trace export (JSON) |
/api/v1/ingest/v1/metrics | POST | OTLP metrics export (JSON, processing deferred) |
telemetry:write scope.
Quick Setup
Set the standard OpenTelemetry environment variable to point at MeshAI:What Gets Extracted
When MeshAI receives trace data, it automatically:- Discovers agents from span attributes (
agent.name,service.name) - Extracts token usage from
gen_ai.usage.input_tokensandgen_ai.usage.output_tokensattributes - Records model info from
gen_ai.request.modelandgen_ai.systemattributes - Registers new agents in your registry if they don’t already exist
Python Example
Using the OpenTelemetry Python SDK:Node.js Example
Framework Integration
Most agent frameworks support OpenTelemetry natively or via plugins:| Framework | OTel Support |
|---|---|
| LangChain | langchain-opentelemetry package |
| CrewAI | Built-in OTEL_EXPORTER_OTLP_ENDPOINT support |
| AutoGen | OpenTelemetry tracing via autogen-ext |
| Semantic Kernel | Built-in ActivitySource (maps to OTel) |
| Custom agents | Use any OTLP SDK |
Response Format
| Field | Description |
|---|---|
accepted_spans | Number of spans successfully processed |
agents_discovered | Number of new agents auto-registered from this batch |
Metrics (Preview)
The metrics endpoint (/api/v1/ingest/v1/metrics) accepts OTLP metric exports and acknowledges them. Metrics-based anomaly detection is coming in Phase 2.
