Skip to main content
MeshAI accepts standard OTLP/HTTP JSON exports. Point any OpenTelemetry-instrumented agent at MeshAI and it will auto-discover agents, extract token usage, and populate the registry automatically.

Endpoints

EndpointMethodDescription
/api/v1/ingest/v1/tracesPOSTOTLP trace export (JSON)
/api/v1/ingest/v1/metricsPOSTOTLP metrics export (JSON, processing deferred)
Both require an API key with the telemetry:write scope.

Quick Setup

Set the standard OpenTelemetry environment variable to point at MeshAI:
export OTEL_EXPORTER_OTLP_ENDPOINT=https://api.meshai.dev/api/v1/ingest
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer msh_YOUR_API_KEY"
export OTEL_EXPORTER_OTLP_PROTOCOL=http/json
That’s it. Any OTLP-compatible SDK or agent will now export traces to MeshAI.

What Gets Extracted

When MeshAI receives trace data, it automatically:
  1. Discovers agents from span attributes (agent.name, service.name)
  2. Extracts token usage from gen_ai.usage.input_tokens and gen_ai.usage.output_tokens attributes
  3. Records model info from gen_ai.request.model and gen_ai.system attributes
  4. Registers new agents in your registry if they don’t already exist

Python Example

Using the OpenTelemetry Python SDK:
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Configure exporter to send to MeshAI
exporter = OTLPSpanExporter(
    endpoint="https://api.meshai.dev/api/v1/ingest/v1/traces",
    headers={"Authorization": "Bearer msh_YOUR_API_KEY"},
)

provider = TracerProvider()
provider.add_span_processor(BatchSpanProcessor(exporter))
trace.set_tracer_provider(provider)

tracer = trace.get_tracer("my-agent")

# Your agent code — spans are automatically exported
with tracer.start_as_current_span("agent.run") as span:
    span.set_attribute("agent.name", "my-summarizer")
    span.set_attribute("gen_ai.request.model", "gpt-4o")
    span.set_attribute("gen_ai.usage.input_tokens", 1500)
    span.set_attribute("gen_ai.usage.output_tokens", 800)
    # ... agent logic ...

Node.js Example

const { OTLPTraceExporter } = require("@opentelemetry/exporter-trace-otlp-http");

const exporter = new OTLPTraceExporter({
  url: "https://api.meshai.dev/api/v1/ingest/v1/traces",
  headers: { Authorization: "Bearer msh_YOUR_API_KEY" },
});

Framework Integration

Most agent frameworks support OpenTelemetry natively or via plugins:
FrameworkOTel Support
LangChainlangchain-opentelemetry package
CrewAIBuilt-in OTEL_EXPORTER_OTLP_ENDPOINT support
AutoGenOpenTelemetry tracing via autogen-ext
Semantic KernelBuilt-in ActivitySource (maps to OTel)
Custom agentsUse any OTLP SDK
Set the environment variables above and these frameworks will export traces to MeshAI automatically.

Response Format

{
  "success": true,
  "data": {
    "accepted_spans": 12,
    "agents_discovered": 1
  }
}
FieldDescription
accepted_spansNumber of spans successfully processed
agents_discoveredNumber of new agents auto-registered from this batch

Metrics (Preview)

The metrics endpoint (/api/v1/ingest/v1/metrics) accepts OTLP metric exports and acknowledges them. Metrics-based anomaly detection is coming in Phase 2.