Documentation Index
Fetch the complete documentation index at: https://docs.meshai.dev/llms.txt
Use this file to discover all available pages before exploring further.
MeshAI accepts standard OTLP/HTTP JSON exports. Point any OpenTelemetry-instrumented agent at MeshAI and it will auto-discover agents, extract token usage, and populate the registry automatically.
Endpoints
| Endpoint | Method | Description |
|---|
/api/v1/ingest/v1/traces | POST | OTLP trace export (JSON) |
/api/v1/ingest/v1/metrics | POST | OTLP metrics export (JSON, processing deferred) |
Both require an API key with the telemetry:write scope.
Quick Setup
Set the standard OpenTelemetry environment variable to point at MeshAI:
export OTEL_EXPORTER_OTLP_ENDPOINT=https://api.meshai.dev/api/v1/ingest
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer msh_YOUR_API_KEY"
export OTEL_EXPORTER_OTLP_PROTOCOL=http/json
That’s it. Any OTLP-compatible SDK or agent will now export traces to MeshAI.
When MeshAI receives trace data, it automatically:
- Discovers agents from span attributes (
agent.name, service.name)
- Extracts token usage from
gen_ai.usage.input_tokens and gen_ai.usage.output_tokens attributes
- Records model info from
gen_ai.request.model and gen_ai.system attributes
- Registers new agents in your registry if they don’t already exist
Python Example
Using the OpenTelemetry Python SDK:
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
# Configure exporter to send to MeshAI
exporter = OTLPSpanExporter(
endpoint="https://api.meshai.dev/api/v1/ingest/v1/traces",
headers={"Authorization": "Bearer msh_YOUR_API_KEY"},
)
provider = TracerProvider()
provider.add_span_processor(BatchSpanProcessor(exporter))
trace.set_tracer_provider(provider)
tracer = trace.get_tracer("my-agent")
# Your agent code — spans are automatically exported
with tracer.start_as_current_span("agent.run") as span:
span.set_attribute("agent.name", "my-summarizer")
span.set_attribute("gen_ai.request.model", "gpt-4o")
span.set_attribute("gen_ai.usage.input_tokens", 1500)
span.set_attribute("gen_ai.usage.output_tokens", 800)
# ... agent logic ...
Node.js Example
const { OTLPTraceExporter } = require("@opentelemetry/exporter-trace-otlp-http");
const exporter = new OTLPTraceExporter({
url: "https://api.meshai.dev/api/v1/ingest/v1/traces",
headers: { Authorization: "Bearer msh_YOUR_API_KEY" },
});
Framework Integration
Most agent frameworks support OpenTelemetry natively or via plugins:
| Framework | OTel Support |
|---|
| LangChain | langchain-opentelemetry package |
| CrewAI | Built-in OTEL_EXPORTER_OTLP_ENDPOINT support |
| AutoGen | OpenTelemetry tracing via autogen-ext |
| Semantic Kernel | Built-in ActivitySource (maps to OTel) |
| Custom agents | Use any OTLP SDK |
Set the environment variables above and these frameworks will export traces to MeshAI automatically.
{
"success": true,
"data": {
"accepted_spans": 12,
"agents_discovered": 1
}
}
| Field | Description |
|---|
accepted_spans | Number of spans successfully processed |
agents_discovered | Number of new agents auto-registered from this batch |
Metrics (Preview)
The metrics endpoint (/api/v1/ingest/v1/metrics) accepts OTLP metric exports and acknowledges them. Metrics-based anomaly detection is coming in Phase 2.