Skip to main content

Option A: Proxy (Fastest — Zero Code)

The MeshAI proxy sits between your agents and LLM providers. One env var change and you’re monitoring.
1

Get your proxy key

Go to app.meshai.dev → Settings → Proxy Keys → Create.
2

Set the env var

# For Anthropic
export ANTHROPIC_BASE_URL=https://proxy.meshai.dev/v1/anthropic/k/YOUR_PROXY_KEY

# For OpenAI
export OPENAI_BASE_URL=https://proxy.meshai.dev/v1/openai/k/YOUR_PROXY_KEY
3

Run your agent

No code changes needed. All requests automatically flow through MeshAI.

Option B: Python SDK

1

Install

pip install meshai-sdk
2

Initialize and register

from meshai import MeshAI

client = MeshAI(api_key="msh_...", agent_name="my-agent")
client.register(
    framework="crewai",
    model_provider="openai",
    model_name="gpt-4o",
)
3

Enable auto-tracking

# Automatic heartbeats
client.start_heartbeat()

# Track usage (or use framework wrappers)
client.track_usage(
    model_provider="openai",
    model_name="gpt-4o",
    input_tokens=1500,
    output_tokens=800,
)

What You’ll See

After connecting, your dashboard shows:
  • Agent Registry — all your agents with status, framework, model
  • Cost Attribution — spend per agent, team, and model
  • Anomaly Detection — cost spikes, behavioral drift, security alerts
  • Governance — policy enforcement, audit trail, compliance score

Next Steps