Documentation Index
Fetch the complete documentation index at: https://docs.meshai.dev/llms.txt
Use this file to discover all available pages before exploring further.
Installation
pip install meshai-sdk[openai]
Usage
from meshai import MeshAI
from meshai.integrations.openai import wrap_openai
import openai
# Initialize MeshAI
client = MeshAI(api_key="msh_...", agent_name="my-agent")
client.register(framework="custom", model_provider="openai")
# Wrap the OpenAI client
oai = wrap_openai(openai.OpenAI(), meshai=client)
# Use as normal — all calls auto-tracked
response = oai.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello"}],
)
# Model, tokens, and cost are captured automatically
# Even when switching models:
response = oai.chat.completions.create(
model="gpt-4o-mini", # Different model — tracked separately
messages=[{"role": "user", "content": "Hello"}],
)
What’s Tracked
For each API call, MeshAI captures:
- Model name (extracted from the response, not hardcoded)
- Input tokens (prompt_tokens)
- Output tokens (completion_tokens)
- Request type (chat.completions)
Alternative: Proxy (Zero-Code)
If you prefer no SDK at all:
export OPENAI_BASE_URL=https://proxy.meshai.dev/v1/openai/k/msh_YOUR_PROXY_KEY
All OpenAI calls are automatically routed through MeshAI.