Skip to main content

Installation

pip install meshai-sdk[pydantic-ai]

Usage

from meshai import MeshAI
from meshai.integrations.pydantic_ai import track_pydantic_ai
from pydantic_ai import Agent

client = MeshAI(api_key="msh_...", agent_name="my-pydantic-agent")
client.register(framework="pydantic-ai")

# Enable global tracking
track_pydantic_ai(client)

# Run your agents as normal
agent = Agent("openai:gpt-4o", system_prompt="You are a helpful assistant")

# Both async and sync are tracked
result = agent.run_sync("What is AI governance?")
print(result.data)
# Model and tokens tracked automatically

How It Works

track_pydantic_ai patches Agent.run and Agent.run_sync to intercept LLM calls. After each agent run, it:
  1. Extracts the model name from the agent’s model configuration
  2. Extracts token counts from RunResult.usage() (RunUsage object)
  3. Infers the provider from the model string prefix
  4. Sends the usage event to MeshAI (buffered, non-blocking)
Works with all Pydantic AI-supported models: OpenAI, Anthropic, Gemini, Groq, Mistral, and others.

Alternative: Proxy (Zero-Code)

If your Pydantic AI agents use OpenAI or Anthropic, route through the proxy:
export OPENAI_BASE_URL=https://proxy.meshai.dev/v1/openai/k/msh_YOUR_PROXY_KEY