Skip to main content

Installation

pip install meshai-sdk[agno]

Usage

from meshai import MeshAI
from meshai.integrations.agno import track_agno
from agno.agent import Agent
from agno.models.openai import OpenAIChat

client = MeshAI(api_key="msh_...", agent_name="my-agno-agent")
client.register(framework="agno")

# Enable global tracking
track_agno(client)

# Run your agents as normal
agent = Agent(
    model=OpenAIChat(id="gpt-4o"),
    instructions=["You are a helpful assistant"],
)
response = agent.run("What is AI governance?")
# Model and tokens tracked automatically

How It Works

track_agno patches Agent.run() to intercept LLM calls. After each agent run, it:
  1. Extracts the model name from the agent’s model configuration
  2. Extracts token counts from the response usage metrics
  3. Infers the provider from the model name
  4. Sends the usage event to MeshAI (buffered, non-blocking)
Works with all Agno-supported model providers: OpenAI, Anthropic, Google, Groq, and others.

Phidata Compatibility

If you are migrating from Phidata, the same integration works — Agno is Phidata’s successor and shares the same agent interface.

Alternative: Proxy (Zero-Code)

If your Agno agents use OpenAI or Anthropic, route through the proxy:
export OPENAI_BASE_URL=https://proxy.meshai.dev/v1/openai/k/msh_YOUR_PROXY_KEY