Documentation Index
Fetch the complete documentation index at: https://docs.meshai.dev/llms.txt
Use this file to discover all available pages before exploring further.
Installation
pip install meshai-sdk[agno]
Usage
from meshai import MeshAI
from meshai.integrations.agno import track_agno
from agno.agent import Agent
from agno.models.openai import OpenAIChat
client = MeshAI(api_key="msh_...", agent_name="my-agno-agent")
client.register(framework="agno")
# Enable global tracking
track_agno(client)
# Run your agents as normal
agent = Agent(
model=OpenAIChat(id="gpt-4o"),
instructions=["You are a helpful assistant"],
)
response = agent.run("What is AI governance?")
# Model and tokens tracked automatically
How It Works
track_agno patches Agent.run() to intercept LLM calls. After each agent run, it:
- Extracts the model name from the agent’s model configuration
- Extracts token counts from the response usage metrics
- Infers the provider from the model name
- Sends the usage event to MeshAI (buffered, non-blocking)
Works with all Agno-supported model providers: OpenAI, Anthropic, Google, Groq, and others.
Phidata Compatibility
If you are migrating from Phidata, the same integration works — Agno is Phidata’s successor and shares the same agent interface.
Alternative: Proxy (Zero-Code)
If your Agno agents use OpenAI or Anthropic, route through the proxy:
export OPENAI_BASE_URL=https://proxy.meshai.dev/v1/openai/k/msh_YOUR_PROXY_KEY