Skip to main content

heartbeat()

Send a single heartbeat to confirm the agent is alive. Called automatically by start_heartbeat(), but you can also call it manually.
client.heartbeat()
The heartbeat updates the agent’s last_heartbeat timestamp and keeps its status as healthy. If no heartbeat is received for 2 minutes, the agent status changes to degraded. After 5 minutes, it changes to down.

start_heartbeat()

Start a background thread that sends heartbeats at a regular interval.
client.start_heartbeat()
The interval is configured via the heartbeat_interval constructor parameter (default: 30 seconds). The background thread is a daemon thread — it stops automatically when your process exits.
# Custom interval
client = MeshAI(api_key="msh_...", agent_name="my-agent", heartbeat_interval=60)
client.register(framework="custom", model_provider="openai")
client.start_heartbeat()  # Sends every 60 seconds
To stop the heartbeat:
client.stop_heartbeat()

track_usage()

Report a single LLM usage event — tokens consumed, model used, and optional cost.
client.track_usage(
    model_provider="openai",
    model_name="gpt-4o",
    input_tokens=1500,
    output_tokens=800,
    request_type="chat.completions",
    cost_usd=0.035,
)
model_provider
string
required
LLM provider: openai, anthropic, google, nvidia, bedrock
model_name
string
required
Model identifier (e.g., gpt-4o, claude-sonnet-4-20250514)
input_tokens
integer
required
Number of input/prompt tokens
output_tokens
integer
required
Number of output/completion tokens
request_type
string
Type of request (e.g., chat.completions, embeddings)
cost_usd
float
Explicit cost in USD. If omitted, MeshAI calculates cost based on model pricing tables.

Auto-Tracking Wrappers

Instead of calling track_usage() manually, use framework wrappers that capture usage automatically:
# OpenAI
from meshai.integrations.openai import wrap_openai
oai = wrap_openai(openai.OpenAI(), meshai=client)

# Anthropic
from meshai.integrations.anthropic import wrap_anthropic
ant = wrap_anthropic(anthropic.Anthropic(), meshai=client)

# CrewAI
from meshai.integrations.crewai import MeshAICrewCallback
crew = Crew(callbacks=[MeshAICrewCallback(meshai=client)])

# LangChain
from meshai.integrations.langchain import MeshAILangChainCallback
chain.invoke(input, config={"callbacks": [MeshAILangChainCallback(meshai=client)]})

# AutoGen
from meshai.integrations.autogen import MeshAIAutoGenCallback
agent = AssistantAgent(callbacks=[MeshAIAutoGenCallback(meshai=client)])
See the Framework Guides for complete examples.