Skip to main content

Installation

pip install meshai-sdk[bedrock]

Usage

from meshai import MeshAI
from meshai.integrations.bedrock import wrap_bedrock
import boto3

client = MeshAI(api_key="msh_...", agent_name="my-agent")
client.register(framework="custom", model_provider="aws")

# Wrap the Bedrock client
bedrock = wrap_bedrock(
    boto3.client("bedrock-runtime"),
    meshai=client,
)

# Use converse() — auto-tracked
response = bedrock.converse(
    modelId="anthropic.claude-sonnet-4-6-20250514-v1:0",
    messages=[{"role": "user", "content": [{"text": "Hello"}]}],
)

# invoke_model() is also tracked
response = bedrock.invoke_model(
    modelId="amazon.titan-text-express-v1",
    body='{"inputText": "Hello"}',
)

How It Works

wrap_bedrock patches both converse() and invoke_model() on the Bedrock runtime client. After each call, it:
  1. Extracts the model name from the modelId parameter
  2. Extracts token counts from the response usage metadata
  3. Infers the provider from the model ID prefix (e.g., anthropic., amazon., meta.)
  4. Sends the usage event to MeshAI (buffered, non-blocking)

Alternative: Proxy (Zero-Code)

For Bedrock, the proxy approach requires configuring a custom endpoint in your boto3 client:
bedrock = boto3.client(
    "bedrock-runtime",
    endpoint_url="https://proxy.meshai.dev/v1/bedrock/k/msh_YOUR_PROXY_KEY",
)