Documentation Index
Fetch the complete documentation index at: https://docs.meshai.dev/llms.txt
Use this file to discover all available pages before exploring further.
Installation
pip install meshai-sdk[bedrock]
Usage
from meshai import MeshAI
from meshai.integrations.bedrock import wrap_bedrock
import boto3
client = MeshAI(api_key="msh_...", agent_name="my-agent")
client.register(framework="custom", model_provider="aws")
# Wrap the Bedrock client
bedrock = wrap_bedrock(
boto3.client("bedrock-runtime"),
meshai=client,
)
# Use converse() — auto-tracked
response = bedrock.converse(
modelId="anthropic.claude-sonnet-4-6-20250514-v1:0",
messages=[{"role": "user", "content": [{"text": "Hello"}]}],
)
# invoke_model() is also tracked
response = bedrock.invoke_model(
modelId="amazon.titan-text-express-v1",
body='{"inputText": "Hello"}',
)
How It Works
wrap_bedrock patches both converse() and invoke_model() on the Bedrock runtime client. After each call, it:
- Extracts the model name from the
modelId parameter
- Extracts token counts from the response usage metadata
- Infers the provider from the model ID prefix (e.g.,
anthropic., amazon., meta.)
- Sends the usage event to MeshAI (buffered, non-blocking)
Alternative: Proxy (Zero-Code)
For Bedrock, the proxy approach requires configuring a custom endpoint in your boto3 client:
bedrock = boto3.client(
"bedrock-runtime",
endpoint_url="https://proxy.meshai.dev/v1/bedrock/k/msh_YOUR_PROXY_KEY",
)