Documentation Index
Fetch the complete documentation index at: https://docs.meshai.dev/llms.txt
Use this file to discover all available pages before exploring further.
Installation
pip install meshai-sdk[langchain]
Usage with LangChain
from meshai import MeshAI
from meshai.integrations.langchain import MeshAICallbackHandler
from langchain_openai import ChatOpenAI
client = MeshAI(api_key="msh_...", agent_name="my-chain")
client.register(framework="langchain")
handler = MeshAICallbackHandler(client)
# Pass to any LangChain model
llm = ChatOpenAI(model="gpt-4o", callbacks=[handler])
response = llm.invoke("What is MeshAI?")
# Model and tokens tracked automatically
Usage with LangGraph
from langgraph.graph import StateGraph
from meshai.integrations.langchain import MeshAICallbackHandler
handler = MeshAICallbackHandler(client)
# Pass as config to graph execution
config = {"callbacks": [handler]}
result = graph.stream(input_data, config=config)
Usage with Chains
from langchain_core.prompts import ChatPromptTemplate
prompt = ChatPromptTemplate.from_messages([...])
chain = prompt | llm # llm already has the handler from callbacks=[handler]
result = chain.invoke({"topic": "AI governance"})
How It Works
MeshAICallbackHandler implements LangChain’s callback interface. The on_llm_end method:
- Extracts the model name from
LLMResult.llm_output
- Extracts token usage from the response metadata
- Infers the provider from the model name
- Sends to MeshAI (buffered, non-blocking)
Works with any LangChain-compatible model: ChatOpenAI, ChatAnthropic, ChatGoogle, etc.