Skip to main content

Installation

pip install meshai-sdk[langchain]

Usage with LangChain

from meshai import MeshAI
from meshai.integrations.langchain import MeshAICallbackHandler
from langchain_openai import ChatOpenAI

client = MeshAI(api_key="msh_...", agent_name="my-chain")
client.register(framework="langchain")

handler = MeshAICallbackHandler(client)

# Pass to any LangChain model
llm = ChatOpenAI(model="gpt-4o", callbacks=[handler])
response = llm.invoke("What is MeshAI?")
# Model and tokens tracked automatically

Usage with LangGraph

from langgraph.graph import StateGraph
from meshai.integrations.langchain import MeshAICallbackHandler

handler = MeshAICallbackHandler(client)

# Pass as config to graph execution
config = {"callbacks": [handler]}
result = graph.stream(input_data, config=config)

Usage with Chains

from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate.from_messages([...])
chain = prompt | llm  # llm already has the handler from callbacks=[handler]
result = chain.invoke({"topic": "AI governance"})

How It Works

MeshAICallbackHandler implements LangChain’s callback interface. The on_llm_end method:
  1. Extracts the model name from LLMResult.llm_output
  2. Extracts token usage from the response metadata
  3. Infers the provider from the model name
  4. Sends to MeshAI (buffered, non-blocking)
Works with any LangChain-compatible model: ChatOpenAI, ChatAnthropic, ChatGoogle, etc.