Skip to main content
AgentBasis integrates with LangChain via callbacks, providing full visibility into your chains’ execution steps.

Basic Usage

Create a callback handler and pass it to your LangChain objects (LLMs, Chains, or Agents).
from agentbasis.frameworks.langchain import get_callback_handler
from langchain_openai import ChatOpenAI

# 1. Create a callback handler
handler = get_callback_handler()

# 2. Pass it to your LangChain model or chain
llm = ChatOpenAI(model="gpt-4")

# 3. Execute with callbacks
response = llm.invoke(
    "Hello world", 
    config={"callbacks": [handler]}
)

Chains & Agents

For complex chains, using the handler ensures you see the entire trace tree (parent chain -> child steps -> LLM call).
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
from agentbasis.frameworks.langchain import get_callback_config

# Using get_callback_config() is a convenient shortcut
config = get_callback_config()

chain = LLMChain(
    llm=llm, 
    prompt=PromptTemplate.from_template("Answer this: {query}")
)

# The trace will show the Chain execution wrapping the LLM call
result = chain.invoke(
    {"query": "What is the capital of France?"}, 
    config=config
)