Skip to main content

When to use LangChain

  • You’re already using LangChain or LangGraph
  • You want pre-built agent patterns (ReAct, etc.)
  • You need LangChain’s ecosystem of integrations

Create an agent

superserve create-agent my_agent --framework langchain

Define tools with @superserve.tool

Use @superserve.tool to create Ray-distributed tools that work directly with LangChain:
import superserve
from langchain.agents import create_agent
from langchain_openai import ChatOpenAI


@superserve.tool(num_cpus=1)
def search_web(query: str) -> str:
    """Search the web for information."""
    return f"Results for: {query}"


@superserve.tool(num_cpus=2, memory="4GB")
def analyze_data(data: str) -> dict:
    """Analyze data with heavy computation."""
    return {"result": f"Analysis of {data}"}


def make_agent():
    llm = ChatOpenAI(model="gpt-4o-mini")
    return create_agent(llm, tools=[search_web, analyze_data])


superserve.serve(make_agent, name="my-agent", num_cpus=1, memory="2GB")

Wrap existing tools

Use superserve.tool() as a wrapper for existing LangChain tools or plain functions:
import superserve
from langchain.agents import create_agent
from langchain_openai import ChatOpenAI
from langchain_community.tools import DuckDuckGoSearchRun


# Wrap a LangChain tool for Ray execution
search = superserve.tool(DuckDuckGoSearchRun(), num_cpus=1)


# Wrap a plain function
def calculate(a: int, b: int) -> int:
    """Add two numbers together."""
    return a + b

ray_calculate = superserve.tool(calculate)


def make_agent():
    llm = ChatOpenAI(model="gpt-4o-mini")
    return create_agent(llm, tools=[search, ray_calculate])


superserve.serve(make_agent, name="calc-agent")

Next steps