Skip to main content

FunctionCallingAgent

FunctionCallingAgent uses native LLM function calling — OpenAI tool_calls or Anthropic tool_use. More reliable tool selection than ReAct, especially with multiple tools.

Requirements

  • OpenAILLM, AnthropicLLM, GeminiLLM, or MistralLLM (all support call_with_tools())
  • For other providers, use ReActAgent instead

Usage

from synapsekit import FunctionCallingAgent, CalculatorTool, FileReadTool
from synapsekit.llm.openai import OpenAILLM
from synapsekit.llm.base import LLMConfig

llm = OpenAILLM(LLMConfig(model="gpt-4o-mini", api_key="sk-..."))

agent = FunctionCallingAgent(
llm=llm,
tools=[CalculatorTool(), FileReadTool()],
max_iterations=10,
system_prompt="You are a helpful data analyst.",
)

answer = await agent.run("Read ./data.csv and tell me the row count.")

How it works

  1. Tool schemas are sent to the LLM as JSON
  2. LLM responds with tool_calls (or text if no tool is needed)
  3. Each tool is called, results appended as role: tool messages
  4. Repeat until the LLM returns text with no tool calls

Anthropic example

from synapsekit.llm.anthropic import AnthropicLLM
from synapsekit.llm.base import LLMConfig

llm = AnthropicLLM(LLMConfig(
model="claude-sonnet-4-6",
api_key="sk-ant-...",
))

agent = FunctionCallingAgent(llm=llm, tools=[CalculatorTool()])
answer = await agent.run("What is 144 / 12?")

Gemini example

from synapsekit.llm.gemini import GeminiLLM
from synapsekit.llm.base import LLMConfig

llm = GeminiLLM(LLMConfig(
model="gemini-1.5-pro",
api_key="your-google-api-key",
provider="gemini",
))

agent = FunctionCallingAgent(llm=llm, tools=[CalculatorTool()])
answer = await agent.run("What is 2 to the power of 10?")

Mistral example

from synapsekit.llm.mistral import MistralLLM
from synapsekit.llm.base import LLMConfig

llm = MistralLLM(LLMConfig(
model="mistral-large-latest",
api_key="your-mistral-key",
provider="mistral",
))

agent = FunctionCallingAgent(llm=llm, tools=[CalculatorTool()])
answer = await agent.run("What is the square root of 256?")

Parameters

ParameterTypeDefaultDescription
llmBaseLLMrequiredMust implement call_with_tools()
toolslist[BaseTool]requiredAvailable tools
max_iterationsint10Max tool-call rounds
memoryAgentMemory | NoneautoCustom memory instance
system_promptstr"You are a helpful AI assistant."System instruction