Skip to main content
Some LLM providers offer server-side tools that execute on their infrastructure — no client-side code needed. The model decides when to use them.

Anthropic Server Tools

const agent = new Agent({
  name: 'Research Agent',
  instructions: 'Search the web to answer questions with citations.',
  model: anthropic('claude-sonnet-4-6'),
  modelConfig: {
    serverTools: [
      { type: 'web_search_20250305', name: 'web_search' }
    ]
  }
});

const result = await agent.process({
  message: 'What are the latest AI breakthroughs this week?'
});

Code Execution

const agent = new Agent({
  name: 'Data Analyst',
  instructions: 'Analyze data using Python code.',
  model: anthropic('claude-sonnet-4-6'),
  modelConfig: {
    serverTools: [
      { type: 'code_execution_20250825', name: 'code_execution' }
    ]
  }
});

Both Together

modelConfig: {
  serverTools: [
    { type: 'web_search_20250305', name: 'web_search' },
    { type: 'code_execution_20250825', name: 'code_execution' }
  ]
}

xAI Server Tools

xAI Grok models support native web search, X/Twitter search, and code execution. These work via the Responses API.
xAI native tools (web_search, x_search, code_interpreter) require the Responses API format, which is not yet proxied through Runflow. Coming soon. For now, use Anthropic server tools or custom function tools for web search.

Provider Support

ProviderWeb SearchCode ExecutionX Search
Anthropicweb_search_20250305code_execution_20250825-
xAIComing soonComing soonComing soon
OpenAI---
Gemini---
Groq---

Next Steps

Structured Output

Get guaranteed JSON responses

Reasoning

Enable chain-of-thought thinking