Some LLM providers offer server-side tools that execute on their infrastructure — no client-side code needed. The model decides when to use them.Documentation Index
Fetch the complete documentation index at: https://docs.runflow.ai/llms.txt
Use this file to discover all available pages before exploring further.
Anthropic Server Tools
Web Search
Code Execution
Both Together
xAI Server Tools
xAI Grok models support native web search, X/Twitter search, and code execution. These work via the Responses API.xAI native tools (web_search, x_search, code_interpreter) require the Responses API format, which is not yet proxied through Runflow. Coming soon. For now, use Anthropic server tools or custom function tools for web search.
Provider Support
| Provider | Web Search | Code Execution | X Search |
|---|---|---|---|
| Anthropic | web_search_20250305 | code_execution_20250825 | - |
| xAI | Coming soon | Coming soon | Coming soon |
| OpenAI | - | - | - |
| Gemini | - | - | - |
| Groq | - | - | - |
Testing in Prompt Studio
Test server tools in the Portal:- Open Prompts with an Anthropic provider selected
- Click the config icon — you’ll see Search and Code buttons
- Click Search to enable web search
- Ask a question that requires current information — the model will search the web automatically
Server tools are only available for Anthropic providers in the Prompt Studio. Web search adds latency (5-15 seconds) as the model performs real web queries.
Next Steps
Structured Output
Get guaranteed JSON responses
Reasoning
Enable chain-of-thought thinking