---
title: Tools
description: Define and use tools in model and agent workflows.
type: guide
summary: Define function tools, design schemas, handle tool errors, use schema-only tools, provider tools, and MCP tools.
---

# Tools



Tools can be schema-only declarations for direct streaming, executable Python
functions for agents, or provider-executed tools that run outside your process.

## Define function tools

Decorate an async function with `@ai.tool`:

```python
import ai


@ai.tool
async def contact_mothership(query: str) -> str:
    """Contact the mothership for important decisions."""
    return "Soon."
```

The tool name comes from the function name. The model receives the function
parameters as a JSON schema and the docstring as the tool description.

## Design tool schemas

The function signature becomes the tool schema. The docstring becomes the tool
description the model sees.

```python
@ai.tool
async def scan_sector(sector: str, depth: int = 1) -> str:
    """Scan a mothership sector at the requested depth."""
    return f"{sector}: clear at depth {depth}"
```

The model receives `sector` as a required string and `depth` as an optional
integer with a default.

## Validate arguments

Tool arguments validate through the generated Pydantic model before your
function runs:

```python
@ai.tool
async def set_alert_level(level: int) -> str:
    """Set the mothership alert level."""
    return f"Alert level set to {level}"
```

If the model sends `{"level": "high"}`, validation fails and the agent returns
an error tool result instead of calling the function.

## Handle tool errors

Tool exceptions become `ToolCallResult` events with `is_error=True`. The model
sees the error text on the next turn:

```python
async with agent.run(model, messages) as stream:
    async for event in stream:
        if isinstance(event, ai.events.ToolCallResult):
            for result in event.results:
                if result.is_error:
                    print(f"{result.tool_name} failed: {result.result}")
```

The original exception is available on `event.exception` for logging.

## Use schema-only tools

Pass `ai.Tool` objects directly to `ai.stream` when you want the model to emit
tool calls but you do not want the SDK to execute them:

```python
tool = ai.Tool(
    kind="function",
    name="contact_mothership",
    args=ai.tools.FunctionToolArgs(
        description="Contact the mothership.",
        params={
            "type": "object",
            "properties": {"query": {"type": "string"}},
            "required": ["query"],
        },
    ),
)

async with ai.stream(model, messages, tools=[tool]) as stream:
    async for event in stream:
        if isinstance(event, ai.events.ToolEnd):
            print(event.tool_call.tool_args)
```

## Use provider-executed tools

Provider-executed tools run on the provider side. Pass them to `ai.stream` in
the `tools` list:

```python
messages = [
    ai.user_message("Check the latest mothership telemetry reports."),
]

async with ai.stream(
    model,
    messages,
    tools=[ai.providers.anthropic.tools.web_search(max_uses=3)],
) as stream:
    async for event in stream:
        if isinstance(event, ai.events.TextDelta):
            print(event.chunk, end="", flush=True)
```

When you route through AI Gateway, you can use provider-specific tool factories
and AI Gateway tool factories:

```python
tools = [
    ai.providers.anthropic.tools.web_search(max_uses=3),
    ai.providers.ai_gateway.tools.perplexity_search(max_results=5),
]
```

## Use MCP tools

The Model Context Protocol (MCP) adapter converts server tools into agent tools:

```python
tools = await ai.mcp.get_http_tools(
    "http://localhost:3000/mcp",
    headers={"Authorization": "Bearer your_access_token_here"},
)

agent = ai.agent(tools=tools)
```

Use `ai.mcp.get_stdio_tools` for subprocess-based MCP servers.


---

For a semantic overview of all documentation, see [/sitemap.md](/sitemap.md)

For an index of all available documentation, see [/llms.txt](/llms.txt)