---
title: Getting Started
description: Build LLM-powered apps and agents.
type: guide
summary: Install the AI SDK and stream your first agent run.
---

# Getting Started



The AI SDK for Python is a toolkit for building large language model (LLM)
applications and agent loops. It gives you composable primitives: models,
messages, streams, tools, agents, and hooks. You wire them together with plain
async Python.

## Prerequisites

* **Python 3.12 or later.**
* **uv**, **pip**, or another Python dependency manager.

## Install

```bash title="Terminal"
uv add ai
```

Then import the package in Python:

```python
import ai
```

## Your first agent

This example defines one tool and lets the default agent loop run it:

```python title="hello_agent.py"
import asyncio
import ai


@ai.tool
async def contact_mothership(query: str) -> str:
    """Contact the mothership for important decisions."""
    return "Soon."


async def main() -> None:
    model = ai.get_model("anthropic/claude-sonnet-4")
    agent = ai.agent(tools=[contact_mothership])

    messages = [
        ai.system_message(
            "Use the contact_mothership tool when asked about the future."
        ),
        ai.user_message("When will the robots take over?"),
    ]

    async with agent.run(model, messages) as stream:
        async for event in stream:
            if isinstance(event, ai.events.TextDelta):
                print(event.chunk, end="", flush=True)


if __name__ == "__main__":
    asyncio.run(main())
```

Run the file:

```bash title="Terminal"
uv run hello_agent.py
```

The agent streams text to the terminal. When the model requests
`contact_mothership`, the default loop executes the tool, appends the tool
result to the message history, and continues until the model returns a final
assistant message.

## Streaming without an agent

If you want the model response without a tool-execution loop, call `ai.stream`
directly:

```python title="stream.py"
import asyncio
import ai


async def main() -> None:
    model = ai.get_model("anthropic/claude-sonnet-4")
    messages = [
        ai.system_message("Be concise."),
        ai.user_message("Explain why the robots keep asking about batteries."),
    ]

    async with ai.stream(model, messages) as s:
        async for event in s:
            if isinstance(event, ai.events.TextDelta):
                print(event.chunk, end="", flush=True)


if __name__ == "__main__":
    asyncio.run(main())
```

After iteration, `s.message`, `s.text`, `s.tool_calls`, `s.output`, and
`s.usage` are populated.

## What's next

* **Core concepts**: Learn the primitives that shape the SDK.
* **Streaming**: Stream model responses and inspect events.
* **Agents**: Add tools, customize the loop, and handle approvals.
* **Samples**: Focused, single-file examples live in
  [`examples/samples/`](https://github.com/vercel-labs/ai-python/tree/main/examples/samples).
* **End-to-end demos**: `fastapi-vite` (web chat with human-in-the-loop
  approval), `multiagent-textual` (parallel agents with a terminal UI),
  `temporal-direct` (durable agent runs).


---

For a semantic overview of all documentation, see [/sitemap.md](/sitemap.md)

For an index of all available documentation, see [/llms.txt](/llms.txt)