All examples are available in the pipecat-subagents repository.Documentation Index
Fetch the complete documentation index at: https://daily-mb-ui-agent.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Local examples
These examples run all agents in a single process using the defaultAsyncQueueBus.
Single Agent
The simplest use case: one
BaseAgent running a complete voice pipeline through the AgentRunner. No bus bridge, no multi-agent coordination.Two LLM Agents
A greeter and a support agent that transfer control between each other.
Demonstrates agent handoff with
handoff_to() and the @tool decorator.Two LLM Agents with TTS
Same as above, but each agent has its own TTS with a distinct voice. The main
agent has no TTS — audio comes from the active LLM agent through the bus.
LLM + Flows Agent
An LLM router agent combined with a
FlowsAgent for structured restaurant
reservations. Demonstrates mixing agent types.Parallel Debate
A moderator spawns three worker agents in parallel using
task_group(). Each
worker argues from a different perspective. Demonstrates task coordination.Voice Code Assistant
A voice agent backed by a code worker that uses Claude Agent SDK with tools (Read, Bash, Glob, Grep) to answer coding questions. Demonstrates task-based agent integration.
Reference applications
These live in their own repos and exercise a coherent slice of the framework end-to-end, not just one feature.Music Player (UI Agent)
A voice-driven music browser backed by a live Deezer catalog. Exercises the full UI Agent surface: voice/UI separation with
attach_ui_bridge, <ui_state>-grounded Q&A, multi-turn deixis with keep_history=True, parallel fan-out via start_user_task_group with streaming worker results into a Discovery grid, ack-first ordering for slow tools, and a long-lived singleton catalog agent.Distributed examples
These examples run agents across separate processes or machines, communicating via Redis or WebSocket.Redis Handoff
The two-agent handoff split across separate processes using
RedisBus. The main transport agent runs independently from the LLM agents.Remote Proxy Assistant
A main agent connects to a remote LLM server over WebSocket using proxy agents. Demonstrates point-to-point distributed deployment.