MeetStream Guide: Create an AI Agent & Bring It Into a Meeting

View as MarkdownOpen in Claude

What is MIA?

MIA stands for MeetStream Infrastructure Agent — the platform layer that lets you create, configure, and deploy AI agents into live meetings. Through the MIA tab in the dashboard, you define how your agent listens, thinks, and acts during a call.

This guide walks you through creating an AI agent using MIA, connecting it to an MCP server for tool calling, and deploying it into a live meeting.

Applies to: Google Meet, Zoom, Microsoft Teams. Support: docs.meetstream.ai • API: api.meetstream.ai


1) Open the MeetStream dashboard

  1. Go to app.meetstream.ai.
  2. Navigate to the MIA tab.
  3. Click Create New Agent.

MIA tab — Create New Agent


2) Choose a mode: Realtime vs Pipeline

MeetStream offers two agent modes. Pick the one that fits your use case.

RealtimePipeline
How it worksSingle provider handles everything (LLM, TTS, STT + MCP)Each component (LLM, TTS, STT) can use a different provider
LatencyLower — fewer hops between servicesHigher — each stage is a separate call
Best forFast conversational agents where speed mattersFine-tuned setups where you want the best provider per capability

Tip: Start with Realtime mode if you want the fastest response times. Switch to Pipeline when you need specific provider combinations.


3) Select the agent response type

Choose how the agent should interact during the meeting:

Response typeBehavior
VoiceAgent listens and responds with spoken audio
ChatAgent responds via text in the meeting chat
ActionAgent performs actions silently (no voice or chat output)

In all three modes, the agent can perform tool actions if an MCP server is connected (see step 6).


4) Select provider and model

Pick the LLM provider and model that will power your agent. The available options depend on the mode you chose in step 2:

  • Realtime mode — select one provider/model for the entire pipeline.
  • Pipeline mode — select a provider/model individually for LLM, TTS, and STT.

5) Add a system prompt

The system prompt defines how your agent behaves — its personality, constraints, and instructions.

Write a clear prompt that tells the agent:

  • What role it plays (e.g., “You are a meeting assistant that takes notes and creates action items”)
  • What it should and shouldn’t do
  • How it should respond (tone, length, format)

Example

You are a helpful meeting assistant for an engineering team.
Summarize discussions, track action items, and create Linear tickets
when asked. Keep responses concise and professional.

6) Connect an MCP server (tool calling)

MCP (Model Context Protocol) lets your agent call external tools — like creating tickets, querying databases, or triggering workflows. This section shows how to set up an MCP server locally using Docker and expose it to MeetStream.

Step 1: Install Docker

  1. Download and install the latest Docker Desktop from docker.com.
  2. Sign in to Docker Desktop.

Step 2: Add an MCP server from the Docker catalog

  1. In Docker Desktop, look for the MCP tab (new feature).
  2. Go to the Catalogue section.
  3. Search for an MCP server — for example, Linear, GitHub, or any other available server.
  4. Add the server and authorize it when prompted.

Docker MCP Catalogue

Step 3: Run the MCP gateway

Start the Docker MCP gateway with streaming transport on port 8080:

$docker mcp gateway run --transport streaming --port 8080

Docker will start the MCP gateway locally on port 8080. You will also receive a bearer token in the output — save it, you’ll need it in step 4.

Important: Copy and store the bearer token from the command output. You will need it to authenticate MeetStream with your MCP gateway.

Step 4: Tunnel with ngrok

Your MCP gateway is running on localhost:8080, but MeetStream needs a public URL. Use ngrok to create a tunnel:

$ngrok http 8080

ngrok will print a public HTTPS URL like:

https://c711-64-71-17-105.ngrok-free.app

Your MCP endpoint is now reachable at:

https://c711-64-71-17-105.ngrok-free.app/mcp

For Docker MCP gateway, always append /mcp to the ngrok URL.

Step 5: Add the MCP server URL in the dashboard

Back in the MeetStream dashboard (agent creation screen):

  1. In the MCP Server URL field, enter your full URL:
https://c711-64-71-17-105.ngrok-free.app/mcp
  1. In the Header section, add the bearer token:
Authorization: Bearer <YOUR_BEARER_TOKEN>
  1. Click Fetch — MeetStream will connect to your MCP server and retrieve a list of available tools/actions.
  2. Select the tools you want the agent to use (e.g., create, list, edit, etc.).
  3. Click Save to finalize the agent.

MCP Server Configuration


7) Bring the agent into a meeting

Now that your agent is created, you can deploy it into a live meeting using the API.

API endpoint

POST https://api.meetstream.ai/api/v1/bots/create_bot

Required fields

Include these three fields in your payload alongside the standard bot parameters:

FieldPurpose
agent_config_idThe ID of the agent you created in the dashboard
socket_connection_urlWebSocket endpoint for the agent bridge connection
live_audio_requiredWebSocket endpoint for live audio streaming to the agent

Example cURL

$curl -X POST "https://api.meetstream.ai/api/v1/bots/create_bot" \
> -H "Authorization: Token <YOUR_API_KEY>" \
> -H "Content-Type: application/json" \
> -d '{
> "meeting_link": "<YOUR_MEETING_LINK>",
> "agent_config_id": "<YOUR_AGENT_CONFIG_ID>",
> "socket_connection_url": {
> "websocket_url": "wss://agent-meetstream-prd-main.meetstream.ai/bridge"
> },
> "live_audio_required": {
> "websocket_url": "wss://agent-meetstream-prd-main.meetstream.ai/bridge/audio"
> }
> }'

Once the bot joins the meeting, your AI agent is live — you can start talking to it immediately.


8) End-to-end summary

Here’s the full flow at a glance:

  1. Dashboard → MIA tab → Create New Agent
  2. Mode → Realtime (fast, single provider) or Pipeline (flexible, multi-provider)
  3. Response type → Voice / Chat / Action
  4. Provider & model → Pick the LLM (and TTS/STT if Pipeline)
  5. System prompt → Define the agent’s behavior
  6. MCP server → Docker MCP gateway → ngrok tunnel → add URL + bearer token → fetch & select tools
  7. DeployPOST /api/v1/bots/create_bot with agent_config_id + WebSocket URLs
  8. Talk → Agent is live in the meeting

Troubleshooting

MCP Fetch returns no tools

  • Confirm the Docker MCP gateway is running (docker mcp gateway run ...).
  • Confirm ngrok is tunneling to the correct port (8080).
  • Make sure you appended /mcp to the ngrok URL.
  • Verify the bearer token in the header is correct.

Agent doesn’t respond in the meeting

  • Check that agent_config_id matches the saved agent in the dashboard.
  • Ensure both WebSocket URLs are included in the create_bot payload.
  • Verify the meeting link is valid and the bot has joined successfully (check webhook events).

ngrok tunnel expired

  • Free ngrok tunnels rotate URLs on restart. Re-run ngrok http 8080 and update the MCP Server URL in the dashboard.

For webhook event handling (bot lifecycle + post-call processing), see the Webhook Events Guide.


FAQ

What does MIA stand for?

MeetStream Infrastructure Agent. It’s the platform layer that powers agent creation, configuration, and deployment into meetings.

Can I use my own LLM API key?

Provider and model selection is handled through the MIA dashboard. Contact support at docs.meetstream.ai for details on custom provider configurations.

What’s the difference between Realtime and Pipeline mode?

Realtime uses a single provider for LLM, TTS, STT, and MCP — it’s faster because everything runs through one service. Pipeline lets you mix different providers for each component (e.g., one model for speech-to-text, another for the LLM), giving you more flexibility at the cost of slightly higher latency.

Can the agent perform actions in all response types (Voice, Chat, Action)?

Yes. As long as an MCP server is connected, the agent can execute tool actions regardless of whether its response type is set to Voice, Chat, or Action.

Do I need Docker to use MCP with MeetStream?

Not necessarily. Docker is one way to run an MCP server locally using the built-in MCP catalog and gateway. If you already have an MCP-compatible server hosted elsewhere, you can point MeetStream directly to its URL.

Why do I need to append /mcp to the ngrok URL?

The Docker MCP gateway exposes its MCP endpoint at the /mcp path. Without it, MeetStream won’t reach the correct route and the Fetch will fail.

Can I connect multiple MCP servers to one agent?

Currently, you configure one MCP server URL per agent. If you need tools from multiple sources, consider running a gateway that aggregates them behind a single endpoint.

What happens if my ngrok tunnel goes down during a meeting?

The agent loses access to MCP tools while the tunnel is down. It will still be in the meeting but won’t be able to execute tool calls. Restart ngrok, update the MCP URL in the dashboard, and redeploy if needed. For production use, host your MCP server on a stable endpoint instead of a tunnel.

Where do I find my agent_config_id?

After saving your agent in the MIA dashboard, the agent_config_id is shown in the agent details. You can also retrieve it via the MeetStream API.

Can I update an agent’s configuration after creating it?

Yes. Go to the MIA tab in the dashboard, select your agent, and edit its settings (mode, prompt, MCP server, tools, etc.). Changes apply to new deployments — agents already in a meeting will use the configuration they were launched with.

Which meeting platforms are supported?

Google Meet, Zoom, and Microsoft Teams.