AI Integrations
MeetStream provides two ways to bring its documentation and API directly into your AI workflow: a Claude Code Plugin that pre-loads Claude with deep MeetStream knowledge and an MCP Server for real-time doc search inside your coding tools.
Claude Code Plugin
The MeetStream Claude Code Plugin extends Claude Code with native MeetStream knowledge letting you deploy bots, query transcripts, and work with the MeetStream API directly from your terminal without switching context.
Installation
Run the following three commands inside Claude Code to install the plugin:
1. Add the MeetStream marketplace
Registers the MeetStream plugin marketplace with Claude Code. This makes MeetStream plugins discoverable — no plugins are installed yet.
2. Install the plugin
Installs the MeetStream plugin from the marketplace you just added.
3. Activate the plugin
Loads the installed plugin into your current Claude Code session. Run this after installation for MeetStream skills and commands to become available.
What’s included
- Bot lifecycle management via natural language
- Live transcription and audio streaming setup
- WebSocket bot control commands
- Webhook and callback event handling
- Google Calendar integration guidance
- Ready-to-use API call generation
Tip: Pair the Claude Code plugin with the MCP Server for the most complete experience. The plugin handles common patterns, while the MCP server handles live doc lookups in real time.
MCP Server
The MeetStream MCP (Model Context Protocol) server lets AI tools like Cursor, VS Code, Claude Code, and Claude.ai query our live documentation on demand.
MCP Server URL:
Setup Instructions
Cursor
VS Code (Copilot)
Claude Code
Claude.ai
- Open Cursor Settings → Features → MCP Servers
- Click Add new MCP server
- Set Name to
meetstream-docs - Set Type to
SSE - Set URL to
https://docs.meetstream.ai/_mcp/server - Click Save and restart Cursor
Once connected, your AI assistant can search and reference MeetStream documentation in real time during your sessions.
Recommended Setup
For the best developer experience, use both together:
With both enabled, LLMs can answer questions about any MeetStream endpoint, generate working API calls, debug webhook payloads, and help architect integrations all grounded in accurate, up-to-date documentation.
