Live video streaming in MeetStream

View as Markdown

This guide explains how to receive live video from a MeetStream bot over WebSocket while a meeting is in progress. You provide a URL when you create the bot; MeetStream connects to your server and streams fragmented MP4 (fMP4) data you can record or process.


Platform support

Live video streaming is supported for Google Meet and Microsoft Teams meetings only. It is not available for other platforms (including Zoom) at this time.


What you need

  • A create bot request that sets video_required to true and includes live_video_required.websocket_url (see below).
  • A WebSocket server you control that implements the message protocol in this document.
  • For local development, a way to expose that server on the public internet with wss:// (for example ngrok or cloudflared).

What happens during a session

  1. You create a bot with a meeting_link for Google Meet or Teams and pass your WebSocket URL in the payload.
  2. After the bot joins and recording starts, MeetStream opens a WebSocket connection to your URL.
  3. You receive JSON control messages (video_stream_start, periodic video_latency_ping, and video_stream_end) and binary frames containing fMP4 chunks in order.
  4. You respond to each video_latency_ping with video_latency_pong so latency can be measured.

Create bot payload

Use this shape when creating a bot in MeetStream:

1{
2 "meeting_link": "https://...",
3 "video_required": true,
4 "live_video_required": {
5 "websocket_url": "wss://your-server.example.com/"
6 }
7}

You can also use camelCase keys if your client prefers: liveVideoRequired and websocketUrl.

WebSocket URL rules

  • Must be a string starting with ws:// or wss://.
  • For production, use wss:// (TLS).

WebSocket protocol

Messages from MeetStream (text / JSON)

video_stream_start

Sent once after the connection is established.

1{
2 "type": "video_stream_start",
3 "bot_id": "bot-123",
4 "codec": "h264",
5 "audio_codec": "aac",
6 "container": "fmp4",
7 "width": 1920,
8 "height": 1080,
9 "framerate": 25,
10 "audio_sample_rate": 44100,
11 "audio_bitrate": "128k"
12}

video_latency_ping

Sent periodically.

1{
2 "type": "video_latency_ping",
3 "bot_id": "bot-123",
4 "seq": 42,
5 "sent_at_ms": 1743500000123
6}

video_stream_end

Sent when the stream is stopping.

1{
2 "type": "video_stream_end",
3 "bot_id": "bot-123",
4 "duration_seconds": 152.7
5}

Messages from MeetStream (binary)

  • Raw fMP4 bytes from the recorder. Append chunks in order to build a continuous stream or file.

Messages you send back (text / JSON)

For every video_latency_ping, reply with:

1{
2 "type": "video_latency_pong",
3 "seq": 42,
4 "sent_at_ms": 1743500000123,
5 "server_received_at_ms": 1743500000189,
6 "bot_id": "bot-123"
7}

Echo the same seq, sent_at_ms, and bot_id from the ping; set server_received_at_ms to a millisecond timestamp when your server handled the ping.


Implementing your WebSocket server

You can use any stack that speaks WebSocket. Your server should:

  1. Accept connections on ws:// or wss://.
  2. Parse text frames as JSON.
  3. On video_stream_start, prepare your output (file, buffer, pipeline).
  4. On each video_latency_ping, send a video_latency_pong as above.
  5. On binary frames, append bytes in order to your output.
  6. On video_stream_end or disconnect, finalize and close your output.

Minimal handling loop

  1. Track state per connection (for example bot_id, open output handle).
  2. Text JSON: handle video_stream_start, video_latency_ping, video_stream_end as described.
  3. Binary: append to the current output if video_stream_start was already received.
  4. Disconnect: flush and close resources.

Production tips

  • Terminate TLS in front of your app and expose wss:// to MeetStream.
  • Allow large WebSocket frames if your platform has limits.
  • Process writes sequentially per stream so chunk order is preserved.
  • Isolate streams per bot or session.
  • Plan storage and backpressure for long meetings.

Reference implementations

Client side — creating a bot with live video

Send a POST to the MeetStream API to create a bot with live video streaming enabled. The response includes the bot_id you will see in WebSocket messages.

cURL

$curl -X POST https://api.meetstream.ai/api/bots \
> -H "Authorization: Bearer YOUR_API_KEY" \
> -H "Content-Type: application/json" \
> -d '{
> "meeting_link": "https://meet.google.com/abc-defg-hij",
> "video_required": true,
> "live_video_required": {
> "websocket_url": "wss://your-server.example.com/"
> }
> }'

Node.js

1const response = await fetch("https://api.meetstream.ai/api/bots", {
2 method: "POST",
3 headers: {
4 "Authorization": "Bearer YOUR_API_KEY",
5 "Content-Type": "application/json",
6 },
7 body: JSON.stringify({
8 meeting_link: "https://meet.google.com/abc-defg-hij",
9 video_required: true,
10 live_video_required: {
11 websocket_url: "wss://your-server.example.com/",
12 },
13 }),
14});
15
16const bot = await response.json();
17console.log("Created bot:", bot.bot_id);

Python

1import requests
2
3resp = requests.post(
4 "https://api.meetstream.ai/api/bots",
5 headers={
6 "Authorization": "Bearer YOUR_API_KEY",
7 "Content-Type": "application/json",
8 },
9 json={
10 "meeting_link": "https://meet.google.com/abc-defg-hij",
11 "video_required": True,
12 "live_video_required": {
13 "websocket_url": "wss://your-server.example.com/",
14 },
15 },
16)
17
18bot = resp.json()
19print("Created bot:", bot["bot_id"])

Server side — Node.js WebSocket receiver

A complete receiver that writes incoming fMP4 chunks to .mp4 files on disk. Each connection maps to one bot session.

1import { createWriteStream } from "node:fs";
2import { mkdir } from "node:fs/promises";
3import { join } from "node:path";
4import { WebSocketServer } from "ws";
5
6const PORT = Number(process.env.WS_PORT || 9876);
7const HOST = process.env.WS_HOST || "0.0.0.0";
8const OUT_DIR = process.env.OUTPUT_DIR || join(process.cwd(), "recordings");
9
10await mkdir(OUT_DIR, { recursive: true });
11
12const wss = new WebSocketServer({ host: HOST, port: PORT });
13
14wss.on("connection", (ws) => {
15 let writeStream = null;
16 let botId = "bot";
17 let bytesWritten = 0;
18
19 ws.on("message", (data, isBinary) => {
20 // Binary frames: append fMP4 chunks to the output file
21 if (isBinary) {
22 if (!writeStream) return;
23 const buf = Buffer.isBuffer(data) ? data : Buffer.from(data);
24 bytesWritten += buf.length;
25 writeStream.write(buf);
26 return;
27 }
28
29 const msg = JSON.parse(data.toString());
30
31 if (msg.type === "video_stream_start") {
32 botId = msg.bot_id || "unknown";
33 const outPath = join(OUT_DIR, `${botId}_${Date.now()}.mp4`);
34 writeStream = createWriteStream(outPath);
35 bytesWritten = 0;
36 console.log(`Recording started -> ${outPath}`);
37 }
38
39 if (msg.type === "video_latency_ping") {
40 ws.send(JSON.stringify({
41 type: "video_latency_pong",
42 seq: msg.seq,
43 sent_at_ms: msg.sent_at_ms,
44 server_received_at_ms: Date.now(),
45 bot_id: msg.bot_id || botId,
46 }));
47 }
48
49 if (msg.type === "video_stream_end") {
50 console.log(`Recording ended for ${botId} (${msg.duration_seconds}s)`);
51 if (writeStream) { writeStream.end(); writeStream = null; }
52 }
53 });
54
55 ws.on("close", () => {
56 if (writeStream) { writeStream.end(); writeStream = null; }
57 });
58});
59
60console.log(`Listening on ws://${HOST}:${PORT}/`);

Run with:

$npm install ws
$node --experimental-modules server.mjs

Server side — Python WebSocket receiver

The same receiver in Python using the websockets library.

1import asyncio
2import json
3import os
4import time
5
6import websockets
7
8PORT = int(os.environ.get("WS_PORT", 9876))
9HOST = os.environ.get("WS_HOST", "0.0.0.0")
10OUT_DIR = os.environ.get("OUTPUT_DIR", os.path.join(os.getcwd(), "recordings"))
11
12os.makedirs(OUT_DIR, exist_ok=True)
13
14
15async def handle(ws):
16 out_file = None
17 bot_id = "bot"
18 bytes_written = 0
19
20 try:
21 async for message in ws:
22 # Binary frames: append fMP4 chunks to the output file
23 if isinstance(message, bytes):
24 if out_file is None:
25 continue
26 out_file.write(message)
27 bytes_written += len(message)
28 continue
29
30 msg = json.loads(message)
31
32 if msg["type"] == "video_stream_start":
33 bot_id = msg.get("bot_id", "unknown")
34 path = os.path.join(OUT_DIR, f"{bot_id}_{int(time.time())}.mp4")
35 out_file = open(path, "wb")
36 bytes_written = 0
37 print(f"Recording started -> {path}")
38
39 elif msg["type"] == "video_latency_ping":
40 await ws.send(json.dumps({
41 "type": "video_latency_pong",
42 "seq": msg["seq"],
43 "sent_at_ms": msg["sent_at_ms"],
44 "server_received_at_ms": int(time.time() * 1000),
45 "bot_id": msg.get("bot_id", bot_id),
46 }))
47
48 elif msg["type"] == "video_stream_end":
49 print(f"Recording ended for {bot_id} ({msg.get('duration_seconds')}s)")
50 if out_file:
51 out_file.close()
52 out_file = None
53 finally:
54 if out_file:
55 out_file.close()
56
57
58async def main():
59 async with websockets.serve(handle, HOST, PORT):
60 print(f"Listening on ws://{HOST}:{PORT}/")
61 await asyncio.Future()
62
63asyncio.run(main())

Run with:

$pip install websockets
$python server.py

Reaching a server on your laptop

If MeetStream runs in the cloud and your receiver runs locally, expose the local WebSocket port with a tunnel (for example ngrok or cloudflared) and use the public wss:// URL in live_video_required.websocket_url.

Point the tunnel at the port your video receiver listens on. Do not reuse another WebSocket used for a different purpose.


Troubleshooting

SymptomWhat to check
No connection or no dataConfirm video_required is true, the URL is correct, and the meeting is Google Meet or Teams.
No binary chunksConfirm the tunnel or firewall allows inbound connections and TLS is valid for wss://.
TLS or certificate errorsVerify certificates, tunnel URL, and that you use wss:// in production.
Scheduled or recurring bots missing live videoInclude live_video_required in every create-bot payload your automation sends.

Early in a session, the first bytes may arrive slightly before your WebSocket is fully ready; once connected, chunks should flow normally.


Security

  • Prefer wss:// in production.
  • Do not put secrets in URLs if you can avoid it; protect your receiver with auth, IP restrictions, or a private network where possible.
  • Treat tunnel URLs as sensitive while testing.

Quick test checklist

  1. Start your WebSocket receiver.
  2. Expose it with a public wss:// URL if needed.
  3. Create a bot with a Google Meet or Teams meeting_link and live_video_required.websocket_url set to that URL.
  4. Confirm you receive video_stream_start, then growing binary traffic.
  5. Confirm you send video_latency_pong for each video_latency_ping.
  6. Confirm video_stream_end when the session ends.

If something still fails, note your bot_id, meeting platform, and timestamps when contacting MeetStream support.