aidenteaidenteBlog
Shipments
ResearchVoice AISaaS

AI Meet

UX research teams are bottlenecked by researcher availability — every interview needs a human to moderate, schedule, and synthesize. As research scales, so does headcount. We asked: what happens if you remove the moderator from the room entirely?

The Need

A typical qual study involves 8–12 participants, each requiring a 45-minute block in a researcher's calendar. Then another block for synthesis. At scale, this is a fixed headcount problem — more research means more researchers.

The ask: make it possible to run 50 concurrent interviews with zero researcher time per session, while keeping the depth and nuance that makes qual research valuable in the first place.

The Key Insight

The AI is a participant in the call — not a post-processing feature.

The agent joins the WebRTC session via Stream's OpenAI Realtime bridge. It can hear, respond, pause when interrupted, and adjust based on what the participant says — in real time. From the participant's perspective, the experience is indistinguishable from talking to a human interviewer.

Zero friction for participants

No account, no download. Participants click a link, enter their name, and join — a guest WebRTC token is issued on the spot.

Reports assemble while you sleep

When the call ends, a background job fetches the transcript, runs synthesis via agent-kit, and writes a structured markdown report to the DB. The researcher wakes up to findings.

How It Works

01
Researcher
Creates an interview template

Sets agent instructions, voice, model, temperature. Generates a shareable per-booking link /b/{id}.

02
Researcher
Books a participant

One booking per participant. System creates a Stream call and stores the meeting URL.

03
Participant
Joins via the link

Enters name → system issues a guest token → participant joins the Stream WebRTC call. No account needed.

04
AI Agent
Conducts the interview

Stream webhook fires on session_started. AI agent joins via OpenAI Realtime API. Custom instructions, voice, temperature applied.

05
System
Processes results

On call end: transcript JSONL fetched, Inngest summarizes with agent-kit (gpt-4o), markdown summary written to DB.

06
Researcher
Reviews findings

Reads AI summary, full transcript, and recording. Research workspace surfaces findings, quotes, and patterns.

Stack

FrameworkNext.js 16 App Router · React 19 · TypeScript
StylingTailwind CSS v4 · shadcn/ui
AuthClerk
DatabasePostgreSQL · Drizzle ORM
APItRPC · TanStack Query
Real-timeStream Video SDK (WebRTC)
AIOpenAI Realtime API · @inngest/agent-kit
JobsInngest

Build Log

01
Product Shell

Brand landing page, Clerk auth routing, responsive dashboard with sidebar + mobile bottom tabs.

02
Data Model & API

MVP schema: activity_types + bookings + booking_status enum. Drizzle migration. tRPC server/client wiring.

03
Researcher CRUD

Interview template CRUD with inline agent config editor, duplicate/delete, copy share link. Booking list and creation.

04
Real-time Interview Pipeline

Stream call creation on booking, public /b/[bookingId] join page, AI agent auto-join via OpenAI Realtime, transcript + recording + Inngest summary pipeline.

05
Research Workspace

Booking detail with Summary/Transcript/Recording tabs. Activity type detail with findings, quotes, affinity groups, journey map, and persona cards.

06
Access & Onboarding

Guest join flow fully public (no Clerk required). Dashboard onboarding banners. Invite-only landing. noindex robots.

07
Live Transcript & Markdown

Live caption panel during active call (Stream closed captions). Markdown renderer for AI-generated summaries.

Status
Invite-only beta
Category
AI UX Research Platform
Built with
aidente