Home / Projects / ConvoForge

ConvoForge

A behavioural-intelligence platform for the conversations that matter. Microlearning, real-time speech and body-language analysis, AI-avatar roleplay, and a live overlay that quietly coaches you mid-meeting.

Category  AI · Coaching Stack  Next.js · Node · Python ML LLM  OpenRouter · Gemini · Claude Status  In final testing

The problem

Communication coaching has always been a luxury good — expensive, infrequent, and impossible to apply in the moment when it matters. We wanted a coach that learns your specific patterns, gives you targeted micro-drills between meetings, and rides along during the meetings themselves with the lightest possible touch.

What we built

ConvoForge has four pillars. Microlearning ships a 60–90 second drill every weekday, calibrated to your weakest signal. Speech analysis ingests recorded calls and rates filler-word density, pace, energy, and turn-taking. Body-language analysis uses webcam ML to flag posture and gaze patterns. Avatar roleplay lets you rehearse a hard conversation with a configurable AI counterpart.

The live meeting overlay is the piece we’re proudest of: a small, opt-in heads-up display during a Zoom or Google Meet that shows pace, monologue length, and a single one-word nudge when something obvious is happening. No transcripts, no advice, no AI voice in your ear — just one number you actually look at.

The AI angle

Multimodal end-to-end. Audio runs through transcription and prosody analysis. Video runs through pose and gaze detection. The combined signal is structured into a per-session report that Claude turns into a narrative summary and, more importantly, a single actionable drill for tomorrow. The overlay uses small, fast models behind a strict latency budget so the loop closes in real time.

How it’s used

  • Founders running fundraises who need to compress weeks of pitch reps into days.
  • Sales teams looking for measurable improvement on call structure and energy.
  • Anyone preparing for a hard conversation — a difficult 1:1, a board update, a negotiation — who wants to rehearse against a counterpart that doesn’t flinch.

What it taught us

Behavioural change happens at the smallest unit. The temptation with multimodal AI is to throw the whole report at the user. We learned to suppress 95% of it — one drill, one number, one nudge. The compound effect over weeks is bigger than any single 40-page session report ever produced.