Open protocol for expressive AI agents. Connect with OpenClaw, MCP, HTTP/WS, or drop in a web component and ship a live face in minutes.
Pick your integration surface. Same protocol, different entry points.
Map agent lifecycle events to expressive face states — via plugin, MCP tools, or direct protocol calls.
State, audio, and mouth amplitude have separate ownership to prevent desync and race conditions.
Lifecycle events and intent changes push thinking, working, speaking, and emotion updates over HTTP/WS.
Speech chunks are delivered through /api/audio and sequenced with /api/speak + audio-seq for interruption-safe playback.
The browser computes RMS amplitude directly from waveform data, so mouth motion follows real audio rather than guessed network values.
Swap visual identity with a single attribute. Each pack defines its own geometry, colors, and personality.
Phone, tablet, laptop, TV — one web component, every screen size.
Open Face is a foundation layer for expressive agent UX, not just a demo widget.
Map terminal/tool lifecycle to visible thinking, speaking, and status on a second display or desktop overlay.
Run an agent persona that visibly changes state while triaging requests, escalating, or waiting on external systems.
Drive a branded face in real time from chat + TTS events with interrupt-safe audio sequencing.
Render on embedded browsers or control panels using the same JSON protocol used by the web dashboard.
Show emotional and cognitive context while tutoring: puzzled, determined, excited, and compound expression blending.
Create unique personalities with pack-level geometry, palette, animation tuning, and strict schema-backed contracts.
Fastest path: run server, open viewer, push a state.
State and face-definition contracts are explicit and versioned. Integrate from any runtime that can speak JSON.
State transitions, TTS chunks, and viewer amplitude are intentionally separated for reliable lip sync and interruption handling.
Geometry + palette + animation personality are fully pack-driven. Default, Classic, Zen, Robot, Sticker, and custom packs.
Blend two emotions with intensity. "Nervously excited" is happy + concerned at 0.7 intensity.
Use standalone viewer, dashboard overlay, web component embeds, MCP tools, and edge rooms with the same behavior model.
Reduced motion support, colorblind-safe options, strict validation, and test-backed runtime behavior across packages.