EMOTION ECHO

.Collaborative PROJECT

GOAL

To develop an AI interface that analyzes EEG patterns to provide empathetic support for designers' creative processes and wellbeing, addressing critical gaps in current AI design tools

.Project Overview

PROJECT OVERVIEW

.THE PROBLEM

Bridging the Empathy Gap

Most design tools, including AI-powered ones, operate blindly: they see only prompts, clicks, and geometry, with no sense of the designer’s emotional state or deeper intent. Instead of clarifying the process, they often just add noise.


This problem extends to existing brain-to-3D pipelines, which are too rigid for professional work. They force designers into linear, one-way workflows that offer minimal control and fail to support true iterative exploration.


These tools also ignore the human cost of creation. By failing to detect stress or creative blocks, unresponsive AI systems increase the risk of cognitive overload and burnout. Designers are left juggling a fragmented ecosystem of isolated software (like Blender and Rhino) that demands constant attention but lacks any shared awareness of their context or needs.

.Solution

THE EMPATHETIC FRAMEWORK

Data Capture and Input

EEG stream
Raw EEG data is captured from devices such as the Emotiv Cortex API, including performance metrics like "stress": 0.67 and "engagement": 0.82, plus higher-level labels such as "cognitive_load": "high" and "emotional_valence": "anxious_engagement".


Preprocessing
Techniques such as wavelet thresholding and Kalman filtering clean the signal and remove artifacts so emotional and cognitive patterns can be interpreted reliably.


Hybrid AI analysis
A hybrid engine combines REST-based source localization with NeuroGPT deep learning models to capture both spatial brain patterns and temporal dynamics, improving recognition of emotion, attention, and cognitive state.


Semantic processing
In parallel, the system understands what the designer is working on (for example, “modular furniture”), so neural signals are interpreted in the context of concrete creative tasks rather than as abstract numbers.

The Agentic Framework

Data Integration Agent / System Orchestrator
Fuses EEG data and design context, filters noise, detects anomalies, and routes requests to other agents using a routing decision (for example, flagging “high engagement with elevated stress” as high priority).

Intelligence Agent
Runs cognitive analysis and pattern recognition on the EEG stream, interprets emotional states, and monitors for signs that the designer is approaching overload or burnout.


Designer Assistant Agent
Acts as the main user-facing layer, talking with the designer, surfacing metrics in plain language, and proposing concrete, “user_facing_response” options and next steps.


Studio and MCP Agent
Connects the empathic framework to tools such as Blender via the Model Context Protocol (MCP), generates design iterations, updates interface elements, applies edits, and helps select promising directions.

Data Flow Diagram

.Emotion ECHO STUDIO INTERFACE

THE EMOTION ECHO STUDIO


Emotion Echo Studio is the working testbed where these ideas come together as real interactions that give designers both support and control.

Interface: Project Dashboard

Empathic wellbeing and intervention system

Pattern recognition
The system looks for specific neural signatures, such as high stress combined with repeated focus drops during afternoon critique sessions.


Intervention recommendations
When it detects patterns like strong focus (74%) and engagement (82%) paired with elevated stress (67%), it can suggest short, targeted interventions such as a 2‑minute breathing exercise or a 120‑second microbreak, phrased directly to the user.


Context-aware advice
If the designer is sculpting a concept and shows high stress with a frustrated emotional valence, the system may suggest switching to a low‑cognitive task to protect wellbeing without halting progress.

Interface: Designer Wellbeing

AI Assistant

Neural timeline and version control

Auto-save and timeline creation
The system automatically saves when it detects peaks in excitement, engagement, and focus, tying design snapshots to positive mental states.


Branching and recovery
When stress spikes, it can propose branching from the “last happy timeline save” or generating options from that point to help the designer recover a more productive direction.


MCP-based sharing of context
Through MCP, this emotional and timeline context travels with the designer across tools like Blender and Rhino, so state awareness is not locked to a single application.

Interface: Clay Studio

.ValiDATION AND the SIZING

VALIDATION.

Thinking and Semantic Decision Making
The AI Understands the context and the users emotion to help the designer in the design workflow

Moulding with Tangible Interfaces
A 3D Clay Studio module dynamically adjusts tool parameters such as brush size and opacity based on EEG feedback, enabling quick, intuitive form-finding by combining physical moulding with brain-driven prompts.

Refining with Intuitive controls
Core refinement actions are mapped onto a Tangible User Interface (TUI), allowing designers to manipulate digital 3D forms through direct, physical controls.

Parameterisation
A proof-of-concept pipeline converts mesh-like outputs into editable, parametric CAD models inside a standard CAD environment, solving the usual problem that brain-generated meshes cannot be refined professionally.

CONTACT ME

  • CONTACT ME

CONTACT ME

Like what you see ?
Let's get in touch.

REN©2025