Back to Vault
BCIUnityC#Signal Processing

Inner Light (BCI + VR Neurotech)

Real-time brain-computer interface (BCI) translation and VR biofeedback architecture.

The Challenge

Translating raw EEG data (brainwaves) into an intuitive, real-time visual embodied metaphor in VR without noticeable latency, while avoiding motion sickness in biofeedback loops.

The Impact

Successfully prototyped a zero-perceptible-latency EEG-to-avatar system, bridging the gap between raw neurological data and fluid human-computer interaction.

Architecture & Implementation

Signal processing pipelines are notoriously difficult to implement in real-time graphic engines. Inner Light leverages custom C# buffers to handle asynchronous UDP streams from the Emotiv EEG headset, performing lightweight FFT (Fast Fourier Transform) directly inside the Unity thread to map specific frequency bands (Alpha/Theta) to the shader graph parameters of the user's avatar.

Human-in-the-Loop Biofeedback

The core of the system relies on a continuous feedback loop. As the user alters their emotional state, the BCI detects shifts in neural oscillations. This data modifies the avatar's light emission and particle behavior in real-time, which in turn influences the user's psychological state. To preserve frame rate (90 FPS required for VR), all particle calculation is offloaded to GPU Compute Shaders.