SYNAPTICON (2025) – Closed-Loop AI Multimodal Generation from Brain Signals: A BCI Framework Integrating EEG Decoding with LLMs and Transformers.
SYNAPTICON is a research prototype at the intersection of neuro-hacking, non-invasive brain-computer interfaces (BCIs), and foundational models, probing new territories of human expression, aesthetics, and AI alignment. Envisioning a cognitive “Panopticon” where biological and synthetic intelligent systems converge, it enables a pipeline that couples temporal neural dynamics with pretrained language representations and operationalizes them in a closed loop for performance. At its core lies a live “Brain Waves-to-Natural Language-to-Aesthetics” system that translates neural states into decoded speech, and then into immersive audiovisual output, shaping altered perceptual experiences and inviting audiences to directly engage with the user’s mind. SYNAPTICON provides a reproducible reference for foundation-model-assisted BCIs, suitable for studies of speech decoding, neuroaesthetics, and human–AI co-creation.
Barque-Duran, A., Llauradó Ada. (2025) How Foundation Models Are Reshaping Non-Invasive Brain–Computer Interfaces: A Case for Novel Human Expression and Alignment. 39th Annual Conference on Neural Information Processing Systems - NeurIPS 2025.
CREDITS
Directed & Produced:
Albert.DATA (Albert Barqué-Duran)
Technical Managers:
Ada Llauradó & Ton Cortiella
Audio Engineer:
Jesús Vaquerizo / I AM JAS
Extra Performer:
Teo Rufini
Partners:
Sónar+D
OpenBCI
BSC (Barcelona Supercomputing Center)
.NewArt { foundation;}
CBC (Center for Brain & Cognition)
Universitat Pompeu Fabra
Departament de Cultura - Generalitat de Catalunya