

SYNAPTICON (2025)– Evolving Human-AGI interaction paradigms for the post-anthropocentric era: From Command-and-Control metaphors to Cognitive Co-Experience Frameworks
SYNAPTICON is a radical experimentation that merges neuro-hacking, brain-computer interfaces (BCI), and foundational models to explore new realms of human expression, aesthetics and surveillance. SYNAPTICON's innovative framework envisions a new era of the “Panopticon”, where cognitive and algorithmic systems converge, authorizing real-time monitoring, modulation, and prediction of thought, behavior, and creativity. Through the use of BCIs and SOTA AI-driven cognitive models and architectures, SYNAPTICON blurs the boundaries between the self and surveillance, offering profound insights into the neural and algorithmic fabric of perception within human existence. By developing a real-time“Brain Waves-to-Natural Language-to-Aesthetics” system, SYNAPTICON first translates neural states into decoded speech and then into powerful audiovisual expressions for altered perception. This visionary project proposes a new genre of performance art that invites audiences to directly engage with Albert.DATA’s mind, while prompting critical dialogue on the future of neuro-rights and synthetic identities.

SYNAPTICON – Closed-Loop AI Multimodal Generation from Brain Signals: A BCI Framework Integrating EEG Decoding with LLMs and Transformers.

CREDITS
Directed & Produced:
Albert.DATA (Albert Barqué-Duran)
Technical Managers:
Ada Llauradó & Ton Cortiella
Audio Engineer:
Jesús Vaquerizo / I AM JAS
Extra Performer:
Teo Rufini
Partners:
Sónar+D
OpenBCI
BSC (Barcelona Supercomputing Center)
.NewArt { foundation;}
CBC (Center for Brain & Cognition)
Universitat Pompeu Fabra
Departament de Cultura - Generalitat de Catalunya