NeuralDrift is a love letter to Pacific Rim and a collaborative multiplayer neurogame based on brain-computer interfaces.
Created in less than 36 hours during the Wearhacks Montreal 2014 hackathon using Muse EEG headsets, a LEGO Mindstorm robot, and an Android tablet, the goal of NeuralDrift is for two players to use their brains–literally–and be able to control and steer the robot. Each player controls a tread of the tank-like robot, and depending on the intensity of their brain signals the treads will either accelerate or decelerate. If the players’ brainwaves are not at the same intensity the robot will eventually run in circles, and in order to get the robot where they want to go, the players need to communicate with each other and attempt to (as Idris Elba’s character would say in a deep and meaningful tone) drift together.
I was the designer of the team, and as such while my teammates focused on essential back-end elements like signal acquisition from the EEG’s , signal processing, machine learning, and server communications using Bluetooth and MatLab, I was in charge of the front-end, making logos, icons, and designing and programming the graphical user interface (GUI) that would be seen by the players. As such, I had to make an intuitive and easy-to-understand interface that would essentially translate all of the numbers that processed from the EEG’s to servers and to the robot into something more qualitative.
— ANA™ (@alfalfasalads) September 28, 2014
I used Processing to make bars that would fill and empty depending on the number ranges provided by the EEG’s, and I worked closely with Raymundo to make sure that the interface–that we ported to an Android tablet and placed on the LEGO Mindstorm robot–would work properly, and that both the robot and the interface could communicate via Bluetooth with the computer that received and processed the data from the EEG devices through a server. As the players need to calibrate their brainwaves before the game can begin proper, I added instructions and commands to the application that runs the GUI.
In the end, NeuralDrift received a mention as the most creative hack made during the weekend, has been showcased in We Are Wearables Toronto and BCI-Montreal meetups, and as of February 2015 still has articles written about its execution.
Raymundo and Hubert have been working on a more sophisticated version of the software they created to allow easy interfacing between computers and EEG devices, MuLES.
NeuralDrift official website (Archived)
TechVibes article featuring WearHacks Montreal (and NeuralDrift)
Hubert Banville (Brain-Computer Interfaces & Machine Learning)
Raymundo Cassani (Acquisition of signals & Communication with servers)
Yannick Roy (Acquisition of signals & Communication with servers)
William Thong (Machine Learning)
Ana Tavera Mendoza (Design)