The algorithms learn from your behavior. Before playing the game, you train them to recognize when you are focusing your attention on an object. A pulse of light bounces around the virtual room, and each time it hits a small colored ball in front of you, you think about the ball. At that moment, when you focus on the light and it stimulates your brain, the system reads the electrical spikes of your brain activity.
After you do this for a few minutes, the game learns to recognize when you are concentrating on an item. “We look at specific brain signals,” Mr. Alcaide said, “and once we understand them, we can use them.”
When you play the game, the same light bounces around the virtual room. When it hits the item you are thinking about, the system can identify the increase in brain activity.
The technique works with equipment that already exists. Neurable’s prototype uses virtual reality goggles from HTC, a consumer electronics company, and seven EEG sensors placed at specific spots around your head. But given the physical limits of what these sensors can read, an EEG-based game is unlikely to do more than slowly and simply select digital objects.