CES 2020 Introduces Mind-Reading Interfaces

Within the world of technology, the ability to interface human brains with computers has been mainly the realm of science fiction – until now, that is. Several manufacturers have introduced working brain-computer interfaces that allow users to control items with their thoughts. In the past, tech like this focused more on trying to “read” an individual’s mind through their facial muscle contractions. These interfaces seem to actually do the real thing, intercepting thoughts, and trying to match them to the control of an object or a character on-screen.

NextMind Reads Your Vision

A headset developed by innovative designers NextMind attaches a sensor to the back of your head to read inputs to your visual cortex. By focusing on a location on the screen, the device can use it as an input to open menus or perform selections. While this does offer a lot of promise for future implementation of the technology, NextMind’s headset isn’t consumer-ready yet. However, at CES 2020, they were selling a developer kit in the hopes that some innovative entrepreneurs would get on the train to use their input device for a consumer product.

BrainCo’s FocusOne Brainwave Visualizer

Another of the devices that measure brain activity is available from BrainCo. Their FocusOne headband measures brainwaves to determine the level of focus the wearer currently has. The creators of the device see a use for it in schools, to help students achieve and maintain their attention. The headband has connected lights that inform the user or an observer if it detects extreme focus on a task or thought. BrainCo hopes that the device will be a useful tool in teaching kids the art of concentration.

Baby Steps Are Still Transformative

Amidst the backdrop of multi-million-dollar partnerships among self-driving car companies and new technology that expands the gadgets we already have, the inclusion of this sort of new interface is a breath of fresh air. If it does take off, it could signal a paradigm shift in how users interact with UI and control devices. We may eventually get to the point where using a mouse is considered as archaic and quaint.