A Broader Kind of Brain-Computer Control

Researchers have shown that rhesus macaques equipped with brain-computer interfaces can navigate virtual environments using only neural activity, a result that points toward more natural forms of machine control than many current BCI systems provide. The work stands out not simply because the animals could move a cursor-like object, but because they were able to steer through richer virtual settings, including avatar-style movement that more closely resembles how a living body or wheelchair might one day be directed.

According to the supplied source material, each of the three monkeys received three separate implants of 96 electrodes each, for a total of around 300 electrodes per animal. The sensors were placed not only in the primary motor cortex, which is commonly used in brain-computer interface research, but also in dorsal and ventral premotor areas associated with higher-level movement planning. Signals from these regions were decoded by an AI model and translated into control of objects and avatars displayed on a 3D monitor.

Why the Placement of the Sensors Matters

Much previous BCI research has focused on asking a human participant to imagine a specific physical action, such as moving a finger, in order to move a cursor or select an item on a screen. That approach can work, but it is often described as unintuitive and mentally taxing. The supplied source text cites researcher Peter Janssen’s view that the newer implant placement may tap into a more abstract and intuitive layer of movement planning instead of requiring a user to simulate an awkward isolated motion.

If that interpretation holds up, it would be a meaningful shift. A brain-computer interface becomes more useful when it asks the brain to express intent in a way the brain naturally represents, rather than forcing the user to learn a strange substitute language of muscle imagery. In the reported experiments, the animals could control a sphere moving through a landscape from a fixed viewpoint and also guide animated monkey avatars from a third-person perspective. The researchers said later tests included navigating virtual buildings, opening doors, and moving between rooms.

That progression matters because it suggests a BCI that is not limited to one-dimensional pointing tasks. It begins to look more like generalized navigation.

From Virtual Environments to Real-World Mobility

The long-term applications described in the source are practical rather than theatrical. Janssen and colleagues hope the approach could eventually help people with paralysis explore virtual spaces more naturally or control electric wheelchairs in the physical world. That is an important distinction. The goal is not merely to produce eye-catching demonstrations of animals in VR. It is to discover whether neural signals tied to intended movement can be decoded in a way that reduces training friction and expands what assistive systems can do.

There are obvious limits. Human trials are still some distance away, and the source notes that identifying the equivalent implant locations in humans will require additional work because those brain areas are not yet mapped with enough precision for immediate clinical translation. Even so, the researchers believe the concept should be feasible in people and may even become easier once human participants can be instructed directly.

The experiment therefore sits at an interesting intersection of neuroscience, AI, and assistive technology. AI is not replacing the neural interface here; it is acting as the translator that turns complex patterns of brain activity into usable commands. As decoding models improve, so will the possibility of moving from rigid BCI tasks toward systems that feel less like operating a machine and more like expressing intent.

  • Three rhesus macaques were implanted with roughly 300 electrodes each across motor and premotor brain regions.
  • An AI model decoded neural signals into movement through virtual environments.
  • The researchers hope the approach could eventually support intuitive wheelchair control or virtual exploration for people with paralysis.

The deeper importance of the study is not that monkeys moved through a digital world. It is that the control may have come from a higher-level representation of wanting to move, rather than from a forced mental simulation of a single body part. If future work confirms that, BCIs may become less alien to use and much more useful in daily life.

This article is based on reporting by New Scientist. Read the original article.

Originally published on newscientist.com