- Apple announced plans to support Switch control to brain-computer interfaces
- The tool would make devices like iPhones and Vision Pro —headset available to people with conditions like ALS
- Combined with Apple’s AI-Driven personal voting function could brain-computer interfaces allow people to think words and hear them spoken in a synthetic version of their voice
Our smartphones and other devices are the key to so many personal and professional tasks all day. Using these devices can be difficult or directly impossible for those with ALS and other conditions. Apple believes it has a possible solution: thinking. Specifically, a brain-computer interface (BCI) is built with Australian Neurotech-Start synchronous that could provide hands-free, thought-controlled versions of the operating systems for iPhones, iPads and Vision Pro-headset.
A brain implant to control your phone may seem extreme, but it can be the key to those with serious spinal cord injuries or related injuries to engage in the world. Apple supports Switch Control for those with the implant embedded near the brain’s motor cortex. The implant picks up the brain’s electrical signals when a person is thinking of moving. It translates that electrical activity and feeds it to Apple’s Switch control software and becomes digital actions such as choosing icons on a screen or navigating a virtual environment.
Brain implants, AI voices
Of course, it’s still early days for the system. It can be slow compared to tapping and it will take time for developers to build better BCI tools. But speed is not the point right now. The point is that people could use the brain implant and an iPhone to interact with a world they were otherwise locked out of.
The possibilities are even greater when you look at how it can go with AI-generated personal voting clones. Apple’s personal voting function allows users to register a sample of their own speech, so if they lose their ability to speak, they can generate synthetic speech that still sounds like them. It is not quite distinction from the right thing, but it is close and much more human than the robot imitation known from old movies and TV shows.
Right now these voices are triggered by touch, eye tracking or other aid. But with BCI integration, the same people could “think” their voice for existence. They could talk just by intending to talk and the system would do the rest. Imagine someone with ALS not only navigating their iPhone with their thoughts, but also talking again through the same device by “writing” statements for their synthetic voice clone to say.
While it is incredible that a brain implant can let someone control a computer with their minds, AI could take it to another level. It would not only help people use tech, but also to be themselves in a digital world.