In Vision, did Apple secretly develop a brain-computer interface?

Wallstreetcn
2023.06.06 04:32
portai
I'm PortAI, I can summarize articles.

Apple's AR neurotechnology researcher stated that there is no need for brain chip implantation for Apple AR to predict user behavior, and it can also adjust the virtual environment based on the user's state.

No need for brain chip implantation, can Apple AR "secretly" install "brain-computer interface" to infer and predict user behavior?

Apple AR has finally come out after a long wait, igniting the market's enthusiasm! In addition to various innovative features, Apple AR's neural technology is also worth paying attention to.

On Tuesday, Sterling Crispin, a netizen who has worked in the AR/VR field for ten years and serves as an Apple AR neural technology researcher, tweeted about Apple AR's neural technology research and development.

According to Crispin, his work at Apple includes supporting the foundational development of Vision Pro, mindfulness experiences, and more challenging neural technology research. Overall, much of his work involves detecting the mental state of users' bodies and brains in immersive experiences.

In mixed reality or virtual reality experiences, AI models attempt to predict whether users are curious, mentally wandering, afraid, attentive, recalling past experiences, or other cognitive states.

These can be inferred by measuring eye tracking, brain electroencephalography, heart rate and rhythm, muscle activity, blood density, blood pressure, skin conductivity, and other measurements, making it possible to predict behavior.

According to Apple's patent description and Crispin's introduction, Apple's neural technology can predict user behavior and adjust virtual environments based on user states.

The coolest result is to predict what users will click before they actually click it. People's pupils often react before clicking because they expect something to happen after clicking.

At this time, by monitoring the user's eye behavior to create biofeedback and dynamically redesign the user interface in real-time to create more expected pupil reactions. This is a rough "brain-computer interface" achieved through the eyes, and users do not need to undergo invasive brain surgery.

Other technologies for inferring cognitive states include quickly flashing visual or auditory stimuli to users in ways they may not be aware of, and then measuring their reactions.

Another patent details the use of machine learning and signals from the body and brain to predict how focused or relaxed a user's attention is, or how effective their learning is, and then update the virtual environment to enhance these states.

Imagine an adaptive immersive environment that helps you learn, work, or relax by changing what you see and hear in the background.

Finally, Crispin believes that Apple AR has taken a step towards the road to VR, but it will not be until the end of the 2020s that the industry can fully catch up with the grand vision of this technology.