AI-Activated Eye Tracking Feature is Coming to iPhone and iPad



Apple is yet to announce iOS 18 and iPadOS 18 next month at WWDC 2024, but they’re already previewing a few new accessibility features that are likely included in these major updates. Among the suite of new tools include an eye tracking capability and other life-enhancing features coming to iPhones and iPads later this year.

Eye tracking for iPhones and iPads
Apple is taking a page from the eye tracking technology used in the Vision Pro, and they’re adding this to iPhones and iPads. It says the feature, which uses the front-facing camera, allows a user to control an iPhone or iPad by just looking at the specific area in a page or app. 
It adds that added hardware accessories aren’t needed to enable eye tracking in these devices. However, the function will require a quick setup and calibration and will rely on on-device machine learning. The company also notes that the eye tracking feature supports iOS and iPad apps.

Apple’s Eye Tracking feature for iPhone and iPad uses machine learning / © Apple

This is actually not a new feature. Some prominent manufacturers like Samsung and LG already added eye tracking technologies to their smartphones, although it is limited to certain apps like browsing and video playback.,
Listen through vibration
Another accessibility feature is Music Haptics that is set to aid people with hearing disabilities. When enabled, the iPhone’s vibration motor will produce varying vibrations and taps that are based to the tune of a song or track currently played.
Music Haptics is compatible with Apple Music app supporting millions of songs in the app, but Apple intends to release an API later this year, so developers can utilize it for third-party apps.
No more motion sickness
If you’re someone who uses a when commuting or traveling in a vehicle, the new Vehicle Motion Cues could be a great help to reduce your motion sickness.
As the name implies, it will assist users in a moving vehicle by generating animated dots scattered on the edges of an iPhone or iPad’s screen. The motion of the dots will correspond to the direction and movements of the vehicle that will be detected by the device. Apple says these will “reduce sensory conflict” that usually results into motion sickness for passengers.

Apple’s new accessibility feature for iPhone and iPad adds Vehicle Motion Cues to prevent motion sickness / © Apple

Speech enhancements
Apple is tapping AI with new speech enhancements it is adding to iOS 18. This includes Vocal Shortcuts that lets users assign custom utterances that will be used to improve voice control with Siri, while the new Listen for Atypical Speech will use machine learning to better recognize these speeches. Both are essentially designed for users with speech problems affected by conditions like stroke and ALS.

Voice control and sound recognition for CarPlay
Two of the vital accessibility features on iPhones and iPads are coming to CarPlay as well. Voice control will give riders hands-free control for apps and features on the CarPlay while the sound recognition will generate visual alerts on the car’s display for car horns and sirens.
Beyond these changes, there are other vital improvements planned to be added through software updates such as a Reader mode and support to launch detection mode via Action Button for Magnifier. There is also a new Virtual Trackpad that will come with the AssistiveTouch feature.
Separately, Vision Pro is getting LiveCaption that will project dialogue and transcription for audio apps and during FaceTime calls.

Affiliate offer

It is expected that many of these new accessibility tools are coming to iOS 18 and iPadOS 18 updates that will be shipped to compatible iPhones and iPads in the fall. Likewise, which of these additions are your favorite and think the most essential?

We will be happy to hear your thoughts

Leave a reply

AnsarSales
Logo
Compare items
  • Total (0)
Compare
0
Shopping cart