Apple Announces New AI and ML-Powered Eye Tracking

698

On Wednesday, Apple revealed some new features aimed at improving accessibility for its iPad and iPhone models. To help those with physical limitations use the devices more easily, the business frequently releases new accessibility features. The tech giant is introducing a new eye-tracking function this year that will let users operate their gadgets solely using eye movements. Vocal Shortcuts allow users to accomplish activities with personalized sounds, and Music Haptics allows users to experience music through vibrations.

The company’s newsroom published a post announcing the features. Apple’s senior director of Global Accessibility Policy and Initiatives, Sarah Herrlinger said, “Each year, we break new ground when it comes to accessibility. These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”

First, users have the built-in ability to operate their iPhone and iPad with only their eye movements thanks to the Eye Tracking feature. Utilizing the front camera, which can be adjusted to match the user’s eyes, and on-device machine learning (ML) capabilities, this feature, powered by artificial intelligence (AI), makes it simple for individuals with physical limitations to manage the phone. The company claims that it is unable to access user data.

Another new feature that gives individuals with hearing problems a unique way to experience music is called Music Haptics. The iPhone feature uses the Taptic Engine to sync music’s audio with taps, vibrations, and textures. Apple claims that millions of songs from the Apple Music library can be played using this function. Developers will also be able to include it into their music apps via an API that will be made public.

Next, those with speech-related difficulties can benefit from the Vocal Shortcuts app for iPhone and iPad users. It enables users to program unique speech patterns that Siri can understand to initiate shortcuts and finish tasks. To further lessen sensory conflict between what a person sees and feels, a new feature called Vehicle Motion Cues adds animated dots to the screen’s edges. Apple stated that this disagreement is one of the primary causes of motion sickness and that the feature can lessen its effects, citing studies.

In addition, CarPlay will support people with a variety of disabilities by adding voice control, sound recognition, and colour filters. For people who have hearing issues, Apple’s newest product line, Vision Pro, will also get a live captioning capability that works across the entire system.

Comment via Facebook

Corrections: If you are aware of an inaccuracy or would like to report a correction, we would like to know about it. Please consider sending an email to [email protected] and cite any sources if available. Thank you. (Policy)


Comments are closed.