This New Apple Feature Will Lets You Control iPad and iPhone With Just Your Eyes

This New Apple Feature Will Lets You Control iPad and iPhone With Just Your Eyes

Apple continues to push the boundaries of inclusive design with a suite of impressive new accessibility features set to roll out later this year. This includes Eye Tracking which lets users control iPad and iPhone with just their eyes. These features, powered by advanced technology like Apple silicon and AI, aim to empower users of all abilities to connect, create, and navigate the world around them.

Eye Tracking: Control Your Devices with Your Gaze

One of the most innovative additions is Eye Tracking. This feature, designed for users with physical disabilities, allows them to control their iPad or iPhone using only their eyes. The front-facing camera utilizes AI to calibrate within seconds, ensuring a seamless and secure experience with all data remaining on-device.

Eye Tracking allows users to navigate apps, activate elements with Dwell Control, and even perform actions like swiping and clicking, all without physical touch. This innovative feature opens up a world of possibilities for users who may have limited mobility.

Key Benefits of Eye Tracking

  • Simple Setup: Users can set up and calibrate Eye Tracking within seconds.
  • No Extra Hardware: This feature works across iOS and iPadOS without needing additional accessories.
  • Dwell Control: Users can interact with apps by dwelling on elements to activate functions like buttons, swipes, and gestures.

Music Haptics: Feel the Rhythm

For users who are deaf or hard of hearing, Apple introduces Music Haptics. This feature utilizes the iPhone’s Taptic Engine to translate music into a series of taps, textures, and vibrations, creating a unique tactile experience. Music Haptics works across millions of songs in the Apple Music library and will be available as an API for developers, enhancing accessibility in third-party apps.

Music Haptics lets deaf or hard of hearing users experience music on iPhone in a new way.

Enhanced Speech Features for Diverse Needs

Apple is also introducing several features that cater to a wide range of speech needs. Vocal Shortcuts allows users to assign custom phrases that Siri can recognize to activate shortcuts and perform complex tasks.

Listen for Atypical Speech uses on-device machine learning to adapt to diverse speech patterns, particularly beneficial for individuals with conditions like cerebral palsy or ALS. These features build upon existing tools in iOS 17, providing more control and customization for users who are nonspeaking or at risk of losing their speech.

Vehicle Motion Cues: Combat Motion Sickness

Apple addresses the issue of motion sickness with Vehicle Motion Cues. This feature uses on-screen animations to represent changes in vehicle movement, minimizing sensory conflict and reducing discomfort for passengers. The feature utilizes built-in sensors to detect motion and can be set to activate automatically or through the Control Center.

Accessibility Enhancements for CarPlay and visionOS

Accessibility improvements extend to CarPlay with the addition of Voice Control, Color Filters, and Sound Recognition. These features allow for voice-activated navigation, visual customization, and alerts for car horns and sirens, catering to a broader range of users.

visionOS will provide Live Captions, enabling users who are deaf or hard of hearing to follow spoken dialogue in live conversations and audio from apps.

visionOS, Apple’s new spatial computing platform, will also include a suite of accessibility features. Live Captions, for instance, will be available system-wide, allowing users to follow dialogue in live conversations and audio from apps. Vision Pro users will also benefit from enhanced caption control, support for additional hearing devices, and features like Reduce Transparency and Dim Flashing Lights for users with low vision.

A Commitment to Inclusive Design

These updates are just a fraction of the accessibility improvements coming to Apple’s ecosystem. From new voices for VoiceOver to a redesigned Magnifier experience, Apple continues to refine and expand its accessibility toolkit.

These innovations highlight Apple’s understanding of the real-world challenges faced by people with disabilities. The ability to control iPad and iPhone with just your eyes is not just a technological marvel but a life-changing tool for individuals with physical impairments. Similarly, Music Haptics opens up the world of music to those who are deaf. This tech will allow them to experience songs in a way they never could before.

These advancements are more than just technical updates, they are a testament to Apple’s mission to empower all users. With these new features, Apple is not just enhancing its devices but also making a profound impact on the lives of millions around the world.

Latest From Us:

One Response

Leave a Reply

Your email address will not be published. Required fields are marked *