Notes on: A Deep Dive into Apple's Industry-Leading Screen-Reader, VoiceOver
- John Walker

- Dec 8, 2022
- 2 min read
A 12/08/2022 webinar by Sight Tech Global, view on YouTube
Moderated by Matthew Manzarino, with speakers Dean Hudson and Areeba Kamal, both Apple employees
Sight Tech Global's synopsis: Learn how Apple's Accessibility team is harnessing innovations across hardware, software, and machine learning to support users who are blind or low-vision. Hear what’s new in VoiceOver, and other vision accessibility tools across Apple devices.
What's new in vision accessibility: Magnifier app update for iOS devices with Lidar: Detection Mode
Door Detection: Can tell users how many doors there are, read out their text and symbols, say whether they are open or closed (and how you can open it), and several other pertinent details
Combine with People Detecting and Image Descriptions for more complete, dimensional experiences
VoiceOver: 20+ new languages; new customization options such as voices; Text Checker to help catch typos typically found visually; and more
Since iOS is so integrated, there are great possibilities in combining hardware, software and machine learning (ML)
Areeba says Apple considers a11y to be a core value built into everything from the beginning--built it, not purchased separately or added on later
Apple's teams do include folks with disabilities to create and test software and products
Dean, who is blind, points out how unusual it is that he can go to the Apple Store and test all Apple products with no special accommodations
Having folks w disabilities on the teams really helps them focus on improving efficiency for small but vital functions
72% of blind people use iPhones (though exclusively? Or w Android also?)
ML is combined w on-device sensors to enhance dimensionality of experience to blind users
ML is used to provide missing bits of information--to describe an image that doesn't have alt text etc
Live Captions, Voice Control, even Photos also use ML
BackTap: Set up a shortcut using BackTap under Accessibility to assign custom actions to double and triple tapping the back of an iPhone. Originated for users w motor disabilities but works for everyone
But Apple focuses on users with disabilities for their accessibility features--any benefit to able-bodied users is a happy side effect but not the goal
Dev also have specific a11y tools
Unity plugin: Unity is a popular game platform. Apple put in a lot of effort so that the plugin allows devs to ensure games are accessible as they're being built
But all the Apple frameworks including Swift are accessible by default
The Accessibility Inspector allows devs to audit/debug custom code within X-Code
Coming next: More cutting-edge tech integrations which they can't discuss; making the existing experiences even easier
Also: Apple released a great ad around accessibility recently, which also displays a really nice usage of audio descriptions: https://www.youtube.com/watch?v=tVErGewfgdg
Note how well the closed captioning read out has been paced to also allow for audio descriptions--a lot of forethought went into that!


Comments