To build great products, product teams have a solid foundation in the mediums they work in, and how users interact with those mediums.
Designers and researchers use their knowledge of the problem space and user experience (UX) principles to develop complex workflows. Engineers use their understanding of coding for mobile devices, and how people use those devices, to build mobile apps.
Teams have intuitive knowledge based on years of designing, engineering, and testing, and use that knowledge to create interactions. They know that users can hover, click, and drag using a mouse, and that those interactions are difficult on a mobile device.
This often isn’t the case for assistive technology (AT). If team members don’t have lived experience using AT, they may not have internalized how people using it interact with digital products.
Fortunately, this problem has a simple solution: teams should use assistive technology themselves. By practicing, they can internalize how those interactions work, and take them into account when creating products.
The good news is that most devices already have assistive technology built right in, no installation required. And of the tools that do require installation, there are free versions available.
Available Tools
- Screen readers
- VoiceOver on MacOS
- VoiceOver on iOS
- TalkBack on Android
- NVDA for Windows (free installation)
- Voice input
- Voice Control on MacOS
- Voice Control on iOS
- Voice Access on Android
- Windows Voice Access on Windows 11
- Windows Speech Recognition on Windows 10
- Talon on MacOS, Windows, and Linux (free installation)
- Screen magnification
- How to zoom in on what’s on screen, MacOS
- Magnifier on iOS
- Magnification on Android
- Magnifier on Windows
How to Learn
It might take some time to get familiar with how any of these tools work, as well as how to differentiate a good user experience from an okay one. Remember, there was a time when you didn’t know how to do anything on a computer, but you gained skills with practice over time.
Here are some pointers to help get started:
- Watch videos of people using a screen reader
- Search for “screen reader demo” on YouTube
- This playlist of demos from the University of Maryland is a great start
- Pick a standard web task—check your email, add a product to a shopping cart—and try doing it with different assistive technologies:
- Increase the page zoom to 400% in your browser
- Increase the text size via device settings
- Use your keyboard to navigate (on MacOS, this may require configuration)
- Use a screen reader
- Use voice input
- Go through examples on the California School for the Blind Screen Reader Training website using VoiceOver, JAWS, ChromeVox, or NVDA
- Set up a screen reader testing environment on your computer
You will begin to develop a sense of where things are broken (modal only seems to open with a mouse click), or where there might be additional and unnecessary burden (having to tab through everything on the page to reach a modal), and where things simply work (can interact with a modal immediately after opening it).
Note: Learning to use assistive technology is not a substitute for doing research with actual users of assistive technology.
How We Know We’re Doing This
- All our design, content, and interaction decisions are inclusive and consider assistive technology in addition to cursor and touch interaction
- Our team members are actively deliberating accessibility decisions alongside all other design decisions
- Our work is manually tested with assistive technology
- We are making product decisions for assistive technology when nobody has ever used assistive technology
How We Know We’re Coming up Short
- Our accessibility solutions are layered into the project after initial decisions are made
- We only have a singular point of view for how to support accessibility
- Our accessibility testing is done only with automated tools