Post

I/O 2019 and a11y

import a11y from ‘./images/a11y-2019.mp4’;

I/O 2019 and a11y

I care very much about making truly accessible apps, so naturally the first session for I/O ‘19 I watched was “What’s new in Android accessibility?” 🐱 here are my key takeaways:

  • Live Transcribe 📝 is an app that adds captions - live, to any speech synthesised by the mic! When captions appear, the UX is amazingly accessible with high contrast, large text options available. Live Transcribe

  • The a11y suite now ships with captioning for videos even in the photos app. This is possible through recurrent neural networks within your device - your data never leaves your phone! 🤖

  • Talkback’s context menu now has a search option. This lets you jump to a cta or sub-section within the view rather than having to navigate the entire screen 📺 Accessibility Suite

  • Voice access enhancements make a dotted overlay appear over your device so it can be controlled via voice, just like using Google assistant! 🤳 (bit experimental IMHO) Voice Access

<video width="30%" controls autostart autoPlay src={a11y} type="video/mp4" />

This post is licensed under CC BY 4.0 by the author.