 Hey everybody, what's up? It's Rob Dutson. Welcome back to the AliCast show. Today I want to start kind of like a little mini series as part of our playlist where I go through and I look at different kinds of assistive technology. So in my opinion, like the whole goal of accessibility is making sure that as many people as possible can access the content of your site or application using the tools that make them feel the most productive. And so there's a lot of different tools that people use in order to access content on the web. They could use the keyboard and mouse. They could use just the keyboard. They could use a switch device like this fella over here. They could use a voice access, right? So there's a lot of things out there and as a developer it can be really overwhelming thinking like gosh how am I going to support all of these different tools and their different input methods? So what I think is useful is to first understand, you know, the full breadth of tools that are available to folks and then understand how they work so you can see where their behaviors overlap and then as a developer we kind of target those overlapping spots and that kind of maximizes our effort and make sure we're getting as many people in the door as possible. So what we're going to start with today is a mobile screen reader. So I've got my little pixel phone right here. I'm going to be using the screen reader that comes on Android, which is called TalkBack. If you have an iPhone there is a similar screen reader called VoiceOver that comes pre-installed on those phones and what I'm going to show you today is just the basics of navigating with TalkBack, how to access a page, how to, you know, work with some controls and things like that. If you're using VoiceOver, a lot of the stuff will be pretty similar, but I also want to do like a later episode and show you how to specifically use VoiceOver as well. So follow me over here to my phone and we'll get started. All right. So over here on our device, the first thing we're going to do is go over to the settings menu and we'll scroll down to where it says accessibility and there's a number of options here. The first one that I recommend you turn on is this accessibility shortcut. So if you click on that, make sure that's turned on and make sure that the shortcut service is set to TalkBack. And what this does is it lets you hold both of the volume buttons down and that will turn on the screen reader. So it's really convenient. You don't have to go to the settings menu every time. And then there's a lot of other things inside of here. So there's, you know, text-to-speech. There's TalkBack itself, which we can turn on font size, a bunch of other cool things. We're just going to use TalkBack today. So let's go ahead and turn that on and you'll hear it start announcing the second I turn this thing on. So get ready for that. TalkBack on. TalkBack. The first thing I always try to figure out with any screen reader is actually how to make it stop talking. And so the trick there is once you are on a piece of text, it's announcing. You can just tap the screen again and that'll tell TalkBack to chill out. So I'll show you an example. Over the last. Right. Very useful for when you're debugging and you land on a large block of text. Now as you may have noticed as I'm kind of like touching the screen, this is a gesture-based UI. So most desktop screen readers, there's a number of keyboard shortcuts that you can hit to move around, do special things. On the phone, we're pretty limited here, right? All we can really do is swipe and do different gestures. So in order to move around the screen, the nice thing is if you don't know any of the gestures yet, you can actually just touch stuff. And it's called explore by touch. And you can touch anything on the screen that looks like content, and the screen reader will just announce it for you. So I'll show you an example. Introduction. How to use Polymer with what over the left. Right. So that's really nice if you're just not familiar with any of the gestures yet. Now the main gesture for navigating though is pretty straightforward. It's just a swipe to the right to move forward and a swipe to the left to move back. Most frameworks, but web... most over the... Pretty straightforward. And then if you want to scroll, you use two fingers to do the scrolling action. Lastly, the thing you probably want to know most is how to click on something. So what I'm going to do here is I'll just swipe down to an anchor. And then once that item is focused by the screen reader, we can double click it or double tap the screen in order to click it. But webpack... Polymer webpack loader. Progress bar, 21 percent. GitHub webpack contract, Polymer webpack loader. 100% stop. There we go. So you see sometimes it gets a little chatty and you got to like click on something on the screen just to just to quiet it down. But yeah, so we click the link and we want to go back. We can find the back button. Back button. Progress bar, 70 percent. Double tap that and it'll take us back to where we were before. So you might remember when we've used voiceover, the screen reader that comes with the Mac, we sometimes open up this thing called the rotor, which gives you all sorts of like cool global commands to like quickly jump around the page. We've got a similar thing here in the screen reader and it requires some fancy gestures. So there's a global context menu and a local context menu. And I'll show you both of those. So the global context menu, you open it by doing kind of like an L shape and you got to do it really fast. So it's a little tricky. I always have to use my thumb and do a little L shape. Alert global context menu, showing items one to eight of eight. And you can see there's like a number of things inside of here. Like it'll it'll read the last item. You can jump to the top of the page. You can dim the screen. So some interesting stuff. You can get to your talk back settings, which is handy, but I tend to not use this menu that often. So I'll cancel the most framework. The other one though, that's actually very useful is the local context menu. And that's like drawing an upside down L context menu showing items one to eight of eight. So what this lets us do is change the navigation mode for talk back. So for instance, if we want to navigate by headings, or we just want to navigate by links, we can do that from this menu. So let's change it and navigate just by headings. Now, the interesting thing is it looks like nothing has really changed here. But if we start swiping left and right now, it's going to jump from heading to heading. It can be a little annoying to have to open that context menu every time you want to change the navigation mode. So one sort of last little power user feature is you can actually just swipe up or swipe down, and it will cycle through the different navigation modes. So if I wanted to navigate just by links, I could just swipe down. So you can see it's just navigating through some of these links at the bottom of this iFrame. Over I want, I can swipe back up to go back to headings, or I can go all the way up and go back to the default navigation mode. Headings and land default. And now we're back to navigating as we were before. So it'll try and go from like paragraphs to links to major landmark regions and be sort of smart as it does so. Now, because this is a gesture based screen reader, and because we're working with a much smaller viewport on the mobile device, there are some limitations and some kind of non desirable behaviors that you will see as you're navigating the mobile web. And I want to point some of those out because they're really important for developers to understand. So over here, I've got kind of like your classic responsive website and I've got the hamburger menu up here on the top left. We can click on that because I want you to see the content that's off screen right now. So you can see here I've got a bunch of links in my off screen menu. And so what I want to do next is I want to actually navigate the screen reader and I want you to listen to what is announced as I'm swiping. It's actually reading the off screen content there because all that stuff is still in the DOM. It's just not in the viewport right now, but the screen reader doesn't really know that. So it's just reading through everything, assuming that the user should be able to access it right now. As a developer, it's really important for you to understand that because you might think, oh, well, I've got a smaller viewport. I'll put modal windows and other things off screen. But if you don't mark those as display none, if you don't add already hidden and maybe add tab index negative one to the interactive controls, then those items will actually still be accessible to the user and you might not expect that they can click on them in that particular state. So that's something to definitely watch out for. It's a good reason to use the inert polyfill, which I can link to down in the show notes, which we've talked about in previous episodes. The other thing to keep an eye out for are unique control types. So I've got an input type equals slider over here. This is the native one or sorry, input type equals range. This is the native one. And I want you to listen to how the screen reader announces it. 50% 50 slider out of list use volume keys to adjust. Right. So it told me I can hit the up and down volume key as you can see this here to adjust it. But let's say I create my own slider using aria. So below I have this div that says I'm a slider. It has role equals slider on it. Listen to the way the screen reader announces this. 50% slider use volume keys to adjust. So you heard there the phone instructed us to use the volume keys to update the slider. The problem is if I hit the volume keys on this phone, those events will not be passed through to my custom aria slider that I've built. There's really no way for us to access those events. Until we have something like the accessibility object model, which I can link to down in the show notes. That's kind of an emerging standard that we're working on. Part of the accessibility object model will actually let you access those cool accessibility events. But today, without access to them, there's certain control types I really can't replicate that well on mobile and this being one of them. You might think, ah, well, I could use touch start and touch in and I could implement like a custom gesture or something like that. The problem there is that the screen reader itself is a gesture based UI. And so any gesture that you do on it will actually be intercepted by the screen reader. There is this notion of like pass through gestures. So you can actually like double tap and long hold the second press to do a gesture and kind of drag around. And that maybe could work, but you're sort of relying on the fact that the user is a power user in that case and knows like every feature of their screen reader. But it's not what the screen reader instructed me to do just then. So I might not know to actually do that. Instead of doing your own gesture UI, a suggestion from developer Patrick Lauck is that you could actually just add plus and minus buttons next to your slider because we know clicking works. So that way the user could navigate to those and increment or decrement that way. So sometimes you have to get a little creative, right? Depending on the control type that you're building, make sure you're testing it on mobile to see if you can get the same experience using mobile screen reader as you would with a desktop screen reader. So anyway, yeah, that about covers it for today. That's the basics of using TalkBack on your Android device. I'm going to do a follow-up on how to use voiceover for your iPhone users out there. But if you have any questions, you can leave them for me below in the comments or hit me up on a social network of your choosing. As always, thank you so much for watching and I'll see you next time.