 Thanks, everyone. I'm sorry I don't have any slides. If that bothers you, I would suggest closing your eyes until the demo part. So seriously, close your eyes, try it. OK. My name is Austin Serif, and I've been blind since birth. I was born in 1977. I started programming when I was seven on an Apple 2e. And that was the first computer that had a screen reader. And that's how a blind person uses a computer by using a program that reads the screen, usually with synthesized speech. After the Apple 2, in the 90s, I moved to MS-DOS. I used that for a while. Then I used Windows, unfortunately. After one Windows freak out, I couldn't do it anymore. I switched to Linux. And that is a screen reader built right into the kernel, which is nice. Then in 2010, I got an iPhone. And that was really a turning point for me. All of Apple's products have accessibility built in, including a screen reader called VoiceOver and screen magnification software called Zoom. After the iPhone, I got two Macs and an iPad. So the iPhone changed my universe as soon as it entered it. The blind usually get nothing from companies. And so we felt really skeptical in the blind community when Apple announced that the next iPhone or iOS would have VoiceOver. We just kind of made fun of it. We were just like, this is stupid. How could a blind person even use a touchscreen? They're just doing this. And then a friend of mine got one and just started going nuts about it. She called me and was just like, this is the greatest thing. And I was like, come on. We were just making fun of this like three days ago, really. And she was like, yeah, yeah, this is so great. I was like, what about the touchscreen? Oh, the touchscreen's awesome. This is the next big thing for the blind. So I went to the AT&T store with my mom to check it out. I read the manual the night before being a nerd. So I had an idea of what to expect. And I began using it immediately right in the store. I checked the weather, the stocks app, the render stocks charts. And that's the little feature that sold me because that showed me Apple's attention to detail that from the first time I could read a stock chart. So we brought it up to the front and my mom asked the guy at the AT&T store if I could get a text message on it. And he said, I don't know. I don't think so. And that's kind of the problem. A lot of these AT&T stores, Apple stores, the employees will know that these features exist but they might know how to turn it on and that's about it. They don't really know their capabilities. So don't believe it when they say something's impossible because it's probably not. And so I said, whatever, there's only one way to find out. So my mom sent me a text and it came right through. It said, hi Austin and everyone felt moved even the guy at the AT&T store. The iPhone opened up my world. The thing to understand about the iPhone is it makes so many other assistive devices obsolete. There's a whole bunch of devices made just for the blind. So they have a limited market so they're expensive. They're often not that well built or supported. And so I brought my iPhone home and I was excited because now I could run apps. Everyone was talking about apps. No one told me which apps. They were just talking about apps. So I was like, okay. So I was thinking about these different devices and I thought about, for instance, a device called a Color Identifier. They cost $200, the price of a subsidized iPhone and you point it at an object and it tells you the object's color. So I typed Color Identifier into the app store and an app came right up called Color ID for $2. I downloaded it, I installed it and my heart was just like so big. I was like, this really is the next big thing. This is so awesome. I ran it and it just kept saying black, black, black. And my heart sunk and I was just like, once again, harsh physical reality has reasserted itself and slapped me in the face. This isn't the next big thing, whatever. And before I turned it off in frustration, I thought for a second, I thought, well, wait a minute. It's two o'clock in the morning. It's, yeah, you can all tell where this is headed. It's pitch black out. I had no lights on because I don't need to. And yeah, I forgot, sighted people need light to see color. So I turned on the light and out came the colors and it was amazing and that's when I knew that the iPhone really was a magical device. And an interesting thing about that app is he didn't even write it for the blind and that's the advantage to building in accessibility like that, he had no idea. There are a lot of examples of this. I'll give a few more. For instance, there's a money reader. I believe it was Jack Hamlin earlier was saying how here in America all of our bills are the same size. We don't demarcate them in any tactile way. So the blind have to fold our currency and people are like, oh, it's so great that you do that. I'm like, no, it's not great. It's stupid and annoying. And so there is a device that costs $100. You can put money in it. It'll tell you the denomination of the currency. But you could spend $10 or less on an app to do it. The one I like costs $10. You just hold your money under the iPhone and it tells you what it is. There's object recognition. There are devices for $1,200. You take a picture of a box or a can or whatever and it identifies the object for you. Or there's an app called TapTapSee. That is free and then you pay for a reasonable subscription and identifies the object. There's GPS, for instance. They're GPS devices. They can cost like $800. There are a bunch of good GPS apps now for the blinds. I really like one called Blind Square. It taps into four squares database of locations to give you a crowdsourced location. It'll tell you things that are around you. Scientific calculator. A talking scientific calculator costs $250. There's a $5 app that'll do it and of course Apple has a calculator built in. There's an alarm clock, $40 for a talking alarm clock. Of course that's built in. There are devices called note takers that cost $1,000 that just do the basic functions. Note taking, address book, things like that. It costs $1,000. The iPhone has all that built in and with a much better web browser. The web browsers on these note takers are pathetic. They're terrible. So if you add all that stuff up, that'd be $3,600 for assisted devices versus $232 for an iPhone and some apps. We can squirge a little. Let's throw in a Bluetooth keyboard. Call it an even $300. You'd spend 12 times more on assisted devices than you would for an iPhone and a handful of apps. That's the difference the iPhone makes for the blind. That's why accessibility is so important. And accessibility helps everyone. There are lots of great examples of accessibility going mainstream. For instance, Siri. I always laugh now inside of people like, oh, it's great. I use Siri and it talks to me. It reads me things. I'm like, yeah, it is great. It's how I do everything. Yeah, audiobooks are a cool example. The blind have been reading books for decades and I'm sure some marketer at some point was like, hey, we could sell these to sighted people and now they're ubiquitous. Even little things, like reading a menu at a restaurant, things sighted people take for granted. We cannot do it. And accessible apps are just better for everyone. So I want to quickly demonstrate voiceover. So voiceover, it's really cool. Voiceover actually supports three types of outputs. It has synthesized speech. It has Braille displays and also the switch controls for people with muscular difficulties. And so that means for a developer, by just learning one set of APIs, you can cater to three totally different groups of disabilities. The blind with speech, and they use Braille too. The deaf blind who can't hear the speech, so they have to use Braille. And also people with motor difficulties with the code-based controls. So to turn on voiceover, you go to settings in general and accessibility. You can turn it on. Siri can do it too. Don't do this until you've learned the gestures and what you're doing. I've had people email me and be like, I heard your talk and then I turned on voiceover and I can't use my phone now. So you might want to read up a little. The cool thing about voiceover and what they've done on the iPhone is that it combines two different paradigms. So when you think about the job of screen-eater, a screen is a two-dimensional output. It's got rows and columns, width and height. Speech, though, is a one-dimensional output. When you're listening to me, you have to listen from the beginning to the end of what I'm saying. And so the task of a screen-eater, then, is to take a two-dimensional output and render it as a one-dimensional output. And Apple has two different ways of doing this. So let me just turn on my iPhone. This is, I have my speech set fast. Here's how I have it. All right, I got it in a file around Twitter, cool. So let me turn this down a little for you. There it is. Speech rate? 60%. 50%. Unlock button. You all hear that? Unlock button. There's Twitter. 62%. Next trap. See? You can hear all the things on my lock screen, for instance. So there are two ways of browsing the screen. The two-dimensional view, the two-dimensional paradigm is you can move your finger around the screen and voiceover will read you the elements under your finger. And that's nice because it gives spatial relations. For a long time, you know, if a sighted person would say, oh, you know, tap this button in the upper left-hand corner or whatever, blind person would say, well, that doesn't mean anything to me. That's not how I use a computer. So for the first time, we have that. The other way is the one-dimensional view, and that's by swiping. If you swipe right with one finger, you go to the next element. Swipe left to the previous element. Thank you, Colin. All right, for that two-dimensional, very good. So you can swipe to the next previous element. You can swipe up and down and use a rotor for finer resolution by character, word, heading, things like that. You double-tap, by the way. You can double-tap anywhere on the screen, and you can activate that element in voiceover focus. So for instance, if it's focused on the unlock button, I can double-tap anywhere. And it'll open my iPhone. So there are some other gestures too. You can scroll with three fingers. There are a bunch of other ones you can read about as well. So for a developer, what this means is that the standard cocoa elements work by default. If you use them and you label your controls, you've done 90% of the work in a lot of cases. And Apple has some really good accessibility guidelines as well. So as soon as I had my Color ID experience, I realized, of course, the next thing for me to do is to start programming this thing, being a programmer. Now, of course, I tried the standard Xcode and Objective C way. And I know there's been a lot of Xcode bashing at this conference so far, and that's cool. I feel like I have a special right to bash it, though, because if you think Xcode's confusing, try closing your eyes and using it with voiceover. You don't even want to know. You don't want to know. The joke I like to make when I promote RubyMotion to people is I say, yes, it's a commercial product. It costs $200, but I'd spend that on beer if I had to use Xcode and Objective C every day. So I think it makes up for itself. And of course, RubyMotion is commanding operation. You can use any editor you want. We all know this. And this is extra good for the blind. Give me Vim and Ruby in the terminal any day. Interface Builder also has no accessibility as well. So I had to learn how to build views programmatically. Unfortunately, that's exactly what we do in RubyMotion. So after I got RubyMotion, I realized that there were some things I could do to help make it even more accessible and better. And that's why I started writing motion accessibility, which is what I'm going to demonstrate for you now. First, I wrote some Ruby wrappers around Apple's UI accessibility protocols. And then I started trying to write some simple apps. And I realized a problem. And it was something I was warned about. When I first contacted Laurent about RubyMotion to ask if I could use it, he said, yeah, I think you can. I'm just worried about the iOS simulator. And I said, well, whatever. We'll cross that bridge when we come to it. And the iOS simulator, it's true, doesn't work very well with a voiceover. So what I've done is made the motion accessibility console. So that's what I want to demonstrate to you first. All right, are we good? Are you seeing this stupid little demo app I made? Good? OK. So we've got a label. We've got a text field. And we've got an update button. You fill in text in the text field, press the update button, it updates the label. Very simple. So let's see how that renders in text. The first command is browse or be. You can see. Come on, voiceover. I'm trying to make you look good here. All right. So you see there is a UI view with three sub-views. There's a UI navigation bar and a UI tab bar. So let's browse in to the UI view. And here's our UI label. Here's our text field. And there's the update button. So we can say touch to. And then pass a string as an argument, since it's a text field, just expecting a string. Hello, San Francisco. You can see that's updated there in the simulator. Why is my simulator broken? What? Command 1, Command 2. All right. This is the right crowd for this event. OK. So you can see that that's updated. And now we're going to do the update button. Now the update button has what's called an accessibility label. And I'll tell you more about that in a sec. But the upside of this is I can say touch update. And there it is. It's updated the label. So it's just a little demo, but it shows you the basic workflow. So the next thing I want to show you is the accessibility inspector. And the way you can do that is by browsing a view without any subviews. It brings that up. Oh, another browser command is to tell you about is view or v. That just returns the view that you're currently browsing. So I could also say accessibility, which is often shorted to a11y, because it's a11y.inspect view. So let's go through a few of these attributes. And by the way, you set these attributes. You can either define methods in your class. For instance, you'd say deaf, accessibility label. If you're using review motion, you can use camel case or snake case, whatever floats your boat. So you can either define your method. Or you can, if you've got an instance of a view in a variable, you can just set it that way, view.accessibilitylabel equals whatever. So the accessibility label is the most important thing. That's what voiceover reads. So if you take one thing away from this, make your accessibility labels. The accessibility hint, you usually don't need. But if it's doing something non-standard, you can put in a hint to give some additional information. There's accessibility traits. And there are a bunch of these. This one has static text set, because that's what it is. There's static text, button, adjustable, place sound. There are a bunch of different traits you can look up in this. Simply tell voiceover how to interact with that element. Accessibility value is for something like a slider or something like that. Accessibility language is the language voiceover uses. Accessibility frame is the frame that voiceover considers that element. It defaults to the frame of the view. Accessibility activation point, remember I said you can double tap anywhere on the screen. The activation point is the point that voiceover simulates touching. And that defaults to the center of the view. Accessibility path, that's something new in iOS 7. And that's the area that voiceover highlights. So for instance, in Apple's Maps app, it's really cool, you can drag your finger around and actually hear the different streets and things like that in intersections. And it uses the accessibility path for that. Accessibility view is modal, kind of self-explanatory. If you've got a modal view, you can set that to true. That tells voiceover to ignore siblings of the receiver. Should group accessibility children. That's set to false by default. When you're swiping, remember the single finger swipe, it usually reads from left to right and from top to bottom. That can be confusing sometimes. So if you set that to true, it'll respect your sub-views array. Accessibility elements hidden, pretty self-explanatory, just hides the elements under it. An is-accessibility element, again, pretty self-explanatory. That means that voiceover will address and read that element. That's not always set to true. For instance, on a table, accessibility element is false because you're not interacting with the table, you're interacting with the cells in that table and those are true. Accessibility identifier, that's an additional way of a voiceover to identify your element. Then at the end, we have accessible to true. That's the next thing that I want to show you and that's the automated accessibility inspector. I mean, the automated accessibility testing. Because, yeah, yeah. So I know that, oh, and by the way, if you're doing something that doesn't descend from a UI view or UI control, you've got to use what's called an accessibility element. That gets a little heavy, but you can initialize like this with accessibility container. That's often self. And then you would define a few simple methods to tell out what to do, to tell out the order of the elements to read. You can see here, it's just a few traits. It's just kind of a mini thing. So now I know what you're thinking by this point. This is great info, I want to make my apps accessible, but this is a lot to know and I'm probably never going to memorize all this stuff. So that's why I'm really proud to show you automated accessibility testing. So it wouldn't be very much of a demonstration if the element's already accessible. So let's create something. I would say the biggest complaint of voiceover users is unlabeled buttons and apps. So we're gonna do that now. I know some voiceover users are cringing that I'm creating an unlabeled button. It's a controlled environment, so it's okay. So there's our button, I just did UIButton.new. We'll pull up the accessibility inspector. You can see, it says accessible false and it puts near NS log exactly what you have to do. So it tells you that to set the accessibility label and for button you can use the set title method and it's telling us we need to make it an accessibility element by giving it a frame. So let's do that now. Someone tell me if I mistyped it. For state UI, control state normal. Okay, button frame. I'm just gonna do this. And there we go, accessible true. So there are actually a few components here. Firstly, that are built into the inspector here. So firstly there's the accessible predicate and that just of course returns true or false. And then there's also the accessibility doctor and that's just saying 11.doctor. It's not gonna tell us anything now. But if there are problems it'll then log those messages as we saw. And that'll also return the view. If there is a problem it's going to return the view with the problem. Because a lot of these views it'll browse recursively. So I wanna show you the third thing is the accessibility test. Let's say you're doing something in like a custom UI view or something. We're just gonna do something simple here. So it doesn't know what you want it to be because it's just a UI view. So the way you would do that is with the accessibility test. The custom accessibility test equals let's say UI label. And then it'll again tell you exactly what you have to do to make that custom view act like a UI label to voice over. So the advantage to this then too since we have a predicate is that you can put them in your specs. And this is great because the only thing worse than an inaccessible app is an app where it is accessible and then they upgrade it and break the accessibility. Usually with the words totally redesigned user interface. Whenever a blind person sees that in a change log for what's new in an app we go what if they broke and now. And sadly it's usually true. So this is just a little demo. And it's basically recreating what we did with the button. So we're seeing that in some of this I just commented out. So if we try raking this you'll see. Again it tells us exactly what we have to do. The spec failed. So now I'll uncomment these lines and they just do exactly what we just did in the RAPL. We break the spec and there we go. And now let's break all the specs and then you can see that I have specs for every single one of the views and custom controls. I hope this isn't the one time it crashes. Nope. Bam. Yeah. Ha ha ha ha ha ha ha ha ha ha ha. So that's what I've been working on. By the way this all started last year during the question and answer session after this talk. Someone asked me and I don't even know who maybe they can take credit for it. Someone asked me if there was a way to do automated accessibility. And I said it's a great idea. And now a year later we've got it. The first time I wrote views should be accessible on a spec it sent chills down my spine. It was just amazing. These are recursive as I said. So if you're feeling lazy you could always test your key window or even your app. You could say app should be accessible. I'd recommend against doing that though because you can get confused going down the sub-view hierarchy and also if you're testing your view controllers obviously your app is terminating anyway. But you can also test your view controllers for accessibility also. You could say controller should be accessible. So that's what I've been working on. In conclusion the iPhone allows the blind to do wonderful things. Accessibility helps everyone. It's the right thing to do. Ruby Motion increases productivity for the sighted and for the blind. In most cases as I've shown you you can make your app accessible with relatively little effort. It's a lot of time to do it. Not that hard. The combination of Ruby Motion and Motion Accessibility I think is the perfect environment for the blind developer and for anyone who wants to do accessibility. I do have a special offer. I do freelance accessibility consulting and if you're using my gem I'll give you a discount. And I'll leave you with this thought. If Apple hadn't made devices like the iPhone accessible I wouldn't be standing here now giving this talk. Thanks very much.