 Good evening everyone and welcome to a presentation titled accessible charts using music and haptics. My name is so pretty as Manjula already said I'm a product manager at Spotify at the moment. Previously I worked at Yahoo Finance as a product manager and before that as an Android engineer. We are also joined today by Yatin who is the star developer lead on this project who will be speaking about implementation details in just a bit. Another special thanks to Larry Goldberg and the accessibility team at Verizon media for supporting this project and advising incessantly on everything that went into making this happen. And also thank you to the hello a love and why team for giving us this opportunity to present. Let's get to it. So in today's agenda will be covering the motivation for this project. What led to the problem statement how we discovered what were the most pressing concerns for users with real users with low vision or blindness. And we'll get into some of the more design specific elements of how we decided to add the feature set that we finally ended up with. There's also a newer new ish announcement as of last week which is not part of the presentation but because it's so recent that yeah then will will share with us. We'll share the Android architecture diagram. I'll be showing you a demo in just a bit. This project is also open sourced, which will also get to an implementation part and some of the next steps would we'd like to take as as as a community and team to take this further. So to give a little bit of context finance charts are essentially one of the center pieces of a stock charts, or a stock app or a finance app, which is, which was the initial motivation for making the accessible charts feature. So if you think about it, charts quickly render in terms in terms of the status quo and charts in finance is just one application of data visualization. This overall concept can be applied to a lot more platforms other than mobile and also a lot of other data visualizations other than line charts, for example. So to take a step back and assess the status quo charts quickly render hundreds of data points that help us analyze trends. In the finance context, it is movement of a stock's price charts are great for people who can see or or who can see well. They can quickly identify key markers such as domain range and the data points and points of interest at a glance for visually impaired users, however, to rely who rely on screen readers to access information on digital devices. Most charts are data visualizations only have a label similar to that for an image with no way to meaningfully interact with the data. In the charts considered accessible today, the status quo is for screen readers to read the data points as XY XY XY, as it would be in a tabular format. This is problematic on at least two levels. One, after about five or eight data points, it is difficult for the user to keep a mental image or mental picture of the trend shown on the chart, which is the entire point of it. And second on the spectrum of low vision to blind users, it's cognitively disruptive and inconsistent to see a chart on the screen and hear a table from the screen reader. Users with low vision face similar challenges to data visualizations across multiple domains. One of the most important of which and one of the most impactful of which is education. This is one of the reasons we chose to open source the solution so we can make this technique available to developers and designers that work on education software among other areas. Next we'll talk about the challenges related and these are some of the design challenges related to accessible data visualizations, and especially charts for low vision and blind users. And the solution essentially combines music, and more specifically tones that scale and map to human audible frequency, which basically means as the stock price goes up and down so does the pitch of the associated tones, and we have haptic and spoken track and a seamless transition which will show in a demo later between those modalities. Note that the solution is for Android, but the same or similar solution is also available on the finance iOS app. The Android part is open sourced however, but the solution in itself is very extensible to desktop and other platforms. The design considerations that you'll see or notice in the demo coming up are as follows. In about eight user studies during the discovery phase, which we did with eight blind users, we validated that beyond a handful of data points it becomes challenging for users to create that mental picture we spoke about. Another important point that we learned was spatial recognition on the device was challenging. By this we mean if we tap a certain point on the mobile device as a sighted user tapping that same point again is pretty straightforward. For a blind user, however, it is not a straightforward to spatially recognize that same coordinates or the same XY on the given screen which is why we chose to make the chart experience a full screen experience. So you know the start and end of the chart is the same as the start and end of the actual device. The solution or the gist of it is the data points are mapped to a range of human audible tones that convey relative values of the time series. As the why value increases the pitch of the tone go up and vice versa. Since this was implemented on mobile, we also used haptic feedback to indicate points of interest, which in the finance use case are the highest of the time range that you've chosen the lowest, the previous close and the current stock price. As a user scrubs through the chart. And on Android that means sliding with two fingers, they can hear tones corresponding to data points that the user or the pointer is focused on. At any point when they decide to stop and explore an area more granularly they can release the pointer and the screen reader will announce the last data point they were on. From here the users can focus back and forth on data points proceeding and following that particular point as they would in a table which is what the status quo has made them used to so far. Usually, however, it will be consistent because the overall chart is divided into invisible panels, each of which represents a data point so when they're looking at one, one data point on the x axis. The thing that's being read to them is also visually consistent for users with low vision and not necessarily blindness. The users overall scrubbing the chart, one, one, an additional thing we do is give haptic feedback on the points of interest which is the high low previous close and current stock price to indicate that there is something interesting there. After the first round of user studies, we validated that the solution works for the set of users that we were aiming to improve the experience for. As you will see in after the demo, about 80% of the participants were able to draw the overall trend of the chart by using the solution, which was really encouraging to see another design consideration that went in was the overall description or or the label of the chart that was like the X and Y axis range. The heading structure is also really important to note. If you, if we notice in the in the status quo chart or what all users see you'll see the time ranges are at the bottom of the chart. In the accessible charts full screen experience the time range buttons are at the top so the user doesn't have to waste time interacting with the chart before they before they get to pick the range they're most interested in. So these are some of the optimizations we made to customize the experience for a low vision or blind user. The functionality that will maybe get to here in the in the demo it's not as clear, unless you use it on the actual device is there is a different texture to the stock price above versus below the previous close, which is something that is visually marked and financial charts and we tried to do it with an audio signal. So one of the most important lessons we learned during the research and development of this process was that users want a nuanced customizable solution that works with them in their unique context. This is the reason why we added the ability to change the pitch or the frequencies that the users most comfortable hearing. Another another point of personalization is the time is the data date date point format, which means that the time can be read as the first of January 2021, or just January 1 or just first to make it as concise or as verbose as the user prefers. Let's watch the demo now I'm going to share my YouTube screen. If you could let me know that that's working, that would be great. Do you see the YouTube screen. Yeah, then can you give me a thumbs up I can see you. Yeah, we can. BZ three months chart double tap to explore double tap to activate BZ three months chart trending up current price 59.75 previous close 57.63 high 59.96 low 55.59 swipe or drag two fingers across the chart to explore double tap to activate the second of September 2019 59.06 enlist 15 items August 26 2019 58.16 August 19 2019 55.92 the 12th of August 2019 56.65 BZ three months chart. That was the demo and which basically shows how all of these elements come together. And this is the response from the user studies that we were speaking about these these are the actual drawings from a subset of the users we tested with. The bottom left is that the reference chart that we represented with this audio solution and a majority of the users were actually able to draw with varying levels of accuracy, the trend of the charts. Next we have yet then to talk about how this was implemented and more of the meat of the solution. Over to you. Great. Thank you so pretty appreciate it. So on Android, we approached this feature by creating a new custom view that would draw the chart and overlay it with the list of points to use Android terminology we use canvas to draw the chart and a recycler view to populate the list of points. Each point in that list would have a description containing its price and timestamp and using Android's talk back feature or screen reader. It would allow these descriptions to be read out loud as the user focused on a data point and swiped over to the next one. The user can also press down two fingers to interact with the trend line and hear tones play out loud. And the pitch of these tones would match the points relative position in the chart. We've extracted this audio chart view into its own project. So any developer would be able to pick up this chart and place it into their app and load it by providing it with the list of data point view models that we have defined. They can act upon the chart to do certain things like play a summary audio of all of the data points. The chart takes care of the scrubbing and the releasing gestures. And when you're done using the chart in code you would dispose of it to clean up any resources. So earlier last year, we open source this project titled songbird for other Android app developers to use if they were interested. And if you're interested in checking it out, the link to the GitHub repository is on this slide. Please feel free to leave any comments or open an issue. We're always open to any feedback and we thank you for it. We also got some direct user feedback through the app store that the charts were pretty awesome and that the next big big things could be accessible indicators, which is an advanced charting feature. And this user appreciated the fact that they would be able to get into the forex market for the first time. They felt like that's a very real possibility in a very real shell. And on Twitter we had a user share our Android demo and they said that they had never seen this method before. And it's probably the most unique accessibility solution to a problem that they've seen in a while. And they also state that translating graphs to tones is pretty trippy. And I'm assuming in a good way. So some of the next steps that we've wanted to take for a while were to use the pentatonic scale to make the tones a little bit more pleasant to listen to. The built-in tone generator that Android provides isn't as pleasant to listen to. Another integration that we wanted to look into was having it integrate with Google Assistant. So a user could ask the assistant, play the audio for this stock's performance today. And we want to continue development of the open source library and get word out there. Get some feedback from real users or other developers to see how we can improve and increase the feature set for this project. Although for the first point, as I believe last week, we have a new version of this project using piano tones, which is far more pleasant to listen to. It uses the sharp keys or the black keys on the piano, which in any order would sound pleasant to listen to. So as you scrub through, the different scale of that tone will play according to the trend of the chart. But if you prefer the pitch range and the other tones, because you have a little bit more control over it, you can use an older version of the project and still have that. But now you do have that option to use piano tones, which I believe is far more pleasant. Great. Thank you for listening in and joining us. We have any questions. We're happy to take them. Yeah, this is quite interesting. Yeah, and I'm just curious to know, like, since the charts and high low points, it's like you could do it. But when are you looking forward to extended to other forms of images also like I've seen people struggling with interpreting maps and all. So do you have any plans to take this in other interpreting other forms of images? Data visualizations are slightly different from images in the way that they're when we have the access to the source of the data or in this case, which is the X and Y arrays that we can play around with and just play in a different way and in different ways. It's a little more straightforward. There are applications on maps and especially other forms of data visualizations, for example, a pie chart or another kind of chart that are way easier to do than the actual image in which case you'd have to extract qualities or attributes from the image to be able to convey that more meaningfully. That can be done and is being done with machine learning at various companies that are extracting information like who are the people in a certain image and things like that. Same thing I'm imagining can be done with maps. As far as this project is concerned, it's very focused on data visualizations at the moment, but that's a really interesting idea. Yeah, thank you. I hope you progress ahead. It's very pretty interesting to listen to you. Thank you. We have a question in the chat that is it possible to have a hands on workshop about this? Yes, it depends on what would the hands on workshop entail. Whoever asked this question, would you want to elaborate on this? For the person who asked if you would like to try it yourself, you could download the latest Yahoo Finance Android app and either enable talk back on your device or go to the settings page and enable audio charts. And you should be able to see a button on the stock detail page under the chart to open the audio chart page. So if you actually want to see it live in production and try it out, it's possible now. Let me paste the link to the Play Store and also the open source project where you can actually see the code that Yathan wrote and see exactly how it's implemented. Yeah, yeah. If you have questions, feel free to open an issue on the open source repository and we'll try to answer it, but we I mean Yathan. Great. Anyone else with any questions, you can go ahead. We still have time. Yeah, so there's one question from YouTube. Krishna asked how can this help cognitive users like people with dyslexia and colorblind deficiency. So, how does it help this particular use case? So someone with colorblindness should, if there's enough contrast on the actual chart should be able to use regular charts to begin with but if not we've taken care to have enough of both focus contrast and the actual line chart labels buttons everything on the accessible chart experience to make it even more accessible with respect to contrast there. With respect to cognitive disabilities, the more modalities or the more ways we have of interacting with a certain data or information, the more options we provide users to absorb that data and to interpret it. And that way it's indirectly more accessible to people with cognitive disabilities. That wasn't the initial intent of this particular feature was very focused on low vision and blind users, but it does have applications for other forms of disability, which is which you'll see is a common theme among any accessible feature it tends to have a lot of overlap. I think this is one of the features that I mean playing an audio chart I think that's a very interesting question somebody asked about the cognitive disability because I since they can't remember so many things as the users already showed that in the in the study itself. So there's a way that even many users would prefer I you know something read out to them right. Right. So I think that way cognitive users can anyway see and if they can still remember they can anytime go and you know, make the talk back read out and you know, understand and they may not every time remember what at least at least some cognitive users and even people with learning disabilities use a screen reader. And, and that's that's a good thing to know about I mean they only see this see the chart there is no distraction for them in this view. If I'm not wrong. No there isn't. Yeah, so in that case they have enough time and there is no distraction for them to for them so that they can at least try to understand or make sense of the chart I guess that's that would be the way forward is my understanding. Right and the good part about this is they don't have to listen to every single data point. A lot of people with cognitive disabilities are able to interpret music a lot better than they're able to interpret spoken or written word. So once they do hit a data point they're interested in or something that they would like to explore further that's when the spoken feedback takes over. Otherwise it's the tones that are being experienced.