 Good afternoon. Can you guys hear me? All right, there we go. It's so good to see you all here. I did not know Beyoncé was the theme, but I'm so excited about it. So, as Kim mentioned, I'm going to talk to you today about open source, open mind. So I'm faculty at University of Florida. I'm an assistant professor there, and I run the soundpad lab, and this is where the sound thing comes in. So open source, open mind. We use open source software to create a lot of our research projects. And then open mind is the second half of the presentation. We're going to talk about how, when we're creating these open source projects and contributing to them, how we can think about them from an inclusive design perspective. All right, so what do I work on? I said computer science, but more specifically, we work on a very specific application of sound in computer interfaces called 3D sound. So typically when you think of 3D, you typically think of the picture here on the left, where you put on a headset, and then you're automatically in a virtual world. So visually, you guys got it. You know what 3D visually is, but what in the world is 3D sound? Luckily, I am going to tell you. So how many of you have used Unity or Unreal, any of those engines? Okay, a few of you. So if you've noticed, as of late, there is a checkbox that you can check that says, hey, do you want to turn on that 3D audio spacializer? I know it's especially in Unity. And now you're wondering what in the world is that? I'm so glad you asked. Nobody asked, but I'm glad you asked. So when we think of 3D visually, there are two different pictures that are given to your eyes. So as you see the picture on the left, there's two different pictures, and your brain combines them, and it gives you the sense of depth and of space. So we do the same sort of techniques with 3D audio where we give two different sounds to your ears, and when your brain combines them, it's perceived as being coming from like a specific location in space. So I know you are thinking, what kind of crazy magic is that? So this is made possible by the use of head-related transfer functions or HRTS. So I'm pretty sure almost nobody in the room knows what these are about, but just from a general perspective, head-related transfer functions basically describe the acoustic transformation of sound in the environment to your ear. So the way that we can measure them is imagine that we have our friend up here on the left sitting inside of an anechoic chamber, so sitting in a room that has no reflections, and there's two microphones inside of their ears. So if you can see there's speakers all around this person. So a sound is played, an impulse response actually, in a certain direction. So then we can record all of these impulse responses. So if you look up, you can see the left ear gets the signal, the right ear gets the signal. In this example here, the left ear is the closest ear to the sound. So it has a higher amplitude, and then the right ear is the farthest ear. So the right ear actually has a slight delay receiving the sound. So we can take this amplitude difference, or sorry, this volume difference, time delay, and then some head shadowing because of the head, combine those all together and create a filter such that we can take any sound source and convolve it with that impulse response, and then it sounds like the sound is coming from anywhere in space around you. So by doing this, we have a bunch of different head-related transfer functions for all different locations in space. So it allows us to use sound treated as though it's a signal and manipulate it using DSP or digital signal processing to emulate how sound actually travels in space. So we can take in a mono or a single channel sound source and then split it in two, one for the left ear, one for the right ear, and then impose a delay on the farther side and then actually filter it. Don't worry, this is not on the test, I promise. And then filter it, and so when a person hears it over headphones, it actually sounds like it's coming from a location in space. So you guys are not excited enough about this. You're not excited. But you know what? I was prepared for it. I was prepared for it. I was prepared because, you know, sometimes, you know, you need a little bit of convincing of why being able to put a sound anywhere in space should be something of interest to you. So in my lab, we use this technology combined with open source software and cloud software services to make solutions for a wide variety of populations. But before I talk about any of those projects, I have to thank my minions. I mean my lab group. So these are the folks that I work with and without a lot of them, a lot of this wouldn't be possible. These are grad students, undergrad, master students, and I'll also show you some work from students that are in my 3D audio special topics course. So actually, I wanted to go back for a second. Our group is kind of like the Avengers, I like to think. Like everybody has their own specific little like specialties and skills. And then in this picture, Ben has on green. So he's like the Hulk. So yeah, this is our Avengers picture. Everybody is special in their own unique way. Love it. So these are a few of the open source software packages that we use in the lab with varying degrees of openness. So one project is literally called beep boop blippity bop. And just pause for a second. This is one of the reasons you don't give the students free brain to name the projects anything that they want to. But this one, I call it quadruple B. But it's beep boop blippity bop. But the take home point is here. Imagine that you are, not even imagine, just a quick show of hands. How many of you ride a motorcycle? Have a motorcycle? All 10 of you, wow. So I can't believe that this room full of computer scientists, we don't have more bikers. Really? Really? I'm joking. I can imagine it. Anyways, imagine if you are in a room, sorry, if you are a biker and you need to change lanes. So this is actually a quick little depiction out of the biker training manual where you have to do a head check where you literally turn your head and look before you change lanes. But guess what? You're taking your eyes off the road in front of you. In order to combat that eyes off the road problem that could lead to potential danger, we've created, using Python and Udu to open source pieces, an actual prototype that allows you to put sensors inside of a rider's helmet and it can sense the proximity of incoming cars. So you can actually hear them as sounds in 3D space. So remember that 3D audio thing? We can put sounds anywhere around you. You can ride a bicycle and have, or ride a motorcycle and have things coming from anywhere around you and you can perceive them without having to turn your head. So this 360 degree awareness. In order to make sure that the motorcyclist is safe and we're not just clouding their ears with all sorts of sounds, we can deliver it over bone phones. We have wireless bone phones that connect to the Udu so that bone phones actually conduct sound to your ears on your ears. They sit on this bone right here. So that's one project. Another project is a little bit similar except in the aviation domain. So this one's called Plane Sense and we use an open source project called Stratix. And Stratix is an open source aviation. It's actually just traffic and weather but we just use the traffic portion of it to allow pilots to not have to actually look down at their instrumentation as much to figure out the proximity of other aircraft. Pilots can hear sounds in 3D space that relate to the proximity of other aircraft. So here we've increased their situational awareness without overloading their visual channels. If any of you've ever seen a cockpit, there are a thousand different panels and gauges. Why add another one for them to look at when you can utilize the ears and show them where things are. So this is another open source technology that we're using in this domain. So this one isn't saving anybody's life. This one's more of a fun one. It's called 3D Aid and it uses Pigeon. Has anybody ever seen Pigeon before? Maybe? A few of you. Okay, cool. So we hacked up Pigeon and Pigeon's a messaging client and it operates with a bunch of difference, a multi-platform instant messaging protocol. So imagine that you want to receive your instant messages but you don't want to constantly look up and see, okay, who delivered this message? How important this is. So you can create settings here to say, all right, if I receive a message from my boss, I want it to come in 3D space really close to me. But if it's Bill down the hallway and Bill just wants to talk about the Cavaliers, nobody wants to hear about the Cavaliers. So if it's just Bill and you can say, okay, it can come from a farther away location, perhaps even the trash can. So it's a way to just customize. It's a way to customize your whole instant messaging listening experience so that you don't have to always pop out of whatever you're doing to determine if it's from someone who's important or not. The actual location the sound comes from can tell you whether or not it's something you should pay attention to. And this next one, it's not exactly a 3D audio project but I still think it's cool. We were approached by a PC company that I can't name yet, approached to use BCI. Has anyone heard of BCI Brain-Computer Interfaces? Nobody? Okay, a few of you. For the rest of the room, BCI or Brain-Computer Interfaces are these sensors and they typically come as these commercial off the shelf sort of like headsets and they have lots of sensors on them. You can actually read brain wave data with them. So using these, you can basically train a machine with patterns of data. So we map these different commands that the brain is giving to key commands such that for this project the company we're working with has a person who they were previously a heavy metal drummer and they are currently an amputee and we'd like to restore his ability to play the drums again. So we've created sort of like a MIDI trigger using this BCI headset but using juice platform audio application to facilitate a lot of this. So as you can see, we're using open source to save the world, right? So we're creating this customized solution for that project. And this is a picture of, this has actually been again using one of the first models that we used. We actually didn't end up using this device but this is one of the concepts behind it where a person is literally sitting in front of the computer with this device and we're literally just changing the face of an emoji to see if we can just get it to change. This is one of the early phases. Alright, and the last project I'll tell you about that we're working on is called Sensing the Museums. So if you were a person who's visually impaired and you went to visit a museum it would be a pretty boring experience. Why? Because everything's on the wall, it's visual. The museum is already boring in the first place, right? So if you can't see anything then it's extra boring. I'm joking, museums are fun. If you're visually impaired there isn't much that's accessible to you. So we're collaborating with the Museum Science Program at the University of Florida to create a very accessible museum experience where we have an indoor localization system such that if you are standing anywhere in the room you can hear a sound coming from the locations of all the different exhibits so you can interact with those. You can choose it's a Panama Canal exhibit for the first one so you can say, okay, do I want to hear an oral history? I can walk over here. Do I want to hear about some sports so you can walk around and personalize your whole experience. So this is basically a way to make the museum more accessible for a population that people generally don't think about when designing museums. But as you can see we have a lot going on in my lab, but when you take a step back, when you're designing software for all of us, you have to reflect and think about population that you're actually designing for. So this is the transition into solutions for humanity this portion. So this is making a case for inclusive design so this means inclusive design is when you design a software or piece of product, thinking about everyone who could potentially use it and not just the general case. So some of these slides may or may not have been stolen or were without permission of my friend Nancy DeYon from Uber, but we shall move along. So we can all agree that we all design software and there are some terrible interfaces out there. Some of us are guilty of creating them and if you're not convinced, then I can show you this. So, alright, everyone seeing one of these, you go in, you pay for your items, but what button are you supposed to press to complete the credit transaction? So someone said the green one. So, yeah, it says okay. So usually green means yes, confirmation okay, it actually says enter and yes on it, but nothing really says okay. But then we could think, alright, let's press this middle button here because there's an arrow pointing to the okay, perhaps that's the answer, or maybe even the button that says credit because we're trying to do a credit transaction. So as you can see here, there are some real ambiguities in this interface. So we can all agree that interfaces can be designed a bit better. In the same vein of things that are poorly designed, CNET put out an article saying that women generally prefer Apple products and gentlemen preferred Samsung and if you were to just physically look at both of the products, you would see that with the Samsung computer of interfaces with the phones you can't put them in your pocket. So it could be actually a preference because of being able to actually take it with you versus even being able to, being a preferred object. So this article says will the iPhone 6 fit in your pocket? If you're a dude, yep. Everybody else, no. So we have to think about the whole population when we're creating these things. And also about the whole population, things that are colored nude. So nude is not one color. So this yellowish looking bra is not nude. And calling them sheer and nude kind of sends the message that anything that doesn't fit into this is just outside of the norm and that it's super weird. So we need to design things that think about everybody. It wasn't until like the last five years that you could even get mandates in different colors than the generic flesh tone color. So hopefully I am making the case for inclusive design to you guys. But in case I'm not, I'm sorry Corey, I have to bring this out. Let's look at Microsoft's pride and joy from I think maybe last year, the year before. Do you guys know who TAY is? Some of you do know where I'm going with this. TAY was an artificial agent. It was a Twitter bot. And this was an experiment in AI and machine learning. And the bot was supposed to communicate with millennials and they thought it would be a good idea for TAY to use Twitter to learn how to talk. So you guys are already laughing. So now after about 24 hours TAY learns some crazy things like this colorful language here. So inclusive design is always needed. TAY learned how to just be the most vile person on the planet. So TAY was actually tested for two years before they released it in the States. But the problem, it was tested in China. They don't have free speech. A lot of the speech is censored. So of course it worked there. People couldn't say crazy stuff. This is America. We're crazy. They should know. We should know. So it was very well intentioned but inclusive design, having someone there to say, hey, guess what? Twitter is not a good place for all people. It would have just rained this thing in, put some pictures in to say, hey, we don't need someone to learn how to talk from Twitter. The next point is that inclusion for one group does not have to mean exclusion for another. So everyone can be excluded. For example, there are no baby changing stations in the men's restroom. So here's a picture of a man trying to change a baby on the floor of the men's restroom because until lately a lot of men's restrooms did not have changing tables. So when you're designing things, you have to think of everyone who will use it, not just the usual use case. And so if you ignore a lot of these inclusive design practices, you actually get into a lot of ethical issues. So there was a Snapchat filter that was created, I want to say almost a year ago or something, that was clearly yellow face. So these are some things that if you do not have any person of color in the room to let you know this is not a good idea, then you can get into some real issues. And so the second image here is actually a picture. And this picture is the algorithm for a hand soap dispenser. And we can play the video. All right. So watch this. Black hand, nothing. Larry, go. Black hand, nothing. Larry, go. No. Racist mother sinks. No, I wouldn't say that. I wouldn't go that far to say that the sink is racist. But when you're creating the algorithms for these sensors to detect if a hand is there, you need to consider all hues of hands. And not just the one that represents the majority of the people in the room. So the take home point here is that you need to identify the data that you train your applications on so that it can accept a wide variety of input. So in case you were asleep this entire time, then here are a few take home points. I'm not asking you guys to become diversity experts or anything like that. But just think about these four things. The first one, when you're creating software that you expect to impact a wide variety of people, make sure you include a wide variety of people in the design process, ask them how this would work. And if you don't have access to those populations, just imagine what you're doing from the perspective of someone that's much different than you. I know we're very selfish, we're American, we're taught to think about ourselves, but just turn your brain for half a second and think about somebody who is much, much different than you. And then also, if you are creating a system, try to get rid of any assumptions, any generalized assumptions that your system might make about how a person would use it. So anything that has to do with gender, size, ability, experience, think about the wide spectrum. And then lastly, if you are creating algorithms, one way to avoid bias in these algorithms is to make sure that you diversify your training data, make sure you're training based on a wide spectrum of data points. Alright, well thank you guys so much, it has been a pleasure.