 Good morning everybody. Or good evening or wherever time. So when you're in here in Redmond, it is morning. It's Thursday and it's time for another episode of the Visual Studio remote office hours. So since last week we had a few updates here, or I had a few updates to my home office. Look at this thing. I got like this microphone arm. It's a very cheap one. It's but it does the trick. And here's a really cool thing. So when I'm not doing my video recording or doing meetings, I'll do this. And it's gone and you can still hear me if I yell right. Very cool. This is probably the thing I've been most excited about because it makes the quality so much better of my voice. Super happy about that. Here's another thing. I forgot my pen. This is the surface pen. I forgot that in the office and I went in the other day to pick it up. I found myself doing way more screenshots now than I kind of ever had because I can no longer just show my screen to a colleague. I had to do a screenshot and send it to them often. And so I do a single click with this on the top here. And that makes my screen into the screen capture mode. And then I can just outline what I want to share. And it automatically copies it to my clipboard. So that's really cool. So the same thing happens if you don't have a pen. Windows Shift S, you can try it now on Windows 10. That will bring you into like a screen snipping mode. Anyway, so that was a few updates on the home office. This is gonna be the last time you're gonna see the basketball hoop. I'm gonna move that out. So I'm sorry for everyone who's been eager to see me try to dunk a ball in that. Not happening. All right, so that was an update on that. Today we have a fantastic show because we are getting a rare insight into how our internal customer research on the Visual Studio team takes place. We have labs, we have interviews, we have surveys, we have a bunch of things that we do. And typically it's the program managers, such as myself and others that talk to customers and learn things. And all of this goes under the umbrella term customer research, I think. And here to talk about that, we have Mr. Research himself. Carl, welcome to the show. Thank you. Thank you for being generous. Thank you for welcoming me into your garage. Yeah, I mean, I feel privileged and it's like not many people get to be part of this, so thank you. I wonder, Carl, before we get into sort of the meat of the matter, you've been at Microsoft for many years. So if you would mind introducing yourself to everybody here. Sure, okay, yeah, I've been, I'm the old timer now, I guess at this point. So I've seen sort of the changes in how we engage with customers from over the years. And so, but yeah, my title is UX researcher. And my job is to, well, a great way of thinking about it is to help us as an organization figure out what is the right thing to build and how to build it right. So the first half is about what do customers really need and trying to unveil that onion to try to figure out what is the really core problem that we need to solve. And the second part is, you know, you can think about it as the usability part of it, the utility part of it. How do we make that usable, not only from the discoverability part, but also from that day-to-day usage part, right? That sort of I'm working the editor and it has to work smoothly for me. So I'm helping the organization figure out those two halves of that equation. So anything else you wanna know about me? I was an old C programmer way back when did image processing stuff, sort of discovered human computer interaction by accident. And I had a degree in medical computer science and did this research at a cellular level for many years. And so then I just figured, hey, we're multicellular creatures. So I'll take that skill set and just translate it to us. Anything that have worked okay. So. That's very cool. I think it wasn't Bill Gates who, when he was asked the question, what he would do if he wasn't into computers, he said he would be in like biotech. And there he is. So that's the opposite maybe of you. You came from that world. Maybe, maybe, exactly, exactly. So I guess maybe those, more interesting the other side of the fence I guess. So anyway, yeah. Okay. So we're using basically your team is sort of one that facilitates the various product teams around Visual Studio. I think a lot of people maybe are not aware of this, but we have typically the way we're organized within Visual Studio is that we have a bunch of product teams, we call them, or feature crews. And they're basically usually at one PM and about five to seven engineers. And that includes the engineering lead. And they work on different things. So there's one of those teams. I used to be on one that did Visual Studio extensibility. There is one for the editors. There is one for source control, for the Git integration, and so on and so forth. And they all like, of course, work together between those teams, but also they are sort of their own unit at the same time. And they each come to you, right? And they say, Carl, I'm gonna build this feature or I've heard about this problem. Can you help me? Is that how it starts? Very often it does. There is also that we're not quite aware of the problem that we have yet, right? We think it's a problem, and but we don't understand what is really going on here. So we wanna sort of really like have people use this stuff and see this stuff and try it out for us to figure out what is really going on here. So it's some cases, the trigger is not a feature, but rather we have these students coming up to Visual Studio and they're installing it and not running Visual Studio. What is going on here? And that's where I bring in the sort of the techniques to get into that deeper level of understanding as to why and what and who and where and all that fun stuff. So that's where I come into play. So it starts at different places. It's sometimes it's an idea that somebody has, would customers want this? In other cases, it's like, what the heck is going on here? So it can come from many different directions. Okay, so you said something that grabbed my attention. You said that some people or students or whatever would install Visual Studio, but then they would never run it? Exactly. How did you figure that out? What's the reason behind that? Wow, okay, great question. So yeah, it had us flummoxed, right? I mean, when you ran the numbers and looked at, you just spent all that time installing Visual Studio. Why not running it? Why not run it? Well, what we did is we then brought a bunch of CS students, in fact, into the lab. Many of them were from the UW because it's close by and had them, we basically gave them the browser and said, hey, somebody said, Visual Studio might be an interesting product for you. Why don't you go take a look at it? And we simply watched them, right? Now, what was interesting about it is at some point, through the installation process, all us folks who have no Visual Studio know it takes a fair amount of time to install. And it's gotten a lot better over the years, but it's still, the youth today kind of think about it as a five minute. Yeah, as any college student, how long it should take to install a tool, they will say five minutes. And well, what happened was they would start installing it, and after about three minutes, they started getting twitchy, expecting it to be done soon. And then at some point, they're on the internet looking at pictures of dogs and cats because they're just waiting for it to install. And they're almost like, they've sort of forgotten about Visual Studio at that point. And so that's like one explanation for that, but it was intriguing to watch that in the lab, watching people have that experience, and not only seeing their expectations, but actually seeing when they start getting bored because you could see it in their face, you could see it in their attitude, you can see it in their physical mannerisms as they're installing Visual Studio. So what do they do? Do they reach for their phone at that point? No, they're kind of stuck in the lab, right? They're waiting for this thing to install, and they get bored and they start going to the internet and looking at other things. And that led actually to some inspiration and said, can we keep their attention while they're installing Visual Studio? Can we give them some upfront insights into how to use Visual Studio while they're waiting to not only occupy their time, but almost more importantly, that when they first start Visual Studio, they kind of already know what to do. They already have a notion of what is a solution and a project and how to pick the first template and do sort of those basic things to do Hello World, for example, right? And what we discovered is if we can get them that almost sort of a visual tutorial that they can explore before that, they were far more successful when they first opened Visual Studio. So that was one of those deeper learnings that we had along the way. That makes sense. And that's kind of funny that I didn't know that came from there because it's something we talked about for well over a year, maybe two years, right? About having something make the installation process a little bit more interesting or engaging somehow. So that all came from that research you did back then, I guess. Yeah, it was sort of, there were several deeper learnings around that too. For example, people don't want to read a whole lot of text. We kind of know that, right? But when you see it, when you see people skimming, it sort of really brings home the point that you really need to be very crisp in your communication. And the other thing that was sort of real interesting is when you want to learn how to do something, a little animated GIF, right? That shows you where you need to go instead of reading like two, three paragraphs, you know, go to this menu and hit that drop down and da, da, da, da. A little animated GIF was something that people really gravitated towards because they could see it in action. You know, and it was kind of small, but it kind of gave them the general place to start. And that just proved to be very, very effective. You do need to think about how to do those animated GIFs because they can be pretty poorly done also. If it's too small, you can't really see very well. So you have to think about it a little bit, but very effective if you get it right. Okay, so we have a question here from the Q and A asking, if I turn off the telemetry, will I then show up as having installed the product but never run it? So in other words, are these numbers real or could they be a consequence of people disabling telemetry? That I don't know. You know, you should definitely talk to Kathy about this one because she did that initial body of research and did that instrumentation work. So you would need to ask her about that. I don't know for sure. All right. I should ask her. That's a really good question, actually. It is a great question, yeah. Okay, so we have a way of bringing people into a lab. We literally have a lab in one of the buildings. Is it building 16? It is, yes, it is building 16. And it's a full-on lab where people come in and sit down and you take them through tasks and how does that process work? You want me to show you some pictures? Yeah, let's see them. All right, all right. So I'll go show screen and then pop through pictures. And so I'll just kind of bring up PowerPoint. I'll show full screen here, share the full screen and then let me know when you all can see it. Is that okay? Can you see that okay? Here we go. Yep, we see it. Yeah, it's coming through. Okay, I'll show you the cleaned up pictures when they first built it, all the shiny ones, right? Before it actually got messy and where things got busy. So basically there's a big old garage door. We can open it up, make it very inviting. And there's sort of this shared space here and you see the labs around it. And the goal was to create almost a design space, a maker space. So we can surround ourselves with, our customers experiencing these ideas, these new products, these new features and to sort of start brainstorming about solutions to some of the problems we're seeing in a space that makes it very, very collaborative, I should say. And that's why you see these screens up on the side here. I'll show you a few more pictures where what's happening is you're actually seeing inside the lab, you've seen the customer engage with the product, trying to figure it out, trying to use it for the first time or using a new version of it that we've changed and tweaked in some meaningful way and we're trying to investigate if that's a useful change or not. The individual place, this is sort of looking at it from the team's perspective on the side. This is the place, the side that we work and I'll show you a couple of pictures of us in action. And what you'll see and the users on the other side of that mirror, it's a one-way mirror actually. And the purpose of that is that we want to not bias that customer in any sort of way. So we want them to go in there and have some tasks to do and for them to go at it and for us to watch to see what happens without sort of interfering or being very deliberate when we need to jump in. So this is sort of pictures from a typical Thursday. One of the things that we do in the lab is basically these teams, these individual teams that you just spoke about earlier develop a cadence. And so very often Thursdays are a really big day and it happens on other days of the week too. But what happens is on a Monday, we figure out what we want to bring into the lab in front of customers. And it could be, like I said, an idea or it could be something that is where we're tuning. And when we just pre-schedule these customers every week. So every Thursday, we know that every team, I'm gonna have customers in front of me. And I can go explore ideas. I can do, it could be just a standard sort of usability thing or it could be a mock up of some idea that we want to find out if it has legs, if it's a meaningful change for our customers, if it can fit into their workflow and work style in a meaningful way. So that every week gives us that cadence, that constant stream where always got users around us. Some other pictures around that. This is again, the team in this design space. We're absorbing what we're hearing and what we're seeing. And I think that's an important point here too in that we're not only listening to customers, we're watching how they're doing stuff and looking at the two together, right? What people see is really useful. Sometimes what people don't see is equally important, right? Especially when you talk about discoverability of things. And so what people don't notice, they can't tell us about, but we can see them not noticing it. And last week you talked a little bit about the eye tracker and stuff like that. So we even have the sort of when appropriate, be able to put an eye tracker to see what they're not looking at, to see what part of the UI they're missing, that we need to somehow bolster up in some meaningful way. So is that, am I, I do want to stop for a moment. Yeah, I have a follow up question here. Oh, you do, all right, yes. All right, so this is amazing. This is like, this is a proper lab, right? You do actually proper testing here. This is significant. And you bring, I think I saw a number like a while ago saying that you run like 10,000 people through customer research a year or something like that. Like that's the staggering amount of people. It's a staggering, but that 10,000 also reflects other things, not just the lab, right? So what we look at, as you know, we've had, if you think about how we did our job 10 years ago, 15 years ago, right? There were specs, right? To agile, to lean, such that we all are, lab is just one space, right? It's just one place where customer engagement happens. It's convenient for many folks because it creates a cadence, it creates a healthy form for having that conversation. But we have interviews, hopefully some of the audience have participated in this, where we will remotely share with you stuff and ask for your feedback. And so, when you think about that 10,000, it really encompasses all those, what we call zero distance touch points, which is still massive when you think about 10,000 direct customer engagement, right? That's a big number and it's an amazing number. And it was so amazing that Sacha came about a year and a half ago and was quite awestruck by the whole experience and seeing how much we do. No wonder, it's 10,000 is just a gigantic number, but the obvious question then is, how do you find 10,000 people that are assumed that are not, that are different, they can't be the same people every time or maybe that's okay? Like how do you go about that? That is great. So one of the things that, and that's a great segue to when we talk about the lab is well, the world is not Redmond and Seattle. And so the lab is where people physically come in. But again, we reach out remotely and we have a number of different ways of connecting with customers. So for example, we have an organization that will recruit for us. They will reach out to folks and ask them to participate. We'll give them a profile and they will seek out, contact those folks and ask them if they can participate for a 45 minute hour. Lab sessions typically are like an hour and a half because we will do multiple different things. For instance, if we are testing APIs, and we can test APIs for usability, for example, and do it in an iterative sort of way. And so coding takes a little bit, so they're in for an hour and a half. But, and then there are services out there that actually allow us to do these little mini usability studies and little studies that are like 15 minutes long. And they will then, we create a screener and then it goes out to the world and people sign up to participate and do these little mini studies. So there's all kinds of different venues by which we do that. And then of course conferences, right? And in fact, let me show you a build from one of the conferences. Yeah, okay, build unfortunately is remote this year. Well, it's gonna be interesting, right? But anybody who's attended build will recognize this because at build, we sort of turned around the story where normally, typically it's about downloading stuff to our customers. Here it's reversing it. It's about uploading for our customers telling us, telling us their story, telling us about issues and us taking that back to the lab. So again, there is sort of many, many different venues to reach that 10,000 number. And, you know, last year we had this wall of people submitting sort of feedback to us, direct feedback in a very, very structured way. And even in some cases, adding pictures and that wall just took over. I mean, we had six, just there and that one wall, we had 660 people having given us very target information about the product, what they like, what they don't like, what they see as an improvement to it. I remember that. I thought that was such a powerful thing. So what Carl and team did was that they in building 18 where the Visual Studio team are on the second floor, there was this big long wall down a corridor and you basically just completely covered it in tiny little pieces of paper that each represented the story of a user. And so that was a handwritten, I think their handwriting, maybe even. And just going through and you categorize them. Some of it was about the setup experience, some was about editor experience or something like that. They were like categorized in different things. So we could, as PMs, we could go down or anyone, engineers might not, we could see and we could follow and we could learn firsthand from a direct feeling that users have engaging with the product or a frustration or something they're really happy about because I think oftentimes it's easy to just focus on things that are problems that we have to fix in the product, but many times we also have to identify, well, what are the things that work really well? Let's make sure we don't screw those up, right? That was one thing that that stood out. That was one thing that I thought that wasn't the most hilarious thing ever. That was a guy, I think, he was very unhappy, very unhappy that we changed the logo. Do you know what I'm talking about? Oh, I still have it. Can you talk about what he did? So in years past, we would have these sort of, feedback walls and people would scribble on, a sticky and maybe put their picture on it and it was sort of loosely formed and it proved to be very hard for us to get it back to the teams, right? Who does this belong to? If somebody says, oh, performance not so great. Well, who do you bring that to, right? Because very often performance is specific to an activity. It's specific to something. And what happened there was we put some structure on it and we have something called the hypothesis framework where everybody builds these hypotheses and we have sort of a mad lip style fill-in-the-blank kind of moment where we said, as this type of developer who does this kind of work, my problem is this because, and when you fill in that stuff, it gives us just a much richer venue of information for us to then get that back to the teams, to the proper people in the teams to really understand the source of that anxiety or that design idea. What happened there is here, his thing is as a long-term visual studio user who's a tattoo lover, you have unchanged the icon and he had the original logo on a visual studio on his arm and in the picture he shows the original logo. And so jokingly, he was sort of describing his pain point of us changing the visual studio logo and this imprinted tattoo on his arm is no longer current, so it was so funny. So we, so awesome. When we talk about like, hey, this has consequences if we build one feature over another or we don't fix that bug because we instead used the time to fix a bug somewhere else. Like it has consequences, but never like a consequence like that. That was just from left field. Yeah, that was, yeah, yeah. But the other thing I think it's important here to talk about is that, and especially in these times now, it's like how important a product like visual studio is a product like visual studio is to our customers, right? And how important it is for us to be very, very thoughtful and what we change, what we add into visual studio, how we think about it, right? And so clearly, it's something when you have to live in it, four, five, eight, 10 hours a day, it becomes something that's very personal to you. And that's important to consider also. And this is why these sort of venues are so important. And that's why these engagements are so important to us because we can't understand all the different contexts. There's so many different sort of perspectives that people bring into their daily lives and to create a tool that is able to be able to have affordances for those different perspectives and those different skills and those different contexts is quite challenging, so. Right, and it is something we take very seriously but I think like what you were saying, just to echo that, like now with everything going on is even more important, right? That we take this very seriously because this is now used for like very serious business around the world, right? To stop these things going on. And so it kind of adds a little bit of pressure, I'd say. That's a little, yeah. Hey, but that's good, right? It's a good pressure and we feel good about like it's being used for good things. And so there's been a really good sense, I think on the team, like team all up, Julianne, Manda and all the way down, that we're actually, we're making a difference in all the people that use Visual Studio and of course other programmers using other tools as well, like it's actually making a real difference, keeping the world running right now and coming up with algorithms and stuff that can sequence DNAs and come up with cures and whatnot. So that feels very empowering when you think about that. We as an industry, as programmers, as developers and so on that even though it doesn't seem like we're doing much, right? We're sitting at home. We're not healthcare workers or anything like that. No, but we make sure their email works that they can communicate that they, all this sort of stuff. Right, exactly, exactly it. Yep. So Carl, we have another question here. This is a good one. Oh, okay. So you mentioned that we bring new beginners into Test Visual Studio, right? We have them in to run, you know, setup and how we can improve setup by studying a new beginner Visual Studio users. But what about like power users or experts? Like, do we do any studies on them to make sure that any problems they face are also taken good care of? Oh, no, no, no. So again, you know, we're very hypothesis driven. In that particular case, we were investigating a specific problem. So in the quant data, right? We found out that these students were and so that led to that investigation. But in our lab and particularly with the IDE, we have a very broad spectrum of customers coming in. And in fact, I would say, we even had this sort of great experience where we had a developer who was blind come in and use the debugger. And it was an amazing experience because the entire dev team showed up for that to understand the accessibility of the debugger in general. And just watching that customer struggle with some things that were just kind of things that we just didn't think about or just kind of overlooked or didn't prioritize correctly. It was a remarkable experience for them and a remarkable experience for the product because I think that inspired that entire team to focus in on those issues, learned a lot and really turned around the accessibility of that product. So I have many folks, you know, 20 year veterans who will come in. I remember VB6, you know, sort of thing, right? And that actually speaks to, you know, what is really kind of a challenge here is that for any product that you use so much, you develop habits, you develop, you know, you think about just the sort of the editor, for example, you know, and how an editor works. It's a big deal. And you're in a path where, you know, you do need to make progress, you do need to improve on things, but you need to also be very cognizant of those patterns of behavior. So it becomes the challenge is trying to understand how you can augment, how can you work within that sort of learned behavior and be able to be able to sort of provide additional value in a meaningful sort of way. So yeah, and then, you know, I have folks coming in who, you know, have to sort of zoom in the screen, have to increase the font size because, you know, they're in their fifties and their eyesight is starting to sort of become, you know, challenging. And, you know, so it really, we really sort of invite as many different folks in and we love the diversity, the diversity of people using Visual Studio week to week. You know, that person who comes in, who comes at it from a very different perspective, who, you know, offers us that perspective, gives us yet another sort of way of thinking about how we should approach the problems. Yeah, and that's a really interesting thing with habits, right? Because we all do things differently. And one way that I see that very clearly with our users is we have this way of providing feedback. People can send feedback from within Visual Studio, whether it's a bug report or a suggestion ticket, right? And well, it's really, really important for one person for us to fix. Like this is an extremely important thing for them. It's the core of their workflow. But they are, you know, unique in the sense, well, there's not really any unique Visual Studio users. We are too many. Like 8 million or whatever. So usually when we see that people open a bug, saying, hey, this thing doesn't work, we think about it as being 1,000 people. Like so, one person equals 1,000 people. And that is just a number we made up. So, but the point is that there's a lot of people that have the same problem, but it could still be relatively small compared to the 8 million Visual Studio users. And so it might be something super important to one person, but not to the others. But that doesn't mean that that thing isn't important to that person because they're doing something niche or their workflow or their habits, have them use the product in a certain way. And one way I think where the UX lab and your team really comes in and shed lights on this sort of stuff is, here's a very concrete example. How do you build your solution? Well, there's probably three ways that I can think of. There might be more. I'm sure there's more. But you can either use a keyboard shortcut. A lot of people do that, that's very common. But when you take people, even people that have used Visual Studio for years through the lab, you learn that a lot of people actually go up to the build top level menu and click the build menu item itself. Or they right click their solution and say build solution from there. And so all of a sudden you learn that for the same task, there are multiple ways of doing it. And it's not like one is more correct than the others, but you might have optimized for a certain flow you thought, this is how people always do it. You never occur to you that, hey, people do it differently. And so then you end up with all these bugs and you don't know how to prioritize it necessarily. And we struggle a little bit with that. But that's where the lab really comes in, right? Because you can very clearly see that our assumptions of how people use the product probably aren't the right ones. Right, right. And then it becomes interesting. I'm gonna circle a little bit back to the student profile is that, so we sort of view running and debugging as the same thing, right? It's sort of, we've conflated the two together and that's been a long-term sort of visual studio thing. Well, when students come in and they write some code, write their Hello World code, they look for a run command. And it's not a run command. There is play and there's that little green arrow that looks familiar, but it's about debugging that doesn't quite make sense. I don't want to debug, I just want to run my code. And you will watch them look for a run command anywhere in the product. And it's sort of, again, going back to what we just talked about, it's sort of so many different people coming at it from different perspectives and trying to weave that through all that is fascinating and challenging and kind of, it's really hard, but it's hard in a good sort of way. Yeah, so this is a good segue call because now we've been talking about bringing people into the lab. This is something very, very few teams or companies can do because it's not a cheap thing. We have a whole team dedicated to running these labs. And so it's only sort of the big tech companies probably that can do that. But we do other things for this sort of qualitative information gathering. We do customer interviews where we Skype them or Microsoft Teams or call them on the phone. And we have that sort of down to a science too, right? It's like there's a certain recipe of how to do it right. There's a lot of stuff we've learned over the years. How do we, how do you conduct a good interview and how do you do a bad one? Okay, yeah. Can you talk about like a little bit? How do you even get started with doing these interviews and how do you make sure that your own biases doesn't predict the outcome before you even start? Stuff like that, some common pitfalls? Yeah, so I can, there's multiple parts to your question here. Yes. So one is like, what if I'm not a big company? Right, I'm part of a small team, right? And I don't have these resources. So that's kind of an interesting question too, right? And then is the, how do I do this successfully? How do I remove my own biases from this? And then there is sort of that, okay, there is sort of this rich space of things you can do to engage with customers. There's all these opportunities there. And they're each special in your own way and how you approach it. So this really, your question is really pretty open. Do you want to pick one of them over the other? Do you want to start with one or? Yeah, let's start with like, if you're not a big tech company, how can you do your own customer research? So one of the things that, in fact, even happened in the office when we were in the office, that I had a team who, we really didn't have resources for them. We really didn't have the ability to resource them. And so, and they were dealing with some docking windows issues. And they were trying to understand, if this is going to be intuitive and all that stuff, right? And so instead of sort of going into this large structured sort of approach, I said, follow me. And we walked to another building, another team room, and we walked into one of these team rooms and looked around for a dev. And basically, jokingly, you look like a dev. Can we chat with you for about five minutes? And we brought our laptop with us. We had a private build of that feature and we gave them a task to do. Hey, could you do us a favor and try docking these windows? And we just watched this person try to dock these windows. And we saw, and we would say, try docking it here, try docking it there, try tap, you know, try different. And suddenly we said, ooh, that did, that worked great, that, not so much so. And we were able to take that very quickly back. And we only had to do that like two, three times. You know, when you see three people in a row having the same problem, you're going, well, you know, there's a chance that this is going to be a big problem for everybody. Okay. Okay. That makes sense. So was it important then that the person that you interviewed there, I guess at that time was not on the team to build that docking feature? It was someone that had no idea, no prior bias either way. So if you were to do this in your own company or wherever you might work, let's say that your company, you have a, you sell shoes online. And so you have a bunch of developers, if you go to them and ask them, they might know the problem if it's a small team, but you can go to someone in accounting that would be a potential customer, right? Go online, but that doesn't know anything about the technical aspects that you're about to ask them about. Would that be, would that be how you do it? Absolutely. You know, you pick shoes, you know, everybody buys shoes, right? So is somebody in accounting going to be fundamentally different than somebody in delivery around shoes? Well, maybe, right? Maybe not. And so, you know, watching somebody try to buy shoes online, if that's the sort of the goal and you're working with a new idea and seeing how they use it, it's that simple. Now the hard part here is for all of us, we talked about bias, right? We talk about bias, you know, internal versus external customer, so on and so on. There's also our own biases. And when you have a feature and somebody struggles with that feature, your first inclination is to do what? Help, right? We're all trained to help, right? We are in our industry, we're built to help, right? When we see somebody struggling, we wanna fix that problem. In this case, you have to sort of back away from that and have to let that problem happen. And in your brain, you have to go, if I can understand this problem well, okay? If I can let this problem happen so I can understand it well, then I have half a shot at figuring out how to solve it for a large population of people. That's the hard part. The hard part is like watching somebody struggle and be in pain and be confused and not wanting to jump in, rather taking the approach of letting it happen as it would. And then afterwards, you can help that person. That's perfectly okay. But it's a focus on learning at that moment. You're learning, learning, learning, learning. You're not helping, you're learning. And everything you do is about trying to mitigate that biases to maximize that learning. Does that make sense? Absolutely. I think, so last time you and I worked together was on Visual Studio Code Spaces and it was about figuring out, we had a bunch of hypothesis of what are some typical customer issues? What are some problem hypothesis and what are some solution hypothesis? And we start calling people like, over 50 people we called like an hour. We had an hour with the Meach to go through a bunch of things. And this was all done on Microsoft Teams. And I remember like a lot of the things that we thought was gonna be validated. That's a slam dunk. Our assumption was that everybody here has that particular problem or I think that this is a good solution for previously established problem. And it turned out that that was wrong. And oftentimes when those people struggle with finding the solution or identifying the problem or something like that, you know, we were like, my initial reaction was I wanted to help them get there somehow. I wanted to, because they would typically say, oh, you have a problem with such, there's a problem with Visual Studio when I do such and such. And that's not why you're doing the interview, but I kind of wanted to go in and say, well, have you tried doing this and this instead? Like completely irrelevant to what we were doing, right? And then derailing the conversation and that sort of stuff. But I just find it so fascinating that we can be so wrong all the time. And then like, are we though so wrong? I think, you know, we all, you know, you think back in sort of software and the evolution of how we do things. And, you know, we were in a model where the most knowledgeable person was the person who we would assume to be right. If you are in a frame where you're really brainstorming, you're really engaged, you're really sort of looking for opportunities in many different places, you better be wrong. Most of the time, right? You can't be right. There's no way you can be right. The world is very, very complicated. And the real trick here is the faster you can figure out you were wrong, right? And the faster you could learn from that mistake, the better you're gonna get at coming up with a solution that the broadest set of people will have the maximum benefit from. Yeah, you wanna be able to figure these things out so you can course correct early on before you get too deep into the development cycles, right? And so you have to back out and re-architect and all that. And so a great thing to do at sort of a physical level is what you'll see in the lab is we'll be bringing these low fidelity mockups. Where what we do is we show you how that experience could feel like if you were to use it. Okay? And in some cases, it's a simple sort of what we call user step system response. Sort of model where you do this, you see this, you do this and this. Then you can do interesting things like, what would you do next? Which is almost like this mini sort of usability thing. Well, they know what to do next, if we show them the screen. And it's some cases really easy and simple to create this sort of mockup. It's literally taking a screenshot, slapping some MS Paint UI on top of it enough to give the customer a sense of what it could feel like, okay? And sometimes very often that is sufficient to find out that, well, we were smoking something that day because, you know, and rip it up and start all over again. But when we start all over again, we go, wow, that didn't work, right? Perf tips that I worked on several years ago with a team is a great example of that. The first idea that we had, this is something that came internally. Some higher up people had this notion for this window that would give performance information and it would sort of look like the windows sort of thing and it shows CPU and memory and stuff. Like a test monitor. Yeah, kind of like that. And then as you step through it, this thing would sort of, you know, change. And what we did is really simple is we took screenshot, we did sort of a really cheap variation on top of that. We showed you stepping through code and seeing that window. And without explaining anything to them, we asked them, what do you think that thing is? You know, they would, I think it's something about, you know, my CPU, but, you know, kind of, I guess it's dealing with, you know, how my memory, but it was all these sort of really vague sort of answers. And then we asked it sort of that, there's always a great question to ask, what would you do with that thing? Oh, I'd probably close it, which is kind of like the kiss of death, right? If once it's closed, it's not coming back. And so, because it was a low fidelity mockup, we did this two weeks in a row, couldn't get it to work. We made some changes, see if we could, it failed over and over and over again. Nobody got it. Nobody could articulate how they would use it in the real world. And so we basically ripped it up and started it over again. And that led to the perf tip story that we have today, which the goal was to democratize performance. And so that's the feature you see today. Nice. Are there any other sort of surprises that you have just observed over the years where we were sure about a certain outcome that then turned out to be different? Are there any fun stories from the trenches? Oh God, there's so many. There's big ones and small ones. And where do I start? Console app, that's an interesting one that, you know how the console app works, if you write your little hello world story. So like a .NET console app for instance? Exactly, something simple like that, right? So I'm gonna circle back to the student story because it's just, it's top of mind right now for me. So student, and these are CS, these are computer science students, right? From the University of Washington and some neighboring schools, so really smart people. And a lot of them were in their junior levels, already had gone through a full suite of coding classes. And they would be writing their hello world app just to get a flavor for it. And that was interesting too, because the question that we had is, hello world still relevant for new developers, for the people coming in today? Turns out, yeah, it is, but you think about it, it's a concept from the 1960s. It's from Kernagin and Richie C programming way back in the 1960s when that notion was introduced, but it seems to be still relevant today. And so they're writing this code and they write this hello world and then they run it. And then the screen will flash and go away. And they're like, what's going on? Now, this is a concept that's been in Visual Studio forever, right? The fact that you have to stop it, you have to put an input or a breakpoint, something in there in order to break that code from completing its execution. And it's been something that we've just assumed all along. And the PM who watched this from week after week watched these students hitting on the same problem. And they were left in the dark. It would disappear like, what happened? I tried to run it and nothing happened. And it was only in the situation where things are running really, really slow that you kinda saw something appear that looked like, there was something in there that could be your code. And so he took it upon himself to make that change in order to not have that window stick around. And so what was really fun is that he got an email after they got it out the door from a very senior level developer who just loved this new feature because he could take that console up and dock it, he put it onto his other screen and it stayed there. And it was able to do the work that he needed to do. And he loved it and for him it was a new feature, right? So that was a pretty interesting pivot on something that we just all had assumed after years and years and years of this is how it works, right? Oh, are you there? Sorry, I mute it because I have painters outside and they have this machine that pumps paint. So I was muting my phone, sorry. So let's see here. So what's really cool about what Carl just said, there was a test in the lab that was happening about students and how starting up Visual Studio for the first time was being perceived by these students. But what's really cool is that the people that were sitting there in that room observing, the product team sitting in the room observing all these students, they started seeing patterns that they weren't there for. It was just something emerged and they noticed something and they could brainstorm it right then and there and come to basically fix some problems that they didn't even know existed till they had those students in there for completely different reasons. And I guess that's a quite frequent thing, I would assume, right, Carl? Yeah, I'm back by the way, so I got kicked. No one noticed but me. Okay, well done. So I had that moment of panic is like, did you even hear what I said? It's just gonna stop, so could you repeat the, just frame the question for me one more time? I'm sorry. Yeah, sorry, Carl. It's that we had the findings that they had figuring out the console lab was shutting down if they didn't have a console read-line in there. Like they figured that out by just observing people in the lab, they weren't there to test out that scenario. It just happened to be a pattern that emerged and then they were able to fix something that didn't even know needed fixing. And so is that something that happens frequently? Oh yeah, and the thing is when you watch somebody, in this case, Augustine was the person who just, it drove him, watching that week to week just drove him nuts. Seeing that stupid thing happen week to week just drove him nuts. And it gave him that passion and that ammunition to go, we have to do something about this, right? And it happens all the time because, when you watch somebody struggle with something, we talked earlier about watching a blind developer struggle with your code, with your app and for things that you've missed, things that you've thought about in sort of not quite the right way. You didn't really understand the problem well enough. That's a source of energy. That's a source of inspiration. That's where you're willing to sort of step it up, right? And that just happens all the time because well, it's like we all wanna help. We all wanna fix things, right? It's intrinsic to our industry. It's intrinsic to who we are as tech people. And so watching and seeing and hearing this stuff is just powerful. Can't shake it. No, we're doing that in time, so. We're good, we got four minutes left. And so there's one thing we haven't talked about yet, Carl. And unfortunately, we're not gonna have much time to do, but we went through bringing people into a UX lab, which very few people can do. We talked about, hey, you can do interviews with customers like over teams or Skype or whatever is your preferred method over the phone. You can sort of interview or have like run-throughs of your features with people that are in your vicinity, in your office, you know? And we do another thing, which is we do a lot of surveys as well. And I'm sure that everyone watching this, you've seen us ask you questions, like whether it's inside Visual Studio or it's on the Visual Studio website somewhere. Even the blog asks you for your feedback. And so, you know, a common fear, of course, is that we ask too much, too many times, we give you these things. But there's a reason, and there's a method to the madness, right? How do these surveys work and why are they, why is it important that people answer them if they have the time and the capacity? So, so also, great question because there's multiple aspects to it. So, when we learn stuff in the small, like, you know, we're in the lab and we bring in, you know, five, six, 10, 15 people and we start learning what's the right way of asking the question, what is really the core problem? Then the next thing is, you know, how pervasive is this? Is everybody suffering from this or is this a meh? Sort of, you know, is there different types of people that are more affected than others, right? In some cases, it's a map for other people, it's like the world is coming to an end maybe moment. And so, the survey helps us get that breadth perspective. In some cases, we're trying to also, depending on what the hypothesis is, what we're trying to learn, it could be, you know, we're trying to say, hey, are we missing the boat here in some fundamental way and getting that input from customers? And who are these people using this in the first place? You know, we have limited visibility, we have instrumentation and stuff like that, but who you are as a person and what you're trying to accomplish is something that we're trying to make more visible and these surveys give us that window into that. And so, you know, we feel always like, are we asking too many surveys and we have some controls in place, as you know, to sort of mitigate that. And so, but yeah, it's a very real sort of issue is how do you reach out to all these people to get this feedback and you don't want to overburden people? And I think, you know, it's still a struggle and it probably will be a struggle for some time until, you know, we all figure out what the right path and rhythm for all this is. So, but it is incredibly important for all of us, you know, anybody who has these questions, we're trying to learn and you are our, you know, our customers are our, you know, colleagues and all this and, you know, learning doesn't happen in isolation, it happens in collaboration. And so, this is just another form of collaboration. Yeah, and oftentimes we use the survey because we think there might be something that we're missing the boat on, as you were saying. And if it turns out to be true, that could lead to us bringing people in specifically to test out some of these problem areas in the lab or do some customer interviews and really get qualitative data on what is the problem here so we can get to the bottom of it. But the survey can help us to figure out if there is a there there to begin with. Exactly. It really goes both ways, right? We have sort of a, I don't have it with me, a kind of graph that shows the, you know, we talk about qualitative and quantitative. Sometimes the quantitative shows an issue that is like, what's going on here that then the qualitative helps you bring light to. Sometimes you have a qualitative observation and going, we're not really collecting quantitative information on that problem. We didn't know that we should be doing that. And then it goes back to figure out how we can get that information at that quant level. And so there is sort of this back and forth that's always happening. And that's where the interesting, that tension and that back and forth is where the learning kind of really happens. No. Awesome. Carl, we are at the end here. Thank you so much. This was super fun to learn about and see those pictures from the lab was just amazing. It's really cool stuff that's happening there. So thank you so much. You're very welcome. Thank you. That was, it was fun. Awesome. All right. All right. So before we leave, if you watch this on YouTube, remember to hit the subscribe button below. We really appreciate your viewership here and feel free to share the video. Next week is build. So we won't be doing this office hour because of build. There'll be so much more content coming to channel nine and YouTube or wherever they're gonna put it that you're gonna be so busy that you won't even have time for this anyway. So we're gonna be starting up after build. So the week after, hopefully. So thank you so much for tuning in and see you next time.