 I'm terrible at remembering to do that. So here's a little bit of me. I am a web analyst on the IT communications team at McGill. I'm actually switching over though. I'm going to be going into communications and external relations at McGill starting at the beginning of July. Throughout my career, I kind of flip-flopped back and forth between the two departments. McGill is the fourth university that I've worked at. Prior to working at McGill, I worked at the University of Victoria where I also did UX exercises and UX research. Before that, I worked at Western and prior to that at King's University College. There is a link to the presentation slides right here. You will be able to see this at the end of the presentation as well. If there's any Drupal developers in this room, we use the Shirley module to create this tool so that we're able to shorten links for people in our community and not use bit.ly or hourly, which our information security people were telling us that was a security issue. So now we have our own link shortener. So if you attended my last presentation, this slide is going to look very familiar to you. I thought I would start this presentation in a similar manner, but not from a tactical perspective, but to talk about the difference between how we used to create websites and how we create websites now. In 1997, that was about the time that I started working with websites. This is very indicative, I think, even though it looks funny. It's indicative of how websites looked back then. I don't know if there's anyone in the room that was creating websites back then, but the process of planning and creating the architecture was pretty simple. You sat down at a desk and you created a flowchart with circles and lines that showed what your navigation structure was going to look like, and it wasn't really any democratic process. It was something usually one person did because the job happened to land in their lap and they just sketched out their information structure without getting any input, really, which is a terrible point to do things. For me, about 2004, I realized that there was a shift. We were paying more attention to what user needs were. At this point, I went back to school that the university I was working at at the time was Western, and they supported a lot of people in the department to go back and to take courses in the Faculty of Information and Media Studies so that we can learn more about creating better information interfaces for our users. The next university I went to, which was the University of Victoria, was the first university that I started doing focus group exercises and usability exercises. That was in 2008, so I think from a university perspective, that was pretty early. I'm hearing a lot more now when I go to university or go to conferences about other universities doing UX testing, but back then, it was something that was a little bit kind of on the leading edge, which was kind of neat. Right now, where we are at McGill, we're in a space where a lot of our projects start with UX research and are really based on following principles related to UX design. Here is an example of one of those projects. This is the first stage of our web evolution project. It was the launch of our new homepage, which was launched in January of this year. There was extensive testing that was done, research and testing that was done to come up with this design. The prior version of the website, it was really one of those experiences where there were a very small pool of people that were making decisions. In this exercise, we did a really focused effort on reaching out to different departments across campus, different audience groups, and really communicating what it was we were trying to do and having different experiences with them where we could really kind of get down to who are we trying to promote to on our homepage and how can we best promote to them? How can we best interact with them or provide information we need to see? I'm not sure if you saw the prior version of our homepage. It was kind of a dog's breakfast. It was like something for everyone was on that site. For this site, I was really happy because we chose just one target audience and that's prospective students and only prospective students as the audience for the homepage. That choice was made after extensive research in terms of identifying how people are using our website, who is actually coming to our website and what they're getting out of the homepage experience. There is background information about that. Everything that I am giving information on, if you have questions about it later on, feel free to come up. I'm just going to touch on it very lightly here. I'm going to now get into the nitty-gritty of the presentation. For this presentation, I wanted to provide something extra in terms of UX. It's not a basic primer in terms of how you can do UX at your university. I was really happy because I went to go see Aiden's presentation yesterday. I wanted to mention his presentation covered really well a lot of the basic concepts on how you can do UX, best practices for doing UX at your university. The slides, I believe, you can correct me if I'm wrong. Aiden Foster. Fosterinteractive.com slash UX tips. That's where you can find the slides. If you want background information about some of these concepts, because I'm going kind of a step beyond that and I'm going to talk about how to do this effectively in a team structure, you might not have the background information that you need. If you have questions, that's a good place for you to go. I also have a resource slide at the end of my presentation that has some additional resources that you can look into. So benefits. Why do UX exercises? So these are some pretty standard reasons why. I think there's a few higher ed people here. How many people conduct UX exercises or follow UX design principles in your institution? So there's a few people, but of those people who maybe aren't doing it right now, is there maybe an interest in doing UX? I know that this is kind of still something that's emerging, so I think if there's an interest, I guess if you have an interest in finding ways to, I guess, encourage interest in UX in your university that you work at, I think that that's a discussion that a lot of people, that we had at the higher ed summit yesterday, and there were a lot of also suggested tools for that. So feel free to approach me after that if you're interested. But these are, if you're not familiar with the benefits of UX, here are some very standard things that you can get out of following these principles. So when our participants participate in the exercises and gather research through UX research, we get information, better information about what our audience needs are. We also have a better understanding of user interfaces, how well they work, what concerns there are that we need to address. And we also get information about our preferences for look and feel. So at McGill, we have some unique things that are available to us. I think because we have such an extensive team, our development team at McGill includes technical writers, trainers, support staff, developers, and all these people are located on campus. They're not people that traditionally would be involved in these UX exercises or participating in UX exercises. But we have realized through a lot of the work that we do that there is great benefit in finding ways to incorporate different tasks or different roles in our web team in the tasks that we're doing. And we wanted to explore ways to do this with UX. So, of course, the way to look at that would be to think about how people might be able to participate in these exercises. So in terms of who, for example, we get to participate in our focus groups, we often are looking at target audience members to participate in our focus groups. And it depends on what project we're working on, who those people are. But basically, often they're students, faculty, staff, alumni, and maybe external community members that may come to the website on a regular basis. But there's also other roles that we have, other people that are participating in these exercises. We have facilitators, people like myself who would go in and facilitate a usability test or a workshop or a user journey mapping exercise. We have note takers, observers is another opportunity for people to participate or project team members, just generally people that are working on these projects to revise and revamp these websites. So, what are some benefits that we've realized incorporating these additional roles? So, there's a number of things that we started to, a number of benefits that we started to experience right off the get-go when we incorporated a broader team set into our exercises. One was we were hearing more often about possible technology enhancements. So this was from having our developers participate, obviously, in our UX exercises. So they participated quite extensively in the user journey mapping exercises that we did. We did these exercises across campus in different departments. And one of the things that they were able to identify is that there was a real importance in making sure it was easy for people to create content that could be shared across websites. And we have that to some extent now because we have a bunch of integration. So we have a bunch of tools available to our site managers where they can easily pull in course information from our eCalendar. So it's dynamically updated on their sites or biography information from our banner system so they don't have to keep contact information up to date. So we have a bunch of integrations like that. But one of the things that kind of is missing is this idea of just allowing people to create a content block that could be shared across departments. So not just kind of a pool of information, but for example, admissions information that is displayed on pretty much every faculty website that we have. So that information could potentially be managed and laid out by the Enrollment Services Office, for example, and it could be automatically updated. So that was an idea that was proposed by one of our developers. And I know that it might seem like, oh, why did that occur to you before? From a technical perspective, you aren't always aware of what the possibilities are for a developer to be involved. They can present those ideas. And they also need to recognize the need to be able to get behind making that update in the system. So it's kind of a two-way to improve our process. In terms of ways to improve our web processes specifically, another improvement that we realized was when we did usability testing in the faculty of engineering, or actually a focus group workshops in the faculty of engineering, we had some high-level key stakeholders participate in that. I talked a little bit in my previous presentation about an improvement that we have in the faculty of engineering where we now have a committee of high-level strategic people that can make authoritative decisions on websites. So these are people not, they're not site managers, they're not editors. They're people that actually affect communications and strategy changes. And having that committee allows us to communicate our updates and the changes and the enhancements that we're making in our tool with people that can actually affect communication strategy changes. So what was happening before was we realized we would make enhancements to our tool, we would communicate them to our site managers, and then those updates would fall flat because the site managers didn't have the authority to say, hey, we need to introduce this change on our website that is going to affect the way we communicate with our community. So having that in the faculty of engineering is really beneficial now and that idea for that was inspired by people in that level, of that level participating in our focus group exercises. So there are benefits, but there are also challenges. So this is the section where I'm going to get into the challenges. I did this presentation for my partner who calls this the story section. It's the section where I kind of get into some of the failures that we had and try to incorporate different people into our user experience exercises. I don't know if anyone has any ideas about what might have happened or gone wrong when we tried to incorporate our support staff, our developers, our key stakeholders. I don't know if anyone can guess. But it'll be very obvious, I think, once I start talking about it. There were mostly issues that were related to things like bias, personal bias in terms of experiences that people had had, especially related to past experiences. There were difficulties with point of view. So people had difficulty putting themselves in the shoes of the target audience that we were actually trying to get information about. And another thing was if we brought people into these exercises as facilitators or note-takers or other roles, sometimes they had difficulty remaining objectives and would get pulled into conversation with our test participants about the decisions they were making. And I've actually seen some of these members actually challenge some of the input that's being made at these exercises, which as you know is a big no-no. So I will go through a few of those. Now, just to start this section, I'm going to give you just an overview of the range of exercises. This is not a complete list, but some of the types of exercises that we conduct at McGill. So we have workshops, user journey mapping, surveys, tree testing, usability testing, maybe testing, reviewing existing data, of course. There's a bunch of other things that we do too. We don't do all of these things for every project. It depends on the complexity of the project. If we have other existing data that can feed that project and a bunch of other factors will help us decide what is necessary. I'm not also, again, going to get into in-depth what these types of exercises are, but you can refer to Ian's presentation if you want the background information about what these are and how you can do them. Example of a focus group exercise that we did. Raise the video diagram for information students, of a type of information students look for when they arrive on the engineering homepage. So this is what's called an open card sort. I don't know if anyone's familiar with that term. Okay, I'll just, for those who might not be familiar, it's a type of workshop where you bring in participants and you allow them to sort content and categorize content based under categories that they determine. So we decided to do this exercise that the faculty of engineering obviously had an existing website. The reason why we decided to do this exercise, which is kind of starting things at the base level, was we did an initial analysis of the website. We did a benchmarking analysis and looked at other engineering sites and we realized that there we had a big jump to make in terms of how we needed to improve this site. So we started by just getting people into a room and having them talk about what it is they need to see on this site and to organize that information in terms of importance. So in this exercise, and these are pictures, all these pictures that I'm showing, by the way, are examples of the actual exercises that I'm talking about. They're photos taken from the exercise. We got a lot of great participation. There was engagement. We got some really good input. Everyone was really excited, which is really great to see when you conduct these types of exercises. But we also had one of the key stakeholders participate in this and we got some good ideas from that key stakeholder. But after the presentation or after the experience wrapped up, he made a comment about the fact that a lot of the input that he had provided was actually related to a site that didn't exist anymore and conversations that he had had in the past. And it wasn't relevant to actually what the experience was of students at this time. And unfortunately at that time, all that information that he had provided was already mixed in with the data that we had collected. So this is a difficulty that when you work in a university environment, there's a lot of staff members who have been at the university for a long time and they're prone to bringing in this kind of historical kind of point of view. So it's something that we found we had to watch out for. The solution that we came up with to this is we are now doing a better job of communicating goals and objectives, especially to people that are participating outside the audience group. The audience group is obviously going to stay focused on the information we're trying to gather. But other people we bring into the discussion were really trying to do a better job of making sure they're aware of who it is, the perspective that we're looking at. Here's another example of an exercise that we did. This was related to a service. I don't know if... I find in the types of UX testing that we do one of the most difficult departments, types of departments, to test for our service departments because they often have a crazy amount of labels or services or information to organize and they're using a lot of terminology that's really internal. Or it's terminology that's kind of technical. So it's a little bit of a more difficult type of audience testing to really get right. So we do usually extensive testing. This is one of those. We were trying to create a service category label structure for a service catalog for IT services. There were 150 services that needed to be sorted and categorized. I've never done a card sort this big. So we did it as a closed card sort in contrast to the open card sort. In a closed card sort you actually provide the categories. You don't ask people to make up the categories and as you can understand it would have been really overwhelming if we'd asked them to do that. But we did give them the opportunity to create other labels if they felt that that was necessary. So here is what that looked like. I was a little bit concerned when we started the exercise that they would find it overwhelming but they really dug in there and they made it happen and it was really great. We actually did it in two sections. So the first series of workshops we held we focused really on choosing the category labels. So we just got them to tell us we had done a bunch of research and looked into a good set of labels that we could test for every category and we got them to sort those labels based on what worked best. So once we had the categories chosen we then with a separate group of people showed them the categories, gave them the 150 services and asked them to sort those 150 services under the categories that had been chosen in the previous exercise. So we got a lot of great data out of this. We were really lucky with this project because we had a big long timeline so there was lots of opportunity to experiment with different prototypes that we could test and do extensive workshops like this and we got some good information out of it. But we had some difficulties. Again, we were involving our team members in this exercise and this one in particular we had difficulty when I went back to some of the notes that were taken we had support staff members that were participating in this and some of the notetakers had focused on collecting information not about the navigation structure but about the IT service itself. So I had some pages of notes about what people thought about their email going down previously or about how long it took them for their tickets to get answered online rather than information that was specifically related to what we were trying to test. So that told us that what we needed to do was to provide better structure in terms of guidance and sometimes training in terms of what we're looking for from notetakers and facilitators in these exercises and the actual information that they need to gather. And we didn't want to stop people from collecting related information because it's useful. It's useful for departments to know these things but to keep that information, those notes separate from the information they're gathering for UX exercise. So this is another exercise that we did. This is a user's learning map. I talked about this earlier. This is part of the homepage redesign that we did. We did testing at both of our campuses. We talked to a lot of audience members but we did it in a number of different departments. One of the problems that we had with this project was because we were doing so many workshops in so many different departments it wasn't always possible to get the students our target audience involved in doing these exercises. So I know that the people that did participate we've got staff members and key stakeholders they were able to provide useful information but again sometimes it's not a current or relevant or sometimes it's tainted by past experience. And also the optics. I think one of the feedback that we get from students often is the sense of these websites that are working for me because you don't ask me about what I need to see on them. So it's really important for us to really try it and get students to participate in these exercises not just because they provide valuable feedback but also because it gives them the I guess the memory when they come up to these kinds of messages that they hear someone is able to say well yes I am a student and I was asked and I did get to participate in these exercises. So as you can imagine if you do testing with students we had difficulty recruiting students during exam periods and around the holidays just exactly what you would expect and what we did in those cases is we had scripts, we worked from scripts that were transcripts of interviews that had happened with students so we were able to interview students at times when it was easy for them to provide information to us often in the evening or on the weekends and then we took those scripts back to our exercises and we had people participate but to draw their input from those scripts so that they were really providing input that was relevant to the current student experience. So this is the last one I'm going to get into. I was joking with you about this. This is Optimal Workshop. It is an example of a tree test and it's a tree test report. I love this tool and I don't know what. Last year I went to a bunch of conferences just every conference I went to there were three or four presentations that had Optimal Workshop tree jack reports in one of the slides. So this is the output of that big exercise where we were trying to get them to sort all of those services. So after we did those workshops we created two prototypes and we wanted to see which navigation prototype was going to work best. So we tree tested them both. If you're not familiar with Optimal Workshop I'll show you kind of how it works from up. So this first, it's a usability test to start so this is the question we asked. Where would you click to find information on how to set up your vehicle email on your computer? And this graphic that we have here is a visual display of the navigation path that people took. So everybody started here on the service catalog homepage. The lines that are going out from that first pie chart show the different directions that people went in. Green means they went in a good direction. Red means they went in, they couldn't find what they were looking for. It was a bad decision. Blue means they had to go back so they got cold and they went back in the navigation structure. We coupled this reporting with a note that we took in a reframer. And a reframer is another tool that's available in Optimal Workshop. And we've been experimenting with ways to tag the feedback that we're getting during our tree testing and during our usability testing so we can easily go back to these red circles and find out exactly what was happening at that spot that people were confused or they didn't know where to go or whatever happened that there was a breakdown in the navigation structure. So that's how we were able to identify what the actual issues were and we were able to better refine the navigation. Again here there was a little bit of a difficulty. So when we initially tried to do the tree testing and this is something that is sometimes suggested, we thought let's test our usability test internally to see what the internal feedback is going to be. So we brought the test over to the other side of our IT department floor and we tested it with a couple of network people and they had difficulty actually navigating the structure. It actually worried us. It became very obvious very quickly that the problem that they were having was they were looking at it from the perspective of someone who was internal to our department. We used very different language in the way that we label our services and we have an internal kind of structure of the way our teams are organized that they were expecting to see mirror in this navigation structure and it wasn't there. So we realized from that that it's not always possible to incorporate people as test participants in these exercises but that isn't always the case. Sometimes it is. It just in this case wasn't. So I guess the best practice takeaway that we had was just to be aware that we have to continually refine and kind of tailor your practices based on the different restrictions of the project. So, I'm going to just quickly show you a few of these solutions that I talked about when I've gone through some of the challenges that we've experienced. For example, one of them, we spoke about this a little bit in the higher ed summit yesterday, if anybody went there, the idea of having a clinically defined plan. So we had assumed, I think that having the actual project plan, having a plan about just how the web project was going to go would be enough. We realized that we had to also have a plan in place for our participants, our facilitators, our note takers to let them know how this project was going to go from their perspective. So to clearly define what the goals and objectives were, the exercises we were doing, the timeline and responsibility, so when they were going to be able to participate and provide feedback, participate in our analysis and provide structure for that team analysis. So that they're not providing feedback as the workshop is going on, but they are giving their analysis in meetings afterwards when we were actually focusing on the analysis. Another thing that we, another set of things that we developed for our participants, we give them how-to documentation. We provide information in training sessions if needed. We have more advanced templates and guidelines that we make available and this idea of transcripts from interviews with audience members. So it's just basically making sure that I think before, when we started to incorporate other team members in these exercises, we just assumed that they would know what the goals were and the information we needed to collect and how to collect it just by giving this kind of a basic overview of what was happening, but we realized it actually sometimes needs a very detailed explanation. So here's a slide from one of the exercises that we did where we were involving people that don't normally do UX testing. So we did a presentation specifically about the exercise we were doing. We explained the mechanics of what it was we were trying to construct. We talked about how we construct it and why we followed that process to construct it and then we talked about how we're going to use that tool afterwards to create a design for our website. So this wasn't necessary for every project. This was related to the redesign of the homepage. So as you can imagine all those workshops that we did, there were some very high level strategic people involved. This type of detail is necessary. It's not always necessary, but sometimes it is. Here's another example. This is just a set of guidelines that we presented for note-takers. It's related to that discussion or that piece that I mentioned earlier about the difficulty we had with staff members that were recording kind of tangential information. So we just realized that it was important to actually outline exactly what it was we were trying to get feedback on. We couldn't just assume that people would understand that we were here to test the navigation, so we're looking for navigation feedback. So in summary, I just wanted to say again that there are a lot of benefits. There's the regular benefits that you always get out of doing following UX design practices and doing UX research. So you get a better understanding of user needs. You get a more improved interface and you have a better understanding of user preferences for look and feel. But on top of that if you're willing to incorporate other people into these exercises, developers, trainers, key stakeholders, you will get an additional level of benefit out of that experience. We're very lucky at McGill because we have at other universities we've worked at, the team provides our development and our training and our documentation maybe wasn't present on campus at McGill it is, so it makes it easier for us to incorporate. But I'd love to hear in the future maybe about a situation where these types of experiences were made possible where these teams are located on campus. So just in wrap up too I wanted to mention this last point I think another kind of side benefit of this is just a better relationship between our developers and our training staff with our audience members both because they see them at the exercises and because there's a way better understanding on a more in depth level of what our user needs are. So I'm going to show these pictures again. I'm wrapping up now. I'm going to show the people that are in these pictures that are not our audience members. So this person here in the blue shirt that is one of our support team members in that picture there in the top right in the top right of the photo you'll see some of our development team there. There are developers here in the far right again you see one of our project team members and in this picture here the woman seated that's one of our trainers taking notes. So here are the additional resources I said I would share even also had some really good resources in his you mentioned the Rocket Science Rocket Surgery This is Steve Crew's other book Don't Make Me Think it's an earlier UX or usability testing I guess home so it was where I first started and it's the thing that it really inspired me but Rocket Surgery is basically the user testing the act of how to do user testing try not to bias yourself and so on and then Don't Make Me Think is sort of the primer for that. Both are awesome. Yes I agree. So that is that one other thing to wrap up we are hiring right now we're looking for a developer for our team if you are interested in applying or if you know someone that might be interested in doing development in a university environment we would love to if you would pass on this information to us the job is posted on our HR website or McGill HR website you can take a look and that is it. So I have a little bit of time for questions if anyone has any questions Yes So you mentioned you have a UX how to toolkit that you I guess distribute internally for workshopping I guess to spread the knowledge wider. Is that available publicly or is that private things? It's not available publicly and it's not available publicly because we tailored it to the department that we are working on because it's not the same process that we follow so we didn't want to really make it available online because we don't want people to refer to something that's specifically for a project that might not be relevant to their project. But it would be similar to the example you have before with these are the exact questions you should be trying to get the answers to in those consequences? Exactly, yeah and it incorporates some of the other the slide with the questions that's another thing that we might include in there and yeah basically it's an overview of the mechanics of how the exercise is going to happen the why's and what's going to happen next along with some kind of background information as a participant where are you going to participate and what input are you going to give so it's focused on the people that are going to take part. Anybody else? Yes? So how is your team structured? Is it like a full-time UX specialist or is it like a web developer or a content specialist and part of their job is UX? How is it structured at the end of the university? There is no dedicated UX person already now at the university. We have this web evolution project in place actually that's not accurate. There's no UX person at the university in a centrally supported department. There is a UX position in the library department. I think the library is one of those departments where they weren't having an ongoing person to continually do UX. We have people on our team that have expertise and background in terms of doing UX but the way our team is structured we share our work so it's a development team so we all kind of do UX exercise as we all do even though we're testing. Part of that web evolution project that's happening though, we're doing a better job. We're going to be doing a better job I think of defining the difference between the people that create and manage content and provide content resources at the university. Those people are in communication services versus the development team which is the team that works more on creating the tool that we work with. There should be a more defined sense of who is going to be around providing that resource and it should be seen. I'm actually about to move into that department so I'll start a rule there at the beginning of July. I will see that come together. Anybody else? Can I ask a little bit about the structure that the UX tasks? Is it a faculty that comes to you and says we need websites? Do you guys provide the whole website or do the UX testing and then that gets handed off to an external vendor? So we work as closely with the department as possible to make it possible for them to do these things themselves. So in some cases, a lot of cases people in the department don't have time and they don't have the expertise to be able to take on redesigned projects. In those cases if it's a very strategic project you might come in and provide support but we try to make it possible for site managers or site managers to at least participate if not facilitate these exercises themselves. Because we realize that if we do that they will be able to on an ongoing basis continue to do UX testing as needed and continue to evaluate the site which is necessary. We can continue to provide in-depth support to them on an ongoing basis indefinitely. So it's great for us to be able to empower them to do it themselves. But it's not always possible and then at that point sometimes it's necessary to turn it over to an external vendor or to help the department hire new staff to help see them through the project maybe temporarily. So there's a bunch of different solutions that you can look at. So, yes? So I have a question about your user research so not so much the usability testing but the research based on the beginning how did you feel about recruiting people in your audience for that? Yeah, there was some great tips that you've come across one of them was shared in the hire summit yesterday. So as I mentioned somebody asked how do we recruit students and that is something that's always difficult but so we tried a bunch of different incentives to get students to participate gift certificates, gifts and things like that. The thing that inevitably works the best is chocolate bars at the library. They have to be full sized chocolate bars and if you sit in the library you can't have a little like Halloween chocolate bars that doesn't work. They have to be like O'Henry Kit Kat like the full chocolate bar and then they approach the desk and I've had experiences where you can't just have one person doing testing. You have to have multiple people because you have at times I've been inundated by students who come up and say how can I get a candy and if you say you just have to sit down and do this test then it's really easy to recruit students. It's a little bit more difficult for the workshops because you're actually trying to get students into a space. Sometimes pizza won't work. Food is awesome. It's interesting how it's adorable but it totally makes sense especially in the library. They're hungry they're studying. If you actually do an exam period it's great because there's students that are looking for a little bit of a break and if they don't participate right away they'll come back to you when they're ready to have a little break. But yeah, pizza sometimes works for those workshops but really for trying to get them into the workshops one of the key things is just not to schedule a journey exam period or close to the holidays to do it at a time when it really works well the time of year that we find works best is the very start of the semester and I don't know why but we get a really good turnout. You think that they would be really busy with a whole bunch of things but I think at that time they're still really fresh. They have time in their calendar because their courses haven't really ramped up yet and they can devote time to the types of exercises. So something to consider. How do you recruit prospective students or how do you do research for them as you mentioned that the redesign for the nickel homepage is focusing really on the prospective students and how do you learn about that because you won't find them in the library. Absolutely. So there's two ways. One we look at new students at the beginning of the school year because they have recently been through the registration the application process and the acceptance process and the other way is we collaborate or connect with guidance counselors in SAGE apps and high schools to set up testing sessions in high schools and SAGE apps. So there's a bunch of different I've had a bunch of different experiences that usually though the people that we connect with are really great at identifying students who are going to be keen and really good to participate in the exercise and we've gotten really good I guess input out of that. So that's what we did for the homepage. We went into I think it was Dawson. We went into a couple of universities or different high schools that feed into the GIL specifically and did testing with them. Just for your card sorting exercise as you mentioned open and close that's predefined categories or not have you tried or experimented with because again you have to get everyone in the room and that's good that's good energy as well but I know there's tools that are like online card sorting tools we experimented with any of those and had positive or negative results. We did. One of the negative things that kept us from doing that more often is we when we want to do those types of exercises we usually look at doing them in our computer labs so that people can it's still an in person experience but there's an online but we can't bring food into our computer labs so then we can't entice people to participate in the exercise and we find when we send out the card sorting exercise as something people can do randomly we don't get the input and especially the qualitative data that we can capture if they're doing it in person so that's kind of one of the key reasons why we haven't used it. I can't remember is it optimal sort I think optimal workshop but we've tried it a couple of times but we've found that the depth of the data that we've gotten from those experiences hasn't been as rich I think as the in person so just because the conversation that happens while sticking things to a whiteboard and that kind of thing? Absolutely, yeah so the really good feedback that we get is enlisting to people especially when they have conversations where there's conflict one person is saying it should be this, the other person is saying it should be this and then they provide a lot of detail about why and that is something that we don't get if they're not going to provide that feedback if they're just doing the experience online really good questions I think that's it, we're out of time so I think it's time for lunch so thanks everyone for coming in