 Hi everyone, I'm Mary Kachorek from Legal Services State Support in Minnesota, which Jane just mentioned. And so we spend all this time and money creating these great new resources and tools to better serve our clients. But are we doing enough to make sure that we're actually getting those tools to our client population? And even if we're doing enough, could we be doing even more? So up north in Minnesota, the frigid north of Minnesota, we've spent some time thinking about how we feature projects and resources on our statewide public information site, LawHealthMN. So you just launched a new project and it's getting kind of low traffic. What if you changed how you showcase it on your website? This sounds like a good idea, but there's still a few things to consider. First, what changes are you going to try? Will doing this negatively impact traffic to other parts of your site? How are you going to measure the impact if you do try something? And how much time and staff resources is this going to take? This is where using an A-B testing tool can be really useful. Who here has heard of A-B testing? Hands in the air, yeah? Who's actually done some A-B testing? Less hands in the air? A couple? Okay, awesome. Brian in the corner, great. So A-B is a way of conducting controlled randomized experiments with the goal of improving a website metric. We've been using A-B testing in our TIG project to find the best way to post our Document Assembly Library on LawHealthMN. Everybody seems to do this a little differently, so we decided to run some tests to see what setup drives the most clicks to our ATJ interviews on LawHealth Interactive. One thing we looked at was the button we use on our homepage. So we've used button A, little green, on the left for a while, and it's been doing a fine job. But what if we change things up? What if we made it blue? Added a new icon. Took out our weird made-up name form helper. You guys see where this is going? Yeah, maybe? Okay. So here is our experiment. For a month, half the people coming to LawHealth saw our homepage with our test button in blue. And then the other half of the people saw the same page, but with the test button in green. Any guesses as to how it comes out? Who thinks blue is going to win? Oh, okay. Yeah, some people? Well, blue won by a lot. After a month, we saw a 747% increase in conversions to our Document Assembly page. Thank you. It's all too little blue. But the takeaway from this is pretty obvious, right? Use the blue button. But of course, we did not stop there. We started looking at all of these links on our right-hand navigation bar. What if we change the order? We guessed moving things lower on the page would decrease conversions, but by how much? Maybe it's worth it if we get more hits to the other resources that we're also looking to push. In addition to all this crazy fun we're having on our homepage, we've been testing out some other ideas for posting Document Assembly interviews on LawHealth. How much content on a staging page is too much content and causes users to drop out? Do you really need a whole page devoted to the technical requirements to run A2J? There are some great resources out there on LSNTAP and elsewhere about web accessibility for low literacy users. We're using some of these resources already to help guide this project, but there's also all of you brilliant people out there. This project is still in progress, so if you have an idea, please let me know and we can run it in the next round of testing. All right, so how did we do all of this? The wise people at LSE recommended that we use an A.B. testing software and we decided to go with Optimize.ly. Any of the three A.B. testers use Optimize.ly? Yeah, great. We'll talk. We'll talk Optimize.ly. There are a lot of other A.B. testing tools out there. Optimize.ly is just great for us for this project. Once you learn the basics, Optimize.ly is really easy. They have an editor tool that guides you to select the container you want to modify and then gives you a few options for changing things up. If you know HTML, great, you can just edit away. For the rest of us, they have tools to help guide you to make a link or text or color or all these different kinds of style changes. You can edit existing content like this login button here or you can add a new content just for your experiment page. You can play around with where things show up on the page. Again, you just select your container, drag it around and then see where you want it to go. The tool also lets you scale containers to make things show up bigger or smaller on the page. Tracking your goals is pretty easy. You just select the parts of the page that you want to track. For our home page example, we made a lot of click goals because we wanted to see how changing up just a few things affected the rest of the page. If you want, you can track page views instead of clicks. If it optimizes automatically tracks engagement, that's someone just clicking on any part of your experiment page. Optimize gives you a lot of flexibility about where your experiment will run. You can set up a simple match if you only want your experiment to show up on one or a few URLs. If you want your experiment to show up across your entire site, you can do that too. That's what we did with that first example because the right navigation carried through to the whole site. You can target your audience or who gets to participate in the experiment. I'll play around with this, but I can see it being really useful if you're trying to optimize part of your website for a certain community of users. You can see language, device, location. There are a lot of different variables you can choose from depending on the project and who you're serving. Tracking your results is super easy. You can just start or pause your experiment whenever. You get a nice summary of the unique visitors and the days running. You can see how your variation stacked up against the original or baseline version of the page here. You can see one and two beat three in that one goal they had set up. What's the cost? Everything we've done so far has been in a free starter account. You can pay more to get more functionality. I'm like a solid B web administrator. I'm not a coder at all, and I was able to learn it all pretty quickly. Now if we have an idea to try something, I can set something up in like a half an hour, which is pretty cool. To sum it all up, this has been a really good use of staff time. It's been unexpectedly a lot of fun. I think that there are a lot of possibilities for how we can use A.B. testing to better serve our clients. Thank you. Thank you, Mary.