 Hey moms con how we doing day three? Yeah, we out here. We're learning and now guess what you're gonna hear from a CRO at an SEO conference Wild right, but it turns out that if you send all of that traffic to your websites and the websites Aren't doing the work for you. What's the point? Right. It's all about that ROI And in my years as a CRO I've seen some shit. I've seen some terrible tests Because what so often happens is that we get in a room with our with our friends with our colleagues We start brainstorming ways that we can change our website Inevitably the highest paid opinion in the room says oh my god, that's too risky and behold The button test is born Let's just test the color of the button. It's safe. It's easy. We can implement it with just a little CSS. It's so great I guess what? Nobody cares what color your buttons are Nobody cares Okay, but actually an accessibility interlude it is very important to have a button that is able to be perceived by people with colorblind and Other obviously those wonderful tips. We heard yesterday and our accessibility talk With those caveats really nobody cares what color your button is because by the time they're on your website Ready to click that red or green or purple or blue button They've already interacted with Dozens if not hundreds of your marketing touch points Which means that all of these touch points are actually our opportunity to be influencing how they perceive our products how they perceive our brand What benefits they might receive as a customer? These are opportunities And so that takes us away from this science this Inherently risky thing which of course testing is always going to be risky But it brings us more into the art They're like what could we be doing? What is out there on the internet? And yeah, we're gonna break a few eggs while we do it But that's the fun, right? You can't have a really great omelet without breaking a few eggs Obviously, I'm the first person to ever say that so you heard it here first But what I'm talking about is going out on the internet and allowing yourself to be inspired some of the best test I've run on a website at a conversion point have been inspired by other people's marketing Yes, what are the Amazons doing? What are the Etsy's doing? Because our audience doesn't live in a bubble like why should our test ideas? Why should we limit ourselves to what is already on our website and already as part of our marketing arsenal? Have the confidence to go out and say I fucking love this thing I'm gonna take this even though it's not related in any way shape or form to the industry vertical. I'm in and I'm gonna make it work Seriously, I had a 100% lift Yes, a 100% lift because I stole something from an e-commerce website and I put it on a nonprofit website Shouldn't happen like that. But guess what people are shopping in Amazon. They're also gonna go and donate to my nonprofit They're not just viewing one website And now the fun part we get to play with fire because that's the most fun part of science, right? Or is that just me? Okay, so I'm gonna have to unpack that with my therapist next week. That's okay a little bit of pyro But yes, now we have wonderful ideas. We've been inspired. We're out there on the internet. We're looking Now we have to take that step back and say, okay So what do we how do we actually execute this in a way that we can prove to our bosses prove to ourselves and Prove to our literal bottom line that we are making an impact Obviously first thing to do start with your data and your own observations Yes, I am talking about the Google analytics of the world But also if you're in an ad platform if you're in search console, what is working for your audience? Are there groups of people that are interacting with or? Performing differently on your website than others These are the biggest insights we have like you can obviously see this is a little device report here device category One of my faves you can just like take a quick look at how our audience is performing Are we getting more traffic on one device than others should we be paying more attention to our mobile? Interface because we're getting more and more mobile traffic But I'm also talking about this qualitative what are people interacting with and what are they completely ignoring? These are fascinating because you can actually see where people are clicking and If there is like a bunch of texts like you can see there's like a bunch of like lit up here on the headline People often highlight for reading comprehension And so those are the points where you can say oh people really care about what we're saying in that paragraph Should we build an additional module? Should we pull that up to the top of the page? should we make it more about that factoid in our headline because if chances are if they're reading it that intensely They care about it. I Also encourage you to go back and tag all of your buttons like Dana told us yesterday So you can have some quantitative data to back up that qualitative data because data nerd here. You've got to get all that data So here's our example our mobile users have a small age purchase purchase amounts and a lower conversion rate Simple right? We're just at the observation stage. We're not trying to unpack why Well not yet now we are so we've got our observations. We've looked at all of our data Now we can start brainstorming why these differences exist our mobile users just in a different part of their consideration journey Maybe but as Pete just talked about there are several points on that consideration journey where we can have an impact with our mobile experience And when I'm talking about what's in your power to change I mean literally everything on your page not just the button It's everything and this is a chime example. I'm working in financial services right now Which is again hard left turn from my nonprofit life But I love watching this page because they are constantly testing on it I take a screenshot of this page once a week and I go back and say, okay, what'd they change what they keep? Because it's fascinating to watch So here we go. Here's an example. We've got our original observation And the things that are within our power to change Could we nudge them to? Bigger purchases suggesting different products reframing our core benefits Maybe using a little bit that social pressure Maybe featuring our app if you have an app So we've got all of our ideas. We've pulled stuff from the internet You've taken screenshots of things that you like and hopefully have organized them in a way That's more than just keeping them on your desktop for a year. We don't want to talk about it. It's a little messy But now we're gonna go through Hello Now we're gonna go through and write a hypothesis and this is where the science really starts And I'm sure you've seen hypotheses before in your lives Probably in physical science where we did actually get to burn some stuff, which was I know again fun So we're articulating what we're going to be changing And why? Oh, they took away my Britney mic. Okay? That's fine Everything is fine Double mic double the fun. Here we go so When we're articulating a hypothesis here, we are talking about what we're changing What we want to change on this page usually a single thing Because again if we change a bunch of stuff and then nothing happens. We actually don't know why nothing happened So we're gonna change something to something else and It's gonna impact a main KPI again. What are we trying to drive? Usually it's some sort of financial conversion or a lead generation and The reason why you think it's going to help Because if you don't articulate why it's gonna help, why are you texting it? Why do you think it's gonna make a difference? You're gonna say like all this thing will do better Yeah, but why and it's important to do this because you're gonna learn something no matter what even if the Test fails and when I say fails. I mean didn't Your test version didn't win Because it's not truly a tale a failure if you're learning and you're staying focused About that about that whole staying focused thing This is getting really fun. It's an experience for all of us now So what we're going to do is then prioritize our test experiences based on how much that we think they're gonna help We've seen so many different prioritization frameworks this week and I freaking love all of them And so of course I have one in a spreadsheet because data girl over here Everything I own is in a spreadsheet everything. It keeps me organized except for my desktop Which is where all my screenshots live in it and for me But so what I've done here is I've articulated all my tests. I've given them a name I've given them my hypothesis. I've articulated, you know where they're gonna run all of this Really juicy details about the nuts and bolts of my test But then I've also calculated an effort benefit score and so my effort benefit benefit obviously being my expected impact and My learning priority learning priority is something I added as a consultant because sometimes my clients were like I want to do that thing make sure you capture that. I want to do it like okay I'll give you I'll give you a number it'll be in the spreadsheet. That was five years ago. It's still in here And then I subtract that technical effort to kind of balance that out right because we don't want to do something That's going to be a huge technical impact Or on our dev team that's gonna take three weeks to build when really we weren't we're looking for something more in the middle So I've added some numbers here high impact three medium impact to Subtract the medium effort and we have a three and I can sort the column Because math So there we have it now we're ready to run some tests and evaluate all of our test results So I want you to plot them out give them a timeline Make sure that everyone is on board with the order of which order of tests you're going to run This is where that effort benefit score really comes in handy because you can say oh This is going to be a huge technical lift for our dev team We're gonna schedule that thing For later in the year and in the meantime We're still going to be able to get some learnings because we know we have a couple of lower hanging fruit test ideas That won't take as long to execute and maybe they won't be as big of an impact But then you're constantly learning And yes, we do need to run our tests on a certain audience size. I'm not going to go into this I know math is hard for a lot of you as we've discovered this week But there are calculators for this you can plug and play with your numbers to see how many people you need to run your tests on Before you start running that test and so then when your boss comes to you and says hey I know that test has been running for four days. Where are my results? You can say oh, no wait We still need 50,000 more visitors to this test before we're gonna be able to tell you if there was a pattern of change and I usually try to let my test run for at least two weeks to like Normalize a bit of that weirdness in the data So if you're running an email campaign or if there's like a promo that goes up Or if you've done really well digital PR and you've gotten a bunch of traffic to your site over the course of two or three days You let that traffic sort of normalize so that you understand what's actually working for the full audience Not just the weirdos who clicked on a link somewhere. I mean we like those people. That's the point, right? Doing great with your digital PR And then we're reaching for something called confidence statistical significance What does this actually mean if you're not a scientist? This is like kind of a tricky concept to understand We're evaluating a pattern of change so that we're confident that if we did the exact same thing 20 times 19 times out of 20 we would get the exact same result That's really all we're talking about with statistical significance It's not that complicated if you want to get into the complicated weeds You can find me during the break because I will nerd out about that all day long But if you check those results too often we're doing what's called a repeat significant test and we're introducing this random chance The possibility that at the moment that we check the test results. It would just so happens that we have a winner So these are results from the exact same test taken two weeks apart On the left you can see my control is the winner 96% chance to win We should have pulled the plug right like right away But always let your data normalize because two weeks later low and behold Our test version was the winner and if we had pulled the plug at the beginning of that test with what we did find out Was just our email traffic? We would have ended up losing a bunch of money and losing a bunch of opportunity from the rest of our audience So that was a lot The steps are actually pretty easy first thing to do is be inspired Don't keep yourself in your bubble don't keep yourself within your industry go out there If you see something that you love take a screenshot take it to your boss say I think we can do XYZ with this thing I think it would be really cool if this other person is doing it in my work for us, too Next thing build your hypotheses never ever ever ever ever ever ever run a test without a hypothesis Finally prioritize Ruthlessly prioritize you can't run every test Not a single person in here no matter how big their website or how enterprise Their scale is can run every single test that they want to run. It's just not physically possible And then go back To the beginning take what you've learned Build on your plan add those things to the roadmap Even if your boss is like oh, I have this really great idea We should absolutely do this thing next you can say we'll add it to the roadmap and evaluate its effort and benefit score And of course don't forget to share with your colleagues What we found is that only 14% of businesses are saying they're sharing their expertise their experimentation results across the departments That's not so great Because like what if they learned something that you could use to improve your bottom line like we all win We're all reaching for the same KPIs. We're all trying to build the same customer base Why not share? Why not even post about it on LinkedIn or Twitter? I love testing Twitter. I get so many cool ideas there The limit truly does not exist if we are constantly testing because we will constantly be improving I've linked all of those resources in this presentation here on my website and I hope to see you all on Twitter