 So, just jumping right into it, this Coversio Rate Optimization 101, we're going to talk about how to raise the Coversio Rate on your website. So first off, just to kind of introduce myself, I'm Chris Edwards, I run a data driven marketing agency called Data Driven Labs, and I've been doing website development for about 18 years, been a WordPress developer for about six of those years, and been doing Google Analytics for 13 years. I used it back when it used to be Urchin, if anybody remembers that time. I've been a digital marketer and doing SEO stuff for about 11, and then I've just been a data nerd like my entire life. I've always loved numbers, I've always loved trying to figure out what's the meaning behind things, and just looking at statistics and all that, and just understanding those different pieces. And that kind of goes right into what we're going to be talking. This talk today is going to be data, about data, it's going to be called, it's basically a, Coversio Rate Optimization is a hypothesis and testing type of idea. We're going to be looking at how, we're not going to be just making decisions on our website based off of what we think looks good to ourselves, but we're going to basically ask other people what looks good to you. And so with this, what we, this is where I get to tell you, hey, if you're a designer, you're going to hate me. If you're a developer, you're going to love me because I'm going to tell you how to tell your designer that you're right and they're wrong, so this will be actually pretty interesting based off of that. With Coversio Rate Optimization, what we're doing is we're going in and we're looking at, we all kind of have what we think looks good, but our target audience may like something different and we're going to kind of talk about different things. Good examples I built a website once, or worked with a website once, it was targeted only to developers and they had a very basic like just text only listing out the features of the plan that they offered and what it did and it was very, I thought, ugly. We decided to go ahead and we tested and this is going to be, we're going to show a bunch of different tests here in a little bit, but we decided to just test what happened if we put a really nice looking, modern looking pricing table that really made everything clear. Knew it was going to help a ton with their website, that they were going to start converting like crazy, it dropped their conversion rate and we realized because developers don't really care what it looks like. As a matter of fact, when it starts looking too professional, they started to lose trust in it. They started to say, well, this is more like some designing company that's coming in just trying to sell me a bunch of things versus another developer who's just trying to teach me stuff. And so that's what we talk about when we say, we want to make websites for a target audience. I am going to ask through this talk that if you can hold your questions towards the end, we may answer them along the way and if not, I am going to have, I am reserving enough time to go ahead and do some questions at the end. So the first thing we got to do is we're going to get into this, I call it the CRO mindset. You're going to fail and you've got to get yourself ready to fail. You're going to find certain tests and certain things are not going to work the way you wanted them to. So when you go in, you're going to have to figure out, keep an open mind as you run these tests, and don't be worried about what the results are going to be. Failing is actually part of the process. If we don't fail, then we're not doing this right because what we're trying to do is we're testing different things. You go work with any scientists, they're going to tell you, actually Elon Musk put out a video from SpaceX that showed all the rockets that they launched that they were trying to land and all of them, it was just like a montage for three minutes of rockets just crashing and blowing up. And you're watching this going, wow. And then it shows the one that lands and shows that since they weren't able to land that, they've been able to land most of them, not all of them. They've been able to land most of those rockets. Because of that failure, every single one of those failures, they're able to learn something to improve the next one. And so that's kind of what this whole process is, is a system of winning and failing to figure out what do we want to do and what don't we want to do again. We're going to learn how we're going to do more with less. The one thing I say is there's a lot of talks going on here today teaching SEO, marketing, all those things. I do talks on those as well. And driving more traffic to your site is always great. But what if we were to say, you don't have to drive more traffic to your site to make more sales. If you already have traffic coming to your site, there's a way just to get those to start converting. So we're going to be able to do a lot more with doing a whole lot less of other things. And then we're going to just follow all the data. And I used to have a slide from Frozen, but then I thought it was too corny, so I took it out. But just let it go. There's going to be points. We're going to be looking at your data, and you're going to be looking at your tests. You're going to be kind of a little worried. You're going to be like, I don't like how this is going. Or I really don't like the way this looks. And you're going to want to quit. But if you haven't hit the point where we've actually figured out what we're trying to do and actually gotten our results, we have to just stand back and just let the data tell us what's happening. And it's a little hard to do. So you have to just be willing to take a step back and go, I don't think this is working. I think I'm failing on this, but let it completely fail. Don't cancel before it fails. And we'll talk a little bit more about that towards the end of the talk. So the second thing we need to do with anything, we need to start with a plan. So you don't want to start testing your website to test. What I find with a lot of these talks, sometimes people leave here and they're instantly, they're sitting out in the lobby trying to set their first test on their website. And they're asking me, well, I'm just randomly picking this on my site because it's the easiest thing for me to test right now. Even though this really doesn't matter, I really want to test this. It's too hard for me to test. I don't want to do this right now. The idea is we want to define our metrics, create some goals, and then we need to figure out what is the exact plan and model we're going to go after. We're going to start big with any test and we're fine down. We're not going to start testing little tiny pieces inside our site. We're going to test big, large items. We're going to make sure that we can see the big things are going to make a big difference first. And then we're going to try and work on those minute things. You also need to know why you're doing the test. And we're going to talk about the different ways you're able to know why. First thing you want to do, create a backlog. With any plan, we want to plan out everything we're going to do. The thing is when you're doing A-B testing, depending on your traffic, how many people are coming, you may plan for this test to take two weeks. And based off the traffic and based off what you did, that test could fail really fast. I've had a test before that I planned it was going to take three weeks. It failed within three hours of launching it. I had enough data to show that that test was not working. And I need to move on to another test. Luckily, I created a backlog. So I had other tests ready, lined up, ready to go. So as soon as that test failed, I instantly jumped right into my next test. I wasn't wasting time. Because if I were to have it fail, and I go, oh, well, now I got to figure out what I'm going to do next, turn that test off. That's just wasting time. You want to always be testing. So you want to create that backlog so you have things that you can switch out right away. So just always know what's coming up next. So this is where we talk about the why. We're going to have to collect two different types of data. There's quantitative and qualitative. The first one we're going to talk about is quantitative data. So quantitative data is just a fancy word for pretty much analytics. This is going to be the actual data, the numbers that we're going to get. There's a whole bunch of different ones in here. Some of them you'll recognize. Jetpack, many of you probably use on your sites. Google Analytics, a major tool that's free. And there's other ones out there, Kiss Metrics, Mixpanel, Adobe Analytics, and a bunch of other smaller ones as well. These are all just analytic tools you can put on your site. Obviously the most common is going to be Google Analytics. Easy install, you can go tracking. Everybody uses it mainly because it's free. Just out of curiosity, how many in here use Google Analytics? All right. How many here don't track anything on their website and have no idea what's happening? All right. Well. I'm just starting, so. Perfect. There's nothing wrong with that. This is what this whole thing is about, learning why we use this data and why this data is really important. So the more advanced ones are like Mixpanel, Kiss Metrics, Adobe Analytics. These do have a price tag, depending where you're at. Kiss Metrics starts at about $600 a month. Mixpanel starts around about $100 a month. Adobe Analytics is pretty expensive. But those are more used in a lot of businesses or advanced applications. You guys, what we're going to teach here, you can do with Google Analytics for free. And then those are other solutions that are out there. Set of curiosity, anybody in here use Kiss Metrics and Mixpanel? Okay, we may breeze through one spot on this talk. Building your funnel. So we want to know how people check out on your website or make that conversion happen. There's a conversion funnel they go through. It's multiple steps they go through. You want to build out that funnel. We're going to skip this section, but this is basically Kiss Metrics, how you build one. We're going to talk about this tool here. It's Hot Jar. You can install it. They have a free version. They have a paid version. It's all based off of how many hits your website gets. It's actually a very competitive price. It's not $600 a month. I mean, if your site's really large, it might be, but for most, it comes out to free or it's like $20 a month or something like that. But you can build out your funnel. So if you look here, I'm going to read from my screen. It's a little easier to read. You have the homepage, the product, the cost, or I'm sorry, the cart. Check out and thank you page. So this is the process that someone goes through in checking out on this website. And what we can see is we can see that drop off rate all the way down through here. So we see that 52% of the people drop off and never go to our product page. But 20% of the people drop off at the product page. So 80% of people end up going on to adding something to our cart. And then you have a huge drop off at the checkout phase going on to our thank you page, which our thank you page means they completed the purchase. So we want to understand and build that funnel so we can find out what we're going to test later. In addition to the quantitative, we want to collect qualitative data. So qualitative data is a little different. This is where we actually reach out. I can look at that one funnel and say, we had a huge amount of people dropping off, jumping from the ad to cart to the thank you page or to the checkout page. Which means something's happening. People are adding it to the cart, but they're not actually going to checkout. So we want to look at that and figure out what happened there. But we might not be able to tell just by looking at it. There's a lot of different ways we can jump in and we can do some error testing and see what's happening, but we still don't know what's going on. So this is where we go and we look at using qualitative. This is where we're going to pop up. And I'll show you guys seeing this, a little survey pops up. Why are you buying this product today? Do we have everything you need? Do you need any assistance? Can we improve on anything? We're asking them to give us some feedback so that we know what's going on because everybody has a reason that we may not be able to figure out just by looking at graphs and numbers. There's different tools that do this. You have hot jar, again, you'll hear me say hot jar a lot because hot jar actually has a whole suite of tools that does a lot of these different pieces. And then there's other ones like Qualaru and O-Larck who just do just those surveys as a piece. All of them are paid hot jar does have a free option available as we talked about earlier, but the other two are paid services. I don't know what their pricing is, but they are decent tools. Then we can do heat maps. Heat maps are really interesting because you can learn what people are looking at and what's important on your website. You can find hotspots. So you can kind of see here, people still, when they're on a desktop, they still read with their mouse. They still hover over things they're interested in. And so what heat maps do is it actually tracks the mouse movements and the clicks of people on your website. So we can see on this site that people are really interested in the button on the left, not so much the button on the right, even though the button on the right's a brighter color. And then a whole bunch of people are going up into the top of that site, and I don't know what that site is, it's behind there. But a bunch of people are going up there. So whatever's up there is really interesting to them. And so we may want to build off of that and maybe change some of our call to actions to reflect what's going on over there. So we just need to see what are they reading? What are they ignoring on our site? Is this going to help us to understand? And all this is going to come back to the A-B testing portion in just a few minutes. Just kind of a note with heat maps, you also want to look at doing things like having your models look at your products. So this is actually interesting. This is not something you can just pull a tool you can put on your website. They've done some of these testings in different universities where they've put this helmet on that literally follows your eyes and see where people look by looking at advertising. And what they found is, you notice when she's looking at the product, a lot of people look over, they look at her and then they look at the product. Versus when she's looking right at you, nobody really paid attention to the actual product in the picture. So make sure that when you have people in your websites and you have imagery that everything's pointing to what you want them to do. And if not, there's an idea for something we can A-B test down the road. We want to see, this is more to give you ideas when we were looking at our site and trying to figure out, hey, what are we going to test? Visitor recordings, this is another thing HotJar does. It records not through your webcam or anything crazy like that. That's the NSA, but what it does, it records your actual mouse movements. So when the mouse is moving on the screen and it doesn't record their computer, but it records them inside the browser, inside of your website. So we can actually follow where their mouse goes. And what we do is we use this to basically identify where they're pausing at, where they're stopping, what's going on here, or identifying lost visitors. A lot of times you have people who, they're looking for something in your site. Maybe they're looking for a button, the I agree button to your terms of service, and they can't find it. But you'll be able to tell because they'll just be like, just going around. And then finally you'll see that moment they find it and they quickly move to it and click it. So it's really great to see how they navigate your site through this as well. And just identifying what's going on with those customers. With long pauses, we've actually used that to identify with a company. We were working with it, had a long checkout process. And we had a long pause and then a long drop off during that pause. And we realized it was a spot where we actually asked people to get their VIN number for it was a whole card, it was like sell your car online. Then it will get the VIN number. Well, your VIN number, really the only place most people know to define that is on your car. Well, depending where you live, you may have to go down to a parking garage and all this, and so people are bailing out. And so the way we fixed that was, we gave a little checklist of, here's the items you're gonna need to do our process. Please make sure you have all of these before going through. And that helped increase the conversion rate. Because people got extremely frustrated after they've gone through, I mean it was a really long form. After they've gone through like six or seven steps and then they were asked for a VIN number, they basically rage quit. They said, I'm not doing this. And they quit. Versus when you present that ahead of time, they get all that information and then going through that long form, wasn't that bad for them. Again, hot jar pops up again, but you also have Crazy Egg and Specklet, Clicktail, those also do those screen recordings as well. I believe all three of those other ones are paid, I think. I'm not sure, Crazy Egg might have a free version now. But they all have paid versions as well. This is basically says that all those tools we talked about, hot jar does most of those. So I just throw that in there. I'm not endorsed by hot jar or anything like that. It's just I love that tool because I prefer to have one tool doing everything because they all integrates. Then have a bunch of different tools because it slows your site down and causes other performance issues. And they can interact with it or have issues with each other. So now that we've caught all this data, our next thing that we want to do is go back to that backlog we created and update that backlog. So we've created our plan of what we want to test. We've now gone and researched our data on those tests. We may have realized in that researching of that data, there's some other big items that we did not look at that we need to go back and test. We may say, oh, you know what, there's a bigger problem here. Let's go figure out what's going on there and fix that first and test that. And finally, we've done all this work. Now we're to the AB testing part. So all this was our prep work to figure out and give us some ideas of what we're going to AB test. Now, we're going to start doing AB testing. Who here is familiar with at least what the concept of AB testing is? All right, so it is basically serving you. This is the blind test of putting a Pepsi and a Coke in front of you and having you taste both of them and see which one you like. AB testing, one of the biggest things about AB testing that makes it different is the way you do it. And you want to use certain tools that make this possible. You have Optimizely and VWO. Those ones are paid tools. They are well tested. They're used on very large sites and they work great. I actually like both of them. There's not really a preference I have in them. It actually comes down to the pricing model of which one I use for a client because one works, I forget which one, but one of them. You basically have to look at how many visitors you get and how many tests you're running and then run it through their calculator and I'll tell you how much it costs per month. A new player in a space is Google Optimize. Google Optimize created Google Optimize 360, which was a great tool, but it was going to be part of their 360 package, which is very costly. They've now released Google Optimize as a free tool that you can use. It works just like Optimizely and VWO. So I highly suggest trying it out first. It's free. I so far have liked it. I've not tested it personally on very large projects. It was a large amount of hits. I've only tested it out on smaller stuff. But I do know, for example, talking with one of the guys over at Bluehost. They actually test it on their homepage and they get a lot of traffic to their homepage. They used Google Optimize to test it on their homepage and it worked great for them. So it's definitely a tool to look at. It's free. You can go and start testing tonight on your website if you'd like using that tool. So it's just, I think it's like Google.com slash Optimize. You just search for Google Optimize and you'll find their signup page. The reason I suggest these tools is these tools do a really great job of allowing you to test at the same time. What a lot of people try to do when they do A-B testing is they put one variation of their site up, run it for a week, and then they go put another variation up the next week. And then they compare their data. Promise you're not comparing apples to apples. For all you know, something could happen that week. There's a lot of things happen all around us that affect everything we do. There's world news, politics, there's financial stuff, there's stock market stuff. There's so many different things that could affect your sales from week to week. So it's not fair to test one week against another week because the sales could be different from something else that happened outside your control. So the way these tools work is if everyone here in the room went to my website and I have one of these running, you'd get test A, you'd get test B. You'd get test A, you'd get test B. And so each person coming to the site will be served up one of those different variations at the same time being evened out. And it tracks those as everyone's coming in. It tracks that data and it tracks the conversion rates. And so that way it's a lot more even because literally we're getting people at the same time so whatever environmental effects are going on are not going to affect our test. All these tools work really simple. You add a simple JavaScript code into your website. So they all will work with WordPress. They will work if you don't use WordPress and use any other CMS or just any kind of site you have. As long as you can drop a JavaScript code on it, it will work. So while we're at a WordCamp, I'm telling you that this tool will work with anything, but especially works great with WordPress as well. It has an easy to use WYSIWYG. So this is a site that we were testing out. And we were going to test this Learn More button down here at the bottom. So this little bar pops up on the bottom of the site every time we go there. So if you can't read it, it says Save 400 or more a year on your gas. Learn more. To test this button, when we open it up, this is using Optimizally for this one. They all work very similar, so I didn't make an example of each. I just used one. When you right-click on that Learn More button, you notice we're able to select, and it's selected, sorry, it's kind of small, edit element, edit, and then we can actually edit the text of that button. So we can easily just change this out. You don't have to log into WordPress, you don't have to get into your code. It's changing it right here on the fly. So when you access this, opens it up, there's our text. We're going to change this Learn More button to say buy now, even though it's actually a free product. It was just an interesting idea to test this. So now we have two different variants. We have control and then we have our variant. And you notice the only thing that we've changed this entire page is that one button. Even though this product is not something you buy, that gave a 21% increase in conversions by having people see the buy more button. So we kept the buy more, even though there was nothing to buy. We were actually doing this test on purpose to see a failing test. We were actually doing this to be an example of a failure test and it didn't fail, which. And that's why I say this is such a really cool process, because you learn things that just will baffle you. I still cannot tell you the psychology behind 100% why that could have worked. But it did. People, more people clicked it when they saw that. So it was really an interesting concept of trying out. And I'm going to show you a couple other fun things. So we got to figure out, what do we want to test first, right? I just showed you a button, for example. But we want to know where we're going to test these buttons at. Because we're going to have a lot of traffic to do tests. And we don't want to test it on the page nobody's going to. So we're going to first find those pages that have high traffic. So in this case, on that site we're talking about fuelsy, they had a couple blog posts that were just performing really well. Why does my car make a squeaking noise when I drive? It's not good, but a lot of people are looking for that. We're getting 19,000 people a month coming to our site with just that one article. So we want to test on that page. The other one was why does my car make a loud screeching sound when I hit the brakes? Again, probably should go see a mechanic. But if you want to just use Google to fix your car, you can do that, too. We have 1,500 on that page. So if I were only running the test on one page, I'd go with that first page. For me, I ran it on both these pages because that gave a majority of my traffic. You can actually see, overall, that one consisted of 85% of the traffic to this website. The other one was 6.84%. Between the two of those, I'm covering 91% of the traffic to my website. So I'm not going to go change and add this test to every page. I'm just going to add it to those two pages. So what do we test? First thing, test your buttons. You can change your button color. You can change the text color of your buttons. Your call-to-action button, you'll hear in many talks, is the main key of your site. This is where everyone sees. Changing the color of that button could be a major difference. Maybe it's making the button stand out less or maybe it's making the button stand out more. But play with the buttons on your site. Try testing those. You notice on this site, everything else is the same, but the button colors were changed. But just changing that button color to red on this one was 21%. Now this is where I come in with my disclaimer. Don't take this talk and go change every green button on your site to red. No. This is what you want to test. You want to test these different things because what works in one will not always work in others. And on a different site, changing it from red to green may absolutely change the conversion rate as well. This is where I get to tell you that you didn't get to go back to the designer and tell them they were wrong because this color works better. Now all the designers are giving me bad looks. Test your images. The Barack Obama campaign in 2008 did a lot of A.B. testing and then in 2012 did a ton. They did more A.B. testing than any other political campaign. It's actually a really interesting thing to read up on the digital marketing aspect of what his team did digital marketing-wise and how that helped him perform in the election. They knew every little thing. Every time you went to his website, you were being A.B. tested. There was something changing on this site and they're constantly. A.B. testing for politics is really an exciting field. We've been consulted to do that and it's really crazy because you have a very small amount of time to do all your testing. You have to run a lot of tests really fast but you also have a lot of traffic. So in this case, the images were being tested. So you have this one where it's focused more on him himself and you kind of see some people around but you really just see him and he's looking towards the camera. The other one shows him communicating and actually having a conversation and there's a whole bunch of other. He's not really the focus of the photo in the other one. You notice the text in the image besides being laid out slightly different. It's pretty much the same and everything else on the site is the same. In this case, it was a 19% increase and I believe it was signing up for the dinner or the donations piece. Whatever their goal was, then a 19% increase. I used to sit here on this talk and try to explain why I think this one won because he's sitting here, shows he's part of the community, he's talking to people and I actually went to a talk with, I think it was a Mike Demo from Bold Grid. He did a talk at one of them. I'll get to it just one second and he said, who cares why? If the test works, that's all that matters. If you sit here, you'll sit here and rack your brain forever trying to figure out why this one did better and unless you go and you literally were to talk to every single person and find out, you're not going to discover why and it doesn't really matter because ultimately, you have a 19% increase in your conversions. Yes, sir. What about testing this one with dinner with Baruch with that? Because the other one had dinner with Baruch on the right. Yes, and so actually we'll talk about that here in just a few minutes. So this example, there are a few things that I didn't like that I did because that's a good example that they did switch other things and we will talk about that in just a second and it could have been that as well. And that's actually under our don't screw up section so I'll talk about that. Testing layouts, again, sidebars, some theme layouts still have them and they give you the option to pick if it's on the left or the right or no sidebar at all. Test those out and see what they do. In this case, this example, they showed this site that did this one, 52% versus a 22% conversion rate by having it on the right. Funny enough, KISS Metrics blog did the same test. They actually got the opposite. The left side performed way better than the right side. So it really just depends on your audience. KISS Metrics has a lot of marketers going to it who are used to seeing it on the right side so maybe on the left side it stood out to them more. Again, who cares why? Just run the test and see. Test different flows, sign up flows. When a person first becomes a customer, test out different pieces of that. When I say that, one of the things that happens when you sign up for something, how many of you sign up for something and you immediately get that email because you see it pop up, you get that email, that welcome email once you sign up for a service? Problem is, are you going to go read that email? Now you have a dilemma. You're going, okay, this email has popped up. Don't want to go read that email or do I want to continue going through the site? You've just interrupted their experience of going through your product. Maybe when they sign up, maybe you wait and you follow in their session or you send that email maybe six hours or a day later, you send the welcome email saying, hey, it was great to see that you signed up for our service yesterday. We're so happy to have you. Here's your next steps that you can do. Have you set this up, this up, this up? Because now you're contacting them once you know they're no longer in your app and you're inviting them back and again, test that. Test what that time period is that works best for that as well. When to promote upgrades or paid if you're doing a freemium service and you're going to get sent to paid? When do you start advertising that to them? When do you start talking about it? When do you instantly give them that wall that says, no, you can't do this until you pay? Figure out where that is, your different features, tests turning on and off certain features. Maybe if you have too many features on right away, you're going to scare people away because there's too many features. Maybe you hide some of those in the beginning and have that come in a later user flow where it says, by the way, we have other features that you should check out. Try testing these different pieces. And then test your money pages. This is where we make our money on our site. This is a store. This is my favorite test of all of these, right? So this is a store, obviously they sell t-shirts. They got different t-shirts on here and this is where I say, think outside the box. Have some fun too when you're doing these tests. So there's one difference between these two. There's a fake beard out of this guy. No, he did not go grow a beard and pose the same way. It is literally a Photoshop beard. If it was not on this small screen, you can literally tell it's photoshopped on there when you were like zoom in and look. So it's a fun test. You can just see what happens. This is where you get to like have a little bit of fun until your boss, you know, you really are working. Hipster one. This is why I have a beard now, by the way. When I first started doing this talk, I did not have a beard. Now I have a beard. So, interestingly enough, it performed way better. And again, have no idea why. This is one I actually would like to know why, but I don't know, it worked out. And you're not selling beards. They're not selling grooming products on the site. They're selling t-shirts. So that's why I say you can think outside the box. This is something that when I actually first saw this test was done, I was like, I don't know if I would have even thought about doing something like that. You know, I would have been focusing on the button color or maybe the way he's standing or something like that. But like, all they did was like, they did this really cheap. They didn't send it back to the photographer or anything like that. They just jumped in and put a fake beard on. So, this is where we get though, when you're having all this fun, you don't wanna screw up. And so this is the image I originally used when I did this talk. That's a very obvious screw up. And what I say is your screw ups won't always be obvious. I don't know if Hurricane Irma, Hurricane Irma affected this area as well, didn't it? Up here in Atlanta? Well, I lived down central Florida. So, right around Disney area. Hurricane Irma did a little bit of damage, took out a lot of signs. One of the biggest areas to cut signs was Disney. Disney replaced the signs literally within six hours of the storm, which was pretty amazing. But this stayed up for about three or four days before all the news picked it up. And then somebody realized, whoops, we made a mistake. And they obviously went and fixed that. It now actually does say Epcot. I did go and verify it just the other day. But this is where I say everyone, there had to be quality control in there. People missed it, and it got put up onto a sign. And I'm sure tons of people drove under and didn't even notice it until somebody actually pointed it out and saw it and posted it up on everywhere, basically. And then everyone notices it after that. So, you're not always gonna see your mistakes. So you gotta really try to look very closely. And the one of the things this is where you get to, what you were talking about, sir. Staying focused. Change one element at a time. If you're gonna try switching these two blocks, only do that. Don't switch these two blocks and then switch the content that's in that block. That campaign with the Obama campaign, you had two different types of images. You had one where he was with a group of people interacting versus one where you more saw him focus just on you. The problem was they did switch the things. So we don't know if that text switch is actually what may have caused the change. So we wanna test one element at a time. Don't go change the color of your buttons and switch everything around because while one may perform better, we now don't know what actually performed better. And so we're gonna have to do a bunch more tests now to figure that out. So focus on one item. Change the item, see how it did. If it wins, implement it, and then go and chest the buttons or test the next piece. Sick, if you have lower traffic, don't do too many variants. I always like just trying to do two variants. Every once in a while, throw a third variant in there. But I like to try to say it's normally gonna take if you have a really good solid test, a minimum of 5,000 per variant you have. So you're talking about 10,000 visitors to really hit statistical significance on something. So when you add that next one, now you need 15,000. And the more you add, the more you need. And that's on a good test. If it's a very close test, it may take a lot more. And so the more variants in there you put, the more you're gonna have. Now, Facebook, who here has the Facebook app? So Facebook, no lie, they like to track a lot of things. That just recently came up in the news a whole lot. Facebook runs A-B test all the time. If you've ever pulled open your Facebook app and found a feature and then like a couple hours later, you're like, I just used that way to go. It's because they're A-B testing. Facebook is very interesting in a way because they have such a large user base. They can run a test and has statistical significance within five minutes of starting a test, sometimes less. Because it goes out to so many users, they instantly have feedback on that. They can run multivariance and they can run a lot of different tests all at once because they get that feedback so fast. But if you're not the Facebook and you don't have a billion users on your platform, you're gonna have to then tailor that down a little bit and maybe focus just on one item at a time because it may take you two months depending on your traffic to get an actual solid answer. So try not to do too many tests at once. And then don't call them too early. There are a bunch of these A-B certificates tests out there and what they'll do is you'll see, it kind of says over in the corner, basically it says test B converted 33% better in test A where 99% certain that changes in test B will improve your conversion rate. You want that to be a 99 or even a 99.5% significance and it's a long math equation that we can go in and start learning math today or we can just go use a calculator because it's easier. But you can plug those numbers in and it will actually tell you because even though you see nine and 12%, normally that looks great but it might not be significant because it runs the test and once it gets an idea, the way these tools work is it then tests itself multiple times to see if it's right in its algorithm to then tell you if it's pretty darn sure it's gonna work. You don't wanna implement a test that you think's gonna be improve your site and then it actually hurts your site down the long run. Double check, amazing. If you got a 956% increase in conversions, you're either gonna get a really huge raise in promotion or if it comes out that that was false and you implemented that, you might be looking for a new job. So you want to really dive in there and when you see something that's just like, that's unbelievable, that hipster test. I would probably have re-ran that test three different times because surprising. When I talked about that one test that we did with that developer, the developer site, I was just, because in every other site we did that test, it always, having a nice good-looking pricing table, always, always increased conversions and so we re-ran that test three or four different times making some slight tweaks. We even re-tested ourselves because we just were like, that just seems so odd and every time it outperformed around the same numbers. So if you see something that looks a little odd, retest it. Also retest down the road. Don't just assume what worked today is gonna work next year. So after you go through that whole backlog, go back and recreate some of those tests and retest those theories and make sure they still are working the way you want them to. And so that's it from our presentation. I'd love to open the floor. We've got about 15 minutes or so for questions. Yes, sir, right here and back. Yeah, can you just talk a little bit more, you've already talked a little bit more about this time. You've talked about waiting too long, too short. I'm a little unclear about, you said don't call test too early, what's too early? How long is the right amount? Yes, so his question was, what is the right amount of time to keep testing? What's too early, what's going too late? So with statistical significance, it's actually an equation that will tell you in those tools if you've hit that. They'll actually tell you to keep testing or you've hit it. With, when we hit them, we usually switch them out pretty quickly. Every once in a while, we may let them run based off of what the schedule we have with that client. So for example, some clients, we have them set up that we only launch new tests on Mondays because of the, you know, what's set up that client. So if that test ends on like a Friday and we hit significance, we may just leave it on and just let it run for a couple extra days and see if the data starts to change, but it shouldn't because it's usually pretty solid at that point until we're ready to change it. I wouldn't keep the test running forever because once you hit that significance level, it shouldn't change. And what you want to do is you want to come in and make those changes because the way these changes are being made on your site are being made through JavaScript and you want to go back in and make them manually to your site instead of JavaScript. The other way is with all these tools, you can actually select a winner in the tool and it will display that winning variable to all of your clients. So 100% of your traffic so that you can actually implement it right away and it does it through the JavaScript piece where it's changing that and you can implement that until the development team can actually get in there. So we had a lot of clients that there's about a two week turnaround in the development team so we would implement that and then the development team, we'd send it to them and say make this change and then they'd come back a little bit later and say, we've made the change, turn off your thing and we'll turn off our test. All right, there was a question over here. Sure, can you talk a little bit about desktop versus mobile? Yes, with all these tools, they do allow mobile testing as well. So when you're running your tests, your site's gonna look different. With these tools, you can select and you should normally unless you're changing like actual text in the site, you should run desktop tests and a mobile test separately because people use both devices differently but the tests do allow you to select which device to run the test on and you should treat, you should have mobile tests and desktop tests running as separate tests. Yes, sir, in the back. How are you playing? Okay, so. So my question is right, you're gonna have, like you already see it with Facebook is now by default in Mozilla as a blocker. So Facebook can't track you outside of the Facebook tab on Firefox, right? And I do it like someone joke before about Facebook kind of opening everyone's eyes because like as marketing, we knew this happened forever, right? But now even like mom's asking me like how do I stop them from seeing all my stuff, right? And maybe that's gonna be built into platforms. So how are you comparing that and like how are you gonna tell clients that they're gonna change the market? Yes, so GDPR it has two reliances. It has reliances in the third-party tool. This is where there's GDPR. It made me really wanna start rethinking my career recently because I do analytics. But with GDPR, the whole point of it is personally identify identifiable information. Hotjar became GDPR compliant about a month ago. Google Analytics just released their statement. They are now 100% as well as their Google Optimized product is. But their product out of the box is. And what that means is, and again, I am disclaimer, I am not an attorney, consults your attorney or legal advice on this. But with GDPR on these tools, as long as you're not tracking any personally identifiable information, you're okay. So as long as you're not tracking their name or you're not tracking anything that identifies them as a person and you're not holding that data. So for example, Hotjar has the ability to screen recordings. My understanding is the way they fixed it is when they're filling out a form, it makes all forms so you can't see what's being filled out so you can't relate it back to the person. Yeah, so. Thanks for doing that for decades. Yeah, so that's kind of the idea is it can't be identified back. So with this, you're just another number in my system. I'm not actually identifying you as a person. So you have to check with the tools to see 100%. I know Google Optimize says they are. I don't know if Optimize.ly or Visual Website Optimizer, but I'm pretty sure they're going to be because those are some pretty hefty fines if they're not. Are you gonna be on cell right now? Yes, yes. If somebody's logged in, if you have an actual system where you're tracking the user ID and they're logged in, it's okay to track the individual, correct? You know, no, because, well, yeah, no, you need to, so this, in reference to this testing, you would just need a test that, you know, you're testing my logged in audience. Don't identify that this is Chris Edwards, this logged in and we're tracking Chris Edwards's activities, because now you've identified it to me. With Visual Website Optimizer and all that, you're not identifying it to the person, you're just doing it across the traffic, but if you did figure out a way to filter it down so you could filter it and you actually start identifying who it is, then that can become GDPR. Now, on the application side, that's a whole different animal, but that's, again, that's the side where I say consult a legal person or somebody. GDPR is very confusing because it was written by lawyers and politicians and tech people were kind of kept out of the loop and it's very confusing on all ends, so yes, Sam. I would try a whole different element. If your test is not doing very well, the question was if your test is not doing very well, would you move on to just changing the color and trying a different variant related to that one or would you try a whole different test? I would see, sometimes it could just be the color or the text and you can try that and you can actually deploy that with the one test still going as a third variable if you have enough traffic for it, you can just create a third variable. I've done that with other tests where I see one's really doing well with the color change of the button and so then I go, well, green's working, one or blue would work and so I've added a third variable that's blue and then I test all three of them so you can do that route, but if it's really not performing at all, I would try testing something completely different and then doing the test. There will be points where tests seem to do drag on and what happens is with most of these tools, they give you an estimate based on your traffic of how long it's gonna take to get to statistical significance and if it's the timeline that you're willing to wait for that because your boss is demanding and they wanna see something now, then maybe try a different test that has a bigger impact at that time. That's right, 10 to 15,000 is where I usually see it. If it's a really major, like everybody absolutely loves the new test variant, it could be less, but for most of them, it's about 10 to 15,000. I've had some at about 5,000. I've even had one, it was like a thousand because it was so significant. So, okay, great question, actually. So what do you do if you're a small, tiny site and you're just not getting that many visitors? That's where we go back to the qualitative and quantitative testing. Go and use the heat maps and just try to identify what people are looking at on your site and move things around and just look at the heat maps and now you're not making as much of a scientific, you're making a little bit of a scientific guess, but you're now also, it is a guess. But with those, you can just kind of look at where people are looking at my site, what are they not looking at, what have I eliminated these and just trying to change up those different items and see if that helps. AB testing is really built, it has to have the numbers to prove. You can still run one and just maybe not look for a 99%. And if you just realize that you do have a margin of error if you don't go all the way to 99%. All right, yes, ma'am? I'm kind of feeding off that one. Can you do something like one out of every 10 or one out of, you know, some kind of qualifier to implement something for a low-traffic site? For a low-traffic site, you wanna do all your traffic because there's not a lot of traffic. So you wanna have 50-50. You can, if you have a large site and you only wanna test the small percentage of your audience, you can actually say I wanna only test 1% of my audience to see the test. The rest of the people just get my regular site. So you can change it up, but with that you wanna test every single person because they're very, when you have a low traffic, you need every person's opinion. All right, well thank you everybody for coming out. Thank you.