 That was actually, Matt's talk was actually a perfect introduction to what I'm going to talk about because obviously Matt talked a lot about conversion optimization, which is incredibly useful. But a lot of companies are not doing it right now. So I would estimate less than one or two percent of the 2.5 million companies that spend money on Google AdWords are actually doing split testing. And I want to talk about, well, a cultural problem that I believe is the main cause for why not more companies are doing conversion optimization. And that cultural problem is sort of a war between creativity and data. And I'm going to suggest some ideas on how to solve this, obviously. So I'm going to start with a story that the story starts in 2006. And 2006 by all standards, Google was already an incredibly successful company. They were about to make ten billion dollars that year. And maybe more importantly, to Google became an official verb in the dictionary. So by all standards, a really successful company. And now what also happened in 2006, Google hired their first visual design lead. Now that's a weird story by itself that such a tremendously successful company only hired the first visual design lead in 2006 when they were already quite big. But that's not the story. The real story that happens is three years later when Doug Bowman, that's his name, resigned and Google lost their visual design lead. And Doug wrote a blog post explaining the reasons for why he quit Google. And the sort of most salient quote here is that he said, I can't operate in an environment like that. Now let's just look into this bit more detail. So he can't operate in an environment like that. Now you wonder, what does he mean? Google is a fantastic company. This is just some pictures from Google offices all over the world. I can't operate in an environment like that. Now what he meant very specifically is that he didn't want to work in an environment where the design philosophy is strictly driven by data. So the way he put it, a philosophy that lives and dies strictly by the sort of data. And more specifically, he gave examples. So for instance, when Google couldn't decide between two different blues that he suggested, they would just test 41 different shades of blue with all the traffic they have to see which one attracted most clicks. And he mentioned debates where he would have to sort of reason for why this should be three, four or five pixels wide. And he didn't like the fact that they removed all subjectivity and instead only looked at the data. And so this is a sort of fake conversation I made up between Doug and Marissa. Clearly fake because there's no way Marissa would have used an iPhone. But you can imagine these discussions they had where Doug sent something over. He says, like, I thought about this a lot. I think this would work. And Google people tell them, well, have you tested this? Now, this is not a single incident. This is a big problem. All over the web, you have a lot of people that take all the stuff that Matt was talking about and would say, I don't like that. So here are some quotes from rather sort of successful and influential designers. When Nathan Barr says, A.B. testing is like sandpaper. You can use it to smooth out stuff, but you can't really create anything with it. Or I'm skeptical of A.B. testing. The approach ignores the magic and the soul. Now, on the other hand, you've got a lot of people that have that work in this industry and that have huge problems finding designers. And they say, oh man, designers, they always want to create, create, create. They don't care about the goals and the revenue and the results. What is this war between data-driven marketers and designers to some extent? Now, this makes you want to say, just stop. This is not leading us anywhere. And in order to understand this, let's zoom out and understand that this is a broader problem in general. So first of all, I really like this way of looking at it. So this is not a term by me. This term was coined by an ATEC company called Media Math. It's the Madman versus the Mathman. And this graph in the middle exemplifies what is meant by that. It's like Google rakes in more advertising revenue now than the whole entire American print industry. So clearly, this is a broader problem and this is also an older problem that some could argue that it actually starts with Aristotle's distinction between episteme and techne, which is sort of craft versus knowledge. And we had a similar discussion about these things in the 20th century in a very influential lecture called The Two Cultures in the Scientific Revolution by a guy who was both a chemist and a novelist, C.P. Snow. He said, so there's something serious. I believe that the intellectual life of the whole Western society is split between two groups, the sort of intellectuals, novelists and the scientists. And they're not talking to each other and that's a big problem. So this stuff is still holding us back today. And I want to just start to look at some misconceptions that I believe are the reasons here. There are some very extreme perspectives in here because they are out there. So if you had discretion advised, one of the first misconceptions is that creativity lacks iteration, creativity lacks process. So basically, you will find this amongst people that are very data-driven, that they think somehow creativity is a single stroke of genius, just as Paul McCartney supposedly rode yesterday in his sleep when he was dreaming and he woke up and had to write down the notes. What a genius! True story, but not the norm. If we look at the history of art, then we will find that a lot of artists, in particular during the Renaissance, were painting many different pictures of a single painting. So this is Leonardo da Vinci's two versions of the Virgin of the Rocks, not on the rocks. And so the slightly Instagramed version on the left is the older one and there are some other subtle differences here. Beethoven was known to have a very rigorous process of sort of changing and iterating through and through these symphonies before he came up with the final version. So this is an interesting example by Pete Mondrian, where if we just go back and forth between these two, so obviously this one is the original, the other one is an altered version where the blue and the red are just exchanged. This could be a typical A-B test, actually, when you do online marketing. But somehow he stuck with this version, but what is known is that he did a lot of different iterations here because you can use X-ray to look at these paintings and see that he constantly changed them. Now, actually some neuroscientists went on and did an eye-tracking study with this altered version and they found that the left one, the original one, was doing a much better job in having people's gaze wander around the painting as opposed to the right one where the viewer's gaze was trapped inside of this blue dark hole. Now, this is just a quick hint for creatives presenting their work these days, showcase your process. It helps a lot when you talk to suits and data people. Second misconception, science is all about data. So all numbers, no feel. And this is precisely the reason why one of the most important scientific discoveries of the 20th century by Watson and Crick, namely the figuring out of the double helix structure of the human DNA, Watson had this idea in his dream. Obviously they were working with lots of data, but this idea occurred to him in his dream. So basically third misconception, data is already science here. The numbers, what else do you want? That's not true either. The scientific process does accumulate a lot of data, but often the view of that data changes when someone has innovative ideas. So we have scientific dreamers and iterative painters on both sides of things. So in many ways you could say, yeah, that's very Ying and Yang, we sure need both. But what I'm really trying to say here is that this all starts with culture. And if you are running a company or if you work in a company where you could potentially be testing your websites and you're not doing it right now, then this may be because of that problem that you do have this internal struggle between creativity and data and how to resolve this. So let's get a bit more practical. This isn't the choice when you run an online company or an app. This isn't the choice. This would be a very stupid choice between a radical redesign when you want to do something new, no testing, and on the other hand optimizing pixels and yes testing. Now, getting more practical, what I would say is you should definitely test. You should definitely test because why not? Because you can. So you can get all this knowledge through testing. And there's maybe just two exceptions when you shouldn't test and that's when you have little to no traffic or when you haven't made a single sale conversion. But in those cases, a mat will be able to help you and then you should get into testing. So these are just some tools that Matt had pointed out before that make it incredibly easy to set up tests these days and it's very good for that. And the question remains, so when we look at tests, what should you test? So we still have this spectrum here between smaller changes and radical redesigns and I'm going to bring forward some arguments against both extremes. So this guy, Rand Fishkin, the CEO and founder of Mars, a great company, he is actually one of these people that suggest you should test radical redesigns, mostly because if you look at the different sorts of tests you could run, where on the one hand you're looking at a button color test, on the other hand you're looking at a completely different landing page. The different landing page will take less time to give you statistically significant results. More importantly, he obviously had an experience where they used a conversion rate company to help them improve their website and on the left hand, this is the site they started and on the right hand is the very long site they ended up with, which gave them a 52% increase in sales in ZeoMars revenue. That was $1 million, just a single test. Chris Goward, another bearded man from North America, the CEO of Wider Funnel, which is a conversion optimization agency, he suggests the opposite. He says, no, you should do smaller changes simply because you're going to miss out on insights when you test something. So basically this is an example where you just, if you test all the things at once with a completely new landing page, you will not know which thing, which change actually drove revenues the most. I'm going to go ahead and say, my personal sweet spot will be in the middle. You should be creative in your tests and you should try and thrive for testing bold things. But not necessarily complete redesign. Reason for that on the left-hand side, when Rand suggests complete redesigns as a testing tool, he doesn't take into account that it takes a lot of time and work to completely redesign a website. On the other hand, where the suggestion is you should change only very small things because otherwise you miss out on the insights, that's a bit of a first world problem. When you say, we increase sales by 120%, but we just don't know what did it. Now when we look at the role of art and science again, or of creativity and data in CRO, I want to suggest, I want to look at this typical conversion framework where you start with, a lot of these things have been covered by Matt, you start with analyzing the status quo by looking at your analytics data, your Google analytics, your KISS metrics, where people dropping off, and that's all data. That is purely data and science-driven. Now when you get into the phase where you actually have to come up with a hypothesis on, well, how could we improve things? What tests should we run? This is something where you clearly need to be creative. And if you just change your button color, yeah, that's a good test. Some people have very good results with that, but there's a good chance that you'll have better results if you focus on bolder ideas that are really creative in this hypothesis process. Now, during the actual process of creation, that's obviously a creative enterprise still, and then running a valid test and analyzing and interpreting the results is again done between data and science. Now, what I would briefly like to mention is we have all these tools. We've got analytics in the status quo analysis phase. We've got psychology, persuasion psychology, space studies and frameworks in the hypothesizing phase, and we've got plenty of platforms on the testing side of things what we currently don't have, and I think that's something that could really resolve that problem between data and creativity is something that I like to call creative data. It's data tools that help you during that creation process because that's also the process that takes the longest of all, and obviously we're on our own then as creatives. Slowly moving towards something like creative data, 2009, same year that Doug Bowman resigned from Google, Adobe bought Omniture for almost two billion, so giant marriage between a company that is building tools for creatives, and Omniture, which is one of the leading enterprise analytics companies that helps big Fortune 500 companies optimize their websites through testing. So that's quite promising, and this is the sort of vision Adobe had where they said, okay, we want you to be able to have a platform all in one that enables you to create and test things continuously. So that's quite promising and something that goes towards creative data. I believe the constraints of creative data, so the sort of tools that we use during the design process, when we're done with hypothesis, done with analyzing, creative data needs to be fast, it needs to be objective, it needs to be communicative and not all too final. Because otherwise it would sort of endanger, again, the creative process. Some examples here. Notable is a really, really great tool for getting qualitative data from your team. When you are in the process of designing something, you can put up mock-ups on Notable and get very, very quick feedback from the team or other stakeholders in the design process, which, and that's their claim, better interface to faster iteration, when you want to get qualitative data from users during the design process. A five-second test is an interesting one. You can put up your sort of mock-up or website or some intermediate step and have people, random people on the internet look at it for five seconds and tell you what they saw. So that's quite useful. I myself am working with ICOND, obviously, as a founder. Our background is in neuroscience and we provide software that predicts within seconds how people will look at the website. So that's quantitative creative data and that's also how our customers use it. Generally, I think what we'll see is that on the tool level, creation and testing, creativity and data are going to come closer together and optimise is doing similar things with that where they already enable you to sort of change the design a little bit before you set up a test within the same platform. And I think that's great because everybody says you've got to embrace both. You've got to be both creative and data-driven in order to be successful. Thank you very much. And I look forward to your questions.