 All right, good morning. We're going to get started. We're already five minutes late, six minutes. So it's a pleasure to see you here. I'm going to be talking about my experience of having run through a couple of startups. And in this specific case, I'm going to talk about a case study about a startup that I've been involved. And we're going to go into some of the gory details of how we pivoted seven times before we figured out what we want to do or what we were actually trying to solve. So the title MVP hacks is about trying to clear some misconceptions about what MVP itself is and then how you can be smart about designing MVPs so that you can spend very little and still get maximum learning. So I actually picked this up from Jeff some time ago. Jeff talks about if you think of a typical design thinking process. You have a list of big learnings. You have a big idea. And basically, you build a big prototype out of it. You take the big prototype and you run through a bunch of tests and you try and come up with a long list of learning. And then you go and iterate through this process. And that's kind of done in an iterative cycle till you have enough confidence about what you're trying to build. So in some sense, if you were to simplify in many sense, that's what design thinking flow would look like. Now, if you compare this with the lean startup flow, it's kind of slightly different in the sense that you do have a big idea and you highlight all the big assumptions or the riskiest assumptions that you have. Out of the riskiest assumptions, you pick the most important assumption that you want to validate. You pick that assumption. You build a simple prototype or you come up with a simple technique to basically validate that hypothesis that you have around that riskiest assumption. And then you go through some focused tests to build some learning and then iterate by refining your idea. So by taking one thing at a time, you're basically refining your idea and in the process, maybe pivoting or preserving to go forward. That's in a nutshell, if I were to compare design thinking with the lean startup approach of an MVP. So where this comes from is the BML cycle, which is the build-measure-learn cycle. You have an idea. You build something which produces some code. You measure. Once you deploy that, how things are working. You collect some data. Based on the data, you have some learning. And the idea is to basically go through this loop as quickly as possible. That's in a nutshell what the lean startup, the build-measure-learn cycle is. In fact, Ken Beck came up with an interesting twist on this cycle where he said, instead of going from an idea to code and then to learn, what if you reverse the cycle? You come up with what you want to learn first. And then you work backwards and you say, OK, if I were to learn this, what I need to do, what data I need to collect to be able to learn something out of this. And if I were to collect this data, what are the experiments that I need to run to be able to collect the data and have some validated learning? So that's an interesting twist on the BML cycle. But that's the spirit behind that. And that's where the term MVP came from. The term MVP basically is, I mean, I'm going to let you read the bookish definition of what an MVP is. Makes sense? So what is the goal of an MVP? As we talked about in that diagram earlier, the goal of the MVP is essentially to have validated learning as quickly as possible, as cheap as possible. That's the idea of an MVP. Now, I think a lot of people have read this, and we've all concluded different things from what we mean by an MVP. I've been at a company where an MVP means release one. Six months of building a product, and they call that as an MVP. I've been in companies where they don't build anything, and they call that as an MVP. So this MVP seems to be very loosely used in all different contexts. But without arguing who is right or wrong, I think what we need to focus on is what is the outcome that we want to achieve, and what can we do to get that outcome as quickly as possible. I'm actually going to play a quick audio. So my name is Paul Howell. I'm going to talk about a specific technique that my startup has used to conduct really realistic, really effective user tests of our ideas. So about a year ago, we had an idea for a social purchase sharing app, where you would stream out what you're buying to your friends, and they would share back with you what they were buying, and it was going to be great, and it was going to be a social networking take on product reviews. And being a lean startup, we mocked it up. Static prototypes. We got it in front of a lot of people, and they said, I'm not going to use it, but I could see how other people might use it. Now we heard that again and again, and we said, why wouldn't you use it? And they said, because I don't know which of my friends would actually use this. And it made sense. It's a social application. And if they don't see the real faces of their friends that they can emotionally connect with, they can't actually grok what it's going to be about. So we said, all right, we need to make this a more realistic test than what we've done. And we drilled down on the most important interaction on the site, which is when someone does a purchase and shares it on Facebook, and it appears in their friend's news feed. And we were interested, would people actually click? Would they care? So we thought about how can we make this realistic? Do we have to build the whole thing and build this pretty serious sized app to do this? Or could we fake it really well? We decided to try to fake it. And the technique we used was a grease monkey script. So if you're not familiar with grease monkey, it's just a simple little JavaScript that can change the way a website appears. So here's an Amazon product page. After I installed grease monkey, it now pops up every time I go there, a little yellow box that shows competitors' prices. So it's specific to one page, and it alters that page. So I went on renegoder.com, described what I wanted. I found a guy in the Philippines who's willing to build their script for $40. We sent her $40 across the ocean. And a few days later, sure enough, back in the script. And it's pretty simple. There's the whole thing right there. You just drag that into Chrome or to Firefox. And it wakes up every time it gets on its target page. So every time we went to Facebook, it would wake up and it would run itself. So the next thing we do is our standard procedure. We post an ad on Craigslist. Say we're running a social media focus group. We bring people in.