 Live from Washington D.C., it's theCUBE. Covering .conf 2017, brought to you by Splunk. Welcome back to the district, everybody. We are here at .conf 2017. This is theCUBE, the leader in live tech coverage. I'm Dave Vellante with my co-host, George Gilbert. Doug Meredith here, the CEO of Splunk. Doug, thanks for stopping by theCUBE. Thanks for having me here, Dave. You're welcome, good job this morning. You are a positive guy, great energy. You got the fun t-shirt, I like big data, and I cannot lie. The t-shirts I love, so great. You guys are a fun company, so congratulations. Oh, thank you. How's it feel? It feels great. I mean, you're surrounded by 7,000 fans that are getting value out of the products that you put, that you distribute to them, and the energy is just off the charts, as you said. It's truly an honor to be able to be surrounded by people that care about your company as much as these people do. Well, one of the badges of honor that Splunk carries at your shows is spontaneous laughter and spontaneous applause. You get a lot of that, right? And that underscores the nature of your customer base and the passion that they have for you guys, so. That's been a, from the very beginning, from the first code that Eric Swan and Rob Doss pushed out, the whole focus has been on making sure that you please the user. The attendance that they created to drive Splunk still stand today, and I think a lot of that spontaneous laughter and applause goes back to if you really pay attention to your customer and you really focus all your energy in making sure they're successful, then life gets a lot easier. Well, it's interesting to watch the ascendancy of Splunk and when you noted, back go back to 2010, 2011, everybody was talking about big data. It was the next big thing. Splunk never really hopped on that meme from a narrative standpoint, but now you kind of are big data. You kind of need big data platforms to analyze all this data. Talk about that shift. And I still don't think that we are the lead flag waiver on big data, and I think so much that goes back to our belief on how do you serve customers? Customers have problems. And you've got to create a solution to solve that problem for them. Increasingly in these days, those problems can be solved in a much more effective way with big data. But big data is the after effect. It's not the lead of the story, it's the substantiation of the story. So what I think Splunk has done uniquely well is whether it's our origins in IT operations and systems administration, or our foray into security operations centers and security and analytics and security analysts support, is we've started with what is the problem that we're trying to solve, and then because we're so good at dealing with big data, obviously we're going to take a unstructured data, big data approach to that problem. So about two years in, you were telling us off camera about the story of, Splunk has a tendency to build a little ADD. You came in, helped a little prioritization exercise, but what have you learned in two years? Infinite, we've got to have an hour for that. I think part of the ADD is because the platform is so powerful, it can solve almost any problem. And what we need to do to help our customers is find, listen to them, and figure out what are the repeat problems that we can actually scale and bring it to lots of different people. And that's been part of that focus problem or focus opportunity we have, is if you can solve just about anything, how do you help your customers understand what they should do first, second, and third? And I think that's part of the dilemma we see in the big data space. As people started with, I want to just amass all the data. And I think that was a leftover to where big data, Georgie and I were talking about this, where those big data platforms started from. If I'm Yahoo, if I'm Google, if I'm LinkedIn, if I'm Facebook, the guys that originated MapReduce and the whole Hadoop ecosystem, my job is data. Literally, that's all I have and that's all I monetize and drive. So I both have the motivation and the technical and engineering know-how to just put every bit of data I possibly can somewhere for later retrieval. But even those organizations have a hard time really optimizing that data. So for the average org, they need to start in a different spot. It's not just put everything somewhere that can later retrieve it. It's what problem I'm trying to solve. What data do I need to solve that problem? And then how do I use it? How do I bring it into something and then visualize it so that I get immediate payback in return? And that's, I think you guys talked to Michael Ibbison on the show, he was in my keynote. That's a lot of, I think, the magic he brought to Gatwick and to Dubai Airports is, let's just start with, can we get people through security in five minutes or less? What data do we need? And then you can move on to the next problem. And the next problem. But I think it's a more practical and more effective way of looking at big data is through the customer solution lens. Yeah, great story, Dubai Airport. Go ahead, George. When you look at these sort of customer adjacencies, are you looking at what is the sort of most relevant next batch of data relative to what I've accumulated for the first problem? Or is it an analytic solution that addresses a similar, end customer, a similar department? How do you find those adjacencies and attack them? So the good news and the beauty of Splunk is, it's not difficult to get data into the platform. And when you do the surveys on data scientists, and I think Richard talked about this in his keynote, they all unanimously come back and say, we spend 60 to 80% of our time just trying to wrangle data. It's like, well, that's not super helpful. How do you get data in quickly? So we've always been effective at getting massive amounts of data because of the way that we architect the system in. The challenge for us is, how do you marry to main expertise and then the different algorithms, queries, or usage of the data so you get that specific solution to a problem? And so we've built up a whole practice of looking at the data sources that are in. What do we know from our customer base that says here are the top end use cases that have been able to take advantage of those data sources for these outcomes? And that's how we try and work with customers to say, all right, you've already brought server logs, firewall logs, and API streams from these four AWS services into Splunk. And you have already got this benefit. What are the next two things you can do with that data to get additional benefit? So in a sense, you've got a template for mapping out a customer journey. Yes. It says here are the next steps. It's like a field guide to move them along in maturity. Right. And you can codify that. And that's been the hard part, is both creating the open source contribution framework for the back better word on what are all these different uses. But the final mile that most of these customers are final inch that most of these customers are trying to drive to is different for every single customer. And that's again, part of what the challenge is with AI and ML and what we were highlighting on stage this morning. There's two different dimensions, three different dimensions you're dealing with simultaneously. One is what data sets are you bringing together? And as you add different data, it radically changes the outcome. What algorithms are you driving? And as you tweak an algorithm, even on the same data, it radically changes the outcome. And then what functional lens are you putting on in place? And so if you want to solve baggage handling at the airport, like one of Michael Ibbison's guys, you need some rich aviation and logistics experience to actually understand that domain. How do you bring that domain set together with the actual data, the algorithms and the data sets you get that rapid piece. And so creating enough of those so they're easily digestible and easily actionable by our customers is that that is the horizon that we're trying to pierce through. And that leads to an ecosystem question. Does it not? It does. Is that the answer or a part of the answer for that last mile or last inch is a huge chunk of the answer. It's because you just go back to, I need that domain expertise. And pharmaceutical drug exploration expertise is different than general healthcare or medical expertise. And if you're not able to bring that practical experience with the ability to easily wrangle data and some data scientists that can write these really interesting and effective ML routines, then it's difficult to get that value. So I know you would jump in here in a second, so what are you guys doing explicitly in that front? Where does that fall in the priority list? Is it percolating? So I think what's made Splunk unique from the very beginning, a whole host of things, but one is we made it accessible for an average person to get data in, to store data and to extract value. A lot of the technologies are out there, you can cobble together and eventually get to Splunk, but it's really long, painful and difficult. If you take that same orientation around this now over-hyped ML AI world, it's the same thing. How do you raise the bar so that an average person on an average day with domain expertise and some understanding of data can find ways to get value back out? So I think it's, there's certainly a technology problem because you've got to be able to do it at scale, at speed, with integrity, but I think it's almost as much or maybe more of a user interface and approachability problem. So there's just not enough data scientists and data experts that are also computer science experts to go around and solve this problem for the world. So there's, it sounds like there's two approaches. There's the customer-specific last mile and then what you were talking about earlier, sort of in the keynote and at the analyst breakout, which is try and find the horizontal use cases that you can bake into what Richard called curated experiences, which is really ML models that need sort of minimal light touch from the customer. Yes. So help us understand how those can build out with the customer last mile and then the customer wakes up with a platform. So we have over 1500 solutions as part of Splunkbase, which really are those many curated experiences. For my Palo Alto environment, a combination of Palo Alto us and third parties created Palo Alto solution that is able to read data in from the different Palo Alto technologies and provide dashboards, alerts or mediations that really assist the Palo Alto team doing their job more effectively. So there's over 1500 of those in Splunkbase. What Rick in the IT operations and app dev arena and high-end and security arena are responsible for is how do we continue to gen up the ecosystem so we get more and more of those experiences? How can we extend from Palo Alto firewalls to overall network and perimeter visibility? Which is a combination now of reading in Palo Alto firewall logs plus the other firewall technologies they likely have plus network data plus endpoint data so we can get visibility and that almost always is a hyper heterogeneous environment especially when you start to drive applications in AWS maybe some in GCP, maybe some in Azure. They all have different formats. They've got different virtualization technologies that represent all those different on-prem renditions. So I think that the world continues to get more complex and the more that we can help the community, corral the community into here are buying centers and here are pain points, use the technology to finish and deliver that curated experience that easier it is and the better it is for our customers. Doug, I know you're super busy and you got to go but sort of last question, we've seen Splunk go from startup, pre-IPO, successful IPO, couple bumps along the way, now you guys are over a billion dollars, feel like there's much more to come. The ecosystem is growing, adoption is really, really solid. The richness of the platform continues to grow. Where do you see it going from here? It is, I really do believe in my heart, my deepest heart that this is the next five, 10, 20 billion dollar organization out there and it's less than money than the representation of what that means. Reaching millions to tens of millions to 100 millions of people with these curated experiences, with these solutions, with insights across hundreds of thousands to potentially millions of different entities out there, organizations, whether it's non-profit, governmental, commercial. We are, Mark Andreessen's famous for saying, the world is becoming a software world. I agree, I'll take it one step further and I think the world is becoming a data driven and a data insight world. Software is key to that but you implement software so you can get insights and be intelligent and sense and respond and continue to iterate and grow. And I believe that Splunk is the best position company and technology in the planet right now to lean in and make this practical and approachable for the millions of end users and the hundreds of thousands of organizations that need that capability. So much more to talk about. Doug Merritt, thanks so much for coming by the cable. Thank you so much. Really a pleasure having you. Thank you George. All right, keep it right there everybody. With our next guest, this is hashtag Splunkconf17. Check that out, check out hashtag CubeGems. This is theCUBE, we're live, we're right back from the DC. Bye bye.