 Live from the MGM Grand Convention Center in Las Vegas, Nevada, it's the queue at splunk.conf 2014. Brought to you by headline sponsor, Splunk. Here are your hosts, Jeff Kelly and Jeff Frick. Hi, welcome back everybody. Jeff Frick here with the queue. We're at the splunk.conf 2014. The fifth annual Splunkies or Conference at the MGM Grand in Las Vegas, Nevada. They outgrew the Cosmo, they outgrew the Aria, they outgrew everything. I think Godfrey says there's 175 people at the first one five years ago. I think there's over 4,000 Splunk practitioners, employees, partners, customers, and Splunkers I guess is the right word. So we're excited to be here. It's our third year. Joining them by my host, Coase, in this next segment. I'm Jeff Kelly from Wikibon and we're joined by Matt Olson, principal architect at CenturyLink. Matt, I think your first time on the queue, welcome. Thank you, glad to be here. So tell us a little bit about CenturyLink. I think, you know, we think CenturyLink, we think cloud. What are you guys, tell us a little bit about CenturyLink and your kind of value proposition. Yeah, so we're third largest telecom in the US and grown by leaps and bounds through some organic growth but also merger and acquisition. And I think like most of telecom right now, we're in the midst of a massive transformation really into a data services company. That means we're accommodating massive transformation with decline in some areas and just incredible growth in others. Right, so let's talk about that a little bit. So becoming a data services company, what does that mean? Does that mean providing analytics? Does that mean providing data movement? What does that mean from your perspective? First and foremost, it means providing broadband based and internet based services. And it really means also moving from a mode in which we're involved in selling pipes to a mode in which we're selling services that are running over those pipes. So it's really been quite a challenge in terms of monitoring the performance management which is my key responsibility because you need the ability to link what you're seeing in terms of the service delivery with what you're seeing in terms of the underlying infrastructure and the pipes and the network elements and such. So when you're delivering high value services over those pipes, you've got to be able to make sure those pipes are up and running and operating smoothly at all times. And the key is understanding what's happening end to end because in service delivery to a customer from a point within the core of your network, you're going to traverse multiple topologies, multiple network elements and lots of different vendors and each of those elements, each of those segments might look okay but when you add it up end to end you find it's a different story. So talk a little bit about Splunk and how that's helping you do that. Clearly it's a critical part of your job and a critical part of delivering value to your customers. Oh it's absolutely central. And I think the key with Splunk is that we're able to break down data silos very easily. So as a large telecom and a telecom that's grown through acquisition especially, you have a number of different silos, you have a number of different information domains and Splunk has really allowed us to break down the barriers between those silos and provide an integrated correlated view of what's happening across those domains. So previously, maybe take us a little bit before and after. So as you bring, before you started working with Splunk and trying to break down those silos, what was your approach to trying to do that? Was it a manual approach? How would you go about doing it? Well we had a very traditional approach which was based upon traditional row oriented databases and generally you have your DVAs, you have your people focused on schema development, you have your people focused on Java and GUI development and you have generally multiple teams and multiple areas across those domains. So we started using Splunk I think as many businesses do just for simple log parsing, kind of a tactical spot solution and then we realized that we could do prototype development very, very quickly. Because at the time to market, if you want to inhale data, parse it, work with it and then build views of it, if you're working in a traditional environment, you got to work in each of those specialized areas, work with your DVAs and schema and bring in the data and then you go to work with another group in the development. And we quickly ascertain that with Splunk we were able in one place, in a one stop shop very simply to prototype all the way from data acquisition through to the useful views. And then we realized that those prototypes, well they're fully functional as production operational tools. So we began to transition more of the load and more of the focus from traditional, it's actually oracle based solutions over to Splunk and over time actually very quickly and that became our core focus and we kind of pushed aside the old legacy solution. Interesting, so we got to resolve in the CEO at his keynote yesterday, talked a little bit about supporting. We first talked about seeing from customers the need to do this kind of rapid application and development and rolling it out as fast as possible. And we've talked to customers yesterday around kind of the whole DevOps and where Splunk fits in there. It sounds like that's one of the key value propositions for you. Oh absolutely. Can you maybe give us an example, add a little color, what's one of the use cases or one of the coolest things you've found or maybe not coolest, one of the most important things that you've found using that approach? Well I think first and foremost, we start with pain. You work with different organizations, different customer organizations and determine what the pain points are, where they're hurting. Because nobody ever really has the time to produce detailed requirements of what they need. So you find out where the pain is and then you kind of dive in and prototype something. And I think the coolest initial cases were simply situations where something had gone wrong with the service. Something was clearly a mess and yet you had these multiple groups each scrambling in their own little tool sets trying to figure out what was going on. And we were able often with Splunk to very, very quickly link what was being observed in terms of issues with the service to what the platforms were saying regarding the syslog and SNP data and such telling you what's going on with the platforms. And you can really quickly, really easily provide a view that shows you, okay here's where your service is failing and that exactly right there is the underlying cause because you can link what's happening with the machine data from the platform perspective with what's being observed with the service. And we had a number of cases like that where we had a number of people in mad scrambles, they had war rooms, you have these long operations and we're able to just quickly dive in, grab data, whip something together and illustrate the root cause of the problem. So, and not to belabor the point, but with the old approach, the more rigid relational approach where you have to know the question you want to ask beforehand. It sounds like that wouldn't be possible doing that kind of analysis. Yeah. And so clearly that's an area where it's interesting, I'd love to talk more about it. Some point how you see this kind of impacting some of those traditional vendors who take that approach, but I know Jeff wants to get the questions as well. Yeah, it's just interesting too with Telcom and really the growth of data centers and we're constantly hearing about next in data centers and obviously cloud is huge and public cloud and hybrid clouds and private clouds and you guys are the stuff that's actually up in the cloud that's metal and electricity. So I want to talk a bit about how your business is growing and transforming because of this cloud transformation and the additional challenges that that kind of rate of growth is kind of throwing it your way when you've got to fix things. Yep, oh absolutely it's a matter of scrambling to keep up with mad growth in areas certainly cloud is one of them and also service is delivered by the cloud and it's certainly a challenge of tying together all of the pieces involved in the service delivery because you have now I think a scenario where you often have not only bundled the services and functionality but you have everything sort of coexisting in a virtual environment and it does offer though I think some tremendous advantages in terms of not only the simplicity and provisioning but also the ability to establish a more sort of consistent and controllable environment for the services but it's a learning curve. Because for the consumer of the cloud I just wanted to turn on, turn off, add my stuff, pull my stuff down but from your point of view and supporting that and architecting it it's real stuff that's got to be connected it's got to work together and you've got to actually execute on that vision. So I mean is taking a cloud based architecture approach from a delivery point of view actually easier from your side? I think ultimately it's easier, yes but there's a substantial sort of upfront investment in expertise and in establishing an architecture that's going to be very robust and support the rapid deployment that's really the key. And following up on Jeff's question I mean how is this approach with new tools like Splunk so different than the old approach with the old tools? Well and I think the key is that you need tools that are inherently flexible and extensible and you need tools that are within our control to a degree, I mean the thing is as a service provider you need the ability to quickly go in with your knowledge of the particular, the subject matter you're working with and the problem you're trying to solve you need to be able to quickly iterate through the solutions without depending on rigid technologies or undue vendor engagement. Right you can't really iterate if you have to go back to a data model or to build a new schema and it's going to take three months that's not an iterative approach unless you've got to be able to ask a question not get quite the answer you were looking for iterate, ask a new question that's something that some of these new approaches provide. So I want to get your take a little bit on kind of Splunk's model so we've talked to a lot of clients here customers of Splunk and a lot of them started kind of small with Splunk and then moved to different use cases can you kind of walk us through how that evolved in your organization? Absolutely so we've started as I said with the initial tactical focus simply on parsing log data to get at operational issues with platforms and such and extended then into sort of the service layer where we have a lot more data related to the services we're delivering to our customers and with a focus really on operational support but then we quickly discovered that that same underlying data set and that same functionality provided by Splunk can then be leveraged for executive dashboards and presentation for trend analysis and then true analytics and in fact for business intelligence type functionality as well because often times the underlying data set is actually very much overlapping and often identical so what we've done is really to begin our initial development with an operational focus but then as we've developed the operational tools we've discovered more and more use cases that can be readily supported in other realms in planning and engineering even in marketing and product management and such. Well interesting so does and how does does or doesn't Splunk's kind of licensing model allow you to expand in that way? Yeah and I think the licensing model in my mind is really key because it's simple. It's simple, it's clear cut, there's none of this back and forth around which feature sets you have licenses for and complex higher order math around any of that and the simplicity is extraordinarily important in my mind. The other really key development and something I've been very focused on at Comp is now with 6.2 I think the availability of even simpler pivot type models and I think tools that will really allow those BI type users to quickly become comfortable and productive with Splunk. So when you say BI you mean more kind of on the reporting side versus more the interactive real time operational analytics? And also, well some of it is I think real time because you have impact of various marketing approaches and understanding of what's happening with your various products and it's but yeah it's really I think the analytics in the business realm as opposed to in so much in the operational realm. Interesting because in Godfrey's keynote yesterday he kind of put up that slide that showed the old EDW BI model and kind of the new Splunk approach and in your case it sounds like Splunk is disrupting not just the underlying relational database but also even up to that layer of the EDW and the BI space as well. Oh I think so. It's very interesting. Yeah it is interesting because at the end of the day it's about getting the right information of the right people at the right time so they can take action and they can do something with it and I'm just curious kind of the people that have more of a traditional BI approach, knowledge, experience, expertise, how are they, what's the reception to this style of BI I guess? Well I think the initial reception has been quite positive in that there's kind of shock and awe and when people see what you can produce and how quickly you can produce it it's really very very appealing. I think the next step though is to get to a point where those same users are comfortable producing those views for themselves. You know it's one thing to impress them without rapidly you can develop views of the data. The next step though is to impress them with how quickly they can produce those views of the data and I think these new tools are going to enable that. Right so moving to that self-service. Right, exactly. So expand on that a little bit. Does Splunk make it possible for a really non-technical person to do that or is that something you think they need to invest in anymore? Yeah and I think with the latest though at 6.1.2 and now at 6.2 that's a lot of what's being introduced is the ability to work with the data models and with PIVITS and such which are intuitively familiar to data analysts, people who have grown up using Excel and PIVIT and VLOOKUP and all those nifty tools. Well yeah that's interesting. I mean we've been hearing about kind of that self-service moving to the self-service intelligence model for 10, 15 years and it's still a traditional BI space is kind of stalled out and the metrics you usually hear is 20% of any given organization is actually using these BI tools. They've never really been able to crack that number. So what's it going to take to do that? Is it just simply bringing in tools that kind of mimic Excel and the tools that people know and understand or what's it going to take to kind of break through that number? Oh I think it's the intuitive interface and the intuitive tools and I think that PIVITS are particularly effective in that area. So what else are you looking for from Splunk? I mean as we look forward and we're here next year talking to you what are some of the things you're hoping to hear being announced from Splunk or areas where you hope they invest over the next 12 months? Well I think a lot of what I would be looking for personally is sort of consolidation of some of the functionality that's already been introduced. There are things like the ability to connect to external databases via DB Connect is in my mind extraordinarily important as is HUNK because Splunk has always been very good I think at coexisting with other tools with other solutions and those sorts of capabilities really add tremendously to the functionality. And so I think from my perspective the real key is continuing to sort of extend and harden those sorts of capabilities and integrate them more fully because it's extraordinarily powerful when you can take something like Splunk and have it leverage reference data, topology data and all these external data stores and also coexist with the Duke and HDFS and then potentially do so in a cloud environment. I mean that becomes an extraordinarily powerful environment. You're looking to expand different use cases, different underlying storage and other architectural approaches. Kind of unifying those in a way that you can't do with more point solutions. Exactly. All right, well we're out of time unfortunately Matt. Thanks so much for coming on theCUBE. Really appreciate it, great insights. Thanks for watching. We'll be right back up for this next segment. Please stick around. Excellent.