 Live from Seattle, Washington, it's The Cube at Tableau Conference 2014. Brought to you by headline sponsor, Tableau. Here are your hosts, John Furrier and Jeff Kelly. Okay, welcome back everyone. We're here live in Seattle, Washington for Tableau's data 14. This is The Cube, our flagship program. Where we go out to the events to distract the ceiling from the noise. I'm John Furrier, the founder of Silicon Angle. I'm here with my co-host, Jeff Kelly, big data Wikibon analyst here and our next guest, George Matthew, president and chief operating officer, Altarist. Welcome back to The Cube. Seems like we've been interviewing with you for many, many years. But welcome back and excited to be here at Tableau because I remember which interview was either last year or the year before we talked about your relationship with visualization when it really was kicking off on going mainstream, crossing the chasm. Now it's the killer app. What's the update? Give us a quick update. What's going on here? Yeah, so first of all thank you again, John, for having me on The Cube. It's been an amazing run that we've seen in this market particularly around data discovery and how the various forms of data discovery are really delivered to users. Tableau has certainly dominated the market largely becoming the de facto solution for visual analysis and dashboarding in a rapid fire data discovery oriented way. And what we saw in this space was the ability to then make the data prep, the data blending, the visual work to flow information into an analytical model, into a Tableau data extracts directly into a workbook as one seamless continuum that a user should be able to do. And really that's what Altrix represents. We have had an amazing run in the last two years as a partner with Tableau. Over 180 joint customers together. We have seen a lot of the data analysts that love to use Tableau for that rapid fire visual analysis and dashboarding work. Use Altrix for the prep, the blending, the modeling work that comes natural to how they deliver experiences to their end users. You know, one of the most exciting things about my job is I got, you know, being an entrepreneur and doing some side stars like CrowdChat is also sitting down and talking to tech athletes like yourself where people do pontificate about the future. I asked predictive questions like what's going to happen next year. You called it right. I remember what we were talking about data visualization. You kind of hit this whole prep thing. And earlier I was commenting with another guest around this whole storytelling conversation. And that kind of rolled my eyes. It's almost like too much marketing, but you can't tell the story until you know what the story is. And you got to get the data for the story you're trying to figure out. So this data prep is probably the killer thing in the show that I'm seeing as the real, I would say top-notch story here, which is data prep is top of mind of all customers. So I got to ask you one, you were right there. So you got to get the credit, you know, for that being a visionary and also executing on it. But what is the critical issue for customers that you talked to that's the pattern? What's the common thread around data prep? Is it complexity? Is it the data acquisition? Is it the diversity of the data? Is it the personnel? What do you see as the critical data prep thing? Yeah, so initially we saw a challenge in just being able to get well-formed, well-organized, high quality data in the hands of analytical decision makers. And so purely being able to do the cleansing, the prep was very important. But recently what I'm noticing is that it's not just about how to get that prep right, but it's about how you actually want. How you can take a petabyte scale information source like Hadoop and be able to blend out with Excel spreadsheets. And that's a very natural situation. It's not an uncommon thing for most organizations to have a marketing function that has data that resides across three, four different boundaries that needs to be able to come together in a more seamless way to create an analytical model. So I think that the shift in the market that's occurred is first and foremost getting that self-reliance of prep and cleansing in the hands of users, which has been occurring largely driven by the leadership of Altrix for the last two and a half, three years. But now we're seeing what we think of as blending across variant sources, becoming the necessary de-factor reason why people like the HPs of the world, the folks that are using products like Altrix today are starting to just get that end user enablement largely being in the hands of users. I'm smiling because I'm looking at your press release section that says data blending without limits, so I want to get to that in a second. But also I'm smiling because I wrote a post back in 2008, actually, about Twitter data. But it was about the dirty data versus clean data. And what I'm seeing in this model of recursive, iterative, fast decision making, the data prep is not a static environment. So the data cleaning, old way of doing data cleaning was you send it out to the hinterlands of the data cleaners, right? And then they do stuff and then it comes back, because a lag effect, time lag, unique contribution. Now the data cleaning is all about one, getting good data, not dirty data. Cleaning dirty data and deciding whether you want to recycle that back into the system. It's kind of like water, right? You want the clean water into the system. So can you talk about that dynamic and is it getting ahead of ourselves to have this conversation with customers or is this a real conversation? There's a real conversation. If you look at and you use the analogy of water, right? We think about it as flow, right? How data is iterative, how it flows, how it moves across various boundaries in a seamless way so that a customer asks not only the first question but the 15th question and the 500th question and you have the ability to repeatedly create a data infrastructure that matters for not the centralization of that effect, but the decentralization and the end user enablement around data. And Tableau has been great in the sense that it's driven this through visual analysis in their rapid fire experience and it becomes a natural activity for a product like Altrix to function with a Tableau user because every time the repeatability needs to occur in the flow, like how water flows, Altrix becomes a very natural experience for those users. I want to get your comment on this, maybe correct me if I'm wrong and also slap me around a bit if I'm really off base. I see you guys as a real engine underneath Tableau because Tableau is sitting on top of other systems that are already built and some may get destroyed like if some say, oh SAP's changing, it doesn't matter, Tableau's going to sit on top of things, right? You guys are an engine underneath that's going to create the user experience abstraction layer, if you will. In other words, people use Google search because it just works. They can do zillion searches and get what they, what they're looking for. Tableau's with that same effect where the user experience is becoming very gooey based, very Google-like. That's right. And so underneath you need technology. Are you guys that engine underneath at some level? And if so, what is, how would you put that? Are you a carburetor? Are you the oil? I mean, how would you describe your role as an engine on Tableau? Yeah, so when you look at how Altrix gets used today, we are the cylinders and the pistons in that engine. We make it very seamless for repeatable injection of data as gasoline in that analogy. Data's a new oil, new gas. To come in to that engine and have a repeatable way of creating power from it. And so to have that repeatability in the hands of users where Tableau becomes, to extend the analogy of the car, the dashboard, the interactive experience on the front end of that driver, you have to have both to have a well-functioning car. And we see this becoming more and more relevant, particularly as the types of data that are coming into an organization are very, very different than they were even five years ago or even two years ago. You're dealing with more social streams of information. You're dealing with information that's coming from cloud sources. You're dealing with a litany of Excel content. You're dealing with the rise of Hadoop and NoSQL in the data management space. And all of those things have a place that the blending needs to naturally occur before the dashboard could be created or the visual analysis could be in a rapid fire delivered in Tableau. We see ourselves being that natural substrate to make it happen. So before I get to Jeff, I'm going to ask a question. So data blending, if you will, kind of, I guess it's the long country road to get to what I was trying to get out of you here and data is that the blending is essentially the mixture, if you will. That's the combustion of the oil, gas, whatever. Combination that suits the purpose of the user. Is that what you're trying to get? That's right. And sometimes you can actually do some things with the blended source that becomes more advanced in nature, right? You can run a predictive model. You can run a statistical model. We actually did some analysis more recently on our user base and we see that 42% of our customers who are starting with the data blending experience in our product are immediately in a few short weeks going to spatial problem solving, predictive statistical algorithm problem solving, being able to do the combination of the two together and then output to places like Tableau. And so it is both an engine as well as an accelerant of how the analytics occur inside of organizations. The end result is going to be impressive because it's going to drive organizations to make more reliable decisions faster within an organization than was ever possible in the last two decades. I mean, I can talk about this for hours, but Jeff, I want to give Jeff in here because I can go on this all day. I got a couple questions. I just want to clarify one thing. So when you talk about repeatability, are you talking about essentially this whole question of going from a POC where you're experimenting, you find an insight to actually productionizing this? So it's a continuous dashboard. It's a continuous visualization environment that an analyst can come back to again and again. Is that what we're talking about when we say repeatability? Absolutely. Because you can take things that are understood as items that can be operationalized in the dashboard and put it into a process that's scheduled. You can put it into a process that is parameterized. You can put it into an application that can be consumed in a cloud infrastructure or directly behind the four walls in a server. And that operationalizing work needs to happen in just as a seamless way as the rapid fire analysis work would occur in Tableau. We've become that engine for a lot of the users that Tableau largely has been driving in the marketplace. So I want to dig into your approach to, we talked about the data blending, but you also mentioned being able to do those predictive, apply predictive models. So talk about, kind of walk us through a typical user experience. You mentioned they start off with the data blending and then move on to the predictive capabilities. That's right. How does that typically work? Is it a very natural process? Or is it something where do you have to go in and actually kind of help your customers understand what are some of the possibilities? Because you're talking about a more advanced type of application of analytics. We are, and one of the things that we decided up front is that despite the sophistication of the things that you would want to do in a product like Altrix or frankly, as you can see in the product like Tableau as it's evolving in this market, is that we looked at it at Altrix's and the experience so that you wouldn't have to be a programmer but still be able to do very sophisticated things. So a lot of our users are using the product to blend data, prep it and output it to places like Tableau and they're never going to touch all the advanced capabilities that we provide, right? Almost half or more of those users are just using it for prep purposes, blending purposes and output to places like Tableau. The other half, I would say over a period of time, usually weeks and months, are now wondering, well, what if I put a regression model and to understand what the predictability of this operating model looks like inside of my organization? What if I wanted to introduce a decision tree? What if I wanted to do a stronger set of time series analysis? Would I want to do a clustering model, a forecasting model, a machine learning model? And we look at that as a very natural next step to the data prep and blending. And so for the folks who have the wherewithal to do that without necessarily having to be programmers, again, drag and drop visual interface to be able to build analytical models, you can introduce the predictive statistical capability, the spatial capability just as naturally as you were doing prep. And it's one seamless user experience for both the analytic design time as well as the analytic runtime. So I don't see it as a separate set of users. I see it as a continuum where certain users based on their role responsibility are going to go deeper. And others are going to just do what we do well, the prep and blending stuff, and go straight to Tableau for their visual analytic work. So another issue that I've found with kind of the BI space generally is as a horizontal platform, BI has been somewhat successful in providing insights, looking back on kind of what organizations have done, not so great on the predictive looking forward. And the other thing is a horizontal platform not focused on vertical industries or vertical use cases. And I know you take a little bit more of a vertical approach or you have that offer for customers that they want to take that approach. Talk about why you think that's important, the vertical focus. In your question, there's two statements and I want to impact both of them. One is that BI historically has been looking at things from a view of what's behind you in the rearview mirror and how do you report against it? And we generally agree that's been the case for most BI solutions. I was pretty involved with one in my past life before coming to leading the Altrix team. And we saw that reporting ad hoc analysis was on historical data. We think of ourselves at Altrix as a product that enables the next best decision to understand what the predictive element is that helps you decide who your next 20 best customers are and make that very seamless to repeatedly automate. I think that the model has played itself out for us in Tableau because we are largely horizontal solutions that get applied in very interesting vertical IP. And the vertical IP gets packaged by a great dashboard being created in Tableau that represents what a person inside oil gas would want to do or what a person inside of telecommunications would want to do. In Altrix, it becomes a churn analytics application in telecommunications that someone has created because it's the representation of the best packaging of analytic IP in that realm. And that is a more natural scale model for both our companies, right? We see our users creating some of the best models in Altrix and the best Viz and analytics, visual analytics in Tableau and that becomes what drives an industry forward. That's an interesting approach because I always thought the challenge for the BI companies is to how do you build a horizontal application but also somehow have the depth to go into specific verticals and it's very difficult to do that. So you're saying you're leveraging your users. Users and partners. Users and partners, users experiment with the data, come up with some best practices and now you can share that with other customers. How do you help them actually do that? How do you practically speaking? How do you go out to your customers and try to take the best from their experience and share that with the other customers? Well, one of the things that we've been doing and it's actually very similar to what Tableau has been doing is enabling our public cloud infrastructure, respectively, gallery.altrix.com for the Altrix users and the Tableau public infrastructure for Tableau as a sharing experience for that kind of content. So the best industry content, for instance, the best horizontal content of apps that have been built with Altrix are on gallery. We literally have hundreds of thousands of users signed up, tens of thousands of users actively using the analytics infrastructure within gallery.altrix.com on a day in, day out basis today. Hundreds of public applications, thousands of private applications all hosted on that public cloud infrastructure. In a similar way, you know the numbers that Tableau public has generated in the last two to three years. Largely, this becomes the area that both collaboration, sharing and best practices emerge for two communities and oftentimes those communities intersect each other with the packaging of Altrix content and apps that gets shared and published with a dashboard and Tableau public. Is that collaboration that you're enabling both Altrix and Tableau, is that one of the key distinguishing factors in your opinion from kind of the old school model of enterprise software? Yeah, I think there's a few things that are going on. One is exactly that collaboration and that user experience, but the other is the self-reliance factor, is when you look at most of the space in BI historically, it's been BI analytics. It's been a very authored BI analytics situation where someone curates data, packages it into a data warehouse, perhaps creates a semantic layer on top of it and then produces a dashboard, produces an ad hoc report that gets shared to hundreds, thousands of users. And the difference in the Altrix community, the Tableau community, is that they are more self-reliant. They are about how data can come in to their desktop, how it can be blended very quickly, how the analytical models can be created, how it can be pushed out to a dashboard and they don't have any other individual but themselves to get that work done. And our products really push the envelope of self-reliance in a way that the last generation of BI analytics tools frankly didn't do. And that's where this market is really taking itself by storm. George, I want to get your take on, as I mentioned earlier, you guys made the right call. Good vision, certainly your background, Omniture and SAP, you kind of know what's going on in the nuances and it's nice to see that you can see the trends at the end user level. But I want you to share with the audience who might not have the experience with Tableau. What's this company all about? And what's their strategy? Are you, you're very intimate, obviously you had great success working with these guys as well as other folks. What's the story here? What's the big story at this event this year that people need to pay attention to that's relevant about, not only about Tableau, but about the industry itself? Yeah, so I think that Jeff actually asked a question which is highly related to what the trend line is. Last year, this conference had 3,200-ish customers and prospects visiting the event. This year, there's 5,500, you add the partners and you add the Tableau team members and there's probably somewhere in the range of 7,000. Literally this market is doubling in participation and usage. And the question is why is that occurring? And it's exactly the point that I made earlier. The power is going back to the people. And Tableau's slogan is awesome in that sense because it's data for the people. And when you look at that trend that's occurred in the market, the self-reliance is not just going to be towards visual analysis, dashboarding, where Tableau is the dominant leader in this space but we're now seeing that data and analytics for the people come down to how they prep data, how they blend data, how they build repeatable models, how they build analytical workflow in products like Altrix. So I have mentioned this to you before, John. This is like a decade of generational change that's underway where the data and the analytics are basically coming back to the people. And the reason why it's occurring is that the tools that have served those people to date are good for Excel-like analysis, right? And Excel is still the dominant solution in this space for work there. But if you had to graduate from that level to the next degree of more sophisticated analytics, you'd be writing SAS code, right? And so there's nothing in between that enables users. Yeah, so this brings up, we love to just pontificate on things but also analyze and guess and kind of connect the dots. So if you connect the dots, let's try to connect the dots. Gen one, computing, may I call mainframe minis in one, PC revolution, the web, now we're in this revolution we're in now, you had Microsoft dominating. Microsoft dominated, they had the desktop, they had the operating system and the applications. Now you have the born in the cloud phenomenon, okay? SAS, but the real cloud, DevOps cloud, a whole nother user experience is exploding. So that begs the question that Tableau is not necessarily just a visualization company. They could be the next Microsoft OS that's born on the cloud. I mean, that's not a far-fetched thesis. So I had an opportunity before I was at SAP to be part of the early team at salesforce.com. And about 10 years ago, Salesforce defined what the first generation of an enterprise-scale cloud application could be. And that was, I think, one of the last great software enterprise-focused companies that built a dominant business model in their space being CRM. I think that the analytics space has a fundamental reinvention that's underway and it's being led by a company like Tableau being the centerpiece of how that's occurring. And you saw it today in the product announcements, right? You saw it in how much the expansiveness of the cloud capabilities were in the product. You looked at what Dave Storey had presented today as project elastic. I asked him directly, by the way, was that vapor or was that real code? He said that was live code on the computer, on the iPad. Yeah, so what you're witnessing is- It wasn't the demo version. It was real code. What you're witnessing is what Salesforce did in the CRM space. Companies like Altrix and Tableau are reinventing the BI analytics market. And that is not going to happen in one fell swoop. It's going to happen, we've talked about the baseball analogy before. It's like, we're in the third, perhaps fourth inning of this ball game. I think we're in the fourth inning now. But there's a lot more innings of this- Well, Tableau thinks they're in the bottom of the first because they have only 20,000 customers. Well, yeah, when you look at the users cells, then you look at the users, it is very, very early on. When you look at the user experience, I think these solutions are getting more and more capable to turn. I think we're in the fourth inning of what I call the born in the cloud revolution. And that is a complete reconstruction transformation of how software gets consumed and executed. And I think this puts a direct line in the sand with Microsoft's old monolithic bloated software model. One of the things that I saw, I talked about in our user conference just a few months back is compare when companies have started their first version of their software. And so just for fun, I did that with SPSS 1969, SAS 1972. You look at the introduction of Open Source R in 2007. You look at the introduction of Tableau 2004. You look at the introduction of Altrix 2006. You look at the introduction of Cloud Era 2009. Every one of the companies that are more recent in this experience are built off of not only a native cloud fabric. Workday. But. But Workday in there, they're cloud. But there's another related point. They're built on in-memory. Everyone is taking advantage of the fact that memory is more readily available to your compute needs than say, for instance, disk and storage was. And so that's a big shift. George, always great to have you in theCUBE one. You're a leading executive, you're a tech athlete. You've been theCUBE alumni, but you got great insight. Not about just your company, about the industry. Really appreciate you taking the time. George Matthew, president and chief operating of Altrix, we'll be right back with our next guest after this show. We're just theCUBE live in Seattle. I'm John Furrier with Jeff Kelly. We'll be right back. Thank you guys.