 World Hadoop Summit 2012, not Hadoop World. I'm John Furrier with SiliconANGLE.com and I'm joined by my co-host Jeff Kelly. Jeff, we're back. That was a good segment with Jeffrey Moore. That was a great segment. Just a wealth of knowledge, some really good advice I think for both the vendor community and also the practitioners enterprise. Just looking to get into the data. Yeah, cool. So Mitch is here with Sincourt. So Mitch, tell us what's going on with you guys. Tell us what's happening. Hey, great. First of all, it's great to be around the Cube and at Hadoop Summit this morning. Sincourt, I don't know how much you know about us, but we're a software company that's been around for a remarkably long time, more than 40 years now, which is kind of a very interesting phenomenon. Not many software companies can claim that over the years. And we're really here to kind of focus on the value that we bring to the Hadoop ecosystem. Our technology is really around ETL. So we're a strong player in the data integration space. We've leveraged our technology from our mainframe sort heritage to create ETL technology about 10 years ago. And over the last couple of years, we've taken that investment, we've leveraged that into products to really help accelerate the adoption of Hadoop. So we're really excited to be here and be a part of the ecosystem here at the show. Jeff, Jeffrey Moore, obviously, you know, old-time industry legend in Silicon Valley, author of Crossing the Chasm, talking about adapt or die to the incumbents. So you guys actually are one of that kind of company that mentioned TerraData as an example. And he's saying, you know, they can sit back and say structure data, structure data, and not go with a new way. So showing that commitment, so adapt or die essentially is philosophy. You guys have been around the block, so you've been through many cycles, right? Talk about this cycle. What's exciting about it? How does this fit into your business model? Sure, well, I think it's really exciting about this cycle is that it's really a sort of an inflection point in the land in which people are approaching information processing. They've always been hindered to some extent by the amount of data that they can actually process. And our business has always been about removing those barriers. And so the original value proposition for our company even 40 years ago was about returning time, which is the single most valuable thing an organization has, by increasing the performance and the efficiency and the productivity of processing information in their environment and applying that again. And so the same old tried and trued technologies can actually be adapted to accelerate the adoption and kind of bring more data than ever to bear on real business problems. And at the end of the day, John, that's really what it's all about. It's not necessarily about the means to the end, but the end result is improving the insight that people get that is actionable to drive their business results. And so we think we contribute strongly to that. Just continuing on that theme, how do, what are some best practices or some tips maybe you've learned as you're trying to adapt to this new industry? What role really can some of these more incumbent vendors that have been around for a while play? And in terms of talking with your own customers who maybe aren't into big data yet, do you find that the incumbent vendors like yourselves are going to be very an important gateway kind of for big data to make it into the enterprise? Well, we think we're certainly well positioned with our customers who tend to be large enterprises that are already doing complex data integration tasks. And so we're in perfect role to kind of help them and advise them as they begin to experiment with Hadoop. And we're very much seeing a lot of organizations that are doing that. They're putting small clusters in, they're beginning to learn how to do this. And one of the things that we think is really a challenge for them is not just the building the infrastructure, but actually having the skills available in the organization to use it effectively to do real work. And so one of the piece of advice that I have for them would be to really try to leverage existing skill sets in the organization as best they can by working with vendors, sync sort in others that are helping to kind of grow that maturity within the ecosystem and make it easier for them to use Hadoop. Let's dig into that a little bit, the skills gap. You know, it's well known, it's documented in the McKinsey report about the need for data scientists, but also the more kind of infrastructure level professional practitioners. So let's dig into a little bit about how you actually do that. How do you leverage, if you're an enterprise, how do you leverage those existing IT staff in this new world? Is it a combination of both making the tools easier to use from a vendor perspective, but also training education opportunities for the DBAs and others to actually up their skills to the big data world? How do you actually make that a reality? Well, I think that's a great question, Jeff. You know, reality of course is very difficult to achieve from the vision, but I think that, you know, one of the things that you want to look for are what are the predominant skill sets that are being used in the organization today, and where there are a lot of folks that know how to deal with large data, and I'll use large as a word to describe the era before the quote, big data. Folks have developed strong skills around SQL language programming, but even those skills, while they understand them, are not as efficient as they need to be to deal with data at this scale, and the number of things that people want to do with big data, and so what we really look at is providing an environment in which people can become productive at programming to the higher level tasks that they need to achieve without the lower level technical detail behind it, and using tools such as our data management express, ETL tool, they can actually become productive writing jobs in as little as two days, so we think this is a huge quantum leap in being able to say, look, I don't need to worry about how to learn to program, map reduce, and understand the inner workings of the Hadoop environment, I should focus on understanding my data, and that's where the real value is. We even taught, I wrote a tweet, I did a tweet about trust, because the BBC wrote a story about Linus Tervald doing a get an award, and the big thing about open source is trust, but at the same time, we also cover the big guys, like IBM, EMC, and HPE, massive customers, you guys have an install base, you've done a deal with Hortonworks that you've announced here, so you're committing to the technology partnership with Hortonworks, but let's talk about the trust that you have with the customers, because as Jeffrey Moore was saying, if you're not investing in the future, then you're not committed, so the customers are looking at that, talk about that dynamically, because you have to deliver the solutions to your customers, and they expect you to have a Hadoop story, and you have to deliver on that. You might have to run fast at first, but you do have to get that trust solved. Talk about the trust relationship with why Hadoop's important for your customers. I think Hadoop is certainly important to our customers, because they're really looking over the edge at how they can deliver more to their business. The business is demanding the ability to be able to harness that big data to create business insight, and that translates into revenue bearing opportunities or greater efficiencies for the organization, so it's clear that IT organizations have that mandate to explore this technology when they're moving there. We've already established a trusted relationship because we're used in thousands of deployments across almost 70 countries worldwide, and they've been relying on our technology across our, of course, our mainframe customer base for more than 40 years in our data integration space for more than a decade, and so we are a recognized vendor that works in that area today, and we're trusted with their largest, most data-intense challenges already, and so we're a natural partner to have them look to to kind of lead the charge into Hadoop, so we're leveraging all of that experience and expertise, all of that trust to kind of create the next wave of technology to make it easy for them. When Hortonworks talks about enterprise ready, which Jeff and I have been kind of riffing on this week, these two days, if I'm Hortonworks and I say, Mitch, tell me what enterprise ready means. Remember, they're just a start-up, too. I mean, they're moving as fast as they can, as we say, peddling as fast as they can, but when they go to you and say, hey, you guys are in the business, what does enterprise ready mean? Wow, that's a great question, because I think there are different perspectives, obviously, depending on your point of view. So from the point of view of someone who serves customers whose business is dependent on our products day in and day out, enterprise ready to me means that it installs easily, that the time to value that you get from it is very quick to realize that we can train people so that you're not dependent upon a very small group of people that you can amplify very quickly the effect of this tool, but that it performs consistently well, that it's efficient in its delivery and provides the productivity the organization expects. So it really does come down to that trust factor and the proven reliability from using our technology day in and day out for years. I'd love to get your perspective on the ecosystem. I mean, there's a lot of companies here that are participating in the summit and play different roles in the Hadoop ecosystem. Players like yourself, there's a lot of kind of upstarts in the Hadoop community that are focused on one particular aspect of Hadoop. So what's your perspective? What's the atmosphere like here at the show? And in general, the ecosystem, how are all these different types of companies? Are they all playing nicely together? And what's your experience been? Well, our experience has been pretty interesting. Obviously, in some respects, we're one of those upstarts. So we're happy to kind of be a member of the community. What I think is kind of interesting to see is that this show over the last four years has gone up tenfold in terms of attendance, and I understand there's more than 2,200 people here today. So that's an important kind of proof point that says that we're sort of getting close to that tipping point where the interest is peaking and therefore we can expect the adoption to accelerate. I think the ecosystem is getting larger as people look at this and they see the sort of undeniable wave of big data that's going to permeate throughout big businesses and enterprises of all size, and they're preparing to add value to that. I think everybody's looking for where they can help mature the Hadoop environment and the ecosystem to make it enterprise ready so that large enterprises can trust it well enough to go from experimentation into actual production, which is where they're going to derive the value. So to me, it's an exciting time. It's a perfect time to be here, certainly in the Hadoop community and contribute to help make that more mature. I'm sure there's a lot of our audience or people or companies that are considering getting into big data. They know it's something they need to embrace at some point. What are some best practices that you've learned from your customers as they've started to embrace big data, bring it into existing IT infrastructures, and also from a cultural perspective, getting people to understand or believe that big data analytics is really a good way to run your business. What are some best practices you've seen your customers or even some mistakes perhaps to avoid? I think it's early to say definitively what the best practices are. I think we're still learning them to be perfectly honest. We spend a lot of time in our own organization experimenting to understand the ability to scale the technology. We're doing very well in that regard. What we're seeing is that people are approaching this first in the experimentation mode. It's certainly not something that I would recommend that people immediately rebuild their entire data integration or data warehousing strategy around. I think you want to look for what you believe are the high value problems. Try to focus on ways to solve those and test those and do that kind of in a safe way. We think that we certainly can help in a number of ways to begin with, helping make it very easy to load data into the Hadoop framework, to process it, to accelerate the performance within Hadoop and therefore limit your investment in the infrastructure to do it by making what you have much more efficient. And then ultimately I think the big one is to really get over the skills gap quickly so that people are focused on what they want to get out of it and what the value of that information is rather than on how I actually do it. And so to me that's the best advice, is look for things to help you get over that learning curve very, very quickly so that you can actually derive value from it and take hold in the enterprise. Mitch, my final question is, because we got to get the hook here because our next guest, given all your experience in the industry and who you are with your company, what's your advice to a lot of the younger entrepreneurs and younger executives as they get into this big data space? We heard from Jeff Moore, this is a classic crossing the chasm scenario. You got to go out and build a business, get some customers, support those customers. It's not like the big lucky strike tornado app that goes on iPad and gets a zillion downloads. It's a lot of blocking and tackling but a lot of value to be created. What's your advice to those folks? Well I think that there's always a certain excitement around new technologies and let's be honest, big data as a term has gathered momentum. It's like the snowball turning into the avalanche size, rolling down the mountain of the boulder. And I think it's to not get too wrapped up in the hype and to really stay focused on that the technology, any technology, not just to do, is really a means to an end. And so stay focused on the value that you're delivering to the organization in terms of new opportunity and then you'll keep it in the proper perspective. So the pragmatic approach, sort of crawl before you walk and walk before you run so that you don't overextend the capabilities as this is maturing. So go cautiously but optimistically into the future because I think there's a tremendous amount of value available in big data and the last kind of piece of advice is stay focused on the bottom line which is what you can do to increase your organization's ability to improve the time to gaining the insight that really drives the business. Mitch Siegel, the Vice President of Marketing at Sincourt. Thanks for coming on theCUBE. Appreciate it. Great news with Hortonworks. Great stuff in the direction you guys are innovating. Thanks for coming on theCUBE. We appreciate it. We'll be right back inside theCUBE after our short break.