 From around the globe, it's theCUBE with digital coverage of Data Automated, an event series brought to you by IOTAHO. Okay, we're back with Adam Worthington, who's the CTO and co-founder of Ethos. Adam, good to see you. How are things across the pond? Good, thank you. I'm sure our weather's a little bit worse than yours is over the other side, but good. Hey, so let's set it up. Tell us about yourself, what your role is as a CTO and give us the lowdown on Ethos. Sure, so yeah, Adam Worthington, as you said, CTO and co-founder of Ethos. So we're a pretty young company ourselves, so we're in our sixth year. And we specialize in emerging disruptive technology, so within the infrastructure data center kind of cloud space. And my role is the technical lead. So it's kind of my job to be an expert in all of the technologies that we work with, which can be a bit of a challenge if you have a huge portfolio. One of the reasons we've got a deliberately focused one. And also kind of key to the technical validation and evaluation of new technologies and new vendors that we can see up. So you guys are really technology experts, data experts, and probably also expert in process and delivering customer outcomes, right? That's a great word there, Dave, outcomes. I mean, that's a lot of what I like to speak to customers about. But and sometimes that gets lost, particularly within highly technical fields. I like a virtualization guy or a network guy can very quickly start talking about the nuts and bolts of technology. And I'm a techie. I'm absolutely a nerd, like the best techies are. But fundamentally, we're putting in technology to meet business outcomes, to solve business problems and to enable a better way of doing it fundamentally. And that's what we try to do. Love it. We love tech too, but really it's all about the customer. So let's talk about smart data. You know, when you throw out terms like this, it kind of can feel buzzwordy, but let's get into the meat on it. What does that mean to you? What are the critical aspects of so-called smart data? Cool. Well, it'll probably help to step back a little bit and set the scene a little bit more in terms of kind of where I came from. So, and the types of problems I saw out in the field. So I'm really an infrastructure or solution architect by trade. And what I kind of relatively organically, but over time, my personal framework and approach, I focused on three core design principles. So simplicity, flexibility and efficiency. Whatever it was I was designing and obviously they mean different things depending on what the technology area is that we're working with. But that's been a pretty good step. And what I realized or we realized when we started ethos was that those principles could be, could be used more broadly in that the absolute best of the new breed of technologies and those that really disrupt significantly improve upon the status quo in one or more of those three areas, ideally also in terms of being more simple, more flexible, more efficient. And if we look at data and the challenges that organizations and enterprises, so organizations of a particular size have around data and smart data and the best way of doing things, maybe it's good to reflect on what the opposite end of the story is, kind of why data is often quite dumb. So traditional approaches, we have limited visibility into the data that we're actually storing and using within our infrastructure. As what we've kind of ended up with over time, through no faults of the organizations that actually have infrastructure like this, but there are kind of silos everywhere. So silos of expertise, so whether that be that's born out of specialized teams for socialization, for networking, database admin, for example, and silos of infrastructure which create data fragmentation. So copies of data in different areas of the infrastructure and copies of replication in that data set or replication in terms of application environment. And so that's kind of what we tend to focus on and what is becoming and resonating with more and more organizations. There's a survey that's one of the vendors that we've worked with actually our launch vendor so five and a half years ago, a vendor called Cohesity who got on the panel later. They partnered with a company called Vanson Born because they were a first to pipe kind of global market survey, 900 respondents, all different sectors, all different countries, so US, UK, Germany, bunch of others. And what they found was pretty shocking. It was a Cohesity survey, so it was focused on secondary data, but the kind of lessons learned from the information taken out of that survey applies right across the gamut of infrastructure data organization. And just some stats just to pull out. So from my notes, 85% of the organization survey store their data in between two and five public clouds. 63% of organizations have between four and 15 copies of exactly the same data. Nearly nine out of 10 respondents believe their organization's secondary data is fragmented across the silos that are touched on and is or will become nearly impossible to manage over the long term. And 91% of the vast majority of organizations' leadership were concerned about the level of visibility their teams have incorporated in their infrastructure. So they're the kind of areas that a smart approach to data will directly address. So reducing silos, so that comes from simplifying, so moving away from complexity of infrastructure, reducing the amount of copies of data that we have across the infrastructure and reducing the amount of application environments that we need for different areas. So the smarter we get with data in my eyes anyway, the further we move away from those deficient negatives. Wow, there was a lot in that answer, but so I want to kind of summarize it if I can. I mean, you started with simplicity, flexibility, efficiency, of course that's what customers want. And then I was going to ask you about, you know, what challenges customers are facing and I think you laid it out here. But I want to pick up on some of the data that you talked about, the public cloud creep. I mean, that adds complexity and diversity in skill requirements. The copies of data is so true. It's like data is just like fribbles if you're a Star Trek fan, and they just expand and replicate. So that's an expense and it adds complexity. Siloed data means you spend a lot of time trying to figure out who's got the right data, what's the real truth. So there's a lot of manual processes involved and then the visibility is obviously critical. So those are the problems. And of course you talked about how you address those. But how does it work? I mean, what's involved in injecting smarts into your data life cycle? Well, if we think about it, so in terms of the infrastructure, and as I say, there are very good reasons why customers are in the situation they have and then the situation that they're in. Because of the limits of traditional approaches to infrastructure. So you look at something as fundamental as storage, for example, and applications that utilize data. Something as fundamental as backup and archive. Now, often what that's typically required is completely separate infrastructure to everything else. So, but when we're talking about the data set, so what would be perfect is if we could back up the data and use it for other things. And that's where a technology provider like Heast can come in for example. So although their technology is incredibly simple, it's also incredibly powerful and allows simplification of consolidation of data. And then if you look at just getting insights out of that data, fundamentally traditional approaches to infrastructure, they're put in for a point purpose, put in for a point requirement. And so therefore, it wasn't really incumbent of them to expose any information out of the data that's stored within these traditional infrastructure which makes it really tricky to do anything else outside of the kind of point application environment. And that's where something like Io Tahoe can come in in terms of obstructing away the complexity and more directly delivering this insight. So these are the kinds of areas. So I think one of my, actually I didn't have this quite ready, but genuinely one of my favorite quotes is from the French philosopher and mathematician Blaise Pascal. He says, if I get this right, I'd have written you a short letter, but I didn't have the time. So there's real, I love that quote for lots of reasons. A direct application in terms of what we're talking about. In terms of, it is actually really complicated to develop a technology capability to make things simple, to more directly meet the needs of the business group tech to provide self-service capability. And I don't just mean self-driving, I mean making data and infrastructure make sense to the business users that are using it. My belief is that technology shouldn't mean that the users of the technology have to be technology experts, what we really want them to be and they should be as business experts and any technology they use should enable and inform what they're looking to achieve. And that's the types of technologies that get me excited. They're not necessarily from a techy, geek, complicated technology perspective, but those that are really focused on simplicity, capability and control of data. Yeah, okay. So you talked about backup. We're going to hear from Cohesity a little bit later and beyond backup, data protection, data management. That insight piece, you talked earlier about visibility and that's what the IOTAHO is bringing to the table with its software. So that's another component of the tech stack, if you will. And then you talked about simplicity. We're going to hear from Pure Storage. They're all about, you know, simple storage. They call it the modern data experience, I think. So those are some of the aspects. And your job, correct me if I'm wrong, is to kind of put that all together in a solution and then help the customer, you know, realize what we talked about earlier, that business output. Yeah, and that's, it's fitting that both sides and understanding both sides. So kind of key to us and our ability to be able to deliver on exactly what you just said is being experts in the capabilities and newer and better ways of doing things but also having the kind of the business understanding to be able to ask the right questions to identify how new or better approaches could help solve these issues. And yeah, you touched on yet three vendors that we work with that you have on the panel, three genuinely of, I think of the most exciting and innovative technology providers that are out there. Cohesity, your storage and IO Tahoe. But Pure is a great one. So yes, a lot of the ways that they've made their way in the market is through simplicity and through reducing data redundancy, et cetera. But another area that I really like is with their platforms, you can do more with less. And that's not just about reducing data redundancy, that's about creating application environments that can service, then the infrastructure to service different requirements that are able to do the random IO thing without getting too kind of low level tech as well as the sequential. So what that means is that you don't necessarily have to move data from application environment A, do one thing, manipulate it and then move it to application environment B, to application environment three in terms of an analytics kind of left to right workflow and keep the data where it is, use it for different requirements within the infrastructure and again, do more with less. And what that does, it's not just about simplicity and efficiency, it significantly reduces the time to value of that base as well. And that again, resonates that, if I was to pick up a sound byte that resonates with all of the vendors we have on the panel later, the way that they're able to have better TCO, better ROI and significantly reduce the time to value of data is something that they all haven't. But to answer your question, yeah, you're exactly right. So it's key to us to kind of position, understand customer requirements, position the right technology with the right vendors and help them achieve and do better than they are. Adam, I wonder if you could give us your insights based on your experience with customers in terms of what success looks like. I'm interested in what they're measuring. I mean, I'm big on end to end cycle times and taking a systems view. But of course, you know, customers, they want to measure everything, whether it's productivity of the developers or, you know, time to insights, et cetera. What are the KPIs that are driving success and outcomes? Well, both the KPIs and historically in our space, they've always been a bit woolly when you talk about total cost of ownership, you talk about return on investment and you talk about time to value. And I've worked in many different types of companies and many different types of infrastructure, often quite complicated requirements and infrastructure to service them and being able to put together anything particularly realistic that gets proven out once the solution gets put in around ROI and TCO is challenging. But now with these newer, better approaches that are simpler, more flexible and more efficient enables you to really build a true story and replicate whatever you kind of promised around kind of ROI, TCO. And the key thing, as you say, from data, and I've said it a couple of times now, is time to value. So what we help in terms of the scoping and in terms of the understanding what the requirements are, we specifically call out business outcomes, what organizations are looking to achieve and then tack on those metrics to those outcomes. But what that does is a few different things but it provides a certain success criteria whether that's success criteria within a proof of concept or success criteria of an overall solution that you put in and being able to speak that language. And as I said before, more directly meet the needs of the business group tech in a kind of crystallized, defined way is we're only realistically able to do that now with the types of technologies that we're working with historically in Borderline. Yeah, so when you think about the business case, the ROI, the benefit over cost of benefit and obviously lower TCO, you're lower than a nominator, you're going to increase the output and the value. And but I would really stress that I think the numerator ultimately, especially in this world of data, is the most important. And I think the TCO is fundamental. It's really becoming table stakes. You got to have simple, you got to have efficient, you got to be agile, but it enables that numerator, whether that's new customer revenue, maybe cost savings across the business. And again, that comes from taking that systems view. Do you have examples that you can share with us even if they're anonymized, the customers that you've worked with that are maybe a little further down on the journey or maybe not and things that you can share with us that are kind of proof points here. Sure, it's quite easy and very gratifying when you've spoken to a customer that we know you've been doing it this way for years and this is the way that your infrastructure architecture looks like or your data architecture. If you did it like this, if we implemented this technology or this new approach, then we will enable you to something as simple but often really powerful. Let's reduce your back footprint from a storage perspective. I've worked on a project where a customer reduced their back footprint from, I think it was nine, it was just under 10, it was nine fully loaded rack, which were just full of this, which were providing the fundamental underlying storage architecture. And they were able to consolidate that down and provide additional capacity, a significantly greater performance in less than half a rack. Or looking at the you mentioned data perception earlier. So another organization, this is a project which is just kind of nearing completion at the moment, huge organization that literally petabytes of data that was servicing their backup and archive. And what they had is not just this reams of data, they had, I think I'm right in saying five different backup applications that they had dependent on the, what area of infrastructure they were backing up. So whether it was virtualization, that was different to they were backing up PD-Dooms, different, backing up another database environment, they were using something else in the cloud. So a consolidated approach that we recommended to work with them on, they were able to significantly reduce complexity and reduce the amount of time that it took them, what they were able to achieve. This was again, one of the key requirements they have, they've gone above the threshold of being able to back up all of them. When they tried to do a DR test to spin everything back up in a secondary data center, they weren't able to achieve it within the time scales that the disaster recovery and business continuity kind of took place. So with this, we were able to prove them with a kind of proof just before they went into production, a DR test using the newer approach and they were able to recover everything, the entire infrastructure in minutes actually in terms of the production workload. This was in comparison to hours and those hours is just with a handful of workloads. They were able to get up and running with the entire estate in. I think it was something like an hour and the core production systems, they were up and running practically instantaneously. So if you look at really kind of stepping back what these customers are looking to achieve, they wanna be able to, if there is any issues, recover from those issues as quickly as possible, understand what they're dealing with from an infrastructure perspective, reduce costs. And another customer that we worked with recently, what they had huge challenges around and they were understandably very scared about GDPR. So this was a little while ago, actually it's still, it's not a conversation that's gone away just about everybody I still speak to has got issues and concerns around GDPR compliant understanding where their data is stored and putting them in a position to be able to effectively react to subject access requests. That was something that was a key metric, a key target for an infrastructure solution that we worked with and we were able to provide them with the insight into their data set and enabled them to react to compliance and appear to subject access requests that we created in significantly less time and significantly less time to receive. Awesome, thank you for that. I want to pick up on a little bit. So the first example is can you get your infrastructure in order to bust down those silos? And what I've, when I talk to customers and I've talked to a number in banks, insurance companies, other financial services, manufacturers, when they're able to sort of streamline that data life cycle and bring in automation and intelligence, if you will. What they tell me is now they're able to, obviously, compress the time to value. But also, they're loading up on way more initiatives and projects that they can deliver for the business. And you talked before about the line of business having self-serve. The businesses feel like they actually are really invested in the data that's their data that it's not confusing and a lot of finger pointing. So that's huge. And I think that your other example is right on as well of really clear business value that organizations are seeing. So thanks for those. Now is the time, really, to get these houses in order, if you will, because it really drives competitive advantage, especially, take your second example in this isolation economy, being able to respond to things like privacy are just increasingly critical. Adam, give us the final thoughts. Bring us home in this segment. Well, the final thoughts and something, that we didn't particularly touch on, that I think is kind of slightly hidden and isn't spoken about as much as I think it could be, is that traditional approaches to infrastructure. We've already touched on that they could be complicated and there's lack of efficiency. It impacts a user's ability to be agile. But what you find with traditional approaches, and you've already touched on some of the kind of benefits of your approaches there, is that they're often very prescriptive. They're designed for a particular purpose. The infrastructure environment, the way that it's served up to the users in a kind of a packaged kind of way, means that they need to use it in whatever way has been dictated. So that kind of self-service aspect comes in from a flexibility standpoint. So these platforms and this platform approach, which is the right way to address technology in my eyes, enables the infrastructure to be used flexibly. The business users and the data users, what we find is if you put in this capability into their hands, they start innovating in the way that they use that data and the way that they bring benefits to the business. And if a platform is too prescriptive and they aren't able to do that, so what you're doing with these new approaches is get all of the metrics that we've touched on, it's fantastic from a cost standpoint, from a agility standpoint. But what it means is that the innovators in the business, the ones that really understand what they're looking to achieve, they now have the tools to innovate without this. And I think, and I've started to see that with projects that we've completed, if you do it in the right way, if you articulate the capability and you empower the business users in the right way, then they're in a significantly better position these businesses to take advantage of this and really sort of match and significantly beat off from their competition in whatever space it is. Super Adam, I mean, nearly exciting space. I mean, we spent the last 10 years gathering all this data, trying to slog through it and figure it out. And now with the tools that we have and the automation capabilities, it really is a new era of innovation and insight. So Adam Worthington, thanks so much for coming in theCUBE and participating in this program. Now, excellent times and thank you very much today for inviting me and yeah, it's been pleasure. All right, stay safe and thank you everybody. This is Dave Vellante for theCUBE.