 From the CUBE Studios in Palo Alto in Boston, connecting with thought leaders all around the world, this is a CUBE Conversation. Well, there's no question these days that in the world of business, it's all about data. Data is the king, how you harvest data, how you organize your data, how you distribute your data, how you secure your data, all very important questions. And certainly a leader in the data replication business is HVR, we're joined now by their CEO Anthony Brooks-Williams, and by Devakar Gol, who was the Global Chief Data Officer at GE. We're going to talk about, you guessed it, data. Gentlemen, thanks for being with us. Good to have you here on the CUBE Conversation. Thank you, thanks John. Yeah, well, let's just characterize the relationship between the two companies, between GE and HVR, maybe Devakar, let's take us back to how you got to HVR, if you will, and maybe a little bit about the evolution of that relationship, where it's gone from day one. No, absolutely. It is, it's now actually a long time back. It's almost five, five and a half years back that we started working with Anthony, and honestly, it was our early days of big data. We all had big different kind of data warehousing platforms, but we were transitioning into the big data ecosystem, and we needed a partner that could help us to get more of the real-time data. And that's when we started working with Anthony. And I would say, John, over the years, we have learned a lot and our partnership has grown a lot, and it's grown based on the needs. When we started, honestly, just being able to replicate large sources and to give you context like G being G, we have the fifth largest article ERP, we have the seventh largest SAP ERP, they just, just by the nature of just getting those system in was a challenge. And we had to work through different, different solutions because some of the normal ones wouldn't work. As we got matured and we started using data over the last two, three years specifically, we had different challenges. The challenges was like, is the data completely accurate? Are we losing and dropping some data? When you're bringing three billion, five billion rows of data every five to six hours, even if you've dropped one percent, you've lost like a huge set of insights, right? So that's when you started working with Anthony more around like the nuances as to, what could be causing us to lose some data, duplicate some data sets, right? And I think our partnership been very good because some of our use cases have been unique and we've continuously pushed Anthony and the team to deliver that with the light of, these use cases are not unique. In some cases we were just ahead just by the nature of what we were handling. Anthony, about then the HVR approach that Devaka just took us through somewhat high level of how this relationship has evolved. They start with big data and now it's gone with it now in terms of even fine tuning the accuracy that's so important. Latency is obviously a huge topic too from your side of the fence. But how do you address it then? Let's say G for example, in terms of understanding their business, learning their business, their capabilities, maybe where their holes are, where their weaknesses were and showing that up. How did you approach that from the HVR side? Yeah, you mean winding back a few years, I mean it obviously starts you get in there, you find an initial use case and that was moving data into a certain data warehouse platform whether it be around analytics or reporting such as Devaka mentioned. And that's you mean most commonly what we see from a lot of customers. It's the typical use cases, real-time analytics and moving the data to an area for consolidated reporting. It's either most probably in these times it's in the cloud. For GE, where that's evolved and GR or top customer for us, we work across many of their business units of their different BUs. GE have another ARM Predix which is their industrial IoT platform that actually OEM us as well for a solution they sell to other companies in the space. But where we've worked with GE is the ability one just to support the scale, the complexity, the volume of data, many different source systems, many different BUs whether it be the aviation division or power divisions or those types to sending that data across and the difference being as well where we've really pushed us and Devaka and the team pushed us is around the accuracy to the exact point that Devaka mentions. This data is typically financial data. This is data that they run their business off. This is data that their exact team CEOs get dashboards on a daily basis. It can't be wrong. Not only do businesses these days want to make decisions on the freshest data that they can and specifically over the last year because it's a matter about survival. Not only is it about winning, it's about survival and doing business in the most cost-effective way. But then that type of data that we're moving, the financial data, the financial data lakes that we built for GE that is capturing this out of SAP systems where we have some of the features and benefits. That's where they've really pushed us around the accuracy. And that's whereby you can't ever, but especially these days have a typical just customer-type vendor approach. It has to be a partnership. That was one of the things when Devaka and I spoke a while ago was about how do we really push and develop a partnership between the two companies, between the two organizations? And that's key. And that's where we've been pushed and there's a bunch of new things we're working on for them based on where they're going as a business, where there'd be different sources, different targets. And so that's where it's worked out well for both companies. So Devaka, about the margin of error then in terms of accuracy, because I'm hearing from Anthony that this is something you really pushed them on, right? You know, 96, 97% doesn't cut it, right? I mean, you can't be that close. It's got to be spot on. And at what point in your data journey, if you will, did it come to roost that the accuracy had to improve or you needed a solution that would get you where you needed to operate your various businesses? I think John, it basically stems down to a broader question. What are you using the data for? You know, a lot of us when we starting this journey we want to use the data for a lot of analytical use cases. And that basically means you want to look at a broad pattern and say, okay, you know what? Do I have a significant amount of inventory sitting on one plant? Or, you know, is there a bigger problem with how I'm negotiating with a vendor and I want to change that? And for those use cases, you know, getting a good enough data gives you indicator as to how do you want to work with them, right? But when you want to take your data to a far more fidelity and more critical processes, whether you're trying to capture from an airplane, the latest signal and if you had five more signals, perhaps you solve the mystery of the Malaysian missing plane or when you're trying to solve and report on your financials, right? Then the fidelity and accuracy of data has to be spot on. And what do you realize is, you know, you unlock a certain set of value with analytical use cases, but if you truly want to unlock what can be done with big data, you have to go at the operational level. You have to run your business using the data real time. It's not about like, you know, in hindsight, how can I do things better? If I want to make real-time decisions on, you know, what I need to make right now, what's my healthcare indicator suggesting? How do I change the dosage for a customer or a patient, right? It has to be very operational. It has to be very accurate. And that margin of error then almost becomes zero because you are dealing with things. If you go wrong, it can cost lives, right? So that's where we are. And I think honestly, being able to solve that problem has now opened up a massive door of what all we can do with data. Yeah, I think I would just build on that as well. I mean, what it's about us as a company, we're in the data movement business obviously, sources and targets. You mean, that's the table stakes stuff. What do we support? It's our ability to bridge these modern environments and the legacy environments that we do. And you see that across all organizations, a lot of their data still sits in these legacy type environments, but they all transition to either the target systems or the new world ones that we see, more modern bleeding edge environment. So we have to support those. But then at the same time, it's building on the performance, the accuracy of the total product which has just been able to connect the data. And that's where we get driven down the path with companies like GE with Devalk and they've pushed us. But it's really bridging those environments. You know, it also seems that with regard to data that you look at this almost like a verb tense, what happened, what is happening, what will happen, right? So in looking at it through that person, Devalk, if you will, in terms of the kind of information that you can glean from this vast repository data as opposed to what did happen, what's going on right now and then what can we make happen down the road? Where does HVR factor into that for you in terms of not only having those kinds of insights, but also making sure that the right people within your organization have access to the information that they need and maybe just they only need. No, you're right. And John, it's funny, you're using a different analogy, but I keep referring to as tail lights versus headlights, right? Gone are the days you can wrap it back as to what's happening. You need to just be able to look forward, right? And I think real-time data is no longer a questionable need. It's a necessity. And I think one of the things we often miss out is real-time data is actually a very collaborative piece of how it brings the various operators together because in the past, if you think if you just go a little bit old school, people will go and do their job and then they will come back and submit what they did, right? And then you will accumulate what everybody did and make sense out of it. Now, as people are doing things live, you are hearing about it. So for example, if I'm issuing payments across different, different places, I need to know how much balance I need to keep in the bank. It's a simplest example, right? Now I can keep the, I can always stack my bank with a ton of money. Then I'm losing money because now I'm blocking my money. And especially now if you think about G which has 6,000 bank account, if I keep stacking it, I will practically go bankrupt, right? So if I have an inference of what's happening every time a payment got issued by anybody, I am knowing at real-time it allows me to adjust for optimal liquidity. As simple as it sounds, it saves you $100 million if you do it right in a year, right? So I think it is just fundamentally changes. We need to think about real-time data is no longer, it's just how you need to operate. It's no longer an option. Yeah, you may, what we see, what we've seen this past year, we were fortunate we had a great 2020 just under a hundred percent year over your growth. Why? It's about the immediacy of data so that they can act accordingly. You mean these days, it's table stakes. You mean it's about winning and or just surviving compared to years ago when day old data, week old data, that was okay. You mean in largely these legacy type batch orientated technologies, that was fine. It's not anymore. You mean exactly what Devalka is saying, it's table stakes, that's what it is. And I think John, in fact, I see actually it's getting further pushed out, right? Because what happens is I get real-time data from HVR but then I'm actually doing some stuff to get real-time insights after that. And there is a lag from that time to when I'm actually generating insights on which I'm acting on. Now there is more and more of a need that how do I even shorten that cycle time, right? I actually from HVR get not only data when it's getting fresh, I actually get signals when I need to act something. So I think in fact, the need of the future is going to be also far more event-driven where every time something happens that I need to act on, how can technologies like HVR even help you with understanding those? Anthony, what does scale do to all this? I think Devalka touched on it briefly about accuracy and all of a sudden, if you have a, if you've got a small discrepancy on a small data set, no big deal, right? But all of a sudden, if there are issues down the road and you're talking about, millions and millions of millions of inputs, now we've got a problem. So just in terms of scale and with an operation the size of GE, what kind of impacts does that have? Massive. You mean, it's part of the reason why we win while we've been successful. We have the ability to scale very well from this highly distributed architecture that we have. And so that's what's been fortunate for us over the last years. We know, what is the stat? I mean, 90% of the data was generated over the last years, something like that. And that just feeds into more. I mean, scale is key. Not only complexity at scale is the key thing. That's where we've been fortunate to set us all spot and that's where something where GE can push us and challenge us on a daily basis. The same we do with another company, the biggest online e-commerce platform, massive scale, massive scale. And that's, we get pushed the whole time and get pushed to improve all the time. But fortunately, we have a very good solution that fits into that, but it's just, and I think it just doesn't get worse as the wrong word. It's just, it's gonna continue to grow. The problem's not going away. You know, the volumes are gonna be increased massively. So DeVoc, if I could, before we wrap up here, I'm just curious from your, if you put on your forward thinking glasses right now, in terms of the data capabilities that HVR is providing you, are they driving you to different kinds of horizons in terms of your business strategy or are your business strategies driving the data solutions that you need? I mean, which way is it right now in terms of how you are going to expand your services down the road? It's an interesting question. I think, and Anthony keep correcting me on this one, but today, you know, because if you think about big data solutions, right, they were largely designed for a different world historically. They were designed via IoT parametric set of data sets and different kind of world. So there was a big catch up that a lot of these companies had to do to make it relevant even for the other relational data sets, transactional data sets and everything else, right? So a big part of what I feel like Anthony and other companies have been focusing on is really making this relevant for that world. And I feel like companies like HVR are now absolutely there. And as they are there, they are now starting to think about solving or I would say focusing on people who are early in their stage and how can they get them up and quick, you know, efficient early because that's a lot of the challenges, right? So I would say, I don't know if Anthony's focuses me and rightly, so it should not be me, but it's, I think like where they're going, for example, like how do you connect with all the different cloud vendors? So when a company wants to come live and if they're using data from, you know, the HR worker solution or Concord travel solution, they can just come to HVR, plug and play and say, okay, enable me data from all of these and it's there. Today what took us six months to get there, right? So I think rightly so, I think Anthony and the team are focusing on that. And I think we have been partnering on with Anthony more. I would say perhaps pushing a little more on, you know, getting not only accurate data, but also now on the paradigm of compliant data. Because I think what you're going to also start seeing is as companies, especially in like different kind of industries like financial healthcare and others, they would need data certification also at various kinds and that would require each of these two to meet compliance standards that were very, that were not, they were not designed for a game, right? So I think that that's a different paradigm that again, Anthony and the team are really doing great in helping us get there. Yeah, I think there's, that was good to Valka. There's quite a bit to unpack there. You know, with companies such as GE, we've been on a journey for many years. So that's why we deployed across the enterprise. And it starts off with I have the source system out on my dad's, that target system. These target systems, you know, become more frequently either data lakes or environments that were on premise to running in the cloud to newer platforms that are built for the cloud. Like we've seen the uptick in companies like Snowflake and those types. And you mean we see this from, you know, that big query from Google and those type of environments. So we see those and that's things we've got to support along the way as well. But then at the same time, more and more data starts getting generated in your non-traditional type platforms. I mean, cloud-based applications and those things which we then support and build into this whole framework. But at the same time, to what DeValka was saying, the eyes, the, you know, the legal requirements, the regulatory requirements on the type of data that is now being used before you would never typically have years ago companies moving their most valuable or their financial data into these cloud-based type environments. Well, they are today, it happens. And so with that comes a whole bunch of regulations security and we've certainly seen, particularly over this last year, the uptick in when these transactions have another level of scrutiny when you bring in new products and some requirements that they go through, you know, basically the security and the legal requirements are a lot longer and more in depth than they used to be. And that's just the typical of the areas that they're deploying these technologies in as well. And where you're taking some technologies that weren't necessarily built for the modern world that they are now adopting the modern world. So it's quite complex and a lot to unpack there, but it's, you've got to be on top of all of that. But that's where you then work with your top customers like a GE, that feeds your roadmap. That feeds where one, you obviously make a decision and you go, this is where we believe the market's going. These are things we need to go, we know we need to go support no matter that no customers asked us for it yet. But the majority of it is still where customers that are pushing bleeding edge that are pushing you as well. And that feeds the roadmap. And, you know, there's a number of new platforms, GE, even, you know, pushed us to go support and features that Devark and the team have pushed us around accuracy and security and those type of things. So it's a all-encompassing approach to it. I think we've set up an entirely new cube conversation we're going to have down the road, I think. Yeah. All right. Hey gentlemen, thank you for the time. I certainly appreciate it. Really enjoyed it. And I wish you both a very happy and importantly a healthy 2021. Thank you both. Appreciate it. Thank you. Thank you. Thanks, Anthony. Bye-bye.