 Now, as we all know, businesses are awash with data from financial services to healthcare to supply chain and logistics and more. Our activities and increasingly actions from machines are generating new and more useful information in much larger volumes than we've ever seen. Now, meanwhile, our data hungry society's expectations for experiences are increasingly elevated. Everybody wants to leverage and monetize all this new data coming from smart devices and renewable sources around the globe. All this data, it surrounds us, but more often than not, it lives in silos, which makes it very difficult to consume, share, and make valuable. These factors combined with new types of data and analytics make things even more complicated. Data from ERP systems to images to data generated from deep learning and machine learning platforms. This is the reality that organizations are facing today. And as such, effectively leveraging all this data has become an enormous challenge. Today, we're going to be discussing these modern data challenges and the emergence of so-called smart data fabrics as a key solution to said challenges. To do so, we're joined by thought leaders from InterSystems. This is a really creative technology provider that's attacking some of the most challenging data obstacles. InterSystems tells us that they're dedicated to helping customers address their critical scalability, interoperability, and speed to value challenges. And in this first segment, we welcome Scott now. He's the global head of data platforms at InterSystems to discuss the context behind these issues and how smart data fabrics provide a solution. Scott, welcome, good to see you again. Thanks a lot, it's good to be here. Yeah, so look, you and I go back several years and you worked in tech, you worked in data management, your whole career, you've seen many data management solutions from the early days. And then we went through the Hadoop era together and you've come across a number of customer challenges that sort of change along the way and they've evolved. So what are some of the most pressing issues that you see today when you're talking to customers and put on your technical hat if you want to. Well, Dave, I think you described it well. It's a perfect storm out there. Combined with, there's just data everywhere and it's coming up on devices, it's coming from new different kinds of paradigms of processing and people are trying to capture and harness the value from this data. At the same time, you talked about silos and I've talked about data silos through my entire career and I think the interesting thing about it is for so many years we've talked about, we've got to reduce the silos and we've got to integrate the data, we've got to consolidate the data and that was a really good paradigm for a long time but frankly the perfect storm that you described, the sources are just too varied. The required agility for a business unit to operate and manage their customers is creating enormous pressure and I think ultimately silos aren't going away. So there's a realization that, okay, we're going to have these silos, we want to manage them, but how do we really take advantage of data that may live across different parts of our business and in different organizations? And then of course the expectation of the consumer is at an all-time high, right? They expect that we're going to treat them and understand their needs or they're going to find some other provider. So pulling all of this together really means that our customers and businesses around the world are struggling to keep up and it's forcing a new paradigm shift in underlying data management. We started many, many years ago with data marts and then data warehouses and then we graduated to data lakes where we expanded beyond just traditional transactional data into all kinds of different data and at each step along the way, we help businesses to thrive and survive and compete and win but with the perfect storm that you've described, I think those technologies are now just a piece of the puzzle that is really required for success and this is really what's leading to data fabrics and data meshes in the industry. So what are data fabrics? What problems do they solve? How do they work? Can you just ask them to tell me? Yeah, so the idea behind it is, and this is not to the exclusion of other technologies that I described in data warehouses and data lakes and so on, but data fabrics kind of take the best of those worlds but add in the notion of being able to do data connectivity with provenance as a way to integrate data versus data consolidation. And when you think about it, data has gravity, right? It's expensive to move data. It's expensive in terms of human costs to do ETL processes where you don't have known provenance of data. So being able to play data where it lies and connect the information from disparate systems to learn new things about your business is really the ultimate goal. You think about in the world today, we hear about issues with the supply chain and supply and logistics is a big issue, right? Why is that an issue? Because all of these companies are data driven, they've got lots of access to data, they have formalized and automated their processes, they've installed software. And all of that software is in different systems within different companies, but being able to connect that information together without changing the underlying system is an important way to learn and optimize for supply and logistics as an example. And that's a key use case for data fabrics, being able to connect, have provenance, not interfere with the operational system but glean additional knowledge by combining multiple different operational systems, data together. And to your point, data is by its very nature, you're distributed around the globe, it's on different clouds, it's in different systems. You mentioned data mesh before, how do data fabrics relate to this concept of data mesh? Are they competing, are they complementary? Ultimately, we think that they're complementary and we actually like to talk about smart data fabrics as a way to kind of combine the best of the two worlds. What is that? The biggest thing really is there's a lot around data fabric architecture that talks about centralized processing and in data meshes it's more about distributed processing. Ultimately, we think a smart data fabric will support it both and have them be interchangeable and be able to be used where it makes the most sense. There are some things where it makes sense to process for a local business unit or even on a device for real-time kinds of implementations. There are some other areas where centralized processing of multiple different data sources makes sense and what we're saying is your technology and the architecture that you define behind that technology should allow for both where they make the most sense. What's the bottom line business benefit of implementing a data fabric? What can I expect if I go that route? I think there are a couple of things, right? Certainly being able to interact with customers in real-time and being able to manage through changes in the marketplace is certainly a key concept. Time-to-value is another key concept. If you think about the supply and logistics discussion that I had before, right? No company is going to rewrite their ERP operational system. It's how they manage and run their business but being able to glean additional insights from that data combined with data from a partner, combined with data from a customer or combined with algorithmic data that you may create some sort of forecast and that you want to fit into and being able to combine that together without interfering with the operational process and get those answers quickly is an important thing. So seeing through the silos and being able to do the connectivity, being able to have interoperability and then as combining that with flexibility on the analytics and flexibility on the algorithms you might want to run against that data because in today's world, of course, certainly there's the notion of predictive modeling and relational theory but also now adding in machine learning, deep learning algorithms and have all of those things kind of be interchangeable is another important concept behind data fabric. So you're not relegated to one type of processing. You're saying it's data and I have multiple different processing engines and I may want to interchange them over time. So I know, well, actually, when you said real time I infer from that I don't have a zillion copies of the data and it's not in a bunch of silos. Is that a correct premise? You try to minimize your copies of the data. Certainly there's a nirvana that says there's only ever one copy of data that's probably impossible but you certainly don't want to be forced into making multiple copies of data to support different processing engines unnecessarily. And so you've recently made some enhancements to the data fabric capability that takes it ostensibly to the next level. Is that the smart piece? Is that a machine intelligence? Can you describe what's in there? Well, you know, ultimately the business benefit is we all have a single source of the truth for a company. And so what we're doing is combining multiple technologies in a single set of software that makes that software agile and supportable and not fragile for deployment of applications. At its core, what we're saying is we want to be able to consume any kind of data. And I think your data fabric architecture is predicated on the fact that you're going to have relational data you're going to have document data, you may have key value store data, you may have images, you may have other things. And you want to be able to not be limited by the kind of data that you want to process. And so that certainly is what we build into our product set. And then you want to be able to have any kind of algorithm where appropriate run against that data without having to do a bunch of massive ETL processes or make another copy of the data and move it somewhere else. And so to that end, we have taking our award-winning engine, which provides traditional analytic capabilities and relational capabilities. We've now integrated machine learning. So you basically can bring machine learning algorithms to the data without having to move data to the machine learning algorithm. What does that mean? Well, number one, your application developer doesn't have to think differently to take advantage of the new algorithms. So that's a really good thing. The other thing that happens is if you're playing that algorithm where the data actually exists from your operational system, that means the round trip from running the model to inferring some decision you want to make to actually implementing that decision can happen instantaneously, as opposed to other kinds of architectures where you may want to make a copy of the data and move it somewhere else. That takes time, latency. Now the data gets stale. Your model may not be as efficient because you're running against stale data. We've now taken all of that off the table by being able to pull that processing inside the data fabric, inside of the single source of truth. And you got to manage all that complexity. So you got one system, so that makes it cost effective and you're bringing modern tooling to the platform, is that right? That's correct. How can people learn more and maybe continue the conversation with you if they have other questions? Call or write. Yeah, I mean certainly check out our website. We've got a lot of information about the different kinds of solutions and different industries, the different technologies. Reach out, scottgeatintersystems.com. Excellent, thank you Scott. Really appreciate it. And great to see you again. Good to see you. All right, keep it right there. We have a demo coming up next. You want to see smart data fabrics in action? Stay tuned.