 Now as we all know, businesses are awash with data from financial services to healthcare to supply chain and logistics and more. Our activities and increasingly actions from machines are generating new and more useful information in much larger volumes than we've ever seen. Now meanwhile, our data hungry society's expectations for experiences are increasingly elevated. Everybody wants to leverage and monetize all this new data coming from smart devices and innumerable sources around the globe. All this data, it surrounds us, but more often than not, it lives in silos, which makes it very difficult to consume, share, and make valuable. These factors combined with new types of data and analytics make things even more complicated. Data from ERP systems to images to data generated from deep learning and machine learning platforms. This is the reality that organizations are facing today. And as such, effectively leveraging all of this data has become an enormous challenge. Today, we're going to be discussing these modern data challenges and the emergence of so-called smart data fabrics as a key solution to said challenges. To do so, we're joined by thought leaders from InterSystems. This is a really creative technology provider that's attacking some of the most challenging data obstacles. InterSystems tells us that they're dedicated to helping customers address their critical scalability, interoperability, and speed-to-value challenges. In this first segment, we welcome Scott now. He's the global head of data platforms at InterSystems to discuss the context behind these issues and how smart data fabrics provide a solution. Scott, welcome, good to see you again. Thanks a lot, it's good to be here. Yeah, so look, you and I go back several years and you've worked in tech, you've worked in data management, your whole career, you've seen many data management solutions from the early days and then we went through the Hadoop era together and you've come across a number of customer challenges that sort of change along the way and they've evolved. So what are some of the most pressing issues that you see today when you're talking to customers and put on your technical hat if you want to? Well, Dave, I think you described it well. It's a perfect storm out there. Combined with, there's just data everywhere and it's coming up on devices, it's coming from new different kinds of paradigms of processing and people are trying to capture and harness the value from this data. At the same time, you talked about silos and I've talked about data silos through my entire career and I think the interesting thing about it is for so many years we've talked about we've got to reduce the silos and we've got to integrate the data, we've got to consolidate the data and that was a really good paradigm for a long time but frankly the perfect storm that you described, the sources are just too varied. The required agility for a business unit to operate and manage their customers is creating an enormous pressure and I think ultimately silos aren't going away. So there's a realization that, okay, we're going to have these silos, we want to manage them but how do we really take advantage of data that may live across different parts of our business and in different organizations? And then of course the expectation of the consumer is at an all-time high, right? They expect that we're going to treat them and understand their needs or they're going to find some other provider. So pulling all of this together really means that our customers and businesses around the world are struggling to keep up and it's forcing a new paradigm shift in underlying data management, right? We started many, many years ago with DataMarts and then Data Warehouses and then we graduated to Data Lakes where we expanded beyond just traditional transactional data into all kinds of different data. And at each step along the way, we help businesses to thrive and survive and compete and win. But with the perfect storm that you've described, I think those technologies are now just a piece of the puzzle that is really required for success and this is really what's leading to data fabrics and data meshes in the industry. So what are data fabrics? What problems do they solve? How do they work? Can you just ask them to do it? Yeah, so the idea behind it is, and this is not to the exclusion of other technologies that I described in Data Warehouses and Data Lakes and so on, but data fabrics kind of take the best of those worlds but add in the notion of being able to do data connectivity with provenance as a way to integrate data versus data consolidation. And when you think about it, data has gravity, right? It's expensive to move data. It's expensive in terms of human costs to do ETL processes where you don't have known provenance of data. So being able to play data where it lies and connect the information from disparate systems to learn new things about your business is really the ultimate goal. You think about in the world today, we hear about issues with the supply chain and supply and logistics is a big issue, right? Why is that an issue? Because all of these companies are data driven, they've got lots of access to data, they have formalized and automated their processes, they've installed software. And all of that software is in different systems within different companies, but being able to connect that information together without changing the underlying system is an important way to learn and optimize for supply and logistics as an example. And that's a key use case for data fabrics, being able to connect, have provenance, not interfere with the operational system but glean additional knowledge by combining multiple different operational systems data together. And to your point, data is by its very nature, you're distributed around the globe, it's on different clouds, it's in different systems. You mentioned data mesh before, how do data fabrics relate to this concept of data mesh? Are they competing? Are they complementary? Ultimately we think that they're complementary and we actually like to talk about smart data fabrics as a way to kind of combine the best of the two worlds. What is that? The biggest thing really is there's a lot around data fabric architecture that talks about centralized processing and in data meshes it's more about distributed processing. Ultimately we think a smart data fabric will support both where it, both and have them be interchangeable and be able to be used where it makes the most sense. There are some things where it makes sense to process for a local business unit or even on a device for real-time kinds of implementations. There are some other areas where centralized processing of multiple different data sources makes sense and what we're saying is your technology and architecture that you define behind that technology should allow for both where they make the most sense. What's the bottom line business benefit of implementing a data fabric? What can I expect if I go that route? I think there are a couple of things, right? Certainly being able to interact with customers in real-time and being able to manage through changes in the marketplace is certainly a key concept. Time-to-value is another key concept. If you think about the supply and logistics discussion that I had before, right? No company is going to rewrite their ERP operational system. It's how they manage and run their business but being able to glean additional insights from that data combined with data from a partner, combined with data from a customer or combined with algorithmic data that you may create some sort of forecast and that you want to fit into and being able to combine that together without interfering with the operational process and get those answers quickly is an important thing. So seeing through the silos and being able to do the connectivity and being able to have interoperability and then as combining that with flexibility on the analytics and flexibility on the algorithms you might want to run against that data because in today's world, of course, certainly there's the notion of predictive modeling and relational theory but also now adding in machine learning, deep learning algorithms and have all of those things kind of be interchangeable is another important concept behind data fabric. So you're not relegated to one type of processing. You're saying it's data and I have multiple different processing engines and I may want to interchange them over time. So I know, well actually when you said real time I infer from that I don't have a zillion copies of the data and it's not in a bunch of silos. Is that a correct premise? You try to minimize your copies of the data. Certainly there's a nirvana that says there's only ever one copy of data that's probably impossible but you certainly don't want to be forced into making multiple copies of data to support different processing engines unnecessarily. And so you've recently made some enhancements to the data fabric capability that takes it ostensibly to the next level. Is that the smart piece? Is that machine intelligence? Can you describe what's in there? Well, ultimately the business benefit is we all have a single source of the truth for a company. And so what we're doing is combining multiple technologies in a single set of software that makes that software agile and supportable and not fragile for deployment of applications. At its core, what we're saying is we want to be able to consume any kind of data and I think your data fabric architecture is predicated on the fact that you're going to have relational data, you're going to have document data, you may have key value store data, you may have images, you may have other things and you want to be able to not be limited by the kind of data that you want to process. And so that certainly is what we build into our product set. And then you want to be able to have any kind of algorithm where appropriate run against that data without having to do a bunch of massive ETL processes or make another copy of the data and move it somewhere else. And so to that end, we have taking our award-winning engine which provides traditional analytic capabilities and relational capabilities, we've now integrated machine learning. So you basically can bring machine learning algorithms to the data without having to move data to the machine learning algorithm. What does that mean? Well, number one, your application developer doesn't have to think differently to take advantage of the new algorithm. So that's a really good thing. The other thing that happens is if you're playing that algorithm where the data actually exists from your operational system, that means the round trip from running the model to inferring some decision you want to make to actually implementing that decision can happen instantaneously. As opposed to other kinds of architectures where you may want to make a copy of the data and move it somewhere else, that takes time, latency, now the data gets stale. Your model may not be as efficient because you're running against stale data. We've now taken all of that off the table by being able to pull that processing inside the data fabric, inside of the single source of truth. And you've got to manage all that complexity. So you've got one system, so that makes it cost effective and you're bringing modern tooling to the platform, is that right? That's correct. How can people learn more and maybe continue the conversation with you if they have other questions? Call or write. Yeah, I mean, certainly you check out our website. We've got a lot of information about the different kinds of solutions, the different industries, the different technologies. Reach out, scottg at intersystems.com. Excellent, thank you Scott, really appreciate it. And it's great to see you again. Good to see you. All right, keep it right there. We have a demo coming up next. You want to see smart data fabrics in action? Stay tuned. Okay, so now that we've heard Scott talk about smart data fabrics, it's time to see this in action. Right now we're joined by Jess Jowdy, who's the manager of healthcare, field engineering at intersystems. She's going to give a demo of how smart data fabrics actually work and she's going to show how embedding a wide range of analytics capabilities, including data exploration, business intelligence, natural language processing, and machine learning directly within the fabric makes it faster and easier for organizations to gain new insights and power intelligence, predictive and prescriptive services and applications. Now, according to intersystems, smart data fabrics are applicable across many industries from financial services to supply chain to healthcare and more. Jess today is going to be speaking through the lens of a healthcare focused demo. Don't worry, Joe Lichtenberg will get into some of the other use cases that you're probably interested in hearing about. That will be in our third segment, but for now, let's turn it over to Jess. Jess, good to see you. Hi, yeah, thank you so much for having me. And so for this demo, we're really going to be bucketing these features of a smart data fabric into four different segments. We're going to be dealing with connections, collections, refinements and analysis. And so we'll see that throughout the demo as we go. So without further ado, let's just go ahead and jump into this demo and you'll see my screen pop up here. I actually like to start at the end of the demo. So I like to begin by illustrating what an end user is going to see and don't mind the screen because I gave you a little sneak peek of what's about to happen, but essentially what I'm going to be doing is using Postman to simulate a call from an external application. So we talked about being in the healthcare industry. This could be for instance, a mobile application that a patient is using to view an aggregated summary of information across that patient's continuity of care or some other kind of application. So we might be pulling information in this case from an electronic medical record. We might be grabbing clinical history from that. We might be grabbing clinical notes from a medical transcription software or adverse reaction warnings from a clinical risk grouping application and so much more. So I'm really going to be assimilating a patient logging in on their phone and retrieving this information through this Postman call. So what I'm going to do is I'm just going to hit send. I've already preloaded everything here. And I'm going to be looking for information where the last name of this patient is Simmons and their medical record number or their patient identifier in the system is 32345. And so as you can see, I have this single JSON payload that showed up here of just relevant clinical information for my patient whose last name is Simmons all within a single response. So fantastic, right? Typically though, when we see responses that look like this, there is an assumption that this service is interacting with a single backend system. And that single backend system is in charge of packaging that information up and returning it back to this caller. But in a smart data fabric architecture, we're able to expand the scope to handle information across different, in this case, clinical applications. So how did this actually happen? Let's peel back another layer and really take a look at what happened in the background. What you're looking at here is our mission control center for our smart data fabric. On the left, we have our APIs that allow users to interact with particular services. On the right, we have our connections to our different data silos. And in the middle here, we have our data fabric coordinator, which is gonna be in charge of this refinement and analysis, those key pieces of our smart data fabric. So let's look back and think about the example we just showed. I received an inbound request for information for a patient whose last name is Simmons. My end user is requesting to connect to that service and that's happening here at my patient data retrieval API location. Users can define any number of different services and APIs depending on their use cases. And to that end, we do also support full lifecycle API management within this platform. When you're dealing with APIs, I always like to make a little shout out on this that you really wanna make sure you have enough, like a granular enough security model to handle and limit which APIs and which services a consumer can interact with. In this Iris platform, which we're talking about today, we have a very granular role-based security model that allows you to handle that. But it's really important in a smart data fabric to consider who's accessing your data and in what contact. Can I just interrupt you for a second, Jess? So you were showing on the left-hand side of the demo a couple of APIs. I presume that can be a very long list. I mean, what do you see as typical? I mean, you can have hundreds of these APIs depending on what services an organization is serving up for their consumers. So yeah, we've seen hundreds of these services listed here. So my question is obviously security is critical in the healthcare industry and API security is a really hot topic these days. How do you deal with that? Yeah, and I think API security is interesting because it can happen at so many layers. So there's interactions with the API itself. So can I even see this API and leverage it? And then within an API call, you then have to deal with all right, which endpoints or what kind of interactions within that API am I allowed to do? What data am I getting back? And with healthcare data, the whole idea of consent to see certain pieces of data is critical. So the way that we handle that is like I said, same thing at different layers. There is access to a particular API which can happen within the IRS product. And also we see it happening with an API management layer, which has become a really hot topic with a lot of organizations. And then when it comes to data security, that really happens under the hood within your smart data fabric. So that role-based access control becomes very important in assigning roles and permissions to certain pieces of information. Getting that granular becomes the cornerstone of security. And that's been designed in. It's not a bolt-on as they like to say. Okay, can we get it to collect now? Of course, we're going to move on to the collection piece at this point in time, which involves pulling information from each of my different data silos to create an overall aggregated record. So, commonly, each data source requires a different method for establishing connections and collecting this information. So, for instance, interactions with an EMR may require leveraging a standard healthcare messaging format like FIRE. Interactions with a homegrown enterprise data warehouse, for instance, may use SQL. For a cloud-based solutions managed by a vendor, they may only allow you to use web service calls to pull data. So it's really important that your data fabric platform that you're using has the flexibility to connect to all of these different systems and applications, and I'm about to log out, so I'm going to keep my session going here. So therefore, it's incredibly important that your data fabric has the flexibility to connect to all these different kinds of applications and data sources and all these different kinds of formats and over all of these different kinds of protocols. So let's think back on our example here. I had four different applications that I was requesting information for to create that payload that we saw initially. Those are listed here under this operation section, so these are going out and connecting to downstream systems to pull information into my smart data fabric. What's great about the Iris platform is it has an embedded interoperability platform, so there's all of these native adapters that can support these common connections that we see for different kinds of applications. So using REST or SOAP or SQL or FTP, regardless of that protocol, there's an adapter to help you work with that, and we also think of the types of formats that we typically see data coming in as. In healthcare, we have HL7, we have FIRE, we have CCDs across the industry. Jason is really hitting the market strong now, and XML, payloads, flat files. We need to be able to handle all these different kinds of formats over these different kinds of protocols. So to illustrate that, if I click through these, when I select a particular connection on the right side panel, I'm gonna see the different settings that are associated with that particular connection that allows me to collect information back into my smart data fabric. In this scenario, my connection to my chart script application in this example communicates over a SOAP connection. When I'm grabbing information for my clinical risk grouping application, I'm using a SQL-based connection. When I'm connecting to my EMR, I'm leveraging a standard healthcare messaging format known as FIRE, which is a REST-based protocol. And then when I'm working with my health record management system, I'm leveraging a standard HTTP adapter. So you can see how we can be flexible when dealing with these different kinds of applications and systems. And then it becomes important to be able to validate that you've established those connections correctly and be able to do it in a reliable and quick way, because if you think about it, you could have hundreds of these different kinds of applications built out, and you wanna make sure that you're maintaining and understanding those connections. So I can actually go ahead and test one of these applications and put in, for instance, my patient's last name and their MRN and make sure that I'm actually getting data back from that system. So it's a nice little sanity check as we're building out that data fabric to ensure that we're able to establish these connections appropriately. So turnkey adapters are fantastic. As you can see, we're leveraging them all here. But sometimes these connections are gonna require going one step further and building something really specific for an application. So why don't we go one step further here and talk about doing something custom or doing something innovative. And so it's important for users to have the ability to develop and go beyond what's an out-of-the-box or a black box approach to be able to develop things that are specific to their data fabric or specific to their particular connection. In this scenario, the Iris data platform gives users access to the entire underlying code base. So you not only get an opportunity to view how we're establishing these connections or how we're building out these processes, but you have the opportunity to inject your own kind of processing, your own kinds of pipelines into this. So as an example, you can leverage any number of different programming languages right within this pipeline. And so I went ahead and I injected Python. So Python is a very up and coming language, right? We see more and more developers turning towards Python to do their development. So it's important that your data fabric supports those kinds of developers and users that have standardized on these kinds of programming languages. This particular script here, as you can see, actually calls out to our turnkey adapter. So we see a combination of out-of-the-box code that is provided in this data fabric platform from Iris combined with organization-specific or user-specific customizations that are included in this Python method. So it's a nice little combination of how do we bring the developer experience in and mix it with out-of-the-box capabilities that we can provide in a smart data fabric? Wow. Yeah, I'll pause. There's a lot here. Actually, if I could, if I just want to sort of play that back. So we went to the connect and the collect phase. We're going into refines. So it's a good place to stop. Yeah, so before we get there, so we heard a lot about fine-grained security, which is crucial. We heard a lot about different data types, multiple formats. You've got the ability to bring in different dev tools. We heard about FHIR, which is, of course, big in healthcare. Absolutely. That's the standard. And then SQL for traditional kind of structured data and then web services like HTTP you mentioned. And so you have a rich collection of capabilities within this single platform. Absolutely. And I think that's really important when you're dealing with a smart data fabric because what you're effectively doing is you're consolidating all of your processing, all of your collection into a single platform. So that platform needs to be able to handle any number of different kinds of scenarios and technical challenges. So you've got to pack that platform with as many of these features as you can to consolidate that processing. All right, so now we're going into refinement. We're going into refinement. Exciting. So how do we actually do refinement? Where does refinement happen? And how does this whole thing end up being performant? Well, the key to all of that is this SDF coordinator or stands for Smart Data Fabric Coordinator. And what this particular process is doing is essentially orchestrating all of these calls to all of these different downstream systems. It's aggregating, it's collecting that information, it's aggregating it, and it's refining it into that single payload that we saw get returned to the user. So really this coordinator is the main event when it comes to our data fabric. And in the IRIS platform, we actually allow users to build these coordinators using web-based tool sets to make it intuitive. So we can take a sneak peek at what that looks like. And as you can see, it follows a flow chart-like structure. So there's a start, there's an end, and then there are these different arrows that point to different activities throughout the business process. And so there's all these different actions that are being taken within our coordinator. You can see an action for each of the calls to each of our different data sources to go retrieve information. And then we also have this sync call at the end that is in charge of essentially making sure that all of those responses come back before we package them together and send them out. So this becomes really crucial when we're creating that data fabric. And this is a very simple data fabric example where we're just grabbing data and we're consolidating it together. But you can have really complex orchestrators and coordinators that do any number of different things. So for instance, I could inject SQL logic into this or SQL code. I can have conditional logic. I can do looping. I can do error trapping and handling. So we're talking about a whole number of different features that can be included in this coordinator. So like I said, we have a really very simple process here that's just calling out, grabbing all those different data elements from all those different data sources and consolidating it. We'll look back at this coordinator in a second when we introduce or we make this data fabric a bit smarter and we start introducing that analytics piece to it. So this is in charge of the refinement. And so at this point in time, we've looked at connections, collections and refinements. And just to summarize what we've seen, because I always like to go back and take a look at everything that we've seen. We have our initial API connection. We have our connections to our individual data sources. And we have our coordinators there in the middle that are in charge of collecting the data and refining it into a single payload. As you can imagine, there's a lot going on behind the scenes of a smart data fabric, right? There's all these different processes that are interacting. So it's really important that your smart data fabric platform has really good traceability. Really good logging, because you need to be able to know if there was an issue, where did that issue happen in which connected process and how did it affect the other processes that are related to it. In IRIS, we have this concept called a visual trace. And what our clients use this for is basically to be able to step through the entire history of a request from when it initially came into the smart data fabric to when data was sent back out from that smart data fabric. So, I didn't record the time, but I bet if you recorded the time, it was this time that we sent that request in. And you can see my patient's name and their medical record number here. And you can see that that instigated four different calls to four different systems. And they're represented by these arrows going out. So we sent something to chart script, to our health record management system, to our clinical risk grouping application into my EMR through their fire server. So every request, every outbound application gets a request and we pull back all of those individual pieces of information from all of those different systems and we bundle them together. And for my fire lovers, here's our fire bundle that we got back from our fire server. So this is a really good way of being able to validate that I am appropriately grabbing the data from all these different applications and then ultimately consolidating it into one payload. Now we change this into a JSON format before we deliver it but this is those data elements brought together. And this screen would also be used for being able to see things like error trapping or errors that were thrown, alerts, warnings. Developers might put log statements in just to validate that certain pieces of code are executing. So this really becomes the one-stop-shop for understanding what's happening behind the scenes with your data fabric. Yeah, sure. Who did what went where? What did the machine do? What went wrong and where did that go wrong? Exactly. Right in your fingertips. Right, and I'm a visual person so a bunch of log files to me is not the most helpful of being able to see this happened at this time and in this location gives me that understanding I need to actually troubleshoot a problem. This business orchestration piece, can you say a little bit more about that? How people are using it? What's the business impact of the business orchestration? The business orchestration, especially in the smart data fabric is really that crucial part of being able to create a smart data fabric. So think of your business orchestrator as doing the heavy lifting of any kind of processing that involves data, right? It's bringing data in. It's analyzing that information. It's transforming that data in a format that your consumer is not gonna understand. It's doing any additional injection of custom logic. So really your coordinator, that orchestrator that sits in the middle is the brains behind your smart data fabric. And this is available today? It's all available today? Yeah, it's all works. And we have a number of clients that are using this technology to support these kinds of use cases. Awesome demo, anything else you wanna show us? Well, we can keep going. I have a lot to say, but really this is our data fabric. The core competency of Iris is making it smart, right? So I won't spend too much time on this, but essentially if we go back to our coordinator here, we can see, here's that original, that pipeline that we saw where we're pulling data from all these different systems and we're collecting it and we're sending it out. But then we see two more at the end here, which involves getting a readmission prediction and then returning a prediction. So we can not only deliver data back as part of a smart data fabric, but we can also deliver insights back to users and consumers based on data that we've aggregated as part of a smart data fabric. So in this scenario, we're actually taking all that data that we just looked at and we're running it through a machine learning model that exists within the smart data fabric pipeline and producing a readmission score to determine if this particular patient is at risk for readmission within the next 30 days, which is a typical problem that we see in the healthcare space. So what's really exciting about what we're doing in the Iris world is we're bringing analytics close to the data with integrated ML. So in this scenario, we're actually creating the model, training the model, and then executing the model directly within the Iris platform. So there's no shuffling of data, there's no external connections to make this happen. And it doesn't really require having a PhD in data science to understand how to do that. It leverages all really basic SQL-like syntax to be able to construct and execute these predictions. So it's going one step further than the traditional data fabric example to introduce this ability to define actionable insights to our users based on the data that we've brought together. Well, that readmission probability is huge, because it directly affects the cost for the provider and the patient. So if you can anticipate the probability of readmission and either do things at that moment or as an outpatient perhaps to minimize the probability, then that's huge. That drops right to the bottom line. Absolutely. And that really brings us from that data fabric to that smart data fabric at the end of the day, which is what makes this so exciting. Awesome demo. Thank you. Fantastic. Are you cool if people want to get in touch with you? Oh, yes. Absolutely. So you can find me on LinkedIn, Jessica Jowdy. We'd love to hear from you. I always love talking about this topic, so we'd be happy to engage on that. Great stuff. Thank you so much. Okay, don't go away, because in the next segment, we're going to dig into the use cases where data fabric is driving business value. Stay right there. Today, more than ever before, organizations are striving to gain a competitive advantage, deliver more value to customers, reduce risk, and respond more quickly to the needs of businesses. Now, to achieve these goals, organizations need easy access to a single view of accurate, consistent, and very importantly, trusted data. If it's not trusted, nobody's going to use it. It all in near real time. However, the growing volumes and complexities of data make this difficult to achieve in practice, not to mention the organizational challenges that have evolved as data becomes increasingly important to winning in the marketplace. Specifically, as data grows, so does the prevalence of data silos, making integrating and leveraging data from internal and external sources a real challenge. Now, in this final segment, we'll hear from Joe Lichtenberg, who's the head of global head of product and industry marketing. And he's going to discuss how smart data fabrics can be applied to different industries. And by way of these use cases, we'll probe Joe's vast knowledge base and ask him to highlight how intersystems, which touts a next gen approach to customer 360, how the company leverages a smart data fabric to provide organizations of varying sizes and sectors in financial services, supply chain, logistics, and healthcare with a better, faster, and easier way to deliver value to the business. Joe, welcome. Great to have you here. Thank you. It's great to be here. That was some intro. I could not have said it better myself, so thank you for that. Thank you. Well, we're happy to have you on the show. It's great to be here. You've made a career helping large businesses with technology solutions, small businesses, and then scale those solutions to meet whatever needs they had. And of course, you're a vocal advocate, as is your company, of data fabrics. We talked to Scott earlier about data fabrics, how it relates to data mesh, big discussions in the industry. So tell us more about your perspective. Sure. So first I would say that I have been in this industry for a very long time. So I've been, like you, I'm sure, for decades working with customers and with technology, really to solve these same kinds of challenges. So for decades, companies have been working with lots and lots of data and trying to get business value to solve all sorts of different challenges. And I will tell you that I've seen many different approaches and different technologies over the years. So early on, point-to-point connections with custom coding. And I've worked with integration platforms 20 years ago with the advent of web services and service-oriented architectures and exposing endpoints with WisDL and getting access to disparate data from across the organization. And more recently, obviously, with data warehouses and data lakes and now moving workloads to the cloud with cloud-based data marts and data warehouses, lots of approaches that I've seen over the years but yet still challenges remain in terms of getting access to a single trusted real-time view of data. And so recently, we ran a survey of more than 500 different business users across different industries and 86% told us that they still lack confidence in using their data to make decisions. So it's a huge number, right? And if you think about all of the work and all of the technology and approaches over the years, that is a surprising number. And drilling into why that is, there were three main reasons. One is latency. So the amount of time that it takes to access the data and process the data and make it fit for purpose, by the time the business has access to the data and the information that they need, the opportunity has passed. It's a lapse time, not speed of light, right? That too maybe, but it takes a long time. If you think about these processes and you have to take the data and copy it and run ETL processes and prepare it, so that's one. One is just the amount of data that's disparate in data silos. So still struggling with data that is dispersed across different systems in different formats. And the third is data democratization. So the business really wants to have access to the data so that they can drill into the data and ask ad hoc questions and the next question and drill into the information and see where it leads them rather than having sort of pre-structured data and pre-structured queries and having to go back to IT and put the request back on the queue again and waiting. So it takes too long. The data is too hard to get to because it's in silos and the data lacks context because it's technical people that are serving up the data to the business people and there's a mismatch. Exactly, right? So they call that data democratization or giving the business access to the data and the tools that they need to get the answers that they need in the moment. So the skeptic in me, because you're right, I have seen this story before and the problems seem like they keep coming up year after year, decade after decade, but I'm an optimist. And so- As am I. So I sometimes say, okay, same wine, new bottle, but it feels like it's different this time around with data fabrics. You guys talk about smart data fabrics. From your perspective, what's different? Yeah, it's very exciting and it's a fundamentally different approach. So if you think about all of these prior approaches and by the way, all of these prior approaches have added value, right? It's not like they were bad, but there's still limitations and the business still isn't getting access to all the data that they need in the moment, right? So data warehouses are terrific. If you know the questions that you want answered and you take the data and you structure the data in advance and so now you're serving the business with sort of preplanned answers to preplanned queries, right? The data fabric, what we call a smart data fabric is fundamentally different. It's a fundamentally different approach in that rather than sort of in batch mode taking the data and making it fit for purpose with all the complexity and delays associated with it, with a data fabric, we're accessing the data on demand as it's needed, as it's requested either by the business or by applications or by the data scientist directly from the source systems. So you're not copying it necessarily to make that, you're not FTP-ing it, for instance, I've got it, you take it, you're basically using the same source. You're pulling the data on demand as it's being requested by the consumers and then all of the data management processes that need to be applied for integration and transformation to get the data into a consistent format and business rules and analytic queries and what Jess showed with machine learning, predictive, prescriptive analytics, all sorts of powerful capabilities are built into the fabric so that as you're pulling the data on demand, all of these processes are being applied and the net result is you're addressing these limitations around latency and silos that we've seen in the past. Okay, so you talked about, you have a lot of customers' intersystems does in different industries, supply chain, financial services, manufacturing, we're just on healthcare. What are you seeing in terms of applications of smart data fabrics in the real world? Yeah, so we see it in every industry. So intersystems, as you know, has been around now for 43 years and we have tens of thousands of customers in every industry and this architectural pattern now is providing value for really critical use cases in every industry. So I'm happy to talk to you about some that we're seeing, I could actually spend like three hours here right there but I'm very passionate about working with customers and there's all sorts of exciting cases. What are some of your favorites? So obviously supply chain right now is going through a very challenging time. So the combination of what's happening with the pandemic and disruptions, then now I understand eggs are difficult to come by, I just heard on NPR. Yeah, and it's in part a data problem and a big part of data problem, is that fair? Yeah, and so in supply chain, first there's supply chain visibility. So organizations want a real time or near real time expansive view of what's happening across the entire supply chain from supply all the way through distribution. So that's only part of the issue but that's a huge sort of real time data silos problem. So if you think about your extended supply chain, it's complicated enough with all the systems and silos inside your firewall before all of your suppliers, even just thinking about your tier one suppliers, let alone tier two and tier three. And then building on top of real time visibility is what the industry calls a control tower, what we call the ultimate control tower. And so it's built in analytics to be able to sense disruptions and exceptions as they occur and predict the likelihood of these disruptions occurring and then having data driven and analytics driven guidance in terms of the best way to deal with these disruptions. So for example, an order is missing line items or a cargo ship is stuck off port somewhere. What do you do about it? Do you reroute a different cargo ship? Do you take an order from that's en route to a different client and reroute that? What's the cost associated? What's the impact associated with it? So that's a huge issue right now around control towers for supply chain. So that's one. Can I ask you a question about that? Because you and I have both seen a lot, but we've never seen, at least I haven't, the economy completely shut down like it was in March of 2020. And now we're seeing this sort of slingshot effect, almost like you're driving on the highway sometimes. You don't know why, but all of a sudden you slow down and then you speed up, you think it's okay, then you slow down again. Do you feel like you guys can help get a handle on that product, because it goes on both sides. Sometimes you can't get the product, sometimes there's too much of a product as well, and that's not good for business. Yeah, absolutely. You want to smooth out the peaks and valleys, and that's a big business goal, business challenge for supply chain executives. So you want to make sure that you can respond to demand, but you don't want to overstock because there's cost associated with that as well. So how do you optimize the supply chains? And it's very much a data silo and a real-time challenge. So it's a perfect fit for this new architectural pattern. Right, what else? So if we look at financial services, we have many, many customers in financial services. And that's another industry where they have many different sources of data that all have information that organizations can use to really move the needle. If they could just get to that single source of truth and real-time. So we sort of bucket many different implementations and use cases that we do around what we call business 360 and customer 360. So business 360, there's all sorts of ways to add business value in terms of having a real-time operational view across all of the different geos and parts of the business, especially in these very large global financial services institutions like capital markets and investment firms and so forth. So around business 360, having a real-time view of risk, operational performance, regulatory compliance, things like that. Customer 360, there's a whole set of use cases around customer 360, around hyper-personalization of customers and in real-time, next best action, looking to see how you can sell more, increase share of wallet, cross-sell, upsell to customers. We also do a lot in terms of predicting customer churn. So if you have all the historical data and you know what's the likelihood of customers churning to be able to proactively intercede, it's much more cost-effective to keep assets under management and keep clients rather than going and getting new clients to come to the firm. A very interesting use case from one of our customers in Latin America, so Banco de Brazil, largest bank in all of Latin America and they have a very innovative CTO who's always looking for new ways to move the needle for the bank. And so one of their ideas, and we're working with them to do this, is how can they generate net new revenue streams by bringing in new business to the bank. And so they identified a large percentage of the population in Latin America that does no banking. So they have no banking history, not only with Banco de Brazil, but with any bank. So there's a fair amount of risk associated with offering services to this segment of the population that's not associated with any banks or financial institutions. Yeah, there's no historical data on them. There's no. So it's a data challenge. And so they're bringing in data from a variety of different sources, social media, open source data that they find online and so forth and with us running risk models to identify which are the citizens that there's acceptable risk to offer their services. There's gonna be huge market of unbanked people in Latin America, wow, that's interesting. Totally vision. And if you can lower the risk and you can tap that market and be first. And they are, yeah. Yeah, so very exciting. Manufacturing, we know Industry 4.0, which is about taking the OT data, so the data from the MES systems and the streaming data, real-time streaming data from the machine controllers and integrating it with the IT data. So your data warehouses and your ERP systems and so forth to have not only a real-time view of manufacturing from supply and source all the way through demand, but also predictive maintenance and things like that. So that's very big right now in manufacturing. Kind of cool to hear these use cases beyond your healthcare, which is obviously your wheelhouse. Scott defined this term of smart data fabrics, different than data fabrics, I guess. So when we think about these use cases, what's the value add of so-called smart data fabrics? Yeah, it's a great question. So we did not define the term data fabric or enterprise data fabric. The analysts now are all over it. They're all saying it's the future of data management. It's a fundamentally different approach, this architectural approach to be able to access the data on demand. The canonical definition of a data fabric is to access the data where it lies and apply a set of data management processes, but it does not include analytics, interestingly. And so we firmly believe that most of these use cases gain value from having analytics built directly into the fabric. So whether that's business rules or predictive analytics to predict the likelihood of a customer churn or a machine on the shop floor failing or prescriptive analytics. So if there's a problem in the supply chain, what's the guidance for the supply chain managers to take the best action, right? Prescriptive analytics based on data. So rather than taking the data and the data fabric and moving it to another environment to run those analytics where you have complexity and latency, having all of those analytics capabilities built directly into the fabric, which is why we call it a smart data fabric, brings a lot of value to our customers. So it simplifies the whole data lifecycle, data pipelining, the hyper-specialized roles that you have to have. You can really just focus on one platform. Exactly. Yeah, and it's a simplicity of architecture and faster speed to production. So a big differentiator for our technology, for InterSystems Iris, is most, if not all of the capabilities that are needed, are built into one engine, right? So you don't need to stitch together 10 or 15 or 20 different data management services for relational database and a non-relational database and a caching layer and a data warehouse and security and so forth. And so you can do that. There's many ways to build this data fabric architecture, right? InterSystems is not the only way. But if you can speed and simplify the implementation of the fabric by having most of what you need in one engine, one product, that gets you to where you need to go much, much faster. Joe, how can people learn more about smart data fabric, some of the use cases that you presented here? Yeah, come to our website, intersystems.com. If you go to intersystems.com slash smart data fabric, that'll take you there. I know that you have probably dozens more examples, but I think you're cool. If people reach out to you, how can they get in touch? Oh, I would love that. So feel free to reach out to me on LinkedIn. It's Joe Lichtenberg. I think it's linkedin.com slash Joe Lichtenberg and I'd love to connect. Awesome. Joe, thanks so much for your time. I really appreciate it. It was great to be here. Thank you, Dave. All right, I hope you've enjoyed our program today. You know, we heard Scott now. He helped us understand this notion of data fabrics and smart data fabrics and how they can address the data challenges faced by the vast majority of organizations today. Jess Jowdy's demo was awesome. It was really a highlight of the program which she showed the smart data fabrics in action. And Joe Lichtenberg, we just heard from him, dug into some of the prominent use cases and proof points. We hope this content was educational and inspires you to action. Now don't forget, all these videos are available on demand to watch, re-watch and share. Go to thecube.net, check out siliconangle.com for all the news and analysis and we'll summarize the highlights of this program. And go to intersystems.com because there are a ton of resources there. In particular, there's a knowledge hub where you'll find some excellent educational content and online learning courses. There's a resource library with analyst reports, technical documentation, videos, great freebies. So check it out. This is Dave Vellante on behalf of theCUBE and our supporter, InterSystems. Thanks for watching and we'll see you next time.