 Hello and welcome. My name is Mark Horstman, Data Evangelist with Dataversity. We would like to thank you for joining the latest installment of the monthly Dataversity webinar series, Advanced Analytics with William McKnight. Today, William will be discussing choosing your provider for implementing a data fabric. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. For questions, we will be collecting them via the Q&A section. If you would like to chat with us or with each other, we certainly encourage you to do so and just to note the Zoom chat defaults to send to just the panelists. But you may absolutely switch that to network with everyone to open the Q&A or the chat panel. You may find the icons for those features in the bottom middle of your screen. To answer the most commonly asked question, as always, we will send a follow-up email to all registrants within a couple of business days containing the links to the slides. And yes, we are recording and will also send a link out to the recording of this session, as well as any additional information requested throughout the webinar. Now, let me introduce to you our speaker for this series, William McKnight. William has advised many of the world's best known organizations his strategies form the information management plan for leading companies in numerous industries. He is a prolific author and popular keynote speaker and trainer. He has performed dozens of benchmarks on leading database, data lake, streaming, and data integration products. William is a leading global influencer in data warehousing and master data management. He leads McKnight consulting group, which has twice placed on the Inc. 5000 list. And with that, I will give the floor to William to get today's webinar started. Hello and welcome. Hello and thank you, Mark. I trust that my slide is being shown appropriately here. It sure is. Screen arrangement. Alrighty. Well, thank you. And welcome everybody for an important topic, one that is increasingly dominating, I guess, the buzz in data management circles these days. And that's data fabric. And I saw that this was going somewhere last year, around mid-year, when Microsoft was introducing Microsoft Fabric. And we'll get to that. But I thought it was going to go somewhere. And we have definitely seen this happen. So a lot of our clients are asking for data fabric. But I can guarantee every one of them has a different opinion on what that is. But they know they need it. And I think you need it too. But I think we need to know what we're, what we're needing here, what we're asking for, and what we're possibly getting. Now, I definitely lean into the products that are out there and available for your data fabric. And that's why I've titled this presentation, choosing your provider for implementing a data fabric. But indeed, data fabric is an architectural construct. And it is something that you can put together yourself. And we'll get to that. We'll get to the DIY approach versus a product approach. But interest in building these is very high and mostly for good reason. It enables real-time trustworthy data across multiple clouds, which we all have. And you can expedite the migration of analytics to the cloud, guarantee security, get that one panel of access into all of your data, accelerate governance, and quickly generate business value. I think it's definitely providing a lot of value out there. It takes a while to get there. I can't say anybody has a fully mature data fabric in place yet. It's a work in progress. And definitely these products that are data fabric products are works in progress, for sure. And the method for building a data fabric has prior to, I guess, mid last year, has been assumed to be a messy do-it-yourself blend of everything, right? Data warehouses, data lakes, data integration tools, data governance tools, et cetera. But some of these advancements have made this move to more streamlined and integrated solutions simplify the process. So, now, one more thing. In my study of data fabric for this presentation, I thought I knew data fabrics. I started studying for this, and I realized there's so much more to know. And the more I learned, the more I learned, I need to learn and learn about this. So just know that we're all in that boat when it comes to data fabrics. There's still going to be some sloppiness and some messiness around what this definition is. But we'll get there. Something related here, I thought I would share with you, is the publication of our Enterprise Contribution Ranking Report for Enterprise Data Integration. It's now published and available. We went through all the criteria that you see there for all the products that you see there. It's the most technical, competitive product of its kind in the market. And you can get it at that memorable URL you see at the top there from one of the companies that purchased 3-print rights. Now, first, let's talk about the data mesh. Why are we talking about the data mesh when we're here to talk about the data fabric? Well, it's because there's a lot of confusion out there about the differences. And I think they work together really well. And so you can definitely pull ideas from both of these into your data architecture. And you will not go wrong. But what is a data mesh? So a data mesh is really, okay, you know, I give it to you straight. I give it to you from a working definition standpoint, not really something out of maybe kind of analyst speak. So I'm going to say that I feel like it's an acknowledgement of the decentralized nature of our data environments these days. And that means that we have multiple of these things. We have multiple data warehouses, data lakes, data integration environments as shown here. And so this brings a little bit of, shall I say, science into the picture to help make sure that when we're doing that, there's there's still working together. There's still common pieces in play here that allow us to work together and get a cohesive whole going. A lot of the same objectives of a data fabric, by the way. So this helps with concepts like a customer being different across different parts of the company helps get it get onto a single page in regards to things like that. It allows for more adaptive governance that allows governance to have more leverage in the organization because there is connection points between all the various sub data architectures within the organization. This concept of a data mesh is not attached to a vendor. It's consultant led and driven. The mesh aims to decentralize every component based upon what I call business domains and what has now come to be called a data product, which is confusing to me because I thought my clients were building data products when they were building products based on their data that they then sold to the market. So it's not that. So let's put that definition aside because this definition of data product has taken over. So it might be departmental, but more likely and the science says to make it along the lines of a business domain, something like customer product, et cetera, et cetera. So where each is managed in their own architecture with shared data channels, shared components, et cetera. So you don't have to, you cannot, I will say, build a great data mesh or a data fabric without being pretty good at data warehousing and data laking, if that's a word. So with a lot of these things, as you can see the data warehouse is there. It's a component of the overall architecture as it will be for the data fabric as well. But sort of the assumption here as you look at this is there's a great data warehouse there. And if there's not a great data warehouse there, you're doing all this stuff around it and you probably are best, it's best to put your energies into fortifying that data warehouse. So you want to bring the data warehouse to a certain level of maturity before trying to make a mesh out of it, before trying to make a fabric out of it. Hopefully that goes without saying, but I wanted to say it because I think we're moving forward, hard charging into the future sometimes with these architectural constructs and trying to gloss over our problems. So these are complementary approaches, the data mesh and the data fabric. Some of the principles and benefits of the data mesh is domain ownership, ownership where it belongs. So these domains offer a bounded context where our team will own a domain within the organization. It's data as a product. I love that concept where data sets are exposed via APIs. There's a catalog over the top. And with federated governance, you have global standards of service levels and quality. There's lots of cell service inherent within a data mesh and it's frequently compared to microservices. It has that kind of a field to it, that kind of an idea, breaking a bigger thing into smaller pieces. And some of the principles are data democratization, get this data out to everybody that needs it within the organization, cost efficiencies, because you're really getting good at a few products. They just happen to be across the organization and you're reducing technical debt with the data mesh. But okay, that's the data mesh. Hopefully, I'm not confusing you because I'm switching to the data fabric here. And again, they work together really well. So what is the data fabric? Well, I don't know how to best graphically represent this. And I looked around and nobody seems to do anything more than what I've done here, which is there's your architecture and I threw something over the top. A piece of fabric, if you will, it's really kind of that. It's a fabric over the top of your existing architecture. Now hopefully, or quite possibly, the architecture is a data mesh behind the fabric. All right. So the fabric is that unified and integrated architectural pattern that provides a consistent and scalable approach to managing and accessing data across multiple sources and formats. But there are some commonalities that organizations desperately want to have across all of their data, like security, like one panel of access to all the data, different types of components of data governance. Okay, the fabric provides that. And so that is the great thing about fabric plus it has a lot of connectivity to many, many data sources that you have in your organization. So legs, warehouses, all sorts of operational sources as well. So fabric provides these common shared services, connectivity, and application portability. The fabric is all about using metadata to automate the management and governance of data. So more operate automation is made possible by providing a unified centralized platform for data management integration and governance. And driving these data quality rules and different types of data analysis throughout the organization, no matter where the data may be, is really a great thing. Now a lot of us know what data virtualization is. You might think of the fabric as data virtualization on steroids. So they're with a lot more componentry involved in the process. So data virtualization, to me, it's like a logical data fabric. And still important in the mix too, by the way. Now there's another term debated as to whether I should bring this up here or not. But it is kind of thrown about in context of the data fabric and can be a confusing point as well, as well. So much going on. And that's the data cloud. And so what is the relationship of the fabric to the cloud? The fabric can operate on top of a data cloud. Now, so what's the difference? What's the data cloud? So just like I said five minutes ago about, yeah, the architecture for that data warehouse better be good. Well, what is around that data warehouse? What other components are there to that data warehouse? Now to me, the data clouds, they seem like a play by various vendors to try to make sure that everything you need, you get from them for your data architecture. So you need a catalog. Yeah, we got it. Data integration. Data warehouse, of course. Data access. We got it here. We're bundling it in a cloud. And I'm calling that the data cloud. And I'm not disparaging it by the way. But that is something that works in conjunction with a data fabric. A data fabric is platform agnostic. It can be implemented to integrate data residing on multiple data clouds, on-premise data centers even, or even edge devices. So this gives you great flexibility and adaptability to specific organizational needs. The data fabric is the architectural blueprint for connecting and managing data across the diverse environment. The data cloud is a specific platform or services that provide data storage, processing, and analyst capabilities, leveraging cloud technologies. So I hope that helps. Now, let's look at the factors driving the need for data fabric. I don't need to go through these. It's everything that's driving us towards this idea that data is important to the organization. And it's becoming increasingly important. And I have seen this time and again over the past few years of organizations readily acknowledging now that it's all about their data. And it's all about data getting ready for AI and that sort of thing. And these are the factors driving the need for data fabric. Because a fabric helps you organize the architecture that you have and brings some commonalities to it that again, the organizations out there desperately need to have on it. So big data cloud computing, edge computing, they've all contributed to this increasing complexity of data management. So this is, some of this is not new. We're just naming it now. Data architecture that connects an organization's data sources and processing within a unified layer, allowing universal access through a single interface. That's the data fabric. It standardizes data to overcome differences in formatting, schema and file types, making it valuable for end users in various use cases. So what are the principles of a data fabric? Hopefully it's coming together a little bit. Data fabric is intelligent and automated. So it doesn't just sit there and connect to sources. It uses metadata and semantic models to understand what's going on in there. It's a step towards overall automation. It's going to allow for more automation down the road because it unifies disparate data systems. It embeds governance, strength and security and privacy, and provides more data accessibility to workers, particularly to business users. A data fabric can allow decision makers to view all data more cohesively to better understand, for example, a customer lifecycle. It counteracts the problem of data gravity, which is the idea that data becomes more difficult to move as it grows in sizes. It operates around the idea of loosely coupled data in platforms with the applications that need it. And we'll get into some of the vendors behind the data fabric here shortly. But the vendors have definitely picked up on this. The benefits of the fabric are integrated intelligence. Like I said, more than just here's your data. We know where it is now. Come and get it. It's doing things. Data democratization. This gives everybody access, well, in a fully mature fabric, everybody access to all the relevant data. It improves data security because security policy can now be enforced across the board. It gives you universal access to all data sources. It standardizes the data formats because the data that I've seen in organizations are in so many different formats. But this is going to standardize the data that you see, that you get back from these various sources. Reduce complexity for end users, improve data integration and standardization. And it enables access to data from various sources across the organization. So one thing that I'll hone in on here is the data security, because it's going to have, give you better data protection, which is something that let's just say I'm asked about all the time these days. It's a huge risk to organizations that the wrong data gets in the wrong hands. The broadening of data access also doesn't mean compromising on data security and privacy. In fact, it means that more data governance guardrails are put in place around access controls, ensuring specific data is only available to certain roles. Some of these data fabric initiatives really come from data security, because this is the benefit that some organizations are glomming onto and saying that is what I need, and that is why you're getting into a data fabric. Now, there's probably a good 10 reasons why you might want to get into a data fabric, and you may not care about the others. Now, I suggest that you probably do care about all of them at some level, and you will care about all of them in due time. But if the data security aspect of a data fabric, for example, if that cuts the mustard on your ROI for the project, go for it. Go for a data fabric. Am I advocating a data fabric here today, or just here to tell you about it? I am advocating a data fabric now. So some of the challenges, some of the big challenges are if you're masking a poor data architecture. That's always a bad idea. Now, the data fabric, I must admit, I must admit that your data maturity might be allowed to be a little bit lower if you're going to implement a fabric, but not so much lower that it can't still must still be good. Hope that made sense. So still be sure you have a good underlying data architecture before you turn it into a fabric. Some of the key features of a fabric, and hopefully this is starting to gel and come out for you, the connectivity across all data sources and processing. To me, this is usually a number one consideration in picking a data fabric. You want access to all data sources and processing that you have today, that you may have in the future, that you can think about, and that you may have in the future that you can think about. It's a unified layer with a single interface, accessible regardless of location, again on premises, on the edge, and the standardizes data formats for you. It's a unifying layer that connects all of an organization's data sources and processing within a single interface. It eliminates the need to deal with the complexities of different locations and formats, providing seamless access to all data regardless of its origin. So what are the components of a data fabric? Hopefully this is starting to gel, right? Data integration. Yeah, because you're going to be moving data with your data fabric, data governance, putting those security policies, putting those data access policies and everything else into the entirety of your data organization, data catalog, data virtualization. So all data is at your disposal. Data analytics and data security. These are the components of a data fabric. Is this what all of them provide? All of the so-called data fabric vendors? Pretty much yes at this level, absolutely. I will get into what you can expect to get where the most important or the most, I'd say, beneficial components of a commercial data fabric are and where some of the lesser important ones are as we go along here. The use cases for a data fabric are, heck, I hate to even create a list, but these are some of the common things I've seen, but there are so many more. And you can think of them, I'm sure, for your own organization. It's really a data architecture. It's a great data architecture. By integrating across various data sources, data scientists can create a holistic view of the customers, data analysts can create a holistic view of products and sales. The scope and scale that it can handle is big as it eliminates the field of data silos. So how about artificial intelligence? Everybody's getting into artificial intelligence. And so most new applications, at least that I can see, have a component of artificial intelligence in them for sure if they're not full-on AI projects. Data fabrics can help AI applications to be more accurate. By providing AI applications with access to a wider variety of data, the fabrics can help to ensure that the models are trained on better data, more representative data. Data fabrics can help AI applications to be more efficient. Data fabrics can help AI applications to be more scalable. All right, these are some of the things you want to take off the table when you step into AI. As a matter of fact, what I've seen over and over again throughout the years, and I'm sure you have as well, if you think about it, there's a lot of great ideas that executives and others get for an organization. But then they think about, well, let's look at that data hurdle that we have to get over to get there. And I don't know if I'm up for that challenge right now. So since there's a big old data hurdle out there, I think let's walk away from this great project that I can think of. But what if that were not such a hurdle? And in my view, data fabrics can really help out with that. So you can get down to business quicker. You can get these projects done quicker because it makes your data much more pliable for various uses within the organization. We've got to get away from this idea of I'm building this application one and done. I have no consideration for anything else and that sort of thing. And we must do this with a great architecture of which I think a fabric provides a great component of that. Data fabric in action integrates data from various sources, like snowflake, Kafka, Salesforce. It enables data reuse for various different use cases. I just mentioned that. And it reduces time and cost for data re-engineering. Fabrics enable the integration and analysis of data from diverse sources for various use cases. This allows for data reuse, improve decision making, and reduce costs associated with data re-engineering. Solving these data distribution challenges addresses the scattered data landscape. It solves the challenge of scattered data across various internal and external systems that enables organizations to access all their data from any location as if it were stored in one place, as Microsoft would say in one lake, and presented in a clear business friendly format. This is going to simplify data analysis and decision making across the organization. Now there are components of physical data fabric and logical data fabric in all data fabrics, okay? They're not all 100%. They don't have to all be 100% one or the other. As a matter of fact, ultimately you're going to have some logical data fabric in the mix. But if you can give that logical data fabric and assist with some physical data fabric components to it, that is going to be great. Now that's a decision that you have to make as you step into your architecture and you look and decide where are we going to place our energy? There's so many ways we can go with it. Data cloud, data mesh, data fabric, artificial intelligence. Should we consolidate? Should we not? There's which projects make sense and our data quality and our data security. What about all that? There's no end. Now data fabric can be implemented in either one of these ways, physically moving the data or virtually accessing it. Physical fabrics centralize data in a common environment, kind of like the data warehouse or the data lake idea. While logical fabrics access data wherever it resides. And again, you'll probably end up with some combination of this. Everything's ultimately physical, right? But who's to say you don't need to do some more consolidation of your data before you put the logical data fabric on it? Well, that's a decision that you have to make by looking at your situation, looking at your maturity and just I would say definitely doing some proof of concept with data fabric to see what kind of performance you're going to get and grow it out from there as always, right? So implementation considerations, your time and money investment in this and what value do you intend to realize? Now, it's not good enough for me to sit here and say you need a data fabric. It's a great data architecture for you. It has to make sense from a business standpoint. It has to improve applications. As a matter of fact, I can't sit here and say, well, your application is going to be improved for the next 10 years and that's good. That should win the day and make a general statement like that. I've already said what about what if you could have your data as a service to the rest of the organization? And they don't have to think about that data mountain that they have to get over in order to do a good business idea. I've already said that, but what if it wasn't there? Where would the value come from? Now, would you get deployments up and running quicker and save money in the deployment cycle? Sure. Yes, you would. And would getting projects up and out the door quicker mean that you start realizing value quicker? Absolutely. That's a good thing. But there's so much more to it. There's so many more business capabilities that you might enable. What you ultimately need to do is look at your next several applications that will use the data fabric and look at how they're going to benefit from the data fabric. And they might benefit because you can build them quicker and cheaper. Great. They might also benefit from just being able to be better, like predictive maintenance, for example, be better at what they're trying to do for the business. And wouldn't that be great? Yes. How much better? Well, this all gets into return on investment, but ultimately, you end up having to do something of that nature to step into this. Now, I hope the guy in the blue shirt is okay. I'm not sure what he's doing there. All right. Now, this is a little bit about how it does the things it does and some of the things that you want to start to look at in these data fabric tool sets, or if you're doing it yourself, what you'll need to build. You want automatic, more or less automatic data integrations based on artificial intelligence, based on metadata. Advanced data fabrics leverage AI and metadata to automate data integrations, enabling efficient and scalable deployments. If there's any big area of data management out there that is having cycles pulled out of it left and right because they're not needed anymore due to artificial intelligence, it's data integration. Data integration has now become like merging libraries, but the books speak different languages, so you just got to slot them into the right place. Semantic inferencing is helping tremendously. It untangles the data meaning, matches up that data smartly, cleans and fixes all the data bumps that you're going to have out there and enriching the data otherwise, and improving schema translation. Schemas are different and can lead to a lot of confusion doing it oneself. And so the data fabric helps with all of this. All data fabrics on the market are semantic driven, and that's an absolute must, and fortunately they are. Fortunately, I will say that the market has not flooded us with an overuse of data fabric yet, the term data fabric yet. While a data fabric could have a much more shallow meaning than what I'm presenting here, it can do one or two things, I suppose, and you could call it a data fabric. So far the market has resisted calling those things data fabric, and that's only good for us. However, as you, the buyer of such things, start to insist on data fabric. I want a data fabric. I need a data fabric. As that term starts picking up, and I think it will, we're going to start seeing more watered down data fabric usage out there. So beware, now's a good time to look at these products as we'll do now. Key considerations for choosing a provider. Now, before I get into the considerations, let's just look at the do-it-yourself approach to a data fabric. And I'm not here to say you must get a tool or you can't get the fabric. You certainly can do it yourself. As a matter of fact, we did. We did a benchmark recently where we did a do-it-yourself approach. We stitched together several pieces, which I'll get to, and did our own. And then we did enterprise things with that environment. We projected that out, the cost of that out a good few years. Did the same thing with a data fabric provider. And I can tell you, it was SAP DataSphere. We've come to know that product pretty well. But there are other products out there. And I'll carry on with what we found as we go along here. But while I'm mentioning a product, let me mention some of the others you may have heard of, Appian, Cambridge Semantics, Cinchi, Denoto has something Informatica, something K2View, I recall, Microsoft with its fabric. Let me get to that. Solix, Stardog. Not all of the magnificent seven, I can say it, have jumped into this fray yet. It's all a work in progress. And again, I think this is something that is going to pick up and the rest of them are going to jump in and we're going to be a wash in fabric here pretty soon. Like the Microsoft fabric. Let me take a minute on that because that's kind of the 800-pound gorilla in the room. I get asked about it all the time. We've been on fabric, Microsoft fabric, that is since probably the first month that it was in pre-release. So we've been using it. Come to know a little bit about it. First question I get is, is it a data fabric? Because they're not calling it one. They're calling it the Microsoft fabric. Well, they can call it whatever they want. They are Microsoft, by the way, but I'm saying it is very much like a data fabric and the day may come when they call it that. It's an end-to-end cloud-based software as a service solution for data and analytics offered obviously by Microsoft. The focus is on data integration, data engineering, data science, and business intelligence. Now the pricing for Microsoft fabric warrants its own webinar. So I don't want to get into it too much here because it's a deep pit, but it's priced by Azure stock keeping units. And I think we're all trying to figure it out at this point. I feel like this enhancement of Synapse plus the Lakehouse concept, I think it's going to catch on fast. And the fabric is one data security, one governance, one compliance framework, and no more using separate analytics services from different vendors as Microsoft fabric offers and is a solution that really takes out any potential competition by having the power to unify data. Now I say Microsoft fabric, and I'm telling you about Microsoft fabric, but a lot of the products I could say this for as well. But one of the special things I guess about Microsoft fabric is its co-pilot capabilities. We've all heard about co-pilot, right? And some of these capabilities are soon to be launched, but you're going to be able to use large language models to help you with your coding directly in your workspace and other powerful capabilities that fabric helps you monitor your organization's data in real time, and you can trigger actions based on data insights. Now Databricks is a partner in fabric, but now Microsoft has more or less become a competitor to that with its data lakehouse approach. So watch this space, and there's going to be things happening here, I'm sure. So keep an eye on this. I think Microsoft fabric is a great evolution of Synapse. As I think about, I think this is about the other fabric products out there. So anyway, I've kind of taken an aside on that particular product, but it is one of the products that obviously it's not a do-it-yourself approach to a data fabric. Why would you want a provider approach like that? Well, again, we did a study, and here's some of the things that we showed. We compared the tasks of creating, integrating, distributing, and managing a data fabric with a common do-it-yourself set to a leading fabric provider, which I've already mentioned is the SAP Data Sphere one. Our do-it-yourself data fabric stack included a leading data governance tool, a leading data integration tool, an augmentation S3 for the data lake, and a leading tool for the data warehouse. And in our study, again, projecting to the enterprise a small three-year data fabric will cost about three years now, $716,000 with a product and $2.2 million without, large $2.8 million with and $8.7 million without. Now, anytime I mention cost, and some of these are mind-boggling costs, right, I will also mention that the return on investment is there in a well done data fabric. And that's what you want to focus on, not just for cost, not just measuring out various costs necessarily, but measuring out the output, the return. And it is there with a fabric. Now, with one of these commercial fabrics versus a do-it-yourself approach, there are differences. And here's where we found the differences were. There are large differences, a large meaning, large in a positive way for the commercial fabric system administration. Lot easier, data modeling, a lot easier, data security, a lot easier, somewhat easier, data acquisition, integration, and the consumption. Small, but still there, data preparation. And we didn't really find the catalogs that are part of the data fabric products to necessarily be that much different or better than a commercial data catalog. So it's to the point where I wouldn't necessarily want the catalog from the fabric provider. Yes, that'll add overhead to each's own on that, but something to look for. Don't expect huge catalog bumps when you go to a fabric. Now, choosing a provider, a lot of recommendations here. And I hate to use an overused term, but I'll say it because it applies one throat to choke. Yeah, one throat to choke with a data fabric purchaser. So far, again, the market has resisted promoting non-integrated tools as a data fabric. Oh, we have these three tools. Let's package them and call it a fabric. Hasn't quite happened yet. With one of these products, I'll mention organizations can streamline data management processes and achieve greater efficiency in handling complex data projects as we saw in our test. So hat tip to SAP DataSphere for what they provide with their data fabric. And that's the one we use for our great test. Okay, complete connectivity to data sources. Now we're getting into the, I think I have six things for you in terms of these are the things to look at. Universal data source connectivity. Let me define universal here, because that makes universal makes it sound like everything all time forever and ever, not necessarily. Of course, the sources you have in your organization, yes, what you are getting into and then that you can see that you're getting into, I would say, as far out as you can see, the next three to five years. And also be sure that you do a little think here in terms of what you're not planning on, but what may be a possibility. The good news is that most of these products have great, what we would call universal data source connectivity, the data fabrics connect to and ingest data from a wide variety of source systems, regardless of location, type, being hybrid, multi cloud or private or platform. And it abstracts the technical complexities data fabrics, hide the technical details, like connection type, JDBC, ODBC, REST APIs, what have you, to present a unified view of all ingested data. There's an equal, great treatment of diverse data sets. When I first heard this as a goal of data fabrics, equal treatment of all data sources, I thought, well, no, there are some more important than others. But if they're all brought to a nice high standard in terms of making them accessible to me as a developer, engineer, data scientist, what have you, then why not? Why not make them all equal? So equal treatment of diverse data sets. What you like as someone that's run these organizations a lot, what I like is things that are quantified. And we can say, we can list the data sources, one, two, three, four, etc. That is great when we can do that. When it's not a list, when things are maybe on the list, maybe not on the list, maybe you got half a source here, and then you got the other half there. The more confusing it is, the harder it is to get things done. So I like this, ultimately, equal treatment of diverse data sets. Centralized access, one panel. One panel access to your entire federated environment. Entire environment, including multiple clouds, including multiple of the so-called data products in a data mesh. Yeah, so you build the data mesh, you've got these multiple data products out there, how to bring them together and in something meaningful. Well, that would be your data fabric. Universal data access, centralized data understanding, actionable insights. And this is a step towards more action coming out of data more or less automatically. Okay, getting the data more, I know it's decentralized, but now kind of bringing it back into some sort of centralized understanding of the entire environment enables more automation and artificial intelligence. Single point of access, just in time access, single pane of glass view. Is the access the greatest looking right now necessarily? No, probably not. But just having access to that data is wonderful. And let's face it, most of the projects that we're doing out there right now, we're not building dashboards anymore. We're not building reports anymore. We're doing things with the data. We're moving that data right into automated processes. We're not bringing data to a point where it has to be interacted with as much. We're not doing it as much. That's usually a side light, I guess, of the projects that we're doing. And you too, problem. Preparation of data. I've separated this out from integration of data. We're talking here about getting predefined transformations working on data, versioning data, automatic schema drift, detection and resolution. It's always going to happen. Best be prepared for it. And workflow automation for repetitive tasks like preparing Salesforce data, something you'll do on a daily basis. Data preparation entails unifying schemas, rectifying differences in data models and cleansing data so it's easily serviceable for data integration. This is important because data preparation often kicks off the analytics or application loading lifecycle that's foundational to many of our data fabric use cases. Many offerings are broadening this paradigm with self-service constructs based upon low code methods with point-click GUIs. Now we talk about integration of data. Standardization is key. We need to see that data, the data models, the schemas, the semantics, they need to be unified to ensure compatibility of cross sources. Can you imagine trying to look at a date field and you have different formats of date in the field because they came from different places? Yeah, none of that. And that would be true for things that are wildly more complex than that simple example. Unified data for diverse applications. Multiple integration methods like ETL, ELT, replication, change data capture and streaming. So is data fabric a data integration tool? Yes, yes it is. And it's a unified data access tool. It's a unified data security tool, unified data governance tool. And it's built using semantic models. This is how it does what it does. I put it in here because you want to look at how well that is done. A data fabric provides a semantic layer that enables a map of your metadata. Unifying data based upon its meaning rather than just its location and preserving the business context and logic of that data across the fabric. I mean I've been trying to get organizations moving towards a data fabric well before it was called a data fabric, okay? And I faced that data mountain that I've been talking about. And with a data fabric in place, I won't face that data mount. We'll be able to get things done much more readily in the organizations that have data fabrics. Excuse me. And they all use semantic models. And I'm putting this in here partly to make sure that we remember a year from now, two years from now, that data fabrics should be based on semantic models. And if any vendor is pushing a data, a so-called data fabric out there and it's not based on a semantic model, you're not going to get the benefits that I'm talking about here. So let's just keep that in mind. Data management. Yeah, it manages data. Do they all have their own database? Yes and no. Yes, they mostly do. But yes, they also work with other databases as well. Data fabric must support data governance, including security and access control. Vendors of data fabrics must either provide mechanisms for data modeling, data profiling, data quality, life cycle management, and other conventional components of data governance and data management. Or they must integrate with them. And this is where it gets a little fuzzy in terms of wealth is just fully integrating with other tools. Is it truly a data fabric? Well, gone are the days. Gone are long gone are the days when there could be some sort of laminated data architecture that I could bring without knowing a company, right into that company and saying, hey, here's what we're going to do. Everybody's different. And everybody has their own thing going on, their own oftentimes spaghetti architectures going on. And you've got to pick up the pieces right where it is. There's no wishful thinking here. It is what it is to get the past, move forward, make a fabric of it, and get to the point where you can get the benefit of things like user behavior, monitoring, access control methods, and detailed data provenance. Where did the data come from becoming increasingly important in this regulatory environment that we're getting into. And finally, and again, sometimes this is the biggest benefit of a data fabric unified data security. Data fabrics implement various security measures to protect sensitive information like access control, data encryption, masking, anonymization, auditing, and logging, and so on. Now, what will you find when you step into the market for data fabric providers? You will find data virtualization, data movement, and data federations, strong capabilities there. Data federation is like when there's a single point of access to multiple data sources, it federates the data by creating a unified schema and translating queries to the specific language of each source. Low code, no code, they're mostly that. Tens to hundreds of connectors, I am sure that in the future it'll be thousands for some of these. There's data harmonization where these tools provide an evolving data model that harmonizes data of any variation that is structure, schema, source, and so on based on uniform standards and business terms. This semantic layer is an overlay for the underlying source data accessed in memory or through data virtualization that works together with those. Knowledge graphing, a data catalog, that's for storing data assets, metadata, and enriching those assets with business logic, taxonomies, and data models as needed. Data profiling, which we talked about, we didn't talk much about the data marketplace capabilities of some of these data fabric providers, and natural language query. Of course, any query engine that's being built today is with natural language, and so these definitely are as well. So you want to step into the market, evaluate these potential providers, here's some miscellaneous topics related to that. Like pricing, it's confusing. It can be, you are priced on in one example, and I think several examples actually, data warehouse capacity units, data lake compute terabytes, yeah, weird. Data integration jobs per hour per month, I see something like that, I'm going to write some big jobs. Optional data catalog, it's an optional add-on cost, and if you do get it, how much are you crawling, how many hours are you going to crawl that catalog per month? I've already said that catalogs have nominal benefit here, but they'll get better. And anyway, there's probably, I could probably put five more bullets on here right now for pricing, because it's kind of like this, it's based upon utilization. Now the future of data fabric, integration with artificial intelligence and machine learning, decentralized data management, focus on interoperability and open standards, enhanced security and privacy features, you have more of the same, more of what I've told you about data fabrics today. Self-service analytics and citizen data scientists, more things like Microsoft Copilot in the mix to make it all easier to get there. The utopia that a CEO would want for their data environments, not here yet, but this is a step in that direction where he or she can pretty much press a button and the data issue is, there's no data issues, it's just what you're doing with the data. It's just APIs and apps on top of a great data infrastructure supported by data fabric. Best practices, yeah, a lot of these obviously could apply to any project, but they're definitely worth mentioning because I see that a lot of shops are getting gung-ho about the data fabric, let's push headlong into it and you forget some of the basics. Define clear goals and objectives. Assess your data landscape and this is part of determining if you're building it oneself or if you're getting into a product. Consider user needs and adoption, standardized data formats, how do you want the data format standardized? You still need that governance forum. Yeah, implement a data, a robust data governance framework, you still need that for many, many data decisions that will happen throughout your data fabric journey, focus on data quality, always write, prioritize security, of course, and embrace a collaborative approach. Technical builders and business users and application developers in summary. And if you have any questions, by the way, I'm going to leave a few minutes here in just a minute for those questions, go ahead and get them in. A data fabric is a unifying integrated architecture that provides a consistent and scalable approach to managing and accessing data across multiple sources and formats. That's your top line right there. There's a bunch of benefits that I talked about and it reduces complexity for everyone involved and it includes integration, governance, cataloging, virtualization, analytics, and security. Those are the components of a great data fabric. Purchase considerations are complete connectivity to data sources, centralized entry, preparation of data, integration of data, support for semantic models, data management, and data security. These are some of the key things that you want to look at as you step into your fabric market. And in our analysis, we found that it's better to buy than it is to build. I'll put the caveat on there. You may be different. You may have a great mature data environment and great capabilities to do these things with componentry as opposed to a fabric kind of unified product. And that's great too. But just know that both are at your disposal. And that brings me to the end of the formal part of the presentation, considering your provider or choosing your provider for implementing a data fabric. And with that, I'll pass it back to Mark to see if we have any questions. Yeah, our community is so fun because we've got lots of fun questions and things that I think about a lot as well. So we'll get into a couple here. Is there any real world research or case study that confirms the benefits or a positive difference for data fabric? There's the one I did, which I talked about a little bit. But it was more assuming you do want a fabric and whether you should get into a product or build it yourself. So it did not start from the premise of is a fabric going to provide me return on investment? I would say to that question, no, I'm not aware of any such study. It's one of those things that's so new and people are dipping their toes in. But I think there's a lot of benefits out there that people can realize as you went over today, which was excellent. Your diagram depicts multiple data warehouses in the data masher fabric. Do they all need to be a certain level of maturity before becoming a component? Ah, great question. I love that question. Yes. Yes, I tried to stress that a little bit as well that the underlying lake here needs to be done well. The underlying warehouse needs to be done well. I can't tell you how many organizations that will call something a data warehouse that's really just a loose collection of different data. The data's not really integrated, not really looked at from a quality perspective, not modeled together or anything like that. But it's in one database now. So you did get somewhere. You did get somewhere. So that's a low mature data warehouse, if you will, not really fit yet for putting in a mesh, putting in a fabric. Please fortify those warehouses to some degree and get it up to standard. Now, last year, I gave a presentation on maturity, maturity modeling, your data management environment. You might look to that presentation because I had a section in there on data warehousing and what I meant by what I mean by level three maturity, which is where I would say that warehouse needs to be to be effective in a mesh or fabric. But there are several things you want to look at. Awesome. Yeah, I totally agree. What types of existing data environments and infrastructures might benefit best from a data fabric? Well, I've become an advocate for data fabric, so I'm almost wanting to say all, but I know I can't quite do that because that's not right. If you were at a super high level of maturity with your data environment and you're getting all your needs met, that's the bottom line, by the way. The bottom line is not, oh, I got to have a great fabric. I got to have a great dimensional data model. I have a great, this is that, that the analysts and consultants care about. It's what your users care about, what your application teams care about, in terms of are they getting what they need so that they can do their job, which is to turn business. And so that's the bottom line now. But I think, I think that to get there, I see many enterprises and what they've got. And so to get there, I think data fabric really helps you get moving in the right direction, getting all the data under control. You don't have to worry so much about security. You don't have to have some complicated data integration layer that you have to build that integrates all this data and harmonizes it. That's what the fabric's there for. So hopefully, I've shown that it does provide quite a few benefits. Yeah, I love that answer. I'm smiling in agreement over here. Well, thank you, William, for this great presentation in Q&A. I'm afraid that's all we have time for today. Just to remind everyone, we will be posting the recorded webinar and slides to Dativersity.net within a couple of business days. And we will send out a follow-up email to let you know the links and other requested information throughout the event today. Thank you again for attending today's webinar. And I hope everyone has a wonderful day.