 Oh. Hello and welcome. My name is Shannon Kemp and I'm the Chief Digital Manager at DataVersity. We'd like to thank you for joining this DataVersity webinar, Data Monetization, Demonstrating Quantifiable Financial Benefits from Enterprise Data Management, sponsored today by Information Asset and Your Data Connect. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. For questions, we will be collecting them via the Q&A in the bottom right-hand corner of your screen. Or if you'd like to tweet, we encourage you to share highlights or questions via Twitter using hashtag DataVersity. And if you'd like to chat with us or with each other, we certainly encourage you to do so. Just click the chat icon in the bottom right-hand corner of your screen for that feature. And as always, we will send a follow-up email within two business days, containing links to the slides, and the recording of the session, and additional information requested throughout the webinar. Now, let me introduce our speakers for us today. Greg, Alex, and Sunil. Sunil is the CEO and founder of Your Data Connect. He has spent more than a decade in data governance work, including students at IBM and Information Asset. Sunil has worked with hundreds of clients across six continents and multiple industries, including banking, insurance, life sciences, retail, telecommunications, manufacturing, oil, gas, healthcare, and government. Greg is the Chief Revenue Officer at Your Data Connect. He is responsible for sales and marketing. He has helped sales leadership roles at Information Asset, Cleabra, Immuta, Initiate Systems, and Business Objects. And Alex is the Chief Technical Officer at Your Data Connect. He has responsibilities in this role, including promoting the best practices within the organization, developing the technical skills of others, identifying new opportunities, and driving innovation through technology. And with that, I will turn it over to Sunil to get us started. Hello and welcome. Greg. Yeah, I think Sunil is having audio issues. But I'll go ahead and kick things off, Shannon. Beth, I'll give it to you. So thank you. This is Greg. Can you hear me okay, Shannon? Am I good? So thanks, everyone, for joining today. Once again, my name is Greg Orsada, and today we're going to run through a short presentation that will preface the use cases and the usability of Your Data Connect. And then Alex Straugens, who is our CTO, will run through a short presentation. So we encourage all questions, please, asked or put into the queue. So Your Data Connect, as Shannon said, is the industry's first cloud-based monetization platform. Now think of Your Data Connect as an extension to everything you've already done inside your data organization, from the evolution, starting with business intelligence, data warehousing, master data management, the advent of data governance. And then what we found and discovered, and as Shannon said earlier, with all the implementations that we've done throughout our careers, one thing was always missing and was difficult to obtain, and that was end-user business adoption or giving business the ownership of the data, allowing them to take ownership. So when we created Your Data Connect, don't think of it as reinventing the wheel. This is an extension to everything you've already done within your data ecosystem. So we're going to show you how you can value the data, so to create different lines of business in around creating cost savings, mitigating risk, but actually quantifying that data. So you have a continuous ROI. You'll see where you can continually show the business value in the data, taking cost out of the business. We're going to show you how to create new revenue streams, things of that nature. And then when you see data marketplace, I know it's a nomenclature that a lot of people use today in different industries, but it's more of a unified data marketplace, and we'll show you how to use those external sharing internal and external sharing data agreements and how to help utilize those throughout the organization. And then the, you know, the age old question, how does this data regulatory compliant? So all that that you've done, whether it be with one trust or a muta, even utilizing technology platforms like Big ID and showing that regulatory compliant data, we have hooks into all these applications and are able to extrapolate and show the value of the data as well. So what we saw that when we created Your Data Connect is enterprises just don't have the tools, right, to measure the financial benefits. So you hear a lot, like the new buzzword is data monetization, but Your Data Connect, as you will see, actually puts it into practice in a toolset, in a platform, much like and utilizing, you know, whether that be informatica, whether that be elation or utilizing what you've done with Calibra as well. So we took these, you know, as we have a business value assessment team, and we took a lot of this information, you know, and put it into an actual solution. And what we do with our customers is we do what's called a business value assessment, or, you know, as we like to say, a business monetization assessment. And we can take actual initiatives, put those into the solution and show you how to quantify the data that you've already utilized in those, in any data platform that you were using. You know, I'm just giving you examples, but it'd be any type of platform. So what we see in the amount of data, the CDO's role, and the CDO having a go-to platform. So what we're trying to do is give the CDO the ability to show quantifiable, measurable value with that data. And what you'll see here, these are all, you know, information, you know, the average tenure of a chief data officer is 2.4 years. Why is that, right? So we have some ideas around that, and some of the challenges that the CDO has, and he or she's role. So understanding and monetizing that data, you know, we're giving you a platform to do that. So you can actually utilize the data as an asset. Measuring that success of the data, so you can actually quantify it, whether that be in a marketing initiative, whether that be a cost savings initiative. And then really, so as the CDO is charged with revenue, creation and cost savings, coupled with risk mitigation, you'll see how your data connect can encouple all of that so that it's not just we're creating this data dictionary, we're creating this business glossary, or cataloging all this data for analytics, I understand that. And then our customers understand that, but actually quantifying and measuring the financial benefits, that's what your data connect will do for you. So when you see just a service now, you know, with the chief revenue officer, and then sales force with a chief risk officer has metric stream, data monetization and your data connect will be the go to platform for the CDO and he or she's organization. So what we'll show you is a cross functional discipline, you know, across all the technologies, all the compliance applications. And what Alex will show you is how we've integrated within all these different applications to utilize not only what you've done with your data management strategy, but data privacy as well. So what you see here, this is an example with one of our customers, and we launched the platform in June, and we have several new customers and when I say several new customers, some of these are most of them are Fortune 500. And this is an example we use from an industrial manufacturer. And Alex will show you an example similar to this in the presentation. So this is the value so driven 400 million, right, and a $50 million supply chain, you know, based on all the numbers and finance. So taking all these and quantifying all of this information into an actual solution that can be presented with actual cost savings associated with the solution. And that when I say that solution being your data context. So once again, there's a cross functional discipline that we take every as I said earlier. First of all, we talked about everything you you know, within the data management ecosystem, incorporating the different technologies, you know, incorporating all the regulatory compliance, GDPR, CCPA, you've already done all of that work to show that that data is actually compliant, but then and then integrating and showing the financial benefits across that. So taking that cross functional discipline, as you see, your data connect will allow you and you'll see this directly in the tool today, grow your revenues, reduce those costs, which is key today. And of course, managing the risk. So when the business is actually utilizing the data for monetization or purposes, they're going to understand that there's no risk and then using the data is regulatory compliant, you know, they'll be able to reduce the cost. And again, reducing costs will help grow revenue. And this will all be shown in today's presentation. I want to get if you have any questions will be feel free to put them in the chat window. And we'll go ahead and get those answers as soon as we can. So much like the flow, your data connects, we're going to start with, you know, you know, the what is the value of your data, right? Internally generate our third party, that's important. And what we've been able to do with our security model is we'll be able to take those data sharing agreements and share those internally and externally. The example would be from a clinical trials process, you imagine all the data that needs to be shared internally and externally, you'll be able to do that with our solution. So when it comes time to then go and get that clinical trial approved, you know, from the FDA, you didn't miss any steps, and you understand the cost and the process along the whole way. And then that goes with that continuous ROI with curating those data assets. And we all want to grow revenue, we all want to reduce costs and manage risk. And once again, that step two within that continuous ROI, which Alex will show in the presentation, is all prevalent in the solution. And then once you place those data assets in our data marketplace, now let's say you have a data marketplace. That's fine. We'll incorporate those data assets into our marketplace, once again, for those internal and external sharing agreements. But that's very key. And we've seen that with a lot of our customers not being able to share all the data, well then how can you actually value it if you don't utilize all of it? And then once again, we support all the regulatory compliance based on all those workflows, everything you've already done in your existing data infrastructure solution, whether that be again, it could be Roshade, Ghalibra, we work with all of them, Informatica, Elation and so on. And then creating throughout that whole process, whether that data be machine learning data, whether that be artificial intelligence, having that automation is all prevalent and inherent in the solution as well. And what we've done on top of it, in taking all our experience with doing over 100 implementations at our sister company information asset, and also we've created the industry module. So we have financial services bill, healthcare, life sciences, utilities and manufacturing. Now we've built all of these and they're in the solution, but if you don't see your industry in here, we've also worked with others. But our ability to build these industry modules is at no cost to you. We would do that as you become a customer of your data connect. All right. So that is just an overall synopsis of your data connect. I'm going to turn it over to Alex and he's going to actually run through the presentation of the solution, extrapolate on everything I just showed you. Hi, everyone. This is Alex. Let me share my screen. We'll get started. Okay. Hi, everyone. So first of all, I want to go through data portfolio management with a demo and then dive into, you know, get deeper into what we mean by data monetization and also speak about data marketplace. And we'll have demos for each of these. Alex, can you put that into presentation modes and be easier for everyone to see? So data portfolio management, we refer to this as the process of managing your company's investments in data. So that includes your databases, data lakes, business intelligence solutions and ETL processes. So the way that we do that in your data connect, the applications, sorry, let me switch into the tool now. We'll do a live demo. So your data connects crawls, your infrastructure to identify, you know, BI reports, databases, ETL jobs, as well as, you know, files and data lakes. And we're trying to identify opportunities for cost savings such as identifying duplicate reports, tables and so on. Also those assets which haven't been used recently. So, you know, tables that maybe are legacy, haven't been used in a while, candidates for retirement just so that we can reduce the amount of technical or data debt that we have and various solutions we've accumulated over the years. So this is the homepage of your data connect. What you see first of all is this, we call it the data portfolio summary. This is just a high level view of all of your average use usage of the different technologies you have. So that usage is by looking at the operational metadata to figure out, you know, as I said, what reports are unused, you know, they haven't been accessed or run in a while. For databases, it's, you know, the tables that haven't been queried in a while, the tables and views that haven't been queried ETL. We're looking at the execution data around your ETL jobs, you know, what jobs look like they're no longer necessary, no longer used and files as well. What is the last accessed information of various files in your data lake? So that's what this dashboard provides at a high level in its group by technology. So for example, this 80% of DOMA means, you know, of all the assets we have in DOMA, it looks like only 80% of those are actually used, you know, on a frequent basis. The rest, the 20% might be, you know, technical debt that we could retire just to reduce our footprint in some of these technologies. Going on to the Data Portfolio tab of our application, we see, you know, lists of different assets, so applications, BI tools, we can see various ones we've crawled here, DOMO, Business Objects Tableau, there are others we support, this is just an example. We can connect to other catalogs as well, so, you know, other data catalog solutions, metadata management solutions, so our solution can connect directly to many of these sources we're talking about, these BI sources and so on, but if you already have a repository of that information, we can get it from there, so, as I said, the data catalog or metadata management tool. Databases as well, and that includes, you know, traditional relational databases as well as NoSQL databases like, you know, cloud-based databases in AWS or other clouds like Dynamo DB, MongoDB and so on, we are looking at those to count, you know, the number of tables we have criteria around identifying unused tables, and that's how we arrive at the overall usage. So ETL as well, you know, we support various ETL solutions, we're looking at things like the, as I said, the execution metadata of these various jobs to figure out what ETL jobs are no longer used. File systems as well, and this includes a category of data lakes, so we, you know, connect to various data lakes and get metadata around the files and data lakes to, you know, try to identify technical debt. So we're drilling in now to one of these solutions, this is a DOMO, it's a business intelligence tool, we can see, we have this parameter, it's the definition of unused, so, you know, what is unused for one solution might be different for another, so this is a number in days that we consider something unused, so it's configurable and we can change the number, but this is, you know, how we control that criteria, you know, maybe it's different from organization to organization, that definition of unused, so we can change that parameter there and we see at a high level a total number of reports counted 100 and then a number of unused reports within these BI tools, you typically organize your assets by folders, I've opened up one of the folders here and I can see various reports as well as a last used date and a checkbox that will tell me whether individual reports meet that definition of unused we saw at the technology level, so we can see there, you're roughly 20 or so that meet that definition of unused and now that we have this information, other benefit of having that is we can calculate things such as the migration costs of moving from one solution to another, so let's say we have various technologies in our, you know, in our enterprise, we've used your data connect to figure out which ones are candidates for retirement, you know, and we can create that migration plan to, you know, maybe not only retire some tool but migrate any artifacts that are still used in it to a solution, so I'll do an example of that now, I'm going to create what we call a technical use case, we're going to select migrate BI tool, when we're defining this we give it a name such as we want to in this case migrate from an older BI tool to a newer one, so let's say migrate DOMO to Tableau, we select our source and target of the migration and then we for the source in this case DOMO we enter the annual software cost, so I'll say $2 million, for annual hardware cost I'll say $5,000, number of developers, 10, the salary of those developers average salary also $50,000, the number of assets, so I'll leave these blank because these will be pulled in via automation, we will actually connect to DOMO to determine these things, the number of assets, the number of unused assets, so the target which in this case I've chosen as Tableau, the annual software cost, hours to migrate each report 40, and then hourly rate 60, I'll submit this and we've taken a look at that operational metadata, we've counted that there are a hundred total assets but 20 of those are basically unused, they have a meet our definition of unused which we configured for DOMO, therefore you know as part of this migration we recommend you only migrate the AD reports which look like they're still actively used, so that's how we arrive at these costs around the migration, these are the costs just to migrate the actual active reports in the tool, we estimate an overall cost of that migration to migrate the AD reports of $192,000 and then a break even point, after 9.2 months we believe we would break even the costs, and we also provide that ability to track the migration of individual reports, so this is basically a log of what we would do as part of the migration, we could for each report I have a task, I or a developer can come in and leave comments, change status, and once all of the reports have been migrated we can consider the business case closed, so that is our data portfolio management, we're really just looking at all these technologies, calling them to find that operational metadata and find opportunities to save costs, so next I want to dive into a more business oriented use case for data monetization, so what we're going to do here is to find a business case, so it's a certain scenario for a manufacturing environment around making the lead time of purchase orders reducing that lead time, so as an example, let me make this full screen, this is an example business case for a manufacturer, due to poor data quality lead times are being extended on purchase orders due to data quality issues around raw materials data, so that's causing revenue losses for each purchase order, the business wants to improve the data quality of raw materials data and therefore improve the lead times for purchase orders and reduce those revenue losses, the business also wants to quantify the value that data quality plays in this process and the way that will be quantified is it will be a percentage of the overall cost savings for those purchase orders, so let me exit this, the way we implement this in your data connect, your data connect contains the CDEs, the database columns, and the business rules that are relevant to this scenario, those business rules get implemented in a third party data quality tool and the results of those rules being executed are brought back into your data connect and then we use your data connect to author a business case that has formulas that say basically 8% in this case it's configurable that says 8% of the saved value from purchase order lead time reduction is attributed to data quality as long as the data quality score meets some threshold, in this case the relevant CDEs have to be greater than 98%. Once this business case is approved our platform it checks to see the data quality score of these CDEs to see if they meet that threshold that's configured in the formula, then we query the data to find out where it's stored and we're looking at the data to count the amount that we're saving for these purchase orders and as long as the data quality score is greater than 98% we give data quality credits for the role they play in this process. Let me jump into the tool now so the data monetization tab that's where we can also define business use cases, we saw a technical use case earlier for that migration. Here's one I've already created around purchase order lead time reduction, so we organize these by industry and supply chain or sorry industry and division in this case it's supply chain and the author of this business case they provide this recognized value formula, so that's recognized value in the sense of how much value gets recognized for data governance if they meet certain criteria, so the recognized value would be the overall saved value from lead time reduction times 8% and the recognized value criteria is the lead time validity is a measure of data quality it has to be greater than 98% so these values and quotes are actually CDEs which have been linked to the business case over here so that's that's what we see here saved value from lead time reduction that is ADCD inside of this purchase order glossary it also has a column that to it in approval history so if somebody defuncts this formula they submit it for approvals and after it's approved we connect to the systems the data quality system as well as the in this case it's an ERP system where the data is stored to actually calculate these values so let's look at the CDEs we mentioned the saved value from lead time reduction that is a CDE defined in this glossary and we can see it's linked actually to a column this total saved value it's also as we saw it's a part of that business case and this traceability will basically show us where is the data for the CDE located I'm exploring from this business term this traceability module I can see there's the column I can continue exploring this to figure out you know precisely where that database is so this column in this PO savings table in this ERP app schema and we can continue exploring to get even more details so this gives us a map basically of this you know business term to the column that contains the data and I've illustrated that here so as we explored that traceability we identified a column and if we query that column we'll see total saved value has numbers like this organized by a date now in talking about the lead time remember we had a criteria it says lead time validity must be greater than 100 so lead time the validity validity comes from the third party data quality tool we can see them on the traceability tab if I explore I can see that lead time business term is validated by a business rule and if I explore that business rule I can trace it all the way down to in this case it's informatic IDQ where the rule is implemented the rule is implemented as a data quality rule and that rule gets executed on a periodic basis it's actually monthly and we can see the different metric objects so there's two metrics one for the month of august one for the month of september and if we open these we will see some statistics about that data quality rule from that date I've opened that metric I can see on this date in september there were 32 valid records zero invalid so under percent validity in the most recent data quality metrics validity score is what shows on the business term so that's where the under percent is coming from so you know we've defined the business case we've submitted it for approvals we've we've translated the cds into actual values as data quality score for the lead time and then an actual value from the database for the saved value so that's where these values come from as long as this criteria is true whenever this process executes it's a monthly process we can see the dates here as long as this criteria is true the validity greater than 98 percent um the recognized value of eight percent is calculated and eight percent was configured here as part of the recognized value formula so total value is 47 000 eight percent of that is 37 60 because the validity is greater than 90 percent these numbers continue to come in monthly and they roll up to provide an overall recognized value of data quality at the business case level so this is just a sum of all the values from the month continuing on I want to get us to data marketplace our our platform also provides a data marketplace to let users register sets of data what elements are within those data sets and then other users can request access to them in the marketplace so our platform allows you to define you know dashboards or to measure certain things so here's a sample dashboard that we've created showing you know which division in the company has the most popular or sorry which division in the company is requesting the most data which division owns the most data being requested and then the popularity of the various data sets so if I jump into the next tab I'll see here this is the organization module of our application this is where we can define the structure of our company and within different divisions we can assign data sets that the division owns so I have a data in analytics a finance division marketing sales I've opened up the sales division here we can see it has a data portfolio it has certain applications it owns it has databases it owns as well as certain business glossaries the data marketplace tab is where we can register actual data sets so this customer data data set is one that I've registered previously for this sales division if I open that data set we can see it contains a price so that is the price this data set will appear as in the data marketplace the price could be zero but if you provide a non-zero price what will happen is when somebody requests it and is approved to access that data set we record what we call a cross charge it's just a record of the work that the provider is having to do on behalf of the requester whenever the requester's request is approved in the marketplace so price of $25,000 the frequency this is a one-time data set it's just a snapshot basically of customer data there are other values for frequency as well could be recurring on a periodic basis could even be something like a real-time you know like an API the system of record so that's an application customer 360 the provisioning system is the same application we have various data elements we have the in this case customers email first name last name and whether or not that customer has opted in for email marketing those are mapped actual columns which contain the data as well as policies we can define policies around data sets in this case there's a policy that says marketing emails can only be sent to customers who have not opted out of email marketing any previous requests that were made for this data set appear in this section here we can see somebody has previously requested this it was actually the marketing division and they wanted to do a marketing campaign via email approval history as well this data set was previously approved so this is just a recording of that approval history that was the sales customer data set now we have the marketing division they also own some applications one called campaign manager a database as well on the data marketplace tab we can see they started this campaign data set that campaign data set it has various statistics around marketing campaigns like you know how many emails should be send of those how many were opens how many clicks were there in those emails and so on so this data sets let's see it has a status of pending to submit it for approvals I'll click this button I can provide comments and now the there's an admin group who receives emails to notify of this new data set they can click the links in those emails to come to this page where they can approve let me refresh the page and we should see the approval history so I'm set up as an approver I'm going to just approve that data set and once I approve this data set it will become visible in the data marketplace tab so this is the data marketplace tab let me refresh we should see two data sets there's the customer data owned by sales here's a campaign data set owned by marketing we also have the option for users you know of course they can filter search by name and other attributes but sometimes if somebody might need data but they cannot find it therefore they can submit this request over here we call this an ad hoc data set so they select the requester selects their division described why they are requesting this data set its purpose priority and they can select individual tables or columns here so this is really just for the case of a user cannot find a data set that they need so that's what this would be used for in this case though I'm going to select an existing data set I'll select the customer data sets we can see a few other details I'm going to choose to request to it the requester now must provide their division so I'll say marketing purpose I'll say an October email campaign priority medium before I can submit this request I have to opt in that I will or I have to agree to follow any policy so we can see that one defined on the data set or in marketing emails I must agree to follow these policies I will do that and save that submits the requests users are notified via email the approvers are notified via email they can come here and approve or inject if they reject the request is closed and the requester would have to submit a new request in this case I'll approve that we can see the first step of approval is the administrator approval I will approve as the admin now we go to the data set owner so the person who registered that data set in the marketplace we require them to approve as well so I'm set up as the owner I will approve as that owner and we should see the status change now it's become fully approved that means you know we're complete with the approval process the data can be shared and we've gone ahead and recorded a cross charge just a record of the work that the provider has to do on behalf of the requester in that cross charge in this case it's $25,000 that amount is the price which was specified on the customer data set which we can see here and again that could be zero it's not required it's a specified price so all of these requests they're now affecting what shows up in this dashboard we can refresh and see you should see some updates click the refresh button we can see the overall value of all the data being requested it's $50,000 there have been two total requests for this particular data set and it looks like marketing is the top requester and sales is sharing up as the owner of the data being requested so with that I'm going to turn it back over for questions thank you everyone thank you Alex thank you Greg and thank you Danielle for you guys for this great presentation I am going to dive here into here to answer questions and just the most to answer the most commonly asked questions just a reminder I will send a follow-up email to all registrants by end of day Thursday for this with links to the slides and links to the recording of the session as well as anything else requested so dive in here so guys why wouldn't you just take advantage of Amazon's AWS data exchange so sorry I was I'll go ahead to Neil can you hear me Shannon yeah yeah you sound good yeah so yeah correct the Amazon data exchange does not offer the full-fledged workflow capability and the governance capability that you would get with this platform and I'm assuming it doesn't matter where those assets reside regarding for example on-prem cloud etc etc that is correct I don't know if you want to add anything Alex but it really does that they add data assets now the data sets and the data marketplace of conceptualized they're virtualized it doesn't really matter if they're on-prem or in the cloud okay all right easy um I love it so certainly does your soccer require a complete metadata catalog including for the files in a data lake I'll let Alex answer that so so that is a lot of the things we showed are optional for that marketplace to work you know you could just create the data sets in our platform it's not required to use all three of the various you know I showed data portfolio management the various business cases like the data quality example and then the data marketplace you could use one or you know all of these you there's no requirement that you use all three together so data portfolio is really I've mentioned there's two ways that can work that can connect directly to your data sources and scan them to identify those cost savings or it could connect to existing catalogs you might have existing data catalogs that might already have some of that metadata but we also offer a lightweight data catalog ourselves right but if you already got an existing data governance tool metadata management tool or data catalog will coexist with that platform and this Greg and what we've seen with a lot of customers is they utilize us almost as a catalog of catalogs because of the some organizations have calibra they also have a link in and they might even have been from Attica's enterprise data catalog too so utilizing what they've already done in those enterprise data catalogs and utilizing the monetization all aspects of those integrations between those and one question also I noticed early on that I wanted to you know when I said the business the CDO's job I think is Pam I do want to answer this this is important the CDO's job is not to own the data it's to give the business the ability to take ownership of the data I think that's important and I think that's what your data connect does that at least us that we've seen early on with our customers you just calculate the eight percent this came up during your demo Alex uh is it yes it's configurable but how do you come up with eight percent why not 25 percent or one percent great question so so that's typically I didn't really show it but for these business cases there's a typically a lot of involvement with finance to quantify what percentage you know is credited to data quality so that's you know changeable you would submit that as part of the approval process and then finance or whoever you set up as approvers we see finance involved in these types of workflows a lot they would you know they might iterate on the percentage and go back and forth before finally approving the business case yeah perfect and what if you don't have a data quality tool I'll answer that question if you don't have a data quality tool right then our platform does offer a lightweight data quality I don't think you know we're going to say that we you know we're a replacement for an IDQ or ID information analyzer but it does offer lightweight data quality but if you do have a data quality tool like uh Informatica data quality or IBM information analyzer or others will coexist with those and we can show that you know how that works uh I that from that question so we uh we can show how that works through our presentation layer in a demo is what I'm saying sure um yeah and if you guys have a link to you know sign up for a demo definitely let me know and I'll get that out in that follow-up email as well uh so knowing that nothing works straight out of the box on average how much initial configuration time is needed to see results from the tool so we've seen this is Greg so we've seen anywhere from two to four months and being completely operational in production great and what if the data source is used by multiple divisions are you able to take it with multiple divisions that is that is possible so this meta model of this tool is is highly configurable so it in a lot of cases there is a requirement to to have that owner but if you have multiple owners and you want basically all of those are sorry all of those owners to approve like a sharing of the data that can be configured absolutely there's also a hierarchy you can set up so one possible solution is um that division that I showed earlier it could be organized as part of a hierarchy and you could have just a parent um division which owns you know multiple subdivisions in the ownership could be assigned at that parent level does the price of a data source change with time or when uh add value is added to the data set that's the one that might be again run the price of uh a data source vary over time correct so I think what you're really saying is the value right so the value because there's there's two ways you can think about um a data set in a marketplace right there's the there's the value for it right there's the value of a data set and then surprise you might charge for it internally or externally there's two aspects to that so the price is of course totally configurable right the value may or may not grow over time it may decay so generally if the quality of your data set is decaying over time and you're not curating it through data quality efforts then you'd expect that the value of your member or your customer data would decay over time but if you're but uh obviously if you're doing things to curate your data then it might improve perfect um uh is there any way out in the box uh or uh to sync the data sets with the tool I'm sorry uh what was what was the question about data sets and this part of it do you have an out-of-the-box way to sync the data sets with the tool oh absolutely so so um you could leverage this platform as the data marketplace in that workflow engine but synchronize the data sets from somewhere external definitely so you could say Alex that your your customer data is setting an oracle for example right and then you would basically scan your oracle database right database schema columns tables and bring that all all that into your data connect right that's the right correct going back here a little bit about the question um regarding the price of a data source changing over time so our data value um what can government agencies do to monetize data since they cannot make a profit per se correct so that's a good question so the question is how can government agencies do to monetize data so many government agencies now uh you know obviously have the concept of open data right with open data many government agencies have a mandate to actually provide as many data sets as possible for use by the by the general populace and that is oftentimes the remit of the chief data officer so in the case of um you know a government agency that's a good point I have to give it some thought in term you know in the commercial space you think about data monetization in terms of growing revenues reducing cost managing risk in a government agency um you would have to think about um ascribing a value to higher availability of data but also potentially reducing risk right there's a risk reduction aspect to uh to data that would still exist uh in a government agency so for example I don't know Alex showed so this to you but we have a specific data monetization use case around integrating with ServiceNow so we have your list of applications in ServiceNow and your data connect which can ServiceNow highlight all the applications that have not been certified in the previous 12 months and send those applications to the application owner for recertification so presumably when you recertify an application you reduce the probability of a data breach which means you you capture some monetary value in terms of risk reduction that specific use case around risk reduction would actually apply in a government agency as well Shannon I love anything yeah anything else you guys want to um continuing on that um you know the additional comment was government agencies value their data in terms of public benefit was that a question it was it was kind of an add-on to that uh but I think you're going to talk to that I didn't know if you want to add any additional I think what I said what I said earlier I'd be a whole open data and figuring out a way to ascribe a value to the fact that more citizens are able to access the data that would be one way of thinking about the problem sure so do you work with smaller companies that don't use Informatica or IBM tools yes absolutely yeah this this platform could be used uh you know standalone or it could be used in conjunction with the Culebra IBM Informatica etc and does the tool harmonize data across platforms for the same data what when when you say harmonize data across platforms not sure I understand what platforms are we talking about I will have the questioner elaborate a little bit on that I stand if you have additional comments on that that's all there is in the question I know there were some awesome some questions in the chat too as well yeah so just digging through there as well I do try and keep it in the Q&A if you guys have additional questions to submit in the Q&A um I know you talked about the CDO question already uh kind of scrolling through so and then I think Tim King had a question too you know nobody owns the data in the organization just like no one owns the CAS what we're saying is giving the business the ability to take ownership of the data to actually use it for monetizational purposes because what a lot of companies see is when they build these monolithic data governance infrastructures across the organization the business just doesn't use it there's no reason for them to take ownership of that data to use it for anything whether that be analytics but we're saying on top of that taking the analytical completion and monetizing the digital topic perfect so we do have any uh oh go ahead so go ahead that's fine yeah so we do have an expansion on the on the question so the question again was does tool harmonize does the tool harmonize data across platforms for the same data so um can this be used with applications such as Tableau multiple versions of SAP ERP I don't think our tool is uh Alex you can jump in our tool is not an ETL tool right so we're not trying to harmonize data across platforms we're really focused on quantifying the financial benefits to really support the chief data officers he or she is trying to you know explain the financial benefits of data management to the enterprise right right they're correct yeah yeah and along those lines and you know it's you know there's so the question here is this a metadata tool or data tool you know the catalogs are purely metadata good question so so some of the business cases like the one we saw around purchase orders part of the success criteria that business case is actually connecting to the connecting to the system that has the data and summarizing it and then trying to figure out you know what is the quality that we have on file for that data and if that quality meets certain conditions we give credit to the data governance team in that scenario so this is touching the data not just the metadata correct yes I mean different you know the data portfolio that's really just touching metadata operational metadata and all that but when we get into business cases that and we want to quantify like the value of data governance or data quality that's typically looking at data it's executing those formulas getting those values you know uh they're typically you know dollar amounts as we saw in that purchase order example and giving credit so there is definitely some involvement with data yeah we love all the questions coming in if you have additional questions feel free to get in the Q&A is there a way to tie the past business outcomes and future target business outcome to the data monetization process past business outcomes and future business outcomes to the data monetization process the answer is yes the way you would do that is you would take a specific business case and the business case is configurable right as alex showed you you would have a business case around for example improving materials data quality so you could get purchase orders faster if you get purchase orders faster you get paid faster which means you reduce networking capital that's the broad brush that business case is informed by your past experience because you in the past you know what your historical materials data quality has been you have a pretty good idea about your working capital velocity right so absolutely your business cases should be based on what has happened in the past and what do you expect to happen in the future so do you want to add something Shannon did we answer the question about tableau did we get to that one we did okay good that's just a lot of questions okay go ahead how much of this is dependent on how clean and structured your underlying data is it's actually the worst the data is the better it is from a data monetization perspective think about it right because the worst your data quality is for example for materials it probably implies there's more room to improve data quality which means there's more room to get your purchase orders faster and get and reduce your working capital so so whether your data is good or bad you can still apply this platform though honestly if the data is bad there's higher opportunities for monetization and how do you benchmark how do you how do you benchmark your what part of data monetization in general so we've done a lot of projects yeah we've done a number of projects on the services side of the house in different industries for different use cases so we have a we have a lot of knowledge and house around you know what it looks like like what is the value of customer data by industry what's the value of genomic data of your own health care we've done a lot of work across different industries in different use cases and when developing a business case when are some of the early wins that create positive projects so so positives around these these projects I I think one is around building a building a data marketplace to be able to you know share data so if you think about you know typical data sharing agreement might take anywhere up to 12 weeks right but if you're using the data marketplace you can actually reduce the 12 weeks down to anywhere from four to six weeks so now what you've done is there's a cost reduction and we we did the analysis for one specific data marketplace use case and you're reducing costs associated with contractors and data scientists spending time getting approvals for data and we put a number of three million dollars based on you know 500 data requests a year right and that's just based on cost reduction we didn't even talk about you know revenue enhancements because data scientists have access to data faster we didn't talk about risk reduction associated with doing something incorrectly so so that's certainly one tangible example and does it connect to Atacama or Amanta not today but if it's got a REST API we can absolutely you know set it up on it so we like from a data quality or perspective like Atacama or MDM tools we've connected to Informatica so those solutions right so as soon you'll say as long as they have the REST API. I love it and we'll just give everyone a quick minute to submit any additional questions in the Q&A section there and again just to answer the most commonly asked questions just a reminder I will send a follow-up email on the recording and the slides for this by end of day Thursday. While we're waiting for any final questions to manage. Sorry Gary Rector had about Doug Laney weighing in on your data connecting I reached out to Doug and we'll coordinate a time we'll get together so no that hasn't been done yet but it's it's going to happen Gary. Okay well that is all the time and questions that we have for today guys thank you so much for this great presentation and really informative it's got a lot of interest going on here again just a reminder I will send a follow-up email by end of day Thursday for this webinar with links to the slides and links to the recording of the session as well as anything else requested throughout so thank you thank you all and I hope you all have a great day stay safe out there thanks guys