 And here we go. Hello and welcome. My name is Shannon Kemp and I'm the Chief Digital Manager of DataVercity. We'd like to thank you for joining this DataVercity webinar, the Automated Business Glossary, sponsored today by Octopi. Just a couple of points to get us started. Due to a large number of people that attend these sessions, you will be muted during the webinar. For questions, we will be collecting them via the Q&A in the bottom right-hand corner of your screen. Or if you'd like to tweet, we encourage you to share highlights or questions via Twitter using hashtag DataVercity. And if you'd like to chat with us or with each other, we certainly encourage you to do so. Just click the chat icon in the bottom middle of your screen for that feature. And if you'd like to continue the conversation after the webinar, you can continue the networking at community.datavercity.net. As always, we will send a follow-up email within two business days containing links to the slides, the recording of the session, and additional information requested throughout the webinar. And a special note, today's webinar is built off a recent joint survey done by Octopi University on business glossary automation. The data was also used to generate a new joint white paper, what happens when you automate a business glossary. And I will include a link for all webinar registrants to receive a first copy of this white paper in the follow-up email. Now, let me introduce to you our speaker for today, Amnon Dori. Amnon is the CEO and co-founder of Octopi, a leader in metadata management automation for BI. With over 20 years of leadership experience in technology companies before co-founding Octopi, Amnon led sales efforts at companies like Zen Technologies, Modus Nova, and Alvarian. Amnon studied management and computer science at the Open University of Tel Aviv. And with that, I will give the floor to Amnon to get today's webinar started. Amnon, hello, and welcome. Hi, thank you so much for having me today. And thank you for the intro. I'm very, very excited to have everybody who joined and anybody who registered. I know the times are challenging, but most importantly, beyond any topic, I really hope that everybody is safe. Everybody is healthy. Everybody's families are okay. And that's on a personal side. And again, just like you mentioned, we have done some work together with Dataversity talking to the market for the past couple of months, trying to understand a little bit more about the challenges around business intelligence and analytics, specifically around a very hot topic that we've seen in the past year that has to do with data catalog specifically around business glossary. And with the very good work that we've done with Dataversity, we thought that it would be a good opportunity to share some of the results of the answers and insights. Very interesting insight that we were fortunate to be shared by our clients, as well as some business intelligence professionals, which we believe could be interesting for everybody to share with. So on today's session, we're going to share some information that had been captured due to the analysis and the survey that we've done together with Dataversity. And we thought that it would be a very good idea to add on top of this more information as Occupy has been engaging with the market for the past four and a half years. And collectively with this information, whoever joined the session hopefully will be equipped with more information from what we call the crowd sourcing that might even a little bit help in the daily dilemmas and daily decisions. So together with Dataversity, we've conducted a very, very thorough survey in the past two months in January and February. More than 300 business technology professionals have spent their time in addressing questions and very, very openly share with us their insights that has to do with our domain of business. But also it was really interesting to see that some of the business technology professionals that are coming from different verticals quite suffer from the same challenges. So if you're financials, computers, telecom, manufacturing, pharma, healthcare, universities, it seems like it's a very, very small world. At the end of the day, people suffer, people challenge from the same things all around. And the goal of the server was to understand from the field what business professionals are going through on a daily basis. And more interestingly, as I'm coming from many, many years of automating many old processes, what is their view about the role of automation specifically around business intelligence and analytical operation? To add to that, we also added some insights as we communicate with the market almost on a daily basis. And we thought combining these two data points could be very, very interesting to share with you today. So let's start. So the session was mainly around business glossary. And I think that if I would ask five of you, what do you think business glossary is, we're probably going to get eight different answers. Now, multiply this by thousands of people who are being answered, do you want the business glossary? And we started with a very, very basic question. What do you think business glossary is? Or how do you define the business glossary? And when we asked the users, we talked about BI managers. We talked about data analysts. We talked about sometimes CDOs that are flood with different terms like data catalog and data dictionary and business glossary. So what is their definition? And among many, many different answers, we kind of captured the top ones. One is their definition of business glossary is a place where all the metadata or the business metadata that relates to reporting system exists. The second answer was it's very much data dictionary. The third one is it's the same as data catalog. And the fourth one was I'm not really sure. I don't really know. So it's kind of hard to have a decent conversation when you're trying to have a discussion about what you expect from a business glossary and what is important for you from a business glossary point of view. So if you're a vendor like us or you're an analyst and you try to preach or to ask the client what is it that you would want to see from a business glossary point of view, they have different answers. To spice it up, we also asked our users what is important for you to see in a business glossary or what's important for you in a business glossary specifically. The answers were I want to be able to have this as a product preferably out of the box. I don't want to do any professional services. I don't want to invest any capital to get this up and running. I most probably don't have any documentation of things that have been created maybe three years ago, five years ago, ten years ago. Can I use a product that could be out of the box? Which leads also to fewer or even no resources required on their stuff. I don't know if you experienced that while you want something and you understand what would it take you to get to that point. In some cases you waive what exactly you wanted to have because the investments and the road to get there is so tiring and so expensive that you just give up on that. They also mentioned and we see that trend growing the past four years that the word around automation and to do things automatically is growing rapidly in the past couple of years. So now the automation around creation and management and updating business glossary is something that is really, really important. And fourth, we want the business glossary not to be just a self-unit by itself. We want this to live together with other entities of applications that serve the entire organization. It doesn't mean that I wouldn't buy or use a dedicated business glossary but I want this to be integrated to other capabilities that are important for me as an organization to use as part of delivering the data in an accurate manner, in a fast manner to my business users. The last question that we asked was, so what's keeping you from having a business glossary? Well, you want it, you know pretty much what it is that you're looking for so why you're not doing anything about this? And the answer was if you go back to the previous two slides where it still requires too many internal resources just to have this created, to have this recorded, to inject data elements into a creation of business glossary and maintain this on a rapid pace as the environment keeps changing all the time which leads to a very costly or too costly to their perception of what they would be willing to invest or the organization would be willing to invest in such an initiative. When you look at that, the third answer was really an unclear expectation of what you should accomplish. Going back to slide number one where every organization has a different perspective or perceived value around business glossary, it leads to setting the right expectations internally in order to make sure that what you deliver at the end of the day fulfills the promise of what the business users wanted to have. And the fourth barrier that we see is that it's not automated. Again, if there's one thing that we've heard again and again quite extensively this year versus the previous year versus the previous year was that people want to have more automated things embedded so less manual work is going to happen where the team can focus on the real interesting stuff. So if there are things that could be automated for them, that's more than welcome. So then we also asked, could you share what is your day-to-day looks like? Could you share one of the top use cases that you end up spending a lot of time due to business requests, demands around data? And here are the three. One of the top three BI and analytic use cases that your team is being challenged. The first one was implementation of business glossary or data diction or data catalog. And I'm not going to argue about the definitions of each. There's a whole world of explanations. But if there's something that is really difficult is the implementation of such a solution. The second business intelligence challenge are the daily operation. The ability to understand, to analyze, to track the entire data behavior, the entire data journey from the first instance of how business users look at a data set in a report and analyze the entire data journey that lands the data on the report until you understand what does this data mean. And there are four questions that we've learned that are happening on a daily business within business intelligence and analytics group on a daily basis. And here are the four questions. Question number one, where the data is coming from? Question number two, where the data is going to? Question number three, where is the data in my collective business intelligence tools? And four, what does this data mean? When you ask for a data element called territory or client or customer or state or commission or policy, what exactly did you mean by that? And I'm going to show you a use case today that has to do with a very explicit understandable data element called full name. If I were to ask you what do you think the column called full name represents and what should I see when I'm asking to see a full name on my client? What do you think it should include? And I'm going to show you an example on that. And the third, moving away, and we see that trend going quite growing trend, moving away from old systems to new systems, moving from on-prem infrastructure to the cloud and the understanding of how to migrate systems or upgrade systems from legacy systems to more modern BI tools. And the collection of these three things created a very, very difficult situation for business intelligence teams to cope with. And you could see that among all of these use cases, almost 70% said that the implementation of business glossary in specific is one of the top three use cases challenging BI and analytics teams. So let's move on with some of the interesting questions that we asked our, the users through data versity and I want to share with you the results. As Shannon said, you're going to have a copy of this as well as the results of the survey. So question number one, when you run into data inaccuracy, what do you do? How long does it take to understand the root cause or to understand the source of an error from the report all the way backwards? The answers were that it takes, third of the responses said it takes many hours or quite a amount of hours to almost report a quarter 25% that says it goes all the way as today. Now, if that was one instance a week, there's no issue with that. But one thing that we've learned in the past couple of years that the incidence around this type of a use case is happening more and more. We're learning so often that the need to be able to address these use cases on any given point of time is becoming imbearable by continuing doing things manually. Another question, tracing the data flow for many reasons, how many hours does it take you to trace how the data moves around within your business intelligence tools for a variety of reasons like impact analysis to understand the root cause or you want to do impact analysis to understand changes upstream from the detail all the way to the report. And third of the responses more than that said that they spend five to 15 hours on an incident like that just to understand the impact. Multiply this by dozens of times on a weekly or bi-weekly basis. We're talking about a lot of time and unnecessary efforts that are going down the drain. So going back to the previous use case and this one, can we have a common denominator to automate all of these use cases and get results with a click of a button which could increase the efficiency, save a lot of time and frustration and also expedite the ability of the business intelligence and analytical team to provide accurate data to their business users. We also asked if you want to change a field, you want to enhance it, you want to increase, you want to delete, you want to mask, you want to do something. How long would it take you to do that? 45% spend anywhere from few days to sometimes few weeks just to find the impact of a change of a single field. That leads to frustration and frustration is not a word that I picked. This is one of the most common word that we've seen our respondents using. And they just said that the level of frustration is growingly happening because they don't see the light at the end of the tunnel. They keep doing things manually and there's no way to add more people to the team in order to become more efficient and better. As a matter of fact, 86% said they are frustrated on a daily basis. This question was important to us because at the end of the day we are professional people. We are people at the end of the day. And we want to be happy doing our job. We want to be satisfied doing our job and not be lagged by the fact that we're not equipped with modern technologies that will enable us to be better. And 27 out of that said they are extremely frustrated. This is not a great place to be in when you have to do this every day. We also asked them one of the two capabilities you would wish to see incorporated together with business glossary. If business glossary implementation is number one, one of the additional two capabilities that you think should live side by side together with a business glossary. So you can have a coherent picture about the data movement process and the discovery of the data within business intelligence and analytical systems. Over 200 responses said the data lineage and data discovery are very, very important to them. As a matter of fact, if you could have business glossary, which is very much tailored to the business, incorporated with more I would say professional tools around data lineage and data discovery, you can have a very coherent story, like a story all the way from the requirement of the business through the analyst, through the data scientist, through the BI developer, all the way to the ETL developer and collectively they are responsible for the data movement process from the data sources to the data consumers of which we call the business users. So we asked them if you would wish to see all of these three together, do you think it will provide you more capabilities altogether? Can we have a formula that says one plus one plus one equals nine, not equals three? And the answer was yes. The key to that would be not to deal with each one of these capabilities separately in different techniques, rather than if you can create those capabilities based on a platform that is fully automatic. If you can introduce automation to the management of the metadata, which is the base, or I would say the raw material in which we are using for business glossary, for data discovery, for data compare, for version management, for data lineage and so forth. So can we provide automation to the entire I would say operation? And the answer was to what extent do you think automation plays an important role for both data lineage, data discovery, and business glossary all together, not for just a single specific capability. And as you can see, 83% are convinced that automation is the key to become better in either one of these capabilities or all of these capabilities altogether and moving forward to additional capabilities around metadata management. And this is really, really important because at the end of the day, we're trying not to deal with a single problem. We're trying to overlook a little bit on the macro level and say if there's a technology that we can provide that will eliminate manual work in different domains like data lineage, business glossary, data catalog, version management, data discovery, so you could have a common denominator in a platform that will serve a lot of needs to the organization if you are re-automating things. And this is where we present the automation of business glossary and why this is critical to the BI and these are some of the inputs that we got from the people that we ended talking to. The importance of automation will be to overcome the implementation barrier. We've seen a lot of organizations that have current initiative to implement business glossaries or even further to the data catalog, but they are stuck. They stop at some point because it's too complicated, it's too costly, and it's taking too much of the capital resources of the organization to be able to be in a position that you have a business glossary. So can we solve this barrier so organizations can enjoy having a business glossary in order to serve their business? The other point was that glossary environment must represent the actual environment, not to have something that captures what exists six months ago. If we can upload and maintain and refresh the business glossary due to automation and automatic refresh of metadata from all those BI systems, no matter how many you have and no matter which vendors you're using, you let the automation work for you so you can do your job or a more interesting job, rather than spending so much time on the plumbing and creating the environment on an infrastructure level just to be able to work. So this is where they see another point of the role of automation. Third is real collaboration, the fact that everything has been centralized in a single repository. Keep editing, keep modifying and keep refreshing itself in an automation manner. Provides a real collaboration between the different stakeholders in the data journey from the data sources to the data consumers, different roles, different stakeholders that at any given point of time, from anywhere in the planet, you can enter that collaborative product and talk about the same thing without guessing. Which leads to a more kind of self-service business intelligence. And if this slide is something you recognize, this is kind of a landscape that we see among many organizations of which data sources contain data, run data through different business processes that combine different tools, different instances, different environments. Their job is to ship data and store it in database tables and views in the data warehouse of different vendors in different entities in different departments. And there are reporting tools that exist in the organization, some old, some a little bit new, that their job is to show data to the business users so they can take decisions. And this infrastructure, this landscape is becoming more and more complicated to manage. So in that sense, I want to show you a demo of the automation and how our clients use automation, leverage automation, combining data lineage and mostly business glossary in a collaborative storytelling that consists two, three use cases, but we're going to say the story from the organizational point of view. So I'm going to combine a story of an insurance company that had a very, very specific question. Can I trust the data in my report? But in our language, the BI professional is, I understand the business request, but in practice, I want to find the exact ETL processes and database tables that are associated with that specific report of that specific business user in order to understand if I can answer their question. Or this company of business users who wanted to enrich their data with additional data, but they couldn't find it in the reporting system. So they addressed their analysts saying, could you enrich my report with more data that I know had been created in the data source? And now you go to kind of becoming a blind. The answer could be, yeah, I can get it in two hours or maybe two months because I don't really know the data is accessible. Where can I find it? What I need to do to ship it over, which ETL processes are related to lending the data on that report specifically, what could be the impact analysis of those changes, and a whole bunch of questions which I'm sure are not stranger to you guys as I'm kind of sharing this. And the third question, the third kind of use case has to do with, I need to understand what specific data element means. Going back to a couple of minutes ago when I said, could you think of what is the exact definition of a data element or column called full name? What would you expect to see? So I'm going to show you now a demo that combines this use case, this use case, and this use case. Obviously I can do each one of them separately, but I think tailoring the entire use cases, leveraging lineage together with glossary could provide you the power of automation and the power of collaborative modules altogether. So I'm going to move to our demo environment. And Shana, if you don't see three round circles, please let me know. Otherwise I can continue. So what you see in front of you is what I promised could be seen in the presentation. This is a typical environment of which we extracted metadata, upload this to OctiPy. That process takes about 30 minutes to an hour, not more than that. And if you wish to do it more than once, you can automate the entire extraction process. Very, very simple process to do. Once you extract the metadata and you allow OctiPy to analyze it for you, you get a link and you start working. So what is it that you see in this demo? In this demo we've used different type of metadata. This is a demo environment, not a real customer's environment. So we capture metadata from Informatica, from SQL Server, from Power BI, Tableau, Business Object. So different systems, ETL, Data Warehouse, Analysis Services and Reporting Tools, also from different vendors. You can see here 386 ETL processes, shipping data to 2,500 database tables and views, landing the data on 22 reports. If you do a simple math of possible data junctions between data pipes, you're talking about tens of millions of possible data pipes within this very, very simple BI landscape. In practical terms or in practice, customers vary between thousands of ETL processes to tens of thousands of database tables and views, all the way to thousands and tens of thousands of reports. The amount of possible data pipes on a multiply point to point, in some cases reaches tens of billions of possible data pipes, which by all means are not manageable in a manual way, or even not recorded or captured or documented in any way, at least from what we've seen, clients don't really invest in documenting things because things keep changing all the time. So, let's start with the use case. I'm a business user. I'm looking at my report called Sales Report. And in my Sales Report, there's a column called Full Name. And in that column, I see the first name and last name of my clients, which is great. But now I want to see also the middle name. And for some reason, I don't see it. So, I'm asking my analyst to enrich my report by adding a very simple data element called middle name and enrich my report. But I'm a business user entering my reporting system and I cannot find a field called middle name. So, as an analyst, I want to understand where this field exists. But first of all, I need to find the report. So, I would go to this reporting section right here because the report can be generated in one of these four reporting tools, or I can go here to the main search, and search for a report called Sales Report. And as I'm typing in, immediately I found the report, it generated in Business Object, and I want to see the lineage inside the reporting system. I want to map the lineage inside the different layers, in this case, of the Business Object report. Very quickly, I can see that indeed the customer or the business user is right. He cannot find or the creation of the report conducts two technical data elements that are concave or merged together to a column called full name. So, my question is, where can I find the middle name that I can enrich that? In that case, I would go to the Business Glossary here, which again, I captured all of the data from all of the reporting tools. So, I'm going to move to a Business Glossary view, and I'm going to look for full name, just for me to understand if I should expect to see the middle name as an available data element. By clicking here, I can immediately see that the full name was initially designed to have first name, middle name, and last name. And I can see also the formula that was initially generated in order to enable the full name column to include the middle name. So, now that I know that, and I know that Jeff Smith is responsible for it, I need to go back and understand why I did not see the middle name data element exist. I would go back here, and I would go back here, and I'm going to click a different button called lineage, because then I want to understand which ETL processes out of 386, the exact ETL processes, and the exact database tables of view that collectively may store, may ship the data element that is missing. By clicking a button within less than a second, I see the following, that this ETL, this report is basically fetches data from this table, which is in SQL Server, and running this schema of the analysis services. And the data lens of this table, we're running these ETL processes. So, now I'm going to check if the data element exists or not, in which then I can enrich the report by just fetching this from the database, or maybe it just doesn't exist, even though it was designed to do so, and I need to enrich my ETL processes. In this case, this ETL is running on Informatica, while this ETL runs on SSIS. So, in the case that I found that I could not find those middle name data element, and I've decided that I need to enrich this ETL, the immediate question would be, I need to design a change in this ETL to enrich that business processes, but would that impact only this report? So, with a click of a button, you can do the lineage forward, and get within a few seconds the following picture. This ETL right here impacts not only this report, which I want to enrich, it impacts all of these reports right here, and all of these database tables of views that are collectively dependent on this low data warehouse ETL. If you remember, this one was generated in BusinessObject, while this one is generated in Power BI. Two different reports run by two different users, probably in different departments, are going to be impacted by changes that I'm planning on doing in this low data warehouse ETL. But this is lineage between systems, between ETLs, data warehouse, and reporting. What if I want to dive in to see the inner lineage, lineage in the column level into the different layers of that specific ETL? Well, it's quite easy for you to see that there are four lines coming out of that. From my point of view, this low data warehouse is a collective of maps that impacts shift data to four different ETL database tables. Let's dive into the lineage all the way to the column level. So I can dive into each one of these lines, or I can dive into the package, which is, again, diving into the process. Here are the four maps. And this is the lineage on the column level, which here's the source table from the CRM, of which I want to enrich by adding another data element, or if I want to modify, I want to change, I want to mask some of the fields. And I can go all the way to the destination table and go to the upper lineage or the inner lineage. So the ability to navigate within your BI infrastructure, up and down, between systems, between tables, between columns, to integrate data lineage to business glossaries so you can understand a coherent picture, to be able within the business glossary also to tag things, saying, well, if you've seen in the lineage that we've associated the full name to first name and last name, we can immediately link a certain independent data element to their subordinates by automatically associating the link for you. You can also tag some of the data elements to specific business units, projects, people, initiative, as many as you want. So it will be easier for you to find that out within your entire landscape and also to filter the different layers from the business perspective and the technical perspective. And the cool thing about it is that once we extracted the metadata from your reporting tools, we also can extract the descriptions that are associated to any specific data element, formula, report, what have you. And if you already have them, we will create them here on the original item description and add those descriptions for you. So from now on, you will have a single repository to add more description rather than on separate reporting tools. What we've learned is that the majority of organizations either did not create descriptions nor maintain them. So if they want to do it, they can either upload whatever descriptions they have on any other application like Excel, reporting, whatever. But if they want to have a single repository managing their full description about their entire landscape, you can edit and create descriptions here and populate this among your business user. So when they next time want to use full name, they would know what to expect. And you can validate that it actually works when you come and understand the lineage. And if it doesn't exist, you can easily, with few clicks of a button, understand where should you expect that field to happen to exist and if not, what is it that you need to tailor or to map in order to understand how to ship more data. So with that said, I wanted to share with you how do we take individual use cases of different verticals and how do we tell a coherent story of why all these modules, when they are automated and they are integrated, can provide you the ability to deal with different use cases separately or collective of use cases for the entire organization. So at that point, even though I can talk for the next two days about OptiPy and share more and more information, I think it's a good time, Shannon, to start answering some questions. And then thank you so much for this presentation and just to answer the most commonly asked questions. And if you have questions for Amnon, you may submit them in the Q&A portion in the bottom right hand corner of your screen. And again, just answer the most commonly asked questions. I will send a follow-up email for this webinar with links to the slides and links to the recording by end of Monday for this as well as a link to the white paper containing additional information from the survey. So Amnon, diving into the questions here, does the business glossary include or exclude technical glossaries? So it includes the technical glossaries. Very much with the customers, I would say preference is to have both the business or the technical or both stored, extracted and managed within our automated business glossary. You know, especially as one of the founders, can you tell us a little bit about when you founded OptiPy and how you've grown and kind of why you started? Yes. At OptiPy with three founders have a lot of experience in the data area. We founded OptiPy in 2015. OptiPy was founded after I spent almost seven years in another company called Panaya that we established. The theme of Panaya was automation of manual processes and ERP upgrades that has to do specifically around Oracle and SAP, taking a year project and minimize it to four months by automating the simulation of what could be the impact analysis of upgrading. And with the theme of automation, I was fascinated with its huge impact of organization but also on an individual level of the people who are doing tremendous job just to be able to keep the organization with a more up to date. So we've taken the concept of automation and also shifted this from ERP area and we've created a new company that has to do with data management, specifically around business intelligence and analytics, which is as close to the business user as possible. So the automation is something that we're very, very proud and the automation is around raw material that we are fascinated with called metadata or different facets of metadata. And today you've seen some of the outcome of when you do proper analysis of metadata platform. Today we are serving many dozens of clients in different continents and different verticals of different size or different, I would say, names like it can be Fortune 500 and it can be a 1,000 people company. But the common denominator of all is that they have to balance the abnormal equation where they need to become faster, more accurate, but they always lag in time. They always lag in resources. So how could you change the physics of becoming better without compromising on quality and quality of life as a personal? So automation is something that we see very, very strongly being adopted. So what is your definition of a business glossary? There seems to be a lot of opinion over the place and where do you draw the line between a glossary and a dictionary? Great question. I'm not going to educate the market rather than share what the market respond to us. We believe the data dictionary and business glossary should be created in one. Business glossary very much is a terminology that is being used by, I would say, the BI developers that create dashboards and reports and they're looking at the business term. When you say customer, what does it mean? Or the data dictionary is very much more on the technical side that enables you to understand that DP means policy, number. Or when you say CS or cost ID in the business language, it means customer ID. So many technical terms are being used as shortcuts or even masks, by the way, due to regulations. So the data dictionary perceives to be more on the technical side and business glossary more on the business side, but both needs to be able to contain the physical layer, the semantic layer and the presentation layer, meaning both technical metadata and business metadata and the relationship between the both. Perfect. And we got a lot of great questions still coming in. Does Occupy provide an SIB option to help organizations, organizational stakeholders see the value of this product? Could you repeat the question? Sure. So organization wants to see the value of the product end? Yes. Do you keep provided a service SIB option? Like a trial. So the answer is yes. I love it. Simple question and answer. Are there any tools for unstructured data types? Not that we are aware of. As a matter of fact, when we established the company about four and a half years ago, it's really interesting. We didn't think that we would need to deal with all of these tools. The company was initially born to serve the more modern tools that are based in the cloud and to the unstructured type of data, like the big data. But when we started to sell or to share the concept of automation on a central repository, we talked to hundreds of organizations and they said, it's great to deal with big data and unstructured data, and this is important, but we have something more urgent. And then we said, what could be more urgent? And they said, well, the majority of the data that we run today and the majority of tools that we use today are these type of tools that you see in front of you in this slide. And we're not going to get rid of them so fast. The cost of changing them or eliminating them is greater than having them collaborated with big data environments or unstructured data that is going to be on top of this. So we spent the past three years addressing abstraction, analysis, and modelizing metadata from these tools. And as more unstructured data is going to take its place, we're going to be able to support that. And this is just a matter of a few quarters away from our standpoint. We haven't seen a majority of real estate of using unstructured data rather than, first of all, deal with this bleeding edge of dealing with these kind of systems. So the answer is that we still see these type of tools dominated today, but if there are any clients or any prospects that want to focus their efforts on unstructured, this is coming very, very soon by Octopi. Does Octopi integrate with ETL jobs running on mainframe, or is it just SQL Server and Informatica that you presented? So the answer is yes. Love it. So how do you do the data lineage? Automatically or manually? And if automatically how? And from business glossary to the tech metadata, how does Octopi identify the link? Well, for that, we need more than a couple of minutes. I mean, this is exactly what our IP and this is what our technology that we've spent so many years. It's a combination of some machine learning, pattern analysis, pattern recognition, algorithms that we put in place. In some cases, just to be more clear about what I'm saying, in some cases, we're able to understand the relationship from source to target because it's very explicitly being reflected on the relationship of the SQL. In some cases, we kind of make the connection in an implicit way as a rational of similarity of different relationship management, like transformation. So we have the technology that enable us not only to picture this in this way, but also to make sure that what we're showing you is exactly reflected on the reality. In some cases, we see a relationship that is either not being extracted or cannot be understood by the customer's metadata. And in that case, we enable a couple of things. Implicit links versus explicit. So you will see a dotted line, which is a calculation of Octopi suggesting that this could be a relationship between data elements, and this has to do with some of the machine learning capabilities that we've implemented, but also the capability of manual link of which clients want to have the freedom to add connections and relationship between different data elements that either have not been presented or they are not being analyzed by Octopi at all. So the automation has to do with our infrastructure of mobilizing, indexing, understanding the relationship between data elements. And so forth and so on. And the last thing I want to share is that the fact that you click here and see a lineage with a click of a button in about a few seconds has to do with the fact that our engine has already analyzed billions of permutations of possible data pipes. So when you click, the calculation of showing what you see here that happens in a few seconds happens on the fly. So it's not a picture. This is something that once you click, you can really generate this on the fly and get results in seconds. So the entire mechanism of capturing, analyzing, creating the relationship, indexing, billions of possible of lineages end to end or multiple to multiple and be able to click and visualize this so fast. This is where our technology comes in place. And then is there a way in the tool to surface content curated in the automated business glossary directly onto reports of SSRS or Tableau Power BI, whatever? The answer is yes. And does the business glossary include data steward information? No. The reason is that we only have the information that has to do with a different owner of data element, data stewardship. We haven't seen that as a major requirement within the business glossary thus far. Nevertheless, if this is something that we see as a growing demand to enhance the business glossary capability that includes that, we can definitely do that. The reason that I'm answering the question this way is that there are some lines that you have to put in within different capabilities not to cross domains because data stewardship is a very thin line from moving to data quality and then data governance. And then we haven't seen a single vendor that can provide a whole bunch of capabilities. But we're trying to focus our efforts in what we do best. And also this product, if I have not mentioned this, is collaborative with other tools like data governance and data profiling. So all the metadata that had been captured and analyzed and visualized in Octopi can be injected to other tools like data governance and so forth and so on. And I think we have time for a couple of questions here. Typically business glossary does not exist in a database. Does it exist in Excel spreadsheets or in Wikis? Does Octopi tie into tools that are non-databases? Yes. You can upload files and we can recreate them within our application. Easy. And there's some questions for some technical information which we will get out to everyone. And what is required for developing the lineage you described? Plugins or crawlers? Nothing. From the customer's perspective, if any prospect is interested to trial the product or even buy it, the process is very, very simple. We'll create an account for the prospect on either Azure or AWS instances. You will get a link to download the relevant extractors you would want to extract. Then you configure the extractor to the specific system of which you want to extract them at the data. This process can take from two minutes to maybe two hours, depending on the size or the system that you have, either a BO or whatever. It creates an XML file which you upload to your account. And within one or two days, after we finish the analysis, we'll be invited to a session with our customer success to guide you and to show you the results of the analysis and to guide you how to start using it. So the only thing from resources point of view is let us know who we need to talk to to guide who or her how to extract the metadata. That's it. I love it. I'm going to slip one last question in here for you. How does Octopi integrate with existing metadata management repositories, third-party or embedded ones in data integration suite? So it will be... We'll need to understand which one in particular. There are some metadata repositories of different tools, some of them are even storage in certain... What's in metadata repositories like Informatica or EDC or different tools. So it very much has to do with us understanding where the metadata is stored, but we have an open API so we can either inject or accept different data elements from different sources, but I think it would be more... A more clear answer would be to understand which kind of systems or which kind of sources specific customers refer to. Well, Adnan, thank you so much and thanks to all of our attendees for being so engaged with all the great questions, but I'm afraid that is all the time we have for today. Again, just a reminder, I will send a follow-up email to all registrants by end of day Monday with links to the slides, links to the recording. We'll get you some additional technical information about Octopi as well. I love all that those questions came in. And again, we will get... You guys will get a first run at the new white paper that we generated as a result of the survey as well, which complements the webinar. So thank you, everybody. Thanks. I hope everyone stays safe out there. Adnan, thank you so much and thanks to Octopi. Thank you so much. Thank you. Bye-bye.