 start with something called the data catalog or the metadata derivation problem. If you think about it, you might have 100 sources internal or external and each source will have its own metadata around how they collect the data and each source will have various value types which they are giving into your system. So you need to have a catalog of all these. Now that is from the source catalog perspective. What are the sources you're getting? That is what I'm mentioning as the raw data. Now, once you refine the data, you might add 10 or 11 columns to it in terms of I have refined it, I have enriched it and these are the additional things I have figured out about the dataset. Now that creates a complete bunch of dataset in terms of the refined data catalog. Now when you're creating curating datasets that might be additional views and inferences and derivatives on top of this curated dataset which you are giving for various end customers. Suppose you have 10 customers, each of them may not be consuming the same set of data. So each would be consuming their own set of data which will have some catalog implication with a say tag around who's consuming it and who's the source of it, right? You may not run into a major catalog issue unless you are doing a complete denormalization that is like a union all of all the dimensions of the data you're collecting across all the sources and making a single denormalized table for consumption which is practically not possible. So from the NPD angle, what are the challenges here is it does not clearly mention it does not clearly mention what are the catalogs I need to make available for my end user, whether I should give right from my source catalog, the refined catalogs as well as well as my consuming data catalogs or I should invest exclusively on creating a new dataset with its own metadata for NPD consumption alone, right? So this is one of the, I would say a gap in terms of the NPD draft where it is not clearly defining what are the catalogs I need to maintain. So that becomes, I would say a question and which data architect or a data engineering team has to answer down the line. And if you think about it, there are other common challenges in terms of when you do this thing, the data cataloging as well in terms of you can have a cardinality management when I say cardinality, you could have high cardinality data, low cardinality data, you will have your tags towards compliance, towards sources, towards consumption and a couple of other unique things would be the path where the data is residing. There could be logical paths, there could be physical paths where it is residing and where it flows through, what are the version? What is my current version of data? What are the version which I have sent out for consumption? So these are all various metadata which you need to manage around the data. And this all gets compounded if you have to manage it for the NPD scenario as well down the line. And another thing is another challenge in terms of catalog is the tool set, I would say as we are speaking, it's actually fast maturing. We have a bunch of tools, cloud native tools as well as open source tools available for data cataloging. And so the tool set maturity would be a concern but it is greatly getting address as we speak on this. And the third thing is the cultural aspect. So if you think about this cataloging or creating asset inventory of your whole data sets for any organization, it's typically on a backboneer. It is something classically what I would draw a similarity in terms of being a tech person is where unit testing takes a backseat during development. So it will be unit testing would be doing a catch up. So development of say 100% of code is there, this will be say 40% then it will reach up to 50, 60% so on and so forth. Similar challenges appear in the cataloging scenario as well. So if you have to adhere to the say the PDP or NPD or these kind of draft, this has to be given a first class citizenship in terms of your architecture and treatment and that has to be in place for you to adhere to or cater to the requirements of NPD.