 Hey guys, welcome to SSUnitech, so we will decide and today we are going to start with the get meta data activity. So in this video we are going to see about the get meta data activity and in the next video we will be going to see about the filter activity and then after we will be going to see about the forage loop. So these videos you should watch in a sequence, so don't skip any video. So first let's try to understand about the requirement and then we will see about the get meta data activity. So as per our requirement we are having a storage account and this storage account as we can see having multiple files. So under this input it is having the employee data along with the payroll data. So for example in the employee data it is having the information for the employee as you can see here employee India May 2022 and if we can see the data under this edit. So it is having the comma separated values there and employee ID employee name employee address and department name. So these are the information available inside the employee file and inside the payroll file it is having the salary information for the employees. So as we can see for three employees payroll month after that the salary amount then the tax amount. So this is the information and under this it is having as you can see two different set of files employees having different set of files and payroll is having different columns on their files. So as per the requirement on the monthly basis we just want to process all these files and loading the employee data into the employee table in the SQL server and payroll data into the payroll table inside the SQL server. So this is the actual requirement that we need to do by using Azure data factory. So for reading the data from this container we are required to use the get meta data activity. So what is the get meta data activity? You can use the get meta data activity to retrieve the meta data of any data inside the Azure data factory or SNAP pipelines. So as I told you if you want to read the data from the Azure blog stories one by one then we are required to use the get meta data. And we can also use the output of the get meta data activity in the conditional expression to perform the validation. So as you have already seen we are having two different set of files like employee files and after that the payroll files. So we want to segregate the employee files in a different output and the payroll file in a different output and we will be going to do the further operation on that. Now what is the meta data types? So by using the get meta data activity what we can get in the output? We can use the item names. We can use the item type size created last modified then the child items column count and exist. So these are the major types that we can use to validate. So go to on the Azure data factory and we will try to create a pipeline and under that pipeline we will be going to see all these in the real time. Now here I just want to create a new pipeline so let me click here on the new pipeline and let me call this pipeline as get meta data. So after that here under the activities we can search for the get meta data. So we can drag and drop this get meta data activity. Now here we can see the general tab and under the general tab we can call the name of this activity so we can call like getting employee and payroll file info. So we cannot use this ampersand so let me call this amp. So now we can also provide the description if required then the timeout retry. So all these are same as we have already seen in the other task in the earlier videos. Now go to on the settings. So under the settings so this is the required thing you need to set up the data set from where we want to get the information. So I just want to create a new data set here and our source is the Azure blob storage. So let me search for the blob storage click on continue. Then the type of the file as we have already seen which is the comma delimited file so we can select this and click on continue. Now here we can call this as get meta data. Now then we can see the link service so you need to remember we have to set up the link service to this input folder only. So we can go here and try to click on new and let me call this as get meta data. We can select the subscription after that we need to select the storage account. So here the storage account is SSU testing that you can see. So we need to select the SSU testing here. Now everything looks good. We can test the connection. It should succeed. So connection succeed click on create. So it is creating that so now successfully created. So we have created the link service now here we can see the file path. So we can browse from here. So under that if you can go SSU testing then we are having these folders. So our files are available under the input folder. So we need to select the input from here and after that we are not required to select any of these files because we want to keep the information for all those files. So we can click on OK without selecting any of these files. Now everything looks good. We can click on OK. So we have successfully set up for the data set. So once we have set up the data set then we can see the field list. So under the field list click on new. So under the argument it is having the child items exist item name item type last modified. So these are the types available for the Azure Blob Stories. So first under the child item so where we can use this child item. So if you can select the child item then it is going to loop through under any folder like we have selected input folder. So it is going to check all the files those are available under that folder. So under the input it is having these four files. So it will keep the information for all these four files. So after selecting this child item we can debug this and we will see the output on this. So we can wait a little bit. So it is executing now here we can see under the output. So under the child items it has two properties first is the name and second is the type. So name we can see the file names and what is the type which is the file similarly for the second, third, fourth and fifth. So all those are having the name and types. So in the output we can see all these. Now let me select the other arguments like exist. So if your input folder will exist only then it will return true otherwise it will return false. So let me debug and we will see the output from here. So it is on debug mode. So it is in queue. So it got succeed now let me check the output of this. So under the output as we can see the exist seems true because the input folder is available. So that is why this is true. If input folder is not there then this will return false. So this is also very important while we are going to check whether the folder is available or file is available or not or any table is available inside the SQL server. Then this is very important. Let me check the other properties like the item name. So as we have inside the data set setup for the input folder. So the item name should be the input of the folder. So let me refresh this. Now let me check the output of this. So the item name that we can see the input. So this is the item name now let me select the other properties like item type. So the item type that should be the folder because we have set up the data set up to the folder level. So let me try to check the output of this. So we can see the item type which is the folder. So that I told you now in the other properties like the last modify when we have last modified or created this folder. So this will return that date. So let me debug this and we will see the output of this as well. So this is on queue in progress completed. Now let me check the output of this. So here we can see the output last modify was 202020313. So on that date it was modified. So this is the all properties for the Azure Blob Storage. Let me go and we will select the child items because we are required to have the child items in our case. In the next video we will be going to see if the file name is having employee then we want to filter out the employee in one data set. And if the file name contains payroll then that will begin another data set. In next video we will be going to see the filter activity and how we can do the filter there. So thank you so much for watching this video. If you really like this video please subscribe our channel to get many more videos. Don't forget to press the bell icon to get the notification of our newly uploaded videos. See you in the next video.