 Hello, welcome to SSUnitech, so we will decide and today we are going to see about the SEO data factory restart ability from the point of failure. So what does it mean? So in the SSIS we have seen about the checkpoints. So checkpoints we can implement inside the SSIS. So your package will be going to start executing at the point of the failure. But inside the SEO data factory we don't have the option for the checkpoints. So for example we will be going to have like three activities like activity 1, activity 2 and activity 3 and we want to execute all these activities in sequential order. So for example if your activity 2 got failed then we don't want to execute the activity 1 again. So as it got executed successfully in the first execution and we got the error on the second activity. So we just want to execute our pipeline at the point of failure which is the activity 2. So this is very important because let's assume your pipeline is having total 100 activities and 95 activities got executed successfully and we got an error on the activity 96. Then it's not very good to execute everything again. We just want to start the execution of your pipeline at the activity 96. So it will be going to execute the remaining activities. So how we can do that inside the SEO data factory we will see in this video. So let me go into the browser. So here we are having this input folder on the blob storage of the file which is the customer details dot csv file. So we just want to copy the file from the input container to the output folder. So this is the primary thing we need to do. In between we will be going to use the wait activities. So in this video our main focus to start the execution of your activity at the point of failure. So let me try to implement this copy data activity first. So let me try to add a new pipeline and let me call this pipeline as checkpoints and after that let me try to use the copy data activity here. So this copy data activity will be going to copy the file from input to output folder as I have already created the source and sync data sets for this. So the source data set which is the ds customer. Let me try to open so I can show you that. So this is pointing to input folder of the customer details dot csv file. Let me go into the sync and here let me select the ds customer sync. So this is pointing to output folder. So this is we have done for the copy data activity. Let me try to use the wait activity here at the starting and at the end as well. So as of now if we are going to execute this so it will be going to execute successfully as we are having the file at the source location. So let me try to execute by trigger. So we will be going to see this will be executing successfully. So we can wait. We can go into the monitor tab and under this monitor tab under these triggers we can see let me try to refresh. So we will be going to see. So here as we can see this is the latest run which got succeed. So let me quickly go again into this and now go to the input blob stories and here let me try to delete this file. So once we will be going to delete this file your pipeline execution will fail. So let me try to execute this again. So this time your pipeline execution should be failed because your source did not have the file. So here as we can see this pipeline got failed. If we can see the error then error is saying your blob is not available. As we can see let me maximize this. So here we can see the required blob is missing. So it is indicating your source file is not there. So let me quickly go into the input folder and let me try to upload the file. So that is the customer file. So let me click on upload this. So it will be going to upload it as we can see. Let me go here. So as we have seen already let me try to go inside this execution of this pipeline. So here we can see the wait one is completed with success. Copy data got failed and wait to need to be executed. So we don't want to execute it again from the wait activity which is the wait one. So here in the top side we can see the rerun option. So rerun will be going to rerun this pipeline again. The rerun from activity if we can select then we can see the option rerun from activity. If we can see the third option which is the more important that is the rerun from failed activity. So if we will be going to execute it from here. So only copy data one and wait activity two will be executing. So this is in progress. Let me try to refresh it. We can see this. So as we can see the wait. So here we can see it is indicating skipped the status we can see below side. So why we are having this skipped here because it got executed successfully in the first run. Let me go into the monitor all pipeline run and after that let me try to refresh it. Here we can see the success. Let me click on this. So now we will see like the first which is the wait activity is skipped. If we can see the status below side as well then that is skipped. Copy data one is executed and wait activity is executed. So this is the only way by which we can execute your pipeline at the point of failure. So this is very good if you are having a lot of activities on your pipeline. So instead of executing all the activities again you can go and execute at the point of failure. So thank you so much for watching this video. See you in the next video.