 Hello, Welcome to SSUnitex, so see this side and this is continuation of PySpark tutorial. So in this video we are going to see about the isNull and isNortNull function inside the PySpark. So for today's agenda, first we will see about the isNull function, then we will see about the isNortNull function and at the last we will see real time use of isNull and isNortNull. So let me quickly go inside the browser and we will try to see in practical. So here we have this data frame that is DF and it is having the value for the sales order. Now the requirement is first we want to add one more column on this data frame and newly added column will be indicating whether your item name is having null or not. If it is having null then we want the new column value as true, if that is not null then we want that column value as false. So first for using the function we are required to import the function. So we can use the from PySpark.sql.functions then import asTick. We want to import all the function. Now we have the data frame that is DF. First let me try to select all these columns. So for selecting all the columns we can use the asTick. Now second we are required to add one more column which is indicating whether your item name is null or not. So for that simply we can use the DF.itemName. Here we can use isNull so we can specify the bracket. So let me try to use the display command here and this display command will be going to display the output of this query. So here we can see the new column is added which is false. It is indicating like your item name is false. If we can scroll down side and we will see here we have item name as null. So we can see this value as true. So instead of using null we can also use not null here. So what this not null will do? It will be going to check whether your item name is not null or not. So if this is not null then we can see it is returning as true. If this will be null then that will be false as we can see. Next let me try to filter out this data frame for the particular null value and not null value. So for that we can use DF.filter and after that we can specify DF.hereItemName. This item name value is null. If this value will true it will be going to filter your data frame. Let me try to use the display command and we will see the output. So this time what we will see it will be going to filter those particular item names those are null. So as here we could see we are returning only 299 rows by which you can filter out. Similarly you can also use not null. So is not null we can use and it is going to return all these values as we can see the output as 500 rows. So this is the way by which you can filter out. Now the requirement is if your item name value is having any null then instead of null we just want to specify not available like any name. So this is the actual real time use. So how you can do that? So this is very straight forward. Let me try to use the data frame.select. Here I just want to add all the columns first comma. Let me try to add one more column that could be your item name new. So we can use DF.here we can use the item name and this item name after that we can specify is null. If this value is null then on that scenario we just want to first check. If this value is null then we just want to specify as not available. So here we can simply use when clause. So when clause we have seen in the last video if you have not seen that then I would strongly recommend to watch that video. First we can use when if this value is null then we just want to specify as not available. If this value is not null then we can use otherwise. We have already seen the otherwise in the last video as well. So inside the otherwise I just want to return the item name from this data frame. That's it. So this is what you have to do. Let me try to put this in one more data frame and let me try to use the display command here. So DF1. Let me try to execute and we will see the output of this. So here we can see we are having all these values but the column names are not proper. So we can use the alias so dot alias and alias name could be your item name new. Now let me try to execute it. So this time your item name new is here and if you can scroll down we will see if we are having the item name value as null then it is going to replace that null as a name. So by which you can use the null and is not null functions inside the PySpark. Thank you so much for watching this video. If you like this video please subscribe our channel to get many more videos. See you in the next video.