 Hello, welcome to SSUNITEX Succeed this side and this is continuation of PySpark tutorial. So in this video we are going to see about the data types. What are the label data types inside the PySpark? So here we are going to see the comparatively inside the SQL and the PySpark. So the data type as we have seen inside the SQL server that is the integer. So we are defining the integer as int. But inside the PySpark, so data types are very similar, but the representation is little bit different. So whenever we are going to declare the data type for any variable or any particular column, then we have to use the integer type. So the declaration that should be complete integer with the type. So this is inside the PySpark. Similarly for the big int inside the SQL we are simply specifying big int. But inside the PySpark we have to specify as long type. So this long type is used for the long integer values. Next for the float and double we have this like the floating values and for the double. So here we can specify the precision floating values. Similarly inside the PySpark we have the float type and the double type. So simply we can only add the type at the last of the data type. So inside the SQL we have the double. So here double type we have the float. So float type we have the big int. So here long type and for the integer we have integer type. Next for the care. So care and where care. So these two is used for the string data type. So for these two we are having only a single data type that is the string type. So this string type will be handling for the care and where care both the data types. Next for the date and time stamp. So the date data type will be going to represent as date type inside the PySpark that you can see. And for the time stamp it is going to represent as time stamp type. So we can simply say we can only remember the data type for the integer and long. So except these two data types rest all the data types whatever we have seen inside the SQL simply add the type at the last and the data type will be ready for the PySpark. But for the integer inside the SQL that is int but in the PySpark that is complete integer type. And the begin so that is long type. So these two you have to remember and the next one is the string except these three remaining you can add only type. So by next videos we will be seeing how we can use this in practical thank you so much for watching this video see you in the next video.