 Okay, guys, so I have 15 minutes only. I will try to keep it as fast as possible. Okay, so today we are going to discuss about web scrapping and data analysis using Selenium and Python. So before I start, may I know, like, how many of you are working with Selenium and Python? Okay, not much. Okay, so this entire exercise, I'm going to do it on IPython Notebook. I will tell you later on what is IPython Notebook. So we are going to discuss about web scrapping using Selenium. So far we have seen, like, Selenium is used as a web browser automation tool and it is used in the field of automation testing. But in this session, we are going to see how Selenium can be used as a web scrapping tool and how we can extract the data using Selenium. So I'm going to do a small exercise. I see if my time permits, I'm going to take an AMDB page. I'm going to extract the data from that page and we will see some interesting facts coming out of our analysis. Okay, so data science is one of the areas which has got a lot of attention in the last couple of years and we have seen like data scientists they use different set of tools in their day-to-day life to get their work done. Some of the tools are listed here. These are basically Python-oriented tool. So you will see like there is always a fight between Python and R, but both of them pretty much do the same thing. So there is a lot of time which is spent on extracting and cleaning the data. So if you ever get a chance to talk to any of the data scientists, you'll know like around 60 to 70% of the time they spent on extracting or cleaning the data. Okay, so because the reason is, the data is collected from different sources whether it's a web, database, CSV or Excel file, then it is thoroughly clean and then later on it is used for exploration because the main reason, the main objective of any data science project is to find the hidden trends and pattern within the data. Okay, so coming to web data extraction, they're one of the best way to extract the data from the web is through APIs because APIs provide the data in a more structured format in a JSON or XML format. Okay, so if you have to run any sentiment analysis on your Twitter data, you can use the Twitter APIs, you can get the data from the Twitter, but what if there is no APIs present for your website? What do we do in that case? So web scrapping is an excellent way to extract the data from the web, the unstructured data from the web, then transform that data into a structured format, Excel, CSV or database or a text file, and then after that you can use it for your purpose, whatever like analysis of the data or anything. So in a session today, we are going to see how we can scrap the data with the help of Selenium and then we'll convert the data into a structured format in a more meaningful format and then we'll do some analysis on top of that. Okay, so let's talk about iPython Notebook. iPython Notebook is basically a Python interpreter or a Python, it is tightly integrated with your operating system shell. So it provides you a web environment where you can execute your Python code.