 My name is Aditi and I am a geospatial data analyst at Ilk Labs in Bangalore, India. This is my first user. I use R to work with air quality data. Today I will present about how are you shiny to build tools to provide a workflow to analyze air quality data. R in the air. R in shiny are a powerful couple which can be used to build interactive platforms to manage and work with data collected. With the ever increasing global measurements of air pollutants through stationery, mobile, low cost and satellite monitoring, the amount of data being collected is huge and it necessitates the use of management platforms. In an effort to address this issue we developed two shiny applications to analyze and visualize air quality data. As we say, necessity is the mother of invention. We at Ilk Labs always wanted to work with high dimensional data and talk air quality to make it possible be designed and develop these two applications. Shiny applications are user friendly, so team members across continents can use the application without any prior knowledge of programming. Another added benefit is having the vibrant and supportive R using community. Both the applications developed have pre-loaded data sets to work around. The first shiny application is called NMEQ Shiny which helps in processing high resolution air quality data collected on a moving platform, usually called mobile monitoring. So we take multiple sensors in a car, take repeated measurements of each road in an area to generate stable high resolution air pollutant maps. Usually daily maps are generated. There is a superior performance in estimating long term mean concentrations when multiple repeated drives are possible. This is how NMEQ Shiny looks like. NMEQ Shiny can handle data in multiples of thousands every day from each of the instrument. We had nearly five specific instruments in this app and also used for the study. It reduces the time consumed for analyzing each pollutant individually, helps in visualizing the data collected on field each day. It can also be used to look for pollution hotspots, locations that are relatively more polluted than neighboring areas. Each pollutant or sensor data requires specific kind of pre-processing which depends on the principle on which it operates or its mechanical setup. Eventually this application joins different instruments data into one single corrected file. This really helps us in achieving the unit of analysis for the rest of the study. This application has reduced computational labor since mobile data contains a huge amount of spatial data. GPS data file is mandatory while other files from different instruments are not necessary. Since alarm tab is also present in the application, it helps the user to give a near real time check on the health of all the instruments used. This application is available on CRAN. The second application, Polycheck, helps processing open source air quality data usually called stationary monitoring. There are several platforms which provide open source air quality data. We built this application for users of these specific platforms who can have a quick analysis and basic plots of air quality easily generated and think about science. This is how Polycheck looks like. Polycheck can be used for data downloaded from Central Pollution Control Board specific to India, OpenAQ and AirNow. It aims at generating a range of statistical plots and summary statistics with several data processing options. There are options for different averaging periods as well. It checks for normality, generates density, QQ plots. Along with all these checks for any trends in the time series for the selected parameter. It can implement linear or multi-linear regression. Polycheck allows users to upload another set of data to compare selected parameters and generate plots. It also implements two plots from the open air package. This is made into a package and will be on its way to CRAN soon. Don't forget to check out air quality of your city using our application. A huge shout out to all the packages which made these applications possible. Manipulating different timestamps has been the most challenging part since there are different timestamp formats and multiple time zones. Integrating a number of files from different instruments has also been a task. Joining all files together to make a single file saves a lot of time for further analysis. And I think we are all fans of tidyverse. Thank you and to know more or to provide feedback, find me at these places. I'm also available in the lounge for further discussion or questions. Enjoy, use our and have a great day ahead. Thank you.