 Welcome, everyone, and thank you for attending this month's science seminar presented by the NSF's National Ecological Observatory Network, which is operated by Battelle. As people are trickling in, I'm just going to do some general announcements before we get started. Our goal with this monthly series of talks is to build community among researchers at the intersection of ecology, environmental science, and neon. We are very excited to have Sarah Shivers and Keely Roth from Planet here to speak with us today. But before we turn it over to them, a few logistics. We have enabled optional automated closed captioning for today's talk. If you would like to use that find the CC button in your zoom menu bar. The webinar will consist of a presentation followed by Q&A. As you think of questions, please add them to the Q&A box. There is also a chat box which we can use to share links with each other, introduce yourself, general networking or comments, but we'll put questions in the Q&A box. And then at the end when the presentation is on, we'll facilitate discussion. And there will also be an opportunity to ask questions over audio if you'd like to do so by raising your hand. NEON welcomes contributions from everyone who shares our values of unity, creativity, collaboration, excellence and appreciation. This is outlined in our Code of Conduct and these guidelines apply to NEON staff as well as participants external to the NEON program. Please do review our Code of Conduct which can be found on our Science Seminars webpage. I'm just going to kind of scroll down to the bottom here where you can find our Code of Conduct and check this out for yourself. This talk will be recorded and made available for asynchronous viewing after the fact. We hope to have that posted to our webpage usually about a week after the talk. And to compliment our monthly Science Seminars, we host related data-skilled webinars on how to access and use NEON data. Registration for those is available on our Science Seminars webpage which I will put in the chat momentarily. But if you scroll down to the bottom here is the data-skilled webinars and the one coming up later in May is extremely exciting and relevant. It's how to work with NEON hyper spectral data in Google Earth Engine. We hope to see some of you there again. I'll throw the link to register in the chat in a moment. And lastly, we are soliciting nominations for the 2023-2024 round of speakers. So nominate yourself or a colleague today by filling out this form which is kind of in the top here of our webinars webpage. We would love to hear from you and hear about your research if it's relevant to the seminar. All right, now I will turn it over to Shashi Kondori to introduce today's speakers. Thanks, Samantha. Being a remote sensing scientist myself, I'm pretty excited about today's seminar. On behalf of everyone here at NEON, it is my pleasure to welcome Dr. Keely Roth and Dr. Sarah Shivers. And before we get started with the talk, I would like to quickly introduce our speakers. So our first speaker for today, Dr. Sarah Shivers, is a senior product manager at Planet. With a PhD in remote sensing as well as an experience in management consulting, Dr. Shivers enjoys working at the intersection of science and business. Our next speaker, Dr. Keely Roth, is the lead hyperspectral scientist at Planet. Her past research was a data section of remote sensing, data science, and plant ecology and ecosystem science. Both our speakers, Keely and Sarah, received their PhD in geography from UC Santa Barbara. In fact, I only found out this yesterday. They even both had the same PhD advisor, Dr. Dan Roberts. Without further ado, I'll hand it over to Sarah. You may please share your screen now. Thank you. Great. Thank you so much, Shashi, for the introduction. Can everyone see my screen okay? Looks great. Perfect. Thank you for confirming that. Great. Well, so excited to be here today. Thank you so much to the NEON team for inviting us. Keely and I have been really excited about this for months. And so we'll be taking the next half hour to 45 minutes to tell you about the mission that we've been working on, and then looking forward to taking some questions and having some conversation. So before we begin, Shashi already gave us introductions, but just want to say, just to give you an introduction myself, I'm Sarah Shivers. I oversee our tasking imagery products at Planet. I've been at Planet for a bit over two years. And our tasking imagery products include our high res imagery products, but also our hyperspectral imagery products from this new mission that we'll be talking about today. And Keely Roth is our lead scientist for our hyperspectral mission. So we've been closely collaborating on this for a couple of years now. So I wanted to start out before I really get into our mission, our hyperspectral mission with giving a bit of an overview about Planet, because I imagine not everybody on this webinar is deeply familiar with Planet as a company. So who is Planet? What are we all about? Well, at Planet, we're really driven by the idea that no matter what your industry or your role or your geography, you can't fix what you can't see. And so that's why our mission is to image the whole world every day and make global change visible, accessible and actionable. What I think is really very cool about Planet in addition to this amazing mission is that we are a public benefit corporation. And so you can see here that we have a purpose as a public benefit corporation, which is to accelerate humanity toward a more sustainable, secure and prosperous world by illuminating environmental and social change. And so our mission was formalized into our PVC purpose when we went public. And this really cements how our business and our impact are aligned together and reinforces that as a company, we are really dedicated to bringing value, not just commercially, but also socially and environmentally. So Planet is an integrated aerospace company. We design, build and operate the largest satellite fleet ever. And we also provide online software tools and analytics that we need in order to fulfill our mission to make this data actionable and accessible to our users. And so we do this through providing up-to-date satellite data and agile infrastructure through insights and intelligence that is built on top of our data and through delivery, automated delivery into various applications and workflows to ensure that a variety of users can access and use our data. So Planet's fleet today is approximately 200 satellites. And right now we have two distinct classes of satellites. So we have our PlanetScope Dove satellites, which are on the left. Those are the satellites that really were what Planet was founded on. They're about five kilograms or about 11 pounds each. And we like to say that they're the size of about a bread loaf. That's a good way of being able to think about their size. They fly a large constellation in order to map the Earth's land masses every day. And they have just under a four meter resolution. So they are small, but they are able to accomplish a lot. They circle the Earth in about 90 minutes and take pictures multiple times per second. And our Dove satellites have eight spectral bands for a variety of applications. So these are able to image about 85 to 90% of the land mass of the Earth daily. Then we also have our SkySats. We have about 21 or we have 21 right now in orbit. They are much bigger. They're 100 kilograms and they provide responsive tasking. So they provide tasking at about a 50 centimeter resolution, so much higher spatial resolution. And they are designed to be really agile. So they can reorient and take a picture of a target in a few seconds. And at this resolution that they're able to provide, you can start to do things like count cars or identify the make of a ship or craft or do something that we do a lot with these, which is to assess like building damage and other types of damage following natural disasters. And so it's a combination of these satellites that have different types of sensors that we think is particularly powerful at planet and something that is a large differentiator from many of our competitors. So to further our ability to empower our customers, get them the information they need, we are currently working on two new missions for launch within the next year to add to the suite of capabilities that we already offer that I just talked about. So Pelican is one of these constellations. That's our next generation high resolution on tasking satellites that will follow on to supplement and complement our SkySat capabilities. And then there's Tanager, which is what we're here to talk to you about today. So Tanager will provide hyperspectral data from the visible through the shortwave infrared. And for our customers, and I imagine this audience is very familiar with this, that means the ability to detect subtle features, materials, signals and changes in our land and waters and atmosphere. So we plan to launch two demonstration satellites first and then follow on with more satellites to build out a more full constellation. So having different types of sensors allows us to use the right tool for the job. I think that's what's particularly exciting here and it allows us to tip and cue between our different systems. So, for example, because we are looking at all of the Earth's land masses every day with our planet scope data, we're able to detect change in this imagery. We can then use that to cue a either a zoomed in observation with our high res imagery, or we can in the future, the idea is that we can cue our hyperspectral satellites to that area so that then we can better characterize the change that we are seeing in our planet scope data. So combined potential here with all of our satellites is particularly exciting, I think. And so I want to highlight too that we're this also represents a large step forward in our capabilities. You can see in this slide how with time across the spatial spectral and temporal domains. We are really committed to improving our customers abilities to see what's most important to them. And so Tanager is really representing a very large step forward spectrally for planet. And I just realized I said Tanager when I haven't previously explained that Tanager, those are our hyperspectral satellites on the name of our hyperspectral satellites we like to name things after birds at planet. So we have our, our doves, which are our, our planet scope medium res satellites we have our pelicans upcoming and now we have Tanager as well. But Tanager is representing a very large step forward spectrally for planet so moving from eight bands, which is what we have through our superdoves today to over 400 spectral bands with Tanager that that span out into the short way So this constellation was designed with the methane use case in mind. And so this is the driving use case it drove a lot of our requirements, because are one of our goals with this mission is to mitigate methane by being able to detect point source methane emissions at a facility level. But when designing this satellite for this purpose. The requirements for that made it such that it is really well suited for a broad range of applications. And so we do anticipate having customers across many different verticals and know that this data is really powerful on much beyond the methane use case. So some, some up example applications that I'll name here but then keely will go into a lot more detail about some of these are things like within the ag space crop identification or looking at crop nutrients. Think about applications like water quality, or understanding biodiversity and certain areas and tracking change over time materials identification is a big one in different ways of assessing environmental sites. So broad range of applications that are that our data will be very useful for. So here are some of the key performance specs of our system. So we'll be flying in low Earth orbit around 400 kilometers. We'll be going into some synchronous orbit. And so, at least for the, the two initial tech demonstration satellites. That's the plan. We do have an anticipated revisit of one to seven days with the full constellation. We don't have a finalized number of satellites that will be putting up we're putting up more satellites as we get more demand for this type of data. But we do have a goal of hitting daily revisit for certain targets. So we have a spatial resolution of 30 around 30 meters. This is at nature, these are tasking satellites. So they will be looking off nature as well. We have a swap with of 18 kilometers so we'll be able to cover a good amount of area with our collects. So the sampling is similar to the imaging spectrometer on, on neons, AOP, airborne platform, and so 400 to 2500 nanometers at five nanometers spacing. And then this will allow for, for detection limits that you'll see at the bottom here for methane and CO2 really getting done at the facility level. We skipped over here but is really critical for talking about is our SNR. So our signal to noise ratio is something that is really important for this mission we're going to have really high sensitivity, particularly important for the methane use case but really relevant for others as well. And so we are anticipating an SNR in the methane detection region so in the shortwave infrared of around 300 to 600. So the structure of this mission is really unique. And Planet is one of many partners that is making this happen and so I think it's worth giving a bit of an overview of some of the key partners in this mission. In addition to to planet and how we all operate. This is a public private partnership. And it spans regulatory science, technology, nonprofits and donors. And what this does is it creates a really strong team that will really make this mission a success and drive, have the ability to drive action. So, Carbon Mapper is a nonprofit organization led by Riley Durin, and they are the program integrator, and they are also leading the research and advocacy work for the methane and CO2 mitigation. This includes using our data to process it and look for methane and launching a global public data portal to show where they find methane emissions in order to increase transparency of where there are emissions and really drive action. Another major partner of ours is JPL. And so JPL is a tech partner for us. And what they are doing is they are building the first spectrometer. So the spectrometer that will go on our first tannager satellite. They are actively building that right now. And then they are transferring know-how of how to build spectrometers to the planet team. And the planet team is building the second spectrometer and then all subsequent spectrometers. And so that build is actively underway right now as well. This is really exciting partnership as JPL is a leading source of knowledge and how to build and design really high quality imaging spectrometers. So we are really excited about this partnership and what it means for our capabilities and for the quality of our system. And then our role as planet is as the commercial data provider and as the manufacturer and operator of the constellations of the constellation. So we will be building and offering various products and I'll speak to a few of those later in this talk. And then we're also owning and operating the spacecraft and scaling the constellation. So like I said that means building payloads to through however many and we're also building the bus and the full spacecraft. There are various other partners listed up here and I won't go into tons of details about what each of them do. But it is a really exciting way to set up a mission and everyone has their own role to play in making this a success. So very fortunate to work with this group. So I do want to talk a little bit about the imagery products that we're offering at launch. And then we do anticipate that with time we will add additional imagery products to what we're offering. But for as we're like approaching lunch the products that we are looking towards and planning to offer. We will have our core imagery products that are our radiance and reflectance data products. So we'll be offering both of these in basic so not ortho rectified and then ortho products. And these will have the, these are the full full data cubes with the 400 plus spectral bands that I mentioned that span from the visible through the shortwave infrared. These will, the ortho products will have a 30 meter spatial resolution. And something that I didn't mention on the other page but I think is really interesting to note is that we will be able to collect imagery and create data products using various imaging methods that can optimize for either coverage or SNR. So essentially we can, we can decide that we want to cover larger areas faster with a more standard SNR, or we can maximize the SNR and have shorter collects. And so that flexibility makes it really well suited for, for a variety of use cases will be we're planning to deliver this data within one day after collection so pretty fast turnaround time to our users. And as mentioned we have a swath width of 18 kilometers and then the length will be variable depending on our customer needs. So the area that they're looking at and then the mode in which we run this out like to meet those needs. So, the other data product I want to talk about, and that application is, is methane leak detection and repair use case and methane mitigation so this is something I touched on earlier when I talked about our, our tannager satellites and the driving use case. So, we will be processing for customers that are interested in looking for methane, we will be processing our imagery to look for methane, and to be able to give our customers a quick look, as we call it, at whether there is nothing in their area of interest within a few days after collection. And so the idea is that this can really support users that have an area that they want to understand, and be able to, you know, take quick action against if there is methane, and we can help support our users and understanding. If there is nothing there or not that they need to follow up on. So, we will be identifying methane plumes above our detectable threshold within, and we'll have our 30 meter pixel resolution. I think something to note here is coming back to this idea of the power of planets different sources of imagery. Something I think is really exciting and valuable is thinking about our ability to detect methane plumes and in combination with our other data sources. You can see this example workflow on the right. You see an example, methane plume from our tannager satellite as we anticipate it looking. I think you can take this and overlay it onto a contextual base map from PlanetScope. We have PlanetScope imagery daily across our land masses. So, knowing that there is a methane plume is less useful if you don't have that context to understand where that plume is coming from. So, you can use that contextual base map to understand that. And then we have our high res imagery from our sky sets and soon to be from our pelicans as well. And so, if a customer wants to refine their understanding of where that methane plume is coming from, they could then task our high res satellites to follow up and get a really detailed look at what's happening on the ground. So that they can send their workforce in the right places to start addressing that leak that they found. So really thinking about the layering the different types of imagery on top of one another to really take and drive action. So with that, I will hand it over to Keely to talk a bit more about some of the science work we're doing and some more about applications. Thanks, Sarah. Yeah, so Sarah, I think gave us all a really great background on the Tanninger satellite and the instrument itself. And I'm going to talk just a little bit about some of the work that we've been doing pre launch to help facilitate our product development to get early feedback from potential customers. And then also just to kind of start kicking the tires a little bit on different types of data acquisitions and different potential applications we might be able to support with data collected from clients. So we've been working pretty heavily in the synthetic data space here with partners at RIT at Riverside Research and at a company called rendered AI that specializes in synthetic data as a platform. So what we've done with those groups is create a very large synthetic scene that uses a land cover base map and basically gives us a the opportunity to put a faux Tanninger satellite and create simulated acquisitions over that synthetic scene, giving us a wide range of data cubes that we can use to explore different questions. And mainly that we can use to help finalize the specs of our different core imagery products that Sarah mentioned, as well as start to develop and test out parts of our data processing pipeline in the next slide please. So we built this very large synthetic scene, as I mentioned in a partnership with RIT rendered AI and Riverside Research, and we did this by using a land cover map that was part of the Denver regional land use cover project. And so this project used a aerial imagery as well as LiDAR data to create a really fine spatial resolution land cover map. And we decided to use that not to try to accurately and perfectly recreate Denver, but as a very good realistic synthetic scene where we've got a lot of different types of materials and we have the opportunity to bring in a lot of spectral signatures and then distribute those on the landscape of our synthetic scene in a realistic way. So we source spectral signatures from existing deer sick scenes and libraries, as well as publicly available data like Eco stress library libraries that different research groups have uploaded to the ecosystem library system, and build out this scene that has over 1500 unique spectral signatures as part of it. Now once we've got that scene in place, we can add an atmosphere on top. And so we're using something called the four curves atmosphere plugin that's part of the deer sick graded of transfer model packages. And we essentially can then change what the atmosphere looks like over the top of our synthetic scene. Next slide please. So we can set up all of the other imaging acquisition parameters, we can put in a platform that accurately represents the hardware that is the teenager payload. And from that we simulate synthetic top of atmosphere radiance. So we basically go through and get this fully simulated acquisition. And we have a data cube from that that we can work with. We can use this like I said, it to test out certain parts of our pipeline so taking it from radiance back to earlier stages of the pipeline like digital numbers for example. We use our internal radiometric models to add the proper types of noise to the spectra and then we can even do an atmospheric correction with this cube to get a surface reflectance cube. Next slide please. So here's just a few examples of what some of those retrieved surface reflectant signatures look like coming from our 30 meter tannager pixels. And what we can do with this is also test out our different algorithms, we can really debug and understand what's going on at each step of the way. Using synthetic data in this way, definitely we always have some some assumptions we're using a lot of physical radiative transfer models, but at the same time we have a deep understanding of the system and how the signal is getting propagated through each part of the system. Sarah next slide please. So I will spend just a few minutes kind of walking through some of the applications that we see really being good opportunities for using tannager data. As I'm sure many of you know, one of the exciting things about imaging spectroscopy or hyper spectral remote sensing is the sheer number of applications for which the data can be used. I always like to highlight this RFD article that I think did a really great job, kind of summarizing all of the different application areas and all the different families of algorithms that have been developed. And the science here is really quite mature and it's exciting to be part of a mission that will make the data more available and accessible, which I think has been a major challenge to really operationalizing a lot of this good science. Next slide we can do like a quick tour through some of these applications if you're very familiar with hyper spectral remote sensing and applications. A lot of this will be review if you're new to this area hopefully it just kind of opens your eyes to all of the different places where we think tannager data can be of value. Next slide please. So, obviously I think one of the big ones that we've called out the flagship use case for tannager, which is atmospheric products so this is methane and CO2 plume detection and delineation. Showing some examples here from carbon mapper the nonprofit about how we can sort of take that and build different levels of data products, showing these plumes showing their change their time showing concentration, and even source attribution. Next slide. There's also a wealth of applications in vegetation, specifically natural vegetation and agriculture so we can do things like species detection and classification. We can look at changes in fractional cover and sub pixel abundance, which are very strongly related to structure, and we can develop models that allow us to estimate more directly biochemical and physiological plant functional traits. Next slide. In the agriculture space we have like an additional layer of context, so we could be looking at something as simple as saying what crop type is here. But in cases where you understand what crop has been planted, maybe you have more information about when it was planted, what kind of management practices have taken place. You can really start to use hyper spectral imagery to define and differentiate a lot of different aspects so that could be things like growth stage it could be things like pathogen presence disease could be things like a biochemical stress or environmental stresses. And so there's really a wealth of applications in precision agriculture where this data can be applied. Next slide. One of the areas that we've also looked at a lot with our partners, especially is in aquatic and coastal zones. So thinking about using the spectroscopy data to map things like dissolved organic matter turbidity. Chlorophyll a tracking things like harmful algal blooms, and even looking at things like bathymetry in very shallow areas. Next slide. In the urban and built environment there are a lot of exciting applications, building materials, looking at urban infrastructure, using the material and mineral signatures that are present in these places you have a really high diversity of materials, a high diversity of spectral signatures. And so there are a lot of different questions that you can answer and a lot of different derivative maps that you can make that can be used in a number of downstream applications. Next slide. Minerals is kind of a classic was sort of like the first use case for a lot of imaging spectroscopy and a lot of the algorithms that have developed for using imaging spectroscopy. This is also an area that we're excited about the potential of manager data to create mineral presence maps to look at mineral concentrations. And to help better characterize areas that maybe have unexplored mineral regions or even tracking the impact of mining operations through time so environmental site monitoring. Next slide. One that I think doesn't get mentioned enough, but it is becoming increasingly important, especially as we think about climate change and ecosystems as they evolve through time. And that's the cryosphere so looking at applications and snow and ice imaging spectroscopy has been broadly used to map snow grain size to look at water content in snow, as well as to look at different types of dust deposition on both ice and snow surfaces. Next slide. And kind of along the same lines there, looking at active wildfires and burned area mapping as well as well as ecosystem recovery trajectories. It's an area where there's been a lot of work done with imaging spectroscopy on the active fireside at a given temperature we're actually able to model the temperature and the fractional cover of the pixel that is on fire. Using the signal that shows up out in the shortwave infrared after an area has been burned we're able to look at things like charcoal and ash cover, and we're able to really track at a finer grain scale how well an area is recovering after a wildfire. Next slide. I wanted to highlight a little bit beyond just the applications of hyperspectral imagery on their own. And that is that at Planet we really try to think a lot about data fusion and bringing together a lot of different sensors. So whether those be optical or active or radar, and it doesn't matter that they don't have to be planet constellations they can be publicly available data they can be space born missions. They can even work with other folks who are building small cubes at and launching those. The idea here is to really to bring together all of those sensors into analysis ready data layers. And I'll give a little plug here for the upcoming analysis ready data ARD 23 workshop that's coming up next week. It's really, really important that as we get more and more satellites up, and we have more and more sensors on the ground and we continue to evolve payloads and instruments that we can fly on drones or in manned aircraft that we keep thinking about how we can use all of those data sets together, how we can bring them together into a single cube. And really that's where I think we have the most leverage for answering a lot of these bigger questions so we call it kind of like thinking up the stack and getting ourselves to a set of derived products that really take advantage of all of the different strengths of those different sensor systems. Next slide. Another thing I wanted to kind of just share like a quick example of with something that Sarah talked about previously and that's the concept of using multiple constellations in a tip and queue format. So Sarah talked about how that might work across tannager, our doves, and our sky stats and pelicans, and I thought maybe just giving a more specific example here of invasive species could paint that picture a little more clearly. So you can think of we have planet scope from our doves satellites monitoring large scale. We're getting broad swaths of the land three meters spatial resolution nearly every day. And what that lets us do is have a really good sense of what's changing through time and how we're able to monitor that signal and use that to tip and queue for our other satellite constellations. So, for example, let's say we're tracking some change in green up in a particular area. We might then come in and say, well, at this particular point in the season, we know we'd have high spectral separability between the native species and this particular invasive species that we're trying to monitor and manage. So we're able to say, Oh, great, we're at that right time in the season. Go get our hyper spectral imagery so we can tip and queue tannager to go and acquire that image that allows us to create a more accurate map of where we see that invasive species. If we want to map, then we need to take that a step further. Now we actually want to help support a land management group to go out and apply treatment to those areas. Well, at that point, we might actually want our high spatial resolution constellations to come in to really help understand what's the safest and best route of access to get to those areas that we were able to map using tannager data. I thought I just finished with talking a little bit about planet science programs more broadly, and we're really excited to bring tannager into the fold of this program as soon as we're able. But right now we have over 2500 publications out there in the scientific literature that use planet data in their research over 20,000 registered users across all of our science programs. And that's over 100 universities worldwide. So planet is really, really interested in and excited about continuing to provide data to the research community. And probably the thing I like most about this slide is just, you'll get the pie chart, which usually I don't like pie charts, but I like this one. It's a diversity of subject matter areas. I love seeing that there are all of these different disciplines and sub disciplines where planet data is being used to really advance scientific knowledge. Next slide. I think this is my last one here. And this is just like a summary of our existing science programs. We've got our education and research program, which is probably the most well known program. This is typically a degree granting institution has a university license or a lab may have a license. But we also have really strong agreements with the NASA commercial small stat data acquisition program, as well as ether earth net program, which allow any researcher who has funding through one of those bodies to access parts of our planet data. And you can see that that's not just planet scope. It also extends into the rapid eye archives, as well as guy set. And we have a couple other programs as well that are gaining popularity. So really strongly encourage you to go to the website and see how to get access. Usually people are pretty surprised that they most likely have a path to getting access to planet data, which is something that was not the case when I was in graduate school is really difficult to get your hands on any commercial satellite data. And it made it difficult to do any science with commercial satellite data. So it's exciting to see how much effort and support planet has put into really making sure that the scientific community can access imagery from these constellations and use it in their research. And I think that is it. Thank you so much for your time.