 Hi, I'm Samantha Weintraub. I'm the terrestrial biology chemist at NEON, and today I'll be talking to you about measuring foliar chemistry observations, hopefully in the context of remote sensing and the hyperspectral data. So obviously, plant canopies contain lots of elements, compounds, molecules that are relevant to ecosystem processes. Chlorophyll, obviously for carbon uptake, lignin for defense, and leaf longevity, phosphorus and nitrogen, other nutrients that enable plant growth. So we're interested in understanding the spatial and temporal distributions of nutrients and molecules like this in the canopy. However, I think this field is still somewhat young, right? We are still learning about fundamental controls on canopy traits and how they vary both within and between ecosystems, how that's much of that is controlled by phylogeny versus environment and climate. And also, I think we definitely have a poor understanding of how leaf canopies and canopy traits will be are and will be responding to global change because we don't have that much repeat measurement of such parameters. And so this is where hyperspectral remote sensing has been really a boon to this field, I would say, this idea that we can fly imaging spectrometers over ecosystems, we can have reflected energy, some of which is absorbed depending on the chemical constituents of the canopy, and then some of that comes back and can be detected by a spectrometer. And so that really allows us to scale up and increase our possibility for greater spatial and temporal understanding of canopy dynamics. So I think you guys have already been hearing a lot about the aerial observation platform. There are going to be three payloads under full operations and the idea is to try to cover all of the neon sites in any given field season to do these overflights and collect data from LiDAR instruments and also from imaging spectrometers. Also, you hopefully heard that the third payload is going to have targets of opportunity for principal investigators, which is very exciting to take our technology and even scale up beyond neon site. So I won't spend too much time on this slide because I assume you've covered this in great depth, but in terms of what's in the payloads, we have the hyperspectral imaging spectrometer, which is giving us radiance, reflectance, and from that we can derive some of the vegetation indices, albedo, using the LiDAR products we can learn about biomass, the waveform LiDAR, which gives us point clouds, we can do the digital terrain model and the surface model, canopy height model, all that good stuff. Digital camera gives us these high quality images. And so I'm on the TOS, the terrestrial observation team, so similar to Katie, I'm working on the ground technicians thinking about what do we need to sample, what do we need to measure to be most useful and informative to help us work with the remote sensing data. And so the idea here is what observational data and sampling can we do that will inform AOP. And I'm specifically going to talk about the hyperspectral data and canopy chemistry measurements because I think Katie just covered in pretty good depth all of the other sort of biomass, standing stocks of carbon, LAI, the other kinds of measurements of the vegetation that we are doing. So the idea here would be okay, we do an overflight of a site, we can extract the reflectance data from a given pixel on the landscape, and then if we can get a bunch of information in terms of chemical, molecular, elemental, you know, trait data from that tree, that individual, maybe that strip of grass, then we can build models with the hyperspectral data and use it to scale up, which a lot of people are already doing and we hope to really take that field forward. So I am the first author on the canopy foliage sampling protocols. If anyone has questions about canopy chemistry data, I am a good person to contact. Once every five years at a site, we do this protocol. So what that means is in any given year, we have about nine to 10 sites where we are doing canopy foliage sampling, and that is including this year. This is the first year we've got nine sites distributed across the continent that are doing canopy foliage sampling, and they do it in conjunction with the AOP overflight. So plus or minus two weeks of when the airplane flies, we are doing canopy foliage sampling. We are bringing down only sunlit leaves, so that's pretty important, right? Because we don't want intertree variation in things like allocation to chlorophyll versus rubisco or other intertree responses to light, essentially. We only want to collect leaves that are in the sun, mature sunlit leaves. We bring down canopy samples, we flash freeze them, and we send frozen samples for chlorophyll and carotenoid analysis. And then we chill the remainder of the vegetation, we keep it cool in a chest with ice packs where we do leaf mass per area measurements as soon as we can. The maximum holding time would be five days before we do those, but we try to do them as soon as we can. And then the rest of it, we dry, homogenize, and send to different analytical facilities to measure various chemical constituents. So how we do it really depends in terms of how we get leaves out of the canopy, very much depends on the vegetation. If it's very short-statured, we're dealing with shrubs, maybe in arid land, and also in our grasslands, we can just use clippers, which is nice and easy. If it's medium-statured, we try to use pole pruners if we can get sunlit vegetation by hand. But in a lot of our forested sites, we do have to use arborist tools such as line launchers, slingshots. No one is using drones right now, but it's been floated that that might be an idea that we could explore in the future. There are some groups doing that we know in Berkeley and elsewhere. It also just depends on the sites. We have certain of our site hosts are more restrictive. They do or don't want drones. They do or don't want slingshots. So we have this suite of tools. We do what we can to get sunlit vegetation out of the canopy. When we are sampling individuals, so woody individuals, those could be trees or shrubs. We do a plot level assessment of what are the three most dominant species in that plot, and then we go and sample one individual of each of those species. In low-diversity sites, sometimes that means you get replication. So maybe you get three blue spruce or whatever it is if it's a low-diversity site. But we do get the top three canopy dominant species at the plot level. We try to collect around 30 to 50 grams of fresh material, assuming half of that is water, and then we need to subsample it for all these different measurements we do. We do measure the height that the sample was taken. Even though we try to constrain everything to sunlit vegetation, we still try to provide where in the canopy did that sample come from using a laser rangefinder. We do record the tag ID because as Katie mentioned, that's how we geolocate things. So that's how you're going to know exactly where in the landscape that woody individual was by joining the tag ID to the vegetation structure and using some of our packages or some simple code to determine actual latitude and longitude. And so we're doing this when we have a woody site that's trees or shrubs. We've got 14 plots per site, which yields a target of 42 individuals per site, which is okay. I honestly wish it could be more, but it's what we can do given the constraints, resource availability, budget. And we do measure a lot of things about these 42 individuals. So sampling herbaceous vegetation, Katie explained we divide our plots and grasslands into these strips, these clipped strips. So if you're in a grassland site for canopy foliage, you would use these clipped strips to cut a bulk sample. We're not doing species level because it's a grassland. That wouldn't make sense. So they do a random selection of these clipped strips. They have a randomized way of picking one. They lay out the strip. They clip all the above ground herbaceous biomass in it. So we're not worrying about woody things that aren't green for this protocol. And the grid of clipped strips does provide the geolocation. So again, you know where you are on the landscape to try to link with AOP data. If it's herbaceous, we do do 20 plots a site. And some of those are larger tower plots. So sometimes you get 24 strips per site because we clip two clipped strips if it's a big plot. Most of the plots are the small 20 by 20. So we just do one strip. So it's about 20 to 24 clipped strips per site. And then it's interesting because we have some sites that are savannas, right? So they're a mix. There's grasses and there's trees. And so the plan there, we didn't do this. I think you're going to be working with the San Joaquin data that we have from 2013. And that is a savanna site. So going forward, we will be doing a mix of clipped strips and tree sampling. But the protocol just wasn't that mature. So we didn't do that. But you can actually probably expect the most samples coming from savannas because we're going to have to cover the trees and we're going to cover the grasses. So as I mentioned, we do take a subsample really pretty quickly. And we try to get that into a tinfoil packet, put it in the dark, put it in a cooler full of dry ice and flash freeze it as soon as we can. That's for the pigment analysis. Everything else we do set aside. We have the technicians select some leaves, healthy green leaves in good condition, put them in a bag and set them aside to do leaf mass per area in. And we seal the bag so we minimize water loss of those leaves. And then we just have a paper bag full of the rest of the leaves that we're going to use for chemistry measurements. So we bring all that back to the lab. We conduct leaf mass per area probably in the most common way, which is using a scanner to take an image of the leaves and then using a software we use image J to calculate the projected area of those leaves. And then we dry the leaves and weigh them. So we get a leaf mass per area measurement. I don't know if that many people or other groups are doing this with herbaceous vegetation. It's sort of weird, but we're actually doing a bulk herbaceous leaf mass per area scan. We're just trying to get a representative subsample, you know, tell the botanists, pick out some leaves that are representative of what was in the strip. For conifers and broad leaf things, it's much more straightforward. It's just a single species scan. And there are really large plant trait databases and hopefully, you know, these measurements can go in there one day. And then we do dry and homogenize the rest of the leaf material and send it off to various external facilities. So we've got measurements for total carbon nitrogen as well as delta 13C and delta 15N in the leaves. So that's all coming from a stable isotope lab. We're measuring chlorophyll and carotenoids using spectrophotometric techniques and a methanol extraction. So we've got chlorophyll A, B, and carotenoids from there. We're measuring lignin with the acid detergent lignin method. Our current lab uses the Ancom techniques, so the Ancom bags. And we're also doing major, minor, and trace element analyses with ICP, OES, a hot acid digestion, a nitric acid. So we hope to, when we deliver these data, as Katie mentioned, I think fall is a really good timeline when you can expect to see canopy foliage data hitting our data portal. We will make sure that all of the lab protocols are also delivered on the data portal so you can see exactly how we came to the numbers we did for the canopy chemistry. And then if anyone wants to sort of repeat to neon adjacent sampling, we can try to keep the methods as similar as possible. So I just wanted to show you a little bit of early data because I know some people have been hearing about neon for a long time, but it's fun to actually see data, the protocols in action. And so to sort of give you a sense of what you might expect. So last year we did a prototype of the protocol at, and it wasn't our first, we'd been prototyping at other sites, but this was the first time that the neon technicians themselves went out, you know, got line launchers and slingshots and really did the protocol. So we had two forested sites and one grassland site that we did have canopy foliage data, so it was about 100 data points. So just to show you what some of that looks like, so just plotting chlorophyll versus leaf mass per area across the three sites, we do see what I think is an expected trend that there's this trade-off between resource acquisition and storage. If you're a long-lived leaf, you're probably investing more in defense, leaf toughness, not as much in really fast carbon acquisition and allocation and vice versa. If you're really invested in growth, maybe not so much in defense, you're not a long-lived leaf. And you can clearly see there's, sorry the colors aren't great here, but there's definitely a variation between the sites, right? Our Great Smoky site tends to have the lowest LMA, highest chlorophyll, and then highest LMA, or sorry, highest chlorophyll, you could see. And then our Woodworth Grassland site is sort of on the other end of the spectrum. So I think these are some expected patterns, but it's interesting to see it show up in our data. And then these are all of the trees that we sampled in that effort. So the Smithsonian site and the Great Smoky site all combined into one graph. And we're just plotting lignin-to-nitrogen ratios, which we know are really important for controlling rates of decomposition and leaf longevity. And just wanted to point out there's quite a big variation in lignin-to-nitrogen ratio across these species. Some of the ones out here with high ratios are conifers, but there are also some broad leaf ectomycorrhizal trees that have really high lignin-to-nitrogen ratios. So it's interesting to think about when this plant litter is hitting the floor, depending on the mix of species that should promote very different rates of decomposition across the neon sites and plots. The other thing that I thought was sort of interesting was that in Great Smokies and CERC, we had some species that occurred in both sites, and it was interesting to think about how much plasticity is there in the species. And so I looked at foliar nitrogen, foliar phosphorus, calcium, and magnesium, and it seemed like nitrogen was higher in the Great Smoky site, and the rock-derived elements tended to be higher in the Smithsonian site. And that's kind of interesting to me. It sort of follows our latitudinal expectations, where tropical sites are thought to be more enriched and more depleted in rock-derived elements. And Great Smokies is like a more weathered site. It's a little more southern and a little more ultasol. So I thought those were kind of interesting patterns. Like phylogeny probably is important, but there does at least an ace or rubrum seem to be some plasticity, probably based on soil resources and environment. But that would be interesting. There are other species that co-occur both sites, and there's like more we could do with analyses like these. And then I just thought in the last couple minutes, I would mention the other thing I do at Neon. So I do a lot of the chemistry protocols for plants, but I also am highly involved in the soil work. And so just I don't know if you guys are thinking about it all or using the soil data, but I did think I would just mention it. So we do a lot of microbial and biogeochemical work in our soil plots, and we try to co-locate these, as Katie mentioned, as much as possible. So we have a lot of information above ground, what's going on with the plants, and we also are trying to get information about what's going below ground with the microbes and biogeochemical cycling. So when we do soil sampling, we do it in 10 plots per site. Four of those are in the tower where we have the most information about what's going on above ground. So we're trying to mirror that below ground. And then also six of the distributed plots. We are sampling three soil samples per plot. And if it has an organic horizon, some of our forest sites, sometimes we sample the organic and the mineral, and sometimes only one horizon. So it's a little complicated. I won't get too deep into it. But in generally, we get 30 samples per soil sampling event per site, but sometimes more for sampling the two horizons. Our target depth is to 30 centimeters. So it is a fairly large soil sample from a depth integrated perspective. Most people don't take them zero to 30, but we are doing it that way, because we can't afford to do multiple horizons, unfortunately. And then we do have three sampling events per year. We're trying to hit peak greenness at every site and then sort of seasonal transitions. So winter spring transition and then senescence. Or if it's a wet, dry site, we're trying to hit wet up and then dry down. And here's just a long list. You don't have to read all of this, but a lot of microbial data is going to be available. My colleague, Lee Stanis, is the best one to talk to about those. And lots of biogeochemistry measurements at different temporal resolutions. So at the beginning, when a neon site is established, we're having the National Resource Conservation Center come in. Professional podologists do a really detailed characterization, do full geochemistry on the soil, but that is not going to happen again. That's a one-time deal. Then we do periodic carbon and nitrogen measurements. We're doing measurements for net inorganic end transformation, so mineralization and nitrification, soil moisture, temperature, pH. I don't have time to go into too many of the details, but if anyone's particularly interested in the soil work, I'm around and I would be happy to talk more about it with you. And then I did have a couple of slides just sort of showing some early soil data. This is a soil temperature, mean annual temperature, sorry, not soil temperature. This is mean annual air temperature at a site versus soil carbon. And so one would expect this to go down as temperature goes up, decomposition rates go up, and storage in carbon storage in the soil, that is a globally expected relationship. But it's interesting here. So the colors are the sites, and the facets are different kinds of forests that we have in the neon plots. And so sort of the point here was that we have variation in the degree of change, right? And the slope varies with forest type. So the woody wetlands, there's a really steep decline with temperature in the deciduous forest. It's much more variable and sort of evergreen and mixed forest in the mix. So it's interesting to think about how forest type and maybe chemistry of organic matter inputs and other things are sort of affecting these relationships, maybe other state factors, right? Like other things are unaccounted for like elevation and slope angle of the plots. And so some of this noise is probably explainable. And then the other interesting thing that always emerges when you look at soil data is pH. pH is such a good integrator of a lot of soil processes. So whether we're looking at soil delta 15n, which is an indicator of microbial nitrogen cycling and forms of nitrogen loss in the system or this carbon to nitrogen ratio, which is a good indicator of degree of decomposition, or the diversity of the microbial community. pH is a really good indicator and helps us predict soil biogeochemical and microbial dynamics. So yeah, that was pretty much all I had for you all.