 Felly, rwy'n gweithio, rydw i'n gweithio Sam Elinworth. Rydw i'n gweithio a'r honno yn cymhwylu'r cymunigau syniadol ar y Cymru o'r Unedig Weston Ostraliau, ac mae'n gweithio'n gweithio'n gweithio i'n gweithio'r sefydliadau ar yr Unedig Weston Ostraliau. Rydw i'n gweithio ar yr ysbyg, dr Nick Callow, yn gweithio'r Cymru o'r Unedig Weston Ostraliau o'r Gweithio Llywodraeth i gyfwyr, a maen nhw'n gweithio'r Cymru o'r Unedig Weston Ostraliau ac mae'n cymhwylu'n gyfwyr i'r Cymru o'r Unedig Weston Ostraliau yn 2011. Rydw i'n gweithio'r Gweithio'r Gweithio'r Llywodraeth i'r Cymru o'r Unedig Weston Ostraliau ac mae'n gyfrannu Gweithio Iechydig ym Mhwyllgor i'r Unedig Weston Ostraliau. Rydw i'n gyfrannu i fynd i'r gweithio'r gyfrannu, gymortholig, gyfwyr, informatio cyfwyr o'r gyfrannu, gyda ni i'r cymysgau o ddeudio'r hyn o'r pleidio mewn ganferddelach honno gyda unigaf Alliannol ac mae'r ddeudio gen i'r cymysgau iawn i dechrau cysgol yn fadeol. Rwyf dydd y Gweithbethol yn iawn ddweud'r cymysgau cyfysgau o ddeudio am gweithfawr cyflosafol yw rhai gyda gyda'r cyffredinol gyda mi a pho wnaeth gallu'n gwerthu ymgyrchu cyfysgau cyflosafol i'r cyllidau o ddeudio'r cyfysg circle'au. felly, Nick, byfudd hynny wedi'u gennym gyda'r gwybwys neu'r br Weinbynol. Rhaid i ddweud am ei bodi, Sam, a dwych yn fwyaf i amdano i'r象rhaffau a cyng materiaethau yn yma. Felly gallwn i'n gweithio'r gweithio'r bonellaidd, a'r roaming remote-sensing unigol sy'n gweithio, ac yn bwysig dros ffileisio'r technologiach o'n gynghuno. Ac felly, rwy'n gwybod i ni wedi bod yn dweud yng Nghymru a'r ydw i'r cyfrifio ddwylliannol i'r rhaglion gyda'r cymhagorau, a os ydych yn bach i'r hyn ymddangos i'r cyfrifio, ond yn fawr, mae'n ei angen i chi'n cynnwys yng Nghymru, ac yn ddiddordeb ynghyd yn ddod, yn y gwahanol, y byddai'n gwahyddiad rydyn ni wedi'i'n gwahyddiad yn y rhan oedd yw'r ystafell, y dyfodol yn mynd i'n gwahanol yn y 1960 oed o'r rhan o'r cyfnod yn y 1980. Dwi'n dechrau i gael ei ffordd o'r rhan o'r rhan a'r rhaid i ymgyrchu gyda'r erioed yn y rhan o'r rhaid i'ch cyfnod. Felly, mae'r cyfnod yma, ond byddai'r cyfnod yn y dyn ni'n cael ei wneud i'r cyfrifyngau mewn gweld o'r gael. A'r gweld o'r gweld yn oes yn y dynnu'n amser yn gyntafol o'r gael mewn gweld o'r gael, ond mae'n amser o'r cyfrifysg. Mae'r llei gwrthwyll ymlaen o'r cyfrifysg mae'r cyfrifysg mae'n cyfrifysg ymlaen o'r cyfrifysg, that commercial and civil application that's actually the sort of fastest growing part of the sector. And then this sort of consumer use, which is more around the use of drones for photographic sort of areas and that area sort of grew the fastest. What I wanted to do was to sort of place this in a bit of a visual context. So this slide here we've got on the top this sort of evolution from the first mobile phones being introduced into America in about 1983. Rwy'n meddwl i'r rhaglen y brosesu consumer ac y rhaniaeth cyflogau sydd yn y brosesu mor cyflogau. Rydym yn cael ei gweld yn ymdill yng Nghymru. Rydyn ni'n meddwl i'w ddweud o gyffr�u'r phone, rwy'n meddwl ei wneud gafodd wedi'u cyflogau. Rwy'n meddwl i'w meddwl i'w drwng i 3210, rwy'n meddwl i'w ddweud i'w ddweud. a g holding, arddangos gwahanol, gan y styrwydau cewd, lle o gael eu gw Ruth Cymru. Yn rhoi wneud eich ei gwrthod gyfnod addyn nhw'n gweithio wrth ei ddin wrth cyfan. Yn rhoi gael ei wneud y gwahanol ddweud o'r olfa. Felly, mae'n gwneud bod o'r wneud, yw unrhen ni, gan'w wahanol, mae'r olfa USWA Octocopter yn ystod yn gwneudSorry Cymru, byddwn ni'n gwneud yr ystod ar ôl 15000 dolus yn dda i'r gwybod allan o'r gwaith yn ymgyrch. Mae'n gweithio ar six munud, neu mae'n gweithio'r rokenol, ac yn ymdweud y bydd ymdweud yma, mae'n gweithio'r ffunctiwnolol yn y dron sy'n 4 anodol, mae'n gweithio'r eich stor cyfnod ac mae'n gweithio ar yw $2,000. Mae'n gweithio'r gweithio'r gweithio'r gweithio'r gweithio. Mae'n gweithio ymgyrch yn rhoi gwahanol i'r gwahanol, ymgyrch, ymgyrch, ymgyrch, a gweithio'r ymgyrch. Mae'n gweithio'r eich stor cyfnod o Thomas Brunel yn ymgyrch, a gweithio'r ymgyrch sy'n ei gweithio'r gweithio. Mae'r gweithio'r gweithio'r ymgyrch yn fwyaf o'r cyfnod a'r gweithio'r gweithio ond ymgyrch o'r gweithio'r gweithio yn ôl ymgyrch coike building a really amazing stuff. Opportunities in this sort of GIS remote sensing space through things like the professor of science in geographical sciences and environmental sciences. Sherrine Hickey lecturer in that sort of area does some really cool stuff looking at mangroves and mangrove carbon accounting and remote sensing techniques, through into the masters of environmental science and the specific Dillumin, ac mae'n ddysgu'r begfifeth a foiwch o dda, ond mae'n gwneud y gwirioneddau ddylau yma i'w ffordd o ddronau. A'r ymdill nhw o'r busn yn rhoi ein ffais sy'n gŷnau cymaint. Yn y ddweud hynny, mae'n gweithio'r��는 programau o'r bydd yn hyn sy'n gweld am hwn i ddweud bwysig. A gallwn i'r busn yn gweithio'r ffais sydd yn cywirioneddau agriio. Dwi'n amlifio ar gyfer y bachol sydd a'r amlifio ar agri-cultiol sydd. Mae'n rheswm ymwneud yma ac rhai picwyr a Ken Fflow, yw'r amlifio ar y ddiweddol sydd yn y rheswm y bydd yw'r gweithio. Yn ystafell y dron, ac felly rwy'n deud yn ymwneud i'w ni, a yw'r ddaf yn bryd o gwyddiadau iawn i wneud i'w ddweud, neu i wneud i'w ddweud i'w ddweud i'w ddweud i'w gweithio, yw'n sgwysgau y bwysig yw'n ei wneud o'r ffordd y gallwn. Mae'r draun sylwyd wedi'i gynhyrchu'r ddarparu o ddweud o'r ffordd o gyflawnu cyfleoedd cyfeirii arwein. Mae gyrwch ar y gynhyrchu gyda'r ddefnyddio'r cysylltu i'r cynnig o'r ddweud o'r ddweud o'r ddweud o'r ddweud o'r ddweud o'r ddweud o'r ddweud o'r ddweud o'r ddweud o'r cyfeirio arwein. A dron sy'n gwybod ni wedi bod yn ddweud y cwmhynol a chynyddu. Rwy'n mynd i chi'n meddwl am dda i ddweud y cwmhynol, mae'r cymdeithas cyflawni yn cael ei gydag i'n gwybod. Dydw i'n gwneud o gwneud y cyflawni, a ddwy o dron deillog. Dwi'n gwneud hynny ddweud cyflawni, ac ar y gwaith llyfrincir yn yw'r unig o'r uwi, mae'n cael ei gŵr ar y proses. Yn y cwmhynol, y cyflawni ar y cyflawni, o'r Ryoc a'r licenol yn cyfnod ymlaen. Rydyn ni'n meddwl i'r licenol i'r llwyddoedd ac yn cyfnodol i'r cyfnodol, yn cael 25kg. Rydyn ni'n meddwl i'r llwyddoedd, ac yn cael ei ddweud, ymlaen i'r ddechrau'r ffordd yn cael ei ddweud i'r cyfnodol i'r cyfnodol i'r cyfnodol i'r cyfnodol i'r cyfnodol. But CASA allows for the capability to operate what they call mediums, so up to 150 kilo, when you can actually fly these on land that's owned by you or by the university, and then the large aircraft and not really operating within Australia, but certainly within the United States there are research organisations operating large drones. The standard operating conditions are really a set of rules that are there to really protect safety and protect people. The ones that I've lined up here on the top are kind of categorises the broadly non-negotiable ones, so they're the ones that everyone needs to be aware of, so obviously avoiding conflict with any other people or property emergency situations respecting privacy. The ones on the bottom are ones that under the standard operating conditions you can't really do, so in any sort of populous areas and beaches are explicitly listed as those within 30 metres of people, maintaining what's called visual line of sight, so that the aircraft is sufficiently close to you and these other categories. Within the remote operator certificate we have a series of different permissions and approvals. So, for example, we can have training to mitigate risks to operate in some of these areas. We have approvals to operate within 15 metres of people. We've got CASA instruments that allow us to operate at night. We can apply through CASA to get around some of these other rules. So, there's this opportunity for staff if you're needing to and for people who are using them more broadly to increase the range of activities that they can do by moving in from that excluded class operations to the ones where you actually have the remote pilots licence. So, just wanted to sort of have a nice run-through of profiling some of the amazing work that people are doing. Some of it's my group, but some of it's certainly a range of other groups across the university. So, you know, there's huge diversity. You've got really nice slide here from Jeff Hanson and word on the street is he might be giving a talk down the track. So, you know, I won't steal too much of Jeff's thunder, but they do some really fantastic work looking at changing in beach profile and beach dynamics. Nice one of them taking off their little fixed wing EB there down on the beach, which they've got all the appropriate permissions and approvals to do. Some slides here on the left from PhD student Naima, who's looking at shark aggregation studies again, you know, with the appropriate animal ethics approvals and her field sites, which happen to be in a military restricted flying zone, which also requires some work. But yes, super, you know, interesting usage of data and the sort of collection of data that, you know, wouldn't otherwise be possible. Some really interesting work that I've sort of been lucky to be involved with, but also some other groups at the university who are very active. The Center for Rock Art Research and Management in the archaeology group does some tremendous work as well with imaging archaeological sites. Some really cool sort of things to be able to work on some of the amazing rock art in the Barat Peninsula. What appear to be foundations of human structures, you know, built, you know, tens of thousands of years ago. And also some work that we're involved with through the Kimberley trying to reconstruct the paleo climate by using drones to image lakes and reconstruct the regional climate. So the project there is trying to reconstruct 60,000 years of climate and using drones as one of the many techniques to try and do that. Here's another sort of example of some of the work and this time it's using thermal imaging. So it's flying a thermal camera on a drone and processing the data to look at the temperature. In this case, we're linking monitoring of trees in the Kings Park area and the point cloud that you can see on the right hand side is showing temperature. So the red is hot and that's areas of the ground of bare soil, which is, you know, 50, 60 degrees on a warm day in summer. And then the tree canopy is between 20 and 25 degrees. So what you're actually seeing there is the effect of the trees pulling up the water and being able to transpire. The image on the left, we're looking at, again, hundreds of thousands, tens of thousands of points of thermal points within the Kings Park area and different sorts of trees. And these amazing relationships that you've got between different types of trees. So being able to do this really intricate scale of research by using the thermal cameras and, you know, looking at different trees that are able to access different water resources to get them through hot, dry summers and trying to look at the effects of climate change on woodland decline in the banks of woodlands of Perth. Another little one here, this is some work involved with snowy hydro. So they collect data on snow depth at a specific location, but don't really have a really good handle on how the data from one point relates to, you know, the whole broader catchment. So the opportunities to try and use drones to image areas and to look at snow depth and changes in snow depth has really been working to help improve our understanding of how much water comes off these alpine areas and flows down and supports, you know, a huge amount of the renewable energy that's provided the power generation and the water that's going to be required for the anticipated about four and a half billion dollar investment in pumped hydro to try and secure a means of storing renewable energy supplies. So in this last section of the talk, what I wanted to do was to just share, I guess, sort of seven kind of lessons, hard learned lessons that we've sort of come up with and some of the findings of some of the research. And the first one would really be the sort of observation of, you know, let your question dictate the approach. So just because you've got drones and the capability to use drones, they're not necessarily always the right solution. And here's a really nice example from PhD student Dan Dixon, who's supported through the CRC for honeybee products, really trying to understand flowering patterns of trees. You can see that photo there of some merry trees that are in flower with that white sort of tinge to them surrounded by ones that aren't flowering. So in this case, drone imagery is really, really good for providing a validation of flowering at this amazing scale, but completely useless for monitoring trees because we'd have to fly huge areas of the whole state or the continent. And we need to do it every day and that's just not practical. So in this case, the drone imagery is being supported into a process to use different sorts of satellite imagery. And in itself, the sorts of things that Dan's trying to do in this project really illustrate how rapidly this whole area is changing. So the planet labs satellite is not a single satellite like a conventional satellite like Landsat or Sentinel. But in fact, it's what's called a nanosatellite cluster. So it's hundreds of individual satellites that now pass every part of the earth's sort of surface every day, provide very high resolution imagery. So the sort of research that we're doing in this area wasn't even possible to even try and do even two or so years ago. So multiple techniques, multiple methodologies, but using them where they're best and most appropriate. These ones here have stolen some slides from Hustina, who's just completing a master's project in the school and done superb job and will be presenting this for her thesis defence thesis presentation shortly. But here she's really showing the effect of scale. So the drones as you fly them at different heights are going to collect different scales of data. So this is exactly the same image here. In this case, you can see one of the pots and what it looks like up close. And then what's happening as you're flying at different heights and then with a conventional RGB camera and then also a multi spectral camera. So the multi spectral camera is going to give us much richer spatial data, spectral data. So it's going to give us other information, things like the near infrared and we can derive other things like NDVI from that versus the RGB camera that's going to have much more spatial information, but less spectral information. And you can see as you fly at different heights, things look very different, but it's very much what we'd call horses for courses approach. So what we found in a lot of our research, if you want to identify a specific feature, if you're interested in things that fit into nice specific categories, then some of that work of hangle seems to apply really nicely. And basically they suggested that if you want to identify an object, you need about four pixels to do it and preferably two across the minimum axis. So if we're doing research and we want to identify a specific object, we really use that rule of thumb of at least the minimum of four, preferably six and aim for about 10 is best. But we've found in Hustina's work and some of the other work we've done, there's a whole pile of other published work that really supports this. If you're more looking at a crop, something that's much more of a homogeneous continuous surface, actually flying higher tends to work a lot better. And it has significant advantages of covering much larger areas. So typically, if we're trying to do that work, we take a completely different approach. So trying to understand this subtlety, this idea of horses for courses, depending on your question will dictate how you actually operate the drone. The other one that's been a real learning curve for us has been getting our head around what the software that we use to process the data is. So people would probably be familiar. You collect a whole pile of different photos usually in a drone, and then you use this structure from motion or SFM software. People would be familiar with the commercial packages like Aggisoft Metashape or Pix4D that are pretty commonly used to do this. But you really need to understand what that software is doing and how it's affecting the results that you're going to get out. And the key thing is understanding really what the software is doing is optimizing data to try and create this model. And that can be the location the photo was taken. Predominantly, it's the ground control points that we use, so that second point. And the third one is the camera distortion or the camera model that's used. And those can be really important. A really important thing that we've sort of come across in trying to publish some of the work and just sort of understanding any more detail. So it's really important within the software. And some of the software manuals actually tell you to use specific parameters, which are completely inappropriate and incorrect. So you need to set parameters when you try and fit the models, which are really accurate. Use all the data available. So when you're using ground control points, if you've got error for each individual point, include that. Don't just use a global error. Only add parameters, which are necessary. So there's some nice work from Mike James on this. And we found that camera model C of his. And again, this is sort of getting into the nitty gritty detail of people that are, you know, processing in photo scan in meta shape. And also, you know, the sort of understanding what you're doing and using your ground control points for in terms of fitting the model. And then what's an estimate of model error and what's your real error and making sure that you're using things like ground control points appropriately. Using things like sufficient overlap is really important. So we certainly found that 90% front lap and 75% side lap is a really minimum. We tend to use these sort of approaches of most of our field sites in that sort of couple of hectares range in size. So using about 20 targets where we use 14 as our fitting data set and six is our validation, which really allows us, you know, a couple to fall over or blow over or something like that. But it really needs a minimum of 12 ground control points and then four to validate. And that paper from Martin is Caracondo there, you know, is a good little guide in terms of about that sort of 0.5 ground control points per hectare. So we use those as some rules of thumb, you know, obviously not using our fitting data set in the validation processes. Most of the data sets that we've processed, we find that the error is around about four to six times the pixel size. So this gives us a really good guide on what we can expect in terms of error. And then if we're doing topographic change work like, you know, work on on sedimentation and changing structures like that. We know from some of the work that's been laid down a lot of the lidar research earlier that you can only detect a change that's twice the size of your error. So effectively what this means is if you collect one centimetre pixels, we know from our experience that your minimum detectable change is about 10 centimetres. So you need to think very carefully about what scale of data you're going to collect depending on your question and select that with knowledge of how you're going to do that. The other other stuff that's really useful in this is getting your head around some of the things that you can do like flying single grids or double grids. Imaging straight down or slightly obliqually and how that can help with getting more accurate models. It's really important to understand the difference between struct from motion point clouds and lidar point clouds. So lidar point clouds and struct from motion, which is from the drone technology, look really similar, but in fact they are not the same. So here's some examples of some of the work in the snow. And you can see here on the relatively featureless surface on the snow and in the really dark trees, the algorithms to do this key point matching really struggle to identify specific locations. And these are the locations which become the three dimensional points. So really how this is going to affect your point cloud is really important for the sorts of ways that you use the data set. So lidar data sets are going to be much more consistent. The structure from motion ones are going to have their own intricacies. So really important to get your head around how that works and how you can think about how you image areas. Number six, so really understanding how multispectral imaging works. So the fact that you're collecting an image which is based upon the intensity of the sun at the particular time hitting a surface and reflecting back up into your drone sensor. So if you want to fly an area and you just want to look at the relative differences and you're going to fly it over a very short period of time, it's not a cloudy day, then it probably doesn't matter too much the way that you also process this data. But if you want really accurate data, if you want to understand what calculate various metrics or you want to conduct time series, you really need to start doing things like using surface calibration. Understanding in this case how the sun's intensity is going to be affected by clouds really in some ways throwing out data or just not bothering at times if it's a cloudy day. So cloudy days, some of the data we've got is just useless. So, you know, if you currently quite quantitative work, you need to bear that in mind. The last one is really nicely illustrated by some of this work from Mary Murphy, another PhD student in our school who did some of the work on the frost detection. So really try to also use some of the other techniques. So we're not just flying a drone around trying to look at what we happen to be able to map with it, but also using some things that we've got through our remotely polluted aerial sensing platform in the Faculty of Science. So we've got some hyperspectral imaging equipment and this is what Mary's demonstrating in the field. And, you know, really nice figure from one of her papers there where she's actually used this sort of stuff to try and, you know, conduct experiments and work out, you know, what spectral bands are going to be useful. But also the second important learning, you know, multispectral sensors are really quite different. If you look at the lower parts of that figure there, you can see that, you know, the different sensors there have different bands that they're actually sensing. So one multispectral cameras red is not the same as another. So again, you know, if you have different people that fly different sites with different cameras, you might be getting completely different results. So again, some other other things we've sort of learned along the way. So I'm going to wrap it up there. I think there's a couple of questions, which is really good. And I'll throw it over to Sam. Thanks, Nick. That was a fascinating talk. And we've actually got heaps of questions ranging from the quite generic to the very, very specific. So we're going to try and work our way through all of them. And if people can continue to post that would be great. So Maureen Malley's got a couple of questions. Firstly, how mature are the privacy laws around drone use in non-scientific settings? And secondly, where do satellites and drones intersect or don't they? Privacy law is a challenging one. And so I guess there's the moral obligations and then there's the legal ones. Within Australia, I've had it explained to me that there is actually not a lot of protections available for people. But obviously, good practice and I think the moral usage of drones really requires people to use them in sensible and respectful ways. The legal cans and cans I'm not particularly afraid with. So obviously we encourage everyone and we use them appropriately. We've got processes through the university to abide by. But in some respects, there are privacy sort of restrictions with just Australian law. The question on the sort of drones and satellites really falls into the question of most conventional usage of drones. We're limited to flying to about 120 metres. So we tend to use drones for much more intensive small areas. Some of the more commercial usage of drones can fly them a bit like aircraft over much larger areas. But really it's a case of using different sorts of satellites. Satellites in themselves have a whole pile of different types of data as well. Some of it is really quite accurate. So several metres of pixel size. Some of that is hundreds of metres and different spectral information. So it really comes down to I think using an approach where you start with your question and then think about where the technology can best fit. That's great. Thanks Nick. So Dave Gozard's got a couple of questions, a few questions. So Dave says that his group would like to hover a drone with a laser reflector at one kilometre altitude at a UWA field station such as Shenton Park or Gingham. Three questions, very specific. Any idea how much it would cost to get a CASA exemption for a flight at one kilometre? Would it be worth someone in Dave's group getting an REOC and our EPL certified? And what accuracy do you think a drone can hold its lateral position over the ground when flying at one kilometre altitude? Right. We haven't done it because we've tended to avoid it. Someone has suggested to me they paid somewhere in the order of about a thousand fifteen hundred, twelve hundred-ish to get an exemption. We can certainly apply for them through our remote operators certificate. And you know, please email me and let's chat to sort that out. Would it be worth someone getting their licence? Yes, if they want to fly it themselves, they'll certainly need to be at that. They'll need to have the repo that was just the remote pilot's licence. What you won't be able to do is get your own REOC because CASA limits them to one per organisation. So at the present, we because UWA is one ABN and that's the way that CASA work, we have one remote operators certificate for the UWA. It's same for organisations like CSIRO and Geoscience Australia and government agencies. How accurate? Depends on what you want to do. If you have the drones fitted with real-time differential systems, they will hold their position with an accuracy of probably about two to three centimetres. And yeah, it just depends. So yeah, come and chat to me. Great, great appreciation. Thanks Nick. Further recommendations here. So Charles Sun says, we use drones to do traffic surveys but struggle to find a long endurance quadcopter. Do you have any recommendations for that instance? Yeah, because it was one of the challenging things with drones is just the energy density of batteries. And I think this is an area where things will maybe move into the future. So I know, for example, not that I'm on the payroll of DJI sadly, but like I know they've just released a new drone which is now in that sort of 50 to 55 minute. They're DJI M300 that's a replacement for the M200 and 210 series. So most of the drones at the moment are in that half an hour flight time. Although that's total flight time and probably a little bit shorter than that to be safe. But I think, you know, as different alternative energy sources come up, you might see that free up. And really interesting question here from Linda Jeffrey. So have you had any drones taken up by eagles or the birds of prey? No, we've certainly had a lot of interest from Wedgetale Eagles. And that's one of the things that we definitely include in our briefings. My cousin that works as a surveyor for a mining company had lost about three of the fixed wing planes. And I think he was saying that their insurer isn't very interested in insuring them anymore. So it's certainly something that we've seen, you know, the birds of prey really seem to like the drones. They seem to like the aeroplanes a lot more than the multi-roaders. And they seem to go the more phantom and mavic size smaller ones than our big matrice. But yeah, you notice a distinct change in bird behaviour when you put the drone in the sky. Thanks, Nick. So loads of comments from people just saying what an interesting and informative talk it's been. And Monica Daniel-Veltz says here, do you find much difference in the quality of the processed image using Agisoft or Pix4D? And also, how do you think something like the open drone map, which is an open source software image, compares to other commercial applications? Yeah, we've done a bit of experimentation with this and it's been more qualitative than actually trying to quantitatively test everything and write it up. We certainly have gone down the path of using the Agisoft photo scan and it's now called Agisoft Metashape. We find that that is the best balance between being able to control a lot of the parameters. Pix4D, if you want to go and rapidly process data, you want some really nicely put together aesthetic maps. And if you're more into qualitative mapping of sites, then the Pix4D and Agisoft just running it through seem to work pretty similar ways. They seem to outperform the open source stuff. The open source stuff, we've just found for the cost and I'm lucky that we can buy it in academic licenses and that might be an option that's not available to some of the people listening in. But we pay $850 for an academic license of Agisoft. So for us, the time investment in open source software just doesn't make sense. So that's where we've gone with what we do. And Carol Kerr's got a question about wind factor. Have you got any suggestions as to the impact and the best parameters to work in with that? Yeah, I mean, working winds that are lighter than what the maximum flight speed of the drone is, I've found that one out. And the emergency switch it on to sport mode can be quite advantageous to recover the drone that's slowly moving away from you. Yeah, we've got a little cheat sheet that we use when we're about to set up the drones and use, you know, apps, weather apps. So typically we say 30 kilometres an hour winds is really a sort of limit that we want to sort of probably use anything that that tends to shake the camera around anyway a lot. And you'll end up with a lot of motion blur in your images. So certainly, you know, using bit of flight planning software is a really good idea. And our sort of general rule of thumb is anything over about 30 Ks an hour. We'll tend to not bother going out flying. So we've got a couple of very related questions here from Ysmyana Rahayu and Yusuf Arif Afandi, who ask, is there any experience in using drones for benthic habitat mapping, for example, seagrass bed mapping and also could it be used for coral reef mapping? So can drones be used for seagrass bed mapping and coral reef mapping potentially? Yeah, so the nice little photo I had of Sharon when I was talking about the degree. Sharon's done some stuff and and Carl and Boyer, who's also holds a remote licence and works in the Indian Ocean Marine Research Group here at UWA. So they did some stuff out of the Scott Reef operating off boats, doing some coral reef mapping, more exposed stuff, but also some underwater stuff. There's a whole pile of people that we work with with the Australian Institute of Marine Science Group who are again based at this Indian Ocean Marine Research Group at UWA. So they use a whole pile of stuff with underwater autonomous vehicles, which in a way a drone is just a very convenient way of moving a sensor. So in similar ways, when you're moving into the underground environment using automated underwater robots makes a lot of sense, but you're moving similar sorts of sensors. There's also another group in school of biological sciences. Renee Hovey does some super work there with some of her collaborators. Again, full three dimensional structural mapping of benthic habitats, coral reef, kelp beds. Yeah, some amazing science questions that you can answer with some of the technologies. So yeah, again, if you've got those questions, I'm happy to try and answer them or to pass on some contacts of people I know working in these areas. Thanks Nick. So two final questions then we'll wrap up. So Yousif Arif Afandi again asks, you showed some really nice images with stress level using the thermal indicator for trees. Can you still collect that data even without even with a drone that doesn't have such a thermal feature? And then finally from Brady Johnson who wants to know, do you know of any citizen science projects that use drones where they can contribute to as well without directly being part of a particular lab? So is there a place where people can go and can go and do this in the wild so to speak? Yeah, first one. So the thermal imaging, they're quite specific and morally expensive cameras. I think the camera that we use was around the sort of 20,000 Australian dollar mark, which starts making the drones very expensive. But you can certainly do that sort of tree stress work with more than multi spectral sort of data. That one there was quite a specific project and had a whole pile of other work that we were sort of doing. But yeah, there's a whole pile of different options. There's some really good work that you can find out there on using either thermal multi spectral data for looking at tree stress, tree health sort of questions. The last question was. Is that basically, is there any, is there anywhere people are involved in it? Yeah, yeah, yeah. I haven't heard of any here in WA that I know of at the moment. There's certainly a whole pile that I've seen in the eastern states doing some amazing stuff with beach monitoring, beach change monitoring and using a whole suite of people for doing that. But yeah, I'm not aware of any citizen science work sadly in WA. But maybe that's a key area for us to tap into all of those amazing people out there with a great interest. Absolutely. So thanks so much again Nick for giving such an amazing and interesting talk and to all the questioners and attendees for some great questions. Sorry for about a billion questions there Nick, but you did a great job answering the wall. And look after yourselves everyone and see you in the next session. Thank you very much.