 Welcome to today's webinar on the rise of drones in Australian research space. My name is Narsimha Galapati from the Australian National Data Service and I'm your host for today. My colleague Karen Wisser is behind the scenes co-hosting the webinar with me. This one-hour webinar is the second one in this webinar series. Today's webinar will showcase the success stories of drone applications in Australian research space, for example in smart farming, plant phenomics and data science. We will also provide an overview of the recent developments about the research infrastructure and see how we can leverage the benefits of the research infrastructure as well as open drone data. I would like to introduce our speakers for today. Dr. Siddeshwar Guru is the director of data science from the University of Queensland. This is Kim Bryson, Associate Dean of Science Faculty of the University of Queensland. Dr. Tim Brown, Director of Australian Plant Phenomics Facility, the Australian National University. Now, I would like to hand over to Dr. Guru for the first presentation. Thank you, Narsimha, for the introduction and giving the opportunity to talk about this topic. So, as Narsimha said, I am Siddeshwaran Guru, I work for time. So, today I am talking about the making drone data open for scientific research. So, this is basically the work in progress, what we are exploring and what are the thought process behind that. So, just a quick introduction when we say drone. So, what is drone? It's nothing but it's aerial vehicle. It may be controlled by the on-ground remote control or on-board computer. So, drone has got different names, commonly used as UAVs or the RPAS. I liked RPAS partly because it's a gender-neutral name rather than unmanned. And so, predominantly, if it's a rotary wing, it's predominantly called as a drone. And then even the big wings, fixed wings, aerial vehicles are also in this category. Some are pretty big, which is as big as the aircraft, and then which over-over the sky and then gather data. And the rotary wings, the smaller one, it runs in a controlled environment, very fine scale resolution, and then runs predominantly in the battery and then get the data. So, one of the main things about the drone is it becomes quite cheap to run it. So, that's why it's become quite popular in the various applications. So, some of the applications is in a drone, a technology was started in the surveillance and then, currently, it's a fair bit of the applications in the agriculture and widely in the environmental space as well, I would call it as a scientific research. And then it's not used in the disaster recovery. So, especially, it's very difficult to send a human being. So, the drones are quite popularly used. So, in turn, basically, we use drone in the remote sensing capability aspects of the thing. In this context, we use drone, basically, to derive fractional cover to measure vegetation species composition, to map vegetation characteristics, and then to do some stock take, the pastoral stock take, and then even to do the survey, flora fona survey, kangaroo survey, etc. So, a lot of data will be derived from the measurement that is done in the drone. So, if you look at, there's a fair bit of the vegetation is done partly because if you take a satellite remote sensing data, the vegetation composition, I think most of the two-third of the time, it has got the cloud cover over the earth. So, a lot of data is measured. So, drone is a really good technology which runs underneath the cloud, below the cloud and take a good final resolution pictures and then do the, and the researchers can do the analysis. So, in this context, the common sensors used in drone, so in drone, basically, drone is a platform and then the sensors are attached to the drone. So, in our case, it's a fairly popular things used are the multispectral, hyperspectral, LiDAR, and the video and the thermal infrared sensors. So, from a data management perspective, it's basically, it's a fair bit of a challenge, partly because it's dynamic, both space and by space and time, it's keep on moving at both the spatial and temporal scale. And then because of its ability to capture finer scale information, it's a massive amount of data is collected in that one. And we should have a capability to basically ingest that and then the process that data and make that data available to the users. And the other part is, there's a fair bit of the commercial companies who work in the drone industry and even at the research aspects of the thing, there's always a partnership with the commercial entity where they will run the drone and then collect the data and give it to the researchers. Especially there are consultants, especially in the mining area, et cetera. The other thing you should be aware that you need a permit to do this from a CASA to operate this. And then you should be an operator, you should have a license to operate this as well. And with all this, the identifying a data owner is important. I'll come to that later, why it is important. And if you look at the example to make this data openly accessible, just take a bit of a fair principle. So these are the four aspects of the principle. We just see how the drone data will fit in side of the thing. So if you take the first one, data is adequately described, searchable, and should have an identifier. So just to give a bit of a perspective, in the drone, the data is basically the flight plans and then the files of the flight parts and the associated field data, generally especially in our case. And then the raw data files from the measurement it tooks and then the log from the flight. And once it's the process, then all the derived products as well. And we should also provide the auxiliary files, like a QAQC files from the processing. So these are the different files that is part of a data publication aspect. So if you look at all these are the, it's a related, interrelated thing. So all these data should be made available. And then especially for the provenance aspects out of the thing, these, all these data should be made discoverable. And then that should be accessible from a user perspective. And then to make this data searchable and then the file. So basically, there should be a metadata standard to describe this data. So we use ISO standards. We're not sure whether the, for example, ISO 191.1.5 completely describe everything. Or maybe we may have to provide a customized profile of that ISO standard. And then if you make it available as part of a catalog then it's discoverable. And then once we put it in a catalog, we have an identifier and then that looks fine. And then the next aspect is the one of the principles is the data retrieval using open protocol. So if you look at the instruments or the sensors that are used in Tune, it provides by default, provides as raw data from a different file format. And then even at the publication level, if you look at the open standard, in the open data policy, they say that all the file format should be of the open format. So the file format generally in the, depending on the which of the instruments, it may be a GIF, GeoTIV, KML, Shapefile, or LAS file, especially in a point cloud aspect of the thing. So with such a kind of a veracity of the file format, then the tools must be available as well to translate or manipulate and analyze the data. And then even at the, sometimes it may be worthwhile even to provide a program, the R program, Python program where they can access this data and then so that they can run the analysis aspects of the thing. If it's too confusing, for example, if somebody don't have a clue about how to access the LAS file, then probably it's worthwhile to provide a program so that it's embedded, so that is embedded in the R program so that they can start writing the program. So that is one of the things. And the next one probably is the data use vocabulary and qualified reference to other metadata. And then, so we use a, you know, a fair bit of a domain-specific vocabulary, like a GCMD is quite popular, especially keyword search for us. And then, you know, because the drone and the UV technology has come so fast, probably there may be, you know, it may be invent some of the terminology incorporated into the vocabulary so that it's accurately represented. As I was explaining the different data types and, you know, when each of the data type, each of the, you know, files should be referenced and then it should be made as a link and then hopefully as a persistent identifier and then all those should be queryable as well. And then the final, you know, principle is basically metadata in domain-relevant community standards with clear data usage license. So we use an ISO standard, you know, either 115 or 139 in the domain, which may fit well as I said before that, you know, maybe we may have to create a profile to accurately represent the, describe the data. And a lot of the standard is fine. If you want to fit the human actionable, you know, discovery query and access, that means that the human go and click a couple of things. Probably we may have to look at the, you know, machine actionable, actionable discovery and exploration sort of the thing where, you know, it's a machine-to-machine query then that may be an issue partly because of the, so much of the interrelationship with the files and etc. It's also depending on the, you know, the what kind of file. If we are looking at the source file then definitely that will be an issue. And then the, talk about the data usage, you know, so what it says is, you know, you should provide as an open license thing. Even though, you know, we may say, we provide the all of our data and the creative commons by attribution, the identifying the owner is a key partly because, you know, even at the creative commons attribution, the copyright subsist with the data and then you need to identify who owns the copyright. So for example, if you are a researcher who is using a consultant to collect the data, so technically, if you look at that aspect, the person who collects the data is the owner of that data unless you have a contractual obligation arrangement, make sure that the ownership is transferable to the researcher. So why this is important is that, you know, for the attribution aspect of the thing, so that if somebody uses your data they need to know who is attributed as a researcher, you want the attribution to go to you, not the person who just collected the data. And the thing is, if somebody is the owner, they can do whatever they want with the data. The IP is with the owner, so what you want is the, as a principal investigator, you owns the IP about the data. Still, there are a fair bit of a challenge. One of the things, you know, we face is just the amount of data set that is collected. We still struggle to, you know, make that data, the ingest the data into one place, and then the process everything, and then the delivery of that data. So we think that the cloud platform, you know, the managed platform to do everything would be a useful thing. And there are a few initiatives going on as well at the community level who are looking exactly at the similar problems. So one is the OGC domain working group, and the other one is the RDA interest group. So with all the aspect, you know, we want, we don't want to change a complete, you know, data management practice of the term. So we retrofit everything to the term data management practice. So a lot of the, whatever the data, you know, we intend to, you know, make the metadata and ISO standards, and then cataloging the geo network, and then that is harvested to the, you know, different repositories, you know, discovery portals, so that the data is discoverable, and all the data, you know, will have a clear, you know, attribution statement with Creative Commons attribution. And then currently we are storing, you know, data in the FTP server, and then make that link available. And we are still in the process of, you know, processing a lot of data. But having said that the, all the raw files are there. So if somebody is really interested to use that data, they can process the data as well. And the other thing, you know, we are still working on the, you know, the effective, you know, the robust data delivery mechanism as well, so that the user just, you know, come and access the data and get the data. And one of the things, you know, we are working on the, you know, still work in progress is the, you know, the portal where, you know, user can come and get the final products. So we collect, you know, we have done a campaign across, you know, Love and Sites across Australia. So we are working on that. They called it as a field data. And then we are working on the portal where, you know, all those Love and Sites data is accessible as well, especially at the LiDAR level. So if you look at the drone, you know, it is quite popular in the research community, you know, especially in the environmental science. There is a massive uptake of the drone technology, partly because of the ease of use. And then the final resolution of the data it provides. And moving ahead partly, you know, probably it's worthwhile to build an IRT platform to manage the research drone data. It's not at the institution level, at the overall research level. So this probably enabled the interconnection of devices, basically based on the, you know, type of sensory use and what is the application you are running, and that will enable to build a common data platform so that each of the individual doesn't, you know, repeat the same grappling with the same problem. And if you look at the commercial word, you know, this is already happening. There are a lot of commercial players working in this space. It may be, you know, of the shelf as well. And then there are even a lot of the open source technology already started appearing, especially the management aspects of the thing. So what we want in this one is a researcher has got a platform, they put the sensors, they collect the data, and we need a platform where that data is ingested somewhere, the processing happen, and then the product, derived product is basically available for the third party researcher to analyze that data. So at the individual level, you know, people are working, but they make everything as a pipeline and then provide that as a service to the complete community. We are still working on that one. It's still a work in progress. Thanks for the opportunity to talk about this topic. Thank you, Guru. It's an excellent presentation, very informative. Now I would like to hand over to Kim for the second presentation. Hi, everyone. My name is Kim Bryson. Thank you very much for inviting me to be participating in this. I'm going to be talking about drones in research in relation to agriculture. And in fact, in education as well. And I think my talk quite nicely segues into some of the things that Guru has been talking about because the buzz of today's agriculture is around these disruptive technologies, internet of things, big data, drone technology, smart agriculture. They're all buzz words and commonly tossed around and there's been a number of big conferences in the agricultural space and farm management space about how such technologies can be of value to the agriculture and to the whole food science issue of producing good quality food in a sustainable way. What we've done at UQ is look at how we can incorporate things like an internet of things, multi-sensor mesh network to collect real-time biophysical data, which we store in the university's cloud. And we've done that around the whole thousand-odd hectares of the University of Gatton's regional campus, which is a multi-enterprise agricultural center. And we've also incorporated a second-master-sensor mesh network around 10 kilometers away from the campus that will look particularly in relation to... We've set up a living laboratory there is what we called around how we can use this real-time biophysical data with drone technology to look at biomass of pasture in particular, but that can end up going, obviously, to various crops and vegetation as well. So this network that we set up, it's very flexible. We work with technology from Libellium. We can look at different types of communication protocols. So we're not just looking at Wi-Fi, but we're looking at other types of radio interfaces that allow us to transmit data in real-time over long distances and through buildings and trees, et cetera. And it's a network that is basically self-healing. And we have a number of these nodes set up across the campus. Here we go, smart agriculture, smart water, smart environment, and smart security. This last one being the smart security one being essentially the ability through the network to turn things on and off or open and shut gates, which, of course, is really important from an agricultural perspective. The agriculture one looks at, well, in fact, I think my next slide looks at the types of data that you can gather from these sensors when it's installed. And we have all of these nodes and most of these sensors around the place. So we're collecting a lot of biophysical raw data in real-time, which we store in the cloud and have a dashboard to essentially access. The idea is that this is done through open source so that at the end of the day, people can have access to this, so turn may be interested in it. Other people may be interested in using it, remembering it's subtropical information, really, for people who are interested in the agricultural space. But we're looking to certainly make that available at the moment through eduroam to people who are involved in eduroam rather than anything else. We've tried it on an open network and that we found that it had a risk of the university being hacked through our system. So we closed it down from worldwide access to eduroam access. So lots of data, biophysical data coming in. And here's a diagram that tells you that we've got data coming in from these various nodes into our cloud. And then we're using that data within the educational environment now to develop various, essentially, data uses. So we've got research projects in here, but we've got farm operation that is something that we want to do because we'd like this to be developed far more into the agricultural farm space and, of course, our education component. Type of data you can get. You can modify all of this through our data dashboard or you can look at a QR code and get through to it. So that's the start. But when we have this large amount of biophysical data, what else do we need and what for? And of course, we want the data for things like pasture monitoring and management, animal monitoring and management, crop monitoring and management, and education, and more data. We want to get more data. So drones provide that or satellite data or remote sensing data provide that capability of putting aerial data on top of our, essentially, ground and ground data collection system. And if you go back to the future, or we're going back to the future in this because remote sensing has been around in agriculture since the early 1970s. And the problem for ag in relation to remote sensing were two main data. One was data, cost of acquisition and processing, the revisit frequency, the things that Guru has talked about, actually, and also a lack of skills available in the agricultural sector for this type of data to be used efficiently well in a cost-effective manner. And agriculture people or people in agriculture can be very easily turned off technology. What is the point of technology if it doesn't deliver me a cost benefit in terms of what I'm doing? And if it costs me a lot and it goes wrong and I can't do it when I want to do it, then it's not worth it. So we needed to do something about this lack of skills. And what we've done here is look at developing the special skills that are needed to understand spatial variability in agricultural remote sensing. We've set up an agricultural remote sensing lab here at the campus. And what we're doing there is we're trying to integrate students from different disciplines across the academic world. So engineering students come down and work on agricultural projects in relation to things like building drones and understanding what drones can be used for. Building sensors and understanding what those sensors can be used for in agriculture monitoring in particular. But that does involve, of course, environmental monitoring. So our wastewater management site is probably one of our most recorded biophysical data sites. And we get them to do hands-on work so they understand the issues and the risks. We want to use them because they're cheap platform to carry high-resolution sensors. You can see here a spatial variability in a paddock at ground level. You can see a bit, but when you look over here at an aerial photo of that paddock, the variability is huge. And that means dollars to a farmer. So we want to collect that data so we can optimize production efficiency and quality. And we then also want to minimize risk and environmental impact. So drones, from the perspective of an agricultural person, could enable us to do better in the smart food production game. Almost certainly enables us to do better in the smart environmental management game. And for us as educators, smart skills development because it's lots of skills involved and it's fun to do. And one of the things we have in the Australian sector, actually it's worldwide, if difficulty in getting students into the agricultural industry sector, and I'm meaning across the board, not just grains or horticulture, but across the board. So at UQ, we've got five DJ Phantoms that we use for teaching. We've got four bespoke quadcopters. We've got three bespoke hexacopters and 10 mini agricultural drones, which are the ones nowadays because these are the ones that I've talked about. The 10 mini ag drones are the ones that we use and I'll just talk a little bit about them in the next few minutes. They're easy to fly and there is an article out in 2015 about why we're doing this and what we're doing. The design and build principles around this business of getting students of all disciplines to understand what they're doing. We go through a design process trial and error using open software to look at designing these things. We purchase the parts for them. They build them. They test fly or they learn to fly, learn to solder things like this, undertake the project and write a report. So this is a classic problem-based learning or active learning scheme that we can make get students to do. Here are the types of drones that we've investigated and used over time. You can see there's certainly date range here and as much as anything else, this small mini UQ mini ag drone, we designed and built because we want to fly under the two kilo limit of CASA because if we had to get every student certified, we couldn't do this. So this little mini ag drone, which I'll talk about in a bit more detail a little bit down the track, is the one that we're now flying and this with a camera that looks at getting a normalized difference vegetation index image is less than one kilo. So the first student project was in 2013 where the student literally took a DJI Phantom with a little normal red, green, blue visual camera back to Fiji. This was his school in 2003. This is a Google map. This is his school in 2013, which is the image that you can see outlined here and he literally calculated the difference in mango clearing basically. So he looked at change in mango. So this was the first project that we looked at. Then we started getting a little bit more creative. We developed a drone that could carry a multispectral camera and we started looking at prickly acacia, which is a serious environmental weed up in central Queensland. Big problem from an agricultural perspective because it shuts out cattle from using the pasture underneath. And we compared some of that data to satellite data and found of course that these drones creating a better resolution on the ground gave us more data. This beneficial bug drone, a hexacopter weighing around about 2KG and more when we've got this white thing underneath it designed by an agricultural science student which is now going to the, has been invited to be put on display at the Science and Technology Museum in London for the next five years in their display on innovations in agriculture over time. So this is actually an old innovation now. This is about 2013, 2014, but it is an instance of when people started looking at using drones. So this is a box full of beneficial bugs which the farming industry, the farm group that we worked with wanted to see if they could develop, drop beneficial bugs, these bits coming out into the middle of a paddock instead of as they customarily do, drive around the outside. And so this was a very successful project with industry involvement that then started really introducing this sort of technology from the education, from the classroom into the actual agricultural space. So that was great. The net drone was another one where we were asked to fly under hail netting for one of the largest seedling producers in Australia. And here is the netting. You can see it's a lovely picture if you appreciate this sort of color variation. The drone is flying here and it was called the net drone project. We were looking to see which one of these seedlings in 250,000 small tube plants were germinating or not. And here you could see this sort of resolution that you could pick up from a standard multi-spectral camera, you could see the two-coddileden stage here and you could introduce students to the idea that you are looking at three different bands, so a little bit of physics, you're looking a little bit of spatial variation because you can see where in this one where my cursor is, where things haven't grown and then when you go up here, at two-coddileden stage you get that interest for a student as much as they can actually see a plant germinating and they can see where it has not. And what you do is you get them to calculate what the cost is to both the supplier and the buyer when they get a tray of seedlings and something hasn't germinated. So that links it to the economics of the producer. Here is our mini-ag drone. We've been flying it now for a couple of years. We've used it at the beginning of this year for 35 students to do some projects. It's based on Raspberry Pi technology. We've upgraded it because we found that the GPS wasn't crash-hot but in this sort of thing, I've got 10 drones which might cost me 100 bucks each to repair but that means that students can crash them and that's good for a student to be able to do that and over here you can see them flying on campus. Sort of data that you can get out of this quite basic no IR or an NDVI data camera is you can pick up this information quite nicely at an operational scale. So again, I'm guess I'm not at that. I'm not trying to get these students to be as knowledgeable as we might think from a research perspective but an undergraduate level three agricultural science student, they go out with this knowledge and will then be able to develop their skills in the real world after that. We are developing various sensors that can fit onto this small camera so the most recent one is a multi-spectral sensor again, multiplexed Raspberry Pi cameras that enable us to choose very high-grade research filters of specific wavelengths when we're wanting to look at something and this particular camera at the moment is being used on top of one of our IoT multi-sensor mesh network to collect aerial data of the biomass of that paddock across which of some 50 hectares I believe that has a multi-sensor mesh network on it. So this camera is now working so this was when it was being developed you can see how it's been set up four little Raspberry Pi cameras and we use Python to make sure everything goes well. This is a drone that is no longer flying but it was an attempt at us delivering a drone that could pick up the RFID tags in a cow's ear from a distance and it works at about 20 meters but we're talking a big drone now with a big antenna and whilst it worked from an operational perspective and from the idea of trying to keep things down to a level where we're not flying above CASA's regulations this became essentially unusable and although I have an engineering student who wishes to try and reduce this in size this is something that we're not at the moment moving forward to but it is an idea it's an example of some of the ideas that come when you start talking to the industry players and asking them what they want they wanted something like this because normally you have to stand beside the cow or the cow has to go through some cattle gates and literally swipe it with a hand swipe but you've got a thousand head of cattle that's a really difficult a really time consuming thing to do so this was about trying to improve the efficiency of being able to look at these cows in the paddock and the sort of data we could get from it was really quite good sorts of types and research and learning involved is electronics and avionics which is not normal in an agricultural sense but it's really useful if you're going to get into this side of spatial variability analysis the design and build side of things we look at programming physics, maths and chemistry of course because we're looking at the electromagnetic spectrum things like spectral indices and crop growth indices and chemicals and plants so if you're trying to engage a student in learning about maths which they really don't want to do you bring in something like this and it changes their perception of both learning about the maths and then how it can be used in real life when it would appear to be from a classroom perspective completely dry and equation driven plant physiology, animal welfare and food traceability are all issues around management that we can also talk about when we use these tools and most recently we've had these sorts of projects going on drone and sensor development for crop monitoring and diseases computer vision development for monitoring pests basically but we could be doing it in the pigory or we could be doing it in the environment for say wildlife people we've got a master of engineering student looking at developing a robotic arm for capturing these small drones out of the sky for autonomous recharging again it's an academic project probably not to be used operationally because of the issue of autonomous flight but very useful for the master engineering student to understand why we want that from an agricultural perspective so we want to increase obviously the reach of our drones flying and then we've had people developing automated underwater cameras to monitor phytoplankton which we can then link to our IoT and our aerial imagery of the lakes obviously the IoT has applicability elsewhere that may not involve drones like in the equine health area and the development of this visualization dashboard for our monitoring purposes and for staff to access and as I say anyone on EDUROAM to access some of this data that we're collecting and as I say this is a project the RFID monitoring of cattle is something that we're still sort of looking at but the idea is to reduce it in size as the electronics becomes smaller and we don't have to have such a big antenna involved which means a bigger drone and more weight etc. and start starting to contribute in cattle regulations so we're looking at a whole range of things both from an education and research perspective but mainly the research is about getting is about interesting students in learning about this stuff which I think is going to be key down the track in terms of capability development for our industry both agriculture and environment or managed landscapes because without these kids coming in we're going to run out of people in terms of as they get older they're not going to be able to continue to do this so that's my presentation and I hope you enjoyed it thank you very much Hi Kim, thank you and I will move on to the next presenter Tim Brown Thank you for having me on that was a very interesting talk Kim I'm going to be talking about our work at the National Arboretum in Canberra using drones to make 3D models of the Arboretum growing I'm the director of the Australian Plant Phenomics Facility at ANU and a lot of this work was done when I was working for the ARC Center Plant Energy Biology as a research fellow and I should first off say thanks to all the people listed there that have been essential to this project so I wasn't quite sure what level of understanding people would have on this drones are part of as Kim pointed out they're part of a key emerging tool set for next-gen monitoring in the environment and these make up sensors high resolution imaging, LiDAR and a whole bunch of other things that let us measure the environment in ways we never could before and since I've been working in the Phenomics field really we're interested in how you monitor the environment along with genetics so you can measure the phenotype because those are the things together that define how ecosystems develop and in a crop context you know the phenotype is essentially what we want for increasing food yields so a bit of background on UAVs I think a bit of this has been covered so I'll go through it quite quickly but you basically have a lot of choices if you're starting a drone program from the sort of low-cost X rotors like these quad and X rotors like the DJI Phantom these things are changing so fast so I used to give this talk and say that drones were expensive and confusing and then I said that they were cheap but still a bit confusing and now they're just cheap and easy to use the DJI Mavic Pro came out this year it's a very tiny drone but it still has a 12 megapixel camera on it you can throw it in a backpack and you can still do pretty decent monitoring with it at the low end of the things these sorts of drones typically cost in the one to $2,000 range although the bigger ones like this one here cost more in the $8,000 range and you get something like 15 to 20 minutes flight time and up to 10 kilogram payload for the really big ones you can also use fixed-winged drones so these give you a longer flight time something in the 30 to 60 minute range but you do have to have takeoff and landing issues and they can have higher payloads but they do tip to fly faster which impacts how you can monitor with them a new system that's being developed by people that we work with at Pro UAV is this vertical takeoff and landing system and we really think this is gonna be an amazing tool because it gives you both the benefits of a fixed wing and some sort of a copter because with the helicopter part of it it can take off straight so it doesn't need a landing zone or runway and if something goes wrong and the gas motors fail it will catch itself and land so you can carry quite a heavy payload and you can put really expensive equipment on it because it's quite robust on the camera side, as Kim pointed out you have a lot of options ranging from RGB cameras the onboard cameras on the Phantom work quite well all the way up to DSLRs but you do need to watch about what cameras are using and isolate them from vibration or you get rolling shutter issues where the camera's trying to take a photo but the sensor is not writing all the data at once and so when there's vibration you end up with bad data the sort of next step up unless you're building your one yourself like Kim pointed out is the multi-spectral cameras that can get you NDVI and the main thing that people use now are either the Micasense, Sequoy or RedEdge and those are in the $3,000 to $5,000 range there are a lot of other sensors so you can put hyperspectral cameras on your drones or thermal and you can see you can get bands in the 400 to 1,300 nanometer range but you need a big quadcopter because they're heavy, they're very expensive and the data's quite hard to process so there really is a range of stuff available from really easy to use up to quite challenging depending on what output you need the typical outputs you get from 3D reconstruction software which is a lot of what I'll be talking about are ortho-mosaic images so these are sort of essentially satellite layers that you can put into Google Maps or Mapbox and then also DMs and GeoTIFs and some of the software can actually give you somewhat classified outputs or remove the ground so you only get trees can provide you with RGB multi-spectral or hyperspectral indices and also 3D point clouds and then you can make a 3D model of your environment as well if that's what you're interested in there are a lot of options on the 3D reconstruction software front and I'll just go through a few it's important to know that it requires a pretty beefy PC to run these things so you're looking at probably 1500 AUD to get a PC because you need a fair bit of RAM you need a pretty good processor and then you need a graphics card PIX4D Mapper Pro which is the one that we've been using because it was about the only one available when we started working it ranges from $2,000 to about 9,000 depending on what license you get and you have to pay about $1,000 a year for support it also has a challenge that you can only run on Windows so if you want to run it on a server or on the Amazon Cloud for much bigger data processing and for automating stuff you have to get the Pro license the other one people use a lot is Aggiesaw PhotoScan I haven't used it personally but it's been well recommended the cheap version if you just want 3D models is about $60 US and about $550 academic and then pricing goes up from there and those also will run on the cloud if you buy the Pro version Mosaic Mill is another software package it's the last quote I got from them was about 4,500 euros it comes in a lot of flavors but I haven't used it there's a free software package called VisualSFM which is FSM which seems to work pretty well because it's free and somewhat open source software you know it doesn't have all the bells and whistles the expensive ones do but if you have more time then you do money for your project it's probably a good thing to explore there are also a lot of online options and these are great for testing so if you either just do occasional flights and don't want to invest in the software and the PC or you want really fast processing or you're preparing a grant proposal so you just need to get some initial data websites like these ones let you just upload your images and give you a plain cloud and other data quite quickly a disclaimer this is not a complete list and all of these prices change rapidly and frequently so check the vendors for pricing and don't take my word for it there's plenty of other information out there but you see from this software you can take a flight like you see on the left side here and then you get a 3D reconstruction on the forest and I'll talk more about that in a minute so the site where we've been flying drones is at the National Arboretum in Canberra and this is a really great site because it's just five kilometers from ANU and has fast Wi-Fi the last site I worked at was in southern Utah and it took about five hours to drive each direction so when something broke it was a good day and a half just to get there and back and having a field site where you have Wi-Fi access from your desk and you can drive out there in five minutes really makes it easy to test new and emerging technologies until they're ready for deploying elsewhere and also the forest was only planted about seven years ago so we have an opportunity to monitor this forest and build a three-dimensional model of the entire arboretum growing into the future which has basically never been possible before so we initially at the ANU research site we installed a 20-node wireless mesh sensor network CAMBO weather stations for baseline data some gigapixel cameras these are cameras that take hundreds or thousands of high-resolution images that you squish together to get a multi-billion resolution picture we've done some lidar scans both on the ground from various sorts and then we've done near monthly UAV flights and once you sequence the trees out there because tree sequencing is getting down to the $10 to $20 per tree range you can really measure phenotype environment and genetics in a way that gives you an amazing density of data about a space that we were never able to access before so here's CAMBO on the southeast side of Australia the arboretum is over on the west side of CAMBO south of Black Mountain if you've been there this is our main field area this is a picture of what it looks like from one of the cameras up on the hill so this is a 500 megapixel image that we generate every hour and you can zoom all the way into that image to see the forest out on the far side there this is what that forest looks like when you fly over it with a drone and then from the drone data you can bring that into pics for ideas I'll show in a second and get the three-dimensional model of the trees so the drone monitoring program the goal was to test and develop a time series drone monitoring program so we could get 3D models of the trees growing we get time-lapse georectified image layers 3D point cloud and then some phenotypes we can measure like tree height area is measured by top-down pixels and color data over time and this forest we're studying is it's 12 forests of spotted gum and Iron Bark was planted in about 2012 and it's four hectares so it's really a perfect size for drone monitoring because you can fly the whole site in about 15 minutes it's not too hard to process and it's very amenable to that sort of monitoring we've really come a long way on this so in April 2013 I flew the first drone over it which was a cell phone duct tape to a drone that I'd made at home and now we're using a really nice solid Matrix Pro by DJI with Darrell from Pro UAV who's been flying it for us and really good cameras and so we're really getting a lot of solid data but just in the last four years five years this technology has changed so much the typical workflow in this is PIX4D is a software that we use it pulls in all the images you can see here that were taken from the drone and then you can see on the left all those green lines going down are key points that is detected in all of the images and there's really a lot of black magic that happens behind the scenes in these software packages and you can see on the right those are some little tiny piece of ground that the software has detected as being the same in about 30 different photos and it uses that to calculate the actual position of each photo relative to the ground those are called control points from that it creates the 3D point cloud as you can see here and that gives you this sort of ghostly 3D model of your entire forest you can see that there's some data missing from the bottom because the trees are the the drones are just looking down from the top and can't see around the edges of the trees the total processing time to do this is very hardware dependent so it ranges from like 3 to 12 hours but it allows you to go from pictures to point clouds to three dimensional models of the trees you can see here it's important to realize how groundbreaking this technology is because previously if you wanted to measure the location and height of every tree it would have taken you an incredibly long time there are 2,000 trees in this forest and we can do this now in about a morning but there are still a lot of challenges as Ger was pointing out with working with serving up the data and so we developed we implemented a point cloud viewer called Poetry on our website traitcapture.org you can go there and see some of the point clouds that we have online and we wrote a software package called forestutils that runs on python that lets us pull out the locations and height of every tree and the point cloud data associated with that tree so this this assumes an open canopy if you have a closed canopy things are a bit harder unless you have GPS locations for your trees but now once we have the tree locations when the canopy closes we can keep tracking them and it outputs tree height top-down area location RGB colors and a point count which is a measure of how many points were generated for that tree and it also spits out a CSV map that you can just a CSV file that you can just stick on google maps or anywhere you want that has all of your tree locations so that that's the program we ran we we probably flown thirty flights over the last uh... since since mid two thousand fifteen and i want to talk a little bit about data management because it's this is really crucial and if you're you plan to do anything more than just occasional flights you really need to come up with a good data management plan you want to do this before you start your surveys and it's important to consider the entire workflow right because it's not just who captures it or what happens to it but the entire process if you think of who does the surveys if you have more than one company or more than one person that data has to get to you somehow it has to get processed you have to figure out where it goes on your computer and track it you know until people like turn have made us nice tools for having our data go seamlessly online you have to manage all this stuff yourself until it gets to the point where you publish it you have to figure out smart ways for naming things and you know that you may get a data set and then add to it and then add to it again and then process it and you need some sort of workflow that tracks how that how that is taking place and doing that across thousands of images or hundreds of flights can really be a challenge and if people upload data you need to make sure that they've told you how much data they have so that you can have all of it so you don't spend a week processing their their data and then find out that they hadn't finished uploading it and you needed to add another hundred images and run the entire thing again often you run into problems like we stole all of our data on a large data server at the research school of biology but we process it on a computer that's local and has an SSD so we have to move the data gets uploaded into one folder by whoever whoever took it gets moved to another folder which is the storage folder and then gets copied to this computer to process and then has to be copied back with the the new data as well and so that that makes it quite hard to track things over time and you know if experiments fail or new data comes in you need to have a workflow for how you know where things are and what what the status of them is so it's really best in force in force rigorous note taking even just having people you know whatever tech is running the project writing down what they're doing as they do it can be really happy handy also shared google docs and notepad plus plus so you can put just files within each folder it can be really useful so here's an example of the naming structure that we settled on and uh... basically the idea was to make when someone looks in the folder to make it reasonably human useable so we have the year location site who captured it in the status in this case it was uh... the national forest and national arboretum then you force plot actually this was pro uav that captured that one and we wrote that it was done and uploaded so this seems like a great idea but of course if you have any sort of nested folders like you might want to name your dataset national arboretum the file names get rapidly too long for windows in your entire process fails so this makes it a challenge because you need metadata but you have to have someone who actually is maintaining it and as i said shared google docs are good but this isn't really a cell problem and also everyone always ends up with files like this because before we implemented a data management plan things were just going on to my laptop and then getting copied into random places i think i think a lot of people get flummox by data management because it's really hard but it's not easy for anyone and it's easy to think that you don't know what you're doing but i'm not sure that anybody really does and uh... you won't get things right the first time and you have to start with a plan and keep working at it and just acknowledge that it's not going to work the way you expected as soon as you as soon as you start to implement it and then you need to go back and change it because in reality our data data management typically looks something like that and we want to move it more towards the vision but it isn't actually there you need to address these problems because they end up making your data unusable when we have these large-scale huge time series datasets some other challenges with processing drone data are that there for this new these new kinds of three-dimensional data so for things like and dvi uh... for some of the metrics that kim was mentioning that there's a lot of known information about that because people have been working with that sort of data for a long time a lot of data like a three-dimensional point cloud of a tree it's hard to know how you tie data values like that to biologically meaningful things and it's also challenging getting back to what scur was talking about with the the provenance of the data and tracking what's been processed you know you can you can process the same project in three different versions of pic forty and get three at different outputs and there's also about a hundred different ways you can vary the settings so at one point recently we decided to test every single setting we can think of in pics forty and you can see an output from that table here using exactly the same images you can get a stitching time ranging from five minutes to fifty five minutes and point cloud sizes ranging from forty four million points down to eleven million points and it's easy to think for example that more points means more data if you happen to be taking pictures on a windy day and your tree is moving around you probably want your data doesn't have the resolution of a leaf it just has the resolution of the height and structure of a tree so it may be that either the less points is actually a better measurement of that tree volume or somewhere in between but it's really hard to ground truth these things because there isn't any way to go out and measure that volume of the tree anywhere else it's also important to choose the right tool for the job so drones are really great people are using them for good reason with lots of things but they're best suited for smaller areas like maybe less than twenty hectares and a good example of this is that because we've been flying the arboretum so much we'd all wanted to do a full arboretum survey so that we could get a 3d point cloud and model the entire arboretum and measure the height of every tree in the arboretum I think it's about thirty five thousand trees that they have out there so we finally got the funding to do this we got darlin to fly it and it ended up being a huge project it took many weeks of planning there were four or five flight days required you have to have multiple staff on site because when you're flying an area that large you can't fly over people so it's easy to follow castle regulations when you're just a remote forest but when you're trying to find a fly over a two hundred fifty hectare area that's open to the public it becomes much harder we ended up with eighty eight hundred RGB images and more than twelve thousand images from the sequoia it ultimately took about two months of manual processing because we had to break everything into smaller subsets because the full cloud couldn't run on any machines none of the online folks can handle more than about five hundred images so we couldn't just throw it at one of the cloud services and we ended up with a point cloud that had five hundred eighty four million data points so it is important to consider what workflow you will use so this is something we know we could probably do this once a year max but if you add up the time cost of it it becomes prohibitively expensive to do this at this point with the technology available ever something like that it might be better to fly a plane over the arboretum for instance it's also if you're thinking of setting up a uh... a monitoring plan you need to consider whether a distance to the side of accessibility and time of day because if you want to fly your all of your plots at the same time of day say around noon you have to drive between sites you can't do that on the same day and sometimes it turns out that just putting a camera in so you can get consistent data even though it's not as maybe high resolution might be a better option or using satellite data and again for example uh... a lot of the local agricultural monitoring it turns out it's cheaper to do it by a helicopter because you might want to fly five flights fly five sites in the same day and with a helicopter you can do that in about an hour it might cost you a couple thousand dollars but that's cheaper than trying to drive around to five sites over the course of a week in flying a drone and you can put a much heavier payload on a helicopter so this is the point this is the point cloud we got out of the arboretum again if you go to trade capture you can you can go around and explore it it looks pretty good but you can see there are some artifacts where the the stitches didn't line up perfectly because we're doing had to do it in pieces so get on choosing the right tool for the job uh... larger scale surveys are not necessarily best to do with a drone and there was an interesting article from frontiers in the ecology the environment last year where they looked at doing a uav surveys in alaska and it turned out that it was seventeen hundred dollars per site to use a uas but only four hundred dollars per site to use a plane mainly because the the sites are quite far apart so it's important consideration now in the future i think what we all want is self-driving small drone swarms that fly over our forest or field daily we have all this data upload and process in the cloud and get near real-time analytics to the farmer or the end user i've been wanting someone to release something like this for years and finally this year country called x aircraft in china and a group in uh... sydney called revolution ag are starting to release the system so here's an agricultural uav it can carry pesticides or water or nutrients fertilizer and they can fly themselves so of course there are regulatory issues around this but we're getting close to the point where we can have a swarm of drones that's just monitoring our sites continuously and in the future and as micro drones get get better and higher quality they they may be more feasible safer particularly for uh... field monitoring where maybe they aren't going to be there because you can imagine a small a small study pay ten thousand dollars and get five hundred of these drones and they take off every day and fly around your forest and come back they're all solar powered they live on the tower somewhere that would be a really amazing processing solution to give you continuous three-dimensional data of your of your research site so some thoughts on developing a drone program drones and centers are crucial component for field monitoring and phenyl typing but it is hard to do there's a lot of technology available so you want to focus on the deliverables who are your stakeholders or customers and what do they need to know what would be actionable information for them it's easy to say oh yeah let's just get a drone and then realize afterwards that you're not you haven't really figured out what kind of data you're gonna deliveral deliver and then working back from that you can figure out how big there is what the full costing is and so on get covered a lot of this and then also you want to start small get your pipeline working so you don't want to start offline the two hundred fifty hectares of their freedom you want to have done the smaller for us first you want to make sure that you have you buy is the correct one and whether or not it might just be cheaper to subcontract and when you process the data you need to decide if you're gonna outsource it or do it yourself and if you need to develop novel tools other groups are doing this stuff and how can you share this with others so that we all have to good tools to use for processing sorts of data and again don't underestimate how hard data management is and have a plan in place at the start take we're running out of time so uh... i'll run through this really quickly to uh... i just want to say that all of this data is incredibly it's new and it's incredibly hard to manage because it's three dimensions and it's got multiple layers and time series and we don't have tools for this that that we used to like the tools that we normally use for data visualization management are not usable for the tool sets that we have now so we need a sort of matlab excel or gis for time series three-dimensional hyperspectral data and this doesn't exist yet so uh... there's a couple groups that the NCI that are developing point cloud viewers there's uh... we're working with the vislab to make a time lapse virtual reality and windows-based point cloud viewer and if you if you search for adam stear at NCI he's working on a uh... NCI backed one which would be really great because then groups like turn and us could dump all of our data in the same spot on NCI and we'd have these real-time tools for pulling point clouds out on the fly and just the last thing i'll go through this quick because i know we're out of time one thing that i wanted to do once i start getting three-dimensional data is be able to visualize all the sensor and point cloud data on the landscape where it was collected because that that's a way to me that helps helps you really see the data in the context of of where it was collected and place that you're monitoring so i collaborated with some groups with some students in the computer science department and we made a uh... a virtual reality three-dimensional model the national arboretum using all the drone data as well as our sensor data essentially what you can do is you can take three-dimensional modeling software developed by hollywood and use it to reproduce your landscape but rather than using the whatever data they're using for a movie you can use your data from your drone flights to generate the three-dimensional models and so this is an example of of the the project for the national arboretum here the drone the drone flight data is the three-dimensional force that we get out of it this is the virtual reality version of it where we've taken the digital elevation model to have a generate the landscape and then put the trees in at the locations that the drone data measured when you interact with the trees they show you their metadata so in this case for showing height an area of each tree and then you can also map on the landscape and play back in time series the uh... the different data types that were collecting with the mesh sensors so this is a really great tool for pulling all of these dense data layers together and visualize them in one spot to help us make sense of this incredibly complex data and i think we should all consider this is just the beginning right so when i was a kid i used to play atari and this is what it looked like and now this is you know that the same dragon games that my kids play and where the tarry stage in vr and in our ability to measure the world continuously in three d and in ten years we are in a argument indistinguishable from reality so the question is what do we do with these tools and how do we create the next generation interfaces that facilitate ecosystem research and how we build monitoring and monitoring programs that make best use of all these data types so we can really model our environment in three dimensions and solve the grand challenges of the of this century there are lots of people to thank but uh... i think i'm out of time so excellent presentation team thank you for joining us and look forward to see you again in the future