 To kick us off, I have the pleasure of welcoming and introducing the first presenter of the day from Griffith University in Australia, Sebastian Lopez, talking to the app FishID, developed as a smart monitoring tool to support the aims of the global wetlands project and how the app FishID can be used for conservation and management action. Welcome again, Sebastian. If we could now go and show Sebastian's presentation followed by Q&A. Thank you very much for your time. Today I'm going to give you an overview of FishID, which is a smart monitoring tool for aquatic ecosystem monitoring. At the Global Wetlands Project, we are doing three things, measuring wetland health with new analysis and tools for improving conservation action to protect and restore coastal wetland habitats globally and then also engaging communities and managers to help them care for coastal wetlands and also to learn how to support biodiversity and human livelihoods. The Global Wetlands Project has partner sites across the globe from India, Portugal, South Africa, China and Vanuatu and the core team is based in Morton Bay, Australia. When measuring wetland health, there's this big push now in science and in research as a whole to use automated and remote techniques. Specifically underwater cameras are becoming a technique or a method that it's commonly used and mainly because it can collect data from species, counts, behavior and size. Another thing or another point that makes underwater cameras also useful as a remote technique is that it can be deployed for many hours at many different sites and obtain footage of rare behaviors or rare species. So this example that you will see in that video is juvenile giant groper and these are protected species in Australia and it's very rare to see the juvenile form but most importantly we found this individual in a fish passageway moving from a wetland to a tidal lake and yeah so that individual was using that corridor to move between those two ecosystems. To measure wetland health we're developing a new platform called FishID which is an online platform that automates the analysis of underwater video footage and it uses artificial intelligence to provide actionable data from aquatic ecosystem monitoring. The pictures that you see there, it's the core team based in Australia and it's a mix of PhD students, research assistants, software engineer directors and science communicators. FishID at the moment can do four things. The first one is that it can detect target species. So you will see in that video top left that the FishID software is detecting that specific species that it's on the red polygon and it's providing us data of where it is and where it's going. You can imagine that this sort of data is important for managers for example that want to monitor invasive species in a particular ecosystem. We can also detect and count fish communities as a whole and you can see here a baited baited footage from Morton Bay in Australia and every fish that comes into frame it's being detected and sending us what species it is. So there's no need for a human to watch these videos and to determine what are the fish species there. Also we can detect grazing behavior in fish species as you can see there that fish is grazing on the seagrass bed. So this might be important for managers that want to understand how fish communities utilize a specific habitat. And just recently we're also now doing detection and tracking of fish across different visibilities and ecosystems. As you can see there is different scenarios that we use that tracking pipeline and we can use that information to understand behavior of animals across different seascapes across different ecosystems. I've added the links to the papers that we have released on each of these components and they are the orange links or text at the bottom of each video. I just want to also give you a bit of an overview on the fish ID tracking pipeline which is our latest output of fish ID. So basically it's a it's a pipeline that in simple terms it's using artificial intelligence to detect a fish and then track it in subsequent frames. Then, throughout the video there is an interaction between the tracking and detection components. And it all starts of course from an underwater video. So what you will see in these videos is a the detection of the fish and then the tracking output displayed as the dotted white lines in the video. If you want to learn more again about this pipeline just check out the paper that was recently published. Another component that we're not exploring in fish ID is how it performs in low visibility conditions. We know that low visibility conditions are common in most underwater existence and not all ecosystems have perfect visibility. So we have been training our fish ID models to be able to identify and detect species in a wide range of visibilities as you can see there. And for different species. And the graph that you see on your right are three measurements of accuracy for fish ID and the X axis. So that's where it says sake. It goes from two to five so that means two meters visibility, three, four and five minutes. So what you will see there is that the accuracy of fish ID across these low visibility conditions. It's quite constant, but there's also no clear trend of any influence of visibility in its performance. This was done on a fairly small test compared to the test that you will see in artificial intelligence papers. So this is a test on 12 hours of footage across different places. And this is an area that we're also expanding at the moment and it is an area that we could develop partnerships with other institutes or researchers. And I just want to also make a note here that fish ID has been tested with images that have not been pre processed with any software or any algorithms that improves the quality of the images so we're using using just the raw images for this test. Another area that we're also exploring and this is a place where we are open and available to have a chat and collaborate is fish ID automated sizing. So this will increase our monitoring capabilities even further. And at the moment what we're doing is quantifying error and creating baselines, but also finding that interaction of hardware calibrations and software. What you will see in this graph is the error rate of our method at the moment, which is still in development. And you will see that we're getting around 0% error for most of the length estimates that we are calculating on fish. So this will definitely give us now even more automated data about fish across ecosystems and across conditions. Another point that I want to make with fish ID is that it's fairly flexible. We're using very cheap underwater cameras as our hardware. We're not using $600 cameras. We're just using $30 US underwater cameras to collect and train our fish ID software. And the pipeline is deployed across visibility gradients across many conditions and it's working. Just to give you an idea of how flexible it is, we are currently processing three terabytes or around 300 hours of fish movements that are across connectivity corridors just to understand behaviors in those fish passageways. And we're also using that data with common ecological statistical methods to explore rare or latent behaviors. So again, just to show you quickly the tracking pipeline. It's there. And as I said, it works in different ecosystems and this is just an example of how fish ID performs across visibility gradients across different ecosystems and with different fish as well. Now, you may be asking when can you access fish ID or when is it going to be available and fish ID will be an open access and open source platform very soon where you can annotate, train, evaluate and analyze underwater video imagery. It's available now as a partnership with the developing team, but it's going to be available for external users from 2022. Now, we have been able and we're making this big push in our field to also make our code and datasets available. And what you will see on those two images there are two GitHub links which are repositories where you can find our code, datasets, hardware, or any other software that we're using to develop a product similar to fish ID. But as I said now, it is available now as a partnership with the development team. And if you want to learn more about it, you can just scan the QR code, which has been appearing at the bottom right of the screen. You can also contact me through my email, which is now displayed on screen. And I also added my Twitter link where I post regular updates about fish ID product and all the developments that we're currently doing. Thank you very much, Sebastian. Well, great kickoff to the three days and sharing and thanks for sharing your team's work. I have a small question for you. It seems like you put quite a lot in your talk about how people outside your team can connect with you. But I'm interested to ask you what type of assistance do you think what potential collaborations with you welcome to advance the uptake and utility of what you're developing. Sure. So we can get input from either ecologists, researchers, or also data scientists. As I said, our two menus that we're exploring right now is fish ID in low visibility conditions. So that is more in the data science side and how we can do some image preprocessing. The other sector is fish ID sizing. But yeah, so this app was developed by ecologists for ecologists. What we're trying to bring is a community that can use this automated technique to monitor ecosystems a bit better. So in all, in all the sectors, as I said, data science and ecology, we can get some feedback and also help us develop this platform for everyone to use. Matt, I believe you have a question. Yeah, fantastic presentation, Sebastian. Really interesting work, particularly like the open aspects of your work and that you've got a trajectory which is intended to open up to everybody. What I'm really interested in is how do you intend to make it accessible or more accessible for people that say don't have a very good internet connection, or don't have a very technologically savvy background but do work say underwater or do fish surveys using underwater cameras and sheet ones like you say $30 is very accessible actually, which I think is one of the key themes of the forum. So how do you intend to take that trajectory? Yeah, sure. Thank you. So with the tech, as I said, all the code and software will be made available. So if someone in the end cannot use the online platform, they will be able to interact with the code in their own computer. So that's the vision, right? It's all open access, open source. But our key idea is that we could develop, so we're trying to develop this platform that won't require any knowledge about coding or data science will be all point and click. And then the people can really interact with that AI and exploit all the benefits of that automated processing. And as I said, so by doing it open access, then people can use it in any way, and then potentially also interact with open access data sets that we will see in this forum later on.