 I have the pleasure of welcoming and introducing the team from SNAPIT, the SNAPIT CEO, Amanda Barney, talking to integrating AI in the electronic monitoring system for more accurate, verifiable data. My name is Amanda Barney and I'm the CEO of Team Fish. Today I want to talk about the way that we've integrated AI into our electronic monitoring work. Really quickly, I'll introduce myself and Chris Rodley. Then I'll talk about why we're using AI in our EM work. Then I will give an overview of some of the work that we've done to date in using AI in EM, what our next steps are, and then finally I want to talk about our vision for the future of AI in EM and the key needs that need to be addressed in order for real progress to be made in this area. At the end, Chris and I will be available for questions. As I said, I'm Amanda Barney, the CEO of Team Fish. I've been running this company since it was launched in 2019 and have been working in EM for over nine years and in fisheries in general since the late 90s. Team Fish is a federally designated electronic monitoring service provider. We are running regulated and pilot programs around the world and really pride ourselves on the delivery of accurate, verifiable and reliable data. Chris Rodley, who is not presenting but is here to answer questions, is the CEO and co-founder of Snap Information Technology. Chris has got over 14 years of business and startup leadership and has a lifetime of innovating and developing different technologies. Chris runs Snap IT, Snap Information Technology. They were founded in 2007 and are from New Zealand. Snap IT is a really incredible group of highly skilled and dedicated software developers, electronic engineers and mechanical engineers. They build state-of-the-art cameras that are put on fishing vessels to collect data. They also develop, support and build the hardware, firmware and analysis software that we use to collect the EM data we do and to deliver EM programs. The reason that Team Fish and Snap IT decided to look at how AI could be used in electronic monitoring is because from a logistical point of view, there are two main reasons that we are seeing difficulty in EM scaling and that's really because of the amount of time it can take to review EM data and EM footage and find the things you're looking for. So if you're running a camera 24-7, you're not necessarily fishing 24-7 and so you need to find the events, the particular events or particular data sources that you're looking for. So AI could really help in isolating the events that you may be looking for to collect data. Also AI could be used to reduce the overall data capture that you face. So really, how can we use AI to make EM more efficient by helping video reviewers do their job more quickly and by potentially reducing the amount of data that we actually collect? Some of the work that we've done in AI was with the Hawaii Long Line Fishery. So in this project, we collected EM data on five different vessels and we used machine learning and AI algorithms inside of the video review process. So what we did is we were testing and training algorithms to identify fishing activity for us but also identify things on deck to get to the point where it would only be identifying fish on deck for us and eventually by having people identify those fish to species, we are helping in a global effort to develop species identification algorithms. So this particular project was really how can we use AI in the video analysis process to help make it more effective and also potentially to help in some global AI development. The second example I'll give you is an example from the recreational fishery. In this example, what we did is we installed an electronic monitoring system on a fixed point on land and we collected footage from a particular pass that had fishing vessels coming and going. For a particular day of the year when a recreational fishery was open, we trained the AI to count the number of vessels that were coming and going. This was done as a means to determine the actual effort that was participating in this fishery because the camera is in one place and you're just asking the AI to let you know when an object is breaking the plane. This was a relatively easy use case for AI and we're seeing this technology getting replicated in other sites around North America. Finally, a project that Snap IT did in 2018 with some trawl and long line fisheries in New Zealand was to look at whether or not you could actually use AI on the vessel to reduce the data that is being transmitted back to shore or the amount of data that's being put on the hard drive and then collected from the vessel. So what they were doing was using AI on the vessel to identify fishing activity and only putting that data, only kind of highlighting that data as needing to come off the vessel. This was just an experiment but the lab results showed that in some fisheries you could reduce the amount of data you need to remove from a vessel by up to 98%. So what are the next steps? These are great projects and we had some excellent results but one of the things that we found is when we have tried to take an algorithm that has been tested and trained on a particular vessel and has become very good at reducing the video review time and we move it to another vessel, it's not as good because every vessel looks a little different and so we really need to spend time developing AI that can be used across fleets and across fisheries. We need to have a production level pipeline that will allow us to just introduce the AI to huge amounts of footage so that it becomes efficient at finding the things we want it to find across multiple vessels and for different fisheries. We have every confidence that this technology is applicable and useful but we want to bring it to scale so that the full potential of AI is tapped. Right now we have an review platform that is also an AI testing and training pipeline. So this review platform exists and it really acts as a super data storage system. We can take in data from not just video cameras but from other sensors and we would really like to build this into a global or larger AI platform and global marketplace because as we said these algorithms that are being developed need to be run on just tons and tons of video so that those algorithms get good at doing their job on lots of different vessels and don't become so highly trained on a small sample size that they become really ineffective at scale. The key needs that we need, the things that we really need to see this progress happen is we need to make sure that there's trust in the models that are being developed that people believe that the AI is not missing things. That's top thing. The quality and the reliability of the data cannot come into question. We really need scaled government projects that target specific fisheries management outcomes and that identify AI as an essential tool for delivery. If everyone is thinking about AI and it's built into the development plan for a particular fishery and for EM then we start the project off knowing that EM is going to have to work for a huge number of fisheries and so your development starts at the right scale and we're not trying to take a trial and grow it bigger. We're thinking big at the beginning. We really need a thriving climate and blue tech investor market so that we can get funds for this kind of innovation and AI production capacity and then we believe that we really need a fisheries and maritime AI marketplace so that this AI can start to be shared globally among companies, governments and NGOs. In order to deliver big, we need to plan big. Thank you very much for your time today and as I said myself and Chris Rodley are available for questions. Thank you. Thank you for sharing your story Amanda and I believe Chris is there with you. My question talks very much to that top requirement. It works and we need to get to scale which came through strongly in your talk what we found discussing these, I love the camera angle you had at the side of the boat, what we find from other groups during the forum is that we had a situation where they had surprises like for example they found when they were trying to identify fish that working on tail section and head section helped them to understand what fish were. They also found that they could start to get to questions they didn't even know so for example can they start to record hooks that bring back bait and hooks that come back unbaited but don't have a catch. I just wondered if you can share some of the surprises that happened in the journey for yourself and Chris on the kinds of things that made a difference. So we've also heard about lighting being a massive problem. You're talking about transferring between vessels but even on a similar vessel with under very different lighting conditions the AI performs differently. So any of the stories along your journey which were surprising and offered new opportunities? Sure I'll let Chris jump in in a moment. I think one of the surprising things was how well the AI did right away on our small sample sizes and I think for me in particular I think I thought it was a solution before it was because the AI would do really well and we'd have a small sample size and we'd be doing a proof of concept and we'd be demonstrating and be like wow they're really fighting the fish this is terrific and then it would be a second vessel with a similar deck setup but as you said it could be slightly different lighting or maybe the measuring strip or the background was a little bit different and suddenly it wasn't doing well at all and so to us I think that was one of the biggest surprises was that we were expecting in some fisheries better results out of the gate and Chris I'll hand it over to you because you might be able to explain what I'm saying a little bit better but that's what I found surprising was just how smart the AI can get quickly and how you can overtrain something before you think you may be overtrained it. Yeah it's so good I think that's that's really true so we you're able to make incredible progress in a really short period of time and then is that is the diminishing returns is what we I found really interesting so as you you assume that as you transfer that data from one vessel to another that the work that was done on vessel A is going to contribute and produce something that's better than starting from scratch on vessel B what we found was it was actually that baseline you have to work that out for for all vessels and so from our approach we're really focused on designing systems that are easy to train and that can be specifically trained to a gear type and vessel that's a real key and I think the other thing that we found really interesting was when you look at the cost of an electronic monitoring program it's reviewing data followed by storage followed by transmission followed by the camera so you've got this these things that are above and so our focus recently has been really around how do we see that AI actually run on the vessel because what that does is it reduces the need to transmit as much data and it reduces the need to store as much data and therefore the cost so we've been working really hard with my engineers to get hardware that's AI capable that can run what's called on the edge and we found that really interesting as well when you look at cost of a program so I'm going to hand over to Matt because Matt is actually working on some of these on the edge tools and he's probably got some nice questions oh yeah thanks Amanda and Chris great presentation and lovely to see that timeline of work as well yeah I do I do work in some IoT domains but what I wanted to ask you is really about what lessons you've learned over that period of time about putting IoT devices or edge computing on board stakeholder vessels what have you learned from working with the stakeholders and what knowledge have you gained over that period of time I think that's really important it's kind of weird to put something like that on a boat isn't it yeah I think we've always found ever since I will say the early days of EM but 2002 when we were first putting cameras on some vessels in British Columbia regardless of whether or not that camera has IoT capabilities you need to have a pretty thoughtful handed conversation with people about the fact that you're recording video on their home and where they work and so I think it's just been an evolution of that original conversation we're going to be recording you we're looking for information there's lots of things we're not looking for we're not recording sound that's always a really that's a really big deal to make sure people know we're not recording sound and then it suddenly changes so you're having the same conversation but you have to let them know that there is the potential that there could actually be some processing happening on their vessel or how do they feel about imagery being shared outside of just the regulatory program and it always comes down to a conversation and you have to make sure it's an honest conversation that that you believe that there is going to be some value that's going to be coming back to that fishery or to that individual fisherman or at the very least it isn't going to do them some sort of personal or business harm so Chris can talk a little bit more about how there might be a slightly more technical conversation but I don't actually think it's a huge change in the conversation we always need to have when we say we're going to we're going to record we're going to record you at work and then watch you work it's it's already it's already a tough conversation and it's got to be really honest and thoughtful thank you very much. Chris did you want to add anything to that from the technical side of you know jumping on people's boats and hooking up to their electrical systems and then going back to change things all the time or anything like that? Yeah I mean you don't get a more harsh environment I think than the fishing boat my joke is I prefer to install hardware in a volcano probably won't get damaged as much we've had a lot of broken arms bleeding noses and a few other people attending this have walked that journey and I think it's incredibly difficult we have a tendency now to really over engineer so we've come up with ways where we can purge our cameras we've purged our cameras with argon which is really dry so that moisture doesn't seep out of the capacitors when it's running an Antarctica and condensate on the front of the glass on the inside we have tested to 2,000 meters underwater now and we anodize our own aluminium just to make sure that the process is strong we do our own machine work and CNC work of all our implosures and yeah we just if I ever see water in a camera again I will curl up in the corner and rock backwards and forwards it's it's just really essential that gear is reliable and it's probably the most essential element because of the cost of it not being reliable and so we've we've invested a lot in that and we've had a lot of hard lessons to hear your story because we're not only talking about the human dimension of difficulties but you know just the the kinds of environments you're working in and as you pointed out the costs you know there's costs right down the chain and it's not the case that once you've got the code you're ready to go sort of thing you know this is an ongoing story so