 Our next presentation is by Bernie Tidman of Banga University, talking to an automated system for species size and sex data collection of crustaceans. Hi, my name is Natalie Holder, I work at Banga University and I'm a fisheries scientist. We have a club group team across a variety of universities and also throughout the fishing industry. And we have been working on a variety of projects, the first of which I'd like to introduce you today is our video capture of crustacean data. So this video unit is based on the Raspberry Pi single board computer with an additional board for GPS, GSM connectivity and an uninterruptible power supply. So this is mounted onto fishing vessels where we collect data, collect video as it comes on board of the fishermen unloading their catch. And then the end point of this will be that this video is processed on board before the data on catch, size and sex of crabs and lobsters is returned to shore via GSM or other data transfer methods. So the first step is the extraction of frames from the video. So the camera has a motion triggered capture of 10 second clips with a circular buffer to allow capture of frames before the motion trigger. And then there is backgrounds of traction to allow future processing, as shown here. So these are the types of pictures we're getting. And the first stage of the AI is to identify whether it's a crab or lobster or something else that's in the frame. And Aberystwyth University have been leading on this aspect and they have been experimenting with deep learning frameworks and to get a tight bounding box and label on each animal. And you can see these are the types of images we're getting from a variety of boats. So Aberystwyth have had preliminary testing and they found that for the lower power hardware that we're using the dark neck tiny looks like a good candidate with good precision across both crab and lobster species identification. The next step that they've been working on is to create measurements for both crab and lobster. And these are feature point based measurements. And these have been developed for the animals, both crabs and lobsters from a small training set and have so far been tested on other images as we were unable to get out and back and get as much video as we would like due to COVID-19. And so they are waiting on more training and testing data of crabs and lobsters from our team at Banga. So the idea with this is then to combine this video data with other data such as landings, BMS or other spatial data and environmental sensor data. In the short to medium term, we're looking to use that to get size based indicators and data deficient stock assessment methods. But with a long time series, we're looking for a data rich stock assessment. Two other projects that are just starting. Mike Kaiser from our collaboration, Harriet Watt, he is the lead on a PhD, which we've just recruited into to start in September. And this will be using two approaches, both high definition video and laser 3D scanning to collect and process data on bycatch fisheries. Then the second one is being led by Kleshostek at Banga University. And this is to use underwater video data to estimate scallop densities and create size measurements as an alternative to dredge based scallop surveys. A recent paper by a team in Scotland has shown the scallop identification has worked quite well and Claire is looking to take this further with a stereo video to allow size measurements. So that's just a really brief overview of us. And I'm really interested to see what other people are doing and how we can maybe collaborate and take all of this work forwards. I'll just leave you with a video of the bycatch video that we're hoping our PhD student will allow. As you can see, there's quite a lot of movement in the video. This is because it's been shaken to go through a riddle and has its own challenges, but we have a student starting on this in September. Thank you very much. And I look forward to speaking to everybody soon. Well, thank you very much, Bernie for coming and thank you, Natalie, for the video and automatically capturing the kind of information you're after. It's interesting to me that you've got challenges or pops. You've got challenges of shaking units. And yet you've gone with quite a simplified, pared down processing unit. Can you give us some story about how you managed to get to that solution and some of the challenges you've had dealing with things getting in the way, pots and such like or shaking images or just some of the stories about your journey and how you collaborate with others. Thank you. Yeah, sure. I'm working on the on the first project that Natalie talked about with the Krabsen lobsters and it's still a work in progress. And I'm anticipating all the kinds of problems that we've been hearing about from the other speakers. So we're starting to deploy some of these camera units on fishing boats. And I'm sure we'll have lots of issues with water on cameras. We're placing that the cameras is always interesting and making sure you get a decent view of the animal. So, yeah, we're going to we're going to hit a lot of these problems, I'm sure. And it's interesting to hear some of the solutions that others have come up with the one thing we haven't heard much about is we've had a lot of constraints on both the kind of cost and also on the the working practices of the of the fishers. So simple things like and I don't know what other people have done who are here today in terms of we've seen a lot of tracking of animals, you know, the detection working per frame. But we want to be able to count the animals. So we don't want to have, you know, if the if the the tracking or the detection kind of fails on a few frames and then starts again, you want to make sure you are not counting the same animal more than once. So we've gone for a simple system where we sort of saying when there's some movement in front of the camera and we're going to just pick a frame that we think there's something interesting in and apply the deep learning. So we've gone to sort of cheaper, cheaper methods for trying to find, you know, a sequence of animal, separate that out and then pick a frame and start doing some more analysis on it, like the measurements and so on. So you have been interested to know how other people are finding individual frames. I've seen that we've seen today, things like conveyor belts. And that wasn't an option, apparently. And making sure that things like I was quite keen on just having a big red button that the fishers could hit when there's an animal underneath, you know, so we can just say, right, take a picture now. But that was too far away from the working practices. We had to work with the fishers to make sure they actually didn't feel it was taking them at excess time and distracting them from doing their job. So there's a kind of constraints where we're interesting to work with and we'll have to see whether the sort of solutions we've come up with are going to actually, you know, work in the in the in the field, so to speak. So we're just getting to that stage now. So you're on silent. I think you're muted. It's a fascinating blend of use case need and we're hearing throughout the presentations about trying to separate this as being a requirement of the crew that's something that will operate by itself. Matt, have you got any inputs? Yeah, I just think the the funding aspect is something that, you know, we should maybe think about networking as an output in the form. And but also the consideration of stakeholders as well. I mean, traditional fish surveys and predatory invasive if you compare it to what we're trying to to to do instead, if you stand there and get your tape measure out, you know, go about your business. So, yeah, maybe there are some incentives to say it's just going to take a second and we'll have everything. So, yeah, I think those workshop in real world scenarios that are empathetic and stakeholder is super important as important as the tech. OK, well, thanks very much, Bernie, for stepping in at the last minute to to cope with Natalie's not being able to present. Great to hear from your team.