 Good morning everyone, just about. And so I'm presenting this on behalf of a very broad consortium of people that sits within the Marine Research Development Initiative. That's a collection of academic institutes including my own labs at Columbia Marine Laboratory, Marine Biological Association and the University of Canada for the many partners I can want to mention quite a few at the moment. And as they're telling you the amount, this is very much I think was worth that I've presented. So the National Centre for Coastal Autonomy has launched just the end of 2022. The idea behind the concept is to try and bring together a number of different initiatives that have been developed over the years that are particularly focused on coastal oceanographic research. So each of the institutes here has a particular focus on our coastal oceans obviously as well as as well as researching much over deep ocean. What we were recognizing particularly within the development of marine autonomous systems is that a lot of those systems tended to be focused on those deep water and polar applications. We felt that there was a real need to start finding a hub if you like and starting to develop a work towards scent of excellence that had a little bit more of a coastal ocean focus. Some of the obvious reasons there being the greater interactivity between terrestrial and ocean systems, closer to land, also hazards that we typically have to deal with, rocky shoreline, shallow water, fast and strong type of currents. And if we look back at the incentives for us, we go back to our traditional ways of doing things and this is certainly still very much in use today. This is the quest. This is P&O's small research vessel that's used to maintain Western Channel Observatory which is a sustained observing system just off of Plymouth in the waters of Plymouth. And that's actually been in existence in some forms since 1903. It's one of the longest running time series we have to go and ask for ocean observation. And that's typically made of a whole host of different observational capabilities. The ocean orbiters or those are involved here in marine science will recognise these other CTDs, but very traditional methods of net holes actually dragging things out of the ocean, taking them back to the lab, making those measurements. And if we look at the myriad of different disciplines that we have within that, I'm sure there's lots of things to sit between in the right areas as well here. These become very complex projects to hold host of different data pipelines or digital technologies to play a really, really important role in terms of advancing our science capability. We're going to focus a little bit more on some of the platforms today to talk tomorrow by my colleague Tom Mansfield, who will be digging into some of the data pipeline so where can the digital technologies and the Reynolds-Homers technologies start assisting us in these sort of areas. And the hope here is that by bringing together that collective capability under that NCCA plan, that we can start making some progress and, like you say, a focus for people to gather around. Over the last few years, we've already been trying to bring together some of the hardware, the shiny yellow robots of the person under the smart sound climate initiative. Again, similar collaboration here, but also including the New City Council and the University of Exeter. And this presents some of the types of technologies that we're talking about. It's a slightly broader perspective, maybe than some of the other Reynolds-Homers systems might typically deal with. We're becoming quite familiar with seeing these shiny yellow robots, like the gliders and the AUVs. I think we've all passed them coming through the door. They're both in the boat face down there and the gliders that I've personally been working for a long time. But we're also starting to get even more intelligent smart boy systems with autonomous capabilities, particularly around the sort of high-density transfer of information from communications networks that we can provide. And who serve the vehicles that are a really interesting area for research, the potential they have, but also the problems we're starting to experience around regulation of those type of platform. The NCCA starts to advance some of that development because essentially the regulation isn't managing to keep up with the technologies and science that we're trying to do. And then some of these autonomous things that are a little bit more recognisable as boats that we might otherwise see in the short coast environment. But also these can be, these are starting to get a lot of interest from the industry in terms of what future shipping might be with the role of autonomy and AI. So the objective here of the NCCA is to provide a fully integrated marine monitoring framework, enabling a platform for motive surveys, establishing monitoring capabilities, but also to provide the underpinning of the state-of-the-art communication networks that can help what seems to happen. A key part of what we're working on and more relevant maybe to a lot of people in this audience is our own studies about those data and image libraries that have the potential for use and quality of the type of data, but training for various AI solutions. I can't say a little bit more about that in a moment. Various other different applications that embrace the sort of expertise within the groups. I think a key part of this really is bringing together those regional, national and international partnerships that the NCCA can provide. In terms of access, for me I think what I'd really like to see coming from trying to advertise this new initiative to audience like this is that we provide a platform for innovation and also training. And again, in order to actually develop this level of capability, and obviously we're also embarrassing with the massive amount of infrastructure that they mean, saying the community access and use. Again, providing that in a UK coastal waters environment, so with the collective capability across the partnership, such as the marine centre here at the university, this is one of our workshops here just preparing this. This is one of our smart voice systems that's now just going to back the water, but always add some of the sort of more traditional capabilities and crews, you know, never underestimating the amount of expertise and skills that are required to actually get things wet. And again, so that's available to academic industry and basically any partner that wants to have a chat. More recently we've also been developing the communications network, so we have a fully capable now 4G and 5G communication system within the Plymouth Sound area, you're familiar with that part of the UK, so here's Plymouth area, we've got this large sound area, there's a breakwater just there if anyone's living with that. So this is going to be on that around 20 miles level of connectivity. And there's capability there also to work within sort of remote communication systems, so again, bringing on your own equipment to test within that system, bringing together your own comms networks can be fairly rapidly linked into that, provide sort of mobile capability to extend and take advantage of that system. And in the very near future, we're hoping in September this year, we'll extend that into a submarine sort of subsurface communications network, so the equipment's coming in over the summer and we hope this will have these transformer networks going to allow for communications and navigation systems within that smart sound area, but again, this is a mobile system potentially that can be applied in other areas, so we're going to provide focus in the first instance in the area that we're all in the opera. And I mentioned earlier about the broader network of partnerships, so within the LRP partnerships we also have the first cluster, it's the largest marine on timing cluster in the UK, a triple helix cluster as we term it, between industry, government and academia, and you can see the vast array and I'm sure there's plenty that haven't necessarily been updated onto this slide, but there's quick and easy access to this side, plus the five three in collaboration during the end since the eighth. And I just wanted to come on and highlight a few of the applications that I thought might be quite of interest to the audience and the sort of things that the initiative can help drive forward, so this is some work that I was involved in personally, actually before I joined PML1, I was a student at DVI NOC, and just to flag, there's a paper from the day before that the Met Office that sometimes a lot of this works, a couple of works happens there, if you want more information, but more than that, it's amazing. So this speech, this was based upon the great capability of these submarine gliders on the left there, and so this was a three-month mission run in that same area, just off glimmer sound, and so we had one single glider, and anyone who's worked with these might understand that extremely slow systems, they're typically only moving horizontally at about 0.2 meters per second. Typical tidal velocities in this area would be more approaching one meter per second, between a half and one meter per second, so we've got a system that's not going to get anywhere very fast. It also means that particularly susceptible to position currents, they might end up where you don't want to be, it's extremely tricky to try and manage them in the coastal environment, again partly incentive behind developing the NCCA. We're also, the ability however, the interesting part of this is that each time these gliders sort of profile through the oceans, they come to the surface on a regular basis, communicate some information back to us, give us some data, and allow us to then communicate back to the glider, so there's a potential there for adapting something, which was the objective of this area. So in this instance the gliders come up every, I think it was about three hours, we ended up, our pack would come up and forecast, provide data back, that data was then being adjusted with an operational forecast model run by a Met Office partners, which also ingests observation data from satellites to try and improve the local forecast capability. But we then applied also our own stochastic prediction model that in training, essentially we were training that model using the past 72 hours worth of data from both the model, the observations, to then try and provide some improved guidance to the glider, feeding back to the glider to say where it should go next. And in this instance we use the objectives of finding the phytoplankton patches as they're developing through a spring-room period. So this transitional period during the spring where the ocean's getting warmer, the structure of the ocean starts to change, and that's when we get a lot of our biological production and we see these phytoplankton blooms occurring across our coastal waters. And that's when a lot of our carbon is fixed into our oceans, so it's really important to try and get these forecasts, or at least the modelling capabilities, as good as it can be. So I'll try and walk you through where we actually live and work. So the, hopefully you can see a point out here, so the glider data here is just like blue, and this is the real-time data family. So through a daily period that chlorophyll is changing continually, we get small patches of phytoplankton moving around. We've got this glider that's barely being controlled in a very strongly tied environment, and it's sort of drifting in and out of these little patches of phytoplankton. That data leads to the daily need from that, and that's that blue line, the dark blue line on the top. And we then look at what the operational model was producing, which is the green line. So this is the operational physical model, which is what I met up here, linked to the ESM ecosystem model that he and I developed, or what we were keen in developing it is. And so the forecast for that is actually the green line at the bottom. And that's not a great surprise because the red crosses are the satellite data that are actually being adjusted into the model as part of this assimilation. So the model is essentially being trained with the red crosses. So it's no real surprise that it's not managed to get up to where the actual actual oscillations are given to our data. Remembering the course of the satellite data is only seeing that top meter or so. So it's missing a lot of what's going on. The stochastic model was able to produce, because it was ingesting the real-time data in situ data from the glider, as well as the oscillation information, was actually providing much better prediction. That's this urban layer. So from this, it was then going in the glider to where it needed to actually go and measure the patches of this phytoplankton. And it was then needing it back to our own version of the operational model, because the Met Office can't run its model twice, because it's too big. So we were running it offline on our own version, and the orange line here is what we were providing as a sort of offline alternative. And yes, you can see that we managed to constrain or bring this information, but that's why it's a more realistic situation. In terms of upscaling of this type of approach, though, the bottom plots here show what the difference was between what we were seeing in that local forecast relative to the difference between the model, the actual observations of the glider, and what the operational model was seeing. So we can see that we're improving that model in the immediate footprint of where we're actually observing, but in the broader area of that, it's not being improved, certainly not under the methods we use in this situation. So the question actually, we can start then building with that, what an optimal monitoring framework might seem to look like in order to bring the model up to its oscillation capability. So it only wants to go from there, and then come on back to sweeping that one in. So this is a very different project that we're working on within the NCCA at BML. This is the automated in situ plankton imaging and classification system. So this is trying to improve upon the traditional methods of going out, going out, using our ship, using our water collection to then take back to the lab and undertake a whole series of different analysis techniques to understand what the plankton community structure might look like and how it changes at the time. So the APICS system employs two different imaging systems to maintain close sight spot and the plankton analytics PI-10. The reason being here that they capture slightly different parts of the side spectrum, so close sight spot by reveal and smaller parts and here in that fixed row of the plankton that we see in our system. But of course, there's one thing behind the equipment, the challenge really is getting that into the environment and making it, making it of use. And that's again, where the infrastructure that the NCCA provides allows us to integrate this within our already existing voice systems. And this, the expectations towards the end of this year will start actually once we've done, finished understanding how these bits of kit work from the bench, we'll start getting these integrated within our voice systems and hopefully generating data by the end of next year. So this will provide profiling capability to put in these imaging capability on profiling voice, integrating the two systems in close proximity to each other, and we'll have a profiling CGD family coming out of that L4 voice in the CMS. So there's a lot of massive amount of infrastructure required to actually get into this form. And of course, in order to actually get to the point where we can communicate this sort of information near real time, there's a whole host of other technical challenges in front of us, not least the compression of the, well it's not compression, but it's the communication of very large amounts of data and how the trade-off between onboard processing of that data and communication of that over what might be an extremely high fan width compared to most ocean applications, but again it's still going to be limiting in terms of communication information, we can get another time from this system. So there's a whole host of work that hasn't yet been done, but we'll be really keen to talk to people who are working in sort of similar, also in which identification and classification system, but I think we've more than a lot from each other in that space. So I hope we haven't overrun or left it for a time for a question or two. So the Sunrise GACCA is offering a fully integrated marine autonomous systems network, it's open access to academic industry and other partners with the intention here of establishing a Centre for Excellence in the Coastal Oceans Systems, and I won't read what I've written then, but hopefully we'll have time for a question. Another sort of breadth of the environment we're covering, space and the forest. What prospect, there are so many, so few examples of adaptive sampling actually seen in practice, so I think that's really interesting. What prospect is there of this being expanded beyond the myth and digital twin, twin territory of the kind of relay network to the Met Office to be honest, we have the technology, we have the capability already, and the funding vehicle for that needs to be quite large, it's quite large. Autonomous systems often put up as cheap and available, but that's really a perspective taken from how much a ship might cost. So it's a lot cheaper than putting a ship into the water, but it doesn't do what a ship does. We don't tend to put 40 or 50 ships in the water, but the opportunity exists for 40 or 50 of those robotic systems into actually start getting that whole system monitoring capability. I think we need to be a little bit careful about how quickly we rush into these large systems, so running two would be an advance on one, and actually running those two in a sensible way that they're actually complementing the measurements. So some of the thinking behind why and how we use those platforms, I think still needs to be done. So I'd see those as the next steps. The Plymouth application here actually was a real challenge because it's a really tricky bit of water. We actually were extremely constrained, because particularly it wasn't shipping and transport, it was shipping issues, but that's quite typical of coastal environments. Again, you know, the different challenges that we experience in that space compared to say open ocean. But in terms of can we, we can immediately upscale the escape, but that is that capability exists in the UK. But you have a glider one, right? Is that a moment? I was thinking, oh, if you could do adaptive sampling and then not do adaptive sampling with another glider, would that be a way of showing it genuinely was adaptive sampling? I think we can genuinely demonstrate where it's adaptive sampling because these things have an almost real-time communication, you know, it's a completely circular thing. There's quite a few other people working in this space as well. We are seeing people using small fleets of autonomous systems. So that's what you know. Yes, I'm a university. Is there a primary scientific or societal problem that you're going to address through this initiative that is focusing your attention on the delivery of some clear change from what we've been doing before? There's a long list of scientific objectives and priorities that Western Channel Observatory and the whole Coastal Observational Monitoring Community is focused on. I think the key objective here is to start trying to provide the right sort of vehicle and platform that we can start testing the technologies that we usually need to actually be into our scientific objectives. I don't think the NCCA needs to necessarily come up with a new whether scientific or societally important objective. I think there's plenty of them for us to choose from. I think what's lacking often is some focal point to help coordinate the amount of available platforms and technologies that are coming available to us. We're all very busy. Who's got time to invest, timing, money, by assessing systems. The NCCA is pre-processing in terms of the smart sampling of opportunities and just generally the national capability with NERC underpins provides that platform for all those different questions to be addressed. I'm not sure the question needs to come from the NCCA. I'm happy to talk about any interaction you want to try and help take. I think if we have time for one more question and I'm going to jump in. So on that last slide, you were talking about using image recognition for different types of platforms. And we had a talk through the WebCon series, CV, about the difficulty of convincing the community of the buy-in on the sort of merits of that. What's your experience been of trying to sell that as an action is elevating that to an operational system? I'm absolutely not expert in that space. I'm certainly talking with the people here who are doing that. I think it might have been. I think it's an active discussion that's going on in our labs at the moment. So there's a whole host of different things going on in terms of the usage of images, the image identification classification. It doesn't seem to be at the moment, particularly well-ordinated community, trying to gather around and actually try and test, you know, share their experience, share their expertise. Yeah, I agree. We were talking last night, Steve and I did know actually about what an excellent tool that was. It was fantastic. But I mean, I know the PML and CFAS aren't necessarily coordinating the efforts in that space with buying similar equipment. There's others around the community, they're talking to each other via science applications. I don't think some of the more technical ones of that are being discussed quite so broadly. I think that's actually an asset that's convenient to try and help take a force. I do see the need for some bottom-up initiatives in that space just to get us all talking to each other, understanding whether there's common problems and where there's problems. I think it's really encouraging the amount of progress you've been able to make, given that there is that kind of hesitance that needs to be that bottom-up movement first and yet you've still been able to make progress with it. So that's why I take it's really, really encouraging. Just I realised for those of you that didn't see the talk, these dibs really, really good and it's recorded, but AI methods for doing image processing were almost limited by humans at this point. And it's going back to the previous talk, how collecting training data, quality of training data is really hard to collect.