 And I'm going to speak very briefly and introduce the next of two sessions, panel two, Detectings and Technologies, part one and part two. We have a lot of speakers, a lot of new, fresh new ideas and new perspectives. And I have the pleasure of first introducing Isabel Panay, who's an early career scientist who's primarily focusing on gravity analysis, satellite gravity analysis, and also has worked with tsunami detection as well. So Isabel, are you on? Can you hear me? Yeah, we can hear you. We can hear you fairly. Can you hear me? Okay. Okay. Does it work with the PowerPoint? Yes, it's great. Beautiful. Okay, so hello. I'm presenting our work on the analysis of the Toboku earthquake from GRACE Satellite Gravity. So we analyze a time series of the Earth's gravity field that was mapped from the GRACE Satellite Mission with intermediate temporal and spatial resolution, monthly temporal resolution and a spatial resolution down to typically 400 kilometers. So with this resolution, we made it an advantage in order to bridge the gap between local and global patterns of plate dynamics, because we can tackle the regional scales and put a local event into its broader regional context. So in this study, we have searched for possible episodic mass transfer at the scale of the Earth's regional structure near plate boundaries. So from GRACE, we can detect the cosasmic and the posasmic signatures of giant earthquakes as shown here for the Sumatra 2004 earthquake with a great sensitivity to broad-scale mass variations like crustal dilation in the cosasmic and broad-scale posasmic well explained by the mantle relaxation. And here we focused on anomalies before a giant rupture. And we analyzed two sets of GRACE geowheat models. And I'm now introducing our method. So the particularity is that we did not try to reach the highest possible spatial resolution. Instead, we focus on the intermediate spatial scale. If we move to smaller scales, the anomalies concentrate near the epicenter. And if we move to larger spatial scales, the signals progressively vanish. So there is nothing more here to learn. And at these intermediate scales, we search for possible signals correlating with the geometry of the plate boundaries, that is elongated gravity variations along the orientations of the plate boundaries or the slab. And we introduce this direction information by using gravity gradients rather than geowheat or intensity of the gravity field. So these are the second-order derivatives of the gravity potential. And the idea is simple. You have a mass excess like shown here. You have an increase in the gravity intensity above the source and also deflection of the gravity vector towards the source. And so if we take a direction perpendicular to the axis of the source and we map the component rate of variation of the component of the gravity vector in this direction, we have a sharper delineation of the mass anomaly than when we look at the geowheat. So it's useful to extract a signal, an elongated signal along the direction of interest and separate it from the other superimposed signals in the gravity field. And so here we have computed these gravity gradients as shown in this example corresponding to a big thrust slip, which is not supposed to be realistic. And they are computed in a spherical frame and we rotate the frame progressively. And when the frame vector are parallel or perpendicular to the source orientation, we have a maximum intensity, slowly decreasing as we keep rotating. And this way we obtain time series of gravity gradients at different spatial scales and for different orientations. And we study the temporal evolution, the time variations in this time series and we search for abrupt variation near the time of the earthquake. But we do not impose the exact timing of the variation. We leave a little bit of freedom. And actually we have carried out two different kind of analysis. One where we analyze the entire time series including years of observation after the rupture, which is shown here. And another one where we stop the time series in February 2011. So we don't know there will be a rupture coming. And near Japan both agree quite well. So I will present the first one. And in this approach, we search for step-like changes in the time series and we do not impose the exact timing of the step. We leave freedom. It could start a few months before or after March. And so it's a little bit long to explain. But this way we have shown that the step-like changes of the time series around Tohoku, they start a few months before the rupture in a regionally coherent way over a wide area as shown on the left map, where we have a gravity gradient increase, which is especially coherent over 1,500 km along the northwestern Pacific subduction. And at the same time a gravity gradient decrease south of the triple junction, which is followed also by a regional scale change in March 2011. And this pattern of variation contrasts very much with the result of our analysis in the rest of the area delimited by the blue line, where we do not have such consistency. And it's really a regional scale variation. So now if we focus closer to the epicenter, we can see whole dynamics of variation of the gravity field that unfolds over time with this regional change starting before the rupture. We do not find them at smaller special scale. It's a broad-scale pattern, continued in March. And afterwards concentration of the gravity signals, progressive concentration, first on the trench and then slower variation on the oceanic side of the subduction. So we have carried out many tests to compare these gravity variations before the rupture with water signals in the gravity field, or the grace geoid artifacts. And we concluded that they have a highly unusual spatial pattern and amplitude. You can see, for instance, here in a point in northern Japan, we can see a large deviation in the annual snow cycle marked by starting in December 2010, which corresponds to the blue point in the time series while March is in pink. And so in all the points of these anomalies, we have a progressive increase of the gravity gradient over a few months, instead of a sharp one in March, as we can observe if we move two points closer to the epicenter, where we do not have anything in the months before, but really a sharp step in March. So we give an extensive discussion that this behavior is extremely unusual. It appears persistent in the gravity field. It has a large amplitude with respect to usual water signals and noise in this area. And its spatial pattern also does not match with that of the usual water signals in particular. That's why we have proposed that it could reflect a deformation of the solid Earth before the earthquake. And so we notice first the consistent orientation of both the anomaly before the rupture and in March, which is that of the regional subduction orientation, not the local orientation, the regional one, the historic displacement of the gravity signals. And based on these characteristics, we have proposed that this gravity variation could reflect a slow deformation propagating from depth to surface within the entire subduction system, and the rupture would occur as the extreme event within this broader and slower deformation pattern propagating within two oceanic states. The anomaly coincides, can be explained by mass decrees, and it coincides with the Pacific Slab at depths around 250 km, typically, where seismicity vanishes. And it precedes at depth shallower acceleration of the seismic release, that has been reported. Another peculiar point is the March 2011 Cosasmic Signal. If we compare the grace observed Cosasmic Signal at depth with that model from Cosasmic Slip models based on surface motion, the amplitudes agree very well, but the orientation and the spatial extent are not the same. The surface motion constraint model, they align with the local strike of the subduction, whereas the grace one, they align with the regional strike of the subduction, and they extend on the oceanic side much more, so it indicates a larger deformation pattern near the rupture than from surface motion. Okay, you're finishing up. I'm finished. A last point, when we stop the analysis in February 2011, and we simply map anomalous large variation, we can detect around Japan a consistent pattern structure of anomaly in the middle of among a few other large ones. And I will stop now. We think we have a unique probe on deeper and slower motion near plate boundaries from satellite gravity, and that systematic investigation needs to be carried out for all the events resolved by grace. Okay, and I stop. Thank you so much. We're going to do, we're going to have a broader discussion at the end, so I hope you can stay on until the end of this session, and then we'll have general questions to all speakers. Thank you. Okay. So we'll move on next to optimizing subduction zone monitoring, or Sarah, who was just previewed slightly in this talk as well. Go ahead. So I'm Sarah, and I am going to give a brief talk, and then later I will entertain questions, and by entertain I mean tap dance or shadow puppets. I'm not going to be able to answer that, because this is a study that involves three of us, me, Eileen Evans and Ben Brooks. My contribution was to develop the mathematical framework, and because I like you all too much to make you sit through and talk about math, I'm going to skip over that part and only present the part to Eileen and Ben, i.e. the part they did not work on. So, subduction zones, interesting things happen in them. And so, for example, if you wanted to find out about the... In a subduction zone, you need to figure out the tectonic rate in it, and also what rate is speaking on it, and the distributed format. And so, when you're doing mathematical, and when we go out and try and do that, we get many different coupling models, and if you look at the Antonina property model, they are very large, you know, French, which is going to be a hint that we need, it's probably actual observations. And because we haven't yet talked about how these things work, let me put in a few slides to explain how people of the oddity work. So, on land, the oddity is very nice, but in the GPS with people, they get information from GPS satellites, it positions itself easy peasy. If you want to figure out what's going on on the seaport, things get very messy, typically what we do is you put donor transponders on the seaport, and then you send out, for example, a boat, which then... things that look like transponders and positions itself as transponders, and then it uses GPS to talk to satellites and positions themselves. So, in order to get an observation, it seems that we send a boat out, which is very expensive. A flight with people often is what's called a wave glider, and so these are unmanned vehicles, they import the differential vertical motion between the top of the sea surface and the sea at depth to... and it imports that into forward motion to prepare all the craft, and they can be sealed remotely using satellite communications. Okay, so what can we absorb? So, with seaport to our sea, GPS acoustic, GPS-A, principally sensitive to horizontal motion, substanderminal position after three to four days of observation. We also see full pressure gauges with a sensitive to vertical motion and also a substanderminal level position. And then there's other kinds of sensors in various stages of development, for example, sea full of strain. But no big question is, what mode do you operate these things in? So, typically they are operated through a peak occupation, right, and you send out a ship or a wave glider to come visit the transponders and figure out what their position is, but you only do this in sort of a campaign mode. Alternatively, you could use continuous telemetry. This uses a lot of power. That means you have to change batteries frequently. And sometimes you might need additional infrastructure, like a buoy positioned over sea full transponders. So that's even more expensive. Alternatively, if you really want continuous data, you can cable these instruments. That's continuous telemetry and continuous power, and that is so very expensive. But the mode that you choose will determine what kind of transients or precursors, if any, you can absorb. Whatever you choose, it's going to be really expensive, and we're not going to get a lot of observations. So it's really important to make these observations in the places where it counts. So let's do, let's figure out the answer to that question. Where do we need to make sea full geodetic observations? Where is the most important place to get new data? And to do that, we're going to use a quantity called Implementation Entropy or Differential Entropy. That is the amount of missing information about the value of a parameter. Fun fact, it can also be defined as the average number of yes or no questions you would have to ask to guess the value of that parameter. So you do a bunch of math and you can come up with an answer to what is the entropy in your solution? How much uncertainty do you have about whatever it is you're trying to absorb? And this answer is independent of what that value turns out to be. It doesn't, for example, if I'm interested in coupling, it doesn't matter whether it turns out that the subduction zone is highly coupled or completely uncoupled or if it's homogeneous across the subduction zone or if it's very heterogeneous and it looks like a checkerboard or something. That doesn't matter. But it doesn't matter that you said that you wanted to go out and absorb coupling at some sort of spatial resolution. If you decide to care about a completely different physical property like maybe the total moment rate deficit across the entire subduction zone, then you can get a different answer because you're saying I want to make influence about a different quantity. And note, entropy is bad. That is uncertainty. You want to minimize entropy in order to know more about what's going on in the subduction zone. Okay. So I gave this framework to Irene and she said, okay, let's apply this force to looking at the coupling problem. Okay, the subduction zone. So what she did is she discretized the subduction zone using these triangular patches that you see here and she embedded it in this overall block model. And she said, okay, we already have data from these existing 4,213 GNSS sites. And then we're going to consider 4,000 potential new observation locations. I'm going to look at all of these potential observation locations and ask which ones minimize information entropy, give us the most information about coupling. Some more details. We assume the uncertainty is both on the onshore data, the offshore data that we might potentially be looking at and also the oil for using the block model. Okay. And so this is a map of how much entropy decreases, how much we have learned as a function of railway observations. Blue is a big decrease in entropy. Good, we've learned a lot. And what you can see and the triangle is on the existing onshore GPS network. And so what you see is blue is the Wanda-Fuka plate. The single most important thing you can do is go out and get a good observation of the Wanda-Fuka plate because that gives you the relative plate rate. And so there's a little animation where you put additional stations at. That false run again and basically it starts filling in the trench because that's where all the uncertainty was in the coupling models. So number one, figure out what your plate rate is. And number two, start filling in the trench where the most uncertainty was in your coupling model. Now, again, this was for doing coupling, right? If you said you wanted to do something else like total moment deficit, maybe you would get a different answer. What it says is that the first 21 additional places you would want to make observations of are all offshore. It's not until the 22nd observation that goes, you know, there's a little gap in my GPS network onshore, I could use another observation wheel. And so this is a plot of how much entropy decreases as you add in new additional stations. And again, the first onshore station comes 21 stations in. And I'll try to, you could say, okay, but I could also learn more about how much entropy decreases as you add onshore GPS network. How much would I learn if I put that, if I instead of putting out, you know, 30 potentially offshore stations, I went out and put out 30 new stations on land. And whatever location was best to increase our observation capacity using just land-based observations. And that looks like this. It also decreases the entropy, but not nearly as much as putting out stations on land. In fact, it would take 18 additional GPS stations to learn as much as you would learn by taking one offshore measurement. So, I can just wrap up. 27 of the first onshore observation are offshore. And one single offshore station takes as much information as you get from 18 onshore stations. Now we didn't go the extra step and ask which makes more dollars and cents because that depends exactly how you run the offshore instrument, right? How expensive it is, depends on how often you want to be visited or whether you're labeling it or whether you're controlling it. And also, the answer for where you want to put your stations depends on the quantity you want to absorb. But this is a framework that lets us say if you tell us what you want to absorb, how much you need to absorb it, how often you need to absorb it and where you need, and where you're trying to absorb, we can tell you where you need to make observations in order to learn as much as possible about that quantity. And that's all I have. Thank you. Thank you. Thank you, sir, for introducing the seafloor geodesy component as well. So well, thank you. Our next speaker is Bruce Haynes from NASA. Sea surface GPS, recent advances. Okay, thanks for inviting me here today. I'm going to be trying to answer one of the questions that was raised in the briefing book about new instrumentation and techniques to help lend insight on precursory events. And this relates to GPS and sea surface systems. This is a collaborative effort between NASA JPL and NOAA PMEL. They have a lot of experience deploying buoys around the world for various different experiments. Okay, so this project, originally initiated with some seed funding from a NASA ROSES call. And the objectives are to design, build, and test a modular low power, robust, high accuracy GPS system. We now sort of use the word GPS and GNSS interchangeably. GNSS refers to the overall satellite constellation which could include the Chinese, Russian, and European conditions of GPS. And we want to focus on continuous and autonomous operations and move away from campaign style buoy measurements and also ensure that we can get the data back in near real time. And this is for both ocean and cryosphere applications. You heard from Paul about precise point positioning earlier. There have been a lot of advances in precise point positioning in the last decade that allow us to basically determine sea surface height of a buoy that's isolated from reference stations. We want to take advantage of that. And then also explore potential scientific benefits, especially in the field of physical oceanography, but also weather. We can measure precipitable water from the buoy very accurately, which has important implications for forecasting atmospheric rivers, also space weather. Ionosphere obviously from the GPS measurements and the potential potentially also measure transient traveling ionospheric have been used for seismic purposes. And so what I'd like to try to convince you of is that I think people have known that we've been doing GPS buoy research for a long time now in the 30 years. I think the first paper was published in the late 1980s. What makes the time unique now? I think it's right for the development of a global GNSS ocean network. And there's several things that have come together recently that have inspired to make this the right time. One is the emergence of these new satellite systems, which I referred to. This is important for kinematic applications because oftentimes because especially in a buoy it could potentially be pitching and feeding pretty violently in waves. So you want as many observations as possible. So if you're tracking more than just GPS and you're tracking Beidou and other constellations you have a better observing geometry. The second factor is advances in miniature high accuracy receivers. When we first started doing this a couple decades ago, the scientific grade receivers were bigger, they grew more power. Now we have credit card sized scientific grade receivers that draw about a watt. Innovations in precise point positioning I mentioned that earlier, enabling high accuracy without dedicated reference stations and then also a lot of new potential robotic ocean current platforms. We have sail drones, liquid robotics as this wave glider system and NOAA I'm also showing one of the advanced NOAA buoys, which is the one we're actually using is from the dark system. So there's broad scientific and societal benefits as I mentioned earlier not just for sea forgy odyssey but for studying sea level for calibrating satellite altimeters for measuring both the properties of the neutral atmosphere and the charged particles. And also for natural hazards of course which is why we're here today. So here's our prototype buoy it's got a tiny GPS credit card sized receiver which is shown in the upper right there, draws about one watt it tracks all of those satellites I described earlier, not only GPS but also GLONASS they do in the new Galileo system. It's got a little miniaturized digital compass accelerometer inside to get the data back in real time. We use iridium communications. We can't yet get back the one hertz data we're pulling back lower rate data. I'll describe that in more detail later. It's adaptable to multiple floating platforms not just buoys but hopefully also sail drones and wave gliders and it enables geodetic quality solutions without nearby reference stations and so I'm not going to go through some of the development and testing in detail here but suffice to say we've had this approach of taking bathing steps where we first put the buoy out and then enclose lake then we move to Puget Sound and then we do a bunch of open ocean tests and we've had about close to 500 successful buoy days of tracking using the system and what I think I'll do in the interest of time is compress the talk and skip the next five slides and move rate to the results from the last deployment. I will say maybe I'll stop briefly on this slide which shows I just want to give you a sense for what the measurements look like what the solution looks like the data come back to us the raw data the buoys not computing its height we're taking the raw data back we're doing precise point positioning and what you can see here is a time series over a couple days of the buoy height represented by these tiny green dots and it's highly scattered that's not an air in the GPS solution that's just because we are sampling individual waves so the buoy is going up and down with the waves it's also going up and down with the ties and these dots here represent the geocentric sea surface height from the radar altimeter on the satellite as it flies over the buoy so you can see there's very good agreement you have to do some smoothing of the wave effects but when we do this we get agreement at the 2 to 3 centimeter level between the geocentric height from the buoy and the geocentric height from the satellite okay so this is the last test that we just completed so I want to introduce you to the harvest platform this is off the coast of Southern California it's right at the tip of point conception it's right kind of where the California coast goes from east-west trending to north-south trending it's a great location for doing a stress test for the buoy because there are very active seas out there it's at the basically the intersection where the Santa Barbara channel enters Pacific Ocean and the seas there as you can see from this picture especially in the winter we get some very big storms and they're typical of the open ocean so it's a great test location because it's close to shore but we have high seas and also it's served as an altimeter verification site since 1992 so we have multiple GPS receivers and multiple tide gauges which we can use to provide verification of the buoy results so what we did is we deployed a couple of these buoys up by the platform shown in this this picture here this is the satellite ground track from the Jason satellite it traces this ground track exactly every 10 days here's our oil platform in the two buoys we're moored in about 300 meters of water about a kilometer and a half apart so these three form a triangle here we have GPS and tide gauges on the platform fixed to the ocean floor and here we have the buoys so we're going to make some comparisons between the two and we put these out there in August of last year with the idea to try to go through at least some of the winter so we could get some really high sea states and see when the buoy performance started to degrade we also equip the buoys with load cells because one of the issues we have is the water line changes as the mooring pulls in the buoy right so if you have very high tides and active seas and the mooring is on the edge of the watch circle it's going to get pulled into the water and if you don't know that it's going to make it look like the sea levels dropping so we put a couple load cells on the buoys and we also upgraded the telemetry so that we got one minute snapshots of the GPS tracking data which allows us to do sort of decimeter level positioning but we really need the one hertz data to get down to the centimeter level so this is just a plot showing over a hundred days the comparison between the two buoys and the tide gauge in the top panel and you can't tell the difference essentially these are peak to peak variations of two meters that are due to ocean tides essentially semi-dernal and diurnal ocean tides but all systems line up pretty well you can look at the differences and what you see is agreement sort of at the two centimeter level these are hourly averages of data so you can envision taking averages over longer periods over several days to further reduce the errors what you do see if you look at the bottom panel there's a significant correlation with the wave height significant wave height so the typical wave height at harvest is a couple meters sort of what we see in the open ocean but then as I mentioned earlier during occasional periods there's six meters here and you can see there's some issues with the solution it starts to break down at this point in time we could have sea spray we could have steam and all kinds of things going over the top of the buoy so this is a good thing we wanted to see when this was going to start happening and we feel very comfortable with the results up to about significant wave heights of four meters after that it gets a little bit dodgy with this particular platform the other thing you notice is systematic air that's related to wave height that remains in both of the solutions some of this could be coming from the tide gauge it's very hard to measure water level at the platform because these waves and some of it's probably due to the buoy if you look at the differences between the two buoys now you can see we're at the sub-centimeter level so there's some systematic air in there but in terms of the relative positioning in height it's very accurate and that has a lot of applications to do so to summarize the results are very promising I think the comparisons with independent observations suggest the accuracy in height is close to two centimeters and by independent observations not only tide gauges but satellite radar altimeter measurements from space-borne systems we think the horizontal accuracy is probably similar we don't have a means of validating that for relative height we're down below a centimeter and hopefully you'll agree that this might provide a model for an improved acoustic observation system just see for geodesy enables accurate and continuous access to the reference frame does not require a dedicated reference station on land and it's capitalizing on a simple power conserving design to provide accurate positioning and orientation for a hydrophone reduces burden on costly campaign style ship-based measurements and adaptable to other platforms so we're working on this last part here and hopefully we'll have some data from a sail drone here in the near future and the other thing I want to mention is it's not just the height and the position it's also the fact that we can measure total electron content which enables characterization of ionospheric disturbances from tsunamis and all possible precursor events and then there's some challenges we're looking at I think they relate mainly to telemetry and power but these advances in these areas are occurring so quickly I'm confident that we'll be fine the new iridium next system is supposed to enable 10 times throughput for these transmissions and then most of our deployment so far had been no solar panels just using batteries we're going to do a one year deployment with some new batteries but we also have some designs to reflect the addition of the solar panel so I think I'll conclude with that and if there's any questions or not okay alright Ryan I think he's actually speaking to us from his car and it's Bud Vincent from URI we're told that can anyone hear me yes we can okay great will you be sharing your slides there we go can you see my slides yes we can okay sorry I couldn't be there in person I'm on the road I'm stopped I'm in a parking lot New Jersey I'm sorry and but telecommunications is a wonderful thing I just left a three-day conference at Pennsylvania State University on underwater acoustic transducers and we're going to talk about what telecommunications is what telecommunications is on underwater acoustic transducers so that's one of my one of my areas of interest but that's not what I'm here to talk about although it is somewhat relevant I'm here to talk about the use of GPS and underwater acoustics to facilitate seafloor geodesy and what I want to talk about specifically is some of the work that I've done over the last 20 years on seafloor geodesy I suspect most people have never heard of me or my work because most of my work has been done for the Navy so what I'd like to do is go through that a little bit and then perhaps leave off with some suggestions for some future work and it ties in nicely with some of the slides that I just saw regarding robotic platforms because I agree 100% that the time is ripe right now to bring these robotic platforms into the picture to facilitate economic data collection to support seafloor geodesy so my bottom line up front and I'm sure there'll be more discussion I suspect when we have the panel discussion and that is in my opinion long-term seafloor geodesy is economically feasible by combining three things underwater acoustics GPS and autonomous surface vehicles but I will challenge the status quo which always defaults right now to using transponders and there are other seafloor sensors that are available and are better for long-term seafloor geodesy specifically they are beacons and hydrophones and I'll discuss briefly the differences between those and why you might want to consider using those alternatives I'm going to tell you that using a hydrophone with an acoustic modem offers the opportunity to provide the highest accuracy geodetic measurements on the seafloor as well as the lowest power for long duration deployments and by long duration I'm meaning at least 25 year long deployments of the seafloor sensors and these are not tabled sensors back to shore so what is a transponder? transponder is a device that transmits a reply pulse after receiving an interrogation pulse so whether it's a wave glider sail drone or manned surface vessel you transmit from the surface it gets received on the bottom and the bottom transmits a reply back for every single ranging measurement that you want to obtain a beacon on the other hand is a device that sits on the seafloor you send a signal to it once transmission from the surface and then it transmits repeatedly for an extended period of time afterwards and that period of time would typically be the observation campaign that your surface platform as it maneuvers in a geometry around the seafloor sensor would go so for the navy ones we've done these beacons we let them run for say four hours and then we're out of the area and we've moved on to the next one a hydrophone just receives the pulses why is that important? acoustic transmissions require high power high power being hundreds of watts especially if we're going to talk about capabilities down to full ocean depth on the order of 6,000 meters but acoustic reception on the seafloor only requires milliwatts of power so if we can minimize the transmissions from the seafloor we can have a very small package and it can last for a very long time and we can get a very large number of observations that allow us to compute positions we do have to get the data from the seafloor to the surface and that's why a hydrophone coupled with an acoustic modem represents the lowest possible power per slant range measurement that we get as the observable to make geodetic position computations so typically what has been done in the past when I initially got into this business back in the 90s was we would drive out with a manned surface ship we have hydrophones down on the seafloor and the hydrophones were able to be time synchronized with the transmissions from the surface ship because of GPS so we have to solve that problem without the lack of time synchronization if we're going to have this untethered device sitting on the seafloor and that's why transponders have typically been used in the past. Now though as I'll show or comment on fail atomic clocks have been developed that give us very precise time synchronization on the seafloor in a very small battery powered form factor so the traditional positioning models that have been used on the seafloor positioning and in GPS consist of spherical models so these are long baseline multilateration models hyperbolic models and the reason there's one each excuse me two each for spherical and hyperbolic is it has to do with the number of transmitters and the number of receivers so the second one in each of these and let's see if my mouse will come through can you see my mouse okay so here the T sub the T the I indicates how many observations I'll be making so this is time of emission and if I have a single time of emission multiple time of arrival measurements I would employ this model if I have many many time of emissions and also many many time of arrivals I would apply this model so this one here for spherical and this one here for hyperbolic is one for GPS at a fundamental level and of course for GPS with your cell phone handheld GPS receiver because you don't carry around an atomic clock synchronized with the clocks in the satellites you have to solve for your user clock bias in your local GPS receiver and I'll skip you I just want to remind you that we only have a few more minutes to use up our discussion time sure okay I'll cut to the chase we've developed two new models that allow us to remove the single most important error contribution which is the effective sound velocity of the sound speed in the ocean two new models that were developed we estimate not only the geocentric position of the acoustic receiver in an earth centered earth fixed coordinate system represented by xyz but we also can solve for a timing bias as well as a sound speed bias and we've conducted a series of computer simulations and predicted what the bias errors would look like and how we would remove those and that's indicated here for the various models the traditional spherical and hyperbolically squares and then the spherical and hyperbolically squares with the sound velocity bias estimation and this is some data from back in the late 90s where we went to sea in the depth of about 1700 meters put sensors down on the sea floor cable back to shore and validated these models and we this was done down at the US Navy facility and the tongue of the ocean and the Bahamas and we did about a hundred of these sea floor sensors everywhere you see a crossing pattern of the ship there's a sea floor sensor and the bottom line is really the same. This is a comparison of the sound speed bias and we are getting sub centimeter positioning using the combined acoustic and GPS data and this is a comparison with carrier phase this was post-process carrier phase data and let me jump ahead this is portable sensors so we extended this technique to portable sensors again one way acoustic transmissions and the conclusions and we can have a conversation with it we basically have developed low power acoustic sea floor instruments over about three generations for the Navy our models and our algorithms we believe we are about because of the bias removal with these new models we are well below a centimeter in terms of our geodetic position. The surface system with the autonomous vessels can be used for the transmissions and if we put hydrophones with an acoustic modem on the sea floor they are economically feasible and they can last an extremely long period of time and on the left here is an example of a sea floor sensor that could run this one here could easily run for ten years and just in case you're not aware sail drone is not down for unmanned wind powered vessels these are datamarans and they can stay out at sea transmitting acoustic ranging signals to the sea floor picked up by this sea floor sensor on the left which then transmits those acoustic telemetry back to the datamarans and back to shore and with that I will answer any questions now or defer until we have the panel discussion. So Isabelle and by Sarah and Bruce are here at the table and we're opening the floor then to questions to any speaker in this first part of panel two questions. Questions? I have a question for either Bruce or Isabelle the chili changes related to this precursor candidate before to Hoku would it produce sea level changes large enough to be measured by Bruce's system? I don't know how big that is and deferred it. The co-seismic change is about at millimetrical level on the geoid at race resolution so if this is a large scale anomaly it should be below the millimetrical level I think otherwise we would see it more easily in geoid data in geoid data directly without performing a gradient analysis. Another question for Dr. Pané in terms of the precursory signal I suppose nothing was seen for the large Indonesia earthquake where the co-seismic part has been discussed for grace for a while is that correct? I'm aware only on this study on the Tohoku earthquake and we are studying the Maoli earthquake but for Indonesia I don't know. I have a question for Bud or two quick questions you mentioned the horizontal accuracy of about 0.1 centimetre what's your vertical accuracy is the first question and then the second one is what's the cost of the system you showed the picture of at the end the sea floor hydrophone and modem and then the surface sail system whatever you use. Can you hear me okay? Vertical accuracy is determined by the vertical accuracy of the GPS for prior work that I did due to uncertainty in the geoid height model as well as title correction we felt that we were about 10 centimetres vertically now I think that can be improved but it's directly related to the GPS and not as much to the acoustic ranging measurements per se the horizontal we actually think we can do better than a millimetre with enough observations with the right distribution around the sea floor sensor the cost both the expensive piece right now is the chip scale atomic clock and those prices have just gone up recently from about $1,500 to $4,500 so the sea floor sensor is going to be about a $30,000 instrument we are looking at some clocks that have just been recently developed to compete with the atomic clocks and if we can model the drift on those sufficiently and remove it we may be able to bring the price down by another few thousand dollars the surface systems I don't have a cost on because most of the surface wind powered vessel providers will not sell you a wind powered vessel they want to sell you services thank you I'll just reinforce with our system we do think now that the state of the art for GPS vertical in the water with the properly designed system is a couple centimeters instantaneous and by that I mean if you get to average over about an hour high rate data and if you average over multiple days we think it's probably closer to one centimeter but it's something we still have to demonstrate and it depends on the platform too right it depends on we need to test the system on other platforms we understand our system very well now but we need to deploy it on gliders and sail drones and see what we get yeah I wanted to follow up with Isabel you showed that the cosismic signal is sort of speared along strike much farther than the predicted signal and I'm wondering about in the time domain is it possible is your filtering the time domain fully causal or is it possible that some of the cosismic signals being aliased into the precise you're speaking of the cosismic well I said the cosismic was clearly smeared out laterally extends much farther than the prediction and I'm wondering whether there could be temporal the filtering could project some of the cosismic into the pre-sizement period I think there are two levels one is in our analysis and the second is before our analysis in the way the G-WID models are built month per month and the fact that we detect an anomaly in time series truncated in February 2011 so before the step means that there is no filtering effect in the analysis we have run itself that's why we carried out two kind of analysis then there is the way the G-WID models are built before we start anything and each month they are computed month per month independently from the next month and the orbits also are computed day per day so we did not expect any kind of propagation but then or something we would not think about it's Ben Phillips Ben can you hear me? yes this is Ben Phillips at NASA headquarters a question for Bruce and Bud I'm interested in the error budgets for the sea floor the sea problem and looking at the sea surface satellite component and where you see in addition to these great technology developments that seem to improve the logistics and cost case where you see opportunity for improving precision bringing down integration time those sorts of things I think one of the biggest challenges is monitoring how much the water level is changing relative to the face center of the GPS antenna also multi-path and separation of tropospheric signal from vertical height for an isolated buoy but in terms of I would say the water level measurements might be the biggest component of the error budget and that's why we're spending effort putting these load cells and other systems on there and also we use this digital compass so we can precisely model the orientation of the buoy in much the same way we do with the satellite in space we're always monitoring the position of the antenna face center relative to the reference system of the buoy so those are the areas that we're focusing on. Do you have to worry about monument settling? Do these things sink into the mud? Well this, the buoy is just floating on the water under water part I'd have to defer to Bud Vincent on that part. It's certainly possible that the sensor could settle into the sediment and that would have to be distinguished somehow from the seafloor changing its height and you can put pads down on the feet of the sensors but you know there might be able to do something with a very short distance laser to measure settling local settling of the device into the sediment on the seafloor. It's difficult. There has been some discussion about setting up the benchmark by putting a driving a long pipe down into the seafloor so that it doesn't settle vertically so and I believe that's being done in the oil and gas industry in fact as they set up these production areas and they want to measure local deformation as they produce an area. Other questions? I mentioned one other thing about our budget if I may. As far as acoustic ranging is concerned just from an acoustic signaling standpoint we can get better than a centimeter resolution of the instrument. It's the bias errors that kill you and that's why it requires the modeling of the sound speed to remove that component and I suspect something similar must have to happen in the GPS propagation of the electromagnetic signal from the satellite to the receivers. So I have a question for Sarah. So I really like this kind of approach of where's the optimal station by station where should we put it but the bit I've done quite well at least I can't get my head around so wonderful the plate great then the next 20 ish we're all right along the trench but does the analysis really mean right along the trench as opposed to some sort of distribution to see the definite because there's presumably some sort of deformation within the overriding plate and so the idea that the next 20 would all be right along the trench I'm a little surprised by so can you talk us through whether that's really one of the conclusions to draw or well of course you're talking about Irene's work so mostly I like to answer that question with a shadow puppet but other than that I mean that is genuinely what the important says right we start with this very dense grid it absolutely has the opportunity to put those stations farther down you can see those color maps of the relative entropy you know all over the whole region which is all the places you could put a fault and the entropy really does grade up towards the trench which kind of makes sense I mean again it is kind of shocking how much it is near the trench but at the same time if you turn around you look at those maps of where the uncertainty was on the coupling models they grade up to the trench the biggest uncertainties are near the trench and they decrease as you get closer and closer to land and so if you ask the question where do I want to go put out my stations it turns out that the entropy thinks it's along the trench where those uncertainties are largest I presume if you were to subdivide the overriding plate you would get a different answer it's tessellated even in terms of the parameterization of the model right so there's a distribution of where you could put the station and then there's an assumption in your model are you asking for a specific model of what's the entropy what if you were to change the parameterization in that model so what's the function of parameterization you know presumably if I made this you know these patches really really big then you would only need a few stations along the trench in order to say I know everything that's going on on the trench and you know if I had made those patches really really really tiny maybe once the force 50 stations near the trench because boy there's a lot of patches out there they don't know anything about it is the argument I mean one part of the argument though is that a cable array a dense cabled array is not in any way the geometry in form right I mean from your from your analysis yes Paul course that are saying is that you have epistemic uncertainty these are all modular the forward model being a particular elastic model if you start allowing inelastic definition of the trench then you may get a different answer show absolutely this is absolutely model dependent and again this wasn't this isn't even just a matter of you know elastic forces and elastic it's also a question of the fact that like in that particular example we said I want to know about coupling right if you had said something else I want to know about slip rate or total slip rate or something else you couldn't have gotten yet a different answer because that requires a different set of observations and you have different interest in different variances and covariances if there's one more short question otherwise we'll we'll wrap up now and have a coffee break 10 minutes