 I'm Thorsten Becker, I'm a committee member and I'm here to help convene the continuation really of the previous panel. We are again discussing techniques and technologies that are related to the tectonic precursors. We have the same format, four 10-minute talks, which I may have to cut off to stay on time and we would like to save the questions until the end as sort of a joint Q&A. We're going to start off with a spa web from Lamond Doherty and he's going to be a remote presenter and he's going to talk about seafloor pressure sensors. Hi, how are you doing? I'm doing well. We can hear you. Shall I share a screen or do you want to do it? Yeah, we see your slide. All right. I've been talking about seafloor pressure sensors with respect to possible precursors of large subduction events. Can I advance the slide or you go I'll do that. All right. Let me start off by talking about the three different kinds of pressure. Say again. Somebody blaring their nose. We're going to talk about the pressure sensors that are routinely used on the seafloor mostly in OBS fleets. The first two here are primarily used for seismology. They have no long period response so you can't look at long-term geodetic signals. First we have the hydrophone, useful from 10 seconds to quite high frequency and we use it for looking at earthquakes and tremor is accurate and it can reach the seafloor noise floor over it's band. The differential pressure gauge became the first sort of broadband sensor widely used in the OBS fleet. It's useful band is from about perhaps 1000 seconds to 50 hertz and we use it for general seismology, Rayleigh waves, P waves. One of its most important uses is removing wave loading noise from vertical component seismic data from the seafloor that can greatly improve the signal noise for long period signals like Rayleigh waves. So it's known problems. It's calibration is sort of variable. It's not very well known often. Of course it only see fluctuations in pressure. You can't measure geodetic signals or total depth. Next slide. These next two sensors are really the same sensor. They're based on the scientific company pressure gauge. It's the most stable sensor out there currently. It measures depth so it's going from essentially zero with to higher frequency. The bottom pressure recorder is the way it's been implemented for years. You get a sampling rate up to perhaps a hertz or more typical 10 seconds. The APG is just a term used to describe a much higher resolution version of the BPR. It allows much higher sampling rates so we're using it in our fleet from up to about 50 hertz nyquist. It's sensor noise level somewhat higher compared to than a DPG. It's used for the same sorts of things in seismology as a DPG but it's much more accurate and stable. The main reason we've been working so hard to get this on the absolute pressure gauge is it has essentially an infinite dynamic range so if you want to look at signals within the source region of a large earthquake or if you want to look at a tsunami within a within the source region the APG is supposed to do that. It has a very accurate and stable response well known response and because it has very long period response DC response you can use it for geodetic observations of vertical uplift. Next slide. So type of precursors we're trying to look at will be anything that produces a vertical excursion of the seafloor that will thus change a cause of change in the seafloor pressure. Difficulties with using pressure gauges for this purpose is the drift of a pressure gauge can be quite large times the centimeters per year. We seem to have a solution now it's called A0A. I'll describe that briefly. The much bigger problem is that oceanographic signals are large. If you're trying to see a change in the depth of the seafloor if the sea surface is moving up and down that's going to give you a signal that obscures the changes in depth. Big oceanographic signals are tides and waves, mesoscale eddies and then if you're worried about things like secular uplift rates, up secular strain rates, long-term changes in ocean currents. We're looking at some solutions. I'll describe the reference site as one solution also looking at ocean modeling and also whether improved observations within the ocean can be used to remove this oceanographic noises that were from geodetic observations. Next slide. The first system that was developed that could accurately remove drift from pressure gauges was built by Glenn Sasagawa and Mark Somburg. It uses a dead weight tester within the pressure case, produce an accurately known pressure signal. So you'd have your pressure gauges normally looking outside the case, measuring sea water pressure periodically, perhaps once a month. You'd spin up this system within the pressure case, use that, switch a valve so that your pressure gauges measured the pressures produced by this system inside the case and use that to measure the drift of your sensors. You could then subtract that from your gauge. Next slide. People then realized that it wasn't necessary to compare your pressure gauges with a pressure that was comparable to sea water pressure, but that the drift in these pressure gauges pretty well tracked the drift at low pressure. So you could have your pressure sensors measuring pressure outside the case most of the time. You would periodically switch a valve so that your pressure gauges measured the pressure within the pressure case, which was typically something on the order of an atmosphere. You'd measure the pressure within the pressure case using a barometer. You'd measure the drift at low pressure and use that to correct the drift at high pressure. There's one of these systems that's been running on the Embare cable in about 900 meters of water that was built by Wilcock and others at UW. The curve on the lower left, the blue curve, shows the difference between pressure measured by each of the two pressure gauges over time, over a period of about two years. You see that the two direct gauges drift about four centimeters, 3.6 hectopascals, over that period of two years. But if you use the drift correction, the drift measured at zero at low pressure, you can correct the two and then they then track each other to beather that a millimeter per year. And that's really quite spectacular. Next slide. Here's an example of using pressure gauges to look at a possible precursor event. In this case, a slow slip event off New Zealand. This is a project led by Laura Wallace. Next slide. You have the situation in the North Island of New Zealand, where you have the Pacific Plate subducting underneath the North Island at about five centimeters per year. If you look at a GPS site at Gisburn over a period of time, you see the site at Gisburn moving westward at about five centimeters per year for most of a year. And then you see it jump back eastward again over a period of about two weeks. This is one of these slow slip events. If you look at the site at Gisburn over long periods of time, it ends up pretty much at the same place it was at the beginning of the record. And what that suggests is despite the subducting Pacific Plate, that the coupling between the down going plate and the up going plate is essentially zero over long periods of time. However, the GPS data on land provides essentially no constraint on what's happening far offshore. The question remained, was the plate locked near the trench? And to look at this problem, next slide, we put out a whole bunch of pressure gauges and some seismometers offshore of Gisburn, left them on the seafloor for a long enough period of time to observe one of these slow slip events happened in late 2014. Next slide. What you see on the left at the bottom in blue is one of these GPS sites showing the slow slip event happening. And the curves above it are the pressure data from a bunch of the pressure sensors inverted. So that's in terms of a change in elevation. So what you see is the seafloor going upward during the slow slip event as the essentially the overlying plate goes back uphill on the sloping interface. What we discovered is indeed the slips seem to go quite close to the trench. And we're seeing more than about 25 centimeters of slip on parts of the trench, parts of the interface. Over the pressure data, you can see the, it's quite noisy. The signal of noise to these observations is really quite low. And next slide. The problem is oceanography. This shows some of the of the raw pressure records that have been removed. And what you see first are tides. In this case, they're quite small. They're only about 20 centimeters peak to peak. If however you, you can simply low pass filter these things with a time scale of like two days or longer. And when you do that, you remove most of the tides. Next slide. What you're left with is a fairly large signal about 10 centimeters plus that's especially with mesoscale eddies. They're coherent across most of the array. And I think if you were looking for the slow slip event in there, it's fairly hard to pick out. To get around this problem, we use what are called reference sites. Let's go to the next slide. So two of the sites were on the incoming plate. Those sites should not be associated with a slow slip event. So we use those two stations, we average them, we subtracted them from the rest of the sites. And at that point, you can see the uplift associated with the slow slip event on the data. But again, as I noticed, the noted the signal noise is still quite small. If you're looking at these events in New Zealand are really quite large. That's the thing that makes it possible to see them despite the large signals from the oceanography. Next slide. So you've looked at other options for removing oceanographic noise. The first one is to use predictions from oceanographic models or models driven by the local wind. The other thing we've been looking at is adding additional measurements of the oceanography in particular current meters near the seafloor to try to remove these oceanographic signals so we can see the geodetic signals. Next slide. The oceanographic models actually do quite a good job of predicting the seafloor pressure data. This is from Miramoto et al. The red curve here is the predictions from the model. The black curve is an example of one of the seafloor pressure changes. You see that the oceanographic model predicts something like 50 to 70% of the variance in the seafloor pressure data. And that's the boxes here on the right. The one labeled in Nuzzo ocean model, that's the fraction of variance reduced. And it's going from 50 to 75%. However, that's not enough to clearly see the slowest of the event as shown on the lower left, where we see trying to model the slowest of the event. We know when it occurs. But in this case, it signalizes, I don't think I'd try to publish that. The reason the reference site's work is, again, looking at the lower right, the reference site explains up to 95% of the variance, particularly for the near trench sites. So we're able to remove most of this oceanographic noise and see the slowest of the event. That's fine. We need to move toward wrapping it up. Yeah, fine. One more slide. Next. So the question is, can we use additional observation of oceanography like current meters? And there is some hope that additional observations can be used to remove this oceanographic signal. Geostrophe says that the differences in pressure between the gradients and pressure should be proportional to the velocity perpendicular to the line between the two sites. A small test experiment has been done suggesting that with additional data, you could see a slowest of event on the order of two centimeters. And perhaps with improved data, we get on the order of one centimeter. Slowest of event could be detected. Last slide. So conclusions. Pressure sensors can see vertical excursions. New methods are solving the drift problem, but the oceanographic problem remains a large and difficult problem to grow, but that's it. Thanks a lot, Spar. So we've got a mobile question to at the end of the talk, so I'd like to move on. Next presentation is another remote run by William Wilcock at the University of Washington, and he'll tell us about the use of seafloor cable observatories to study precursory phenomena. Are you there? Yes, I'm here. Great, we can hear you. Okay, I'm trying to figure out how to shut my screen. Did you see my screen? Not yet. Yep, what's happening? You see my slideshow? Okay. Okay. So I'm going to be talking about the use of seafloor observatories, and if you want to study precursory phenomena, you've got to be able to make sustained and reliable observations to sort of characterize the baseline and to be there to observe the events themselves. And if you're looking for very subtle signals, I think it behoves you to be able to support multiple methods of making those observations. And while real-time observations are only really required if you want to either densify your observations or issue some kind of notification, I think they're beneficial because they can enable you to identify broken sensors and fix them quickly and enable sort of, I think, more sort of efficient sort of steady processing of data as it comes in. Now, cable seafloor observatories satisfy many of those requirements. They're built on telecommunications technology, which is extremely reliable and can support long-time observations. They can provide more power and bandwidth that most people would know what to do with in their real time. And as people have mentioned earlier in the proceedings, their big drawback is they're expensive to install. But if you have a fixed amount of capital to spend, they do have relatively low costs to operate relative to many other methods of obtaining observations. Now, the Japanese lead the world in terms of using cable seafloor observatories to study geodynamics. And their two major observatories sort of illustrate the different approaches you can take. The donut system, which extends into an ankytroph, uses a connectorized approach, whereby you deploy with a cable ship, the trunk cable, and various spurs, and then you use a search vessel in remotely operated vehicles to attach junction boxes, shorter cable segments, and your instrument nodes that then house your sensors. And the donut system, I think, has about 50 nodes now supporting seismic and geodetic observations. Big advantage of this is you can extend it and you can add new sensors as they develop, and indeed that's happening. And one of the big things we've been done with donut is they've linked several event nodes to very sophisticated borehole observatories, which provide the quietest setting for seismic observations and setting for the most sensitive geodetic observations. The other approach is sort of to integrate your sensors within to the cable. It's all laid by a commercial cable ship, and that's the approach that's been taken the Japan trench with the SNET system. And there they have 150 nodes, each of one, which is in a sort of eight foot long pressure case, about one meter, one foot diameter, that houses redundant pressure and seismic sensors. And that all gets laid in one go. It's certainly cheaper per site, observational site, and cheaper to operate. And if it's done well, it should be very reliable, but you lack flexibility. You can't enhance the system and you can't fix broken sensors very easily without going out with the cable ship and recovering the cable. Now there are two observatories operated off the Pacific Northwest, one by the Canadians and one by the U.S. And since time is short, I'll just talk about the U.S. one, the National Science Foundation's Ocean Observatory Initiative's cable array. That has two spurs that extend from Pacific City. In Oregon, one goes out to Axial Seamount, and the other one goes offshore and then loops back up onto the slope and shelf off Newport, Oregon. And the installation of this, there's 900 kilometers of cable and seven primary nodes. And these were all installed with commercially by a cable ship. And when you open up these nodes, sort of handle 10 gigabit Ethernet and eight kilowatts, and when you open up the doors on this trawl-protected one, you can basically access it through a series of ports to connect to web-madeable connectors. So that allows you to install secondary infrastructure. And that's all done with an academic, or was all done with an academic research vessel and academic ROVs. And essentially you can lay cable, short segments of cable, up to about five kilometers that then go to 18 secondary junction boxes. These are also very capable. They can support up to eight instruments with 200 watts each and one gig in it per second Ethernet. So the observatory is very sophisticated in its capabilities. In terms of geophysical observations, most of those are the most sort of dense geophysical observations are being obtained on axial seamount. And this just shows some schematic figure of the observatory that basically covers the southern half of an eight by three kilometer caldera at the summit of the seamount. And at present, there's five short period seismometers, two broadband seismometers, and four bottom pressure and tilt geodetic sensors. And the observatory became operational right at the end of 2014. And within a few months, the volcano erupted in April of 2015. And there's pretty incredible data set of earthquakes that were high rates initially that then increased even more leading up to the seismic crisis, which initiated the eruption. A pretty unique data set of earthquakes that reveals that the inflation and deflation of the caldera was accommodated by motion on outward dipping and an outward default. The observations also supported a model that being developed by Bill Chadwick and Scott Nuna that predicted that the volcano would erupt at a specific level of inflation. And there are interesting signals that have come that people have seen changes in seismic velocities from noise cross correlation, changes in shear wave splitting, some evidence that the values changed associated with the eruption. And now the observatory is moving towards its next eruption that's predicted to occur in a couple of years. And it's detecting basically observations over a full cycle, which will allow us to look to see whether there are subtle signals that might be other precursors other than a large, recent seismicity in the eruption and the deflation level. The Ocean Observed Initiatives Cable Array also supports a pretty extensive set of observations extending up the slope and onto the shelf off Newport, Oregon. But most of those are focused on trying to understand water column processes. So there's some very sophisticated profiling moorings and gliders and various other instrumentation being deployed. In terms of geophysical observations, it's limited to seismic observations at the slope base site on the incoming plate and also on top of Hydrate Ridge and the bottom pressure measurements of those same locations. And there are three of the primary junction boxes which are not instrumented at all for seafloor geophysical observations. And I think there's a good reason why they should be. If you look at the geodetic models of the Cascadia Subduction Zone, the central Oregon region is one place where it appears that locking is not complete offshore. And so there may well be slipping, well it appears to be slipping and there may be doing that through slow slip events. Most of the Cascadia Subduction Zone is seismically very quiet. But off central Oregon, there are several clusters of earthquakes that Andrei who's looked at which appear to be associated with subducting seamounts and they lie reasonably close to the cable observatory. I also think the observatories very well position the outer nodes, the nodes nearer at the deformation front for cabled borehole observatories. And so I think drilling here and doing something that's similar to what's being done in Nankai could be very potentially very productive. If you want to get even more ambitious, there's been a group of scientists and engineers at University of Washington looking into how you might develop a subduction wide observatory for earthquake and tsunami early warning. And we've looked at two potential designs for a cable system, one using the Japanese style. And now you, this case with 30 primary nodes, each supporting four secondary nodes. And this costs about 500 million, so it's extremely expensive. If you go to the SNET style, you can get the price down to about half that, but it's obviously still really expensive. The disadvantage of the SNET style if you want to study precursories limited in the instrumentation that you would install at the time it was put in place, so probably just seismic and pressure, whereas the other observatory would provide a lot of opportunities to enhance it. I think if this is going to happen, it's going to be motivated by earthquake and tsunami early warning, not by precursory studies. And so in conclusion, I think cable observatories do provide powerful tools to support studies of precursory. There's no doubt that they're very expensive, but there are some already in place and more may come driven by the desire for early warning and hazard assessment. And I think studies of precursor can provide additional motivations for deploying cable observatories. And I finally just point out again that the Ocean Observatories initiative's cable array has proven itself a geophysical studies at Axial Sea Mount, but I don't think it's being fully exploited for studies of the Cascadia subduction zone, which I think it should given the expense of putting it there. Thank you. Thanks a lot. Thanks. So I'm a PhD student here at UC Berkeley, and I'm an affiliate up at Lawrence Berkeley National Lab. And for the last five years, I've been working on a new technology called distributed acoustic sensing or a fiber optic sensing method. And so I think this is kind of one of maybe the least mature technologies discussed this morning. And so I kind of want to just summarize the opportunities with this figure on the left, which is showing the Alaska magnitudes 7.9 earthquake from last year. But in an image in the background are 500 traces from the telecommunications cable deployed in the Central Valley. So these traces are the DAS sensor records of that earthquakes based over one kilometer. And in the center is the component from a broadband seismometer of these units are horizontal velocity. So what I take away from this are these large surface waves, this is a broadband recording, and it's an array measurement. So if there's something that you walk away from today, those are the kind of two nuggets from this talk, is that this emerging fiber optic tool, which which can leverage telecommunication cables for seismology is both an array and a broadband, giving us broadband sensitivity. So I'm going to describe what distributed acoustic sensing is, some of how we make these measurements, a few examples and then opportunities for research. So like I said, we're making these measurements on single mode fiber optic cables coupled to the ground. We're sending in a chirped laser pulse, and then we're making measurements in the back scattered light. This is really scattering a common phenomenon in fiber optics from the telecommunications company. This has been well studied that you can send the laser pulse out 10s of kilometers and get light back from every bit of the fiber just to the natural scattering profile of the fiber. And so we're going to use interferometry optical interferometry of that light as a function of time, look deeper and deeper into the fiber and get an array of one component array of strain sensors that is the strain on the cable acting over a 10 meter length of the cable. And then we're going to shift that in space by as little as a meter. This is going to give us a massive one component seismic array. So how do we make these measurements? So we can lay the cables in the ground and we've we started doing this back in 2015 down here in Richmond field station in a couple other areas in Alaska. And so this requires just digging a trench in. Maybe you want the geometry that you want to design yourself. So this is this is one opportunity to kind of you know lay cables in the ground at a depth of less than a meter and be able to record this data in a small field experiment. This is great because you know exactly how the cable is coupled to the soil and you also know exactly where the cable is. The problem with this we're getting there. The problem with this with the kind of direct direct burial is you can't actually go to tens of kilometers when you're trenching through routes in Alaska. Quite difficult to trench. And so a different opportunity here is to actually go out to a small telecommunications point of presence, a server room, and in a rack mount just just install the instrument, click into the single mode patch panel that they have there and be using the entire telecommunications infrastructure that's already available. So I don't have a map today of the global distribution of fiber on earth but it's large and it's estimated to be about a gigameter offshore. And so we've begun making making some measurements offshore which are quite interesting but here I'm just going to kind of take the pulse of the community and describe what we can do with some of these measurements. So okay so in an urban environment this is the data set recorded in Sacramento. This is 20 kilometers of length sampled every two meters with this cable in a horizontal geometry. And so this is showing seven minutes of ambient recording. Interestingly this time is actually in the middle of the night here in California. And so what we're seeing here is actually this section is pretty far from a road. Here we're in Sacramento. This is a bridge. It's actually suspended above the Sacramento River and so it's actually not even coupled to the sea floor at that point or sorry it's coupled to the couple to the ground at that point. But this area on the far right side of this plot is and so it's laying along a railway and a road and we can track cars. The numbers here are the miles per hour of those vehicles. And so you can actually see that in this area the speed limit 65. So the train has to go the train's actually monitored. It has to go five miles per hour near Sacramento. And so this is kind of interesting. We can actually use this to take any one one station one channel in this array and cross correlate it into a line of sensors just a hundred meters of sensors from one virtual source and then do ambient noise imaging where we have a one sided noise correlation gather in this case. So many channels all stitched together in one image. You can see it's one sided or causal just because the energy is just coming from the road above. And then we can do basic dispersion imaging to image the near surface just over the first tens of meters. We can also record earthquakes. So a few years ago we took many different arrays and kind of compiled the earthquake observations that we had made and you can see that's kind of interesting from just an observational standpoint. The noise floor is lower. The sensitivity of the instrument is is good enough over this frequency band. That is to say we don't actually get information about the instrument response self-noise dynamic range from the companies. So we're trying to figure that out right now but we can record earthquakes. So what can we do with this? So this is what the earthquake looks like over that same section of cable. So this is into this map earlier but this is Sacramento down here. The cable goes something like this 20 kilometers. And so we're recording the P wave quite weekly because it's polarized actually problematically for a horizontal array. And then the shear wave, the S wave, coming in at an arrival time. This is from the Berkeley magnitude 4 event from last January a year ago. And so this is quite interesting. I mean there's all sorts of things you can do with this. For example you can take the channels which kind of in a, this is basically a beam former, you can delay and sum to find out where the peak grid search pixel is to locate where the waves are propagating both in slowness and back as in it. So that before the event arrives we're actually tracking a vehicle moving over the array. Then we see the P wave coming in right underneath us. Surface waves and S waves coming in later. This is kind of speaking to this idea that the geometry of the cable is giving us an interesting fingerprint of the wave field as it moves across the array. And so here an example event from the Caltech group, Shang Wenjian and Zifang Li showing this, one of these template events that was originally observed by a Wisconsin group from Brady Hot Springs using this interestingly shaped fishbone array of Das Gable that they installed themselves. You can see the wave field for just this one earthquake over eight kilometers is actually kind of has an interesting pattern and it's just because of where it occurred. And so you can take that one event scrape to the continuous record and pull out many matches that are actually quite far below the noise level. And so that actually changed in this case the interpretation of the science from this shut-in of a geothermal well. So they saw that immediately right after shut-in they saw many more events after template matching with the Das Rack. Whereas the templates were out here later in the data set. So I'm going to kind of breeze through this but I think the industry is leading us by about five years in this technology. And so they're doing a lot of monitoring wells and recording preparation experiments and hydraulic fracturing. This is one image from a recent paper in the leading edge which is no axes on this but we have information about distance 500 meters along the monitoring well next to a series of hydraulic fracturing events time along the x-axis. And this is showing perforation stages with pressure and temperature gauges here. So the color image in the background is actually strain information. If you record natively strain rate at every point along the fiber integrating in time gives a strain. And so you're seeing that there's actually a pressure front these are the actual micro earthquakes that were identified but there's actually a pressure front that leads the thermal poles. So during fracturing you're actually seeing this kind of multi-physics at long periods. This is much lower than the kind of earthquake records that I was showing before. So I'd like to kind of just conclude with with different opportunities. I think in terms of complementing existing networks and filling those network gaps this is kind of idea of a raise of opportunity for DAFs. We have fiber in the ground for 25 years typically. So the opportunity to just use an existing cable for a long time is there with this technology. Both in an urban area and offshore it's traditionally been hard to put out sensors and so using the existing cables is quite straightforward and a good idea I think. Earthquake early warning we have zero latency. We're recording at the speed of light on shore and there's all sorts of interesting research questions from the photonics and how to actually interpret the data that we're recording to try and figure out something about the instrument response function. There's a lot of opportunities to actually explore instrumentation further in this space and I think it's continuing to develop actually as new technology advances. They're engineering the fiber and fighting down the laser. This is kind of a new area where the technology is actually improving. So again just take home message. DAF turns the fiber up the cable into a massive one-component seismic array recording strain rate at every point. This is the image of that Alaska magnitude 7.9 earthquake with the seismometer trace. Here's a spectrogram of that energy as you can see from 10 to 100 seconds dispersive Rayleigh wave, airy phase, later arriving scattering and here's what the DAS is actually showing. So you can see that there's kind of amazingly amazing similarity in these two records. There's actually much stronger scattering at high frequencies so this kind of speaks to the instrument response of the DAS. So what I've done is basically do a de-convolution to figure out the instrument response here. I think that's kind of an interesting piece of this but the point for this session is really that you know seeing both broadband and array measurements with this tool. Thanks. Well I set up with one for Nate. You mentioned that there are some offshore experiments that were conducted in some initial results. I was wondering if you could comment on those. That's all right. Yeah so we've recorded some data in Monterey Bay on one of the cables observatories, the Mars cable. And for a period of four days during maintenance we were able to use that fiber because there's no dark fiber in the cable itself. And so we were able to record the first 20 kilometers of that cable before running out of photons. Full cable length is about 52 kilometers. And so we recorded both ambient signals which looks like seafloor currents but it's also broadband. And also we recorded a small earthquake from Gilroy which is which is nearby. So that kind of provides a wave field image similar to that of of the seafloor. So those those results will be will be coming out soon. Cool. I have a question. This is a question for Spar-Wed. Do you know what the pressure sensor design is in the S-net network of northeast Japan? And might you know what the problems are with those data? Because we haven't really seen any results yet. Are you still on? I was trying to unmute myself and now I'm muted. Yes, I have some knowledge. I'm not sure how much I should say. Problem seems to be the way they did the digitization is they tried to send the signal all the way up the cable rather than doing the digitization at the node. But you know this is basically hearsay. I probably shouldn't say anymore. And what kind of design is this? Say again. What kind of design? It was done by the Japanese. I don't know much about it. But it's much lower, much higher noise than it should be. Okay. I have a question for Jeff. When you were showing these tripods that are moving, is there a way to make them more bulky or something? I mean some versions of them have pretty heavy anchors on the bottom, not all. But I think most things in seafloor geotasy need somebody to go down with a drill, put a hole and put something. I mean not unlike a GPS station on land. I mean I think seafloor geotasy will eventually be doing that. You can do that with an ROV. ROVs are expensive. But if you're going to bother, why not? You seem to be surprised that that line across the Japan trench didn't show the plate motion. But if the fob's locked, this is after the earthquake, you'd only be seeing strain. So three millimeters over seven kilometers is like a half a more history premiere. Well, I mean there's right, so it's locked. There's no up to creep at all, right? So with the internet traffic, so there's technology of optical switching to basically occupy just a narrow wavelength of light. So we occupy at 1550 nanometers. So we just send pulses at that wavelength and there's a kind of photomic technology to split out and switch so that you could pass the traffic of the internet through and make these geophysics measurements in the same cable. But that hasn't been shown with this fiber sensing method yet. Yeah, one of your examples was a fracking experiment. So all right, is industry separately or collaboratively developing fiber optic or similar techniques, either as strain meters or as seismometers? Yeah, so a lot of the instruments are being developed first in industry and are now being developed at the national labs and different campuses around the country kind of in an open source way. But the industry is still in terms of both the hardware and software kind of a step above, in my opinion, of the data that I've seen. So there's lots of different manufacturers of these different interrogators, coherent optical Rayleigh scattering. If you google that, you'll find 15 companies. It's kind of a large emerging market. So who operates these networks, the existing fiber networks for internet traffic? And is there a chance to get access to those? Assuming that the problems with the interference are solved? So in the academic cases of cabled observatories, they're academic groups. In the case of the cable and Sacramento, it was actually for a brief portion, it was owned by Lawrence Berkeley National Lab, but now it's owned by CenturyLink. So it's a utility that different companies buy and manage. So it is kind of on an ad hoc basis that we kind of go out and try and find Google or CenturyLink to kind of partner with. You found it easy to just find these partnerships? It's only a couple of us working on it, it seems like. So I think it's early days still, but it's not impossible to have a lot of dark fiber in the ground that's unused. So it's an opportunity, it's a definite opportunity. So probably follow that thread. I mean, so could you say a little bit about the observations that have been made offshore and also a little bit about how much dark fiber there might be offshore that potentially if we had the right partnerships with the right industry, we'd be able to make use of? Yeah, so beyond the experiment that I described in Monterey Bay, I've seen only kind of brief one-hour snippets on posters of data recording, mostly in industry. And so I don't have a lot of information about those other offshore recordings of this method. But there's a lot of fiber offshore, and when they lay a new cable, they just cut the old cable, often is the practice. So there's a lot of cable at these kind of switching hubs onshore even that is unlit and unused. There's also potential to go and lay an engineered fiber of higher quality offshore in the deep ocean. That's going to require putting this instrument in a pressure vessel and the kind of power and data management considerations for this. Something I completely glance by is the amount of data that we're recording, 10,000 channels, 1,000 hertz. So we recorded about 0.6 pets in the last five years, but just our group and a number of different experiments. And so this kind of data management issues is there too. Kind of bringing that back from the seafloor is hard. What kind of distance down on these fibers can you see? Yeah, so limits are really about 70 kilometers with an engineered cable. But there are ways actually Lawrence Livermore and National Lab has a draw tower they make in manufacturer fiber. And so they're actually trying to work on reducing the number of scatterers and putting in specific points, weak fiber bragged, graded scatterers to kind of extend the number of photons to get a longer length for exactly that idea. I'm just going to ask and clarify that right now, all the stuff that you've done has been on dark fiber. You have not done any of the wave division multiplexing to actually work on active data. And the issue is you talked about the advances in cable manufacturing to may let you expand things. What about changes in terms of lasers to allow you to extend your engines? Where do you see that going? Yeah, so there's a lot of a lot of this advanced like why this technology is coming about now is actually the stability of lasers has come down. And so the quieter the laser, the less noise in the seismic data set because we're making this interferometry measurement. So we need that stable phase. And so that's actually, there's a new instrument coming out by this company, CELICSA, which has shown another order of magnitude reduction in the self-noise of the instrument. And that's kind of not surprising. So I think there's plenty of room to lower the floor of that instrument. So also, are you using 1550 because that's where telecommunications are not present? Or my question is related to how much of the loss is due to scattering and how much is due to absorption from small amounts of water in the glass? Yeah, so that's the place where a lot of telecommunications works is that near IR band. And I think it's for the water absorption peaks typically is what they try and avoid. But I don't know how much wiggle room there is to tune that actually. You can tune the laser wavelength. So I just want to check that. You said the maximum length at the moment is about 70 kilometers, but you said for the Mars cable you could only get 20 kilometers. Why is that? So 70 kilometers is with that engineered fiber that I described where they're reducing the number of Rayleigh scattering points along the cable. And the telecommunication companies are not doing that. So they're not putting in this expensive fiber. So you're really, for my experience, it's been 20 kilometers on telecom fiber. So any existing cable that we would take advantage of 20 kilometers would be sort of a safe assumption for how far out you could go? So if there's anything in the way that's going to reduce the amount of light, like a repeater at five kilometers, then that's going to scramble the phase and you're not going to get a measurement past that point. So that's important to understand too. Could you put this measurement device on each repeater? Potentially. I was wondering if you could comment on the relative advantages and disadvantages of horizontal task cables versus like a vertical cable and a borehole. Yeah, so we saw the horizontal trench fiber and maybe we trenched the 2D array at the surface. We're always missing that P wave largely. We get weak P arrivals. And so that's one opportunity. There's other opportunities like if you have a vertical fiber, you can actually separate the up-going and down-going waves to get local velocity, which is kind of interesting. It's an array geometry problem. It's hard to dig horizontal wells though. So it's a lot easier to go up and click into the telecom fiber. Any other questions for any of our four speakers? I got a comment. Robin knows I'm as old as a hilt. So since we mentioned fiber optics, the discovery of hydrodynamic weakening and the presence of hydrogen in SiO2 glass in particular goes back to the early days of chiropractic fiber optics at Bell Labs by, again, the war hero at the time called Orson Anderson, who was not appreciated until the oprysonic work on fiber optics covered hydrogen and PV and silica. And that led one thing after another, but it was in early days when I understand fiber optics, repellent communication back in the 50s. Okay, we apparently have no food yet. So we can take a few extra questions. If there are not, then let's thank our speakers. Thank you.