 Today I will actually focus more on our experiments, particularly on the European X-ray free electron laser, not all free electron lasers. But I think that will give you a pretty good view of the typical kind of experiment that we do when trying to do single particle imaging experiments. And first I would like to talk a little bit of what we are trying to achieve and how that can be achieved in multiple ways. So one long standing goal of scientists had been how to image single proteins and there was a seminal paper from Richard Henderson in 1995 where he compared different ways to probe proteins, single proteins and he came to the conclusion that given that the elastic cross section for electrons is so much higher than for X-rays and also for neutrons, electrons are the best way to go about to image proteins. And for the more there was less damage caused by the electrons. And so there's this huge gap of about 10 to the five in terms of the signal that one can extract from a single electron compared to a single X-ray roughly and depends on the exact energies and so on but about this order of magnitude and these are the rise to this cryo-EM resolution revolution that gave rise to a Nobel Prize a couple of years ago and it's exploding all around the world with EM machines popping out everywhere and the spectacular work being done. So, but we're here I guess to talk about X-rays, so where did the X-rays come in? So cryo-EM can do, like I said spectacular work, but in terms of time resolved imaging, it is difficult to go to very short time resolutions, even the time it takes to freeze the sample and just also because you cannot really put a lot of electrons all bunched up together due to the space charge effects. And so you're typically limited to time resolutions on the order of several milliseconds, like tens of milliseconds. In principle, we would also like to look at things that are faster because many biological processes are faster than many the main motions which take on them are on the microsecond level or into very few milliseconds and then even faster loop motions and other kinds of motions take place at faster time scales. So how can we try to look at this? How can you investigate this? The X-ray free electron lasers might provide an idea using this concept of diffraction before destruction, which has been popularizing this paper from the 2000s, where one can use a short pulse on the femtosecond scale to outrun most of the radiation damage to obtain a diffraction pattern from the relatively undisturbed sample, even though the sample will inevitably explode given the amount of energy that deposited into it. And so you're just making, taking advantage of this time difference, this inertia that exists to try to obtain some more information about the sample. So you're just making use of this difference between the speed of light and the speed of the shockwave. And that would then be to, one can go with whatever sample you want, you just put it into the X-ray beam, you capture a diffraction pattern from it and then you just repeat this again and again and again. So you obtain thousands or millions of diffraction patterns such that you can improve your signal to noise by combining them. You furthermore have the challenge that you do not know the orientation of each individual particle, so you have to very much like in electron microscopy and in signal particle cryoem, you have to figure out what is the relative orientation of every single particle. And then you will assemble the 3D Fourier space corresponding to the intensities that you have measured, but now assembled in 3D. And then you will phase that and you will obtain the structure of interest. And then you will use ultrafast pulses such that in principle you are able to achieve ultrafast time resolution. And the first experiments were done in quite large samples, at least when one thinks about proteins. So they were done on this Mimivirus around 400 nanometers, 450 nanometers in diameter, which are extremely giant. This is very large for a typical virus. These are more normal sized viruses, let's say. And they gave a beautiful diffraction at the LCLS and they were reconstructed. And this all worked very well, surprisingly well given that it was kind of the first experiments that we've done on a real X-ray field electron laser. But then we wanted to try to progress this to go to smaller and smaller particles. And one of the first ones that we tried were these 100 nanometer carboxy-somes, which are some cell organelles from certain photosynthetic cells. And they are supposed to be icosahedrons. So they have a certain amount of heterogeneity, but they are roughly icosahedral. And when we try to deliver them to the X-ray beam, we got, and then we try to reconstruct the resulting diffraction patterns. We got a bunch of things that look like this, which not only they look mostly like spheres, they also have a very large span of sizes, much larger than the natural variation of the carboxy-somes. So maybe there could be something wrong with sample preparation and so on, but we actually had quite a lot of experience with preparing the samples. So that was something else going on. And the most likely explanation is that to be able to deliver our particles, in this case the carboxy-somes to the X-ray interaction region, what we do is we start with a solution and then we create an aerosol of small droplets. And then those aerosol droplets will dry out and you will end up with your sample. The problem is if there is some contamination within the original solution and when you're working with biological samples, it is hard to avoid having some contamination. Those contaminants will then also deposit somewhere, those that are not volatile. They will either cause creating small clusters of contaminants or they will deposit on top of the sample of interest. And this is what we thought that had happened. And so to try to investigate this, we developed some laser scattering to be able to measure the sizes of the droplets that we generate. And we did this first using the traditional method that we have been, we have been using for several years called gas-dynamic virtual nozzles. And this was, this gave a quite broad size range in terms of the droplets that you get. So you can see everywhere from 500 nanometers to one and a half micron. And this is relatively large. We would like to have smaller droplets. And to do that, we then changed to electrospray, which gives us a much more mono dispersed droplet sizes of around 150 nanometers in size. So that was much better. And then we were able to repeat the experiment using an electrospray as a source of the, to generate the droplets, and using the same kind of preparation for the carboxy. So we gave, we got these much, much better looking diffraction patterns and you can see the reconstructions, they look much more like acazohedron. So that's, that seems to have solved one of the main difficulties in trying to go for smaller and smaller particles, which was getting a sample delivery method that could work with small particles. And we actually even managed the TLCS first to look at some relatively small viruses. So this is 35 nanometer tomato-bushy stunt virus. And this is what we were expecting to have. This is the measurements that we got and they match pretty well. So we were, and here we have, for example, the case of some, when you have two virus in the same droplet, then when they dry out, they just get stuck together and you see this kind of interference between the two small virus. And before we always got some balls which are larger than the size of the virus that we were expecting. So that worked fine. So, if now that we could see this, we then wanted to go and be a bit more ambitious. So if we can see 35 nanometer viruses, can we go for single proteins? And the first experiment that we did was to look at Rubisco at the TLCS. And this was at the AMO beam line, the Software Crab beam line, and this is the kind of patterns that we're getting, like you can see this, this module was kind of bad, so you couldn't really read any data from it, but otherwise the main issue was that we had quite a lot of, first, the signal was very weak that we were expecting to get, and it was very hard to distinguish the natural beam line background from any kind of coherent scattering signal. So the result was inconclusive. We don't know if we were able to see any kind of protein scattering or not. But at the same time, we also started to develop ways to to quantitative quantitize the amount of fluency that we're getting on each of the particles that are being hit. And we could see that we had, for example, this is from an experiment in 2012 at AMO. By looking here at basically at the intensity of the central speckle and modeling the particles as spheroids, we can, and knowing the density of the particles, we can estimate how many x-rays are shining on the particle. And this was about 10 to the 11 photons per square micron. And even going to nanofocus at CXI, so harder x-rays, we got below still one millijoule per square micron, which is, one might think is relatively low, given that the free electron lasers produce multiple millijoules of pulse energy. And here we're supposed to be on a nanofocus beam line with a focal spot, which is definitely below a micron. So why do we get so few photons on the sample? Well, there are many losses. That seems to be the main explanation, but that was always a little bit frustrating. But when we started, when simulating diffraction patterns with these kind of parameters, this is, for example, what we were expecting to get from the single rubisco at AMO. You can see, you basically can only see the central speckle, which can be very hard to distinguish from beam line background, and you can't really see anything else. So it's hard to know for sure what you're actually seeing. But using parameters from the SQS beam line at European XFL, or what we, an educated guess on the realistic parameters you're going to get, which was about 300 microjoules per square micron. And at 1KV, we were expecting that we will be able to see some fringes from rubisco, and so that we'll be able to positively identify that we're getting a rubisco on the beam. And then we started, and now the European XFL started, and we were able to actually do experiments, but first we were only able to run things at SPB. And one of the important things to try to check was whether one could actually use the high repetition rate. This very special pulse structure of the European XFL, which has pulses that can come with a one megahertz repetition rate within a train, and then a very long time without any x-rays between different pulse strains. So could this kind of post structure be used to do single particle imaging experiments? And the expectation was yes, but we had to confirm this experimentally. And the first experiment we had, as you can see, very bad focal spot. So 15 microns, we had to use a rather large sample. Again, we use this Mimivirus that were used for the first LCLS experiments. And we were only interested in checking whether the separation between consecutive hits in the train followed what you'd expect for completely independent hits, or there was some kind of influence of one hit on the following hits. For example, it could be that if you hit one sample, then that caused the breeze that would stay around and prevent a second sample from being hit in the following pulse. But that was not the case, or at least we could not see this from the hit rates that we measured. They seem to match perfectly what you'd expect from ideally independent hits. So that was good, but that was also roughly expected. So with that out of the way, then we could start to go to running experiments at this soft extra beam line with really small samples. So the first one we started was with this E. coli ribosome. That's a E. coli 70S ribosome about 20 nanometers in size. And in previous experiments, it was, like I said, hard to look at to see these very small samples, not only because signal was low, but also because we had these depositions. But here we managed to obtain multiple, quite a large number of hits given that at this first experiment we were limited actually to a detector. There was a PNCC that could only acquire one pulse per train. So we were limited at most to 10 Hertz, even though in theory the European XFL can go all the way to 27 kilohertz, but we were detector limited. So, still, we managed to obtain this nice looking histogram that matched perfectly the size that you would expect for the ribosome. And that was quite encouraging. But further in the same experiment, we replaced the ribosome by this GroEL, which is a chaperone that has an interesting shape, so it has a cylinder-like shape, while many proteins can be quite globular and so they can be hard to distinguish from, for example, a droplet. This one in a certain direction looks more like a rectangle. And so that makes it more distinguishable compared to some kind of, some kind of impurity, which tends to form round-shaped droplets. And when we started to inject, we managed to obtain this diffraction pattern that you can see here in the middle, which first, now it is clearly very different from the background that we're getting. The background here is on the right-hand side, so we can see there's clearly something in the beam. And furthermore, with some more detailed analysis, so I can see. You might notice that this thing first is not completely round. It has some kind of, it looks like it has some kind of square-ish edges in some directions. But this image was quite a lot of contaminated also by the fluorescence from carbon dioxide and nitrogen that we are using to deliver the particles to the interaction region. And if we actually do a histogram of the values that we get in our PNCC, you can see there's this zero-photon peak corresponding to pixels without any x-rays. And then you have this bump here corresponding to fluorescent photons, and then only here you have the bump corresponding to elastically scattered photons. So we, given that this detector has such a nice or such low noise, such a nice energy resolution, we were able to simply discard all the pixels that contain fluorescent photons and only keep the elastically scattered photons. But we still had a decent amount of background, a large amount of background, let's say, coming from elastically scattered photons on the gas that was used to deliver the sample to the interaction region. So the left-hand side here shows an image of the detector when we were not delivering sample to the interaction region, meaning that there was no CO2 and no nitrogen. So this is purely from scattering from elements within the beam line, and that was only 65 photons per shot. When we turn on the gas, we get 17,000 photons per shot, and that's only the elastically scattered photons. There's even more fluorescent photons. Which in the end makes it so that you only get an increase of about the factor of two when you actually have a heat of the single protein, the gruel. So that's one area where we're definitely working trying to reduce this background and there are a bunch of ideas on how to do this. But we still had to know whether what we hit was actually a gruel. And to do this we simply compared the diffraction data that we got to the well-known X-ray crystal structure and also cryo-m structure for gruel. And they actually matched remarkably well. So here on the left-hand side you have the gruel, the experimental data. Here we have on the right-hand side the crystal structure of gruel. And here in C we have just if you take the crystal structure and then you try to calculate the diffraction pattern in the same geometry as we were doing the experiment. And then if on top of it you add the experimental gas background that we measured, then we end up with things that are quite similar. But if we went and digged a little bit deeper we saw that the match was actually not perfect or not even that good regardless of the orientation of the gruel. And you can see here just on the small angle scattering pattern just by as muscle integration of the signal. And if we just take the data and compare it to the PDB, then the PDB always has a much larger bump here in this region in this Q range. And this bump actually comes from the hole that exists through the gruel. So we thought about this and it is very likely that when you start with grueling solution and then it dries out that it doesn't dry out completely. But there's still some water that is kept that is stuck within the hole through the cylinder. And besides the water that it might also be protein for example which is well known to to stay there within for example also in cryoEM reconstructions. So we tried to add density there and when we added some density some water molecules within the hole then we got the much better agreement between the experimental data here in green and the model sorry the experimental data in blue and the model in green. And now they agree very nicely. So we are very convinced that what we have is a gruel with a filled cylinder in the middle and filled probably with a mixture of water and protein and some salt maybe. So that that was an expected but interesting. So this is our best model fit and we actually just last week or two weeks ago actually now we posted the paper and manuscript to bioarchive showing all of these results and also with some cryoEM reconstructions of the same data or on the same sample showing that what we're injecting was gruel. So I would encourage if you're interested to take a look at the paper. But so now that we can kind of show that we are able to take a glimpse get a glimpse of a single protein. How many photos do we really need to make getting useful structure with useful resolution to do biology. If you if you look at papers from cryoEM people have come up with different minimal doses but this is one of them. And this corresponds to about 1.5 electrons per square oxtrum to get some around three oxtrum resolution in some the limiting case. So if we take into account the fact that there's this large discrepancy in the scattering cross section between x-rays and photons sorry between x-rays and electrons. Then that was that correspond to about 1.5 times 10 to the 13 photons per square micron. So if you look at the paper that I showed in the beginning from showing the idea of this diffraction before destruction this Noite et al from 2000. That paper had simulations with number of photons on the 4 times 10 to the 14 photons per square micron so it was quite a bit above this minimal cryoEM dose. And so it was not strange that it was possible to get nice results with that. If we now compare this to the experimental experimentally measured fluencies. If we look at the nanofocus at CXI at the LCLS, we are about still an order of 15 times under the this cryoEM minimal resolution. Or if you compare to the Noite et al then it's almost 400 times that we are below that estimate of the 10 to the 4 times 10 to the 14. In some sense very real sense XFL still are weak and they haven't really managed to achieve the power that had been proposed or expected when this paper in 2000 was proposed. But things are improving. So recently at the SPB beam line at the European XFL we have managed to measure fluencies for the first time above this one millijoule per square micron barrier that we had never been able to cross before. And actually, so these are, it's just a bunch of hits on sugar bowls, but you can see that you have not only above one millijoule but we have some of some of them above 10 millijoules so that's encouraging. It seems to show that the SPB beam line at least at the time when we did this experiment, we had a true nanofocus focus and that the amount of X-rays getting there or the loss through the beam line was relatively low. So this puts us at a more interesting position. So the first experiment where I showed the GroEL was only about the factor of three below this minimal cryoEM that was required. If you take into account the fact that also the subject rays have a higher scattering cross section. And this nanofocus measurement at SPB SFX is very close to this minimal dose. So things really are improving. And actually, just recently at the European XFL users meeting, there was an announcement that they managed to obtain 17 millijoules per pulse at soft X-rays, which is about the factor of three higher than what we had when we did this experiment. So they are able to push the not only the pulse energy but also probably more importantly the focus in conditions and the transmission through the beam line. And also the background level, so that's also very important. So things are improving significantly. And so I think that in the future, and also using high repetition rate detectors such that we can acquire much more data, we might really be able to get some high resolution biologically significant data on single proteins. So just to give you an idea of the path that we already went through, we started with this Mimivirus that are in the order of 450 nanometer like I said, and we're now looking at things that are tiny like this GroEL, which are definitely tiny, especially if you think that what matters the volume and not just the area of the object in question. And finally, I would just like to acknowledge a bunch of people that have been working together to make these experiments a success. And thank you very much for the opportunity and I'm happy to answer your questions. Thank you very much, Philippe, and I have to admit that you are the first person of this series we actually stuck to time that it's a note. So congratulations. You did it. Thanks for this talk. And now it's open for questions, comments from the audience so you can either raise your hand or directly unmute yourself or write questions in the chat if you prefer. I don't see anything yet. So I think I will start to ask you some questions. I have no experience whatsoever in this type of physics in some of science. You are saying that there is a minimum flux that you need to to retrieve information. So tell me what is the actually information you are looking for when you're looking at this protein. So what is the quantitative information that you would like to have with X-rays. So, well, I think there are a couple of points there first, maybe just trying to understand why there is a minimal dose required, it might be an interesting point to just to mention is that first you need to be able to not only see whether you have something in the beam or you don't have something in the beam and this is actually the in when you're talking for example to people in electron microscopy this is called the particle picking problem, whether you have a particle there or you don't have a particle there sometimes can be hard to tell. So that's the first limitation so you need enough signal to be able to say whether there's something in the beam. You also need enough signal to know what is the orientation because I mentioned that the orientation of the particles is unknown. And so we have to be able to orient them relative to each other to be able to, in the end, recover the three dimensional intensities and to be able to orient them you also need a certain amount of signal. If you have no photons you're there's no way you're going to be able to understand what is the orientation of it. So that's the first part then you're saying what kind of information that you want to see. So the kind of information that we got in this experiment from the from the grow yell only allows you to to say very gives you only very low resolution features so we can say what is the size the overall size of the particle which matches the part of the size of real. We can also see for example that it's it even though we, it seems that this cylinder, but that's because we already have a model for for the particle. It seems to be a field particle so it does not seem to be a hollow particle. But these are all features that are on the order of 10 nanometer or something like that or maybe a little bit below but not much more. And people would be interested in features that are definitely sub nanometer. And for that, we are still far away. We definitely cannot do it from a single picture we need to be able to average out multiple diffraction patterns to increase our signal to noise and extend the resolution to something that that can be can be used then to to obtain some biological insights about what the protein is doing and how it does what it does. Could you please tell me what is the comparison between the data that you take with the cryo em for example what is what are the difficulties of analyzing such small particles with cryo em. So for example this this kind of sample the grow yell is is more or less a standard, I would say for cryo em. Particularly some time ago it was more of a standard nowadays they're smaller like up of everything that's much more used as standard but but it's it's relatively straightforward to to analyze these and obtain thousands of of projection images from from grow yell and use it to recover a high resolution structure so that's that's not a large not a big challenge and that's one of the reasons why we picked this particle because it's. It is well so not only are there good studies on crystallography and cryo em people has also done studies using a electro mass spectrometry and using electro spray ionization as a method of sample delivery. And so we knew that this was going to be compatible with which we also use for our experiment and we knew that the sample would behave. It was well characterized and so we felt safe in using it. Any other questions and comments from the audience. Felipe seems to me you've gone quite quickly and in a fascinating way through this detail in at the beginning of your talk, but it looks like this data sorting is solved. Basically you have it in your hand. I think it so so there's also been actually some improvements in the kind of data sorting, especially when one is trying to so in the ideal case where all the particles are the same. I would say that this is almost solved I mean it depends on the kind of signal to know as you have. But, but it's a well understood problem. But in reality, the particles are not the particles they are not all the same. There is variation within your sample there also variations in the experimental conditions you might not they're not might not be perfectly stable, particularly at three electron lasers things are not very stable in typically. And so, there will be variation and how you you're able to handle and sort this variation is a more open question and it's not even just for in the case of single particle CDI. I think it's a general issue for structural biology for example that people first start to study the more well behaved well ordered systems, which are easier to study and then one wants to go to less well behaved systems and things start to get complicated, but people are able to start to deal with some of this complication and there was a recent paper by cardiac iron on melting melting gold nanoparticles where the idea was to just image nanoparticles of gold in different with different shapes. But it just happened that during the experiment, the, the, the, the previous pulse. Before, before the pulse gets to the before the particle gets to the interaction region, the pulse that arrived just before the tails of it was actually sometimes melting the particles so you end up with some diffraction patterns of the particle without being melted and some of them sometimes were melted. So, the particles are definitely not uniform, some of them were half melted and he was able to properly separate and classify these things. So, it's a it's a one initial step in this way toward the heterogeneity of the sample which was something we're going to have to deal with in the future. Can you see Oscar, raise these hands, can you please unmute yourself Oscar. Oh, yes, thank you and thank you so much for that for the talk with those really. It's hard not to get excited about how much it happened since the first hard extra expose came online, especially in these kind of fields. So I have a question a bit related to what we just discussed before about resolutions and heterogeneity and all these topics so. So I mean, how much is similar to the cryo field I mean I mean I just try out trapping situations instead I'm thinking like when you go to higher resolutions and have more. I mean more heterogeneity between molecules in this so. So is this a very equivalent problem to the cryo single particle field or do you have differences given how the sample is treated before when you're injecting it in this like a sample handling up until the extra interaction like in terms of temperatures and whatnot like. I mean the details of the sample delivery are clearly very different, but the problem fundamental problem is exactly the same that you will not have data coming from always a consistent particle but from a particle in different states. And so you will end up. You have to do something to be able to make use of data that that doesn't match your simple initial model to start with that we're always looking at one particle just in different orientations and it's not just the orientation that changes it's a bunch of other things but they are still related so it's not like you end up with random stuff in the beam that because that you couldn't do anything with. So there's some kind of continuum between having always homogenous sample and having a completely heterogeneous sample and you're you're a little bit somewhere in the middle and you have to make use of that and make the best use of the data you can. Thank you. And just to add quickly, for example, we will probably be able to make use of some of the algorithms that people are using developing for cryo em and and also anything that we can develop probably can also be used for cryo em so it's a it's a two way situation where we can both make use of things. See Karina is ready for a question Karina go ahead. Hi, I will leave my video off I'm doing this on my mobile so I don't want to collapse. My question. Thanks for the talk I really I loved it a lot. My question is a little bit going on from Oscars, I guess. So when you had the last one this grow yell that you said was in all cases was filled. Basically also with the other sample you had that it agglomerated on the outside some some extra particles so how much of a problem is, is it then when you want to go away from what you already know. So as you said like these structures in principle they are already known. So how can you then rely on your sample prep or at least the delivery methods that was its electrospray as well the. Okay, okay, the girl was like to spray us. Yeah, that is a good question. And the fact that there is stuff within within the cylinder of the or the hollow cylinder of the structure of the grill is not necessarily a problem if it's water and protein for example is actually. Let's say it's good because it's we want there to be water. We do not want to the so unlike when you're doing mass spectrometry for example that kind of experiment where you, you're relying on the mass. Here we don't really rely on the mass of the of the sample we can we can look at it and then make the conclusions from that. And we would like to have a little bit of water around the sample, because it's, we don't want it to do to, we want it to be in as native status possible given that it's in gas phase. And so, the fact that there is water is not necessarily problem if there's other things there and they're likely so for example they might be brought in and we actually when we did some, we did some crime on the sample that we have and we saw that many of the particles did have some protein within the cylinder. So it's possible that this one also have that might be more problematic even though, again, in in vivo to grow yell will very often have brought I mean that's what it does. So it's not strange that it has proteins within the cylinder. Other things like salts and other things that might accumulate when the water evaporates that might be more problematic because that might be that is something high concentrations of salt is not something that you naturally have in vivo. And so that, that would be a problem and we are continue to work on improving our sample delivery conditions and trying to trying to understand better what one interesting thing that was that it is. It is known from mass spec that that go yell often collapses. When it is when it's delivered by by electro spray. In this case we did not see that happening. Which is positive, I would say so it looks more like the solution structure, but we're clearly not in solution. We will have to work more with this problem. And I think it's like you say we have to be aware of what happens to our sample when we when we deliver to the beam. Thanks. Oscar. On the infrastructure side I mean away from the science I mean with a bit like on the topic of dinner with the data sorting. And this and when you're mentioning European Excel and eventually making use of all those 27 kilohertz pulses and such. How do you see the computing infrastructure like and I mean and I mean dealing with these kind of molecular movies on storage and computing side like. Sort of is it already kind of a growing concern or is it still some buffer until it becomes a major problem to actually deal with the data rates coming out of these facilities. I think for many people this is already a real issue for us for our particular experiment it hasn't yet become a real issue, because our hit rates are the percentage of extra pulses that actually hit sample for us is relatively small. So we can throw away a lot of the recorded data and only concentrate on the good part, but their ideas for example to do. I don't know if you're familiar with vetoing capabilities of detectors where you would give a signal, because they are only able to to to to record a fraction of the pulses that European Excel can produce. But in theory, the detectors have the capability that if you tell them which pulses you want to record, and you can do that, for example, by looking at the ions that that for having an ion time of flight sensor that measures, you can measure that that would be relatively fast and so tell the attack that okay this time we actually hit something. And so keep this frame, and that would dramatically increase how much data, we would get. And then, and also as we will improve the sample delivery methods so we're improving on increasing our hit rates and so on, and improving our focusing methods for the particles this will probably at some point become a serious problem Yes. But we're just hoping the crystallographers will fix it before we get any other questions comments from the audience. I think this is was lightly touched before. What is the heat rate you didn't, you didn't say what is it varies a lot, but it. For example, in the case of the grow yell. Well it's a little bit hard to say because many of the hits were weak, but it's on the order of about 1%. When I showed the sucrose particles, where we measure this intense hits. It can sometimes go for sucrose it can sometimes go almost 100% depending on the focusing conditions. And for most biological particles I would say like around 1% is a more realistic value, meaning that you hit one out of 100 posts. So I see percentage as a resistance, please. Excuse this very knife question. I'm actually an astronomer and wondering if there might be anything in common from as an inversion problem. And the question is, do you always have to have a model for the structure that that that you're probing or is it possible to go directly from the observable by some under the inversion method to work out the structure. It is definitely possible to do direct inversion of the data. It just the reason why we did not do it in this case is because we knew what was the sample, what we were delivering and we I mean there is this high quality structures available so we just wanted to see whether it matched with what we are expectation but in the actual examples where I showed these mini viruses large viruses is 450 nanometers done at TLCL there, we did the full inversion without any assumptions about the data and about the sample and you can also do inversions with partial assumptions for example you can do symmetry assumptions and other kinds of assumptions on the on on your sample so yeah you're absolutely right that you could you could do this in this case in this particular case given the low we could we can still do this, it will mostly give us a blob that's the result. Thank you. Thank you. Okay I don't see any other raised hands. I think we can, we can conclude. There is, there may be many, many other questions that can come up to mind I don't know we still have another minutes if anyone wants. Otherwise we can conclude, thank you so much for your talk and for participating in this series. And I hope to see everyone to the next appointment is the 7th of April with even Bartanians. Thank you again.