 for this kind of invitation to this relevant webinar seminar. Unfortunately, the pandemic is forcing us into a far presentation. It would be lovely to be in person and visiting your beautiful surroundings and sinkers. But anyway, I'm here today to introduce and to talk about plane-requirerance in SoftXX scattering as we do at NSL, the NSLS2 on the beamline. I represent the equivalent SoftXX scattering beamline CSX2381. And I hope that I will bring you around with a little bit of a micro-spectrum diffractive journey around some some electrons in correlated materials as a sub-title, exactly for the reason you said. So the outline of my seminar is very simple, just some scientific interest and the need for advanced tools, in particular Soft, Soft, Resonant Elastic XX scattering and the addition of clearance. So some cases wrote out of the beamline, so what can we do, what can we learn? And in the meantime, we can get some new perspectives also because new tools have been developed as a result of our interest and scientific investigation. And then we conclude quickly with challenges, opportunities and caveats somehow and a wish list of things that we would love to see realized in the hopefully near future. So first of all, I would like to acknowledge the many people that collaborated or are collaborating with us at various levels. First of all, the beamline team, so Andy, Wen, Stuart and not attached to the beamline but to the synchrotron Oleg, whose software helped us from source to simulation of the experiment and being able to propagate the full wavefront down to the sample and to the detector. So it's extremely powerful. And then the internal and external collaborators that we have. So I put in laser pointer so that it's easier for you to see my mouse. So first of all, internal to BNL, condensed quantum physics and material science department with Mark, John and Ian and the many postdocs that have and are working with them. So Shao with the egg, yows and meow. And then the external to our lab at MIT, Ricardo Comin and Jeffrey Beach group and the various young people are active that produce beautiful, beautiful investigation of various materials we will see at the end of the seminar. And in a group just in the university, but if you're looking in school, we are actually collaborating with and then across the ocean, our our friends and collaborators in Germany and BI and you believe. So what accumulates all these people is the interest indeed in electronic behavior in condensed matter physics and in particular in strongly correlated electron system at various levels. And here we come with some, let's say general scientific motivation of why I think that for most of the people around, I do not need to say anything but just to be sure that we develop a common language somehow. I put a general intellectually slide on the little bit of overview of the science. Essentially, we want to spend and understand what's happening from nanoscale to macroscale and define what mesoscale is and how to deal with it. There are essentially enormous opportunities there. So we need obviously to span across various orders of magnitude in size, so on, space and time essentially, and to try to understand what the phenomenons that we see are and how to explore them eventually if we can. Some of them are quantum phenomena that can emerge up to our landscape at the limit, right? So up to our macroscales. And sometimes because of their characteristics, sometimes because we have some protections in terms of the policy, face coherence or whatever. And so we all know, and it was beautifully summarized in this famous article by Tokora, how the history went, and where we are and where we want to go. And in this perspective is essentially how we start with the famous case of high-temperature superconductors and in particular, mostly for historical reason, we have a fantastic group in conventional spaces in particular for growing samples and applying many, many different scientific investigation techniques on these samples. And so it is a well-known evolution of the critical superconductive temperature versus years. And that's the, oh, sorry, and that's the part that pertains to the big jump induced mid-80s by the discovery of the cuprates and their strongly correlated, let's say, contribution to superconductivity. So out of a number of structures that are known available, the common themes is the copper or copper oxide plane. And the fact that they have similar phase diagram and sometimes even peculiarities in the phase diagram has a specific doping. For example, there is a suppression of the superconductivity that is well known and we will analyze a little better because we gave a direct contribution of this. So the crucial point here is to understand why the electrons do what they do and so which is the mechanism underneath, which is still a matter of hot discussion and between lattice coupling, other mechanisms involving electronic degrees of freedom and so on and so forth. We will clearly not set a final reply on anything, but I would love to show you the small contribution that we think we were able to get in this line. So obviously to investigate such a fine level of contribution to electron, specific electrons in the material, leaving some specific characteristics but out of many available only few of them essentially are the one contributing and we'll be talking to the others and we have to discover if and out. We need something that is an advanced technique with capabilities from of spanning in time and space over orders of magnitude with the aim of specific charge and mechanical specific information from our materials. And so we decided to go from to the extra scattering because it's well known that in the atomic scattering factors everything let's say at the microscopic lab relies. We have chemical sensitivity, we can add site sensitivity. It's certainly a very photonambular technique because it competes with other prevalent let's say phenomena, at least in the energy scales where we want to see if we use a soft but essentially over all the spectrum available at synchrotons so there's no way that it's a prevalent factor of course. However it's enough to extract quite a bit of information as we will prove and this is one of the cases that's taken as an example but it's pretty much identical to everything, it's the case of iron. So you see very much that already into the photon just using photons, scattering is certainly all the sort of magnitude smaller than other phenomena as it's brought here on this logarithmic scale log-log plot of absorption and scattering cross sections as a photon energy. However the zoop and in particular what we want to do is to use a photon input and out technique that relies on the photo promotion of a core electron to inspect empty states about fermi energy and where the information on what is experienced by this set on projecting its own state on the available state during this virtual process is encoded in information that is retrieved out of the emitted photons depending on its intensity momentum and polarization even that we want to deal with scattering and diffraction in particular so in general staying in the elastic regime let me say the energy of the incoming and the incoming photon have to be considered identical and we will see that this is a little bit of an issue sometimes. So in general it's just the only two essentially two equations that I have in my talk but just to say that we measure an intensity of the detector that writes exactly identical as you are used to in standard diffraction excels but now the atomic scattering factor is not only depending on the modulus of few but in being given this process that is projecting some wave function another wave function being intrinsically tensorial you depend on the incoming and outcoming wave factor of the light and their polarization. Second order perturbation theory helps us in understanding how to do it and essentially as you can imagine it's a standard isobarconic formula where out of a divergent denominator resonant denominator kept under control only by the lifetime of the of the coral you take a certain status that's supposed that it's the ground state you project with an operator into an excited state and you photo project back into a finite state that if you are discussing about something elastic is supposed to be identical to the starting one in reality this is not necessarily the case right quantum mechanics tell us that everything can happen with certain probability maybe zero but in principle certain probabilities so what you get out in reality in terms of energy lost out of your sample when you pinch with with photons is apart for electrons and other things that we are not considering a bunch of photons and some of them have lost energies because they talk with available degrees of freedom into matter and this is pertinent solved in elastic um techniques instead we are only dealing with the elastic part the one that comes out exactly the same energy or very very close to the impinging photons the problem is that it's sometimes difficult to separate this part out to everything else if you're lucky it's just a background everything else and the the variance of this of this signal across the reciprocal lattice due to the fact that it replies to brag it's enough to separate out what is elastic to what it is not but it's not always the case especially if you're dealing with signals that are truly correlated and weak and so sometimes this poses problems we will see some across the so why what can we do if we add coherence to the game and we can do quite a lot so what what is coherence let me let me try to to say a word not because it's probably necessary in this environment here and that is a series of seminar on coherence but just to as I said to put everybody at the same page for what I will use later in the talk so please bear with me if if it is not really very very precise the formulation but just to give an idea and this is essentially what is necessary to understand everything else that translates so if you have somehow poor control of your wave front where you can control only the average distance between the the waves let's say right in your in your train that is impinging on an idealized sample he had described by van der Veen in these beautiful pictures are quite a bit quite a bit ago by now so let's say a collection of for example powders whatever you want right in general scattering points in your samples as I was saying if you have only a poor control of the wave front and the average parameter lambda so the wavelength is the only thing that you really on average can control we know perfectly what the result of this experiment is is a diffuse ring that is nothing else but powder powder diffraction and the geometrical relevant parameter you can you can detect you can measure on your detector is essentially related with the characteristics of the light and the intrinsic lattice spacing that is characterizing the average description of your sample right very important as we all know and it's even your first hand microscopic information out of your sample really really relevant because it can beat up the average structure of your sample but if you do better and if you're able to control a full control of your wave front where here is the at the limit the idealized case of a perfect plane wave but it's much less what is needed and we'll see it a little bit later how to relax what happens if you relax this condition under certain and anyway certain conditions are completely and so you have a wave front control that means k depends on its life in position in a certain way then you can get much more on your detector you can somehow you get in focus let's say with your with your within your diffraction ring and you see a structure that are intensity fluctuations local intensity fluctuation of pixel by pixel appear and if the illumination function is more enough such that this distance between between variation peak and valleys that are called speckled indeed and constitute the difference of which constitutes the contrast of the speckled pattern that is appearing on the detector is suitable then you can get essentially a Fourier transform information on the distribution of scattering centers into your into your so that opens up a lot more information available in your experiments and this comes obviously with some carrots because anything that perturbs this this situation may in your results it's not only on the wave front but it's mostly in the reciprocal position that's a of the way from the sample and the detector and then it kind of anywhere and any kind of motion that you would induce for example an inadvertent motion of the sample will give you back in the previous in the previous condition of a diffuse ring under certain and the center conditions so we have to be attentive the other question that obviously rises is are we able to produce those kind of wave fronts are we able to control the wave fronts as it should as it should be needed for getting the useful information out of the sample we want but yes we can in reality any source is just an overlap of incoherent photons in terms of energy and even the even the characteristics of the of the propagation of the light it's just matter of as can be described an overlap of different energies and different direction of propagation of light so it is sufficient to essentially constrain the spatial coherence by by collimation and then filter to to discriminate the energy the energy is to obtain exactly what we need the problem is how many photons are left right because one photon is perfectly coherent that not very useful unfortunately or not always useful per se so the treatment is very in reality is quite simple it relies completely on the regulatory part of the light and you as I say to the nature of the light itself you have it's it's natural to to to split the problem into a project in the two relevant direction the one of propagation of light and the one transversal for which we have time or longitudinal which is related obviously to the monochromatic monochromatic characteristics of the device you have this is for example the the expression in terms of the resolving power of a monochromatic or equivalent device and then you have the space or transverse coherence length that is essentially a peculiar geometric characteristics the solid angle and under which you look at this was so equivalent let me say that both of them as you see depend linearly on lambda once those two parameters are singled out and you can think that on the reasonable condition over and are relatively extended area of energies or lambdas as you want this this parameter is essentially constant as it is typically the case so both of them are essentially linear linear in lambda and these are some important consequences that we would see in a set indeed why do we want to go soft we all know that even the expression we gave before for the intensity of the detector and we are expecting to pay an enormous price in terms of in terms of even sphere right again using the iron as we did before as a case if you move in resonance from k edges that are here right around seven kilo electron volt to l edges that are one order of magnitude smaller in terms of energies as it's typically the case the jump for transition methods and and you go to the l3 at 700 you see that a typical structure that can be a roughly pero skies right or whatever that you can have changes dramatically from the point of view of the fraction in one of the cases you populate the allowed umbrella even umbrella with many points that are reachable and so you can perform experiments many times you simply count in soft x-rays so is this worth then the case there are two considerations that should be done from the tech purely technical point of view one of which is the fact of having several defraction points is not always there is not always an advantage for example it constrains you to imitation due to multiple scattering which is certainly not a problem in soft x-rays the other is that yes you cannot study the structure or not directly maybe but you can you can study any kind of order that is actually let's say susceptible to frustration and as such as typically propagation vector that is large but smaller so it's a structure that is larger in the real space and this is exactly what typically happens luckily two electrons in complex system right where complexities for in frustration is one of the points certainly you will have some propagation vector or very very likely you have some propagation vector that sits in the surrounding or the zero propagation factor itself so it can maybe maybe study likely by soft x-rays and this is exactly what we are interested in the other advantage is what we already shown before apart from the fact that the scattering is much smaller than the absorption and core section as I was trying to show before the difference between the k resonances yeah the resonances for transition method is massive it's at least one or the wrong magnitude or more so there's obviously an advantage in terms of strength of the signal and other characteristics that I was seeing I will show in a moment to move to soft indeed so why coherence and why soft or better why soft let's recap so in terms of scientific motivation there is a relevance in the target electrons that we are aiming at and edges for transition matter are the way in which you would like to tune to electrons that are actually bringing the the information out of the electronic order correlation and so on and so forth that develops in in in relevant cases and the other is that luckily for the same reason if you take most of the of the cases of of interesting of interesting available materials in nature now the other reason we depict some cases as taken from one of the grouping of university through sdm images you will see that there is any kind of of rearrangement in homogeneities and so propagation vector potentially if the structures are irregular that are present and so we have a plethora of cases that are both and very interesting to be studied indeed right and so in homogeneities and textures in particular are even relevant and again so the coherence will be we play a major role and technologically there is another good reasons because it's difficult to produce as we say a lot of coherence or better a high coherent flux let's say so for good coherence and good intensity for example and so in soft one of the one of the key parameters that plays is exactly the longitudinal part right we we we can relax a lot the longitudinal requests in terms of coherence length because the penetration depth into material is anyway very small and so even if even if we do not monochromatize too much our radiation it would be sufficient from the coherence point of view to form beautiful speckles long story short this means that soft by chance essential for the average density of the materials available and so on and so forth it falls in the in the happy spot between being strongly interacting enough to bring information out of very strong information out of materials let's compare to newtons for example which is the opposite extreme if possible and not not interacting strong enough not to be confined only on the top layer of materials but being bulk somehow enough such that oxidation properties or are happening at the surface can impact on the signal but not necessarily and not too much so we are just in an episode idea to study for example by interfaces so as we as we heard yesterday beautiful talk by Harald Reicher from the SORF the director one of the director of the SORF in the mature third generation synchrotone sources as he called our our our synchrotones very correctly the limit given the the intrinsic anisotropy of the electron beam that is populated by our machines the coherence is essentially limited by the horizontal that emits us right and so there's a plot of of other relevant energies in our case and the photon limit the so-called diffraction limit subsets right as they go and in terms of horizontal limitants and we see very clearly where we see our beautiful source perfectly tuned for soft exercise is sitting where yoga your source is sitting which is better than ours i think i'm slightly late and where in the future we think also other new fourth generation in the soft sources are supposed to come so we are essentially going better and better in covering the crucial the crucial part of the spectrum that in our case is essentially sitting here between 500 and 1000 electron volts we we spend 90 something percent of our time although our beam can go from 250 carbon to 2000 electron volt essentially most of our experiments are performed in this area so we are not perfectly coherent as our result we start to get modes from 400 electron volts 350 electron volts upwards but we still have a relevant fraction a coherent fraction in our beam and i will show better numbers later just to give you an idea of where we are and what we we can perform and where we potentially want to go who can go already in other sources so how does our look our floor looks like just to give you an idea so we are a 23 ad beam port and in reality we are two beam lengths that that's the reason why csx is called 23d one because we have a sister beam in number two which is in board compared to us and it's called ios in operandus spectroscopy so we cannot be more different as women's right they are chemistry spectroscopy catalysis oriented and we are hard condensed diffraction coherent beam and so really really really different for historical reason the two beam lengths were accounted but already of a very very small amount and so although that they are supposed to be completely um let's say independent because they are there are two independent tpu's that are providing light to the two sources in reality we have problems of crosswalk and so we can effectively only operate 50 percent of the time controlling our the energy of the photons that are descending our beam length so long story short this is a cartoon picture of the of the of our beam length epu obviously sorry is sitting way upstream here out of the of the picture on the right and we have a first mirror and then a monochromator second mirror that is the only uh horizontal focusing element of the beam line when obviously the vls law at the monochromator provides the vertical focusing before shining light into the optical aperture that is sitting roughly 50 meters downstream the the source this is large is sufficient and that's the reason why it's so simple the design to preserve the coherence as much as possible and to propagate a lot of photons as i said our our resolution function of the better or resolving power of the monochromator is in reality quite limited we typically leave between one thousand and two thousand probably closer to one thousand and two thousand for most of the energy range of interest but it's exactly matched for what we need to do as i will uh detail it so we shine light into our end station which is shown here as a large vessel in vacuum ultra for for reasons related with the techniques we use and it hosts uh either only a pin or or a couple zone plate and osa on a reduced diffractometer we are scimatized very basically like a one degree of freedom rotation degree of freedom and the detector that goes around covering essentially two pi of space right and it's a fast ccd detector developed together with uh in a collaboration repair let's say and that provides very good very good results and we have also another station at the end of the beam and here that i may have time to talk about a little bit so let's jump back to science and try to to find the drive for where we want to land finally like you may you may remember quite a while ago uh john tanquada came out of genel and i came up with a very nice model electronic model microscopic model to explain the the scientific information that was innocent at that moment so it was known that the super conductive dome of the 214 family of cuprex it was not exactly that one the sample study but it is irrelevant is the same family so this then i was always saying the super conductive dome here depicted versus doping had a depression in exactly one eighth doping and it was not clear why this was happening uh by newton's newton experiment he measured uh what what he looked to his eyes as an intertwined periodic order characterized by a different propagation vector of a factor two between charge and magnetism and the temperature revolution that was clearly getting together with the structural information he had available in its end an explanation pulling one shot of microscopic and macroscopic properties on the system if electrons were thought to uh condensed in an order that was at the same time having a charge and a spin part intertwined together on the copper oxide plane as they say name stripes so obviously newton is only indirectly sensitive to charge and directly sensitive to magnetism and so for a long time it was difficult to see if this order was only related with the specific family or it was explaining other properties that were similar or seen similarly in all the families of the cuprex and i had i had the lack of passing through minamu university when the date of out of resonance in elastic x-ray scattering in software gene wherever they were and we were able to explain that at least one of the families was behaving exactly the same and we were in this case sensitive to the charge part through x-rays and and we could prove that that was the case in the ybco family indeed since a lot of effort has been put with many techniques most of them soft by the way to prove that actually this is a ubiquitous property so charge at least the charge part because now we're more sensitive to charge and then to magnetism of course but at least the charge part behaves it's it's it's ubiquitous to all these kind of problems and various families behave in in equivalent in an equivalent way and although there are important differences you know for example the weight factor the propagation factor goes versus stopping that still has to be resolved and this is not obviously not not so impressive or better not so surprising let me say because the difference is in the structure and mass and so it's actually surprising that charge density weight is so common to to all two structures and so long story short there are a number of questions that are still open and we need to understand so what is the relation of the crystallographic structure what is the relation to superconductivity and the role of domains and also to try to understand why they look so different sometimes so we are very strong correlated strong signal or very broad and correlating the weak signal depending on which of which of these families is indeed looked at and this one is one of the one of the reasons why it took so long actually to discover that there is a certain kind of seniority in order in order structures so having a coherent beam we started easy we want we went for looking at LBCO in the magic composition of one it put the sound put the diffractometer on the on the suitable propagation vector and got a beautiful signal full of speckles so now we see very well out of our detector this is a region on on our detector is practically that a crew has a crewed image maybe correctly for back room or anything else so you see very well that in first approximation we have a kind of Gaussian envelope but the intensity is all modulated essentially pixel by pixel if we make a longitudinal cut as depicted here by the dash line you will see that over a background that you're always fighting against because fluorescence for example is is around everywhere plus in elastic in general and so on and so forth as we saw before we have indeed a Gaussian kind of peak and then oscillation due to the speckles that are forming if you take only the smoothed peak so the average description if you prefer if you integrate the whole area of the detector you forget about speckles and you change the temperature you see clearly that the order parameter goes as expected and reported with other techniques the full width of maximum as well so by the way sorry I forgot to say that in this specific experiment performed a copper L3 obviously with a standard our standard 10 micron pinot special filter in front of the of the sample we get longitudinal and transversal and coherence lengths that are exceeding what we need and still we have a 10 to the 13 photons plus second on the sample of coherent flux and this is what is needed for achieving these results that I'm going to show because obviously everything is anyway has the mobile intensity coming out of the detector as we are only dealing with a fraction of an electron that is hovering across across the sample so obviously it's not very strong. So we have speckles we have temperature variation of the average peak we wanted to learn more how are the this how is the so as we say speckles come from the distribution of scattering center in the sample call it domain call it as you want so how are those domains evolving versus time and versus temperature are the first two crucial questions that you may ask yourself and this is exactly what we analyzed so we were able to show that those speckles do not move at all for hours so everything looks extremely static if nothing is changing in the bin so if the source and photon propagation system is under control the detector is under control and the temperature and the condition of the sample are under control everything is static for hours and hours that also tells us the quality of our bobbin and being able to stay stable for hours and hours. The problem of the really interesting parameter obviously is to change the temperature and see the evolution of the domains for the scattering center distribution if you prefer into the sample the problem is that as soon as you change the temperature obviously the sample will move in the bin there is no cryo so it stays stable right at the level required in a in a coherent experiment that is sub-micrometric essentially control is needed. So cmp people came up with a beautiful idea of producing pinons you see the array of pinons here right they're not very contrasted but you can guess that there is a four by four array of pinon in this very thin lamina of gold that is directly attached against the surface of the sample so they are far enough that they cannot be they can be illuminated balance at the time but they are small enough that they can be overfilled by our beam at the sample. Long story short at the pain or the price of realigning the sample and the specific pinon exactly at every single temperature you can get the same illumination condition of your sample at any temperature so here we report a small part of the detector of our detected signal on our detector versus temperature as indicated on the top title of each sub-clot and you will see that your eyes perfectly correlate what is happening where there is a strong signal or weak signal it's exactly it's exactly staying it's only the average for the parameter that activates but not this not you you cannot see any change in the distribution or spatial distribution on the on the detector of the speckles right because if you look here the signal is almost down so let's go a little bit colder in temperature let's say here already quite close to the transition temperature of the child's density wayward or the parameter fade away you clearly see that where the signal is surviving is exactly coming from where the signal was stronger at the beginning okay so it means that it's just an average attenuation but nothing else we can do better we can do obviously cross correlation of images right so that we can get a quantitative value of how similar similar images are and we can and and we can and we get very high values indeed we can do better at that point we got very interested and we say wait a second now let's try to have the maximum signal available and let's try to warm up and cool down the sample and take a second image so I always had 25 kelvin we took images we took images of 25 kelvin we made a sample to go up in temperature come back at 25 kelvin and take another image this is what is called before and after any of the temperature cycle that is reported here on top and then you can compare the two images by cross correlation and you can get this value and you plot it first to temperature and you discover something quite magic you measure always here that is inside the phase you see this is the other parameter that fades away at the one of the structural transition by the way of the compound so we measure where the order is well formed is strong and you have a certain correlation this correlation is kept at any temperature you inspect you make the sample to inspect in the way I was describing before up to really high temperature way higher than anything that you may imagine correlated or connected with the child sensitivity way survives and indeed it changes only after having crossed a certain transition temperatures temperature which is related again to a structural transformation of the material this is depicted in a time wise way so you have a certain speckle you warm up the sample you cool down at 25 again you take another image you see they are identical you warm up a little bit more and you go closer to the transition you come back there is still some memory but some details have changed you cross well into the into the new phase and you come back you have no relation anymore from what you had before and what you have now if you stay again below any of the temperatures here you will get a new memory of a new state that it seems that you're written inside the sample only if you cross this transition temperature that's explanation please note that when I say speckles are correlated so the cross correlation gives a migrating it means that you need to have the same distribution in terms of position intensity at phase out of your of your sample otherwise that would not be the case in terms of speckles of your attack so it is really saying that there is something into the system that really encodes how the domains have to go back and they go back exactly the same in the same condition so we came up with a with a model that we have tried to cartoon picture here and we need to remember because we have a fundamental program right we have something that is exactly static in time and in temperature but it's evolving in terms of average description for with a maximum in particular and or the parameter how can we understand those two apparently opposite information and we need to understand and we need to remember what the speckles mean the speckle means the distribution in real is the reciprocal trap is the Fourier transfer of the distribution in real space of your scattering centers so if you assume that the scattering centers are dense this doesn't make sense but it's not necessarily the case it's efficient to understand that they can be sparse and almost irrelevant in terms of and then everything is clear as depicted here in this simulation as long as you keep your your scattering center and the correlated part of the signal that is coming out of the scatter supposedly scattering center into your sample that's the Fourier transform that mimics the speckles that we detect on the detector you see that if as long as they do not touch each other essentially the speckles are the same only when some of them start to touch each other you start to see that some little differences indeed develop as soon as they merge together you have a completely different idea so this means that it's sufficient to have only few sparse scattering centers that are dominating our signal here and that's exactly a lot possible to allow for a complete evolution in terms of average for the parameter and even full rhythm maximum in terms of the correlation briefing no problem that as long as this happens on a scale that is much further from the next point available in the real space of the sample they can do what they want no problem the important thing is that they don't touch each other while evolving right and that told us immediately that the volume distribution has to have certain characteristics and this is the cartoon picture that I used to this to explain why we can't see this this kind of temperature revolution it means that there is an even order parameter that is written in the lattice that tells up to the distribution of the charge density wave how they they come up into our material at low temperature and it stays written in spite of changing across this first structural transition so if you go under here and you come back they come back identical if you go high enough you change what is actually imprint imprinting here in the intermediate state how that the charge density wave then have to condense and they condense in a different way so here you have perfect reproducibility as long as you are in this part of the temperature correlation plot and then you have a different situation because you change the underlying structure of the material across this transition here and you're here you end up being here okay this is a very special case and because everything looks static in reality easy to lose the case not at all normally you can have a dynamic and that can give you much more information so how do you how do you obtain the dynamic information you have a detector that is acquiring images at a certain pace so that time and number of image equate and you have speckles for example of a real signal on our beam and the respective water material was and as much as much as it was before the case for the cooperates you can do two things either you single out alarm at your choice vertical or horizontal whatever you want and then you just oppose all these lines that come from different images such that you obtain a matrix that is pixel versus time or images which is equivalent and it's called a waterfall plot and because if you do not have any dynamics it looks like water falling from the sky to the bottom or you make oops sorry or you single out an area and you calculate what is called the one-time correlation function one time being the lag between the different images so you see the expression you take pixel by pixel the intensity and you correlate crossword the intensity pixel by pixel in one image and in an image happening at a later time so that the only parameter you have is indeed the lag and you average over any initial condition right again if it is static you expect a straight line at the value one if g2 minus one we see a second one if it is dynamic the situation you expect something completely different you will have something wiggling somehow left and right here instead of being straight top to bottom and you expect something that is relaxing down with a certain low we will discuss about instead of being a straight line versus time okay so we have some mathematical tools and we can try to see if this applied to other cases helps us so just fresh out of the beam line we have this septic paper in pll from always cmp in collaboration with cmp group on the equivalent structure that instead of cooperates nickelates why is this so interesting and because nickelates have have some difference first of all from the from the basic point of view of the technique uh charge order and space order are physically separated and are both obsessive so you can see selectively with the same technique to charge and see in the same sample in the same position position that is once again constrained out of a deposited uh pinot in a slightly different way this time um it is fit on the sample somehow and so uh you you can get you can get information you have the charge and the spin order you we have again speckles right we have the average description and the speckle description we see that they behave very differently in terms of correlation length where the spin order is way more correlated than the charge order whether the dynamic information is also very different and opposite so the one that is mostly correlated changes the changes its its time dependent so dynamic information much more drastically than the one that is less correlated so once again it seems that the charge order has a memory of what is happening the spin order can do more or less what he wants under certain condition but it's more free to evolve when the charge has to come back pretty much in the same condition and again we can make a speckle cross correlation as much as we did and indeed we see that the charge order stays for example constant while the spin order evolves in a completely different way we didn't go far enough to see that actually also the spin the charge order can change but there's there's more please follow for the article because it's it's very rich in information let me just say that this is the intermediate scattering function and this is the reason why before I plot g2 minus one because the left then what is the remaining part apart for the contrast of the speckle is indeed the evolution the dynamical evolution of the order is contained in this pattern and it was proved also on other relevant cases like the case of the magnetite with the electronic ordering across or around I should say the transition where we clearly saw that at lower temperature we have a more static static order fine then you warm up and the dynamics accelerate but finally it slows down before making the transition to happen which has a certain relevance with some models electronic models that are that are that have been proposed and also you can move to let's say engineered structure not only working in bulk or thin pins real materials but you can you can go for artificial spin isis for example and check their magnetic configuration in a square lattice as this was the one chosen in this case where you clearly see that working at very much more comfortable temperature in this case because being a being engineer you can do what almost what you want out of it in terms of mobile materials symmetry topology interaction by distance and so on and so forth so you can tune the sample really almost as you want and you can have a transition temperature that is tuned around ambient or even above as it is this was the case and you see that you get a magnetic signal that going closer to this transition temperature develops the fact in the order and indeed you see speckles that are not present here and start to be present here due to the fact that you're visualizing the defects in the order this can be done over time as i showed before with a waterfall plot and you see that if temperature is as small enough compared to transition temperature you have some kind of defect but then essentially the situation is static all the time if you warm up these defects are also have a dynamic because they bounce back and forth on multiple states as it is evident very evident in the part that i i zoomed in here where evidently there is a kind of instability between some states obviously this is known from other techniques like beam for example that can access the direct and fulfilled image over a large area with very high resolution much higher than what we can do but when it comes to dynamics and then we we we can say probably something like that right and also in correlation between images it's very difficult to to take a number very high number of images like this in p and then correlate everything to understand what are the small details on the movement of the of the domains instead here we get an average information but it's very powerful because the free answer works for us so i use the same the same argument to show you that actually this you see is the lattice cell and given that it's an anti-ferromagnetic array infant in the center of the cell you get the new the new magnetic information so you know by localization exactly what you look at and then you can check polarization you can check energy you can check other things obviously and this is the same thing that i showed before with a little bit more analysis in the in the terms of what i showed you before as well so the the intermediate scattering function in the derivation that there was i was proposing and you see that once again you have an evolution of the average parameters in terms of all the parameters there's a temperature that is the expected one right you have secondizations second phase transition it goes away fluctuations or better correlations so for with a maximum goes out pick with call it as you want goes up and everything works exactly as we saw in the in the cooperates but we discovered that there was much more underneath and here we have the impression that is the same as well because if you make the fit out of the intermediate scattering function versus time the initial part can give you the activation energy essentially what you're looking at which is a collective motion domain motion of the magnetic order into your sun but if you look a little bit more authentically you will see that there is a kind of oscillation everywhere right and and so we have the impression that this comes exactly because there is some kind of repetitive is not periodic wiggling of the of the magnetic domains in and out and so we were wondering if this can be can tell us something more and we are actively working on this with some groups in particular in in Switzerland coming from PSI Lauer Heidelman Valerius Cagnoli and now at Columbia University Sarah Skelbo and we need to be however you need we need to make another step it's not the one time for correlation function is not sufficient anymore because you average over all the initial conditions as we said as the only parameter of relevance is the distance the time lag between configurations instead if you forget about this and you go to the two time correlation you say I don't care averaging about about the initial state I want to keep trace of this initial state corresponds after a certain certain time to this condition that condition that condition and so you populate the matrix this time instead of a line plot you will see indeed that there is some kind of periodicity no not periodicity but the recurrence let's say in the correlation of the images right and it's it's it's quite it's quite peculiar as I said it's all a working progress so I cannot talk too much about this but it will have relevance for what I'm sure I'm going to show you in one second in a couple of slides so what are we doing with this species here so for what is allowed obviously on our scattering vectors we are trying to extend essentially our domain of interest in terms of time energies right obviously over very small energy scales so as I said collective dynamics but it's very interesting because we are limited one side detect by detector and flux on the other of this by the stability of the beam but we are effectively extending very very much this area of interest and it's very relevant because it's a cross as I showed fundamental phenomena of of interest that span across various various opportunities let's say and we already are around 10 five orders of my middle between 10 milliseconds which is our detector limitation at the moment and several hours of the stability of our beam so this is not the cool of the of the richness that is allowed by coherence because there is only the imaging part we all know that seeing is so important for us human right and in your network that has a visual mean good could try to to visual process everything to the level we do and also it is extremely difficult if we put a very bunch a very good bunch of of scientists as depicted here in this blind in front of a big of a big problem it's so easy to get confused right so we have we have many many many imaging techniques already already existing that are available providing beautiful results but every single one of them projects reality in a specific direction due to technical constraints so it is always good to have more and more techniques because adding the news lies on the problem can give us indeed new insight and just not to be confused and so we tried our best to move in this direction and obviously learning from the image very rich and important work done by existing imaging community by but by adding the resonant part we're trying to make our our contribution so starting easy imaging by focusing as I said we have friends on plates set up into the into the chamber together with this sort of sorting aperture so we can make the illumination function to scan the sample fixed in space it's already difficult enough to keep everything fixed at the level required so we prefer to scan the illumination function compared to the sample we'll maybe comment a little more later and we use this neodymium nickel oxide thing film across its metal to insulation and anti-ferromagnetic transition to check one of the family of the domains one quarter one quarter one quarter reflection available inside our evil sphere at nickel l3 energy and to check how it how they evolve and what are the statistical parameters and how they are distributed versus temperature versus cycle this is what's done in collaboration very for collaboration that is to continue with the recarton coming group and that it can cause input and you can see that if you consider some of the some of the statistical parameters and the distribution you will discover that they are distributed along some power laws that they follow some specific cyclicity or not compared to to the thermal cycle that you apply and so you can get really an insight quite quite interesting over all those magnitude of distribution into of yourself the magnetic domains or a family of your magnetic domains into your sample so we can also construct obviously and the same or similar kind of information can come from the direct reconstruction using the Fourier inversion out of the out of the coherent images that are formed through spectra at the detector and apart for nice let's say test case just to prove that you have two micron resolution you can actually you can actually detect structural and magnetic relevant information across your sample even when it's covered by partially covered by a gold slab that is used as a reference just to have a sharp corner or things like that but it's only partially transferred partially and waiting for students so you can even see that you're able to to reconstruct all that under this and so to check that the presence of a metal close to the surface does not perturb what it does in a specific way the ordering you are you are considering and this was an early effort of the so-called recon steam that is catalyzed around our beam line our beam line effort and as participation from Los Alamos MIT and at the beginning university of Marseille as well so where works this is still still working proper so I don't want to to show too much because it's not published what is instead published is that out of an experiment that we did with the binary scale you can group from you to Rutgers University we noticed that apart for the very strong signal that is here coming from the center of our detector and it's hidden by our beam stop and this white line and white dot here on the field that is hitting our our sample out of our pinot in this case we use the pinot can actually bring relevant information so this is a this is a specific sample kept below its magnetic transition temperature on the magnetic propagation vector and you see very well that if you cycle the temperature you come back at the same at the same temperature as much as we did before right always the same recipe you see that there are specific defects structural defect here marked with a blue arrow that are static in the image they don't move and you have wiggling things going around right that instead they are rearranged each time that you go back into the magnetic phase so evidently those and we proved by by calculations and simulation those are magnetic boundaries anti-ferromagnetic boundaries that can be visualized directly in full field through the coherence of our beam and this is essentially a complete primer right there are not so many techniques around that can actually visualize already anti-ferromagnetic domains nor their borders so clearly and with such a nice and this brought the ability of looking at arrangement of anti-ferromagnetic domains in system with with a great great level of details now we have two things that we can say why this is this why this is interesting and which are the limitations that we have so let me start from the limitation because it will enchain immediately another discussion and see that I'm going a little wrong but I will try to compact so space of time the typical experimental dilemma signal is never enough right and we try to blind average to increase statistic this is perfect if everything is static because you are sure that you're adding apples to apples increasing the signal to noise by averaging but if it is not the case so the signal is not static you incur in the problem that actually the average that you are doing with your data is only saving the part that is static obscene static out of your signal and you're losing everything else ideally we would love for an instantaneous knowledge of the states that our system is cooperating and for doing this we should be able to do a selective average that says now I'm dealing with apples are we putting the apple basket or now the system has transformed to strawberries and so we put those data into strawberry basket and so on and so forth the problem is how to do it obviously so why is this relevant but it is relevant because actually there's a number of scientific cases and applications that depend on the fact that we can know how the system is evolving is making the phase transformation to happen and it's actually dealing with its own let's say excitation local excitations in terms of the main was a magnetic structures a local structures bubble domain scheme and so whatever you want to call them or whatever you want to think about and there are a number of publications that are shown that we can actually use this kind of this kind of topological excitation if you want to call them like that to either encode information or even process data so making calculations to happen or simulate classical or quantum system even with artificial responses for example so to move in that direction we again we started easy because there are enough difficulties already not so that is it's a good idea to try to keep your life as easy as possible and we went straight down to add a new end station at the very end of our will be meant the one that I showed you before dedicated to allography so in collaboration with the MIT group of professor Jeffy bridge and pitch sorry we used a perpendicular magnetization multilayer containing essentially a parallel film and and we using exploiting the knowledge of our german collaborators we produce masks for allography with an all of control dimension exposing the the sample and a couple of references all going through everything of fixed of fixed and very controlled diameter such that use it is simply because we have a perpendicular magnetization in the field and the olography set up as I said and using our coherent we can get beautiful beautiful images that can be inverted as an olography an olography image can can be given that you propagate the information so it is also because it can extract smaller signal out of the out of the of the background so it is really really relevant and also understand if the if the contours that you have is enough to explain a static situation if there is a partial dynamic dynamic component that is out of your of your detection windows window right like in terms of too fast for your detector to be seen as I was saying the ever the blind average essentially is done on the sample instead of on your data and then use better what is available because we proved that already there were available tools but projected in a new in a new way can give new information so this depicts the case of understanding how the relevant charge density wave is distributed inside some samples then there is the connectivity diagram that I showed before and so let me stress last the importance of simulations as we are doing with the OLEG and this very powerful SRW program that is available for everybody and that's the link and conclusions of the conclusions coherent of a fantastic of what unit is it comes at some price and particular some attention has to be has to be put new techniques are some of which are under development and new perspectives are often detector and photon birth to death simulation are very important the instability is on the map and with this I finished thank you very much for your attention and apologies if I was a little I did thank you so much for this very rich and informative talk so the the session is open for questions we have already Ian Robinson was asked to start please Ian hello everybody I'm sorry I missed the beginning of the the talk because we have beam time in Grenoble and we were going through a training session but but I caught some some of the the second half Claudio you you raise the interesting possibility of doing XPCS in real space and the idea that you could see the domains with holography and then attempt to do sort of correlations of those domains in real space I'm wondering if if that ever works for real sort of genuine fluctuations such as the the LBCO that you also talked about would would you think it's still appropriate to use the idea of a g2 function or a two-time correlation function and second half of the question do you think that that would be do you think the decays would still be exponential for the for the fluctuations in real space hi thank you very much for the question so it's a quite extensive one so I think that the reply to the first of your question is yes we have started to to work on specific cases not only based on on holography and we know it works we know that the g2 is pretty much the instrument we have at the moment so with the one that we applied and it seems to work and to give some some results and it is quite interesting indeed to revert back once you have the full let's say a two-times correlation function to revert back to the first time to the one-time correlation function and understand how good you are in the approximation of extracting for example exponents sometimes it works sometimes not and it's exactly what I was trying to relate somehow in a very quick and dirty way I agree but with the ergodic approximation right sometimes your system given all the boundaries conditions that you are using temperature speed at which you can you can acquire the signal and so on and so forth is indeed in a good approximation for the extent of time that you that you you can grant to your to your measurement it is in a good in a good approximation of if you want ergodic so you can extract some relevant exponents sometimes it's not at all and so that's also the reason why g2 so let's say the two-times correlation analysis is always advised just to try to understand if what you want to extract out of the one time has a meaning or you are over interpreting the data or stretching essentially the conclusions Karina Tonal please Hi hi Claudio very very nice talk we are from the the softie max beam line so there's a couple of my colleagues also I saw and we are basically your equivalent here at max four we are not as far ahead as you are in terms of the scattering so very impressive and a couple of practical questions maybe so you mentioned to speed up the the sector and obviously in the soft x-ray regime there's not a big choice in fast detectors is there anything that you have that you know is ongoing towards faster detectors and if so how fast would you like it er not I don't know a lot I know that then there is a changing technology that is supposed to give some benefits let's say at the price of some other problems and but it's not it's it's just becoming available now there are several producers there are several vendors that are working on this but you are absolutely right that our our field is such a niche that there is no essentially or very limited investment unfortunately in this direction what I would love to see obviously is higher energy resolution of the detectors even polarization sensitivity of the detector that would be fantastic these are all dreams because there is absolutely essentially no investment to know no possibility that I know of in this direction unfortunately so we are at the window as much as you are and hoping for somebody to come up with good ideas and in the meantime we are doing our best to try to yeah to do what we can at the price of photons as always and and regarding the stability that you mentioned so I mean that that obviously is also linked to to the time resolution I would imagine for instance so how how is that at your beam line is there any specific improvements that you had to make in order to to get it working or that was that was not an issue at the speed can I ask you to detail a little back on what you meant by implying time resolution and so on because at this this side it's maybe something I didn't understand exactly I was thinking it's kind of like because the detectors are relatively compared to hard x-ray they're relatively slow so is the time time resolution of the detector somehow linked to the stability required at the beam line or is there other things at the beam line just sort of like you know random hops on a very slow time scale I see so now that's a very relevant question indeed you're right let's say that if let me let me rephrase maybe then maybe answer maybe then you will tell so if going fast means that you're losing control of the stability of the detector in whatever sense we can we can think about then you're right that is that is a very big constraint and we should think very carefully about because the stability what I call stability I think about longer timescale so the opposite and the spectrum if you prefer that is that is the integral of the properties from source to open the delivery system to detector provided the sample is a little bit special because it contains extrinsic and intrinsic parts so the the extrinsic parts of the something also enter into the equation if everything there is set correctly and out of control you can hopefully get some information out of the intrinsic part of the sample this is what I try to explain and meant in my in my study so you absolutely right that if going faster with the detector compromises the stability and affects the long part that you're that you're that you're going to check so yes it can in principle and we have to be very thankful about let's say that we are in a little bit of an empty spot where we have a very fast detector one of the fastest available it's not the fastest available on the market and we are somehow able to manage everything such that we also get stability over hours complexity in in good days and don't get me wrong it's not always the case and as you experiment that is the answer we know perfectly right but if planets are aligned let's say we can get up to hours of stability on the beam and that's that's a good news so yes yeah I hope I exactly and a very small question the spot size on the sample what kind of sizes do you use on the end station was designed when SNS was still available and beyond the source on site at BNL by John Innan's work ratings in a very clever way and one of the characteristics was to be able to change a lot the illumination function on the sample and so by changing the setup that is available but they are all available and it's just matter of introducing them in the in the propagation pattern or removing them we can essentially span from fraction of millimeter down to 100 nanometers so depending if you use a far pin or a close pin or zoom plate setup you can you can let's say with overlap you can you can you can change the illumination function of your sample obviously you always pay in terms of flags and other things but essentially we have this this disability which has come extremely handy because sometimes you characterize your sample first in one condition then you know you have a lot of signals very easy to navigate you know exactly where to go and then you start to cut down right and and the hunger exactly in the direction that is allowed by the setups in terms of flux intensity stability coherence and and the crossover of all these parameters