 So it's a pleasure for me to introduce this series of webinar which is called Co-Works from Coherence Workshop. This is an idea that started already about a year ago with Edwin Foden when he was a guest at Lund University. So we thought that it would be very interesting to start to make a workshop actually on possibilities to use coherence and then because of COVID the workshop was not possible anymore so we decided to transform this in a webinar series instead. So what is this webinar series? Well it's as you can see in the title is dedicated to all researchers intrigued by the exceptional properties of the new sources of which Max IV is one of example. So coherence is the keyword of actually all upgrades of modern generation sources and the free electron lasers. So you can see from this, you must have seen these images a few times on the right we can see how the brilliance which is the number of photons per second per solid angles and per bandwidth sources have been able to produce in the last few tens of years and now this is a dramatically increasing with the new generation's sources and with the free electron lasers and on the left we can see an image of all the actually economical investment all around the world to either build or expand existing facility towards this capability. So we have decided to increase to create this seminar and the webinar series really to attract more people but towards the use of these techniques, coherent based techniques have been really developing in the last 10, 20 years so the community around them is quite small but now with this increase of coherent sources then there is also need for more brain to think about which science is possible at these sources at this facility. So this we hope that we can contribute with this to increase the user base for these techniques. We have decided to focus this series on coherent imaging because coherence can be used also for other techniques but we want to focus on imaging or microscopy and actually we can see that these techniques inverse imaging, coherent diffraction imaging has been used already for a wide range of applications so one of the first examples for a 3D imaging was for bone in 2010 but then this has been also used for crystalline biomaterials. This is for example an example in which coherence actually helped to understand that crystals in shells, micro crystals in shells actually are made of very very small building blocks which are nanostructured about 200 nanometers so it's been quite a revealing study. Imaging has also been used to make microscopy of cells, isolated cells so this is also a work using holography actually. Microscopy of not only of morphology of electromorphology but also of a magnetic morphology let's say so this is an example of how inverse coherent imaging has been used to visualize a vortex of spin within cobaltide material and also it can be possible, it's possible to use coherent diffraction imaging on actually very crystalline samples. This is an example of analysis of the effect in a nanowire so this is extremely small object which is extended a few migrants in one direction but confined within less than 200 nanometers in the other so this is also quite a spectacular result and it's more and more extended actually to realize application this is an example of how the morphology of a solid oxide fuel cell is actually affected by operation and all the way to the analysis of how crystal defect can develop during operation of batteries so as you can see there is a wide range of applications so we have organized a program that actually makes an introduction to the coherence so this is what we will hear today from Pablo and then in the next lectures we will have also introductory lectures to the technique so we really want to give an idea of what is the information that can be extracted from such data set and analysis, what are the requests on the sample, what are the limitations so old speakers especially of the of the first few seminars we actually take care of explaining in details what are the possibilities and the limitation of the technique so don't hesitate to ask directly question during the seminars or try to contact the speakers after the seminar or the organizers will be happy to reply but now I give the word to Pablo, Pablo Villanueva Perez is a joint Lund University I think about a year ago and is one excellent example of how one can get fascinated by coherent properties he was studying high energy physics and then during the collaboration with the free electron laser then he decided that actually he was really really excited about the possibilities of coherence so in the next postdoc he went to work at PSI with Marco Stampannoni was introduced to tomography and then later on he went to European XFEL where he continued let's say to develop his passion for coherence so today Pablo is going to give us an introduction to all the concept behind the coherent imaging I also know that Pablo likes to be interrupted during this lecture so if you have questions don't hesitate to raise your hand if you click on the participant list you can actually raise your hand I will see this and I will interrupt Pablo and I'll give you the word so thank you for coming thank you for joining and I really hope that this is going to be useful and interesting for you so thank you Dina for the introduction I will try to share my screen I think you should be able now yep is it can you see the presentation now yes go ahead okay great so thank you Dina for those kind words I mean as Dina said I'm gonna present during this webinar an introduction to coherence imaging so I hope I can give you an overview of all the other talks that they are going to happen later and developing more further details here I give you my email address so let me change to the laser pointer so here you have my email address so if you have any doubts after the talk please don't hesitate to contact me or you can also contact me via the organizers so today the talk is going to be divided in four topics first I will try to start from the very basic so we'll try to describe you how x-rays interacts with matter and as we are going to talk about coherent x-ray imaging techniques I will introduce the concept of coherence then I will move for the sources so I will try to motivate while we are so excited nowadays and why coherent x-ray imaging techniques are becoming so popular and for that I will make a brief review of brightness or brilliance then I would talk about coherent and phase imaging techniques and I focus on two kinds of techniques only the propagation base but in two regimes something I will explain later near and far field imaging techniques to conclude I will talk about 3d and how we can reconstruct in 3d objects by using x-rays and coherent imaging techniques so with that let's start ah sorry I forgot the references for this talk are the following ones so I will mainly use this book an introduction to syncretal radiation by Philip Willmuth because it's a good introductory book for people from different disciplines like biology chemistry and physics these two other books elements of modern x-ray physics and coherent x-ray optics are more oriented to physicists but you can I also extracted some material from them so let's start what is imaging well imaging is nothing more than mapping interactions with a sample and we do this because sync is believing by reconstructing an object in 2d and 3d and seeing its structure it's easier for physical interpretations than having just a physical model and interpreted it x-ray imaging is decomposing three stages so the x-rays which are our probe they must interact with our sample the photos that they have interacted they are collected by an optical setup and brought into a detector the detected signal is then processed in order to reconstruct a map those interactions within the sample volume as you see there are these three different steps interaction optical setup and processing and they must be optimized in order to get the maximum we can get so for example from the interaction optimization we can tweak the energy of the x-rays for a given sample in order to minimize the dose or increase or decrease the acquisition times also we can different have different optical setups that they can collect the photos in a different ways now for that we have imaging criteria that establish which optimal optical setup we should use finally in the last step the processing we can optimize it by adding prior knowledge about our sample nowadays we can think about using comprehensive or machine learning algorithms that include further constraints of our sample let's go to the main topic why coherent or face contrast imaging well absorption based radiography and tomography yields to little contrast for light materials and materials with similar atomic numbers here you have a slice of a brain from a from a mice from mice and the most important thing here is the absorption contrast scales with the atomic number to the fourth power and the energy to the minus three that means that it will have high set materials so what said is the atomic number like let's say gold we they will have higher absorption contrast than light materials like carbon which is the basis of of life of biology also their contrast will depend on the energy so for example if we increase the energy by 10 times we will reduce the absorption contrast by a factor 10 to the 3 so 1000 times less contrast so therefore we will play with this two parameters atomic number and energy to understand this and how the interactions and we characterize it with x-rays we have this example so imagine that you have two waves these are represent two x-ray waves or two photons and one of them goes through a material with an index of refraction and this is the way we characterize the material by its index of refraction which is equal one minus delta plus i beta delta and beta are real and positive quantities therefore the real part of a refractive index is one minus delta which is smaller than one where one is the index of refraction of vacuum beta it's a again a positive quantity that is associated to the complex part so how what does it happen to our initial photon that goes through the to the material so first what we will see is that the amplitude is reduced that means that part of the energy of the photons of the wave is deposited in the sample this is regulated by beta which is associated to all the inelastic interactions that deposit energy to the sample then we have another effect that we will observe that there is a phase retardation so if we count the number of periods of the wave not interacted through the sample and the one interacting through the sample we will see that there is a delay on the number of periods this is due to the phase oscillation this is a phase shift that is originated by delta and this is related to the elastic interactions that they do not deposit those in our sample let's see with an example imagine that we have let's say 20 kb x-ray photons and we go this material here on this lab is an organic sample like a polymer biological material so basically i'm a made of low set materials imagine that this is lab is around 50 micrometers thick so when we look at absorption we'll get only contrast of two per mil pretty low and difficult to detect however when we look at the phase we'll have a bias phase shift which is the maximum contrast you can get so that means that by having high uh exploiting the phase and not the absorption we can have higher contrast and lower dose and this is the dream of coherent and phase imaging techniques let's see a little bit how delta and beta behave so for that in this graph on the right side we'll study delta over beta ratio which is basically the related to the cross section of elastic versus inelastic interactions as a function of the photon energy for different materials this red curve on the very top it's about carbon which is a low set material actually carbon has set equal to six if you see the ratio delta over beta increases with the energy and it gets a maximum value above three orders of magnitude enhancement so by using that we will have much better contrast than absorption however if you see in the second part when you we further increase the energy delta over beta decreases and this is because new interactions such content that we are i will not discuss in this talk to start playing a role you go to higher set materials the enhancement delta over beta it's smaller as you remember from my previous slides the absorption contrast scales with set to the four so we have a higher set material so therefore the contrast delta over beta reduced but however for silicon which is the basics of semiconductor industry still we can get an enhancement of delta over beta around three orders of magnitude if we go to higher set materials then we see that things get more interesting like for example for gold in the case of gold the enhancement of delta over beta gets a maximum close to 10 to the two but you can see there are these peaks and these peaks happen because we have absorption edges within the electronic structure so that further decreases the enhancement of delta over beta so by exploiting delta over beta or exploiting phase contrast techniques and coherent techniques we aim two dreams the first one is the dream of zero dose when we increase the energy of the x-ray photons to reduce the absorption and use the elastic contrast especially for low set materials and the other dream is to improve the sensitivity for those low set materials like carbon silicon in order to have higher spatial resolution and higher contrast by exploiting elastic interactions versus absorption interactions so we want to measure phase how do we do it for that we require a quantity that is called coherence so the way we describe coherence is in two terms we have two kinds of coherence they have the temporal coherence and the spatial coherence the temporal component is the ability of the light beam to form fringes with a levitation of itself so in general when we have a wave a wave front doesn't have only one single wavelength it has a certain broadband delta lambda so imagine that we have two waves that start exactly from the same point one with wavelength delta lambda and one with lambda plus delta lambda so you see by oscillating they will start being out of phase till a point where they are in opposite phase the point where the these two waves are in opposite phase we call it the longitudinal coherence then there is the spatial coherence and this is related to the ability of especially separated points in a waveform to form fringes the best way to understand this is with the double slit experiment of junk so imagine that we have only the purple point as a source and this is a delta function so a perfect point source when it arrives to the screen where the two slits are it will produce secondary waves as the weekend principle dictates and we will have the purple interferometric pattern i will not enter in the details but you can imagine that we have this interferometric between these two sources you have a second source which is the blue one that is displaced from the other one and imagine it's again a perfect point source it will produce exactly the same pattern but it's lightly displaced respect to the purple one you can imagine that it would have another point virtually here farther away we can have the case where the maximum of the interferometric pattern is when the minima of the other pattern of the original pattern the purple one therefore we will lose the capability by having another point farther way to result in the formatic pattern another way to see that in a most simple picture is thinking about the old times light bulbs the incandescent bulbs with tungsten materials so these bulbs emit all the spectral visible wavelengths and beyond and they have a really large size so therefore they are not also spatially coherent so if we want them to be spatially coherent we can use a pinhole to have a really small point source as you can see this is not totally coherent because we have the different wavelengths like the black and the red one which will avoid the capability to perform coherent experiments another way to filter the different wavelengths is to use a spectral filter for example here the red spectral filter that will allow only a certain narrow bandwidth to go through but again this is not a perfect coherent case because we don't have the spatial coherence so in order to make out of this light bulb a perfect coherent source we need the temporal and spatial coherence filter out by using a spectral filter and a pinhole and this will allow us to do coherent experiments in a the way the most simple scalar way to describe the coherence is by using by doing the double youngest lead experiment and here I present you the the the image and the two main formulas that dictate the transverse coherence and the temporal coherence the transverse coherence is given by the transverse coherence length that is basically the lambda which is the main wavelength of our radiation over theta where theta is the angular source width which is seen the way we see from our sample in our in this case this leads the source d over the distance which is this angle theta then we have the temporal coherence that is basically lambda square over delta lambda which is nothing more than lambda over the bandwidth which is delta lambda over lambda so when these two quantities are relatively large so we have higher coherence then we can perform phase contrast and coherent techniques how does it relate these two sources when do we have coherent sources the way to measure this is via brightness brilliance or brightness is nothing more than the number of photons per unit of time per milliratt per millimeter square per open by open one percent bandwidth as Dina introduced before the open one percent bandwidth is the bandwidth so therefore this is related to the temporal coherence the milliratt per millimeter square it's actually the phase space in the two dimensions perpendicular to the beam propagation the phase space is related to the spatial coherence so therefore we have one term that is related to the spatial coherence and one term related to the temporal coherence so we can interpret the brilliance of brightness as the measurement of the coherent flux on the right side we have a plot of the evolution of the brightness of brilliance so what we want in order to have coherent experiments and the coherent imaging is to have as high as possible brilliance or brightness and we're really lucky because this is the figure of merit that we are using to evaluate and improve over the different x-ray sources so initially when x-rays were discovered by Rengen with an x-ray tube they had a low brilliance around 10 to the 7 but after several years let's say till the 60s there was an improvement because we have x-ray tubes with a rotating anode that allowed more heat load therefore we could have higher fluxes but the real evolution and the real breakthrough of x-ray imaging or for x-ray coherence on brightness was when we start using storage rings which are facilities for high energy physics but as a fact of this circular acceleration where the particles are stored they emit x-ray photons that they could be used for x-ray experiments nowadays we are in a really excellent situation because the new sources like max4 are appearing these sources are what we call diffraction limited storage rings where the electrons are part to their diffraction limit furthermore new facilities have appeared x-ray free electron lasers which further enhance the brilliance from compared to storage rings by 9 to 10 orders of magnitude these facilities they produce laser-like light almost like you will obtain in an optical laser I say almost because the temporal coherence is still not there well let's go to the main topic coherent imaging techniques so we describe when do we get coherent sources and nowadays we are living a really golden age for coherence because we have these diffraction limited sources and this is the requirement to do coherent imaging so there are many x-ray phase contrast techniques and coherent imaging techniques here I gave you a whole spectrum of them and the references the main references about them so the most simple way to measure phase is to use an interferometer so you have a beam imagine an x-ray beam then we split it and we have two identical copies one doesn't go through the sample and another one goes through the sample so therefore these two waves interfere and then we are sensitive directly to the phase we can use also another setup that is sensitive to the phase by using a lens and a phase plate that this is known as a nicking microscope but I don't have time in this talk to describe it also there are other setups that they are sensitive to the first derivative of the phase like if you use an analyzer based technique like we use a crystal or a gradient interferometer based on a self-image in setup these two techniques are sensitive to the first derivative of the phase but in the context of this presentation we are only going to talk about propagation based techniques so these are lensless techniques where no lenses are used in order to retrieve the object I will talk about two regimes one is the near field which is proportional to the second derivative of the phase and another one the far field where we will be sensitive to the phase and I will explain you how to reconstruct the phase and be sensitive to that so let's focus on lensless techniques and propagation based techniques for that and to understand these techniques we are going to focus on this sample here that you have in this corner here this sample is basically a set of circles there are the red ones that they represent highly absorbing parts so you can imagine a hyzon material like gold and then there are the blue dots that they don't absorb and they have only elastic or phase interactions the main number to understand the different regimes in propagation is the Fresnel Lambert which is nothing more that d square which is the typical scale size of our problem in this case would be the size of the frame where our circles are then lambda which is the wavelength and set which is the distance the propagation distance within the sample to our detector by looking at different Fresnel numbers we can get different images so for example where we are in contact so set is equal to zero we see that all the phase dots are missing and we are only sensitive to the absorption part it would move a little bit farther away from the sample meaning that we have a Fresnel number of the order one or larger we will start seeing that they appear in these rings therefore by propagating we start being say being sensitive to the phase contrast or phase the phase artifacts from phase objects this is because it produces an ancient husband it will move farther and farther away you see that we start having the rings and the rings become larger in size and they appear more or less with maximum and minimum it will move farther away which sets much larger we get to a regime that is called far fill and far fill is characterized by its Fresnel number being much smaller than one in this regime you can see that the rings start interfering with almost all the whole object and even it became something more blurrier and in this regime we have imaging techniques like cohenin diffraction image imaging as I will discuss later so the main message of this talk is that by propagating we can build phase sensitivity and we have a mathematical way to describe this propagation which is via the propagator that we I will use this following notation h to indicate the propagation and this propagator acts on the waves not on the intensities measure I said this because there is a fundamental problem is that our detectors only record intensities so imagine that on our detector we have this wave we measure we have this wave on the detector which is characterized by an amplitude and a complex exponential which has the propagation terms times a phase given by the by the object so if we could measure this field so cf on the detector plane in principle we could get the exit wave after the sample by applying the inverse propagator so by by propagating what we call unfortunately our detectors can only measure the intensity and if we compute the intensity which is nothing more than the modular square of the wave on the detector plane we will see that we lose all the phase information and we are only sensitive to the square of the amplitude ax y therefore the phase is lost and this cannot be directly calculated or simply calculated from I from the intensity this is what we know as the phase problem and the phase problem actually tries to recover the phase from the intensity measurement I in the in this talk I will review some some approximations how to retrieve this for the near and far field techniques so let's focus on the first technique near field or inline holography these are some experiments we recently made at max for specifically a nano max together with the sebastian cowl flash and micand so we exploit the unique capabilities of max for max for has some special optics that they are called kb mirrors or kipatric bias mirrors they are able to focus to the nanometer scale the x-rays after the focal spot which is the set of position we can position our sample and farther away from the sample around one meter away from the sample we can position the detector here I showed you several the positions of the sample respect to the focal spot that I called set one to set five set one is the closest position of the sample to the focus and farther away therefore from the detector you can see that and on the other hand set five is the closer to the closest to the detector and farthest away from the more from the from the focal spot you can see that by moving from the closer to the focal spot to farther away we decrease the magnification but farther more we have different propagation artifacts because we have different propagation and you can see that the shape of the stars or this pattern is changing also by changing the position of the object respect to the focus by exploiting these fringes there is physical information with that and we can try to solve and retrieve the phase by linearizing the solutions this is possible in near field techniques where we have an interferometric pattern as we do in holography so there are two many scenarios where we can easily linearize these patterns and solve them the first one is contrast transfer functions where we assume a weak scattering and absorption object the other scenario is transport of intensity equations where we linearize the propagator by assuming a small propagation distances here in the red areas I represent what are the phase contributions that they should be inverted in order to retrieve the phase okay let's see the results so here there is the hologram that we measure at nanomax and this is the reconstruction by using the CTF approach by linearizing the object we obtained these reconstructions by using a package that is going to be released soon and it's going to be open in python so it's called pyphase that we are developing in together with max langa so by using this nanomax microscope a full field microscope we could measure the stars with 17 nanometer resolution evaluated by the food and rain correlation in fact this resolution is the maximum we could achieve because it was the diffraction limited given by the focal spot of the kb mirrors one can also aim to get 3d reconstructions by rotating the sample and as I will explain later we can do it and we did those experiments at nanomax by obtaining 3d extreme microscope of cellulose fibers we managed to get reconstruction of cellulose fibers with 124 nanometers estimated again by a method called Fourier shear correlation and the diameter of the cellulose fiber it's 10 micrometers so let's go to far field and look at cd i and take choreography so in the far field we also can perform phase imaging experiments but in that case the propagator becomes the Fourier transform so if we Fourier transform our sample and we calculate the modular square that will give us the intensity we will measure in our detector as I said we only measured the intensity and we have lost all the phases how do we retrieve the phases from this intensity measurement in the far field there are several algorithms mainly iterative ones that they are capable to reconstruct that and they operate in the following way let's imagine that there are two spaces one is the Fourier space where our detector is measuring and we have a diffraction pattern and we have an intensity measurement and then there is another space which is the real space where our object is sitting to move between one space and the other we can propagate the wave by basically Fourier transforming and inverse Fourier transforming so let's imagine that we start with the intensities we have measured and we add random phases to them then we can inverse Fourier transform to go to the object domain and in the object domain we can apply some constraints and prior knowledge we have about our sample we can imagine for example that we have a finite support so our object is only constrained to a certain area of our did our image and the rest we can set it to zero we can also have other constraints like can be positivity constraints histograms because we know our phase should be constrained within several values etc so once we apply these constraints we Fourier transform to go to the detector space but when we look at the intensities that we get after one loop of this iteration they are not at all the intensities we have measured so what we do again is we reapply the intensities we have measured in our detector then we inverse Fourier transform and we again apply the same constraints in the object domain and we Fourier transform and we keep on doing this loop till we reach convergence there are many algorithms based on that the most famous one or the initial one is error reduction by Gerber and Saxon but one of the most popular ones that avoids extagnation is by FNAP which is called Hiblin input output by the ends of the 70s beginning of the 80s the first demonstration of coherent diffraction imaging with X-rays was done by Miao in 1999 so here I showed you an example how it works imagine that you have a phase object this is our reference and when we propagate to detector we measure some amplitudes or intensities the amplitude is basically the square root of the intensities we have measured then I will show how it appears the reconstructed amplitude the reconstructed phase and the current reconstruction so oops sorry it seems the video doesn't want to work so I see that I have a problem with the video but eventually what we do is we iterate over this cycle and eventually in the reconstructed amplitude you will get the sample we simulated hello yeah I think if you go out laser pointer I think you have to go out of laser point and then you can click on the video try that yes but I actually made it to make it automatic and to make it in the presentation but I see that I get an error that says that says it cannot be played yeah can media be played so I don't know it's something I detected this morning that sometimes when I was reloading I tried to reload it but I'm sorry I mean I don't know what it was working when we tried this just before now so let's say we have the beam sight this is what can we do with cohenin diffraction imaging so at this is a lens less techniques we can go to the to the fraction limited resolutions which are given by the wavelength so for example here they have this experiment with cdi where they have a beam of a site around one micron and they had an object which is a gold nanostructure of 100 nanometers so here you have the diffraction pattern and some of the reconstructions by using some studies actually half-pitted resolution they did here they managed to evaluate the resolution to be a three nanometers so this is far away from the wavelength but still we can get really high resolutions not limited by the optics because it's really difficult already to produce optics with x-rays to be capable to resolve three nanometers this technique is really a powerful cdi and nowadays is one of the main imaging techniques at free electrolysis in diffraction before destruction so by using cdi we can image particles and nanoparticles in 3d with resolution eventually limited by the wavelength i will not start discussing this because this is the topic that will be done in the next presentation by thomas echider and with that you see with single patterns we can reconstruct different objects the last technique i'm going to talk today is about tichography so tichography was a technique developed for transmission electron microscopy by hoppe and hergel in 1970s most of the techniques that we are now using for x-ray imaging and coherent x-ray imaging come from optics and from electron microscopy because those sources they had high coherence and they had to solve these problems before us so we just adapt them or reinterpret them to the x-ray domain x-ray tichography is nothing more than a combination of coherent diffraction imaging what i presented before and a scanning transmission microscopy so if you look at this image here you have these orange circles that represent the illumination function so where the x-rays are impeaching our sample for different illumination so for as you illuminate here and for this position you record a diffraction pattern so our data set is basically a collection of the positions of the probe respect to the object x1 y1 and the diffraction pattern for each of these positions one of the requirements of tichography as is a pseudo interferometric method is that there is an overlap between the different illumination so there are these areas where for the sample is illuminated with at least two different probes the good thing of tichography is that we can reconstruct the sample and illumination simultaneously so in the case of coherent diffraction imaging as we don't have this redundancy we only reconstruct the the object and we can do that iteratively as i described before so you can imagine that we have the exit wave which is now the product of the object o and the probe position p which it changes for every acquisition then with Fourier transform we have measured the diffraction pattern that we constrain we go infer Fourier transform so we go to our sample and we update it and we can update simultaneously the object and the probe and this is what you are seeing on the right hand side where you are seeing that the object for the amplitude and phase is being updated while also the illumination in phase and amplitude here oh this video works so here you are seeing a scanning process of tichography so there is the lun logo where it's being scanned and this is the probe position that is always constant but the object is moved respect to the probe and for each probe position we are recording a diffraction pattern with that we can try to get the reconstruction so here you can see that the reconstruction of the lun logo is appearing for the different positions as this algorithm is reconstructed different positions in different steps so in the end we have reconstructed the phase and amplitude of the object but also of the wave we can use tichography and the previous techniques also to get 3d information and tichography is nowadays one of the state of the art and most popular techniques to do nano tomography here you see a setup done at psi by mirko heller which is omni and it was used to reconstruct a whole in 3d a whole circuit of a chip of a computer cpu and with that they could study all the contacts and map them in 3d without destroying in principle the sample although it was preparing this way that it was destructive so with that i would like to go to the end of my talk it's about how do we i mean x-ray have a high penetration power and they allow us to reconstruct objects in 3d so let's go to the last topic which is tomography so in general normal x-ray images are in 2d so we have the x-ray that penetrate over an object and we get a 2d image but how do we measure 3d the idea how do we do it is we measure at many different view angles so theta and reconstruct in 3d all these views so we can reconstruct this mu function which is the attenuation here i explain for absorption but manually with some limitations could be done for phase and now i mean the technique to reconstruct from different views by rotating the sample respect to the object is called computed tomography but how is done this reconstruction that's what i will try to spread you we try this problem of a reconstruction of a 2d image from a set of 1d projection so that's i will simplify a little bit the formalism because in principle we will reconstruct a 3d object but i will follow a salami approach so what we are going to study here is how to reconstruct and slice of a salami instead of the whole chorizo or salami so here is what you are seeing here you have one slice of the salami where you have these two dots and this and then for each angle we will acquire a 1d profile so therefore if we want to reconstruct the whole object we will just need to stack many of these slices by knowing the order there is something about the formalism used in these slides so we use y as the optical axis so y is the direction where the x-rays move and they therefore they produce the attenuation profile which is usually here so let's assume that we have a function fxy which is basically that it's integrated so fxy could be our attenuation profile mu xy and this is integrated over the x-ray path so basically in our case as i said i use y as the optical path so we will integrate fxy over this direction that is called y we can compute the Fourier transform of this function p and the notation is the following one p capital is the Fourier transform of p and the Fourier conjugate quantity of x is qx and the Fourier transform is its typical form described here described here we can also make the Fourier transform directly of the whole object fxy and for that we do a Fourier a 2d Fourier transform where we have two components qxx and qyy and we integrate over these two components the xy and with that we have the Fourier 2d Fourier transform of our object now let's assume that we study only one slice this slice where qy is equal to zero so then f becomes this function here where qy is equal to zero so by that we have this square bracket quantity which actually corresponds to the definition of px and therefore what we are computed in the slice qy equals zero of our 2d Fourier transform is the Fourier transform of p pqx so what the conclusion is the Fourier transform of a projection along a line which we define as pqx is equal to the slice of the full function so f with a slice qy equals zero so you can imagine that by taking projections over different directions we will build all the slices to complete the whole 2d Fourier transform of our object and by inverse Fourier transform it we will get the reconstruction of our object this is the idea of tomography and I will try to further explain it with more examples so imagine that we have this square this square object and our x-rays move with this green line through the red object and this is the transmission and if we Fourier transform it we will get this so from left to right we have the Fourier transform but we can directly Fourier transform all of our object and basically by integrating in this direction we are studying only the semi-transparent red area which intersects our Fourier transform so therefore we have this by rotating the object we can complete this whole 3d 2d volume sorry in this case with a Fourier transform and therefore we can reconstruct our object in 3d so the real situation is that we don't know the object as before so we have to measure a set of projections so for each projection angle theta we get a 1d function of and the position of the detector and this 1d function is what we call the Radon transform which is the integral of the x-rays through our sample in the in their path so this will create a 2d dataset which is basically this path integral within for each angle which is called the Radon transform of the object so if we make the 1d Fourier transform for each angle of this function and then we do the inverse Radon transform which is a kind of inverse Fourier transform I said a kind because I will explain later some things I didn't explain we get the 3d reconstruction of our object the real algorithm has other ingredients because as we are describing the problem in polar coordinates but so therefore we have to make the transformation between polar and tomographic coordinates and here I have a video to try to explain you how it works on the left is the sample position and the x-rays come from left to right this is the 1d function that will measure for a single slide and this is the synogram that we will build or the Radon transform for all the angles so here you can see how the sample rotates and for each of the rotations we have a sum transmission function and we put it in our synogram once we rotate from 0 to 180 we have a full datasets but sometimes we will require also 360 depending on the sample characteristics for an overcomplete dataset so say once we rotate 180 we have a complete dataset then what we do is we have the complete dataset we compute the Fourier transform of each of these lines and then we position them in our polar coordinates or sorry in our Cartesian coordinates by transforming polar to two Cartesian coordinates so here you can see the video where we are collecting transmissivity for each profile we make the Fourier transform and we position them in our Fourier transform object over the red line for each angle so once we fill the whole Fourier domain we will be able to inverse Fourier transform and get the final reconstruction of our object and this is compared to the original image so this is the way city scans work for example in hospitals and here is the example of the typical one you can have in a hospital where you have inside an x-ray tube and an array of detectors and this is the way it works if you dismount it and you see how fast it works so then you will understand how noisy they are so here is the source here's the array of detectors and then they start acquiring acquisitions well you can hear that some scientists are having fun in the meantime and yeah it can rotate quite fast with that I come to the summary of my talk so imaging is nothing more than mapping interactions we have discussed the effects of interactions with matter and I try to motivate why it's important to optimize the dose and how we can do it by exploiting the phase instead of the attenuation counter we have on to reduce the coherent of the concept of coherence required for coherent and phase contrast techniques we have discussed the coherent techniques without lenses in the near field and far field regime we have introduced the problem of the phase retrieval and the phase problem and we have shown a little bit how to solve it for near and far field techniques and finally we have discussed how we can retrieve 3D information from 2D measurements so with that I would like to thank you for your attention Pablo thank you very much for this very very useful overview on coherence and on the possibilities I've invited everyone to ask question I think that the audience is a little bit shy so there is one thing I would like to ask you and is just to talk a little bit about electron microscopy why this imaging is inverse imaging with x-rays should be used and how that compares with electron microscopy that's a really good question so each probe has its advantages and disadvantages so one of the main advantages of electron microscopy is its wavelength because electrons have mass so therefore they can achieve much smaller wavelengths with lower energies so with an electron microscope by having a smaller wavelength we can probe smaller samples so we can typical wavelengths of x-rays are in the angstrom range which is the atomic scale for hard x-rays while electron microscopy hatch with much lower energy can have already such wavelengths that means that in terms of depositing those per contras it can go have higher contrast with so have higher contrast and be able to explore smaller resolutions that's why electron microscopy is now really successful into getting single shot atomic resolution than x-rays because of the interaction on the other hand by having much larger much larger cross-section or interactions with matter than x-rays so less penetration power there is the problem that we cannot penetrate that much so typical penetration lengths of electron microscopy are around hundreds of nanometers so that means that we have more restricted samples that we can use so samples larger than 100 nanometers should be pre-processed or destroyed to be studied in transmission electron microscopy so there so as I said there is this balance between electron microscopy being capable to have higher resolutions and higher contrast than x-rays for a given dose but on the other hand they cannot penetrate that much and therefore we have to cut our sample and x-rays as I said they have the potentiality to go to the atomic resolution and we can do it in crystallography and in some cases but on the other hand they have higher penetration power so they allow non-destructive imaging which is not possible for large samples with electron microscopy thank you also it's important that compatibility with sample environment especially hard x-rays can penetrate sample environment so they can study samples exactly exactly so one of the main advantages of what I would say that where people should focus on x-rays and non-electro microscopy is in in-situ and in in-situ experiments in operandum conditions these kind of things are not in general possible with electron microscopy there are some exceptions but few okay so Pablo may I ask you if anyone in the audience is interested in contacting you privately to ask you more questions about this talk if they can do this yes I would be happy and the last thing is that during your talk you've done some nice introduction about CDI and tachography but I'd just like to remind everyone that there will be two seminars dedicated to CDI for crystalline and non-crystalline samples and one dedicated to tachography so this topic that Pablo touched lightly during this introductory talk will be actually developed much better so I really invite you everyone who is interested in this to join us again the full program is on the links web page for now we have five seminars planned so there would be one per week so it's very I'm very happy that we can put together this program there is one last thing I would like to ask everyone before leaving this chat if you could just send me short information about what is your interest what is the topic of research of what is your interest in this coherence seminar series because the program is still in theory we are it's still possible to tweak it to the preference of the of the audience so if I have a high request of people asking for imaging application for example in biology or in a technologically relevant samples maybe we could invite somebody who can answer most specifically those questions and while I'm also happy to see that there are lots of people still 17 online we have guests from USA, Brazil, UK and actually several Swedish universities so we really hope that this is a good way to gather more interest around these techniques and maybe seeing you as users I'm exploring in your future so thank you very much for this first day and I'll give you an appointment to the next seminar which I think is the 22nd of thank you very much