 There is no great chance of observing this phenomenon that is what was the conclusion of Albert Einstein with regards to gravitational lensing. That is true kind of because it's a rare phenomenon but today more ogle collaborations they're discovering hundreds and hundreds of these events microlensing events. I'm doing my PhD candidacy and the Dr. Nicholas Rattenbury and I'll be discussing GPU accelerated modeling of microlensing events using open source technologies. I will revisit some of the relevant concepts to gravitational microlensing so I apologize if it sounds repetitive. Please don't fall asleep. And then I'll go through the modeling process that I use to model these microlensing events using open source resources and finally present you with the modeling results that I obtained. So here we have a light curve for a single lens event. Gravitational microlensing as Martin elaborated on light from a distant source bent due to the gravitational potential of a lens foreground lens object. And this is a corresponding magnification map which I will be talking about more later. What we see here is as the source star approaches a lens star you can see corresponding increase in the brightness. So also this parameter determines how what the peak magnification is going to be. So the red source track, this is a source track which is quite far away relative to the yellow one so that's a tiny peak compared to the yellow one over there. Compared to that if we have multi-lens systems for example a planet orbiting a house star then we have deviations to the bell shaped curve. We have bumps or dips and this is what we look for. This is a typical planetary signal. This is what tells us we have perhaps a planet orbiting a house lens star. Consider the simple geometry and skipping all the mathematical derivations we have a simple lens equation where theta e is the angular radius of the Einstein radius of the Einstein ring. Einstein ring is a ring Martin showed you a Hubble image of it. Einstein ring is what we get when we have perfect alignment of the source observer and the lens. Over here we have a deflection angle which is the only input from the theory of gravity theory of general relativity so that is the only input other than that don't care. Like multiple lens systems geometry a planet companion to the lens star and we generalize the lens equation which I showed in the previous slide to this in complex coordinate system where we have W and Z the source and image positions respectively that that's what you need to know. We'll be using this equation later on in modeling so the micro lensing modeling problem is essentially find a set of model parameters to best fit the observation of our girl collaborations and what we suffer from is the curse of dimensionality why because for single lens event we have only three parameters that is enough to model. For a multiple lens system we have three additional parameters add more lenses will get even more parameters and now over here we just take into account the additional parameters are due to we take into account the finite size of a source star for example in single lens case we just assume point source point less lens and then higher order effects we take into account finite size of the source we take into account the orbital motion of the planet around the whole star and so on so we suffer from the curse of dimensionality we have for multiple lens systems we have higher diamond problems so that that is where the challenge is. Now micro lensing modeling is an inverse problem why because we know the position of the source and we need to find the position of the images so we need to invert the lens equation inverting the lens equation for single lens event we are able to get an analytical expression inverting this equation for a single lens event will give us a quadratic then we use for a multiple lens system it gets harder and harder n squared plus one polynomial degree polynomial right for multiple lens for where n is the number of lenses we have so we use a numerical technique inverse ray shooting this is a magnification map I showed you a magnification map of a single lens even whereby we had a nice circular geometry circular pattern with the lens in the center this is for binary lens and this is not what we see in this guy this is a mathematical representation of the lens plane right inverse ray shooting is whereby billions of rays are shot towards the from the observer plane through the lens plane and onto the source plane backwards these are close curves are very high magnification on this mathematical representation of the lens plane we have these curves the concave curve that's called I fold and the point where they meet are called cusps okay so this is a magnification map each point on the map represents the amplification value right it's for this configuration where q is the mass fraction and is the separation of the planet from the host star and then we project a source track onto the lens plane right at a at an angle alpha to the observer lens line of sight at a distance of mu naught from the lens the center right and that represents a model path of the source star over here we can see it grazing a cusp and cutting the curves at two points cutting to two fold two folds right then we read we read the values on the magnification map along the source track and we get a unique light curve over here we can see corresponding to where the source track grazes the cusp we can see a bump arise and then a fall of the amplification the x-axis sorry y-axis right and then where it cuts the the folds we get two spikes right these are very high magnification curves searching for the best with light curve then is it involves searching had thousands of these magnification maps which means creating hundreds of thousands of light curves which makes this process very time-consuming and computationally expensive this over here you can just see three light curves with all the other parameters same except for mu naught and alpha right which is that those parameters I imagine creating hundreds of thousands of light curves right now the code that I'm using is developed has been developed by Jolene who did his PhD at Mesa University under the supervision of Dr. Ian Bourne and it is to become open source he is doing his postdoc at the moment what is what is so great about his code well what I like about his code is that it's a GPU accelerated it is executed on a Linux system and it has a pretty Python packaging right I'll elaborate on these small GPU accelerated code GPUs are massively parallel processing units and they're comparable to high-performance cluster computing right for my purpose for example the code that I'm using the CPU version of it will take days weeks or months depending on the complexity of the microlensing event that I will be modeling on a desktop right but the GPU version will take hours so it's super speedy and this is the technology that we are using NVIDIA Tesla K20 with its impressive specifications crazy large number of processor cores this crazy speed and same for its memory right the other thing about is that I can run it on a Linux system remotely which means I can walk from home a code I can run a code from home a code that traditionally would need access to cluster computing facilities now myself I met Linux newbie I started using Linux six months ago I always do things the hard way I jumped in in the deep in the deep end I learned how to access the server before I learned how to make a directory how to copy and paste things tasks simple tasks that you take for granted when you're using Windows so if I had to give a small piece of advice to any Linux beginner take the time out to learn the basics from a book and this is a good book which I have still not read the first few pages suggest it's a good book practice what you preach which I don't know yeah Python with CUDA CUDA C is an extension to C programming language which requires specialist GPU specialist GPU programming knowledge which I don't have again I always do things the hard way good thing Joe's code is packaged into an executor from Python again that also I started learning when I started my PhD six months ago but I really like really like Python mostly because it's visually a bit because of its visually appealing syntax easy to learn and handle and of course it has extensive standard libraries this is a a example of a model that I obtained from from the code that I just discussed it's a binary lens star it all those parameters over here the x-axis is is the time in Julian days miles that and more than this kind of event we care about such events which we suggested their plan that there is a planet of 0.01 mass fraction and 0.77 separation that's like 100 the plant that's such as the planet is 100 of the house star yep ongoing work or rather future work coming observation season which will start in March and I'll be feverishly modeling these events and I am also currently exploring the potential of intelligent search methods the code that I just discussed it had it uses methods search methods such as good search downhill simplex Markov chain Monte Carlo organization methods now I need to investigate the potential of currently I have been writing a genetic algorithm code and I have yet to see what advanced branches that offers when when it comes to my crossing events okay thank you for listening thank you Ashna that was fantastic does anyone have any questions I'd like to ask of Ashna no yes hang on just a suggestion for Nicholas by the girl a K 40 a K 20 you have a K 80s actually it's the Nvidia graphics card it's just five thousand dollars or what's just five thousand dollars is still had to get past a research grant committee this is the problem that you deal with constantly right so one of the things which I did on purpose which was to buy a Tesla K 20 which I know the Nessie guys here at Auckland have a cluster of these as well so the idea is we do our prototyping on machine which is in my office and then if we need to roll out to identical hardware on the Nessie cluster this is before I realized just how good the local support was here at Auckland for HPC I was thinking I had to do most things myself like it was 10 years ago now because we have such good support here what does that mean and the point being is that that was a step which was turns out to be largely unnecessary and delightfully so I'm pleased just to once again plug the resources and the help from this University for HPC sorry can I respond because it's for the good but it's still polite to ask there is I'm have nothing to do with NVIDIA apart from the sponsor multi-core world but how they do it there is something called the academic whatever program that they give you one of these toys for free so the only thing that you need to do is to ask and they give you one per year right of reply version two once again since we when I say we myself and Ashna I'm not GPU code specialists the specialist is Joe Lingick Massey if his code works on the new cards then yes we would thank you do we have any questions I promise this isn't about GPUs just wondering because you've started off with the maths backgrounds and then you went off into astrophysics and now you're doing a computational science problem kind of your approach to looking at microlensing how do you manage the multidisciplinary aspect of your work particularly when you have to do a presentation first at a Linux conference and then later on amongst your own peers the university do you have any difficulty communicating all aspects of your research and the methodology use different audiences do I have difficulty if you are if you ask me to go in much more detail yeah I would have difficulty of the top of my head yep sure okay do we have any others yes no okay I think I'm close this time I just keep your hand a little bit raised and we'll pop around to you this is a question slightly more general about this microlensing thing you know how the light bends to make a bright spot so the place where the light was going to go to it must be like a darker shadow can you take that we need you can only to take the peak does that make sense no well well if the light is being bent to come to a point or you know a brightest point then where the light was going to go to isn't there a dark of it like the light doesn't really see it was gonna there are occasions where there are de-magnification areas that we did detecting light because I think you want to respond yes the answer is yes the question is more complicated like Ashna mentioned that there are there are regions of de-amplification we didn't see any of those magnification maps that Ashna showed in this case but you were quite right yes the light that was going to go to one place now goes somewhere else and my former research group boss at Manchester did consider this question although I don't know whether he actually got anywhere with it so I can look at that myself and ask your question as to whether he got anywhere with it so I'm curious where you're going to go next with your research and I want your answer not your supervisors in the next three years in the next three years or after that I have been developing a genetic algorithm which is going good so far I'll see where the research takes me just in six months and I'm enjoying myself so fast you mentioned that the code that you're using at the moment is going to be released as open source presumably when he's finished sure is I think finishing in six months or so but well takes more time to get your results published and so on after that any code that you work on will that will we all have to wait until you finished yes before we see that yeah isn't academic life grand because well to release it you have to repackage it so that other people can understand your code before that it's just you can understand the code mostly but yeah