 Okay. Hello everyone. Welcome back. So we are here for the second keynote of the day and I want to say welcome to Kimberly. Hello. It's really nice to be here. How are you? I'm really good. Thank you. And what about you? Wonderful. It's a beautiful sunny morning here in Rhode Island. Perfect. Perfect. It's also sunny here in Amsterdam and you don't find that every day. You don't find that in New England either. Okay. So Kimberly is a leading expert in astronomy and visualization. She has been a pioneer in 3D machine printing and extended reality application with astrophysics data. I'm really happy you are the second keynote related with astrophysics. I really like how that science war is taking over Python slowly. It is. Yeah. Or maybe vice versa. Perfect. So I would say let's put your screen in the stage and see if that works. Okay. Great. Cool. So thank you very much for being presenting here in our Python and I will leave the stage for you. Great. Well, thanks everyone. It's really nice to be here. I'm going to talk a little bit about our high energy universe and how we can explore it and understand it through not only sight but also sound and touch. So just a little bit about the observatory that I work for. I work for NASA's Chander Exeter Observatory, which is a sister telescope to the Hubble Space Telescope. Chander was launched back in 1999 on the Space Shuttle Columbia. It was actually the largest and most massive payload to ever launch on the Space Shuttle, which made it a rather dangerous payload to be able to bring up to space for the astronauts involved. But everything proceeded very smoothly and Chandra has been operating in space really beautifully ever since. So Chandra, because it studies X-rays, has to operate above our atmosphere because one of our atmosphere's superpowers is essentially to protect us from things like X-rays. So Chander goes about a third of the way to the moon at its farthest distance from Earth so that it can get a really clear and uninterrupted view of that high energy universe. So what does that high energy universe look like? It can be things like exploded stars. It can be things like black holes. It can be things like colliding galaxies. And ever much more, really most objects in the universe can emit X-rays. So there is an awful lot to study considering how very big the universe is. So Chander is really a technological marvel and a true feat of engineering as well as computer science because we can't ever visit Chandra again. With astronauts we have to take care of Chandra and take Chander to the doctor, so to speak, by using code from here on Earth. And that code is an interesting mix of code because Chandra's development dates back to the 70s and 80s. So there is a bit of a smorgasbord of platforms and languages that were created and utilized rather to be able to make Chander what it is today. So everything from Fortran and C++ to Python. So Chander is part of this really lovely suite of operators that go across different kinds of light. So telescopes and other detectors, observatories both on the ground and in space. And it's really important to be able to view our universe in all kinds of light. You can liken it to if you only had a piano with four keys on the keyboard and that's all the sound that you had access to. If you wanted to play a piece by Mozart it would be really hard to play a piece that sounded interesting and terribly good. So that is essentially if we only had four keys that's kind of like what our optical regime of our light is really equated to. So we really need the entire breadth of the electromagnetic spectrum from gamma rays to radio waves in order to really fully understand the universe that we all live in. So it's essentially filling in all of those keys on the piano keyboard so that we can really experience that full symphony or orchestra that is our universe. So I thought I would take you on a brief tour of some of my favorite sites to see in the universe. I'm going to go really quickly and not spend too much detail on any one of these objects. But starting with this very first image that we released from NASA's Chandricks Observatory back in August of 1999, this is Cassiopeia A, an exploded star. This star essentially was much more massive than our sun as it grew old. It started to run out of fuel. It collapsed and then it exploded itself to smithereens. And this was one of the first objects that Chandr looked at because it's in our Milky Way. We have a really great view of it and it was very scientifically interesting. And within the first hour of looking at this one single object, Chandr had already made a discovery or the scientists using Chandr had made a discovery that you could already detect the neutron star that left over core of the star that exploded right in that central region of the image. But since then, Chandr has gone to look on so much more from very famous objects like the pillars of creation made famous by the Hubble Space Telescope to other stellar nurseries and clusters of stars. All really fascinating things to be able to look at an X-ray light to older and more mature stars, stars that might explode in our lifetimes or might not. Stars that are like our sun that eventually start to change and evolve and turn into these beautiful planetary nebulas. Things like exploded stars. Again, one of my favorite topics and something I'll keep coming back to over and over. And you can see they're my favorite because I put a lot of them in here. And other kinds of exploded stars that leave behind things called pulsars. There are areas around black holes, such as a supermassive black hole at the center of our Milky Way Galaxy, which is another favorite image that I'll be talking about later, to galaxies of all shapes and sizes from pinwheels and exclamation marks to cartwheels and rings and teni whirlpools and so much more. Active galaxies with massive black holes with jets streaming out of them clusters of galaxies the largest gravitationally bound structures in the universe. And even those that look like they're smiling back at us thanks to gravitational lensing. So Chander has been a veritable workhorse for us. It's traveled well over two and a half billion kilometers, about 3,000 trips around Earth. About 25 trillion bytes of data collected and it's still going. And about four million lines of code written to operate it to collect the data and to analyze the data ever since it launched. So essentially my job is to play in the sandbox of data. It's a really fun job, most of the time. All of the data that starts encoded in its suitcase of binary code eventually, you know, that is how we're collecting and traveling the data, if you will. Once we collect it from the object, it gets sent down through NASA's Deep Space Network and over to my laptop here in Rhode Island, which is in New England in the US. We unpack that suitcase, that binary code, and we have like a table of data, essentially with the time, the pixel values, the energy of each photon, each packet of energy that struck the detector during the observation. And then from there, depending on what we're trying to do with the data, what kind of analysis has already happened, what the ultimate product is intended to be, we start working on some other output, which is often an image. So, you know, our images of the universe, they don't start out looking like the famous Hubble images do. They start out looking a little bit more like this, raw data, black and white data. And then we start to add color. That brings us back to our friend, Cassiopeia A, our exploded star, which is about 10,000 light years away from us. A light year is the distance that light travels in a year, which is about 10 trillion kilometers. So 10,000 times 10 trillion kilometers. And that tells you how far away it is, which is not actually that far at all in the universe. So this was our first image of about an hour's worth of observing time. Now, about 20 years later, we have about 2 million seconds worth of observing time. And you can really see the detail that now comes alive when you have a really rich treasure trove of an archive to be able to compile into an extended observation like we do now have on Cassiopeia A. What that additional information tells us are things like what the chemical mapping is of this exploded star. Exploded stars are really great ways of understanding stars during their lifetimes because we can't see inside a star when it's alive. So an exploded star is kind of like reverse engineering or like a CSI crime scene, whereby understanding that star and its sort of outputs, if you will, we can actually learn about how that star lived. So in this image now you're seeing these chemical elements color-coded. So the iron is in purple, for example, and the oxygen is in yellow. The calcium, the silicon, the sulfur, etc. It's all been color-coded into this map like a weather map tells you where the highest wind speeds are, right? And that's really useful information for us. When we have a really nice sandbox of data, we can also look at that data as it changes over time. So for example, we're looking at observations of the same exploded star from 2004 and 2007. And what we're doing is seeing that supernova remnant as it expands. And we know now that that speed is about 11 million miles per second, which I mean per hour, which is an amazing thing to be able to look at. So again, we can understand and learn a lot more about these objects when we have a really rich data archive to be able to work with. And then if we have really, really good data, we can also start to model it into three dimensions. And this version that we're looking at here was one of the first times that we had ever been able to take NASA data of this exploded star, of any exploded star, and model it into 3D so that we could understand which of the light was moving away from us and which of the light was moving towards us. We worked with Dr. Tracy Delaney, who is at MIT at the time, to use some medical software that had been adapted by Dr. Alyssa Goodman's group, also at the Center for Astrophysics here with me. And to be able to take it and adapt it from medical imaging of, say, the brain to be able to do it to 3D imaging of an exploded star. And that's the result that you're seeing now. So again, by understanding what those chemical elements are and where they are dissipating out, you can really see that in this image or this model, the iron is actually colored green and it's all along the perimeter now. Now before a star explodes, the iron is actually all bundled up in the very core of the star. So a star like Cassiopeia A can quite literally turn itself inside out when it explodes. And it's really nice to be able to visualize that in three dimensions through the use of this kind of modeling. When we have that kind of model though, we can then take it into a lot of different things with it. We can apply different textures and colors in additional 3D software to make it look a little bit less biological and brain-like and a little bit more like, I don't know, a celestial object. And it's interesting because when you do these types of models, you can learn, for example, with Cassiopeia A, that this type of star, it kind of comes apart into pieces. It has that very spherical-like expanding cloud, but then it also has these really strong jets that come out after the explosion but then supersede or pass by that cloud of gas. And we can also take it and do 3D printing. And this was a really important step for my own work because it opened up a whole area of accessibility that I hadn't really been attuned to prior to this. So 3D printing, this model is interesting to be able to hold a dead star in your hand, if you will, albeit scaled greatly down because this object is like four million billion times the surface area of our sun and planets. And here we're looking at it in like a four-inch print that you can hold in your hand. But what this really did show was how this material can be improved and made more accessible, particularly for people who are either blind or low vision. And we worked with students here in the U.S. at the National Federation of the Blind to improve this model and other models as well to make sure that that value was being added and that the resolution, the size, all of the details of this 3D print were made and optimized for people who are blind or low vision. We also started working at this time with Virtual Reality. This is one of my students who is walking around inside this exploded star, this beautiful Casio PAA supernova remnant 10,000 light years away from us. And she's walking around inside of it. The camera person is outside, which why it looks like she's not inside, but she is indeed inside it. And she's able to build that exploded star up by chemical element. And that just adds a new dimension, I guess literally, to the way that we can not only learn about these objects, but also help teach about them to other people. And you can bring that into an office, for example, or classroom. This is using the Microsoft HoloLens to be able to bring that into augmented reality so that you can, as you're working with the data, build up each layer piece by piece, bring in the iron, bring in the jets, bring in the argon, the silicon, the calcium, change the scale, change the orientation, walk around it, you know, explore it with your other colleagues that are in the room, for example, and you can see in the demo, they're turning pieces on and off. So we haven't explored too much more about like what the value of this to research is yet, but we have started to explore what the value of this is for communications, particularly with non-experts. And it's been really interesting to see how providing it in augmented or virtual reality can help people understand the concept of scale a little better. Again, these scales are monstrous, so we can never really experience the scale as it actually is. But it is interesting to be able to break it outside of a small computer screen or phone screen and break it into a much larger thing. We've also experimented with bringing into holograms, for example, another way just to sort of bring that dimensionality, which helps us kind of talk to people about how these objects are dynamic, how these are not flat objects out in the universe, like everything appears to us down here on Earth from our giant TV screen that is the universe. After we started working with all of these sort of 3D components, however, the pandemic hit, and there was definitely a point where we weren't sure how to really continue the research. We had been working on attaching sound to those geospatial plots, those virtual reality models of things like exploded stars or stellar nurseries or what have you. But then during the pandemic, you know, the labs closed, students went home, and we were no longer able to work in person with our colleagues and community members. So at that point, we actually started switching to a form of data visualization that is not visualizing it at all, but vivifying it and taking that and bringing it into the world of sound. I don't believe the sound is going to work here. I'm pretty sure no one can hear that as I'm playing it. So I will go out of the PowerPoint later and play it in the YouTube browser so you can all get to listen to that. But essentially, it's this idea of taking the pixel values that we had in the data, the image, and translating it into sound. So translating that information into sound, not capturing sound from the universe, but just translating the data. And I worked on this project with a group called System Sounds, a very dear colleague of mine, Matt Russo and Andrew Santaglida. They are located in Canada and they had been working in this data sonification area for a couple of years before I started dabbling. And together, we've done a few of these projects where we've brought different kinds of objects to life through sound. And it was a bit of a, it was an experiment, I would say. And I wasn't really sure what the reaction would be to it, but the reaction was incredible. It went viral. We had tested it with people who were blind or low vision to make sure the meaning making and that value was being added. We had learned a lot from our colleagues like Wanda Diaz, who is an astronomer and computer scientist who is also blind from other colleagues like Gary Faran in Australia and Nick Bond in the UK, who are also blind as scientists in astronomy and kind of learned lessons from them. And honestly, when the project came out, it was just three sonifications. And within about a week and a half, we had like a million and a half lessons on NASA's SoundCloud. And I was really shocked. I think the fact that it came out during the pandemic was one thing. I think there was a need at that time to be able to connect to things in a different way. But it was a really, really pleasant surprise that, again, playing in this sandbox of data led us to a result that was so particularly pleasing and created with Python. We did the work in Python and then brought it into Logic Pro for that sort of audio finessing. And there were a number of really specific choices made in that process of sonification, which again I'll talk about a little bit once we get to hear the sound. But it was all this idea that all the versions that I've shown you, the virtual reality, the augmented reality, the 3D models, the 3D prints, the sonifications, they all started from those same batches of ones and zeros. The snippet on my screen is actually a tiny bit of the binary code from that actual data set of Cassiopeia A, our first light image. So I thought I would show just a few more examples of the types of work that we've been doing with this multi-monality. Again, I think there are really interesting things here to learn on the expert side, but also the non-expert side. And it's something that I'm constantly learning about. I've learned so much about 3D prints and 3D modeling and sound. None of these things that I had any formal education in whatsoever, but fortunately have really amazing colleagues to help bring all of that knowledge together. So Crab Nebula is another object. This is again the result of a different kind of exploded star that created a pulsar, which is a really dense core that's spinning around kind of like a lighthouse effect. And we're able to look at a two-dimensional image and then look at that data in 3D as well, and again to play sound, which I'll play for you in a bit. And other kinds of exploded stars as well, and exploded stars are much farther away. This is Supernova 1987A, which is in another small galaxy sort of next door. And when you look at this object as a two-dimensional image, it looks kind of like a donut as you see in that first little bit. But then once you turn that 3D model on its side so that you can see it better and understand it better, you can see that it's actually this material that's sweeping out that is then lighting up this ring, this pre-existing ring that was around that exploded star, likely from a previous perp that turns it into like these lit up fingers of really high energy material, almost like a crown. And it's an amazing difference to see that in 3D and to explore that through a 3D print than is to just see it as its donut-like shape when you only have access to the two-dimensional information. And again, we brought that into sound as well, unified it in a radial progression around that bright lit up ring. Other objects and other researchers have done similar work with 3D modeling. For example, this is the famous Pillars of Creation from NASA's Hubble Space Telescope. And there were researchers that created this first 3D print. We then took our data from the Chandricks Observatory, added it on top of the Hubble Space Telescope data and then created a sonification from that as well. So you can hear all of those really bright, high energy newborn and young stars and then also sort of hear the texture of those pillars of gas and dust that reach up, you know, for light years or even more. So it is really interesting to be able to hear those changes in the data versus just see them. And as researchers like Dr. Wanda Diaz and others have shown, sound is actually really useful in scientific analysis. This is not just for the G-Wiz effect, though I personally have nothing against having an output that has that sort of emotional impact on people for the pure enjoyment of it. Any way to be able to bring data to people, to be able to share in I think is a worthwhile experiment or endeavor. But there is also stuff to learn. And so I believe it was Dr. Wanda Diaz's paper that showed that you can learn to be a better listener as a scientist as an astrophysicist specifically. But this also applies to other kinds of science as well. And there are very specific areas of scientific research in astrophysics, for example, where learning to be a better listener can really help you, such as things that are called Cepheid Variable Stars where these stars are changing. And the stars change at such a rate that when you listen to the data, you can pick up on that nuance a little easier than if you're just looking at the data. You can think of it like that cocktail party effect back when we used to be able to go to cocktail parties. That if you're in a room with a lot of people and a lot of noise and a lot of activity, you can clearly hear what the person sitting next to you is saying, but you can also hear conversations across the room. You can make out someone washing dishes in the kitchen next door. And you can still hear the dog who is getting excited by being pet on the head in the dining room and what have you. You're able to map all of these different sounds and understand what's happening in that kind of context. So I think there is a lot to learn from this type of work in the future. We've really just sort of dipped our toe in it. Another example of this type of work, this is a virtual reality piece from Christopher Russell and his colleagues in 2020 of the inner region of our galactic center. So in this virtual reality piece, you are essentially at the position, if you will, of the supermassive black hole at the very center of the Milky Way. And you're looking at around 40 very massive stars around you and kind of understanding their behaviors and the hot gas that's streaming out in that area and the orbits. And then we can take that same area of the central region of our Milky Way galaxy and map it to sound as well so that you have these sort of multimodal experiences. And one of the things that we were doing just before the pandemic was trying to experiment, taking that geospatial information and attaching sound to it so that you have a sonification or a sound-based experience as you're walking through a virtual reality piece and kind of trying to combine those multimodals. So we have done, I think, as I hinted at, a small research study so far on the sonification project. We did this with about 4,500 responses coming in, which I was quite surprised that so many people were happy to take a survey including one person who said it was the most fun survey ever took. Amazing. I don't think surveys are fun, so that was great. This analysis is meant to really study both the non-expert spectrum and also the BVI or blind and visually impaired versus non-BVI spectrum as well. So we had kind of like a two-by-two analysis of expert, non-expert, BVI, non-BVI and just being able to understand some of those differences. Overall, there was just really high learning values for everybody and the enjoyment values were really high as well. But one thing that I thought was really quite interesting and what was unexpected from the result was that learning about how others access information was an unintended outcome from this project. So people who thought about how people access the universe and what the value of being able to access our universe through sound can do made them aware that other people did indeed access information differently. And I thought that was really quite lovely. So we are working on further analysis now, 4,500 responses. We had a lot of open-ended questions because I really did not think we'd get 4,500 responses. So it's a lot of coding that kind of data by hand and trying to get that done in time to get a paper out. But we will be writing and finishing up that paper hopefully soon. And again, just to sort of take it back to the original idea that these are all meant to be a different part of a data pipeline. My sort of hope and idea is that eventually this work in 3D modeling as well as data sonification will be part of the research pipeline from the beginning versus at this point being a farther down piece of the data pipeline as we're getting towards the production end of things. I think there is a lot of value to explore in being able to really assess all of that sciencey goodness from those different kinds of data products much earlier on in the process than just towards the end. But as I've shown here on a website, we've got a growing library of these 3D prints. We have a growing library of the sonifications as well, which I'm going to play for you in just a second as soon as we get out of this PowerPoint. But I just want to, I guess, say that I really do hope you've enjoyed time traveling with me this morning for me this afternoon I suppose for you. It really is a privilege to be able to sort of stare out at this universe every day and take a trip, you know, thousands, millions, billions of light years across the galaxy, across the universe to be able to understand what's happening because all of the stuff that I showed you, those are all sorts of baby pictures. Nothing looks anything like that now, of course, because it takes so long for the information to reach us. But it is a worthy endeavor nonetheless. So yeah, these are the types of things that I like to think about. I get to write books about this stuff for fun, and I do have some references on this slide if you would like to follow up with any of these topics. Again, I think I'm going to stop sharing for a second so that I can reshare with sound if I can manage to do this successfully. So share, share screen, Chrome tab, share audio, data sonification, share. Oh, I think I did it. All right. So starting with the first sonification, hopefully you can hear that. I don't have any feedback to be able to tell me if you can. So I'll just keep going. Yeah, he's working. Great. Thank you. The audio feedback, very helpful. So this is the sonification of that supernova remnant, our good friend Cassie PA, 10,000 light years away. We're looking at all of those chemical elements in different colors. The blue, by the way, is that sort of high energy material and that shock wave that's actually pushing out really, really quickly. And so in this case, because it's such a spherical object and it is expanding, the intent of this sonification was to help sort of describe that information. So it is different kinds of angles going out from the very central dot from that neutron star, that leftover core. And you're hearing that and also sort of, I hope, feeling it as it expands. So I will stop talking and press play. OK. And then we also have individual elements as well sonified. I'll just play a quick example so that you can hear it. So this is the silicon that's been sonified. This is the data sonification of the galactic center. This is one of my absolute favorites. In this case, we've got three different kinds of lights. We've got the highest energy material from the Chandricks Observatory mapped like a xylophone type of sound. We have the mid-range energies from the Hubble Space Telescope mapped to a plucky violin. That's all the sort of filamentary textures and structures you're seeing as well as some young stars. The highest energy material that xylophone sound, that is stuff like really compact, bright, energetic sources like neutron stars, exploded stars, black holes, of course, and also much more. And then the lowest energy materials from the Spitzer Space Telescope, that's the red type of gas and dust that you're seeing. And you can hear all of those in the symphony when you play together. And as we approach the right side of the screen, you'll hear a crescendo with a lot of activity at that higher energy level. That is the supermassive black hole at the center of our Milky Way galaxy. So I hope that crescendo will be notable for you. So this is a left to right scan of the data because of that sort of horizontal format of the information. But I hope you could clearly hear that crescendo of all that high energy activity that happens around Sagittarius A star, that supermassive black hole, all of those really high-pitched beeps and boops. That is like the downtown of the Milky Way galaxy where just a lot of stuff is happening. And let me go back to sonification library. If you're interested in other sonifications, we have a whole curated list of them here on YouTube slash CXPub. This is the Pillars of Creation ones that I mentioned. So in that one, as I mentioned earlier, Matt and Andrew and I were really trying to emphasize the difference between those really tall pillars of gas and dust where the babies are forming and then also kind of around them, that sort of large or stellar nursery where the high energy young stars are still in the process of developing as well. And so that difference in textures between the really compact sources from Chandra and that more texturized gas and dust from Hubble was meant to be very distinct. So that is another favorite of mine that I really enjoy. And let's see if I can get back to the list. I'd like to actually play one last one that I think is really cool. This is the Chandra Deep Field South. And what this is, is a field of really dark patch of the sky where Chandra looked really deeply for X-rays for I think it was like 11 days total. So very, very deep observation. The deepest X-ray observation ever made so far. And in this case, what was found was a massive field of black holes and galaxies. So mostly what you're going to hear is thousands of black holes. And what you're hearing is the difference in the energy levels from the low energy X-rays as a lower sounds to the high energy X-rays as the higher sounds. And the sort of white noise ish sound is what you're hearing with all of that data stacked up into those really bright white patches. So this is a field of thousands of black holes. And there we go. I hope that was a soothing way to end this very full talk that I gave. Hopefully that all was useful to everyone. In that piece, we used the idea of stereo sound and went from bottom to the top of the image that we could make use of that stereo component for the sound as a relatively rectangular type of field of view that really did seem to work. So I'm going to stop sharing there and pop over to the professor. There we go. So I guess at this point, if we have, do we have follow up questions or do we not do questions? I can't recall. Okay, so we need a minute to recover because that was amazing. The music was super, super nice. People will say they were saying ghostbamps. Oh, I love it. That's great. That was what we aimed for. We definitely aimed for goosebumps. That's great. Yeah, this is like, someone was asking if the sounds is available in some online services so as to listen before going to sleep. Yeah, so we actually had someone ask us for permission if they could use them in the sleep app. I'm forgetting the name of the app, but it's all NASA data and so we said, yes, use them for whatever you want. So they very well could be in a sleep app near you. I don't know if that ever happened or not. Cool, cool. So let's go for one other question. So yeah, are those soundification generator or compose? Yeah, so this is a great question because I do kind of think of it as a hybridization between those two concepts. So they are generated from the data. We are using Python and we are extracting that data, right? And then we are taking it into Logic Pro to do the sort of, I don't know, aesthetic making, if you will, to sort of make sure that everything, all the choices that were made through the scripting process makes sense when you listen to it. So it is a sort of combined proposition. We go into the soundification process understanding the scientific story that's there just like we would for an image, right? So when you're creating an image visualization, you process it with the understanding of the science and you're making very specific choices of color, of smoothing, of field of view, of how much different kinds of light to include, all of that in order to tell that science story really properly and also hopefully beautifully. So there's always going to be a bias cooked into any type of data visualization or data sonification when you're taking that kind of approach of both a generated and a composed experience. But it is the data that is driving that experience. This is not just made up sound, you know, off of instruments. Like we're taking those instruments and attaching it to the different energy cuts or the different chemical elements, for example, to be able to represent that data accurately, as well as hopefully beautifully. Nice. Yeah, I wonder if you're going to have someone making using the Galaxy's instruments, right? They say, yeah, I'm playing this with a guitar. I would love it. I would love if there was a band out there who actually played some of these pieces. That would be like a dream come true. Yeah. So someone says, can I print a star at home? Like download the 3D image and print it. Absolutely. All of the 3D files that we've worked on are all publicly available. Everything that we do from NASA is public domain, essentially. So if you go to chandra.si.edu slash 3D print, that will provide you all of the 3D printing files that you might want. OK, so I'm going to go for the last one. There is a lot of questions. So I'm going to copy them and paste them in the chat. If you want, you can stay in the chat and reply it. And people can also maybe ask for more. So I need to pick one. So sorry if I'm not picking. This one, I think is similar to the one for the sound. I think it's interesting. So the pictures are amazing. But the question is, how much is this is for real? And how much is some color tuning for something that we are used to see? Yeah, a very valid question. All of the images that I presented today are all actual data. But I guess it's what you mean by what is an artistic touch? Because I think there is this idea that these are scientific images only in that snapshot by the telescope done. But there are people behind that process. There are people, obviously, that made the telescope, that made the software, that are making those images, that are doing those analysis. And all of that personal perspective, of course, is going to be reflected in that data. We are aligning those data visualizations to the scientific story. We're never adding anything. We're never subtracting anything unless it's an artifact that we know scientifically does not belong there. And what I think the biggest point of all of this is that whenever we do processing on our data, be it image, model, or sound, we describe what we do, especially if we're doing anything that we think might sound strange. Because the last thing we want to do is create an environment where people don't trust that information. But that said, I think we do have to hit home that at the end of the day, we are people making these things. So I am bringing my own artistic sense to a project, even though I'm not an artist, by the way, not an artist. But I choose specific colors because I think they work for the data. But that is my choice. There is no hard and fast rule that you must use blue for high energy data. That is just a choice that I often make because it really suits the data. So I never tend to want to make an image that looks ugly. And I hope I'm not doing that. But we are definitely, like I said, following the science and then just adding our own perspectives to that output. So I hope that helps. It is very much not a created thing as an adding or subtracting information. We are hopefully just adding value. Cool. Great. So thank you very much. That was a really nice answer. Thank you. Yeah, I'd be happy to answer questions. I have to pop to another talk, but I can come back and answer more questions in the chat. Yeah. So everyone can find you in the chat. You can maybe say hello and they will find you. So thank you. Thank you very much. It was a really inspiring talk. So good luck. Thank you.