 No, it's shy, I get it, it's shy. I know, yeah. Okay, and then I'm gonna start. I've been told I have to start. I can understand the distraction. Okay, so I'll just get started. So it's cool to be here, I'm gonna tell you a little bit about how astrophysicists are starting to use a blender, and a little bit about a package that I've been developing. So I figured not everybody knows what astrophysicists do, so I thought, also now it's an excuse to show some cool pictures, I would just sort of explain a little bit about what we do. So here's a cool image. Basically this is the group that I'm working with. We do big, large structure simulations of the early universe till now, and we try and figure out how galaxies form, what those galaxies look like, how many milky ways do we get, what happens to metals when they're created, that sort of thing. And so these things, these simulations track both dark matter, if you haven't heard of that, it makes up about 85% of the matter in our universe, but you can't see it, so since we're very clever, we called it dark matter. And then it also tracks gas, and gas has a lot more properties like temperature, composition, things like that. And so you have to include some other physics besides gravity, if you wanna be realistic, so you have to have some prescription for how stars form, the effects of magnetic fields, how elements are created and distributed through the universe, these types of things. And so these things can get big, and so what I mean by big, for example, the one that we're planning, hopefully to start in the near future, you get things that are about 100 billion particles or cells, and they each have their own physics, their own composition, things like that. And so we're planning on running this on something like 90,000 cores for several months, hopefully only several months. My postdoc is only so long. And the snapshot files, which basically give you just a snapshot of how the universe looks at a particular time, those things can get up to several terabytes, right? So that's a lot of polygons. I don't know a lot about computer animation, but I do know that that's a lot of polygons. So how the heck do we understand what's going on in our data? And even for my own simulations for just smaller projects, these snapshot files can get up to a few gigabytes. So even the small things are kind of big. And so this is not just a problem for people that are doing simulations, observations now, we're getting to be bigger and bigger data. So I might just put two examples. So one is the dark energy survey and that is going on right now and they already have about a petabyte of data. So that's looking at the expansion of the universe, something that's coming online pretty soon at the LSST. That's gonna have probably hundreds of petabytes of data in a decade. So those are big things. And this is not just a problem for people that study like big universe stuff. Even people that study planets have this problem. So this is just kind of a cool example of all the planets that we've now, exoplanets that we've discovered. So the Earth is the little star there. And so you can see we have this wide range of dynamics. Also we're gonna start getting information, hopefully, about their atmospheres and so we have to model those things too. I'm very biased towards the sort of fluid simulation, so that's what I'm gonna talk about, but I just wanted to give you an idea for sort of the scales of data that we're gonna have to deal with in the near future. So here's sort of the workflow of a typical computational astrophysicist. So you have to pick a code. And so two examples that are predominantly used are adaptive mesh refinement. So basically you're taking your domain, you break it up into a bunch of cubes and you follow how gas flows across those cubes and then you can adaptively create higher resolution around interesting areas. Or I think smooth particle hydrodynamics is basically the fluid simulations in blender. So you basically, instead of breaking up your domain, you break up your fluid into particles. You can also do both. So I'm actually working on a code that's a hybrid. So it's sort of a moving mesh code that uses boronoi tessellations to sort of follow things. And you get these cool mesh structures, so I thought I'd show that. So here's just a subset of the different codes that people use, either particle or AMR somewhere in between. And they usually have their own individual data formats. So that's always exciting if you're trying to compare codes. Okay, so you've picked your physics code. So now you have to actually add physics to it. So how stars form, if there's heat being added to your simulation from supernova feedback. So when massive stars die, they explode as supernova. You have to model that somehow, all that stuff. And then you send it to a supercomputer and then you wait for a bit. And then you do your visualizations and analyzations. And so again, all of the data, all of the different codes have their own data formats and so usually they have their own specific visualization code. Or YT is the thing I'll talk a little bit about and that sort of tries to combine all these data formats. And so I just put up two examples in super weird units so you can ignore the units. But one is a more visualization plot. And so basically I have this cube and I'm modeling how gas forms into a galaxy and then I just take a slice on the Z-axis and plot this. And so I can ask questions like where are the spiral arms in my galaxy? Okay, so this is a spatial plot and coloring things by density. So I can say, okay, well probably the higher density things are where my spiral arms are. So I can look at that. I've also over plotted the grid so I can say, okay, where are things being refined in my simulation? And so this is an example of an analysis plot. So it's a phase space plot. So basically it's telling you how much mass you have in each density and temperature bin. So I can say, for example, where do I expect stars to form? So I expect them to form where there's cold dense stuff so I can just sort of go to this plot and in the lower right-hand corner say, okay, I've got this much mass forming stars. And then you usually just allow the rinse repeat, right? You're not gonna get it right the first time. So hopefully you don't have to go pick a different code but you probably do have to augment the physics and just kind of do this over and over again. And then hopefully at some point you get to make a super cool movie. And so Astro Blend is kind of at this end right here trying to fix and make these things a little bit easier. And so I just wanted to mention that I'm not the only astrophysicist that's working in Blender. So here's a few examples. Miguel Aragon has some really cool interactive pseudo volume renderings doing some Voronoi tessellation or delineate tessellation, some sort of tessellation. Art Taylor, I can't pronounce his first name so I'm calling him Art Taylor. So he just came out with a cool volume rendering package in Blender. Also Brian Kent has a nice website and also a great book. So if you have a computational scientist in your life that's really wanting to learn Blender, especially if they do astrophysics, I'd highly recommend his book. There's also other scientific visualization tools but they all have a variety of problems. So some of them are proprietary, some of them don't have easy accessible data formats. Some have user interfaces that have a little bit to be desired and none of them really allow for a lot of artistic input. So the idea is basically to have this package where you can easily access data from multiple types of scientific simulations and then be able to interact with the data and do analysis all in 3D and also to be able to include artistic models. So here's an example script since I do computational astrophysics. I'm kind of not as friendly with buttons so I'll just show what I started with which is an example script and you can also find this just on the website. And so here is the Astro Blend Library. I call it science because I can and I think it's funny. So you import science and then the next step is you load an object. So in this case these are some pre-generated surfaces. So these are some isodensity surfaces meaning they all have the same density. Then you set up your lighting. You say I want it to make a glow. Position your camera. You say I want to render it to this location and then you render, right, pretty easy. So you get this image. And so this image is actually the gas around that little slice plot that I was showing before of a galaxy. So this is the gas falling into the galaxy. And there's two isodensity surfaces. You can really see the outer one and the colors are temperature. So the hotter things are more yellow and the glowiness is basically from physical parameters. Bram's drawing a mission if there's any scientists in here. So you get this cool image. Okay, so you can also do this in a different way by directly interacting with the data. And so this data is actually available. You can download it if you would like from the YT project. They've got a bunch of different data sets. Some are more or less documented, but it's all there. And then you say, okay, I want to extract surfaces with these densities again in weird units. They're grams per centimeter cubed. Don't ask me why we use those units. And then you say, okay, I want my emissivity to be based on some physical phenomena. Again, I'm using Bram's drawing a mission. And then you actually load things directly using YT, this other data analysis software that I'll talk about in the next slide. And you can just use that as a backend to load things. And so basically what YT is doing is it's actually querying the data format. So again, it has front ends for a bunch of different astrophysical codes. So it says, oh, I'm using this kind of code. It knows how to query the data file. It can also generate a surface. So it does that. And then it also figures out what colors you want based on the different values on your surface and also the emissivity. And then it exports those vertices to Blender and then Blender says, oh, okay, I have these meshes with these colors and then it generates the mesh. So this is all done in memory. And again, load is able to figure out what you're trying to do. So if you're trying to load an OBJ file or directly with the data. Okay, so I've been talking about YT. So what is YT actually? So this is from their website. YT is a Python package for analyzing, visualizing volumetric multi-resolution data from astrophysical simulations, radio telescopes and a burgeoning intertipulmonary community. Woo, okay. Basically I'm just gonna show some cool images that you can do with YT. So this is an example of gas in the early universe collapsing and forming stars which are kind of the egg-shaped things. There are those two eggs. And so you can do some visualization and some analysis. So this is one of their volume rendering. This is a big plot. And then there's that phase space diagram that I was talking about before that's telling you how much mass you have at a given temperature and density. And then I just thought I'd show some gratuitous images because they're pretty. So I think this is radio data from Alma. And I think it's of a giant molecular cloud. Here I think this is supernova explosions going off in a small box. Here hopefully this is self-explanatory. So yeah, so you can use multiple, not just astrophysical, but it was developed for astrophysical simulations. Okay, so there's, but there's no interactions in 3D with the data. So I'm gonna show some movies because the loading is not optimized yet. So I didn't want everyone just to wait during a demo for 20 seconds as something loaded. So basically I've added a two panels to a GUI so I can directly with Blender go and load the data. So this is again that isolated galaxy simulation. So it loads it, that's sort of the domain box. So everything sort of scaled to some units that you get to choose. So now I'm gonna create with the GUI one of these isodensity contours. So I'm just choosing the color scheme and then the density value I want in grams per centimeter cubed. I'm gonna make it a little see-through. You can also put in a string for your emissivity but I'm not gonna do that here. And boom, you get this little isodensity contour in your box. So that's the same one we had before, it's just not glowy this time. So on top of that you can also do analysis plots. So I can do one of those slice plots that I was showing is sort of like your initial visualization you might wanna do. And it will do it in the 3D space. So here I'm again slicing along Z and you generate it. And so you get your analysis plot in your UV editor and then you get your slice plot plotted in the 3D space, right? And so you can see where your isodensity contour lives in your slice plot. You can also interact with these things in the 3D space. So here first I'm making a little bit bigger, right? Maybe I want a super cool big slice plot. And then you can also do off axis slices, right? So I can regenerate that. And then I'm just zooming out so you can see the whole slice plot. And then if I go back to the solid view, you can see that there's this extra little empty that's up there and I just call it empty my slice because I'm, you know, we call it dark matter, dark matter, so what do you want? Anyway, so you can then interact and sort of change where the orientation of your slice plot, you can also pick up your actual slice plot and move it around and it'll change the center. And so then you can regenerate your slice and now you're taking sort of an off axis projection of your slice and you can see the gas around it on the outside. You can also do this with another kind of plot called projection plot. It's basically the same formulation as a slice plot. You're just projecting everything along a certain axis. So I'm not gonna do that. Another cool thing you can do is make a phase plot in 3D. So here I'm just gonna generate a phase plot and then, and again, you can do whatever variables you want on your X and Y axis. I'm just doing temperature and density because that's what I've been showing. Excuse me. And so then you can generate your phase plot and then go back to the solid view and lo and behold, you're actually taking all of the data within that sphere and that's what you're using to make your analysis plot. And so that means that you can now interactively move around what data you're actually looking at in your phase plot. So this will allow you to, you know, figure out what regions in your phase plot or in your domain are actually interesting. I think I'm gonna show one more thing with this simulation. Yeah, so you can also, so I'm just cleaning up things, you can also plot the grid. And so that it's another way to look at where the most interesting things are. So this is an easy way to say, oh well clearly the most interesting things are gonna be where it's refined. You know, maybe I should make an isocontour there or something. So the other thing that you can do with this that is very hard to do with other astrophysical analysis tools is then you can load data from a completely different simulation. And so this is again made with a completely different kind of code, but it scales your domain box into the same scale as your previous simulation. And so in this way, I'm actually uploading some SPH, so a particle data code. And so in this case, I can make a particle cloud out of that. And so I'm just gonna do that and then boom generates point cloud. Again, the box size is pretty big, so we have to zoom in a little bit. And I'm just hiding the old domain box. And so then when I render potentially, I can have the results from multiple simulations in there. So in this case, this is a simulation of a galaxy as well, but it's a point cloud galaxy. So I zoom in a little bit and you can kind of see there's sort of spiral arm structure and stuff there. Okay, so even more on the dev-dev level, we're also able to do volume rendering within memory. So I'll just show a quick little simulation of that. So that's that isolated galaxy again, same isolated galaxy. I think I just turned the background to black. Notice that there's no texture panels there. Just keep that in mind for a second. So this again is another way that you can interact with your data sort of in a more intuitive way than making a bunch of volume render plots where you have to figure out where you should put the camera, right? You can just do the volume rendering. And so for example, I could say hey, there's something that's very interesting in the center. Maybe I should check that out. So I can do that while also keeping in mind the larger scale structure that I have, gas structure that I have, right? And so then I can also just interactively change what the color scheme is and that will bring out different features. Okay. So again, everything is done in direct memory. So YT is calling the data, storing in memory and then pushing it to the voxel data. But however the way we're doing that is a little sketchy. So we're actually overriding the voxel data structure with a C types array, which seems a little bit wrong. I mean, look at that face on that corgi. So I was hoping to maybe talk to a developer about maybe being able to pass like a pointer to the voxel data structure. So if anybody has any ideas on how to do that, that would be great. Yeah, so that's the reason why the texture doesn't show up. So you know if your variable that you're adding has hack in it, it's probably a bad sign. So basically, Blender then tries to access memory that's not there, right? So it's trying to preview these textures, but there's no actual texture there. So that causes it to crash. The other problem is that we're not supporting the adaptive part of the adaptive mesh refinement. So this is just a uniform grid that we have, and so we can't actually get up to the highest level of refinement, so that's another thing we gotta work on. Also the transfer functions are a little bit limited. I haven't played around with it that much, so that might be a full statement. But basically, for example, in YT, what you can do is you can say I want specific levels to be emitting fully at some temperature, just at those specific levels, or you can do some gradients where you know your cool stuff fades in blue and then your hot stuff fades in red. And that's how you get that kind of layered structure that YT had. So it'd be cool to do kind of interaction with that. And then I just thought I'd show some gratuitous movies. So here, this was actually in a paper, not the movie, but the plots from it. And so this is two galaxies interacting. They also have black holes at the center, but you can't really see anything. And so the different particle colors are different things, so the, I'll play it again. So the red are old stars, the yellow is gas, and then the white things are new stars that are formed. And so this was, I think, four million particles. So you can get up pretty big. And like the last data, the last plots, the data files for those were a gigabyte or something. So it's not impossible. It took a little bit of time, but it wasn't impossible. So I don't think I'll show this whole one. People have showed way cooler movies. And this was just sort of like a first test to see if I could combine a model with some observational data, that's the galaxies in the background, and then also some actual simulation data. So I'll just zoom through that. So I'm just showing the positions of dwarf galaxies. I mean, these are all scaled to the proper proportion, except for the background galaxies, right? And those are information about the orbits. That's what they're all named. And then you can actually, I put in some of my simulation data. So, and again, this is scaled to fit the space that it's in. So this is like a disk of gases forming because you've got a rotating dwarf galaxy potential. And it's then moving through the background medium of the larger galaxy of our Milky Way. And so that's why you get this bow shock forming around the disk. And so then you can also add annotations onto this very easily. So, I think I'd do that, yeah. So for example, I can show the motion of the dwarf galaxy, how the gas is, and then annotate it a little bit. Okay. So there's a lot of stuff still to be done. So again, volume rendering is still pretty hactacular. It'd be great if it wasn't. And also, being able to maybe switch between the YT rendering and our rendering would also be great. Everything so far is done with the Blender internal render, but of course Cycles is awesome. So we should use Cycles. And again, the loading of data isn't optimized right now. So YT can do things in parallel. I just haven't put that in because one crisis at a time. So right now this is Astro Blend is a package you can download from Bitfucket. But it'd be great to also make it a Python package since that's what computational physicists are sort of used to. And also make it an add-on so people like you guys can use it. Also, keyframing hasn't really been implemented yet. So we've got, again, snapshot files. And so the grid could potentially be changing a lot in between those snapshot files. So the way that we tend to make movies is just to make pictures of each of those snapshot files and then just put that together as a movie. But if you wanna use keyframes, then we're gonna have to figure out how to morph between surfaces and things like that. So that would be cool. Again, I have access to supercomputers so I know people have rendered Blender on supercomputers so it'd be great to do that. And also we're gonna at some point have to do remote access of data for the things that are really big because you don't wanna generate a surface from something where you have to transfer over a terabyte of data. That would be a little bit of a hassle. Okay, so in the last few minutes what do we actually get from all of this? So I think a great example is actually an example from a Python boot camp that I taught. And so this was aimed at young scientists so they didn't really have any coding experience and I hadn't ever done research before. And so I gave them basically a decremented data set from the Galaxy Merger simulation. I said, here, this is what all the particles mean, figure something out. And so they made these two movies for an example. So the blue basically is showing where the black holes are and then they've highlighted in the other movie the new stars that are forming. And so they also had to make a poster and give a little presentation. And so in their poster what they said was as the two black holes merge there's a lot of stars that are formed. And that's like a big thing that observationally people have been really interested in and also people have been trying to show with computational simulation. So the fact that they just did that on their own using purely visualization is really cool. And so that's an example of what you can learn as NASA physicists by doing visualization. So I'm just gonna guess what you guys can gain from scientific visualization because I'm not an artist. So you'd have easier access to scientists which you can debate if that's good. We're kind of odd, so I don't know. But you do have easier access or you will soon have easier access to scientific data. So the National Center for Supercomputing Applications is putting together the National Data Service. And so this is a way basically that scientists when you publish a paper you can also publish access to your data. So this is supposed to be all open source so you guys can also check this out. And so if you query the some information in the paper it will tell you where you can go get that data and that data will all be localized. So then if you're reading several different papers on the same subject you'll have access to all of the data and you can hopefully compare those things easily. You can also print cool things. So here is an ISO density surface from a simulation I did of stars in a cluster with their winds. So their winds are colliding and you get this cool structure. And so last Christmas my mom was like, hey, you wanna learn how to 3D print? And I was like, yes. And so we 3D printed this. It only kind of crapped out at the last little bit there but you can still see the cool structure there. And so if anybody has any ideas I would love to hear them too. So let me know if you have any better ideas. And so I just want to finish with like a grandiose thought, right? So if we go back to this picture of how we're currently doing science now this is usually, this applies to a few astrophysicists in a collaboration or maybe one lone computational astrophysicist. But as we're getting more and more data both computation and observation we create this open source method of access if we can also develop easy ways to communicate across disciplines and to the public I'm just excited to see how science is going to change and particularly the way we do science is gonna change when we are able to more easily communicate across disciplines. So thank you very much. Thank you. No singlet question. It's very interesting to see this. Thank you for the talk. If you are here we can connect you with several developers. You know, Python is one of the guys you talk to. But did you try to reach out to the Blender.org community or did you everything to yourself independently? I definitely post a little bit on Blender artists so they answered a few questions. So I didn't really, so not yet, I mean now that we're kind of overriding things in the C code I feel like maybe now is a good time because that seems wrong. So. I'm always impressed that people are using Blender without contacting us. It's amazing. So I'm very, very happy that you guys are using this. You got so far. Another one, do you know BioBlender? Yeah, I saw a little bit of their stuff. Yeah, it seems like they're doing right models of molecules. Yeah, they're doing molecular stuff, but at that level it's almost the same as the galaxy, you know? I don't know how many cells and stuff they have, but as far as their computational volume, like how much stuff. It might be slightly different problems, but I think, for example, trying to represent planets and atmospheres on planets, I think that there's probably gonna be some overlap. So yeah. One more tool because we have to keep our schedule, right? Yeah. Yes, I was interested. Can you use the tools you developed now to render fractal data or the other way around? Can you use the fractal rendering things because there are some amazing ways to render fractals for rendering galaxies? Yeah, so admittedly I haven't looked into that very much. I've definitely looked a little bit at what people are doing as far as how you have to interpolate your fractals, but yeah, no, I started looking at that and was like, that's really cool, and then didn't go back to it, but it might be useful. It might be a useful thing to look at. Okay, so maybe I asked the last question. What is the feedback from the scientific world about blender? Do you have some positive comments about this work or something like that? Yeah, mostly I get, well, that's cool. So I think the fact that you can sort of move your plots around in 3D, that's something that I have, I don't wanna get in trouble with anyone, but I don't think I've seen any other visualization package where you can do that. And then the fact that it supports a lot of data formats through YT as a backend is super helpful. So yeah, so far no one's thrown anything at me, so it seems good. That's super awesome, brand new roles. Thank you for that. Thank you. No, I'm just saying that I'm very interested in what you do because I work with a company that does simulation engineering. That is not something like this, but they export, I can't import the volumetric animation in blender, and I was wondering how do you do it? And if there is any resource, any book, any website where I can understand how you import volumetric data. Yeah, so I could show you the code. So I initially started by reformatting our data to the voxel data structure, so there's one website that kind of explained how to do it. So I can definitely point you at that. I can't remember it off the top of my head. But yeah, basically we just use things that were in YT to generate from an AMR grid, so an adaptive mesh refinement grid, to make a uniform grid, and then using that as the voxel data. But yeah, I think you just have to reformat. If you wanna just do it from a voxel data, from uploading voxel data, then you have to reformat your output to that. Okay, thank you. Now there is time for a blend for a web.