 Okay. Good afternoon. Thank you for coming. So, next slide. So, first of all, first time attendee of Blender Conference. Already met a bunch of great friends and old friends and made new friends. Had a great year last year, a year off from my teaching duties at Mesa Community College in Arizona in the U.S. Learning Blender, learning more about Blender, developing skills. Most of the time I am a full-time instructor of astronomy at a two-year college of Mesa Community College. I also run our 52-seat digital planetarium and part of my goal with the year off from teaching was to develop new content for our planetarium. Part of that was building a pipeline from the fisheye projection that the camera system now has, learning how to move the camera with that fisheye projection because there is no behind the camera with fisheye. I can't hide assets or lights or the edges of the scenes behind the camera anymore because the dome is a hemisphere. You can see all the way around you. Also learning how to do astronomical models, both mesh models for things like planets and also doing volumetric materials for nebulas and galaxies, things like that. So, I learned a lot, but I'm finding out that there is still more to learn. So, next slide. My background is in physics and in particular astrophysics. So, I approached Blender initially as a physicist. I want to try and model things exactly the way that they happen at the physical level, at the atomic level. And what I've learned over the last few years is that that really isn't appropriate. I'm not trying to use Blender or I shouldn't be using Blender as a computational tool. It's a visualization tool. And so, I've had to change the way that I think about doing these things and in particular, cheating a lot. For example, if I try and model a solar system, the planets are very far from a star, that's a lot of space where nothing is happening. So, instead, scaling the objects appropriately and putting them next to each other so that the scene setup is much better. In terms of just appearance, instead of trying to get all of the physical properties correct, simply make them look correct. And so, that's also been a change. So, the outline here is, first I want to introduce you to Supernova Explosions. This is about week eight in my introductory course. So, I'm not going to cover everything, just a very short introduction. Talk about the data set that I was given to visualize. Talk about some of the obstacles of getting that data into Blender and then visualizing it in Blender. One of the fantastic things about Blender is the community. And this is where the community stepped up and actually improved Blender so that this project could move forward. And I'll talk about that. Also, very briefly outlined the process that I did create to produce this visualization. I'll show you the visualization that I generated and then very briefly recap the improvements that have been made to Blender in the point density texture in 2.78. I'm obviously not the first to use Blender for astronomy. Brian Kent at the National Radio Astronomy Observatory just recently published a book on using Blender to visualize astronomical objects and astronomical data. And last year, right here, Jill Nyman from Harvard Smithsonian presented a talk on her project, Astro Blend. And I haven't contacted her yet, but that's certainly a place for collaboration in the future. So, one of the differences between a physics simulation in Blender is that what's being generated is the physical appearance of something. The orientation of a soft body, the way that particles move under forces of various kinds. A fluid simulation or a smoke simulation that generates directly the appearance of something. The equations that drive that then relate directly to the appearance. On the other hand, a physics simulation, as physicists talk about them, typically generates an output that is not visible. It's quantities that describe the material, such as temperature or density, or in this case, the abundances of chemical elements within the star. So that means that we are free to choose how we want to visualize the information, how we want to choose a color palette to display temperature or density, or perhaps choose which elements to convey within the visualization. So Patrick Young, who is an astronomer at Arizona State University, just a few miles away from my home institution at Mason Community College, his work is on numerical simulations of supernova explosions. And he knew that I worked with Blender. He's a colleague of mine and a friend of mine. And he thought that Blender would be a way of visualizing the numerical output from his simulations. So very briefly, this simulation calculates the motion of the material within the supernova explosion using about 1 million tracer particles. So the motions, the change in temperature, in density, in chemical composition within the supernova explosion are traced using appropriate physical equations. And the data that I'm going to show here today comes from a time about half a year, 178 days after the initial explosion. Patrick tells me that this simulation going to the very end took about 30,000 CPU hours. And he actually runs his own Linux cluster to do these calculations. So why bother with these objects? They're far away. They don't affect us, at least not at first glance. But at least to a scientist, these are some of the most phenomenally cool objects in the universe. They include dramatic extreme temperatures, densities, a whole variety of nuclear reactions, neutrinos, these elusive particles that physicists are studying here in Europe and in the United States and in fact around the world are extremely important to supernova explosions. And ultimately, some supernovas give birth to black holes. So in a way, supernovas are the most dramatic physics laboratories in the universe. These objects are also fabulously bright when they explode. A single supernova is brighter than a billion times our own sun, a typical star. So these objects outshine pretty much everything else in the universe and can rival the entire light from a single galaxy like our Milky Way. And in addition, supernovas and the stars that lead to supernova explosions produced the chemical elements that were made out of. They produced the elements that formed the Earth, they produced the elements that formed us. All the carbon, all the nitrogen, all the oxygen, the calcium, the iron in the universe was produced in stars and in supernova explosions. So in some sense, we're genetically linked to these distant far away, distant in time, distant in space objects that occur here in our universe. So just to set the stage, just before a supernova explosion, and here we're talking about stars that are many, many times the mass of our sun. Just before the supernova explosion, the star has swollen up to an enormous size. Here the scale is five times bigger than our planet's orbit, or about a billion kilometers across. The core on the other hand is about the size of the Earth, and that's where all the action is happening. All the nuclear reactions that power the star are occurring in the core, and layer upon layer of heavy elements builds up in the core. And ultimately, that's what kills the star. That's what triggers the supernova explosion. So without getting into the details of the physics, the short story is that the core collapses, becomes unstable. The outer layers bounce off of the core and blast outward at somewhere between 10 and 20 percent of the speed of light. Again, we're talking many times the mass of the sun being thrown at incredible velocities, enormous amounts of energy are being released. And we can see that from great distances across the universe. And that's the astronomy part. The rest of it is the physics. So just to give you an example, the image on the left is a Hubble Space Telescope image of a nearby galaxy. Two astronomers nearby for galaxies is about 20 million light years away. So we're actually seeing this galaxy as it was 20 million, 21 million years ago. On the right is an image that was taken in 2011, while this particular supernova, supernova 2011, FE, appeared in this galaxy. We now know that this was a massive star, a very large star, that had ended its life. And I like to show this image just because it was taken from Kit Peak Observatory in Arizona. So it's kind of close personal connection to Arizona State and my home state. So next slide. Now, this is an explosion. So the material in the star rushes outward again at a good fraction of the speed of light for many thousands of years. And in particular in the year 1054, about a thousand years ago, Chinese astronomers, Chinese court records, record that a guest star appeared in the sky. This was in the daytime sky. So this is a very nearby event. This only happened 6,500 light years away. So within our own home Milky Way galaxy, this object initially was visible in the daytime sky. So bright enough to be brighter than the bright blue sky. Months later, as the seasons changed, this object continued to be visible in the nighttime sky for many months. Today, we see this expanding cloud of debris, which has been dubbed the crab supernova remnant or the crab nebula. And we now know that this, in fact, is the leftovers, the guts of the star that have now been thrown out over a distance of about 10 light years. So a substantial distance about 15 times the mass of the sun has been thrown out as the star exploded. And we know that this material is enriched in all of the heavy elements that we would expect from these, from simulations, from calculations. So why use Blender for this project? There are certainly scientists who write their own code to do this kind of thing. But for my part, I was already familiar with Blender. I discovered it about 10 years ago and I've been dabbling in it until very recently when I got very serious about it. It was an opportunity to learn more about how to use Blender, how to use Python within Blender. It has built-in support in the cycles engine to do volumetrics. And again, the Python was already built in, so we could use that capability to pull in the data, to load the data, to manipulate it, and to store it within the blend file. We could also write a custom shader to choose how the data was visualized within the data volume. So in a word, what would you choose? Blender is awesome. Yeah, that was the word I was thinking of too. It already has all of these capabilities built into it. There was no need to write new software except for those custom pieces in Python to import the data from the data file. So here is the file that Patrick gave me. Again, I can't really tell what's going on here. It's a string of numbers. What's the shape of this supernova explosion? How fast is it moving? What is it made out of? You can't tell by simply looking at the data file. So that's where the role of visualization comes in. Yeah, you can look at one particle and say, oh, it's at this position and has this particular density. But without really knowing the entire file, you don't know where that particle is within the entire cloud of debris. You don't know how fast it's moving. So that's where visualization comes in. So this string of numbers is a list of XYZ positions. The next value is the density. After that is the temperature. And then the rest of the numbers, the last 20 numbers, are abundances of chemical elements from hydrogen to nickel. So my original plan, having played with Blender and used particle systems within Blender, used the particle lifetimes to use a color ramp to shade the particles. My idea was I could create a particle system, set the number of particles to the number of particles in the numerical simulation that Patrick had given me. I could move the particles around using Python. I could insert lifetime values into the particle system with Python. And then I could use a particle, excuse me, a point density to control the way that those particles appeared within the data volume. I figured it would be a couple of weeks and I'd have something I could show Patrick. It turned out that none of this was possible, except for the last step and in a different way. So this project evolved into, in fact, a great learning opportunity for me. So the first part was how to get the data into Blender to actually use it. And again, my thought was that I could create a particle system and then move the particles around. That's not possible. The particles are set by the physics engine within Blender. And so I could write to those values, but the new values instantly disappeared as soon as the frame was updated. So the particles would move right back to where they started from. So I tried a number of different experiments. None of them worked. And about that time Gottfried Hoffman, I don't know if he's here. He's upstairs in the Blender meeting. Okay. He had just released a point density tutorial. And so I contacted him. And he wrote back and said, no, he's never tried to move particles with Python. He doesn't think it will work. So at that point I gave up. But, next slide. He suggested using object vertices. Except object vertices by themselves only at the time, only had one way of accessing information as a color source in the point density texture node. And that was vertex normals. So there wasn't any way of accessing the rest of the data to do the visualization, to form the volume domain. So, fortunately, Gottfried had already thought about these. He's a whiz at point density texture. And so he had a plan. His plan was to ask Lucas to add vertex wakes and vertex colors to the point density texture. So, with Patrick's interest and my backing, Gottfried asked Lucas again. And Lucas got to work. It took him a couple days to add the code to support both vertex weights and vertex colors. To get it into the code. Sergey approved it. It was inserted into a build back in March. It took a couple of weeks. Bug fixes and things like that. And then a stable, useful version of the new point density texture was available. So, then I got started. So, after all of this, a new method for importing this data and visualizing it was developed. So, I use Python to read the data file. These 945,000 particles that are in the output of the simulation, read their XYZ positions, read their densities, read their temperatures, read the chemical abundances, create a mesh object, and then populate that empty mesh object with vertices, one for each of the tracer particles from the simulation. Then I create vertex weight groups for each of those physical properties. So, density, temperature, and so on. And then those vertex weights can now be read by the new point density texture to control the material. And so, here just is the output. 22 vertex weight groups, density, temperature, and 20 abundances, everything from neutrons and protons all the way up to the nucleus nickel 56. Everything is loaded into a blender file. That file is 213 megabytes in size. Kind of large, but not unwieldy. But one question I wanted to ask of you is, is there a better way to do this? Okay. So, if you have ideas, I'd love to hear them. Let's talk after the talk, or I'm around today and tomorrow. So, this is one way of doing it, but it's certainly not the only way. So, here is the mess of particles. I just wanted to show off what it looks like in the viewport. Again, not a whole lot of information. Again, 945,000 vertices, no edges, no faces. So, this is a pretty simple vertex mesh object. Excuse me. Here's the shader. Three vertex, excuse me, three point densities for this particular example that I'll show you, oxygen, silicon, and iron were chosen, and then using the physical density of the points to control the strength of the emission. And here's the single steel render of that data. Superficially, it looks like the crab nebula has this sort of filamentary cage of material, but one of the problems is this is a three-dimensional transparent structure that we're looking through. So, it's a bit hard to tell what's going on here. So, I rendered out a quarter of the domain so that we can see through into the center and see the surface of this expanding cloud. So, if you want to hit escape, okay. After yesterday, I was sure to come in here early this morning and make sure that this worked. And it did this morning. Yeah. Yeah, it's not feeding to the monitors. Well, while we're waiting, why don't we go to the other slide. Well, I'll try and figure out how to get the video to play, but in the meantime, if you want to see it, I can play it on my laptop. I guarantee you that. So, I'll just wrap up in the interest of time. One of the things that we'd like to do is actually generate a VR representation of this data set so that you can fly through it. It wouldn't necessarily be interactive, but you could then actually look around and see different parts of the volume, see this filamentary cage of material at the edge of the explosion. Another method that we want to develop in the future is, again, this was the very last time step in this simulation. And there are 4,000 time steps from the beginning of the explosion to this time, 178 days later. So, we'd like to be able to load and visualize all of these steps to animate the explosion. Patrick would very much like to show that at conferences, and it would also be a fantastic teaching tool. This is a generalized process of importing the data. Basically, any text file could be read, vertex weight groups could be generated, vertices could be populated into a mesh object, and then that could be used to render. So, the last question, and again, come and talk to me if you have ideas, is this generally useful? There are perhaps some data scientists in the room that could use this for their own point-like data sets. This kind of technique is used in a variety of sciences. So, here is the new point density texture as 2.78. This is official build now. The color source now has three options, not just one. Vertex weight, vertex color, and vertex normal can all be chosen, can each be chosen to provide the output. And so, thank you. Just want to remind you that supernova explosions produce the chemical elements that Earth is made out of and that you yourself are made out of, and now I can say that supernovas are also responsible in a small part for improving Blender as well. So, maybe, maybe. Hey! Can we have the lights down? Can we have the lights down? There we go. Hey, thank you. Thank you. Okay.