 Dr. Travis Rector is a professor of astrophysics at the University of Alaska, Anchorage. He earned a PhD from the University of Colorado in Boulder. For over 20 years, he's been creating astronomical images using some of the largest telescopes in the world, including those at Get Peak National Observatory, Cerro Tololo Inter-American Observatory, and the giant eight-meter telescopes of the Gemini Observatory. His work has been featured in such venues as National Geographic, The New York Times, Sky and Telescope, and Astronomy Magazines. Please welcome Dr. Travis Rector. Okay, hi, everyone. It's a pleasure to meet you all virtually. So as Brian mentioned, what I'm gonna do is talk for about 45 minutes. And if you have any questions you'd like to ask me, I'll have plenty of time at the end, and you can just post those in the Zoom queue in a window. So let's go ahead and get started. And what I'm gonna do is just tell you a little bit about myself and what it is that I do. So as Brian mentioned, I'm a professor of physics and astronomy at the University of Alaska in Anchorage. And you may not know me by name, but hopefully you've seen some of the images I've made over the years, including this one, as well as this one as well. You may recognize this from Neil deGrasse Tyson's reboot of the Cosmos series. And I don't know why they decided to modify it to make it look like an eyeball, because to me it already does look like an eyeball, but it was a great honor that he chose to use this image for his show. So as Brian mentioned, I earned my PhD from the University of Colorado and back in 1998. And after that, I went to this place, which is Kid Peak National Observatory, located about 40 miles west of Tucson, Arizona. And this is the home of the original National Observatory and has over 20 telescopes on the summit, most of which are still in use. You'll notice in this picture off on the right side in the distance is the largest telescope, which is the Kid Peak Mayol four meter telescope. Now, when I arrived at Kid Peak, they had just commissioned a new wide field imaging camera called Mosaic shown here. And Mosaic is a 64 megapixel camera. It's eight 2K by 4K detectors, Mosaic together like tiles in your bathroom. And even by today's standards, this is still considered a large camera. If you go to buy a digital camera, a 64 megapixel camera is pretty good. Back in 1998, it was amazing. It was actually very hard to have the computing power to be able to process the data that came from this camera. In fact, when this camera was built, my largest hard drive was a walking 10 gigabytes and amazingly could only hold 30 images from this telescope camera at a time. So what you're seeing here is you're seeing the Mosaic camera placed upon the, one of the other telescopes on Kid Peak, which is the 0.9 meter. And what this camera excels at is giving us a wide field of view. What it does is it gives us a field of view of about one square degree, which is basically the size of your thumbnail if you hold it at arm's length. So if you can go outside tonight and see the beautiful full moon out there, go ahead and hold up your thumbnail next to it. And you'll see that the moon is about half the size of your thumbnail. So when this camera was created, we wanted to show people what the field of view of the camera was. And so they gave me some telescope time and asked me to make some images. So this is actually the first image I ever made with this camera. And you can see that the entire image is one square degree in size. Again, about the size of your thumbnail, which in astronomical terms is a giant field of view. So to make this image, what I did is I took a picture of the moon and then I waited two nights for the moon to move out of the way. And then I took a picture of the same location in the sky and superimposed the two images and created this. And so the folks at Kid Peak, the director was happy with the image that I made. And he asked me if I'd be willing to make some more, which is when I learned to first, first learned an important lesson in life, which is if you do something once, it's a favor. If you do something twice, it's your job. And that's how I got started making images. Another image that I made at that same time was this one right here, which is of the famous rosette nebula. And one thing I'd like to point out is that these two images are of the same scale. So you'll notice that the center of the rosette nebula is about the same size as the moon. The rosette is actually quite a bit bigger than the moon. And if the rosette was bright enough, if you looked up in the sky, you would actually be able to see it and it would look much bigger than the moon. So an important thing I hope you take away from tonight's webinar, is that when people think of telescopes, they usually imagine us looking at things that are incredibly small and hard to see without a telescope. And while it is true telescopes magnify, another important thing they do is to make things brighter. So this is actually a nebula that's plenty big enough for you to see. It's just too faint. So since then, I've started making images for other observatories, including this one right here. This is Saratalolo Inter-American Observatory, which is Kid Peak's sister observatory in the Southern Hemisphere. And I also continue to make images for the giant Gemini eight meter telescopes located in Hawaii and also in Chile. So people, when they see the images I make, they often have questions about why do we make them? What are the points of these images? And of course, the first and the primary reason why we make these color astronomical images is to help us visualize scientific results. So for example, this is a picture of a supernova that went off in the nearby galaxy M101 a couple of years ago. Another thing that we do with these images is to demonstrate new technologies that we have. So for example, this picture right here shows what's called the bullets of Orion, which are stars that are flying through the Orion molecular cloud at supersonic speeds and leaving giant wakes behind them. And this picture was taken with the adaptive optic system on the Gemini North Telescope. And this adaptive optic system actually gives us better resolution than the Hubble Space Telescope, at least at infrared wavelengths. And then finally, we also make color images just simply to share with people some of the spectacular views we can see with our telescopes. And so every year I'm given a couple hours to a couple nights on each of the telescopes I use just simply for the purpose of making pictures to share with people. Now, another thing that often happens when people see these images is they have questions about whether or not these images are real. And this is a Calvin and Hobbes that I found a couple of years ago. And I really love this Calvin and Hobbes because not only is it a great comic strip, but this actually came out before Photoshop. And even back then people questioned the reality of photos. And so when people see my astronomical images, nine times out of 10, the first question they have for me is something along the lines of this. Is this what it really looks like? Or are the colors real? Or sometimes I get asked, if I were standing right next to this, is this what it would look like? I don't really know what it means to stand right next to a galaxy or a nebula, but I think you can get the idea. The key idea here is that if you can imagine that if you were much closer to these objects, is this what you would actually see? So to illustrate this concept, what I like to do is I like to show this image right here. This is a picture of the iconic horse head nebula that we made many years ago using that mosaic camera on the KIPPIC 0.9 meter telescope. And again, this is a nebula that's actually much bigger than the moon. So this is a picture of what it looks like to our telescopes. Now let's imagine you got into a spaceship and you were to fly a thousand or so light years out to the horse head nebula. And when you got there, let's imagine you looked out the window of your spaceship. This next picture is basically what you would see. So if you were there with a spaceship and you looked out the window, you would see some of the stars, but you actually wouldn't see any of the nebulosity. And people are often surprised to learn that. And the reason why is it has to do with an effect called surface brightness. Surface brightness is a ratio of how much light is coming from an object to how big it actually appears. And surface brightness is actually what determines whether or not we can see large extended objects like galaxies and nebula. So what happens is that as you get closer to the nebula, you get more light from it. That's just simply known as the inverse square law, but the nebula gets bigger and that ratio actually cancels out and it stays the same. And this is actually a trick you can try at home. Next thing, when we're done with the webinar, what you can do is you can now pick a wall nearby and walk towards the wall. And as you get closer to the wall, you'll notice that the wall gets bigger, but it doesn't actually get any brighter. And that's because as you get closer to the wall, the light from the wall is spread out over a larger area. So the same thing happens with the horse head nebula, which means that if you can't see the horse head nebula from here on Earth, even if you got much closer to it with the spaceship, you still wouldn't be able to see it. So sometimes people ask, do these pictures really show what it looks like? And the answer is no. And that's because what we're doing with our telescopes is showing you things that you may not be able to see with your eyes. So with the exception of things like the planets and moons and things like that inside our solar system, most everything else out in space is too faint for you to see with your eyes. So if you wanna see what the horse it actually looks like, this picture right here shows it to you. So when people ask these questions, the answer is always no, almost always no. And again, the first part point I wanted to make was about surface brightness. That doesn't matter on how far away you are. Another factor is that our eyes are really bad at seeing color in faint light. If you look up the stars in the night sky, you'll notice that there's only a handful enough of them that are bright enough that you can actually see the color. So for example, the bright red star, Betelgeuse in the constellation of Orion, that star is bright enough for you to see that it's red, but most other stars are too faint for you to see and they just look black and white. In particular, we have really bad sensitivity to red light. So for example, if you look at the Orion Nebula through a pair of binoculars or a telescope, it'll actually look blue-green even though most of the light coming from it is red. So when people ask, is this what the color really looks like, you have to ask yourself based upon who is sensitivity. Is this a CCD sensitivity or your eyes sensitivity? And finally, there's sources, there's kinds of light that our eyes can't see. We're all familiar with the visible light, the colors of the rainbow that we can see with our eyes, but there's other kinds of light as well. So for example, infrared light, x-rays and radio waves. And these are other kinds of light that our eyes simply can't see. So for example, one of my co-authors, Kim Arkand, who helped me write the book, she has a saying that I really like and that is that imagine you were sitting at a piano and you had 88 keys in front of you and imagine that your ears could only hear one of those 88 keys. Imagine that you could only hear a middle C. Now imagine that there's a beautiful piano piece being played and you can only hear that one key that sure would make things really boring. Well, that's kind of the situation with our eyes. Our eyes can only see a tiny portion of the electromagnetic spectrum, but fortunately we can build telescopes that can see other kinds of light. And then what we do is we make images to translate what the telescope can see into what your eyes can see. So when you think about a telescope, there's actually at least three things that a telescope does that's really important. The first is, and the most obvious, is that it magnifies things and makes something that's really small look bigger. And this is what we usually think of when we think of a telescope. We imagine being like a giant pair of binoculars to make something very far away look closer. So you can imagine like if you're at a basketball game and you're up in the nosebleeds and you wanna see the players down close, you get out a pair of binoculars and you can imagine that's what telescopes do for us. And that is true. That is one of the things that telescopes do for us. Another thing they do, which is just as important, is not only do they magnify images, they amplify them. They're able to collect large amounts of light and make faint things appear much brighter. And so that's why we're able to see things like the horse head nebula, even though it's too faint for our eyes. And then finally, as I've already explained, they expand our vision. They allow us to see kinds of light that our eyes can't see. So these are three important things that a telescope does when we look at the night sky. One way I think that it helps to illustrate this is, is imagine your eye is essentially being a telescope. A not very good telescope, but it is like a telescope. And also think about the fact that all the light that you're seeing right now is coming through a tiny hole in the middle of your eye that is called the pupil. And even when you're looking out at things in the night sky and you're dark adapted, that pupil only gets about a quarter inch in size. Now compare this to the size of the eight meter telescope on Gemini. This is a picture of the primary mirror on the Gemini telescope. And to give you a sense of scale, that's a person standing in the middle, wearing a clean suit as he's inspecting the surface of the mirror. So the surface of the Gemini mirror is about eight meters across or almost 30 feet. And so at any given moment, this mirror is collecting over a million times more light than your eyes can see. Now, another factor is, is that our eyes essentially work like a movie camera. Our eyes are taking pictures 30 times a second, which means that for each image we see, it only collects light for one 30th of a second. Now, instead of doing that, our telescopes are designed to collect lots of light for long periods of time. So what you're seeing right here is a time lapse movie of the Gemini eight meter telescope in action. And you'll notice that the telescope is designed to track the night sky as you observe an object. So rather than looking at an object for just a 30th of a second like our eyes do, we can use these telescopes to look at objects for many hours or even days if we were to collect data for many nights over the span of many days or weeks or months. And this is something that astronomers do. So what you're seeing right here is a picture of a field of view that's one square degree in the size. And again, just as the point this out, this is the same field of view as the Mosaic camera on Kid Peak. And at the middle of this image, you'll see a little square that's on its side and next to it it says XDF. And this little square shows you what's called the location of what's called the Hubble Extreme Deep Field. And this is a small portion of the sky that the Hubble Space Telescope stared at unblinkingly for about 30 days and collected light using many different kinds of filter. And this right here shows the final color image that was made from the data. And you'll notice in here that this picture is full of thousands of galaxies. Most of these galaxies are billions of light years away. And the famous objects in this picture are about 10 billion times fainter than what our eyes can see. So by using telescopes with large mirrors and by staring at locations for many hours to many days, we can see objects that are much, much fainter than what the human eye can see. So the challenge in the work that I do is to translate what the telescope can see into something that you can see. That is to take the telescopic data and turn it into a color image that you can look at on your computer screen and learn something about what we're seeing. So people often wanna know about how we make these images and they're often surprised to learn that our cameras don't see color. They actually only see black and white. And if you've ever used a CCD camera, you know this yourself from experience. So a CCD camera is an electronic camera that simply measures the amount of light that's hitting each pixel on the camera. And it doesn't actually know the color of light that's hitting it. So to create a color image, what we have to do is we have to look at an object through many different filters. So for example, this is a picture of a famous nebula called the crescent nebula. And this is what it looks like if you just look at it through a single filter and you project the image on the computer screen. Now to make a color image for this object or any other object, what we do is we'll look at it through multiple filters. So what you're seeing right here is part of the filter system for the mosaic camera. Each of the filters is five inches square on the side and costs about $2,000. And what we do is we look at an object through multiple filters. So we might take a picture through a red filter and then through a green filter and then through a blue filter and then we'll combine them together to produce a color image. Now the CCD camera, the electronic camera inside your phone actually works very similar to this. But what it does is it uses multiple filters at the same time so you can take a color image just at once. So you may not realize that when you take a picture with an electronic camera like your phone, there's all sorts of things going on inside the camera and the electronics of the phone before the image pops up on your computer screen. It's basically a fast track process of what we do. So here's a color image, a final color image of the Crescent Nebula that we made after we looked at it through multiple filters. Now when we make images looking at different objects we can use filters to closely match the sensitivity of the human eye. So on the right here you will see a picture that shows the sensitivity of the rods and cones inside the human eye. And this is how we see light and how we see color. The rods do not see color but they're more sensitive to paint light and the cones are what allows us to see color. We actually have three different types of cones in our eyes. We have what are called S cones, M cones and L cones and these cones allow us to see short, medium and long wavelength light respectively. So the S cone is mostly sensitive to blue light, the M cone is mostly sensitive to green light and the L cone is mostly sensitive to yellow and red light. The way our eyes produce a color image is that the S and M and L cones inside our eyes measure how much of these different kinds of light are coming to our eyes and then our brain calculates what the color is and that's how we see color. So if you look at something that's red it looks red because you're seeing more light from it with your L cones than you are with your S and your M. So we can simulate the same thing using filters on our telescopes and we often use what are called broadband filters which can see the blue, green and red ways similar that the S and M and L cones do. So the picture on the left shows the filter system known as the Johnson Vessel UB VRI filters and the B, V and R filters are reasonably close to the blue, green and red that our eyes see with the S, M and L cones. But you'll also notice there's two other filters there including the U for ultraviolet and the I for infrared. And so these filters show us kinds of light that our eyes can't see. These are kinds of light that are outside the spectrum that our eye is sensitive to. So sometimes you hear of astronomical images called true color images and often that refers to images that are made with blue, green and red filters similar to the B, V and R Johnson Vessel filters. But many of the astronomical images you see use other filters including the U and I filters as well. There's also what are called narrowband filters and these are filters that we use to see specific colors of light produced by specific gases. So this plot right here shows the sensitive, shows the wavelengths for different kinds of light including hydrogen beta, which is a blue kind of light produced by hydrogen gas. Oxygen three, which is a green light produced by hot oxygen atoms. And then on the right side, you'll see hydrogen alpha and sulfur two. The hydrogen alpha is produced by warm hydrogen gas and the sulfur two is produced by hot sulfur atoms. Now we often use these filters to differentiate between the light that's coming from the hydrogen, the oxygen and the sulfur. And one thing I'll mention is, is that the hydrogen alpha and sulfur two are so close to each other in color that our eyes can't tell them apart. But by using these narrowband filters, we can differentiate between the light coming from the hydrogen and the sulfur. So many astronomical images you see use these narrowband filters as well. So one of the techniques that myself and collaborators have developed over the years is the ability to make color images using more than three filters. So in this example here, you're seeing a picture of what's a galaxy called NGC6822, otherwise known as Barnard's galaxy. And the picture on the left shows what this galaxy looks like if you use just blue, green and red filters. The picture on the right shows what the same galaxy looks like if we use eight filters. And you'll notice it in the image on the right, you can see more detail. The colors and the stars are more colorful and it's easier to see the clouds of warm hydrogen gas inside the galaxy. So by making images in this way, we're able to bring out more detail in the stars and gas inside the galaxy by using more filters. So here's how we do it. And I'm gonna show you an example using the nearby spiral galaxy called M33, the Triangulum Galaxy. This is a picture of what the galaxy looks like if we look at it through a B or blue filter. And you'll notice in this image that you'll see lots of stars. And that's because this galaxy is full of hot blue stars throughout it. Now if we look at the same galaxy through a hydrogen alpha filter, you'll notice that the galaxy looks very different. And that's because the hydrogen alpha filter is designed to specifically let only that red light produced by warm hydrogen gas to pass through. And so when we look at it through this filter, you'll notice we don't see the stars as much. And now we can see more hydrogen gas. So again, just look how different it is going from here, the blue filter, to the hydrogen alpha filter. So to make the color image, what we do is we assign a color to each of the filters. So since hydrogen alpha is a specific color of red light, in this example, we make it red. And then the next filter we looked at was through the eye filter, which is an infrared filter. Now a good question to ask is, is what color should we use for infrared since infrared is a kind of light our eyes can't see? Well, in this example, I used orange. And the reason why I used orange is because I'm already using red for hydrogen alpha. And I want to differentiate between the hydrogen alpha gas and the infrared light. Next, we used an R-band filter, which we made yellow. The V-band, which we made green. The B-band, which we made blue. And then the ultraviolet, which is again a kind of light our eyes can't see, I made violet because that's the closest color that we can see. And then when we finally combine them all together, this is the image that we get. So this is an image that uses six filters, uses five broadband filters, and then one narrow band filter, the hydrogen alpha. And in this image, you can see the colors in the stars very well, and you can see the clouds of hydrogen gas in which new stars are forming. So if we zoom into a portion of the galaxy, you'll notice that the stars are very colorful, and we can see very easily these clouds of hydrogen gas where new stars are being formed inside this galaxy. Now some images that we make use only narrow band filters. And when we use narrow band filters, we often use color in a way different than the broadband filters. And that's because narrow band filters only show us very specific colors of light. So they show us things in a way that's very different than the way our eyes would. And there's some famous examples of these. And so I'm gonna show you one of these. So this is an image that I made many years ago of the Eagle Nebula M16, which is famous because of the colors of creation image from Hubble. So this is a wide field view of the same object. And when I made this image, we used three narrow band filters. The longest wavelength one was sulfur two, which we made red as shown here. And then the next wavelength filter we used was hydrogen and alpha. Now in this image, we made hydrogen and alpha green, even though hydrogen and alpha is a red color. And we did that because we want to differentiate between the light coming from the sulfur two filter and the light coming from the hydrogen alpha filter. And then finally, we made the oxygen three filter, which is apparently a green filter. We made it blue, so we can make a red, green and blue image. And then when they're finally combined together, this is the image that is produced. So this filter combination and color combination is informally known as the Hubble Palette. It got that name because this is the same filter and color combination system that was used to produce the famous Pillars of Creation image many years ago. But one thing I'd like to point out is that the Hubble Space Telescope and the telescopes on Kid Peak and elsewhere use many different kinds of filters and we use many different filter combinations. So this is just one example of how we make color images. Next example I'd like to show you is of an infrared image. This is from the Gemini eight meter telescope. And this is looking at a portion of the Omega nebula, M17. And in this example, we used four filters but we used infrared light. And so to make this image, we use a system that's called chromatic ordering which is you make the longest wavelength light red, the shortest wavelength light blue and then the intermediate wavelengths you make other colors. So this is the longest wavelength infrared light which we made red. The next one we made green and then we made use cyan which is a mix of green and blue. And then the shortest wavelength infrared light we made blue. And then when you combine them all together, here's the final color image you get. Now in this image, you're again looking at infrared light but you'll notice that the stars have colors here too. And even though it's infrared light, the colors here work the same way as colors in optical images. That is the redder stars are emitting more long wavelength light and the bluer stars are emitting more short wavelength light. So just by looking at the colors, you can get an idea of which stars are hotter. So people often ask me, do we use the same color scheme for all the images we make? And the answer is no. And the reason why is because each time we make an image, we're looking at a different kind of object using different telescopes and using different kinds of filter combinations. So each time we make an image is different than the time we did before. Now that's not to say that we go all crazy and use color in all sorts of different ways. We have some standard rules that we apply. And what I'd like to do now is describe how we think about making images. And this brings us to a concept that's called visual grammar. And if you've ever taken an art class, you may know a little bit about this. Visual grammar is basically the way that our eyes interpret an image. Now, usually when I give this talk, I have people in front of me so I can ask some questions. And I know you can't answer me, but I'd like for you to think for yourself about these two questions. First, I'd like you to ask yourself, when you look at this image in front of you, which half of the image is closer to you? The top half or the bottom half? Now that's an easy question to answer. Now the next part isn't so easy. And that is asking yourself, how do you know that? How do you know that the top half is further away than the bottom half? Well, the answer to that is because there are several visual cues that we use to interpret this image. So one thing to know is that when you look at an image, it's just a flat image on the screen or a painting on a wall or a picture in a book, but your brain is already interpreting the image and creating a three-dimensional map of what it sees. And so this helps us to understand and interpret the world we live in. So in this image, there are several visual cues that tell us the top half is further away. One of these visual cues is that we're familiar with this sort of landscape and we recognize that there are trees on the bottom half of the image. And we notice that the trees that are on the, the trees at the very bottom look bigger than the trees that are just a little bit above them. So this is a psychological effect where our brains interpret objects that look similar but are of different sizes. They interpret the big ones as being closer than the ones further away. Now, another important effect that's going on here is the colors. The top half of the image looks bluer than the bottom half. And that's because as we look at the mountains and hills off in the distance, we're looking through the Earth's atmosphere and the Earth's atmosphere, of course, is blue. And it makes things that we look at through the atmosphere look blue as well. And the further away it is, the more blue it is. So that's a way that our brains can naturally figure out where things are here on Earth. Now it turns out that if you go to the moon and you lack this visual cue, it's hard to get around. So what you're seeing here is a picture of one of the Apollo spacecraft that landed on the moon. And of course, the moon has no atmosphere. And when you look at this picture, you'll notice that there is something behind the rover and that may be a big mountain off in the distance or a small hill just behind it. And if you think about it with your mind, you can convince yourself of either possible scenario. So many of the Apollo astronauts reported that it was difficult to navigate around on the moon because they had difficulty gauging the distances and sizes of objects off in the distance. So without the atmosphere, our brains have a harder time working. When you look at any image, your brain naturally interprets the image and uses the colors to develop a three-dimensional model. So for example, this picture right here is from Hubble and it is a portion of the Lagoon Nebula. And our brain naturally interprets the blue parts as being further away and the yellow and red parts as being closer. So it's using the same process that it does when it looks at an image here on Earth to understand the depth. So people often ask me, how does Hubble see things in 3D? And they imagine maybe the Hubble's out there and has a giant pair of 3D glasses on it. And Hubble doesn't actually see things three-dimensionally. No telescopes do with the exception of the stereo telescopes that are orbiting the sun. The reason why this image looks three-dimensional is simply because of the use of color. And so it's a way that we can use color to help illustrate things and help people understand it. So this right here shows a side-by-side comparison of another famous Hubble image of what's called the Keyhole Nebula. And you'll see the image on the left, which is colorful, looks very different than the black and white. The color image looks three-dimensional and vibrant and it looks like a real world that you could explore, whereas the image on the right really looks flat and abstract and hard to understand. So this is a good illustration of how we use color to help people see the things that we study. Now another way in which we use color is to interpret physical characteristics. So for example, our brains nationally interpret red things as being hotter than blue things. And that's because we usually think of flames as red and ice as blue. And so when we look at images, we use this technique to help convey some of the science in the image. So whenever we create an astronomical image, we think about what is the story? What is the science that we want to convey to people? And so this right here shows an example of an image I made many years ago of, again, the Triangulum Galaxy M33. And the picture on the left shows what it looks like if you just look at it through an optical telescope. And the picture on the right adds to that image radio data from the very large array radio telescope. And what you're seeing in particular, the purple light there is showing what's called 21 centimeter radio waves which are produced by cold hydrogen gas inside the galaxy. So when we were making this image, we had to ask ourselves, well, what color should we use for those radio waves so that it looks natural and people can understand what it is? Well, if you zoom into a portion of the galaxy, you'll notice that the cold hydrogen gas seen in radio waves is connected to the warm hydrogen gas inside the star forming regions that I described before. So we chose violet or purple as the color to use for the radio waves. And the reason why is because violet is a mix of red and blue. And so when you overlay the radio data onto the optical image, the violet naturally blends with the red light inside the warm hydrogen clouds. And this is physically what's happening. The cold hydrogen gas is connected to the warm hydrogen gas in the star forming regions that are producing the hydrogen alpha light. The reason why we made it purple was that not only it has red so it blends in, but it has blue so it naturally conveys that what you're seeing is colder. So my hope is, and I've probably overthought this, but my hope is is that when someone looks at this image without knowing anything about it or without reading a caption, they'll naturally interpret the blue portions as being cooler but connected to the red portions that they see in the image. And that's physically what's going on. So when we make these images, we make these images to illustrate the science. And usually these images come at the end of the science. That is after we've done the science, we'll make a color image from the data to share with people some of the discoveries that we've made. But one of my favorite things to do is tell people about stories where it's gone the other way around where we've made a beautiful color image just simply for the sake of doing it, but we've gotten some good science out of it as well. So this is an example. This is a picture of a nebula called the iris nebula. And this is a picture I made some years ago of it with the Kid Peak four meter telescopes. And after I made the color image, I looked at it closely and I noticed that inside the image were these little faint red blobs. And these objects right here are what are called Herbic Hero Outflows. These are jets of gas that are being shot out of young stars that are forming inside this cloud of gas. And I noticed that no one had seen these before. So we wrote a science paper about it, where we looked at it. Here you can see it more up close using black and white images. And we looked at it with the Spitzer infrared telescope and we were able to see these jets as gas as well. So it's a nice illustration of how science and art work together. People often think of science as being different than art. And here what we're doing is we're using artistic principles to illustrate scientific concepts in a way that is natural and enjoyable for people who may not be scientists themselves. So if you're interested in seeing more of the images I've made over the years, I've made more than 200 astronomical images and I have a website. And the URL is listed here. Or the easiest thing to do is just Google my name, Travis Rector, which fortunately is unique enough that you'll either find myself or there's a convicted killer on death row in the state of Texas. And I'm the astrophotographer. So if you look for my name, you'll find my website. And I have many images that I've produced, deep sky images. And I also have some wide field images from some of the different observatories that I've gotten to use over the years, as well as some panoramas showing what many of these observatories look like. So if you'd like to see these images, please feel free to check out my website. And as Brian mentioned at the start of my talk, my co-authors Kim Arkhan, Megan Watsky and myself have just recently written a book called Coloring the Universe. And it's all about the topics that I've been talking about tonight and many things as well. The book was published by the University of Chicago Press and was released last November and contains 250 pages and over 300 beautiful astronomical images in space. And one thing to finish up before I open it up for questions, the last thing I'd like to share with you is a short video we made of what's called a book trailer. I didn't know what a book trailer was until I had written a book. And a book trailer is like a movie trailer, but it's for a book. And so I recently went up to Kid Peak and made a short video just to share with you what Kid Peak is like. And so that's how I'll finish my talk is with this video. Here we go. Have you ever looked at an image of space say like a galaxy or Nebula and wondered, is this real? Is this what it really looks like? Or maybe you even asked yourself, if I were standing right next to this object, is this what I would see? Hi, I'm Travis Rector. I'm a professional astronomer. For the last 20 years, I've been coming here to Kid Peak National Observatory in Arizona to make beautiful images of space using the giant telescopes. Along with my collaborators, Kim Arkand and Megan Waski of the Chandra X-ray Observatory, we have written a book called Coloring the Universe, which gives you a behind-the-scenes look at how these beautiful images of space are made with professional telescopes. We're now inside the dome of the Mayall 4-meter telescope. This is the largest telescope on Kid Peak and is one of my favorite to use. In our book, we have over 300 color images of space, many of which we're taking with this telescope. With this, we can see objects that are over 100 million times fanar than what the human eye can see. We're now inside the control room of the 4-meter telescope. This is where the astronomers sit when they're observing at night. When we're observing, we observe from dust to dawn because telescope time is precious. For every five astronomers who ask for time, only one will get it. So this computer here is what's used to control the camera that takes the pictures. Behind me is where the telescope operator sits. The telescope operator is the celestial taxi driver. He or she is the one who points the telescope to the object that we're looking at. Professional telescopes like these give a superhuman vision. They literally allow us to make the invisible visible. In coloring the universe, we explain how we use these giant telescopes to make color images of space. We hope you enjoyed this behind the scenes look at how professional astronomers see the universe. Okay, well, I hope you enjoyed the video and what I'm gonna do now is go back, just go back to the camera here so we can see each other. And if you have any questions you'd like to ask me, I'd be more than happy to answer them for you. All right. So if you could maybe stop sharing your screen, I think that the button is up at the top. That way we can see you. Okay, let's see here. Let's see how to do that. I don't click that. I think if you just click that green bar up there at the top, I think that'll do it. That's not working. I'm clicking on share a new window, but that's just... Yeah. I think it's on the bottom of the screen where it says share screen in between Q and A and chat. Okay, so it might be a little bit different on this. Yeah, that's fine. When I click here, it just gives me the zoom sign in window. Yeah. Okay, well, we've got the little tiny window in window of view, so that will have to do for now. So, except then they can't see us. So... I think I might be able to do it in one second. This might see, there we go. Start screen share. Yes. I'll just do my... Stop share. Okay, does that work? There we go. There we got it. Thank you, David. All right, okay. Well, we do have some questions that have come in Travis. And so we've got Geoffrey asks, how do you balance the intensity of color when you combine images from different filters? Is it mostly an aesthetic process or more quantitative? Okay, it's mostly an aesthetic process. There are ways to do it quantitatively, but in general, what we want to do is to have a good color balance. That is, that we're using the full spectrum of color. So if you look at a nebula that's emitting a lot more hydrogen alpha than the other colors, if we balance it based upon just simply the flux intensity coming from the nebula, everything would look red. And it would just be a monochromatic image. And that's not very interesting. So instead what we do is we scale each filter independently to use the full range of intensity in each filter. And then that way we get a full range of colors. Okay. Well, kind of going along with that, John asked this, and I think that this is a great follow-up to that. He says, wow, in addition to astrophysics, did you have any training in art or art history that it helps you to create these amazing images and help you with the aesthetic? I haven't received any formal training, but it reminds me of a funny story in that is that, so I've been making images for many years. My collaborators and I, when an image is successful, we often would take the time to think about, well, gee, why is this image successful? And we started to develop some ideas of what made for a good image. And then one day I was talking with a friend of mine who was an artist, and I was telling her about the procedures we use for making an image. And she said, oh, well, if you had just taken an art 101 class, you would have learned all that. So the techniques we use have been known by artists for hundreds of years. And it makes sense if you think about it. If you go to a museum or an art gallery and there's a picture hanging on the wall, when you engage that picture, when you start looking at it, you may not realize that the artist created that image in a way to get you to look at it in a particular way and to engage you. So we try to use many of those principles. And so I do have several art books that I've used to learn about principles of composition and color usage, but I haven't had any formal training. Okay, great. So Dan asks, and I've got kind of a follow-up with us too. Do you use the same image processing software that's available to amateur amateurs? And kind of as an addition to that, do you have any recommendations of software if you don't? The answer is yes. So the actual data processing will depend on what telescope we're using for most of the optical infrared data processing. I use a program called IRAF, which is best described as user hostile. It was written back in the 80s when computers were a lot more simple than they are now. So we use that for the actual data processing. And then to create the color image, I use just two programs. I use a program called FITS Liberator, which converts the FITS files into grayscale TIFF images. And then everything else is done with Photoshop. So it's actually fairly simple. Now, one thing I will mention is, is we actually, when we make our color images, we do less processing than many amateur astronomers do. And the reason why that is, is because we want the image to accurately represent what's coming off the telescope. So we use minimal techniques like sharpening or aliasing or things like that. Other than cleaning out cosmetic effects, we only use Photoshop to do the color composite. That is to take the different filters and combine them to produce the final color image. Okay, great. John asks, were all the images taken with the same camera? And was the field of view the same for all images? I know you discussed the field of view for many of the images, but maybe some of them were a little bit different. The answer is no. We use different filters on different cameras and different telescopes. So for example, the Mosaic camera has a different field of view on the Kid Peak 0.9 meter telescope than on the four meter telescope. And I also use other telescopes and cameras like those on Gemini. So each image has its own characteristics and field of view. So for all the images we make on the website, we list all of the details about the image, the size of the field of view, the orientation, what filters we use, what color assignments. And the idea is just to be as transparent as possible. So if people are interested in seeing how the image was made, they could do it themselves if they went and downloaded the data. Okay. And on the outreach resource page for this webinar, we also put a couple of tutorials. There's a big section on the Hubble site about using, I think it probably emphasizes the Hubble palette and so that, and that actually might be an interesting question here. And so you talked about these standard palettes that you use. Do you use the Hubble palette or a different one and you might explain what that is. So the Hubble palette is specific to three filters. It's to the sulfur to the hydrogen alpha and the oxygen three. And as if you use only those three filters, you make them red, green and blue respectively. But most images we make use more than just these filters or use other filters as well. So the Hubble palette refers just specifically to those three filters and those three colors. But many of the images, most of the images you see from Hubble and other observatories use other filters as well. And so the Hubble palettes are a little bit misleading because I don't want people to think that all images that come from Hubble use just those filters and just those colors. Okay. So Jay came in a little bit late and we've got here and he asks what specifically I think that you discussed this, but maybe it would be okay to revisit that. What filters do you use, LRGB, Johnson-Cousins loan? Are there any other specific ones besides the hydrogen alpha and sodium ones that you use? Yeah. We tend not to use LRGB. For those who are not familiar with what that is, LRGB is a system where you take a luminosity image or an unfiltered image and just get as much light as possible and then use red, green, and blue filters to combine with that. The advantages of LRGB is that it allows you to see painter objects. The disadvantages is you lose color information. And so LRGB images tend to be not as vivid or as colors as using other filter systems. So it's generally a system that we don't use. We do use the Johnson-Cousins filters. We use Sloan filters and we use narrowband filters. And so it really depends on what we're trying to do. So most of the images or many of the images I make are using filters for scientific purposes. And then we take that data to make a color image. And so we don't always have a choice as to what filters we have. If I get to choose the filters, then I will choose the filters based upon what the object is. And so if it's like Galaxy, I may have some broadband filters and then add a hydrogen alpha. If it's a planetary nebula, I would probably just use narrowband filters. It really all just depends on where the data comes from. Okay, great. And we have time for one more question here. Bill, and this is a good one to end up with, I think. Bill says, we started with black and white on glass plates, now giant CCDs with multiple filters. What's next? Where are we going with this technology? Well, the future hopefully for optical astronomy will be where radio and X-ray already is. And that is when we use filters on an optical telescope, we're only letting a certain amount of light go through and then we're throwing the rest of it away. But when you use a radio or X-ray telescope, these telescopes are capable of not only measuring the light coming in, but also the wavelength or the color of that light. And so there are technologies being developed that will hopefully allow us someday to do that with optical as well. So you wouldn't need to use filters and you would basically be collecting all of the light your camera can see all of the time. And then it would just already know what the color of that light is so that you're not throwing light away as you're looking at things. It's kind of sad to think about this light traveling from a nebula thousands of light years away and then it gets to your camera and then just bounces off the filter. So hopefully the future will be that we won't need filters at all, that our cameras will naturally measure color. Okay, well, that's certainly something that will hopefully increase the amount of, if we have more data, hopefully it will translate into a better understanding of the phenomena that we're examining. So, well, this is absolutely fantastic. This was great. Thank you so much, Travis. This is, I learned a great deal here this evening and it inspires me to think about maybe giving this a try myself, but there's a lot of really great equipment out there. So, and that's all for tonight. And so you can find this webinar along with many others on the Night Sky Network under the Outreach Resources section. Just search for webinar. We will post tonight's presentation on the Night Sky Network YouTube page by the end of the week. You can also find other resources and activity on this webinar's dedicated resource page. And now for our drill.