 Coming to my talk, I work on a project called OPUS. It's a search interface that people use to find data and images from NASA space probe missions to the outer solar system. It's built on Django and I'm going to talk a little bit about why we built it and some of the ways people are using it and I'll demo it as well. So I work at SETI Institute in Mountain View, California. SETI stands for Search for Extraterrestrial Intelligence and it started in 1984 as a NASA project to use radio observatories to listen for signals from other star systems that might indicate signs of intelligent life. And so today, 30 years later, SETI runs its own radio observatory and the SETI project is still going strong but it's also a larger research center where it's the base of operations for over 60 scientists and their students and staff who study many areas of planetary science such as planetary geology, asteroid research, planetary dynamics, exoplanets. So basically anything related to the study of life and habitability in the universe. So our group at SETI is a NASA funded data archive and our scientific leadership are experts in planetary ring moon systems. So many people are aware that Saturn has these epic beautiful rings but a lot of people don't realize that Jupiter, Neptune and Uranus actually have rings as well and these are some pictures of them. And so we at the rings node archive data from missions to these planets and we support researchers in this field. So we built opus. In fine NASA tradition we gave it an acronym. Opus stands for Outer Planets Unified Search and idea is that when you've, sorry, we've got several missions into one single interface so like we've unified them. So I'll jump right into demoing it for you. So this is the landing page and ironically it's sort of my least favorite page of the interface because it's just a little bit complicated when you first land on it. We serve primarily the scientific research community and it has a lot of features and options and it has a lot of specialized features so it can be rather complicated. My sort of bad joke is that it looks like a DMV form. But the idea is this. You start with a lot of results that you can access, our entire database. So that blue bar at the top, there's a number if you can see that. That's telling you that there are 1.2 million results that you can access right now. And that is a count of everything in the database. So for this tool we archive the complete set of remote sensing observations for several NASA missions, both Voyager probes, Cassini, Galileo, New Horizons and there's some Hubble data in here as well. And so the idea is that you want to narrow down your result count to find the data that you're looking for or what you want to learn about. And the interface gives you feedback as you sort of drill down into the data. And on the left side there's this monster list of parameters. And if you click one of those, it brings in a search widget. There are now two search widgets shown by default in the center. So it'll bring in a search widget based on any number of parameters. And you can use those to constrain your search. So I'll show you what I mean. I'll go ahead and click Jupiter. And the first thing you might notice is that a list opens below where I clicked. So that is actually a list of Jupiter's moons. So when you click one of the planets, a list of its targets or its moons or rain features will open and it lets you sort of drill down further. And the other thing that happened is that the number in the top got a lot smaller because you're basically saying, show me only results tagged Jupiter. And so clicking browse results to see what we get and you get this gallery of space images that is forever infinite scrolling. So the default ordering of this interface is ascending time. So what we're actually seeing here are the earliest observations in our database. And this is Voyager 1 as it first opens its camera shutter and images the planet Voyager on its approach to Voyager. So the Voyager mission is actually two spacecraft, they're identical. They were launched a few months apart in the late 70s. And they both had an original mission of just going to Jupiter and Saturn. But they've both extended their mission far beyond that. Voyager 2 has actually gone on to Uranus and Neptune. And Voyager 1, the spacecraft that took these images, has actually left the solar system, becoming humanity's first interstellar probe. So you can definitely infinitely scroll here, but you can also jump to pages. So there's a little input at the top. And I jumped here about 125 pages forward and also forward in time. So now Voyager 1 is much closer to Jupiter. And it's taking these close-up images of Jupiter's atmosphere and the very strange textures of its storms and its turbulent atmosphere. So I should say a little bit about these gallery thumbnails. Because if you can pick up the color here, they're actually green and purple and blue. So space imaging cameras are actually black and white cameras most often. You can collect a lot more information at every pixel with a black and white camera. And each of these cameras has color filters that they use. They have filter wheels, so they can drop in a filter in front of the camera. And that way they pick up light at different wavelengths or different colors. But when we get the images back on the ground, without any special processing, they are actually grayscale or black and white images. So for this gallery, we actually added a false color tinting just as an indicator to the user of what filter was used in that observation as a quick way to see it. And I'll tell you a little bit later about why that's helpful. So this is a different search. I've done nothing but click Saturn instead of Jupiter. And I skipped to page 700, just to get forward in time a little bit here. And the thing you might notice is that the number is much larger that the result count is much larger than it was for Jupiter. And that's because most of the images in our database are actually of Saturn. And that's because of the Cassini mission. So the Cassini mission, the Cassini spacecraft launched 20 years after the voyagers left Earth. And unlike the voyagers, which are flyby missions, Cassini is actually an orbiter. So it went into orbit around Saturn. And today, it's completed over 220 orbits. And it does these crazy orbits. So this is a diagram of the orbital path of Cassini. And you can see it swings way close to the planet and way far out. And what it's doing is it's trying to image the ring system because it's a very vast system. And also, Saturn has over 60 moons. And so they plan these orbits to fly very close to some of the moons and get close up images of moons. And then come in very close to the planet to get images of the atmosphere and close up views of the rings. So one thing I love about Cassini in terms of this interface is it's a very stable platform for imaging cameras. So when Cassini focuses on a thing, it's very good at keeping that thing in the center of the frame. And so as you browse Cassini images, you can see that many of them kind of look like frames of a movie. And so this is a sequence where Cassini is looking at the tiny moon atlas in the center there. And in this interface, you can click and see a larger image. And if you kind of just, if you have a good internet connection and you hold down the right arrow key, you can kind of get this like frame animation movie effect. And there's many, many frames like this where there's just interesting movie sequences that you can check out. And people are using this, they're making frame animation movies, they're adding music beds and they've gotten really creative. They're gluing them together and making these videos without having to do any extra processing work to stabilize the scene in the frame. And I should mention this is a jQuery plugin called Colobox that does a really nice job of pre-loading the next image. So I'm gonna go back to the interface and I'll show you a really common use case which is to search by instrument. And that just means you wanna choose a particular camera on a particular spacecraft. So what I'm gonna do here is bringing the instrument name Widget and I've picked New Horizons there because there's something interesting I can show you here. New Horizons went to Jupiter and I'm gonna pick the moon Io from the list. And here's a bunch of pictures of Io. And so, sorry, got a little lost here. So this interface is actually a shopping cart. So you can sort of pick images you want and then you can do things with them. You can download a CSV, you can create a zip file. And so this is sort of the view of what's in your cart. And you can interact with the images here as you sort of would in the gallery. And so one thing about this sequence of images. So a lot of people are aware of New Horizons because it just went to Pluto and I got our first close up images of Pluto. We never knew what Pluto looked like before. But it also had a really interesting encounter at Jupiter. Lots of things happened. So we have all the Jupiter images and when the Pluto images come back down and our release to the public, we'll have all of those too. So if you can see this moon here where there's a little bump on the top. It looks kind of like a fountain, it's kind of bright. That's actually a volcano exploding just as New Horizons was taking pictures of it. Nobody knew this would be happening. They had planned these pictures to take pictures of Io and here it was a volcano exploding at that moment. So from that sequence, we get our first and only existing frame animation movie of a volcano erupting on another world. So I know, right? So that's a little bit closer. That is a 200 mile high plume. The volcano's name is Tavastar. It's actually over the horizon a little bit. So you're just seeing the edge of it or the top of it. So why did we do this? Why did we build opus? The challenge was that people researching outer planets and especially people studying ring systems were having to do a lot of extra work in order to get the data that they needed. And this relates to the fact that space missions are enormous collaborations. If you think about a spacecraft, we attach a lot of instruments to every spacecraft that goes into space. Here's an example of the Cassini spacecraft. So there's very specialized instruments, cameras, telescopes, spectrometers. And each instrument has its own team of people that work just on that instrument. They build the instrument and they get the data back and they shepherd the data back to Earth and they release it on their own websites. And NASA has a standard for how the data is formatted. And the teams mostly follow that. But what we are seeing is that people had to do a lot of different searching around different websites and knowing a lot about instruments in order to work with the data. And this caused frustration. So we wanted to address this pain point. We wanted to create a single interface that would serve a larger swath of researchers and we tried to make it easy to use. So who uses it? Scientists use it. There's many ways in which it saves them time. And I'll give you just one example of a parameter that people who study ring moon systems are particularly interested in. And which Opus makes easier to explore. And that is phase angle. So phase angle is just a way of quantifying the angle of light in a photograph. So if you're taking a picture and the light is coming from behind you, that is a low phase angle. And if you're taking a picture and the light is coming from in front of you so that the thing you're taking a picture of is backlit, that is high phase angle. So right here you see two pictures of Saturn. They're both of Saturn. The top image is at high phase angle, Saturn is backlit. And the bottom image, Saturn is a low phase angle image. So the light is coming from behind the camera. And this is sort of the bottom view is more what we're used to seeing from ground based telescopes where the sun is behind the camera. And as you can see the rings look very, very different at different lighting angles. So just by changing the lighting angle, you can change what you pick up in the rings. And this sort of research allows people to learn a lot more about their rings by looking at them at different lighting angles. And I'm not sure if you can tell in the top image, there's like a super wide, really faint, cloudy sort of diffuse ring that doesn't even appear at all in the bottom image. And that is actually the E ring. And Cassini discovered this enormous ring that's not part of the main rings. And here's a close up of the brightest part of that ring. And there's a tiny moon there, Enceladus. And what Cassini found is that Enceladus is actually venting water ice from its south polar region and feeding this giant diffuse ring. So how do you do a phase angle search? I'm gonna do one, I'm just gonna do one against Saturn. So I pick out Saturn and then go into the monster menu. And under lighting geometry, there's a phase angle widget. And each of the widgets tell you sort of a hinting of what are valid values. So you don't have to know what are valid values. So right here I'm just gonna pick some high numbers, 165 to 180. And the result that you get is some very pretty glowy images of Saturn with rings that are backlit. And in several of these, you can actually see the E ring, whereas you couldn't see it in other images. So if you scroll down a little bit here, there's some very bright, oops, sorry, there's some very bright pictures of the E ring and the tiny dot in there is Enceladus. So not only do scientists use it, science journalists occasionally use it. So at the website planetary.org, you can find collections of these complete composite images created by senior editor of the Planetary Society, Emily Loctawalla. She's actually a scientist turned journalist. And she creates these composite images that show various planetary bodies in the same image to show scale comparisons, to teach and raise awareness about our solar system. So she uses opus when looking for images of specific bodies, at specific resolutions, and specific phase angles. And she tells me that making composite images like this would be ten times harder without this tool. Another community that uses it, there's a thriving community of people who create color versions of space probe images. So space probe cameras are black and white cameras, and they have different color filters. And so if you can find three images of the same scene taken with the red, green and blue filter with some photoshop skills and some artistry, you can make color versions of these spacecraft images. And these are much like the colors that your eyes might pick up if you were viewing the scene directly, if you were out there with the spacecraft viewing it with your eyes. So people like opus because our gallery thumbnails are tinted, making it easier just to find those RGB sets. So related to image processing and frame animations, there's a large format IMAX movie coming out. It's in production right now. It's called In Saturn's Rings, and it's made exclusively from real photographs taken by spacecraft and telescopes. So they're not using any CGI for this movie. There's no 3D rendering at all. Instead, they're processing over 7.5 million photographs taken from various spacecraft and telescopes. So I'll play you a clip from their late summer teaser that they released last year. So during production, they used opus to find suitable image sets to stitch together into large scale panoramas that they then animate to create the effect of flying through space around Saturn. So this movie is due out later this year. So now I'm gonna switch gears a little and talk about my favorite part of the interface, which is new interface. Opus has a completely open JSON API. You can make requests over HTTP and get a JSON response back. So any research that can be performed in the opus interface can also be performed via an API call. The URL structures are the same. So what this means is that all the data and all the images that you can access using the interface, you can also access from your own scripts, from the shell, from Python, anything you want via URL endpoints. So I'll show you an example. Here's one of our endpoints. For this one, I am asking for data and I want to inform at JSON. And I want to do this query, planet equals Saturn, target equals Hyperion. That's a moon of Saturn. And you get this nice, well-formatted JSON response. And in this case, you can also change the .json in the URL to .csv if you wanted to do that. Here's another one. Let's say you want images. So here I'm asking for images, size medium. And I want an HTML response this time, just so I can show you the pretty pictures in the browser. And it's the same query except I've added an imaging camera just to make sure we get actual images back. And so you get a big list of Saturn's weird, spongy moon Hyperion. If you go to the URL and change that .html to .json, you get the same list in JSON and only now it's a list of direct links to all those images. So to find the API, you go to the moon search interface. There's a link at the bottom and that gets you to our API documentation where we've documented how to use it and shown several examples about how to use it. So we are a tiny, tiny team on a tiny budget. And I know, I had to, okay. So Django is making it possible for us to accomplish a lot with very limited resources. And I have to say the awesome icing on the cake is that I've experienced the Django community to be such a supportive community and welcoming to beginners because not so long ago I was also a beginner. So thank you, awesome Django community for being who you are. And thank you to Django maintainers because we're making space images more accessible to scientists, journalists, artists, developers and everyone. And you make this possible. Thank you. So I can take questions if there's any. All right, cool. So I noticed on the main page you sort of talked about how you didn't like that page very much because it's kind of a bit of a monster. In designing this site, were you designing it specifically with sort of the scientist in mind or were you trying to design it as sort of an open public interface or was that sort of something that came along later where you realized that more people might want to use this than just for research? So we're funded to serve scientific research community. But my secret agenda the whole entire time is that I think anybody should be able to use it. So I actually do work to make it easy to use although that's, you know, I have to weigh different issues. But yeah, it's always been open and public and anybody can use it. So I always thought also that scientists deserve the same quality of tools as everyone else. Like they're not better at figuring out bad interactions than other people. And so why not like try? So we try. Thanks for the talk. So you have a lot of images and I was curious, how do you handle sort of the some meaning and storage and all that kind of that scale of image content with Django? Because like I've often found that when you get to, you know, the many hundreds of thousands of images Django starts to, especially the built-in media stuff starts falling down. So is there anything special you're doing there? No. Oh, just normal? No, it's like the database is like a little over a million. So no. Okay. That's right. Fantastic. Thank you. I was curious, you were talking about how there's a community of people who will put the RG and be together. Is there any way that you could do that automated since you are doing the tent? Or is it just a matter of like getting the proper registration marks so that, you know, that you have control points and you're actually layering the images right? Maybe they're not referenced in such a way that you can do that. I don't know. I think there's some of that. I think there's also some judgment calls in how much of each color you bring in. So sometimes you see colorized images and they're kind of false color. They're not really what your eye would see technically. And that's subjective anyway. So yeah, as I understand it, there's a lot of judgment involved in that. But I think a lot of things can be automated. So who knows? I thought of an answer to your question over there. We actually do use a memcache layer to do these queries. But the images, yeah, they're just coming from our server. Yeah. Do you guys self-host? Or do you use some kind of cloud provider to host all the content? We self-host. Cool. Yeah, it's Jango, MySQL, jQuery on the front end, Apache, and memcache layer. OK. All right, thank you very much. Thank you.