 All right. So we have one more speaker and then a couple of announcements to wrap up our morning. And one of the local users here that's been a long time open stack deployment is at MIT at their CSAIL, their Computer Science and Artificial Intelligent Lab. And John Prule from that environment is on our user committee. Today, we're actually going to be hearing from the director of CSAIL, who is one of the world's experts on robotics. And she's going to be talking to us about some interesting innovations in robotics that are kind of surprising to me. So let's welcome Daniela Ruse. Open stack. Are you having fun? I would like to tell you how open stack can power a new generation of services and can enable people to make their own robots. Because right now, it takes too much expertise and time to make a robot. But imagine a world where anyone can make a custom robot. So let's say we have a user and let's call her Alice. And we want to give Alice the power to automate many tasks in her home. So say she wants a playmate for her cat for when she's at work. Well, to do so, Alice goes to 24-hour robot manufacturing powered by open stack with a rough idea of what she wants. And at 24-hour robot manufacturing, she can explore the design space for her device using an intuitive interface. When she settles on the design, the store will print the final design, generate a programming environment for the robot, and augment their database with a new system. The cost will be affordable, and the robot will be made right away. And the cat will have a playmate for when Alice is at work. So how feasible is this idea? Well, here's an example. Say I want to type in my favorite editor, I want the robot to play chess with me. Now, can you imagine being able to type make? And in response to this, having a parser, parser specification to determine behaviors like pick-up piece, move piece from here to there, drop piece, et cetera. And once you have behaviors, mechanisms could be generated to implement these behaviors, and then the designs could be fabricated. And the robot could be made. So this chess robot was created in a few hours for a bill of materials on the order of a few tens of dollars. The robot was not created from that very simple Emax interface, but many aspects of making this robot were automated. So we are building this robot compiler for making robots by printing flat structures and folding them into robot bodies. The compiler has a database. The database has designs. And the designs can be composed and segmented. So for instance, if you want to make a six-legged robot, you might compose it out of pairs of two legs, single legs, and robot bodies. And then all the user has to specify is roughly what the body looks like. The system will figure out exactly what are the components and how they can be parameterized and instantiated using different materials. The output of the system is design files that can be sent to something like a laser cutter or vinyl cutter, and that can be produced even using low tech with paper and scissors. So once the designs are printed, the user, in our case, manually folds these designs into robots. And in this case, we're demonstrating a six-legged robot, a wheel-based robot, or an arm-shaped robot. So the body is now done. What is needed is the special programming environment for the body and the user interface that allows any casual user to work with a robot. So the whole process takes about two minutes, and the use of the machine becomes such that anybody who has a smartphone can use it. So let's zoom in. First, all the designs are parameterized and are stored in a database. This enables the user to do interactive modeling, and once the user settles on a design, the design can be fabricated. So the user designs 3D objects that get unfolded for easy fabrication, but the conceptualization, the support of the user, is done in 3D space. The user does not see the unfolded designs. And the designs get composed hierarchically from parameterized components so that the user can change them and reason about what is most useful for the task. So here we see the user dragging legs to create a four-legged system. And you see on the side how the system automatically unfolds the 3D design to create the desired robot. And once a design is chosen, a simulation engine can be used to make sure that the design meets the specifications. And if something goes wrong, the user can change the design. So in this case, the legs were too long, and the robot was not stable enough. So once the design is established to meet specifications in simulation, it can be 3D printed. And the circuits for the design, the brain of the robot, the electronics, that supports the behavior of the robot, can be created using a very similar data-driven approach. So in this case, for the crawling ant, the user only specifies two sensors and two UI sliders. And the system is able to generate all the missing components, such as the microprocessor and the connecting wires. And ultimately, the system produces all the wiring diagrams and instructions for creating the electronic substrate for this robot. Now, the system can also create the app for controlling the robot, in this case, identified as the sliders. So the most time-consuming component of making these robots is the actual manual folding. It takes about two hours to make a robot, but much of that goes into the actual creation of the robot going from flat to 3D shape. So we ask, can we automate folding? And our insight is, yes, by using heat, by giving the users the ability to bake their own robots. Here's how it works. The secret sauce is to compose the body of the robot as a sandwich structure, three-layer structure, where the top and bottom layers are structural, and the middle layer is a conforming layer that reacts in our case to heat. The middle layer, if you have kids, is made out of the material used by Shrinky Dink toys. So by cutting gaps in the top and bottom layer of the robot, we can control the angle at which the robot body curves up or down when exposed to heat. So here's an example. You can see this robot on the heating plate. And this robot raises from a flat square shape to a runner. This robot, in fact, can run at four times the body length in a second. And so the robot can do interesting things. It can navigate. It can crawl up your body. It can go around obstacles. It can float on water. It can carry things. It can climb up. And it's kind of an exciting robot. The way we control this robot is using an external magnetic field. The robot has embedded in its body a tiny magnet. And using the programmable magnetic field, we can control the trajectory of the robot. And when we're done with the robot, we can send it to a recycling bin. OK, so these are small robots. But in fact, the same method can be used to create large field-sized robots. I wanted to show you one interesting example for why you might want to have such a small robot. And the application here is body surgery. In fact, the idea of creating a small robot, we call it the origami microsurgeon, that could be almost like a pill that you could swallow. And it could do things to recover from various troubles. For instance, if you swallow the button battery, which 35,000 people swallow in the US every year, it's actually quite dangerous, because within a half an hour, that battery will become fully merged in the tissue of your stomach. So it will make a hole in your stomach. And at the moment that operation, so fixing this issue, removing the batteries done by surgery. But with the origami pill, with the origami microsurgeon, you could swallow the robot. The robot could arrive in the stomach. And once in the stomach, it could deploy. It could go to the location of the battery. It could pull it out and eliminate it. Now, in order for this to work, the robot capsule actually is made out of ice. So you swallow the compressed robot in this ice-shaped capsule when it gets to the stomach, the ice melt, and the robot deploys. And here is the example of the robot being operated in an artificial stomach. So the robot was swallowed. And now the robot is directed to the magnet, to the battery. The magnet is used to pull the robot, the battery, out of the stomach. And then the battery gets eliminated through the digestive system. Later on, you can send a second robot to patch the wound and deliver medicine at the location of the wound. And here is the robot delivering on that task. All right, so it's going to make it. I promise. Here it is. OK, so how far in the future is this 24-hour robot manufacturing idea? Well, consider that 20 years ago, people started dreaming about pervasive computing and look at where we are today. Computing is so normal that we don't even notice all the different ways in which we compute every day. So I would say that the smartphone and the app store have democratized computation. And just like the app store has democratized computation, the potential of democratizing and customizing physical tasks through easily deployable, through easy to make and use robots can be equally profound. And if we can do this, we will have a world with a lot of robots helping us with physical tasks. And what's not to love? I hope OpenStack helps us. Thank you all.