 I have a loud voice and I hate standing behind podiums. I like walking around, moving around, so I'm sorry if I look like a crazy chicken going back and forth. So my name is Hal Harris. I'm a professor at Carnegie Mellon University. I'm the director of the computer science program there and also the founder and director of the Networking Systems Lab. I might want to share with you just for a couple of minutes also the context like Maj did. By the way, that was extremely interesting. So it's actually going to make it a lot harder for me to beat. I just wanted to keep hearing you for a while. So Carnegie Mellon was invited by Cutter Foundation, the same general entity that Maj was talking about, to open a campus here. And we offer a couple of programs as part of our presence here, nominally business administration, computer science, information systems, and biological sciences. We are fortunate enough to be part of this experiment here in Cutter, help grow the educational background and the research awareness going on over here. I'm extremely proud to be part of this university. We're number one in the world in computer science, along with MIT, so that they don't feel bad. But we actually have doubled the faculty of MIT only in computer science. But we're really happy to be here. We've been here now since 2004. I've been president in Doha for the past eight years. We've already been stable enough, and now we've kind of graduated several batches of students and have been reaching out more and more to the community in many different ways, some of which have been different collaborations with QCRI, for instance. I'm here to talk to you now about robotics and UAVs in particular. So before I get into the gist of this topic, let me tell you a little bit about the talk format, and then I'll be talking about two big things. Robotics in our life, and then one example that I'll dig a little bit deep into, particularly UAVs. What do I mean by a talk format? Well, has anybody seen this before? Yeah? Anybody get bogged down by PowerPoint slides? Heavy ones? Ones you can't really parse? This was a real slide that circulated the net a while back that was used in one of the presentations by the US military in Afghanistan as an example of how we should not be using PowerPoint. You know, everybody here at the term death by PowerPoint? This would kill you if you really tried to understand what's going on there. My point of putting this up is to tell you I really, really prefer interactive discussions rather than a one-way talk. If you have questions, do stop me. Don't feel weird or anything. I actually prefer that. I can adapt my time, and it's much better to have more people engaged than me ranting on for a while and having you disconnect. So the point of this is saying, I hope my slides will not look something like this. Ask me if something is missing. I like just using it as a way to guide the discussion to a large extent. Thank you. This is what robotics is. Exactly what Doha just said. This is the current definition on Wikipedia. I think it's very appropriate. And the key I want you to catch here is mechanical engineering, electrical engineering, computer science. And within that, you can find so many more details. Point is, I was just sharing over here, robotics is one of those fields that happens to be a result of the marriage of many subfields. It's extremely rare to say, I am focused 100% on robotics. Usually, if you're really focused, you're focused on one portion that contributes to building robotics or doing robotics. I, for instance, if you saw the title slide, I come from a networking and communication background. That's my main area of expertise in depth. I happen to be involved in more and more robotics projects, because my area of expertise, at least my preference in my research, is building real systems. People in academia or in computer science and theory hate me when I say this. I don't like to just think of a problem theoretically and imagine a solution and simulate it. I like to try to build it along with my research team. So robotics happens to be one of the most lucrative domains where we can apply the work that we're doing. Now, having said that, I was wondering, what do you guys think are examples of robotics? Drones, what else? What are robots you're aware of? You can think of an either way. You can either mention a few robots you're aware of or you think are robots. Or even predict what kind of robots might be out there that can improve our lives or make our lives better. What are some robots? What are robots? Mars rover. Mars rover, UAVs. Gaining the tools in order to not have those. Good. Rumbas, the automatic cleaners that vacuum your rug when you're not around, and hopefully can avoid obstacles. OK. So a script. So now there is, does that script have something tangible to it? ATM machines. OK. Interesting. I wouldn't have thought about it. Interesting. The Bob dismantling thing. Bob dismantling thing, right? We don't want people to be hurt. You want to look for mines. You want to send in a robot. Great, these are some very good examples. Let me tell you how DARPA as one institute or one agency that considers robotics looks at robots. From their point of view, for instance, and this is just one example. The point here is it depends on the application domain that you're talking about. They start out with something funny saying, sci-fi stuff, sorry, not real, although the Japanese are slowly breaking that assumption. But they look at industrial robots. They try to categorize the current robotics areas into those three categories. Things that are industrial. They're real. They're pre-programmed. They're pretty strong, extremely precise, extremely tailored for doing one or two specific tasks. That's about it. Not that those are easy to build, but that's the kind of category that we'd be looking at. So think automotive industries, for example. All those robotic machines that now build your cars fall in that category. You'll find a robotic arm that does just a few things. Things that build your chips and manufacturing. Also, same thing. 3D printers. Not there yet. 3D printers, at this point, are 3D printers. They have their own life and technology and materials, and that's a different world in itself. Tele-operated. So things now that can manipulate objects require clear communication, are real, still require a lot of precision, but need some human operation. Can you think of examples? Thank you. So the Mars rovers actually are not tele-operated, but we all know the unfortunate usage of US drones, for example. A lot of those are actually piloted. You actually need a pilot who controls those, and there's a lot of delay, and you need communication, make real-time decisions. Other examples that are nicer in life are robotic surgeries. Actually, there is a robotic surgery center here at QSTP. We went and we played with that. It's really interesting. You can sit there on the actual robot, or if you have a nice decent connection, you can control that remotely. And then this is the thing they want to go to in the future. I know it looks like an evil soldier that's trying to do bad things, but you can ignore the actual app and think about the requirements to get there. You want something that's versatile, highly capable, great communication, flexible enough, maybe have some artificial intelligence so that if you disconnect from it, it would know how to act on its own, you name it. Another interesting thing I'd like to share is another video that sums up some examples of good robots, just to get us a little more excited, if you will. Here are 10 great robots, some of which are even available to the public. Number 10, Hondas Osimo, the humanoid robot, stands a bit over four feet and weighs in at 119 pounds. But its developers have packed a whole lot of functionality into its modest frame. Osimo can do sign language, play soccer, and even serve drinks. Number nine, Caradis. If you ever dreamed of being a pilot on a great, big robot, this is your chance. Located within its 13-foot-tall frame is an operating room for an onboard commander. Oh, by the way, it can be controlled with an iPhone. Great Atlas, commissioned by the Pentagon and engineered by Boston Dynamics, this disaster rescue dynamite is built to perform the heroic task of saving lives without risking its own. Number seven, Termite Robots. These little guys are designed to do top, G is building work and their way of working was inspired by termites. The bots assess their environment, take use from one another, and get to work even though they have enough clue of a bigger planet. Number six, Cheetah Cup. Wanting to know how to make robots take control of tough terrain with the grace of a feline, scientists at the Swiss Federal Institute of Technology made a robotic cat. Using it, they can assess joint force and agility without having to harm an actual animal. Number five, Wildcat. Not to be confused with the aforementioned Cheetah Cup, the Wildcat is the newest generation of Boston Dynamics' Cheetah Robot line. These machines are made for speed, traveling up to 29 miles per hour. While previous versions were indoor sorts, this one is suited to run free. Number four, Ego Virtual Student. Stuck at home with a severe immune disorder, a student in Texas is now able to attend classes thanks to a robot. She can power the virtual version of herself via internet and participate in what's going on thanks to a webcam. Miranda suffered from a weak immune system that doesn't allow her to attend class in person, but thanks to this robot pilot program, she's back in the classroom. Number three, RoboFly. It's tiny and fast, weighing only 80 milligrams, flattening its wings up to 120 times per second. Once technology advances and can make it a battery small enough, the robotic insect is predicted to be a helpful tool in search of rescue missions. Number two, The Map. It flies around your house, senses where it's dirty, and then deploys a bunch of tiny little cleaning bots to make the area sparkle. It's just a concept, but was enough of one to win the Electrolux design of competition. Number one, Curiosity. NASA's Mars rover has accomplished far more than any other robot can imagine, so to speak. It's only been on the red planet for about a year and a half, but has already determined that there could have once been like that. If you could design a robot, what would you make it do? So I'm sure some of you have seen snippets of these videos before, especially Boston Dynamics related ones. And the only purpose here is really to share with you the wide range of potential robots that you can think of. The thing I like to share with typically my students when we start talking about these things, or when we go and give talks to get them excited about the future of computer science, is something that may be a lot simpler and smaller that we use in our everyday lives, and we forget the involvement of robotics or computer sciences in them. So for instance, what's the first thing that you do when you wake up in the morning? Drink water, coffee. I brush my teeth before coffee. Yeah? So I try to do that first. Well, I'm sorry, I just had to use the example I have, but I could have as well picked something else for coffee. But something as simple and pervasive and we forget about it. We take it so much for granted in our life. I walk them through the details of every single piece of material, the glasses you wear, the pipes you're using, the tap, everything you are using or doing has been manufactured today mostly by robots. And I like to share one of these just to show, oh my God, the toothbrush. Who cares about a toothbrush? We don't think about it anymore. And, well. These toothbrushes should make you want to flash your pearly whites in appreciation if you consider that thousands of years ago, people used twigs to clean their teeth. Today's toothbrush begins as little plastic pellets. That vacuum sucks them up into an injection mold machine. The machine melts the pellets into a kind of plastic dough, then injects it into a stainless steel mold, forming 10 toothbrush handles at once. The head of each handle has up to 56 holes for bristles. This machine generates 10 handles every 35 seconds. That works out to 27,000 handles in 24 hours. And now they melt blue rubber pellets. They pipe the liquid rubber into the mold with the white toothbrush handles, then press the rubber onto the handles to form a grip. Go a little faster. It's called those gummy candy. Then faster than you can blink them down so that they sit very or the bristles. A robotic arm pats them down so that they sit very evenly. Then faster than you can blink an eye, the machine feeds the bristles into the holes in the head of the toothbrush handle. This machine works at a blurring speed, filling 900 holes per minute. And it operates with incredible precision. So you got the idea. That's why you only pay a couple of reals for your toothbrush, because this is what's happening at that rate. A huge additional investment, and once you put it, this needs mechanical engineering, electrical engineering, computer sciences, programming, et cetera. So let's talk a little bit about robotics growth and advanced, what's going on now around the world. So it turns out more and more now of jobs are being lost, right? One of the most sensitive issues whenever I talk about robotics anywhere. I'll get to that in the end again. And one of the biggest domains, as you may tell, is that growth of robotics are related to industrial robots. You can see over here a bunch of numbers talking about market value around the world, et cetera. Look at the different countries and how they've been growing in terms of the number of sales of industrial robots. I know you'll see something like Germany a little bit constant, but it's really constant these few years because they had already invested a lot earlier in industrial robotics, and it'll be a while before they go get into the newer generation. One of the scary trends, and if you read more about it, you'll see things now related to China and other places in Asia, because I'm sure you all know a lot of the Chinese goods that you get for quite a long time are hand manufactured. And if they wanna grow that, they're gonna import more of these things and flood the markets even more and more. This is another interesting, potentially scary stack on how there are changes in human workers and robots just since 2009 by some of the major industrial countries. So you can see most of them losing manufacturing jobs and severely increasing robotic jobs. This is not a new trend. This is something we've seen throughout history, but now the whole fear of robots are taken over. This is one initial trend from an economic perspective, from the perspective of how we're moving forward, what we're gonna be facing as realities that we would probably need to be at the forefront of rather than react to. I wanna take that and show you another example of something that we particularly did at CMU, that we're all very proud of. And the example there is showing you an advancement over time. So one of the biggest dreams that a lot of people have, including Google right now, and I'm sure if you've lived enough in Doha or anywhere really, you still have the same dream, which is you don't have to drive. So a few years ago, let me share with you this. Imagine a car that can drive itself that knows the rules of the road. Sound far-fetched? Not for the robot wizards at Carnegie Mellon University. They call this car with the brain, Boss. Now this don't like it, I'm gonna drive myself a plane. That's scary, why? You wanna be in control. But the Boss is probably a better driver than you are. Students built it for a pentagon competition called the Darker Urban Challenge. It applied its way to a simulated urban environment without so much as a din fender. It was the undisputed champion in 2007. This is a GM Tahoe, all right, a Chevy Tahoe. Most of the back end of this is pumped with computers. All what you're seeing here are different ranges of sensors from infrared to laser to video. You can name it. The car was indeed driving by itself. We only have people sitting in there and we actually simulated a big part of this around CMU. Actually a little bit outside of CMU's premises. Now, I'm not showing you this just to show you what was capable, I'm showing you this to see what happened in a couple of years. This was around 2007 as you heard. One of the main students who was working on this project at the time and one of the main advisors ended up going to Google. Actually they ended up going to Stanford and then they went on after that to Google. And give it what? I think about five years or so and now you get something like this. This car can do now the same stuff that you saw on Boss. Google built this prototype with the same team members tested on a bunch of people as you heard in the previous video who are scared, who don't trust those kinds of cars. Go and start building a program. Really, they're prototyping this. They were a chance for us to explore what does it really mean? Chris was the grad student on the project at the Boss project. What's the point of this again? Research, innovation, you think of something, you dream something. It seems extremely hard, extremely expensive. You work hard at it, you work with teams, you invest, you're a typical cycle. You take it to some entity that will believe in it. You guys would know this more than me. They stand behind you, try to fund it, you get a better prototype. I don't know if you've heard, but now Google has licenses for self-driving cars, I think now in two or three states, if I'm not mistaken. So these cars have been heavily tested on roads. This is the future and you've seen the steps that we're taking. Okay, so all this stuff is really to entice you about the area of robotics, show you some differences, show you the range of different robots that you would have there. I wanna zoom in on a project that I'm involved in, one of the projects that I'm involved in, here at Carnegie Mellon in Qatar. Related now to UAVs. We have a bunch of robotics related projects, I can give you just a quick synopsis on a couple of them. One of them has to do with oil pipeline inspections. So we had this in collaboration, I think, with Ross Gas and we actually got a bunch of the real pipes that they use in the oil pipelines into Carnegie Mellon. We never knew how insanely heavy those were. We had to get a forklift to lift it and put it inside. And the whole idea is we built robots that would navigate the inside of those pipes. Think about applications that are relevant here and through stereo vision detect potential corrosion because you would really want to tell early on if a pipe is about to explode or not. And if it does, I think many of us here in Qatar would not be paid for a while. This is one example. Another example that I'm actually involved in is building a bunch of boats where you can mount different kinds of sensors. Right now we're mounting different cell phones because they are actually very high-end platforms today with heavy battery power. And we go now and experiment with them and throw them out on Qatar. This has been a three-year project to gather lots of sensory information and map the bottom of the ocean or measure kind of the depth across different places. These become agile platforms that you can use for whatever purpose you want that relates to water areas where you don't want to send people or you don't want to spend a lot of money and time on humans or more complex entities. UAVs, unmanned aerial vehicles. So my interest in UAVs started about a couple of years ago and interesting enough, it was never, oh, I want to work on drones. It was actually coming from a networking project where we work on something called visual sensor networks. So we are working on how do you go and make sure that if I'm covering an event, I can deploy my cameras in the most optimal position so that I have maximal coverage. Coverage in this place in this case means how can I see as many people as possible? So if I'm in this room, where's the best place to put a camera to see everybody? Maybe give it this distribution. I need one camera here with this FOV and I get this part and another camera there with another FOV. Assuming FOV is field of view. It's how wide or narrow you can actually see some space with a lens. And a lot of the research out there that's becoming, so by the way, this is an extremely important application for so many reasons. Border control, security, surveillance, really things. You can name, just crowd tracking, think later on what will happen in Qatar 2022, just so many different applications. But we wanted to look at it from the perspective of what would you do if you had mobile cameras? And we thought mobile cameras, and what's the best mobile camera I can use, well get a bunch of drones. And let's see if we can hack them and program them. And then this whole thing took a life of its own because it turned out to be a lot harder than we thought. But let me walk you through it. And the reason I'm walking you through this example is as entrepreneurs, if you're interested in robotic related stuff, I think it helps to have some awareness that there will be extreme challenges when you come to the technical things. And it's nice to be prepared to bump into these blocks that would probably or very likely hold you back but are always exciting to handle like them. And that's what makes whatever you build or create really unique. So why UAVs in general? Well, I'm sure you can think of a bunch of applications. And this has become one of the hottest area on robotics today. Everybody here see this video? Related to Amazon now trying to talk about delivering packages at home using heavy drones. Did you see what the UAE has been trying to do recently? They want government services or government entities now to deliver renewed IDs or licenses all the way to your home. I can put it on this drone, give it a GPS coordinate and it would fly to your house and then drop it and you pick it up. I'm sorry? Well, it's easy. They can ring your phone. I'm outside. You really don't want that old technique of letting you know that someone arrived. I don't think we even knock doors today anymore. I think when we visit somebody, we call them and like, I'm at your door, right? This will tap you on the head. Right. Agriculture. You wanna spray a field, pesticides, you wanna monitor what's going on with your crops. There you go. You've been in California, fires. You wanna know what's going on. You wanna help put a fire out or at least confine it. That's another potential application. Disaster Rescue, one of the biggest ones. You saw in one of the videos that robot that would walk in and do a disaster rescue. Where do I send that robot? Maybe you send a couple of these first that take images or videos, send it back and you know where to focus your search and rescue efforts. Traffic control, monitoring, management. That kind of stuff again. Again, another video that came out recently. This, I forgot, I think this was in the Netherlands if I'm not mistaken. And they built this prototype that if someone has a heart attack or has a severe problem, much faster than an ambulance that might be stuck in traffic, they can send this small deferbilator there and a little, this is enough to give you one or two shocks I think. And maybe save a life. Everybody know this? Ross LaFanne, how do you monitor what's going on in all these pipes? One of the, I know for a fact that Cutter is really interested in drones for such applications. Whether it's for safety or security or all the previous things, but this is one of the major areas where problems can happen. So anything that has to do now with cyber physical systems or drone applications in Cutter would involve the major trends that we have been talking about and you hear about in Cutter. Smart cities, surveillance, crowd monitoring, et cetera. So what's the problem that we have been looking at? Let's dig down a little bit deeper. An example of an opportunity. Turns out, let's say you had an idea today and you want to do something that is drone-based and you have a very smart idea on, I don't know, let's pick our application on using drones to deploy them to monitor people. Or monitor just traffic. I wanna tell where cars are, where the best place is so that I can see the largest number of cars. That's a problem I would like to solve. You're a smart guy, you come up with a great solution. How do you try that? This is a problem. We don't have real platforms or something to try with or play with to make sure that whatever smart solution you came up with is actually used. The intermediate step is you do what we call simulations. So you write some software, you try your best to model what that real world will look like. Oh, cars are gonna be running this way. Cars will enter at this rate and I'll have coordinates for my drones, I'll do some smart math and computer science stuff, and then this is where my drones are and that's what they will look at and so I can see the largest number of cars, you measure it, you plot a couple of graphs, you publish a paper, that's half of what we do. As I get out of this. But if you wanna take it a step further and you wanna try it out on something, you need to test that. Hopefully you're not gonna go and jump all the way to, all right, I'll get the drones I program them and you go and you fly them out in front of the real traffic for the first time. You want something in the middle. And whether it's in a research world or even an industry world, you want a way to prototype or to have some platform on which you can test what you're doing. So big problem, test beds are needed. If you've been interested in these kinds of technologies, I'm sure you've seen some of the famous videos of amazing drones and some of the famous places and what they're doing, like in University of Pittsburgh or in ETH in Zurich and we do have a couple of decent ones. Some of people were showing drones that can play musical notes, really small ones. Others were showing drones that can collaborate together and build stuff. Others have shown drones that can self-coordinate and swarms, create formations and do different things. Others have shown drones that can coordinate, catch balls, throw them back to you, do a whole bunch of crazy things. There are very interesting videos out there that you can check out. Okay, so I just told you we need test beds. Aren't these test beds? Yes, they are. What do you think is the problem with these test beds? Extremely controlled environment, true, would be hard to break. I will still need to maintain some control, but there are bigger problems that would make you or me that I want to do research on this or anybody else capable of doing some useful work and a bunch of it is the following. First of all, they're extremely tailored and specialized. They do it to prove a specific concept or a specific idea that's extremely complex and so the whole thing is built and tailored for that. Second, really expensive, okay? So just to give you an example, to do these or these, the localization devices that you use. Everybody know what localization is? So localization is the ability to tell where you exist in some space. GKS, that's localization. This is where you are in some form of space with a specific error range. For any of those drones or for you to do anything useful, those drones need to know where they are and potentially where other entities in the environment are. So just to do localizations in those kinds of examples with that kind of precision, the device or the system that they use costs $40,000. I'm not even, and you can buy that, I'm not even included the price that would go into building and customizing these drones, the effort in manufacturing them which are again extremely tailored and specific. And in the end, the application is for which you can use these drones is very inflexible. You can only use them for a few things. So we thought we had something interesting that we can work on and we built this thing called up and away. By the way, we ended up calling it this for a very interesting reason. As we got the drones and started programming them, we made mistakes, of course, but mistakes with drones can either be very funny or extremely dangerous. If you come, I'll show you some pictures in the end at our CMU building. We have a beautiful building with a big indoor space because we're not allowed to fly these outside yet. Cuttery law does not allow it. There are people who are negotiating with them to make that happen because if you really want to get into these things, we need to change those things. And in one of our incidents, a drone just kept going up, up, up, up. And we were all trained that when they drift and they make those mistakes, we know how to run after them and catch them and then flip it to stop it. But that thing went all the way up and we came up with this name, up and away. It ended up landing after a while when the battery was lost and we were just stopping people from going under it. A funny day. What was our goal? Can we build a test bet that enables low-cost, easy-to-deploy cyber-physical applications? Cyber-physical applications is just a fancy word for kind of different types of applications that would mix cyber-physical, software, and hardware. And if, again, go back to the example that I gave, you came up with an idea and you want to try it out. I would love to offer you this test bet so that you can come and test things on it or give you even the know-how of how to create it and do create it because it's hopefully not that expensive and good enough for whatever it is you want to do. So we went ahead and tried to build the test bet. And I'm not going to spend a lot of time on the details of these slides because I know they might bore the audience over here. So I'll just give you a gist of it. If you have technical questions, I'm happy to take them either now or later, your call. But all you want to see here is we ended up building some arm that we put at the top of our building and we used a top PTZ pan-tilt zoom camera to provide simple localization. So instead of paying $40,000, we paid 500 bucks and we start doing vision-based localization for those entities. We have what we call UAV node modules that would reside on these drones and do a bunch of things like localization, processing, control-related stuff. And the central node, a server that we have, would essentially be the heart of the system. That's the node that would tell or know what the whole environment looks like. So I know that the drones are here from my localization system. I know that the targets, we were using it for one example application, which is again the target monitoring and coverage. I know where the targets are and then we can build a system around that, automated system, that's the key. What do we have out there? I'm sure a lot of you saw the different drones that are being sold today, right? Yes, hopefully, you're interested enough to look at these things. Most of them come different sizes. They're this big now or this big. A lot of them, if not all of them, come with either an iPhone app or a special controller that you use. And all you get to do is videotape whatever that drone sees and you get to control it and navigate it and you do a few accidents at the beginning until you learn how to do it. What we wanted to do is, well, give ourselves the freedom of programming these drones. Turns out only one or two there are a paradrome that has an API and we had to hack the system to a large extent. Now getting into details, we managed to do it. We got to a point where we can control these drones through programs. And that was the first good step. These drones compared to the others, it's a $300 toy. Turns out that it's not just cheap, but cheapness comes at a cost where the quality of the sensors and all the systems that exist on these drones are not designed for new use in this way. They're more designed for some human to control it. But when you want to actually navigate an automatable system, you've run into many more barriers that I can talk about later offline. One example of bumps, so I'm gonna list, just share with you three or four examples. How do I tell, so, okay. I told you we're using a top camera to localize the drones, right? How can I tell drone one from drone two? How can I tell that when I'm taking, when you're localizing with vision, you take pictures, okay? And then you can process those pictures somehow and you can get some vision discussion. How can you tell that whatever you're doing, if you do some form of filtering or analysis, that this is a drone or that's drone one and that is drone two? I want to identify that that thing in the picture is actually a drone. And that's one drone and this is another. So I want to be able to look at a picture and then estimate an X, Y coordinate in a certain plane and say that's where the drone is right now. To be able to do that, I need to see it and distinguish it from the list of pixels that you have. To be able to do that, we came up with a simple or what we thought was simple at first and you get some colors and you put it on the drone. I want to tag this as a red drone as a green drone. Turns out this is just one seemingly simple thing to do but it actually took us a while to decide what colors to use and we had to run completely separate experiments in our environment because with different lighting conditions, time of day, shade, whatever it is you've named it, a bunch of these colors look alike. Can you tell the difference between these two? Very hard. Just one example of a bump we had to solve. Another example has to do with control challenges. So, let me give you an example. Your drone is facing this way. I wanted to look there. Give it some command 90 degrees left, right? Should be easy. You give it that command, it turns out that it goes over there and then inertia pushes it all the way here. What do you do? Well, I'll let it go 15 degrees right. Again, again. And then you start appreciating why the mechanical engineers work on control theory. Where you have this feedback loop and you have to take it into account and you have to estimate that so that you reach some ultimate stability. The same goes for just direction. I want to go from this point to that point, maintaining orientation. Two meters left. Great. You think mechanically with air dynamics that when you say two meters, you will get two meter precision. This is reality now. This is what you would never see in the simulator. Unless it's extremely good and you still would miss a bunch of things. So again, you go two meters and suddenly the rotors go a bit harder and by the time it stops, it's actually a half a meter or one meter extra there. What do you do? So, we had to come up with a whole bunch of algorithms to solve these. Communication with the drones. How do you talk to them? So each one of these came with a wifi interface and because they're built as toys, they actually worked in a way that they become an access point and you connected them with your iPhone and then you control them. We had to reverse that whole thing. We want you to be a client and we need to be able to issue commands and we couldn't hack the kernel to that extent so we ended up having to append our own wifi drivers and talking to them directly. I'll just show you, so lots of problems we went through to build the testbed. We finally managed to do it. Just showing an example of this target coverage and surveillance. You can skip the details there but all we were trying to do, we started off with a very small testbed in our lab and we got a bunch of things like these and toys that we started treating like targets and we started off working with one drone. You can see the snapshots. In those images, you can see two things, actually three things. So I'll walk you through it and then I'll show you the video if you bear with me. So this full view is us taking a video of what's going on. Then the bottom left image in each of those snapshots is what the top camera used for localization sees. So you get a top view real time of what's going on. So for instance here, you see some targets, all of these are targets and there is your drone. And then at some point, you activate this top left corner view and that's what the drone sees. We're trying to do this coverage solution. This is an example of what we had in this setup. Tell you what's going on for a second here. This does not look as exciting as you might imagine but let me tell you why it's exciting. We are not doing a thing running this. We press the button and we sit back and pray to God that it works. These are the targets. The system is working in a way where it detects all of what's going on over here. It's supposed to detect that these are the people we want to cover or the cars we would like to look at. And it has to compute where is the best position for the drone to go to with orientation, with the path and how it should be located so that it can look at all these targets. And it does that all real time. So you go over here and the drone now is deployed. It's supposed to go, you can see that from what the drone sees, nothing yet. You can see how it's moving, going to its proper position and then going down to a different altitude and you can start seeing that the drone captures all the images. We get our remote controls and we start moving the cars. That's the only thing we're doing. We're trying to emulate targets really moving. The system adapts, the drone goes, rotates and now can see all of them in the same image. Looks simple, six months of work, right? Not that easy. But the platform works, it's flexible enough. You can change the application that would work. Time for taking things a bit further. This is the inside of our building at Carnegie Mellon University. This is three floors high. We had to build this huge arm with a PTZ camera at the top. You get a big FOV area on which we can run experiments and we ended up scaling this a bit more. So now you can run and look at a multi-drone scenario and the whole idea here now is you start off with a set of targets. Your system decides that you only need really one drone in order to cover those set of targets. You send it in there. This again is the top camera view. This is what the drone sees and then it will go to the point where it will start seeing both targets in its location. Our mighty engineer goes, adds targets to the scene. The system does its computations, realizes, wait a minute. I cannot look at all those four targets with a single drone. I will need extra resources. Therefore, we need to engage another drone that happens to be at our disposal in our fleet and we will ask it to start moving. I have no idea who this person is by the way. So drone two goes, slowly goes over there until it gets those targets in its horizon and ta-da, we're happy. Our algorithms work, our systems work. Is it perfect? Absolutely not. Is it as good as those other test beds we looked at? Are I showed you pictures of? No, but it's good enough. It does a whole bunch of things close enough, good enough and now we can test a whole bunch of different applications. Lots of things we still need to work on. We have been working on from when those videos are taken. Now we're starting to take internet of things stuff. So things like raspberry pies or Intel Edison's or we want to mount them on the drones so that you have distributed processing platforms and things happen in a more distributed manner. All of this is centralized. So this was just some quick journey of one example. Back to general robotics and a couple of closing notes here. The trends now in usage of robots are really picking up, right? These are some stats of the density, the ratio of people to robots that exist in different countries in the world. Thought, you know, Japanese are weird about this? Well, yeah, right? And you can take it on as you can see in the top countries. Most of the robots today are guests used in what? Automotive, electrical and electronics. How do you think Korea is suddenly building all those cars and every single device, Korea? All due respect, but it's not the people. It's the people who built all these robots or imported the robots and using them at that scale. And then a bunch of other things. There is still a long way to go. And only with these industries, this is the last note and another slide I'll close with, jobs. This hits exactly on the stuff you guys will be thinking about moving on in the future. Do I need people or do I need robots? You don't even have to apply to robots. I'm here to tell you about the robotics domain but you can apply it to any area that you are planning to work on. Just because of automotive industry, 13 million jobs were lost. 22 million jobs are lost in manufacturing. Food services, this is just the beginning. All of this is becoming much more and more automated right now. Is it a leave? Sorry? Is this is a leave? No, this is estimated that this is the current amount that's lost up to this point in time. Only by robots. So are things so grim? Oh, time, this is cumulative. Up to this point in time. This is not a specific year, it's up to now. This is where things will be worse. So by 20 time, it will be a lot worse than this. This is today, by 2025, nearly half of the jobs in the United States will be taken by then. On the other hand, it doesn't create jobs? That's the excellent point that I wanna close with on a more positive note. So, on a more positive note, this is not something new in history. Turns out as we move from agricultural to industrial, and then we're in the dawn of the digital era, societies have eventually adapted and evolved by creating new jobs in industries. The note, though, is we need to be ready for those jobs. We don't need to cling on the jobs that will disappear. You need to hire people who will be adept in those new roles, you don't wanna cling on the old models of running things. One of the very famous scientists who's one of the fathers of the internet, they serve, this is a very nice quote. He's basically saying, historically, technology has created more jobs than it destroys. And there is no reason to think otherwise in this case. And a research study was also shown to say, you know, hopefully all these technologies are not destroying our lives, but they are helping free our lives to do more meaningful and useful and exciting things. In the end, a lot of people will suffer. There is no way of escaping it. I think the smart ones, the more agile ones, places like this country that's small and agile enough to adapt, if we remember and take these things into account, it's very, very easy to move forward and on this positive note. Thank you very much. I'm happy to take any questions.