 Hi everybody, we're here to present you some nice stuff with Quadroader and Hackad digital camera. We are from France so if all English is a bit poor, excuse us. So I'm Antoine, I'm a PhD student. I do imagery, IOL imagery, crisis management and I whip all the grad students that are under my... And here is Corentin, he's a grad student. He's looking for an internship soon, so if anybody... And he's a very good electronic geek and he's a pilot of this thing. We're here in the name of a world team of grad school students and PhD students. We are back in France now, we think of them. We are here in Vegas and we are glad that everybody in DEF CON permits us to come and everything. So we are part of an engineering school, computer science and electronics and one of the laboratories of this school is image processing department, that says. Why an imagery lab is doing an UAV? First, because it is very important when you do image processing to... control of the images, if you can do your own images, you have better control of what you can do. And we had the opportunity in France to be funded by the French Air Special and French Army. So we will present you the result of a three-year project that we have done back in France. Corrie will present you the history of the project and then we will present you the act of the digital camera because we wanted something to fly, but we wanted to do something useful with flying. So from the beginning, this project was thought of aerial imagery on a quadrotor. And then we will hand with application perspective and we will do a flight demo. Okay, so I'm going to present you history of the project. It's a different step we went on to arrive with the project we have here. So first, what's an UAV? It's for unmanned aerial vehicle. It's a flying machine, flying robot that has no human on board and that is expected to fly semi-automatically. So what we chose here is a quadrotor. That's because we are not part of a school. We have no skills in aerodynamics. That's why in this situation, all the mechanics are very simple. It's more like electronics and things like that. It's very simple. You just start with a rigid frame. You do this with what you want, aluminum, carbon, everything you can buy somewhere. And you add four motors with propellers. And what's special is that you have two motors rotating in one direction and two other rotating in the other direction. And that's very important because rotor rotating like that will create thrust, of course. Air flow downwards, but also it will create torque. And these torque strains will tend to do a rotation of the quadrotor around the u-axis. And if you add two rotors in one direction and two other in the other direction, then the strength compensates and it can fly. And we use, in fact, variations of speed of the different motors to recreate the same control we have on a helicopter. We have roll pitch and yo and also a throttle. So here is a video of our first fly. We just nearly put up the motors, nearly no electronics. In fact, directly controlling the thing with our standard transmitter. So you will see. So as you can see, it's not very stable, very difficult to fly, need a lot of experience or even you can't fly it. So that's why we started to implement a real electronic inside. We wanted to create a flight assistant to do the hard work to react much faster than the pilot and everything. So for this, we get an orientation sensor that's called an IMU to get information about the three angles. And then with this information, we can do a control loop that will react much faster than a human pilot. So here you can see a small control loop. So I will show on this here. You have information coming from the PCM receiver that's standard on RC planes or RC flying machine. It will give a set point to the control loop. Then we will get information from our IMU here. We will get angular rate and angle. And this will see the difference. And with this value, we will do actions directly to our motors. And another point important is to have all this loop going very fast. So we chose high speed I2C brushless controllers. It's very cool because they are all on the same bus. So you can add four very simple. Just have one address for each of them. And you can control them at 300 hertz. And that's how it works to stabilize correctly. Just to present to you the technology we used for our main microcontroller. It's DSP technology from Microchip. We are not part of Microchip but I think it's cool to talk a little about them because we were able to get three samples. And that's why we chose them at the beginning of the project. We could test different microcontrollers, different things. So that was really cool. And also it has lots of peripherals. And lots of peripherals with interrupts. Allows you to do lots of things in parallel with good real time response. So that's very important for the PID loop to work correctly. And also it's great because we are wanting to do everything. And so we reduce the weight and size. So here is a video. Now with a stabilization algorithm working. So you can see I won't be controlling much, touching much the RC transmitter. So here it is flying. And even if I push on it, it will come back to its original orientation. But you can see only orientation. So now I will present briefly specific specifications. So the size, in our challenge we had to be less than 70 cm. It weighed about 1.6 kg. We are using LiPo batteries. And what's great is that with our combination of batteries, controller and motors we are able to get about 1 kW maximum power. So with this you will see in the flight it's very dynamic. And if we have no payload on it, we can go up to 20 minutes flight. So now a video showing some tests of payloads. So here we are taking off with bottles under it in a Pandula way, attached with rope or something. Here with only one bottle, half liter of water. And now with two. Thank you. So now you will see that even if it is very stabilized, it's not like a mechanical stabilization that maybe you have already tried a small RC helicopters that are easy to fly. Because it is mechanically stabilized. But on the other hand you can't have flight dynamics or go to high speed and do things like that. With electronic control it's much better, you can do whatever you want. So this video was taken during the final step of the challenge we went in France. So at the end of the show during Q&A you will be able to see it in flight, in the parking lot just outside the pool, we will bring you there. So during this contest we had to be able to get video or photo of targets behind the windows. This is for power. Okay so it's nice and all but there are limits. Limitation is that we only control orientation and of course there are drift about on all the other axis in 3D. So you need a minimum experience to be able to fly it to control correctly, throttle and all the other axis all together. And we were at the moment we were thinking that it would be great if anyone couldn't fly it that it would be more easy to fly. So we decided to work on a complete XYZ control. So for this we added different sensors. GPS sensor, Ultrasonic Rangefinder and pressure sensor. Why so many sensors? Because none of them are perfect and if we wanted to be able to take off automatically then fly at any height we want. We need a precise sensor close to the ground, then precise sensor in the air and everything. So what we did with this sensor and what we are still working on is mixing all the data together because each sensor is a good way of working, a good situation in which it's better. So here you will see the first step we did. Only Z control, so now if there is no wind it will just stay at the same altitude and stay nearly mobile. So here I'm not touching the controller and even if there is some wind it's not going up or down. It's just moving around a little but up and down it's automatically controlled. And for all this time I'm not touching the controller because it's on the ground you can see. So this is our step. We went best for the moment but we will soon implement all the other things like GPS and algorithm on it. Now I will talk about something that everybody knows because everything that goes up will sometimes reach the ground not so smoothly. That's a problem everybody has, even people from Airbus doing things very seriously from a long time still have problems sometimes. So I think it's something we have to get on with but we have to take measures to improve it. So for the last three years we got three crashes and here is a video of one of the first crashes. Something went wrong. This was a great lesson because before that we thought that we will never crash. Everything goes well and smoothly and at that point we understand that we need to be very careful with all what we are doing. So it was not the only one. The second one we were able to track the GPS trajectory. We have no video of it but it was nearly the same thing. Going high, taking pictures, doing what we had to do and then our transmitter went out of range. For many, most of the crash it was human errors very often because when you do different things you think about taking pictures and not about flying the UAV or things like that. And here is another video of what it looks like when it's down after a crash. That was three weeks ago only. So this one is completely new. We had three weeks to rebuild it for DEF CON and we did it. This one it was mechanical parts that broke on board so it was not a human hero. What we decided after that is that we needed to have a ground station and to implement things to prevent crash from happening again. Here is our ground station. We have communication from data using a modem and also we get video downlink. So what we have is before the flight we can do manual and automatic check-up, checklist about everything to ensure that even mechanical parts are correctly fastened, everything. And also what's great is that we have automatic checklists that verify that every sensor gives good readings that automatically lots of things are tested. And even during the flight we can have control of... we can have back data from position, status and also maybe battery left and everything else. So now Antoine we're going to plant you while we hack the digital camera. Yes, because as I was saying previously from the start we have thought the thing to be able to lift up things, one kilogram of anything we wanted. So what we wanted was to lift up several cameras, multiple and to have simultaneous acquisition with each of them. We wanted to have it for low cost and we have to take in mind the size and the weight because you can't put anything on it. Our first try was with some very basic material with a webcam and a mini PC working on Linux. It takes some good photos, you can see. It was high resolution webcam, it was low cost, it was very light because when we split everything on the webcam we have 15 grams. It was very easy to develop as it is USB, Linux, everything and it did not require at that time any electronic knowledge meaning that at that time I was able to do it. But the problem was that you have not real time control of anything, the USB is good for one, two webcam and then the rate is too low and webcams are often CMOS sensor and with CMOS sensor you will have a lot of deformation with vibration and quick movements. So that was not the solution. So we understand that avoiding electronics was not affordable and that we need to use integrated systems. So we decided to use on-the-shelf camera. The advantage is that you have a good optics, you have internal storage, you don't need any PC to back up the data. The problem is that they are made to be manipulated by humans not electronic boards so you have to tweak a little and you need to be very careful of weight and size because they are often oversized and overweight for the capacity. So our objective was to reduce weight to the minimum of the camera and as we have no control no possible control of it because it's not implemented we can't have serial access or anything on those cameras. We needed to do this ourselves in order to get full control of it and also that was very important and we needed to have precise timestamp of pictures and I will talk about it later. So out of the box we have the camera with plastic protection all around it so we decided to remove it. On some of them we removed the LCD screen, the flash but here it's still the same functionalities but we removed all the covering layer. The first step was to be able to control the trigger so it was not that difficult. We started exploring the different buttons and we found that the internal processor was working on 3.3 volts. So we tried to determine what was the circuit inside so it's about something like that. You have the button connected to the trigger that is pulled down with a resistor and when you press it it will bring the trigger to 3.3 volts. Here is a screen of what you got from the oscilloscope if you look at it when you switch it on or when you take a picture because it's the same schematic. So with this we were able to directly control it with our microcontroller just going high for about 300 milliseconds and it allows to take pictures. Now about picture timing. So we wanted precise picture timing because when we want to geo-reference our picture it's important to get exactly the orientation when it was taken so that we can have a match on Google Earth or anything. When it's on board of a UAV it's sometimes rotating very fast and the information we get on the picture, on the GPEG you have EXIF information that gives you lots of things but it's only the time up to down to one second and this is not enough. So we tried to find a signal or a board effect or anything on the camera that would give more precise time than that. So we tried to open it a little more and then we found a set of test pads on the camera. So here you can see it. Here you have very very small test pads and all the thing was now to see if one of them could have a board effect that will give us the time of the camera. We thought that when the picture was taken there was lots of things happening so there should be some things that we could detect and use to timestamp the picture very precisely. And on the down right pad just here we found this type of signal. So when the camera is up and running there is a pulse every 20 milliseconds. So it's like that, running, running when the camera is up and running and when we take a picture here is what happens. There is a lack of impulse for a long time so the only thing we have to know is if there is a timeout of pulse for more than 20 milliseconds then there was a picture taken in the last 20 milliseconds. And we were able to verify that with this system we were able to have a precise timing of the picture up, down to less than 20 millisecond time. So to do this we implemented on our microcontrollers because now we had to do this not by visual highs but to do it automatically. So on our microcontrollers we used an edge detector that gives an interrupts every time you have a rising edge for example and with this you can get, at this moment you can get the value of a timer running in it and if you have a timeout with this value then you know that the picture was taken. And one of our objectives was to be able to get to control different cameras with one board. So we created a small board and we were able to control up to three digital cameras at the same time. So here you can see them in action. Okay, so that was great. We were able to do what we wanted. So now Antoine is going to present you all the things we were able to do with this working. Thank you. So we do this project with a lot of pleasures for fun but we think that there is a lot to do with it for profit. Since with Google Earth revolution now you can say that very high resolution images is the fundamental human rights alongside mobile phones and internet access. So there is a lot of military application obviously in intelligence and surveillance but no, the technology is sufficiently low cost to think of civil application. There is loads and loads of civil application. What we are into is more humanitarian crisis management. I will take a talk about it later and very high resolution mapping for environmental applications. When you are in a situation of crisis you need a quick assessment of the situation because everything has changed. You need maps and images of the area. You need vertical images. You need the side of the building to know if they will break down. You need lots of relevant tools to navigate into the image database. So we need lots of data. We have participated in a French contest which was called Humanitech which was organized by the Red Elmets Foundation who tried to be the blue helmets counterpart for human Italians. So they were very interested for quick response mapping, population move survey and to help decision making. And we are currently looking for some non-governmental organization to assess all products for human Italian missions. We had also a partnership with the Natural History Museum of Paris who was very interested to follow the evolution of biodiversity on their field tests. So we just got a few weeks ago with them to do some acquisition and here are the results. You can see the complete three camera hacks under the quadroser. And we have the three images taken simultaneously. So we use a homemade open source image modiser whose name is Jim to put all the images in the same plane and to merge all the data. So we have finally an image that covers 60 meters over 80 meters. The pictures were taken at about 75 meters above ground and with the 12 megapixel camera you can have one and a half centimeter resolution on the ground so you can cover quite a scene and you have sufficient resolution to study biodiversity or anything you want. Thank you very much. To end this presentation we will open in some perspectives. We, Cori, said earlier that we want to full control on all the axis. We want to go higher, further. We want to know exactly where it is, in which direction it is hidden. We want to see what is seen in real time and we have quite a good promising results on that. We will present shortly another project that is currently happening in our school with undergraduate students. They have developed a head-up display. They take the video that we broadcast from the UAV. They take all the data we send to the ground station and they had in real time all this information on the screen so we can have a first-person view to pilot the UAV. So you can see the artificial horizon that is following the real horizon in real time. It's in the ground of our school in Paris. I think they have done a really great job for undergraduate students. And we are working at the moment to get good video Googles and we can do a real flight, immersion flight. This is only the first test. We hope we'll be able to have better results soon because we still need to have a good video retransmission because as we go further, we need video data link to be very reliable, especially if the video is the only source of information we have. So we are still working on it. So now, when we were first asked to come to DEFCON, we thought that it would be very great to be able to do a flight demonstration because something flying is only good to be seen in real in flight. And then it was not that easy because we started doing this slide because we thought it would be too difficult and all the customs letters come through with this. Both French and US consider this as quite sensitive but finally we were able to do it and to come with it at the Riviera. And so we'll do a Q&A, not in the room next door, but just after the flight in the parking lot, just outside the hotel. So we are here at the conference centre. In fact, you will just have to follow us but just go to the main exit and continue on the road, turn right and it's just outside the pool. There is a parking lot and there is a small entrance on the side and here we'll be able to do the flight. It's not a real entrance but we can't do the flight demo on the ground of the hotel. It's a bit adventurous but I think it's worth it if you came. And we would like to thank every DEF CON staff to allow us to come here, especially Nikita, helping us with the paper and everything. And we would like to thank you all for your intention and thank you for coming.