 good morning everyone I'm Julian from Parrot drones and I'm going to talk to you about what we're doing with open source right now and I've been doing and we've been doing for approximately a year and a half now because we we hadn't so many things doing doing going on about open source and drones so I'm gonna basically talk to you about what I've done on Parrot bebop 2 which is was quite a year ago and I'll talk about Parrot disco which is our new flying wing with just released and what I've done and when we've done with it and open source software I'll talk to you about adding video features to that which is an ongoing work and I'll talk to you about yeah video features and then I explain how to build the code and what's left to do and I'll talk about also a thing called Parrot slam dunk which is a small module during stereo vision camera with jets on tk1 so first of all that's the Parrot bebop 2 I have one but I broke it while doing stupid things with it I always do that when I play with open source software I sometimes crash the drones because yeah it doesn't just work when like when you buy one so it's 500 grams approximately it runs a dual cortex a9 that we've designed yeah we love to design things and do things on our own which is sometimes relevant sometimes not but this is at least it was a lot of fun to do our own SOC and develop all the drivers portal Linux kernel and stuff problem yeah talked about that later we have lots of sensors on it we have an IMU barometer compass vertical camera for doing optical flow we have a sonar GPS yeah it's it's very small but it's yeah it embeds a lot of sensors it's just like that with the propels on usually yeah it's very old kernel now it wasn't two years ago but now yeah three or four years ago now it's very old and every time I'm trying to push so we can spend time to mainline it or pay someone to do so well people tend to say that it's not relevant because we're gonna drop the support for it but we continue making product on it every time so I guess at some point it's gonna be true that we're not going to release any more product with this soft with this SOC but yeah I was right but anyway so it has a front camera with a fisheye lens and it uses that to do video stabilization because it capture a very wide angle and with some data it it's capable of generating a smaller image but stabilized I've been trying I've been working on I've met lots of people in these conferences and some some of them convinced me that it was a very good idea to port an open source software called adu pilots on bebop 2 and sounded a lot of fun so I asked my boss and he basically let me do it so I worked a few months to port adu pilots and just explain what it consists on it's so this is the hardware architecture of the bebop 2 regarding the main SOC so parrot 7 dual cortex a9 with a Linux kernel and these are all the peripherals you can see so there are the two cameras there connected on camera interfaces and on the same I square C bus there's yeah lots of I square C buses we love that so there is the compass the barometer and the ESC ESC is a motor controller it's to control brushless motors which is kind of tricky and then you have the IMU MPU 6050 and there's a heating resistor in order to yeah to control the temperature on it you need it to be as stable as possible so there is it's just PWM on a resistor but it works very well we have an SPI bus connected to a microphone and an ADC this is to do sonar it's lots of fun because it's SPI bus is used to send pulses which is quite unusual but we send pulses with SPI bus and then we receive the echoes with an with a microphone connected on an ADC connected to the PMU connected via IO drivers and then liba IO so it's also lots of fun and we have a GPS on new art which is pretty standard so audio pilot is open-source is GPL v3 it's originally developed to run on an Arduino but it's not Arduino anymore we are having internally lots of debate right now to what's gonna be the next name for it but it's difficult to drop a name and choose another one problem is people think it's Arduino based because it's called audio pilot so if you have ideas I'll try to submit them it's in C++ it was originally in Arduino language in PD files but it's been dropped fortunately two years ago and some Linux boards already supported before bebop there were a big old bone black with an extension they were yeah called pigs pigs hawk fire cape or something like that and they are now pretty many boards running Linux Intel boards a raspberry pies with capes and stuff so there are many boards you have vehicle specific flight code in it which is copter plane and rover meaning you can basically fly all of them with with audio pilots you can adapt it to fly on a plane but you can also fly on a quadcopter any number of propellers three two no two doesn't work shed you have shared libraries that includes sensor drivers by drivers understand user land drivers this is bad and we are all working to get this better and at some point I hope we can port everything to that is on Linux on IO drivers for sensors and even for motors I think it's possible because we've done it internally we have hardware abstraction layers called AP house something so you have AP Linux giving access to SPI dev and stuff like that to do user land drivers yeah have a few videos just to show it works so Randy Mackay is the maintainer of the copter and so he shows yeah we don't hear it very well but so he just flying it in his garden he shows how it works and controlling it with a USB gamepad and that's how you pilot running on it it took me about three weeks and or months to port it because it's very it's pretty straightforward to port it on our drone it was also very interesting because I learned a lot about Jones themselves I wasn't a drone developer before that so he does missions you can do mission planning with ground control stations and it's do mission planning then you can have some more interesting stuff going on like this so implemented fun things like that so you throw it in the air and it stabilizes itself and some will also try to implement funny things like yeah I'm dropping out of the window sometimes doesn't work sometimes it almost works and so yeah so that's kind of things I really want people to do with our drones because it's yeah it's lots of fun and you obviously won't ask customers to do this kind of things at home because if they break it they're gonna send it back to you but open source is something that allows people to play with drones in this way and bebop is kind of convenient for that because it's small it's difficult to break you can but here I played too much I broke a motor it I can buy a new one even if I was why if I wasn't working for parrot I could buy a new motor and put put it back so it's not a big concern it costs yeah it's still an amount of money it's 500 bucks but you can make drones for 5000 if you want so it's relatively small amount of money then we've released this year what's called disco so I haven't put the wings with me so disco is a is of wing here is just without the wings I didn't want to bring them anyway I wasn't going to fly it because it's requires more space than that it's the architecture is very close to the bebop you can notice there is the same camera here there's the same camera underneath they're the same microphone it's so the architecture is very close it has a few additional sensors the motors obviously are not the same because there's only one propeller here and there are servos just two servos output and the main board it can be used on another vehicle it was the goal those main board here is the the goal when we created it was to be able to use it on another vehicle so we wanted to target hobbyists research and education and we originally the people who designed the product didn't weren't thinking about audio pilots at all but we discussed with them and we convinced them that audio plane was the perfect candidate for education and research and that it's going to be much easier than doing anything else because we already had the code ported to the bebop and it wasn't a big amount of work and with the community I met when I started porting audio pilot on the bebop there was I knew who were the maintainer and the maintainer is tridge so it's Andrew tridge all is a widely known open source developer and he loves planes and fixed wings and stuff so we contracted him to do the porting I could have done it myself but I knew it was going to be better if he did if he did it especially if you wanted to send the product to sell the product so we paid him to port audio pilot on disco and settle all the things really good so people can really fly it easily and so there are a few additional differences with the bebop to on this one there is an airspeed sensor it's a pitot tube I'm sure some of you have heard about that so it's it's implemented it's just on the bottom there is small hole and it compares the pressure between the airspeed that the air that enters and the actual pressure and it gives him the the speed of the air which is very important with the plane because basically it can it cannot fly if the airspeed is under something but it's not the same as the GPS speed because if you have lots of wind you can basically stay on its place it's it's just the airspeed that matters so it has an airspeed sensor you have RC input which you don't have on the bebop so you can plug your your hobby's RC you plug the receiver to the to the chuck or to the disco and then you can pilot it with your regular remote you can do that with the regular software in manual mode but autopilot has different flight modes which makes it a bit more fun for hobbyists you have differences with the SC obviously because they're one motor instead of four and yeah it's a bit different it can it can run backwards it's very useful when you try to land because you basically approach and at some point you have to just break and if you go backwards it goes very fast and it can land very smoothly so we had originally compass calibration issues because yeah we the first hardware has had problems so we had to work a lot about that and then in the end the last hardware was all right but we kind of struggled with the compass and he made a wiki for users so now any user can run a disco with audio pilot audio plane and this is yeah it's online people can already play with it it can be tricky at some point so yeah if you buy a disco just make sure you you you're prepared to spend time on it because yeah this kind of software is very it's a lot of fun for hobbyists and for regular people who just say oh I'm just gonna try to fly something else it's a bit complicated right now it's gonna be better at some point so what was missing it was video so the bebop is streaming video and then it's able to stabilize the video on three axis and the disco has the same and we really want to give the possibility to the users of the open source community to have the same so we wanted to give access to video so what's fun is I'm gonna turn this on right now is that we we come from world of autos we started going to Linux because it offered lots of possibilities but our developers now have switched I think but when they started working on the drone it was like five years ago there was still autos developers so what they did was one big process which handles well everything if you connect to the drone I'm gonna show that right now if I can always takes a bit of time to start okay so I'm connected to it you can connect it via adb no but we come from also afterwards after autos is we came to Android in the car we've been doing many different things in parrot and yep the Wi-Fi is not connected yeah we've been doing many things in parrot and Android is one of them so so we we just took the demon adb seems very convenient and lightweight to to connect okay so when it doesn't it doesn't so the Wi-Fi doesn't seem to want to connect but what you can see if I can connect with the Wi-Fi is that there is just one big process handling everything and so we had lots of threads and priorities like an autos it's just like we ported everything that was in the autos in one big process so this is for the official software it has little reusability and high maintenance overhead and so we already switched to a new architecture internally to split between lots of processes are not a lot but different at least so that when the video crashes doesn't crash the drone so because if you have a bug in video processing it it kills the process and then the drone crashes so we split the processes and we have now three main processes we have the autopilot we have the control part and we have video processing we use an IPC to exchange data because when you want to stabilize video you need sensor data you need some data from the autopilot and then you can mix all that and take video and stabilize it so if the drone tells you it's it's like this you can basically turn the image to put it back in its place the process in charge of that it's called parrot imaging process BIMP and we use libraries that we've open sourced for that because we want to open source as much as we can we have our secrets like any company but anything that's not secret if it can be open source we'll open source it so it's called lib ssh data and lib telemetry it's based on shared memory I'll come back to that later and it uses shared memory it's on our github so the imaging process is like that you have an application based on gstreamer you have a v4l2 source then you have the stabilization process not process library and you have on one way what's sent over the network on RTP using gstreamer components you have yeah sometimes photos and sometimes logistics for like what's recorded on the flash memory so the lib telemetry what we use is a lib to exchange data at high rate between processes in Linux it's built on top of another library that we wrote which is called lib ssh data which is based on shared memories and the goal is to have something that's non-blocking because the biggest one of the biggest issues we had with one process thing is that you can basically you want to access data from the autopilot like let's say the attitude the position of the drone you have to say okay I'll put a mutex there and then I'll access this and I'll do something with the data and then you end up with priority inversions everywhere and you have the autopilot that's blocked that some by something that's writing in the flash memory which is very low priority and you end up with the drone that doesn't fly well and you can see that very easily when you're doing lttng traces you can see that well my process my my very high priority thread is blocked by something else so then you can go with priority inheritance but it poses other issues and then in the in the end you're doomed so what you need is a non-blocking process to exchange data so it pushes data to shared memory it's time stamped it's which is very convenient you time stamp the data you push it with very accurate time stamp and then when you need to stabilize video you just get the data you need from the shared memories to achieve a good stabilization according to the time stamp of the frame you're wanting to stabilize you can find both of them on our github and try again to connect to the drone to see if this time he wants so the video implementation for that so we have to come back we have our own processes and so I ported our imaging process to a new architecture where the autopilot wasn't our regular autopilot but instead the open source software called rdupilot what we needed is exporting data from the autopilot to outside so tridge implemented a plugin system with hooks at some points of the software of the rdupilot software to export some data this is what one of the hooks look like and basically you just see in the end you can push the sample to telemetry the hook is sent when there is an ahrs which is the estimator update this gives the quaternion quaternion for those who don't know and probably lots of you is another way to express an attitude then roll pitch and yo which is yeah it just means the drone is like that or like that like that or like that you need that to stabilize video so it gives you that and you export the data using a time stamp yeah here I just took this time stamp but it's yeah you have lots of different ways to get a time stamp and then you just do that if it works and you can basically okay now it's connected does it want to connect your adb this time yeah so this is the regular firmware so you can see the big process here it's called dragon prog takes yeah 49% of one CPU and it's worse when it's when flies you can see that there are different things going on in it everyone can do that at home anyway if you really want to hack drones or just something like that there's the vision vision is optical flow you have yeah this is for auto exposure and auto white balance calibre is our autopilot you have angles yeah calculation for cam angles how yeah reprojection okay so basically what we you can do now and we implemented is you press the button three times and then the big process disappears and instead you have audio plane and pimp which is the imaging process then you can just start streaming giving an IP and a port just have to check that my IP hasn't changed it hasn't so you can basically start streaming and then on the other side you can use cheese streamer when it wants mm-hmm oh yeah it's just it's on my my side so you're there so you can see that the image is approximately because it's still on the work but it's stabilized you can do that and the video it's stabilized on one axis in fact it's not stabilizing you know it's stabilized on roll so we can roll the drone and it stays stabilized it's stabilized on pitch too yeah it's too yeah it's stabilized on two axis but it's blocked it's locked to the your axis so you can fit what you can also notice is that when you just basically you put this like that and you you can you can kill a process that's doing autopilot supposing my mouse wants it so you can yeah oh yeah so yeah like I said I do stupid things in my drone and this one has probably taken a few shocks we have a fan there's a fan inside and this one has been crashing at too much times I think too many times so yeah the fan isn't working very well and I also use prototypes that are a bit not very correct not the same level as mass production because I know I'm gonna break them so I prefer break to break prototypes that are already bad so yeah you can see that this is yeah stabilized on roll and if I stop the autopilot well it's not anymore so and if I restart it yeah yeah yeah I think is what do you stabilize on your why are you looking yeah does it start again not that good okay okay it takes time maybe this time okay but anyway anyway so you you can you can stabilize on what you won't think is okay this time it has started just wasn't started the autopilot took a bit of time to start so yeah you can stabilize on your but why are you looking at means okay I stabilize on you in this direction but if I start and I'm looking this direction you have a black image so yeah you can stabilize on high frequencies on you but low free at some point you have to get back to where you are but yes you can of course stabilize on any axis you want so yeah you can just so play video with G streamer it's G streamer application which is embedded on the drone yeah I'm gonna stop it because it makes too much noise it's the servos that makes the small noise the autopilot which is different from ours stabilizes even when it's not flying so the servos tries to compensate for the position of the drone so if you want to build code for parrot disco then you can check one of my colleagues github yeah it has to go back to parrots github but we always work first on our github and then push it back to parrots you can you can if you want to replace the autopilot version which is already on the drone this version is not in production yet but it's gonna be in a few weeks then you can press the button three times you get the open source software running if you want to replace it well you just have to rebuild it from the github of autopilot you yeah it uses WAF it's a build system for those who don't know and you just configure with board yeah minus minus board equal disco and then you build it and then you can connect via adb and push it on the target yeah remount and push it what's left to do again well there are lots of image quality improvements to be done the quality the video quality is not the same as the one you can get with the regular software we have implemented this new architecture as a demo concept for future drones and if we we want to reach the same level of quality we need to do lots of things to improve the image quality we must improve the stabilization the auto white balance and the former software which is one process and bad and stuff at least right now it works better it's often the case so we have to reach the level of functionality that it has we have to add maveling support to start and stop streaming so maveling is a protocol for drones to exchange with ground control stations especially so you can it's a it's a protocol where all ground control stations that exists for open source drones uses that almost and so you can use it to send data to receive and send commands to the drone and we need to implement something like that to start streaming maybe RTSP at some point also expose the streams via RTSP and people so can stream on their RTSP receivers we can implement and we will piloting from sky controller to sky controller is our remote it works with Wi-Fi but it has very long range it can go up to four kilometers I think maximum yeah it's longest use case but yeah so it's running Linux and it's very easy to hack also we have people doing that internally maybe I've seen some people wanting to do that also and we're gonna help them so you can hack it it's still on the same kernel that's used that's used on the drone is the same SOC it's our SOC called Parrot P7 you can have you have USB Wi-Fi and everything so you can basically do whatever you want with it but good thing could be to pilot the regular autopilot we and also the open source software with it we I really love to allow people to develop video plugins gstreamer plugins so you can get image at some point do some processing export the data to do something else I don't know what but there are probably many things that you can do with yeah yeah we already we're selling the disco with the kind of something that looks like the gear VR with why you put your phone and you can have immersive image it's done on the regular software and that could be done also with audio pilot once we get the video right you can have that and yeah a lot of people could do that to analyze the video stabilize it in another way do something else maybe even learn how to do an auto exposure auto white balance if it's if it interests them you have yeah lots of things you can do with that so I really love to do that and I also love to write a fully open source version for the video pipeline because what I've done with this I cannot open source everything video stabilizations is is one of our of our kind of secret or thing that we do internally and we don't want to expose so I could just give access to the regular stream without any stabilization and let people handle everything that's gonna be something I'm going to do I think in the next month that's all for the disco I'll just talk briefly about something called parrot slam dunk which is a development kit to do autonomous flights it's on taiga k1 it has two stereo cameras here and two so two microphones to do so now here to lots of sensors it runs Ubuntu desktop is remand for research and education it's not meant to be embedded on a drone for a final product and it supports Ross which is the robotics operating system which is not an operating system but it's something that allows to exchange with lots of robotic components and develop robotic algorithms and it's widely used in the robotic community especially open source it's by the open source robotics foundation yeah it's kind of you can plug it you can plug a screen and a keyboard and use it as a PC and then you can download the packages you want and develop your algorithms use stereo camps or use Ross I use whatever you want and you can do something with it what we're doing internally is playing in it on the top of the bebop we analyze and do slam and an object avoidance and stuff like that and we pilot the bebop via its USB port because you can connect to the USB port and pilot the bebop with it you can also do that with the open source software it demands a bit of work I must admit but it's feasible so over with the presentation if you have some questions maybe you can grab the microphone that's just here so how large is the slam dunk oh it's like that and to stereo cams so it's an additional that in this direction should have brought one I forgot to take one yeah it's this this size it's very small ball okay no that's something I haven't been talking about this no we don't provide real-time guarantees and in fact you have the most important thing is that you don't miss samples from the IMU sensor the IMU sensors have a FIFO so it's not very very problematic if you just are a bit late as long as you are very well synchronized and you're capable of not missing any data from the sensor so you don't need to achieve hard real-time to do an autopilot what you need to do is never accumulate late lateness you must always catch but if you there's a small jitter it's not a big big problem see if your IMU has a FIFO because you can read whatever data is and then process it so yeah so we are not running any real-time kernel any preempt RT we're running scared preempt but no preempt RT any more questions no no that's what I said it's it's on 3.4 and it's not on mainline I've been yeah trying to push to do something in this direction I work with the people who can do that or even do it ourselves but I think my company hasn't yet seen the the possibilities that it could offer and so we haven't it's of course released and published and now github and stuff but yeah it's it's not mainline so we keep on backport we have been kept on backporting things especially for video because we needed to use like latest things from video for Linux and we've spent a lot of time backporting them to our 3.4 kernel which is sometimes very difficult okay no more question yeah you mean with the regular firmware yeah what we saw is 220 I think milliseconds so it's yeah it's not at the level of let's say racing drones but it's not the point because it's it's it's something that you can pilot and you you have the immersive flights which is very interesting and the goal is not to have very low latency autopilot in that it's not racing but yeah the the shortest the latency is the better it is in fact I think there are some limitations from our general software that can be overcome by switching to the new architecture because yeah g-streamer allows us to do have more flexibility and to implement different pipelines for streaming and recording which is not the case so you have lots of way to improve it we are just at the beginning of that yeah yeah on the bebop no we have hardware encoders and we have the stabilization is done over GPU it's uses g-streamer plugins using OpenGL so we are very very backwards on the kernel and we're very up to date on the g-streamer no the the g-streamer pipeline I showed was on the PC side to decode the stream and our our pipeline on g-streamer on the drone it uses the stabilization it uses GPU over OpenGL and it uses hardware encoder to do encode yeah on this dual Cortex-A9 you couldn't possibly stream at 720p and record and 180p at the same time without a hardware encoder it's not possible even on a higher hand it's complicated yes yeah some reasons for that the first one is at first when they started to work on it they could manage to have it fly without it so it's hard to convince people that you have to do real time if it works without at some point we have probably studied a bit what was necessary to switch to the preempt RT patch and identified that it wasn't mandatory so we decided not to do it but yeah I think at some point it could be good thing is yeah we starting to look at other platforms including ARM64 and at the moment we switched then there wasn't RT preempt RT patch on ARM64 so we're still not going to use that I think that it's alright to use without preempt RT patch it's a thing is for very hard real-time things you must have something dedicated to it for instance I don't know yeah you can use a separate microcontroller you can use some sometimes you have embedded microcontrollers inside the big SOCs that can achieve some real-time things so if you are even hardware blocks if you have yeah I'm thinking sensors are fine but different between IMUs and cameras is that IMUs are slaves and cameras are master on the bus and it's very different means that when cameras need to to have lots big amounts of data and everything is done that the main CPU is the slave which is very convenient because it keeps on sending data and you have hardware blocks that keeps on receiving data and it can timestamp it on hardware and do some kind of things and if we had the same mechanisms for IMUs barrows and stuff it would be very easy if you had hardware blocks with very standard way to get data and that can timestamp everything once you've time stamped everything you've won you don't need to have micro second latency to pilot you need to have micro second latency on the time stamps you need to have micro second accuracy on the time stamps so that all about time stamps it's not about reactivity I think because if you see what the drone does the autopilot runs at 400 Hertz for copters but for plane it can run at 50 Hertz it works perfectly well at 50 Hertz thing is it just needs to have correct time stamps any other question okay then thank you very much