 So I'm Tim Panton. Current business card says Protocol Droid, which is what I actually like doing, implementing protocols, but also explaining the kind of human-cyborg interaction thing of translating from geek to real life and back again. I'm Steely Glint on Twitter. Feel free to tweet at me if you've got questions and stuff. But I wanted to kind of kick back with some recent news. Fiat got their Jeep hacked because they thought they had a VPN when they kind of didn't. And they didn't authenticate who was talking to it correctly, and they didn't have the right level of firewall. These guys have got a sniper rifle, which for reasons that completely, I'm too British to understand, has Wi-Fi enabled gun sites. And it can be hacked by the Wi-Fi. So that, bizarrely, if you change the gravity constant, you can make the things aim wrong and hit the wrong target. And again, this guy, this is brilliant, right? This guy has a hacked version of the AR drone, which when it sees another AR drone, it co-ops it and makes it follow it. So it can basically, it's a viral set of AR drones. And basically again, because this is open Wi-Fi with no authentication. Nest earlier this week was down along with Dropbox, so that even if your baby cam was in the same room on the same Wi-Fi as your handset, you still couldn't talk to it because it went through a server in Utah, or wherever they host their servers. This is a big thing in Britain. These guys had the Information Commissioner, which is a branch of government in the UK, issued a letter to all webcam manufacturers telling them not to release webcams that use passwords. And if they must use passwords, then they must disable the webcam until the password has been changed to a strong one. So that's the regulator starting to get involved in internet security. It's kind of fun. And these guys, so the FDA, same thing here. The FDA have withdrawn a pump, an insulin pump, I think, because it could be hacked over the whole hospital network. There wasn't properly filed or authenticated. So my message here is that security isn't what it was. The old thing of knocking on the post and gate of the castle and being challenged for a password, it doesn't work in the new world. We've been stuck in that security mode for too long and it doesn't work anymore. Now, the ideal protocol for the internet of everything would be, we're a standard protocol, obviously. It would be secure. It would have a wide deployment. It would be peer-to-peer with natural versus so that it didn't matter if you were on 3G or 4G or whatever, or on your Wi-Fi. It would be real time because you need baby monitors and drug pumps. You don't want to delay this stuff. You would want strong identity management so you can control who's accessing the brakes in your car. And you want it to be mobile capable because most of the time at least one endpoint is going to be a smartphone. The other one's probably going to be a device, but you want that. And the bottom line and the summary of all this is it's got to be focused on the user. It's got to be around what the user needs. So you look at that and you think, well, the RTC Web protocol that goes over the wire has the following attributes. It's standardized. It's secure. It's widely deployed. It's peer-to-peer. It's got real. It's obviously real time. The identity thing, not so much, but people are working on it. It's getting there. And it's mobile capable and possibly on smaller things. And it's user-centric. So maybe there's something we could do that. So what I decided to do for this demo is to build something that is real time authenticated, peer-to-peer communication between a small device and a web RTC browser. And obviously, using the data channel. Unless, obviously, not using any passwords anywhere. Because let's face it, we all hate passwords. So the components we're going to need for this is web RTC data channel app in the smartphone, web RTC data channel app in an embedded device, a web RTC service so that they can meet up, and some sort of pairing so that they know who they're talking to, and that that relationship is well-established. So what we're going to use is Chrome for Android. Well, actually, because of the AV thing, I'm going to do it on the Mac. But it could perfectly well be on Android. Or indeed, it could be a CocoaPod or whatever. It's standard web RTC, basically. I'm going to run a lightweight stack on the device. I've written a very simple web socket message hub just because I can. And it's on GitHub there if you want it. And actually, all of the HTML5 for the examples are in that Git repository as well. And I'm going to use QR code pairing. Just actually kind of fun because it uses GetUserMedia, so it's even more web RTC-ish. And I'm going to use Duckling protocol. So described by Ross Anderson in the 90s, he's a prof of computer science at Cambridge, Cambridge UK. And basically, what it models is the idea that a device trusts the first thing it sees. So I mean, this is supposedly true of Duckling. I don't know if it actually is, but Duckling recognizes its mother because that's the first thing it saw when it came out of the shell. And so actually, we flip this relationship around because most small devices don't have cameras and most smartphones do. So actually, we turn it around. We have the device showing the QR code and the smartphone seeing it, scanning it, and making a call. And then that only happens once at the kind of hatching stage when you first switch a device on when you unpack it from the Amazon package or whatever. And at that point, it becomes your device and you claim ownership. So now, I'm going to attempt to demo what that looks like for users. Now, this screen is more resolution than I expected. So let's make it a bit bigger. OK, that'll do. Right, so not quite the effect I had in mind, but it'll do. So to riff off the original remark about Yo-Pet, this is the app I wrote for last year, but with added pairing. Because basically, last year, I totally avoided the whole issue of how do you pair up two devices. So I figured this year I'd probably better fix that. So the idea is your pet has an app on their tablet, which you put in front of their cage or wherever. And so the model is when you first get this app, you start it up, and it shows a QR code. It says, please show this to your handset, or in this case your browser in your desktop. And you show the two to each other. And a few seconds later, it sets up a quick test call, and then you're paired. From that point on, these two devices will not speak to anybody else. They know about their pairing relationship from then on. And then the net result of that is that I can put this down over here and place a video call to it with any luck. And yes, we have a video call. So that doesn't sound like a big deal, but actually it's nice that I didn't do an answer on that. I didn't accept it. I didn't have to log in, check in. No passwords were exchanged. Switched that off. And there are some other features like you need to be able to forget the tablet and whatever else. Right. So you've seen what the QR code exchange looks like. Now I need to get my browser back. So that QR code exchange is actually a pretty simple experience for the user. So next thing, yes, is the code. No, I wonder if this is going to work, probably not actually. Is the code for the, oh yes, I do know how to do this. I'm just going to show you a tiny bit of code for the exchange of, there's a mode for this presenter mode. There we go. That's what we're doing. Whoa, there we go. Right, so this is a little WebRTC rendezvous service, runs and server. It's actually written in Scala, which I really like. It's great for the kind of central services things. And basically, all it does is it takes a web socket. When you connect to a web socket, you assert an identity. In this case, whatever was actually in your QR code on one end and the equivalent on the other, you assert an identity. And then when a message comes in, this device just basically passes it out to the correct device within the mesh. So it's basically a message passing thing, looks at the two address, and sends it. It doesn't try and validate anything. It just passes them around. There's no kind of, there's actually no management in there. It's just a message bus. How do I get out of this mode? Yeah, no, it should be a Skype, but it isn't. I fell for that earlier when I tried it. Right, so that's Fingersmith. Again, that's actually in that GitHub repository. You're welcome to improve it, please. So the question, interesting question, is what do we use as an address? What was in that QR code? WebRTC has no intrinsic identity built in. And that was a design decision and one I wholly approve of. But it doesn't have an identity in there. So the obvious thing to do is to use a random key. And if you look at something like XMPP in the anonymous mode, the server generates a random key, a random token, which you could then use. Or you can do what's available in Twilio and Respoke, where the client side can generate an identity and assert a identity and just say, I am Fred. That's fine. And that's what we're exchanging in the QR code. We've taken the second approach, that the endpoint claims an identity, a random identity in this case, but persistent over the life of the app. Exchange over the QR code at hatching and stored locally and reused for life. And by mean life, I mean in this case it might be the life of the, well, the tablet actually. Or at any rate, however long you've got the app installed on the tablet. So now, full disclosure, this is such a good idea, I filed a patent on it. So if anybody kind of wants to blank their eyes and ears at this point, feel free. So there's this fingerprint thing, loafing around in the SDP. It looks kind of random. Maybe we could do something with it. It's actually the hash on the X509 certificate used in the DTLS exchange. Can you use it as an address? Well, actually, yes. And if you do, then all sorts of fun crypto properties drop out for free. Basically, you don't have to trust the signaling server anymore. I'm going to give a talk at the Illinois Institute of Technology in Chicago about three weeks time. Well, I'll talk about it. Basically, the whole talk is about that. They wouldn't let me do that here, so. But it basically lets the duckling tell that mommy is calling and ignores everybody else, all the other ducks. So another little bit of JavaScript walkthrough here, because I'm supposed to do code, you see. So localhost, right. So this is what we're going to see in a minute. But for the moment, let's just look at the source. OK, so here we go. Classic JavaScript, you end up reading it backwards. So at the end, we have this on document ready function. And basically, what this does is it uses a little library I've written called Ipsurama, which goes off and tries to find out what the fingerprint is for the current device and then so it can then use it and assert it. So it does this by, wherever it's gone somewhere, indenting's gone wrong. It does this by, where's it gone? Oh, yes, of course. It does this by using another library. So it uses the Ipsurama library. I wonder if this will, yes, excellent. So it does this by using the Ipsurama library. Ipsurama library does is, who am I? Basically does a fake, generates a fake offer. So it generates an offer, which is then going to throw away. And it then passes that offer to a library, which is actually I wrote with a friend of mine, Neil Stratford, whilst we were working at Tropo, called Phono SDP, which parses the SDP into a decently shaped JSON object. And then you can go and dig in the JSON object and pull the fingerprint out. Now, with RTC, I no longer need to do that. I can go and ask the object. But for the moment, so you see here, this is what we call a Phono library with the description, and we pull out a fingerprint by digging down into the SDP contents fingerprint print. It's not horrendously ugly, but actually the Phono.SDP library is not beautiful. Anyway, so having done that, you have the fingerprint. You establish your WebSocket connection to, in this case, the fingerprint, and you say, right, I'm connected, and I am this identity, which is those two not exactly random bytes. So back to the presentation, hopefully. So generate dummy offer, use Phono.SDP.js to parse that dummy offer, extract the fingerprint. Thanks to Tropo for making that own source. Cipsorama sets up a data channel via Fingersmith. And we have to use GenerateStificat and IndexDB to make sure that that certificate is consistently used throughout the life, throughout my use of this app. And Chrome, the default behavior at the moment is that, but I understand from the conversation earlier that that will soon cease to be the case. We'll have to use the GenerateStificat and IndexDB on both platforms, which is fine, because that's the standard. So on the device side, I have a set of choices about what I can use in terms of what the platform is. So an obvious choice would be to use JavaScript. And then, so I'd probably take the WebRTC code dump, compile it onto my little Linux platform, or cross-compile it onto my little Linux form, and then wrap it in Node and use that. And that's possible, but there's an awful lot of stuff that I don't need, like video codecs and things. The other approach would be to take an existing C or C++ library like Janus from the MeetEcho guys and, again, cut out the stuff I don't need because they're very server based, and I actually want to run this on a tiny endpoint. Or I can do something foolish and do it in Java and DIY. So those of you who know me will guess exactly what I did, which is I did it the Java way. Fortunately, there's lots of helpful stuff out there. My friend Emile has wrote a very nice Ice4j stack. So the ice stuff sorted out, and that does turn and stun, and I'm happy with that. The bouncy castle guys wrote a DTLS stack, which actually Tropo funded. So that's great. That's out there. They're now bought by Cisco, so not too bad. Unfortunately, there isn't an open source SCTP stack in Java, so I have to write one. So now, I'm going to demo on the Beaglebone. Now, you have to think about the Beaglebone as the American version of the Raspberry Pi. It's a little ARM device. I think it's 600 megahertz, half a gig of memory, and a little bit of flash. It's actually a slightly nicer platform than the Pi. And it runs a reasonable Linux. So now, back to this web page. Oh, no. First, we have to set the Pi running. So it isn't a Pi. On the Beaglebone, we're going to start a Java library. My big Java library. Let me try and make that a bit bigger for you. So this should now, there we go. You need to see this, really. So this is a Beaglebone. And it's now running. It's drawn a QR code. And if I hold it close enough, we'll recognize it. And then we can press Connect. And now we see my totally unoptimized Java hopefully gets an offer any second now. There we go. And hopefully, it sends back an answer with any luck. Yeah, looking good. And now it's done the ice trunet transaction. It's starting scdp. And look, it's running. It's echoing my packets, my DTLS packets. Now, there are a couple of interesting things to say about that, about why you would bother to do that when you've got a perfectly good web socket server. There are two really good reasons. One of them, I'll show you now if I can remember how. There we go. That's one, which is that we've got even on a relatively slow device. And bear in mind, it's writing to this with serial ports, so it's doing stuff slowly. Got a round trip time of like 30 milliseconds. It'd be hard to do that with a server out in the cloud and back to a small device. So the round trip time is short, and there's a stack of optimization I could do to make that shorter. And then the other thing is, I can, with any luck, I can't see this. Well, if I could find the window it's running in, I could kill the web socket server, but I can't actually find it. And it would still work, point being that it's not going through the web sockets, so that would still continue to work. However, so that's data channel running on a small device. Right, back to the story. So I know there's an argument, basically, which says, actually, you're cheating. A bigger bone is a credit card size. And it's got half a gig of RAM. That's not a real device. Real devices are ARM 9s with 300 megahertz and 64 meg of memory. And I would counter that by pointing out that what Intel think an IoT device looks like is this. This is the Intel Edison, and it's a dual core x86 architecture, dual 500 megahertz with a gig of memory. So actually, if you look not too far in the future, you could expect to find this in your thermostat. But I understand that there will be platforms where that's not appropriate. So I give in. I went out and bought myself a toy. I bought myself a Lego EV3. Now, the EV3 is exactly that spec. It's a 300 megahertz ARM 9 running Linux and some clown. It's open source, right? So some clown rebuilt the OS so that it took all the Lego stuff out, and you run a plain Linux, and then some other clown. And I just admire the open source community of this. Went and put a Java API on it. So all of these things, all of these devices on here, these sensors and these motors, actually have Java APIs to them. And I'm afraid to me that's utterly irresistible. So now, if I put him on there, he'll run away and fall off. So I'm going to put him there, and nothing will happen for several minutes, because I have some typing to do. But right. So now, let's stop the Beaglebone, because that's a distraction. And let's make the, let's make. So this is, I mean, this is just bizarre, right? I'm logging in to my electronic dog here. And this is absolutely typical. No password, right? Anyway, fortunately, it's on closed Wi-Fi. Right, so actually, speaking of closed Wi-Fi, I think we might need to be a bit closer. Right, so I need to kill, kill, kill. I need to kill the current menuing system that's running on there, because we don't want any menus. We don't do menus. Right, so with any luck, the screen's gone? Excellent, right. Now, this is the bit where actually, I asked Billy, but he's not here. I asked Billy to stand around and fill for me, because this takes ages. But it's fine, you stay there and I'll just lab. So starting Java on a 300 megahertz arm and pulling in the DTLS crypto library takes a few seconds. And then pulling in the WebSocket library takes a little longer. And then running over the keys, running the hash over the keys so that it knows what the certificate is, took a little longer. And now it's going to go and try and talk to the WebSocket server, and it will get there. I mean, this is all available for optimization. I had this working, first got this working earlier this week, and I first got it reliably working yesterday. And I say reliably, but it may not actually turn out that way. So it should now be just about to talk to the WebSocket server. There we go. So we're now on the WebSocket. We're doing that. So now we need to talk to it. So unsurprisingly, we have a very similar user interface. Oh, I just made a tactical error. So you've got a QR code down there. Now, for two reasons. One of which is that it's over there, and this is here. And actually, because the camera on here doesn't like the lighting, I'm going to cheat. I took a picture of the QR code earlier. Now, it's fine. I took a picture of the QR code earlier, because I knew that the problem is that that screen isn't backlit, and it actually is a lot of reflection of it, as you will see from the picture, actually. So with any luck, if I hold this in the right place, we will come on. You can do it. The other thing is, if I get, there we go. Got it. Right. So we now have the QR code exchanged, and we now start talking to it. Now, this actually also takes a little while. There we go. Offer gone. Answer coming back. Gathering candidates. Goes out through a turn server, I should be annoyed. But I mean, hey, it'll work, which is the important thing. So DTLS packets are going. We have sent the certificate, but we haven't got one back yet. And that's timed out. Oh, that's looking good. Yes, we now have a data channel running. And which means that, at least in theory, I should be able to drive this thing. I can't actually see where I'm going. I need to put a camera on it. All right, turn right a bit. There we go. We've gone too far. Turn left a bit. Oh, anyway. So my LEGO building skills aren't quite what they should be. But yeah, actually, there's a whole API for how fast you want to accelerate. But I didn't have time. Anyway, so the point being that I have a DTLS exchange with, anyway, whatever. And no passwords were harmed in this exchange. So, oh yeah, the demo. Right, so by using WebRTC data channel, we have used a standardized, secure, widely deployed, peer-to-peer, real-time, strong identity management to a mobile, small device. And it's, wow, fairly user-centric. So the big takeaway message is WebRTC isn't just for video calls. There are other things you can do with it. And particularly the data channel can solve interesting problems in the internet of everything space. So that's me.