 Yep. Hello. Hi. Hi. Okay. So, I'm Libby. I'm going to talk to you about a very cheap, quite cheap, remotely posible presence robot using a Raspberry Pi WebRTC and a few servos. Because I'm Libby, I call it LibbyBot, but that was just a kind of default. Anyway, it currently lives in an IKEA Espresivo lamp, which is just here to show you. So, hopefully, there's someone in the lamp. Richard, are you there? Or Dirk? Oh. Can you say hello on the robot? Can you hear it? I can see the monitor. Can you move around a little bit for us, if you can? So, this is what I mean by remotely posible, so it's got a few different ways it can move in order to see better or express itself. Thanks, Richard. Can you move the top a bit just to demonstrate? Maybe? Yes. There we go. Thanks very much. I just wanted to show it off before it all goes wrong. So, I'm Libby. I live quite a long way away from where I work, but I have an amazing job at BBC Research and Development. But because I live a long way away, I used to spend a lot of time on the phone, and I was very easy to ignore, and I couldn't see very well. I couldn't hear. And it was really annoying for everyone else. So, if I was in the kind of Skype or something like that, or appearing, then I'd be borrowing someone's laptop, and I'd be in the laptop, and people find that very irritating. I also have another hat. I go to Bristol Hackspace quite a lot. I also made, me and Barney made a hat that you could wear, that you could show around hackspaces with a hackspace hat. So, it streams audio and video from a hat. You can see Barney wearing it there. That's the Bristol Brunel version, which is kind of one of the Stedpipe hats. And it's got G Streamer. It's got Janus to turn it into WebRTC, and you access it via the browser. So, my question to myself was, could I adapt a hackspace hat to be a standalone presence robot for work, and would that kind of make things better? So, yes, I could. But it's taken me several years to figure out what the essentials are for finding an effective communication in a robot. So, are you ready for a tale of oddly shaped robot avatars? I hope so. And a lot of mistakes and failure and learnings. So, this is kind of what I was aiming for to start, right? So, this is, you sometimes see these in very posh offices, and they are, they're kind of really things with a, so some of them are a bit like segues. So, they wheel around, and they're quite tall, and you can, you've got a kind of iPad thing at the top. So, it's a bit like talking to a person who hasn't got a body, who's just got a head on wheels. So, that's the kind of thing I was thinking of. So, obviously, a robot in me needs the wheels in her face, obviously. So, here's really, this is 2014. So, it's got wheels, because it's made of a robot. And it can express basic emotions in, but only kind of basic emojis, not even kind of proper emojis, but just kind of emoticons. But it only does one-way video, which is MJPEG, no audio, and it only works on the same network. So, although I was very pleased, and you can see me testing it there, and although I could move it a bit, it wasn't really what I was after, because I could only go to a meeting in a different part at the same building, which didn't seem that helpful. So, one big problem was, it has a tendency to fall off tables if it's got wheels, because it can't really tell where it was going. And also, the office we were in at the time was very, it had a lot of barrier doors that you had to get a pass through. So, obviously, it couldn't get through by itself, especially because it was only tiny. And so, I thought, well, perhaps people could just carry it for me. I could just ask them to take it to meetings. So, this one is a sort of handbag type model. You can see it's had an upgrade. It's got a lovely touch screen, pie touch screen. It's got a nice Jabra speaker microphone on top. It's still using Gstream and Janus for WebRTC, so it can use any network. But it's really, really badly designed shape. So, all the pie and stuff is in the middle, but it's not very portable. And the pie can't handle streaming video and displaying it. So, it was streaming fine to me, but it couldn't display me on the screen. So, it's a complete pointless having a screen. So, different shape. This is the tube. So, I've gone back to emoji emoticons for expressiveness, so the little screen on the front isn't going to display me. It's just going to display my sort of interpretation of my emotions. It's got a slightly more sensible shape in the kind of... The speaker fits on the top. That's made it really top-heavy. It has a tendency to fall over, and it just sort of looks like a lexer. I don't like it, so... New. So, then things start to get a little bit weirder. I'm thinking, well, maybe I don't need a screen, because the screen is really just about saying, I'm here or expressing... Okay, I've got my voice. I can express things. Perhaps I don't need a screen at all. So, here we are. These are the bins. I quite like these, actually. So, it's not really an emotion, I'm here, but it has some expressiveness, because when I'm there, the lid pops off the bin. So, I'm here. You can sort of see someone in the bin, and then you can rotate to see better. So, it was cute, but basically everyone laughed at it. So, I thought, just to not enhancing my professionalism, I'm going to move away from that design, tragically. At this point, I read a really interesting paper that Richard Tull suggested to me. I think it was you, Richard. So, maybe... It was all about just having a single limb. So, this paper is called Exploring the Affect of Abstract Motion in Social Human-Robot Interaction. It's basically... They had a stick on three servos. It had three ways of moving. They studied people interacting with it, and they had a hypothesis of this emotion using stick. Just one stick. And I thought it was brilliant. I kind of needed two sticks, because if you've got a camera on a stick, you can't really move that around. It would make you seasick, so I was like, well, I'll have two sticks. But this is where my lack of abilities with servos and things came to the fore. So, it's got lots of movement, but it has a tendency to fling bits of itself off, and it looks like it's saluting. Video plays. You can see its movement is very, very jerky. And it kind of goes, yeah. It's not great. That's not really an emotion. Saluting is not an emotion. It wasn't very good. And again, people didn't really like it. They thought, I don't know really what this stick is doing in this meeting. And then had a brainwave. So, my friend Dan had moved away from Bristol, and he'd left us two little lamps. And I was just looking at them one day, thinking, what can I do? And, yeah, I'm a lamp. So, here I'm thinking much more of things like Pixar's little lamp here, which has got huge amounts of expressivity, despite, you know, it's got a few angles. It's kind of got a head. And this is the Espressovo IKEA lamp. I think they were about seven quid. They are discontinued. Weirdly, the electricity runs up the two poles. So, I don't know what they were thinking there, but that's what they did. There are loads of people who have got one at home. So, I've made three of these. I've got another spare two, and I've given one away. Two I've got from Dan. One I found on the street outside a charity shop, and Richard gave me two. But it's great. So, you can more or less fit a pie three in the base. A construction is obviously made for nodding and rotating. It's just perfect. So, this isn't the full thing, but you can sort of see how easy it is to make it nod like this. I'm here. I'm kind of waving at you, that sort of thing. So, this is how it works in use. I've been using it at work for a year, more than a year, and it goes like this. So, someone at work switches it on. I say, can you switch on Libbybot? They say, yes. I get a notification in Slack with a unique URL. I click to join it. I can see, I can hear and move, and the remote side can hear me. There's no reason why it has to be one person in the lamp. Lots of people can be in the lamp if you want. We once held a meeting with five of us in the lamp and no one else there, which was a bit bizarre, but it's completely possible. Yeah, because it's, I'll tell you about the technology in a few minutes, but yeah, I wondered if you wanted to go. So, I've set up a specific instance at that URL. It should work on most devices. You might just make it keel over, I don't know, but it's, and I've turned off the audio. So, normally it requests access to your audio and whatever device you're using it on. Here, I've just turned that off because no one wants to do that on a device they don't know. So, anyway. See if you can make it move, perhaps? No? Oh. Oh, God. Hang on. Well done. I don't know if anyone really wants one, but if you do. So, the technology behind it is just that we're running a headless browser on a Raspberry Pi in the lamp. And then we're connecting. That's running a web page, and then I'm connecting to webRTC via the server. So, you use webRTC. I don't know how familiar you are with it. It's secure, open, web-based protocol for real-time audio and video communication. I'm using something, I'm using Node and JavaScript on my server. Let's Encrypt because it has to be HTTPS. And a Pi B3 camera, a USB speaker, this rather fancy one. Yeah. Two 9G servos, quite tiny ones, an Arduino and a bit of wire. It does take about a day to make it. So, on the server side, I've spent quite a lot of time trying to find a good, easy-to-use webRTC library. An RTC multi-connection, if you ever want to try it, is the best that I've found. So, there are others out there, but this one, it abstracts away a lot from some of the more annoying, complicated bits of webRTC. It does video, audio and data in any combination. So, I didn't always want to have video both ways because the Pi is quite resource constrained. And I don't want to show my face necessarily. So, I wanted to be able to do data for movement, audio in both ways in video one way. So, that's what it does. The guy, it seems to be one guy who works on it. He's brilliant. He has tons and tons of documentation. And it comes with a small node server. So, you just kind of run that or you just modify it however you want in an off you go. So, the headless, you can run Chromium headlessly. There's a headless flag for it. But for this, it doesn't actually work. But it doesn't matter if you haven't got a screen. It's fine. It still works fine. And that's what I've done here. Because it's a web page, it's really, really easy to develop for and tweak. I'm just reading this bit. The only bit that was hard, seriously, was getting the V4 Linux drivers to work. Video for Linux drivers. Because you have to put in this amazing options is broken equals one in order to make it work at the moment. So that was the only bit that was hard to find. You can do this thing called UI for media streams. So what that says is it's not normally, if you request video and audio from something, you have to say okay. You have to use a consent. Because it's in the lamp, I can't do that. It's got no controls. So, I don't want to do that. See? But there's a flag. There's a little server, which is controlling the servos. It's a little Python server. And I'm calling that inside the web page when I want to move it. It's just running locally. And you have to do, allow really insecure content to do that. So, the cool things. It's got an embodied presence. And I find this is really, really great when you are in, especially a small meeting. So, you're kind of a thing there. And people address you, and speak to you as a person, kind of, even though you're a lamp. Well, they get used to it really, really quickly. I also use it in larger meetings. But it works less well then, because you're kind of a bit hidden, because you're small. It's mostly web technology. So, that means that anyone with HTML and JavaScript can have a go at making one. You might think that the way I've gone about this is completely nuts, because it's this layer upon layer of things there, right? So, I could have just done this with GStreamer and I have done. But it's much, much harder to tweak. It's much more unclear to me what goes wrong when it goes wrong. So, to make it web technology, it means I can change it really quickly. And I can prototype things on my laptop, which is really, really useful. So, I don't need to take the lamp around with me all the time. And this is just a case of waiting till the technology catches up with you, because when we started this, there was Pi B+, or something like that. There was no way it would run Chromium, except on, you know, a good day, a very good day. It certainly wouldn't do WebRTC. Now, WebRTC works with most modern phones. It works with all browsers, kind of all the browsers on this laptop anyway. But the kind of the best thing about this is that you've got this web-based platform for prototyping connected objects. And it's... Yeah, it is pretty nuts, but it means that you can very quickly make objects that can then do things, because you've got the ability to run things via serial, and you've got the ability to play audio and video, which is amazing. So, we have used this kind of technique at work in order to prototype radio-like objects, and we just did amazingly fast, because we made five ourselves in about a month, and then we made four more in a two-day hack day. So it's incredibly quick to just get things going. The thing is, there's a lot of work on that as well, but it's just having this platform for playing with that's incredible. And my colleague Andrew Nicola has written a really, really lovely article about how we did that, and why it's a good idea. So, just to finish up, the less cool things. If you want to do this, this is really the... Sorry. This is really the only important bit that you need. It's just a really, really good speaker microphone, and they're about 100 quid. So that adds to the cost. It's amazing on the Pi 3 to have Wi-Fi, but in practice it's quite a small antenna, and in corporate environments you often have loads of networks, so it can be very, very... it can be problematic, and actually I was having problems with it earlier, so I'm sharing network with it here. And finally, it's just not very expressive yet, so I had a lot of ambitions for this. I had a version that shrugged its shoulders, you know, as well as nodding and turning, but really I'd love to do something much more like the original stick, because that could express, I don't know, 10 emotions or something like that, which would be a big step up. Yes. So that's it. Thanks for listening. Thanks to Dan for this amazing picture he took earlier than me with trying to test it. The code is all up there. The instructions are up there. You could make one if you wanted, or put it in something else. It doesn't have to be a lamp. I'll put the references up there, and I'd just like to say thank you to Barney Richard and the Bristol Hackspace people, and my tolerant, helpful and inspiring BBC colleagues who carry Libbybot around when I ask them to. Any questions? Yeah? If anyone wants to ask me questions, or ask me online if you'd rather, later. Oh, we've got a question here. Can we make a really big one? I'd love to. I saw this amazing lamp in just a charity shop, and it's about this big, and I just thought that would really give me some real presence at work if I was kind of tiring over people. But at the moment, because it just runs off one plug, I'm kind of at the maximum level of servos. Well, I don't know, maybe. You can help me, Richard, yes. Hi. Have you tried using it in other social situations? Like, leaving one at your mum's house or something? No, no, that's a really interesting idea. That would be really weird, or might it work? I don't know. I mean, I don't know if you can think of me like a lamp. I've not tried it, but maybe I ought to, yes. And also, this is possibly not safe for work. Extension question. You know that telly did Dildonix patent as expired? I did see that, yeah. So you mentioned having multiple people using the robot at the same time. So if you have one person agreeing and one person disagreeing, does it sort of nod and shake at the same time? So I think it's conceptually quite complicated if more than one person is in it, because it makes much more sense if you embody yourself in one person, embodies themselves in a thing. And actually, so my colleagues are now quite keen to make their own ones. So what we might end up doing is kind of really saving on corporate infrastructure by kind of being in a cupboard together or something. I don't know. From a software point of view, it makes sense to all be in there, but yes, you're completely right. It could be very controversial, as you saw. You couldn't tell. How do your colleagues actually react to being in a meeting with it? Particularly, I guess maybe if there's something a bit more complicated on the agenda, it's not routine. So I tend to use it for three specific meetings. One is a kind of a show and a tell. So what we have to do there is just warn external speakers that it might move because you don't want to freak people out when they're talking and it's just a bit rude. But it's really good to have it there and it's a talking point as well. The best meetings are where we're kind of on a relatively circular table. I'm a participant in the meeting and I can, as the lamp. And then it's really interesting how quickly people start behaving like it's a person, you know, turning to it, expressing themselves. It's really, really interesting. So, yeah, like that. No more questions? So, that's it. Thank you very much, Libby, for explaining about the new Libby box. Thank you.