 Welcome back everybody. We're going to get going. Next up we have Mike Reese who is one of the locals here and the organizer of the Utah Valley Meetup because after Brandon bailed and went to Austin, I didn't want to do it again, so we made Michael do it. But Michael is a good friend of mine. He lives in my neighborhood. We go to church together. We're home teaching canyons. It's pretty awesome. It's pretty epic. So I'm super, no nepotism involved at all on Michael speaking. But actually it was kind of a, Michael was our last, our ultimate. And then it was quickly, oh, can you do the full 45 minute give a whole session? And he's like, sure, I have no stress at work. It's totally fine. So, Mike, I'm really looking forward to this. This is going to be fun. It's all about robots and junk like that. So, all right. All right. It's always good to be known as the bee string. Try and carry that tradition on. So as Mike talked about, I today really just want to talk about robots and fun stuff. I really hope that no one gets anything useful out of this presentation at all. You shouldn't be able to tell your boss about anything you learn during the next 45 minutes. And I'll make that as true as I can. So, yeah, stuff like this. There's going to be lots of this. Quick introduction. Like Mike said, I'm Mike Reese. I go by MMM Reese pretty much everywhere that I happen to be on the internet. I work at MX. But my real calling in life is making babies that have ridiculously good hair. I don't have any stickers, but if you want to come swap some pictures of babies, just be prepared to be embarrassed because my babies are really good looking. So, this is Winston. It takes after me. Thank goodness one of them did. So, I'm really happy to be here at the last Mountain West RubyConf. I've gotten to speak here once before. And it is a bittersweet thing as it is for many of us here for me. I was remembering the other day, the first time I attended a URug meetup. I was still a student at BYU at the time. And I remember like the week before, I had just gotten Ubuntu running on my ThinkPad laptop. And I thought that I would be one of the cool kids when I showed up at a URug event with the Ubuntu laptop. And instead, I got there and everyone had a Mac. And I felt so insecure about my hardware. And I didn't know it, but I was just a hipster ahead of my time. Now it's like all about overthrowing our corporate overlords and using Ubuntu would be cool again now. But I was too ahead, too far. And specifically within this URug universe, I really want to call out Mike. As he said, we have had plenty of time together. I've never gotten to cry into his shoulder like Brandon mentioned, but I'm actively looking for a reason to start crying if anyone wants to suggest one. And some of you may not know, but Mike is a secret millennial. I suspect that he's actually made up of a group of little rascals hiding underneath the trench coat. Or perhaps Mike would prefer the metaphor of a group of ponies hiding underneath the trench coat. If you don't know, this is actually why Mike really loves to troll everyone so much, because he's actually just a kid. And so he likes to make fun of all the rest of us adults for the adult things that we do. So that's pretty much all you need to know about him. And also I wanted to call it really quick. I don't know if anyone else noticed this, but Tenderlove at this conference has been looking really buttoned up. He's looking almost enterprising, dare I say. Like wearing a jacket and a button up shirt. I was thinking that maybe we'd get the weird hat Tenderlove. Or perhaps the wizard in a bathrobe Tenderlove. Or maybe colonial Tenderlove. Or I was really, really hoping to get the hands of Unabomber Tenderlove. And none of those things really panned out. And so it made me think he sort of looks kind of like a professor. And so I'm going to propose that we call him for the remainder of this last Mount West RubyConf. He should be known as Professorlove. And that was my best attempt at an emoji representation of that name. All right. So as I talked about the beginning, when I was thinking about this talk, and when Mike was like, hey, can you do a 20 minute talk? Hey, just kidding. Can you do a 30 minute talk? Hey, just kidding. Can you do a 45 minute talk? I was thinking about the useful and the fun aspects of what I wanted to talk about. And then I just threw away the useful part and decided this should just be about fun. And this mattered to me because similar to one of our earlier talks from Jameson, I've gone through some ups and downs in my career of times when I get home at night and I play hard with my kids. But as soon as they're asleep, I'm just like, boom, got to get on that GitHub, got to get some PRs going, pad those stats, and I'll just be really into it. And then a month will go by and really all I want to do is like watch reruns of psych. And I've been on that roller coaster. And recently, one of the things that has for me been a really good way to reconnect with the fun of programming is robots. A lot of people I know, they got into programming because they wanted to make video games. And that's a really fun experiment. But for me, I've always wanted to make robots. That's always what it was about for me. I wanted to make robots that did cool things like bust out some sweet dance moves. I wanted to make robots that would fight each other and stand up really awkwardly. I wanted to make robots that could make some sweet jams. Actually, I have to just comment like this robot in the top right here, what? I don't understand his job in this robot band because he's not playing any instruments. I thought maybe he's a backup dancer, but it doesn't move. It just has giant saw blades on its head. Maybe that's all you need to do to be in a robot band. It seems like this might be the nickel back of robot bands. It'd also be cool if I could make a robot that would just raise its arms but somehow look really scary and intimidating in the process of doing that. And finally, if one day I could make a chef bot and it would make some sweet farm to table health Oreos very finely diced and really be dedicated to that job, it would just bring a lot of happiness into my life. So I'm not there yet. Chef Oreo is not ready quite yet for demo day. But also, as I was preparing this talk and actually just earlier today, I heard from one of the other attendees about a place that's here nearby an area. There's a little museum just down the street called God Hates Robots. A little intimidating. Maybe I should have thought of this before I decided to give my talk on robots. For anyone who's out of town though, actually this other place right here, Ten Angel, super good food, pro tip, go eat dinner there tonight and save room for panna cotta at the end. But yeah, I didn't realize this was like a thing. I really didn't know that. It really throws a wrench into my understanding of things. And I'm feeling pretty intimidated. So hopefully not very many of you have similar feelings about robots in this crowd. So the robot that I set out to build, the main goal that I had for my robot was I wanted it to be a robot that was built to interact with people. So a lot of the robots that I had experimented with in the past, they tried to accomplish some sort of practical job like vacuuming a floor or carrying something from one place to another. And they sort of tried to do that with the minimal, minimal amount of interference from humans as possible. And this time that's not the robot I wanted. I wanted a robot sort of that would feel like R2-D2. It would sort of like when it ran into a good guy, it would sort of make happy beeps at them and maybe throw a lightsaber out of its head. And if it met a bad person, it would somehow know that and kind of like back up and make like little nervous beeps or something like that. This is also not totally in the realm of feasibility along with Oreo Bot yet. But this was what I was setting out to do. So I began by asking myself the question, as a Rubyist, what are the skills that I can fall back on? What are the skills that we as a community have acquired over our history? So right off the bat, Rubyists are really good at taking things and gluing them together. We may not understand how all the pieces actually work. We may not be worrying about the failure scenarios that they introduce. We just glue them together and make mash up websites. And this is something we're really good at. So that seemed like a good place to start. And so for me, what this meant is I didn't want to get bogged down in the details of like, how many things am I going to have to order off of SparkFun and Adafruit before I actually have something that moves? I didn't want to spend a whole lot of time buying resistors and capacitors and doing soldering. So I found this amazing platform, and I highly recommend it. If you are someone who wants to play with robots, the company that makes Roombas, they sell this project called iRobot Create. They're on the second version of it. As you can see, it comes with batteries included, motors already set up. It has infrared sensors, bump sensors, cliff sensors, wheel encoders, buttons. It even can tell you the charging state of the battery and how full it is. This is a whole bunch of stuff that if you tried to build from scratch, it just takes several months of learning. And that wasn't the part of the project I wanted to work on. I wanted to work on its behavior. So this was a good place to get started. And also, I didn't really want to go learn more C. C is a great language, and I'm sure it would do a magnificent job of running this robot. But it would just be a lot of new tools and a lot of new gotchas. So a great thing about projects like Raspberry Pi is that it has an SSH daemon that runs when you first boot it up. I know how to do that. I know how to SSH to things. I know how Linux processes work and how they can communicate. This just does all of those things. It's sort of like having your own mini cloud on a card. So that is loads and loads of fun and saves an enormous amount of time. And then I thought some more about what it is that Rubyists are really good at. This is probably a good time to note that I decided to open a GIF consultancy. If you need the next great GIF for your presentation, I'm available for hire immediately. And so since we're very good at grabbing snippets of code off a stack overflow and mindlessly copying and pasting them into our projects, seemed like a good place to go as well. So I went to RubyGems and one of the really hard things about a robot like this is having the concept of people. Like, how does a robot know if there's a person around? That's a hard problem to solve. There's a great set of libraries, collectively known as OpenCV, that do a whole bunch of computer vision sorts of things. But it's written in C for performance reasons and for historical reasons. And I don't want to go and learn all of computer vision theory. And so I don't have to. I can use spyglass. Thank goodness for Andre Maderos and the amazing Ruby community that we have. As an example here, this is an example pulled from their examples directory on the GitHub page for this project. And you can see that basically we create a new video capture and then we use the reverse shovel operator. I don't know if there's like a Devron instead of a Chevron name for this. But basically pulling frames out of the video capture device into a variable. And then you can pass them into a cascade classifier. Don't know what that thing is. Don't understand how it works. I'm a Rubyist. And so I just steal this code. And it does work. And it does things. And that's good. So another thing that Rubyists are good at. Apparently sitting awkwardly on a cloud. So I really don't understand what this little blue person is doing. But it seems somehow inappropriate. And so anyway I thought, okay, this will help me to look for faces. But I have no way of knowing which faces are the Jedi and which faces are the Empire. And I needed some way to be able to differentiate the faces that I was seeing. And luckily Google Cloud Vision API got announced during the prep for this talk. And thank goodness that it did. That's that's why I was able to extend it. Good job. Thank you very much Mike Moore. I will steal more of your code shamelessly on stage. So this is a really fun API to play around with. If you haven't played around with it yet, it'll do OCR to pull text out of images for you. It'll also look for taggable entities. So it'll notice sailboats and dogs and kittens. Probably dinosaurs but I haven't tested it yet. Aja, she probably already knows. That does dinosaurs according to Aja that's good. And it also does facial analysis. So it does like a sentiment analysis of the faces it sees. So I still can't tell the difference between a stormtrooper who's really happy that they're about to destroy my robot and a Jedi who's really happy because they understand the force. But as long as you're happy, I know generally some way that I can respond to that. So next thing that Rubyists are very good at. Another skill that I can leverage is having public opinions. So just as a hypothetical example, some people in this room, we might have opinions about the difference between mini-test and R-spec. Hypothetically speaking. And some of us might also have opinions about things like RVM versus RBM or Vim versus Emacs versus other editors. This seems to be something that we're very good at as a community. So I thought, you know, beeping and sort of moving around, those are really great intuitive ways to have an interaction with the robot. But it would be even better if my robot was more like a Rubyist. So it is. It has a Twitter account. And it will tweet its somewhat trolling opinions of the people that it sees. And just a minor note here. I do have a good friend named Chris in the audience who I'm pretty sure if I ever posted a picture of him on any sort of social media, it would have to be signed with a GPG key before. And so it will only tweet pictures of faces that are very close to the camera. So if it scans over the room it's not going to be like magically getting all your faces. The webcam that it uses is terrible and can't actually see almost anything beyond a few feet. But if you get real close like Tyler did, then you get a pretty sweet picture with some sort of opinion from a robot posted about you publicly. And that makes you even more of a Rubyist. So the current version of FriendlyBot looks something like this. Unsurprisingly it looks like a bunch of stuff that got glued and taped together by an amateur. Because it is. So the basics here you can see the Raspberry Pi is riding on back. It's sitting on top of an external battery. It's totally possible to wire into the battery of the Pi. But then you have to do all sorts of level shifting with your power supply. And you have to worry about power surges. And I don't want to worry about all that stuff so I just buy another battery instead. It has a really bad webcam like I mentioned on the front of it. And at this point it's probably worth just pausing for a second before my inevitable failure of a demo. And pointing out that I just talked about like six small things and none of them were super complex. And none of them were super in depth. And all of them were a complete blast. The first time that I actually sent a binary command to this Roomba was at about two in the morning when all of my kids were asleep. And I didn't know what the unit of measure was for the speed parameter. And so I accidentally told it to drive half a meter per second. And it goes tearing across the kitchen floor and rips itself out of the plugs that I had wired into the back of it. So I can't send it the stop command and it's running into things and I'm positive that it's going to wake up everyone in my whole family. And when I finally wrestled it to the ground and unscrewed its battery to turn it off, I had this feeling of pure joy. A little bit different than debugging a production system. But a similar feeling nonetheless. So since this presentation at this point really can't get any worse, I figure it's time for the demo. Let me just do a quick recap of what needs to work in order for this demo to go correctly. A Raspberry Pi has to boot without its SD card becoming randomly weirdo and not booting anymore. So like 10% chance that that goes right. It needs to be able to talk over a serial interface at 115,000 BOD to a robot which is hopefully charged and it needs to be able to connect to the Wi-Fi. Thanks guys. And luckily we're using Hotspot. And then if the Cloud Vision API is also up and relatively responsive then it should be able to move and take pictures and if the lighting is okay then it will even get a decent picture and it will then tweet about it. So yes many many things can go wrong and it will probably go something like this. All right so here is its Twitter account. I highly encourage all of you to go follow this Twitter account. Hopefully it will be tweeting here in just a minute. Yes grounding grounding grounding. And dear robot please work for the love. Like that. All right I'm going to have it come this way so the light's on my face because the low light is the hardest thing for the camera. MWRC bot. Thank you that extra light will actually help a lot. So it takes about a minute here for it to get all started up. It has to start a Ruby VM, an Elixir virtual machine and a few other things and boot up the camera and there it goes and so if I get close it might play a sound. Try again. Try again little robot. You can do it. Did the sound. Random shouting this is not a republican debate. What? Someone please help me find my robot later. All right all right Mike Moore is going to go rescue friendly bot. All right that went way less terrible than I expected. And as a result I have a little bit of time that I can talk through the code of how this thing actually works. I promise not to go into too much detail but we'll just kind of roll through this stuff. One more time. There you go. All right so the first thing like I mentioned that I had to do to get this thing to start moving around was send binary commands. There's a really great PDF that the iRobot folks have put out that shows the exact shape of all these commands. It uses standard encodings like like two byte signed integers and things like that. And it turns out that Elixir and Erlang have a really great syntax for both parsing and generating that kind of data. And so I decided to again it's fun so I just wrote in another language and Rumebex was born and you can use that if you want to. Here's an example of what the code looks like so this is parsing I get back a single byte that represents the state of all the buttons on the top of the Rumba. There happen to be eight of them and so I can just say hey bring in this one byte and pattern match it and turn byte number zero into clock and byte number one into schedule and just by setting the sizes it'll figure out how that lines up and it'll end up giving me back the state of all my buttons as zeros and ones. So that was a lot of fun. Then I needed something that was kind of stateful that could up of the Rumba. There happened to be eight of a robot and remember am I supposed to be driving right now or am I supposed to be stopping so I can hopefully get a half decent focused picture and I named it DJ of course after our dear departed friend DJ Rumba and so DJ contains the majority of the code in this project. An example of what DJ does is it receives updates about the sensors it's checking the sensors 30 times a second and as it's getting in the sensors it just does pattern matching to look for common scenarios of this this bump sensor being pressed or released and things like that and then changes its driving command and in that way it'll just kind of shuffle itself slowly around it's somewhat painfully slow but hopefully that makes it easy for it to find faces in the crowd after my talk. Then what it needs to do is I also wanted to do as I mentioned all the open CV and the computer vision stuff was going to be done in Ruby so I did all of that using Ruby but I needed to talk between the two because when the Ruby program sees a face it needs to say whoa whoa stop for a second because this webcam is terrible and I need to be able to like get a half decent frame of someone's face so I can try to analyze it and so I did this with just UDP and so here I'm just listening for some little UDP messages like stop go seeing and cancel and then I needed the code that would look for stuff and decide when to move so here's here's a great piece of code that you can share with your team global variables all over the place and also in true Ruby is fashion you know I figured now's a good time to declare that local variables are dead it's not something we should be doing anymore long-lived globals if no one ever needs to read or understand your code global variables are a lot of fun so here's the main loop of all the Ruby code does over and over it just basically tells the wanderer to go that's publishing the little UDP packet then it looks for a face this is a blocking call that's just sitting and sampling frames it's able to process about 10 frames a second only using about half of one of the cores on the Raspberry Pi which is better than I expected then if it finds a face or thinks it sees a good face in the image it'll stop and then it will try to pick a good frame and this is what this one is just doing a more selective looking for somewhat more focused and looking for a little bit of a bigger face in the frame and if it finds one that matches those criteria then it sings a little song and kicks off some asynchronous processing to do the analysis because I don't want to wait for the internet to be available before I continue moving on and looking for new people if it doesn't find something then it plays a sad song some sad beats you guys will probably hear those later in the lobby and an example once again this is pretty much copying and pasted as I mentioned from the spyglass project and thank goodness for the good work of Andrei Medeiros and then the analysis basically looks like I pass it to this analyzer object the analyzer sends it to the Google cloud and gets back a result it also checks for some additional blurriness because my local check for blurriness is not very good and if that all passes then it decides to actually send it to Twitter and it picks a pseudo random sort of troll to send along with the image so we talked about and I've spent most of this presentation sort of trolling our community the things that we're good at are stealing people's code using things without understanding them etc etc but there are a few things that I actually think our community is totally outstanding at and a few examples of those are things like the friday hug celebrating our heroes why the lucky stiff has already been mentioned here but these are people who emphasized whimsy and playfulness and joy rather than things like technical excellence and being smarter than other people in the room things like ruby friends an entire project and website whose only goal is to try to make groups of rubious a little less socially awkward so we can find a way to take pictures with one another and hopefully friendly bot can be another another project in that vein that helps rubious have a reason to talk to one another at a conference and finally this idea of minus one tend to love did not give us enough puns so this is a mini swan but that's not what minus one about is about its mats is nice and so we are nice and this is one of the things that drew me to the ruby community very early on this is a community of people who believes in being welcoming to people who have been neckbearding for 30 years and people who are in the middle of a boot camp and just learning to program it's a community of people who believe that it's important to be nice and to be kind and to me that that is why i'm here that's why i continue to be a part of this community um i think mountain west has done an amazing job of embodying those values over the last 10 years and certainly my own experience um you know i went from being the person who was so intimidated to be bringing a pc to a mac party at my first u-rug to being someone who was uh essentially coerced to giving a talk at a u-rug and uh and then coerced some more times and then eventually coerced into giving a talk at mountain west ruby conf and all along the way i've been surrounded by good friends and good people who were kind and forgiving of me and of the times that i made mistakes and i hope that whatever is in our future as a community uh whatever meetups and conferences we do this next year in the years after i hope that that remains at the core of what this community is and what we choose to do with our time in effort um and that's all i have thank you very much i shouldn't repeat it because it's a troll but i should because it needs to be repeated for future humanity the question was did i get a lot of design input on the robot or was it designed in a vacuum um thank you for bringing it in stronger and all right any other questions yeah um so the question was did i look at the things like the r2 framework so um there's a really good robot like robot related framework called r2 for ruby um it's made by the same people who make cylon j s and go for bot i believe is the one and go um it's a really cool framework um i did look at it but i kind of um despite the fact that i poked a lot of fun at not wanting to get into the resistors and capacitors in that part of the whole problem i did like the idea of getting into like what are these bits and bytes look like as they fly over the wire and so i was kind of excited about writing that part of the code and since it it fit my fancy i just figured i'd write it myself oh thank you oh there should have been another one earlier too here we go here we go look at its legs it's still trying all right fall all right so the question is there is limited times we have families we have other things how do i try to time box or find time for doing these kinds of projects i'd say that i'm in a big way a victim of my own interests i'm somewhat like ajah's talk about feeling like a ferret um this just happened to hit me at a time when i looked around and i had always wanted to build a robot but i in previous times when i had looked around i just the the raw materials that i had to start from were at two layers of distraction lower and i when i looked at it i thought oh that's that's a thousand-hour project and there's no way i'm going to get through all of it this was a time when i looked at it and i thought i can start from this Roomba thing i can start from raspberry pi i think all said and done it's this is probably about 90 to 100 hours of like coding time and debugging and playing around with it it's still a significant effort the only real way that i organized that is once a week me and my wife plan out our week and she tells me what nights she's going to go like to dinner with a friend or go to a pottery class and i tell her which nights are programming nights uh and that just sets the expectation that like once the kids are in bed i'm probably going to be a terrible husband until the morning so uh we just we just plan it out and that way we we know what the expectations are that's that's all i got oh here we go this one's really good that you got to wait for this one there's a little bit of a buildup clear shot samba samba samba and then so good the little the hip shake is what gets me so the question was about the image quality it's just because i'm a total cheapskate so i i bought like an $18 webcam off of amazon if it was using something even halfway decent i wouldn't have nearly as much problem with white balance and being able to focus quickly and things like that no yeah no it wasn't an issue at the operating system level or the raspberry pie level it was just with the hardware of the camera that i was using yeah so actually works a lot better if you if you stick the laptop on top of it it just looks a lot less friendly oh gif gif thank you oh man this one's good simple but it's good i don't i wish we could see the buildup to this because i don't know what this challenge is but clearly it failed brandon did you have something um i i'm we take strong bet that in this wi-fi it would be limited by your bandwidth uh when i tested it from a good internet location it was still taking around like a second and a half um you're you are still sending an entire image which is a lot bigger than like a normal json payload um and the analysis it's doing is much more complicated than just checking equality or things like that so it's still it's i mean it doesn't take a long time but i knew that i was going to be in a somewhat questionable wi-fi environment when i demoed so i planned on the fact that sometimes those asynchronous jobs they just fail and so that's fine just let them fail and leave an image sitting around on the raspberry pi um so the question is that i look into running the ruby code sort of bare metal without an operating system so that'd be something like a microkernel or something like that is what you're imagining um i didn't really look into that like i said i was uh this was definitely just a case of i know i'm not going to get into all the details so i'll just pick and choose which details i find interesting uh so i i just didn't really play around with any sort of tuning or or running micro kernels no oh do i have another gif you guys are asking a lot of questions oh man i'm out hold on we'll just go back to this one this one is so good all right enjoy i'll i'll just keep answering questions but you guys can just you guys can just enjoy that um so what are my plans for the future with friendly bot um a few things i definitely am gonna let it have some time just roaming around my house and scaring the heck out of my little baby like little baby winston you saw at the beginning uh he's not a big fan of friendly bot um but this this is how he gets to know like you know until they customize it's just they're gonna feel that way so i'm probably gonna scare my children a little bit uh i would i would actually really like to um add on the ability to recognize faces when it sees the same face many many times um and i would also really like to get to the point where i have some sort of the locomotion that i can get up and down stairs um and i thought it'd be cool if uh it's sort of logged activity of like who spends the most amount of time in various parts of the house i don't know why that metric appeals to me um i'm a random human being but for some reason that metric does like i'm curious do we even use that one bedroom in the corner of the upstairs and uh if it noticed when it saw faces heading in and out of that area then maybe i would have a metric for it instead of just telling my wife no we shouldn't buy a chair for that bedroom my favorite show featuring a giant robot um i don't know i feel like i'm really failing you by not having an answer to that question but i don't really watch any shows with giant robots i don't know ask mike more i know he watches really terrible movies like transformers and pacific rims so i'm sure he'll have an opinion does anyone else want to ask a question that i can make fun of mike more as part of my answer all right well um friendly bot will be hanging out over here by the table in this little lounge area like i said it's not going to like creep on people or try to like stalk you it's just going to be randomly roaming around and if you happen to crouch down by it it'll take a picture of you and tweet about it and then you can retweet it to all of your friends uh or maybe get another rubius to take a picture with you you guys get like you know maybe we can even we can get some synergy with erin paterson's selfie stick technology and put it on top of the room but possibilities are endless but it will be roaming around out there and if you're interested in robots come and talk to me thanks everyone