 One other footnote, can you hear me okay in the back? One other kind of footnote to history. There was a good couple of days there where we're trying to figure out what to name the project. This was, it started out as iOS Auto, so this could have been the iOS auto conference. I figured that was too limiting. That was your original name for the thing. We went through a previous, yeah, good name a week or two where it was Applecart. So this almost could have been the Applecart conference. We actually read through the term, like the Apple puts out their little trademark guidance and said, if you put the word Apple in anywhere in your product, we'll see, and you figure it out, yeah, let's not do that. So we renamed it to Appian, and then the rest is history. So welcome to not Applecart conference. There's a running joke in pretty much all of my demos. I'll probably be doing this to my dying day, is that Angry Birds makes an appearance. And I would call it, this is Hello World. If you're gonna make a robot interact with an improbable device, and you have to do a whole bunch of stuff. Angry Birds is like a proper video game. It's hard to test. It made an appearance in the Johnston Webby's Lightning Cloud classmate, because you need a different definition. So it's actually a relatively simple game. It's the first level, but it also flushes out a whole bunch of technical issues. Anyway, I'll come back to this at the end, but there's this cool little feature in iOS 13 that just came out like last week. So I kind of had to delete about my presentation because I wanted to show you some cool stuff that I did with this. This is a USB mouse, and this is like a new feature. This is running iOS 13, so it's only available as a beta, and it'll be available in fall. So I know you want me to punch the plate at the end. There was also Lightning Talk yesterday, building up the suspense. Manual testing is really important. There's a certain thing, or even with my demos, that even though I'm trying to trick it out with crazy animation, there's a certain argument that if you can't even test it manually, then why even bother with the robot? So once I proved that it could work as a manual mouse, then you try to figure out how to replace it with a robot or something like that. Anyway, so that's, you know, if Lightning strikes and nothing happens, you can say, hey, I showed you something, kind of cool, here. Ooh, one star, I need to get better at that. So if the demo gods are not, yikes! Let's see, how do I get out of here? That's super annoying. Let's kill that, here we go. So I'm with the main attraction. I changed the talk title. It used to be World Domination Plans, like version two or something. But this, I modified it because for reasons, don't fear the robot. This is my argument to you. Don't worry about robots taking your jobs. It's gonna be super complex. So thanks for having me. This is me outside my office. I took this picture on like January 15th or so in Chicago. It gets really cold in the winter. I think we actually had the record coldest day ever on record this year. And so one cool thing about when it's, I forgot what's in Celsius, but actually there's a point where it gets so cold, Celsius and Fahrenheit actually converge. So it's like once it's like negative 30 or 40 or so, it's actually the same temperature, whatever. But the cool thing is when it's this cold, you can walk around with swimming goggles and no one will even notice, right? Cause I am. Anyway, but yeah, so I'm only, I only live a block and a half away from my office, but you should come visit us. We're in Oak Park, just west of the city. Anyway, luckily I don't have to wear this set up to office anymore. It's like nice and warm right now. Also I'm hugs on Twitter, so follow me, heckle me throughout the stalker afterwards. And it makes sense because like my last name is Huggins, so hugs, whatever. I'm actually more of a fist bump person, so I'm actually not much of a hugger. So anyway. There's this thing that I've kind of noticed. Does anyone know about the curse of Cassandra? You know the Greek history? All right, two people. So a Greek curse of Cassandra. Cassandra was given the ability to predict the future. But the curse is, and it's there on the bottom, Cassandra was cursed to utter prophecies that were true, but no one believed her. So sometimes I feel like I'm Cassandra. And I'll go through my greatest hits of future predictions that no one believed at the time. Selenium 2004, everyone thought JavaScript was a career limiting move to actually use that in your applications. I look this up. If you go look for the first release date, everything changed about six months later when Google Maps first came out. And that was crazy JavaScript. That came out actually in 2005. They acquired the company that, Google acquired the company that made Google Maps in 2004, but 2005 was when they first made it as a Google release. And then after that, then all of a sudden everyone was trying to make crazy web applications with tons and tons of JavaScript. So Selenium was just a couple months early to kind of ride that Ajax Web 2.0, whatever you want to call it. Wait, but the big thing was like, how do we test JavaScript? It went from you shouldn't do that to, oh hey, let's do that. Fast forward to 2008, the big thing with founding Sauce Labs was, we're gonna do everything in the cloud. We're gonna take everything from the desktop and put it in the cloud. And we found it in the summer of 2008 and we didn't get any funding. We didn't get success with investors until a full year later. I was almost a week away from quitting and going to try to maybe get my old job back or something like that because I'd gone a year without salary and we weren't getting any interest from any investor anywhere. And we were meeting with all the usual suspects on San Juan Road in the Bay Area and no one was interested. Especially developer tools, like who's interested in developer tools? There's no market there. So that was another kind of a time where like, man, I really think there's something in this cloud thing but no one would believe me. Going to 2012, at this point things are getting a little bit easier. At this point, the iPhone and Android were obviously a thing. I think the particular thing with the no one believing us was like, there were tons of other tools that were out there. So why Apium, right? I'll come back to this at the end but there's a core philosophy that's in Apium that was different, is different than all the other tools that are out there. But at the time, no one really, how do they know to believe us versus anybody else as far as whether you should use this tool versus all the others. And then fast forward a couple of years to 2015, that's when I formally incorporated TAPSR, that's like my current gig right now. And I would argue that even, especially actually, with the committers on this project and the Selenium committers, this robot thing is a tough sell. I'm totally having fun geeking out and everyone kind of appreciates that. But it's just like, okay, that's great and then back to work, right? No robots. I would argue that's just like all those other things that robots are definitely in our future and your future, or could be your future. But right now I'm still going through that phase where everyone just thinks I'm crazy. But I've kind of, now I can hang out with Cassandra and Misery and our beers. Anyway, all these years ago, the way I would describe Selenium is that it's like a robot. It's a virtual robot. You tell it to go open a URL, type text in the fields, click buttons, things like that. I would often use this image, almost as a joke. This is a metaphor for what Selenium is. I can't find the actual original source of this robot. Now when I Google do an image search for this, all I do is find my presentations about it. I think it was a research project in Japan. I think it was called F-O-X-X, but I can't find anybody. So if you know somebody who actually worked on this actual project, but it's got some cameras and robot arms up here, like oh yeah, that's gonna be kind of what, testing is gonna be. But now I would say it actually is a robot and part of the reason why I've got all these things is that I want to at some point test everything. I even wanted to test my metaphors and see if it's like a robot actually is a thing. And so this is the latest version of my robot that we've got kind of cooking in the kitchen. And of course there's a theme here as far as what it's automating. Right. You know the funny thing is don't ask me to play the next level. I don't need to play the first level. So the other thing about this is that there's a frame that it can kind of get locked into. I've got other little actuators that can hit the volume up and volume down. They've got these little kind of movable pads to kind of get support like any phone. So anyway, I've been working on this. This is like the latest one. All these years into it. What I'll call, the other metaphor, I don't know where the interference is coming from. The other metaphor that I would kind of talk about as far as what I'm doing, both with my art projects and my personal life and then also professionally is that I'm doing kind of the reverse Tron. If you're not familiar with the movie Tron, it was about people getting sucked into the computer and then getting trapped in this video game. And so what I call like the reverse Tron is like taking people or things, robots, virtual robots in the computer and trying to take them out into the real world. I've got various art projects that actually taps you as a spin off of that I was trying to take an animated 3D display and make a 3D version of that. And I've kind of noticed that a similar thing with Tapster is basically the reverse Tron version of Selenium. This was the very first robot that I made. Oddly, I showed it off at the first Jenkins conference in 2011. Somebody called this embarrassment driven development. I actually said, I put it in the talk proposal. I'm going to make a robot play Angry Birds. I hadn't made a robot play Angry Birds at that point. They kind of wanted to do it so I used the conference as a way to force me to do it. And of course, if I didn't get it done, I'd be highly embarrassed. Anyway, embarrassment driven development, it's fun. I still don't know what it actually has to do with Jenkins but it was unveiled there in like October, 2011. And this is actually the demo. I apologize for the blurriness of this. I was recording it about 10, 15 minutes before I actually got on stage. I was only working for the first time. 10 minutes for a minute. 10 minutes for a minute or so. Again, one star, it's not a very good robot. Some folks saw the YouTube version of this and they said, actually your robot's kind of lame and it's really slow. So I took that as useful feedback and over the years I've been trying to improve it. So again, kind of using Angry Birds as like my muse, if you will, on how to improve things. This is a couple of generations later. And this was also kind of like my side thing at Sauce. I'm still full time at Sauce at this time. And this was just kind of like an interesting thing to work on on the side. Again, still angry. It's getting better at it. It's a lot faster. This style of robot is called a Delta robot. It was invented by a Swiss engineer in the 80s and no joke, it was invented to pick up chocolate softie assembly line and put them in boxes. Everything was automated about making the chocolate so it set that last part about putting the chocolates in the boxes. So he came up with this design as a conveyor belt comes by. A Delta style robot is arguably the fastest kind of design. You can go read the expired now patent about it. The math is crazy, ridiculous. I barely understand it, but luckily people have implemented math library support so you don't have to worry about it. But anyway, so Delta robots are kind of cool. So that explains a huge jump on kind of the change in how the robot looks. Usually I'm just trying to go for a faster design. Something interesting kind of happened after I gave the presentation but that first robot was called a different robot by the time. It got picked up in popular science. It also led to some interviews at other places and what happens was things got really weird. What I didn't realize is that there are secret robot labs literally everywhere. Anybody making a phone. So all the usual suspects. Samsung, Apple, you name it. Anyone's making a physical product. They have a secret robot lab. I mean also telecom, everybody. Keyword secret. We all, we don't know about it and if you do work at one you're not allowed to talk about it, right? Occasionally there's some headlines. You can probably go Google for them. Like it's one that hit the US press. It was like a lawsuit and actually criminal charges between T-Mobile, T-Mobile was the aggrieved party and then another company actually was stealing their designs. Their robot was called Tappy, the tapping robot and they used it for testing phones. Anyways, so the secret robot labs are actually a big thing and I only knew this because after I put my first robot out there they came to find me. I don't know how to find the secret robot labs that they have to, you know, self-identify. So I had lots of interesting conversations where this whole thing kind of went from like this silly hobby just kind of getting robots out of my system but I'm gonna keep working on the software side to actually like, oh maybe there's something to this robot thing. Because like I'm actually the first skeptic as far as like, okay, is there really actual, could you actually be, you know, your day job to be playing with robots? Like I don't know, at least especially around 2012 I didn't think so. Let's back up. So why robots? This is what's my argument for it and especially why now and why should you care? The big thing that I would believe is that testing is getting really weird. We already kind of know that. I mean I think only in retrospect we realized we actually had it really great for a couple of decades there with like desktop testing was just keyboard, video mouse. That's all you needed and you're good to go. And it's amazing how complex automation is just given that but now it's even harder when you have testing in your refrigerator and medical devices and cars and all kinds of stuff. And so I've over the years kind of been collecting examples. Some things were the secret robot lab stuff and I actually go public other things are just kind of floating out there. This is arguably my favorite video of a test automation scenario ever, ever. This is brilliant thing of solving the problem but also in a super creative way. This is made by Tyro. They do software development consulting. And they are testing mobile NFC payments. So it just goes on the track and they had three different kinds of payments and so that's like a little kind of smart card and they needed to, they were doing some kind of implementation and they just had this running on a loop and that's how they do their test automation. I figured this is brilliant, right? Especially for me, because it's like, again, I kind of almost meditated on this because I'm so prone to just over complicate things and this is just beautiful and it's simplistic. One time the toy store got a train set, boom, mobile payment, test automation, done and done. This, a couple of, I was asked by my coworkers a long time ago, like, hey, what are you going to do with this robot thing? I didn't know, but I figured like, hey, well let's just keep asking people and getting more people involved. We had a workshop to build these Taptur robots in New York City and there were some folks, I think from RGA at the time, and they were working on the implementation for the software that, they were working on the Nike FuelBand for the iPhone version of it. And so they were there to build robots and the intention was, you can see on the robot itself that there's an outline for where a phone goes, right? So I figured like, that's how you do this. This is how you do robot mobile testing, put the phone there. The problem that they had, and this blew my mind. When you're testing a accelerometer, a pedometer, sorry, for testing the number of steps you take during the day, well, how do you know that your phone is actually working unless you actually shake the FuelBand or the Fitbit or whatever? And so they used Taptur to literally shake the thing as an end-to-end integration test. I felt like, okay, this is like, it's going to be the early examples of testing that's getting weird. This example is from Mercedes. It started in Sunnyvale. It's actually now, this project is actually based here in Bangalore. They moved it here. And this was the project where I decided to go from hobby to full-time, actually incorporating as Taptur Robotics Incorporated, because they asked for 10 robots. And at that point, I realized like, okay, if a big car company is asking for this stuff, I need to treat it as seriously as they are, right? I mean, I was treating it seriously, but okay, I think maybe this is the thing. Anyway, so this is their fancy glossy demo of the interaction that they're testing. You can kind of see in a couple of seconds here what this is. He, this person has a phone. So they need a robot actually to test the phone. But what the end-to-end interaction is is that this is a self-parking feature for the Mercedes phone. That's if you notice that right there on the seat again, the person's just drawing a circle. That's all they're doing. It's kind of like a dead man switch. Kind of maybe get a better view when they get, unfortunately, a fancy car, but couldn't get access to a wider parking spot apparently. Again, just drawing a circle. What ends up being is like, you're not actually driving the car, but you're telling the car that you're still alive or within range, things like that. And so they didn't have a way to automate that. The other thing, given the car industry specifically, there's a requirement either legally, or just policy-wise to test everything literally as exactly and closely as a human would be interacting with the system. You can't run it through a simulator. Can't even use the USB cable. Literally has to be the standard app store app and you're interacting with it in a very non-intrusive way. So that's kind of a key word. Sometimes if you have requirements, either legally or policy-wise, to like not go through the instrumentation, you have to do it as exactly the way a user would. Robots are sometimes answered. And this is the tapster version of it. Okay, let's try out this. So I randomly post videos to YouTube. Usually there's the context, like the Mercedes things, but they can't talk about that. Couldn't at the time. I think right around the minute, Mark, I actually do my demo of a circle. Everybody's favorite? Circle. So that was the test automation thing that they could do anyway. And counter-clockwise. And the requirements were ridiculous. I had to do clockwise, counter-clockwise, and also starting anywhere on a circle and anything anywhere on a circle. Counter-clockwise. And implementing it was not so easy because I had to plot all the points in a circle. You can't just draw a circle on a screen. I actually had to animate through all the points. And I also had to make a couple customized versions of this for like the large type versions anyway. Okay, let's... And then for a different client, they were more focused on kind of the server side of things, but they needed two phones kind of texting each other to kind of generate traffic. So, you know, Facebook chatting or text messaging or something like that. If you're the server and you want to see what happens when two things are talking to each other, well, you need... And they also had the requirements that can't modify the phone, whatever. In this particular demo that I gave them, I thought seeing two robots text back and forth through each other saying, yo, hey, what's up? I thought it was kind of boring. So I thought the moral equivalent of that is a garage band, because why not, right? Keyboards, playing keys or hitting the piano keys. And so I figured, okay, if I have two... The other requirements that they wanted to have one computer orchestrate multiple robots. They wanted to be able to do orchestration. That was like the other big question that they had. So I figured if I can get kind of a small little duet going, then that would win the day. So everyone usually hates this song. I really hate this song now. Typical style. This was like 3 a.m. in my basement trying to kind of keep it low. Not like anybody else before I had to go for the airport like two hours later. So fun. Crack papers. So other things, versions later on, I think it's actually the same client for this one, upped my audio game no more heart and soul. The particular requirements was to get to the Android boot loader screen again without doing it through a USB cable. So this demo is turning on a phone from a cold start and getting to the Android boot loader screen and turning on the phone. If you go back and watch 2001, kind of like landing on the moon, there's kind of, I was kind of going for that vibe there. Just nice graceful robots moving through space. But the feedback though that I got on this one was that, again, too slow. Side note for side notes. There's these famous interviews with Jeff Bezos from Amazon. And he says, don't ask me what I think is going to be in the future or what's going to be changing. The thing he likes to think about is what's not going to change in the future. And he says things will, people will always want things to be faster. I think the interference comes. You'll always want things to be faster. You'll always want it to be cheaper. And there's a third one. I forgot it. I'll come back to it. But faster was the key thing. Or a more selection, more availability, right? You don't want to be limited in your choices. And so the feedback I got from this was that it's too slow. And that was a common theme. That was a problem with Selenium, all even Appium in the early days. Like everyone always complains about the speed. Oh, other side note. There is something I've observed in my career. People, they're given a choice, go ask your developers, have them choose between slow and correct or fast and wrong. And 100% of the time I'll choose fast and wrong. Anyway, this is kind of observation. I would argue, I try to argue, but I always fail that like actually Selenium goes at the speed of your users and it's slow and it's correct, but no one cares. It just went fast. So with the latest version of this thing, I figured I'd murder the, it's slow argument. And I clocked this at like 800 taps per minute. So I figured like, is this good enough for you? And I like, yeah, okay. It was also nice, I did some research on if I had this play music. I think if you look at the entire catalogue of all human music, beats per minute, like the fastest song ever before, it just sounds like a solid tone. It's somewhere less than 800 taps per minute. So theoretically, I've got a good theoretical base for playing the fastest song. It's probably gonna be some kind of speed metal song that might be coming in a future happy in conference. So there's a kind of various kind of so what factors to this. That was one of the things I picked up when I worked at ThoughtWorks where Selenium project came from. And there was a person in sales that were famous for always saying, that's a great demo, but like so what? Like why is anyone gonna care about this or buy it or whatever? But another kind of other aspect to this is kind of where to start. And I'm self-taught in this stuff. I studied software in college, but I did not study robotics. And I kind of mostly started from an art project that I've been still kind of working on where I picked up electronics and the mechanical sides of things. And so all these years later, you're kind of looking at like where I am of like 20 years of like stumbling through learning the stuff on my own. It would have been nice if I'd gone through school and just learned it in more faster time. But if you think, I would hope out of this, like you might be inspired to go start playing around with toys and robots and start tackling those weird testing scenarios. Where would you start today? I've got on here a couple of these boards, different versions of them, but it's actually not too expensive to just start an Arduino Uno. It's like $22. It's about 1500 rupees. And something that's very closely related to that is the Raspberry Pi that came, that was shown off in Jonathan's keynote yesterday. Raspberry Pi is a little bit more, a lot more capable. You do more things, but there's some nice things about the Arduino, especially if you wanna do very fast kind of timing like driving stepper motors and kind of thing like that. Like actually a small little microcontroller and not a big operating system that's on the Raspberry Pi. I'd be better. Anyway, pros and cons, but not too expensive. The other thing to think about with robotics is that there's gonna three things that you need to know. There's the software side, and arguably if you're here, you're either no software or you're like on the path, you have access to resources to learn the software side of things. And then there's other two things. There's the learning electronics for triggering sounds or driving motors or all kinds of stuff. And then the third part is mechanical, like building literally the structure to hold these things. And if you're gonna be working on some kind of end-to-end test scenario, whatever poking buttons on things, you're gonna need to put it in a physical skeleton or back a better phrase. And so what I would argue is if you're gonna try to find a place to start, Lego Mindstorms is a great place. And this is like for a couple of reasons. When I asked the folks at Mercedes, like well, if you hadn't talked to me or found me, what would you have done? And this is literally their answer. They said we would have dedicated two engineers, given them a couple of months and a Lego Mindstorms kit and see what they come up with. So I'm arguably like Lego is like my competition, right? It ends up being kind of a thing for like, kind of we also observed this with Selenium project back in the day when people are just kind of duct tape chicken wire their own solutions in-house. That's usually a good sign that like, there's a desperate need for some kind of product that's out there. And I'm not arguing that you need my robots, like go and get some stuff from Lego and kind of couple things together. But the thing about Lego is that you don't have to start from scratch. You have some of the building blocks that are there. You can kind of get going faster. The other thing though that you might observe, and this blows blew my mind as a kid who played with Lego. I was always frustrated that like, I couldn't design my own Lego pieces. Oh, sorry. And just with Lego Mindstorms, these were kind of all the pieces you have. You have enough, just enough to get started. You have motors and sensors and all kinds of things. So if you're from a testing point of view needed to push a button or move something around, you have just enough of what you need. And you have a supplier that can go back and get all the other pieces that you need more of them. The other thing though, for the pieces that you don't have, three printers, you know, they've kind of gone through a huge pipe wave, but there's still some utility there. This particular printer, I recommend it because it's only $150. There were, back when 3D printers were invented, there were tens of thousands of dollars. Now, from modern crisis, like super cheap impulse buy, kind of a thing. And some kind of playing, this is like the summer project now is to see how good this is. But if you're trying to get started, like get an Arduino, a Raspberry Pi, Lego Mindstorms kit, maybe a 3D printer. And the reason why I also, again, kind of showing about the 3D printer. Here's an example of something that actually looks similar to my other robots. This is called EV3 printer bot. And if you notice these two pieces right here, those came off the 3D printer because Lego doesn't make a piece that holds a pen. So you can print these small parts for just what you need. And actually holding a pen is going to be very common if you're making a robot hold a stylus and touch on the screen, something like that. And then this is their, this is this robot. So the plans for this are online. I'm gonna link to it off of the YouTube page. This video is kind of sped up. But there's a built version of this now on my desk. It's another summer project I could kind of play around with this. Quality is actually pretty good as far as precision. And the key thing to know is that other than those two white pieces that were printed, everything else comes in the standard Lego Mindstorms kit. So arguably you could put a phone or an iPad underneath this and you can start doing this today. Get the kit, download these plans and then you're off to the races. Great, cool story. But what does this have to do with Appium or the Appium conference? It's kind of like showing up a robot at the Jenkins conference like, okay. Well, one thing I would point out, I'm kind of, because I'm so focused on the robots I haven't been day-to-day in the trenches in Appium in a long time. But one thing, trying to get back into it I've kind of noticed like some things are pretty gnarly and still really complex. Specifically if you're doing iPhone automation the state of the art right now in the project is to use XC UI test to kind of get to all of the stuff that you need to get to. But the key thing that the call this out and I don't mean any offense to the folks here or watching this who's worked on this like you're doing amazing work. And I love you. But I just pulled this straight off of the documentation of that page. While this is simple in theory the hoops of code signing and provisioning applications for development and testing can make this a bit of a headache. It's like the story of our lives. It was true then, it's true now. And so I would like to remind you again some stuff I just copied off the Appium.io webpage. It was true at the beginning of the projects of the Selenium project actually and Appium. This is right off of the Appium about page one. You shouldn't have to recompile your app or modify it in any way in order to automate it. You shouldn't be locked into a specific language or framework to write and run your tests. A mobile automation framework shouldn't reinvent the wheel when it comes to automation APIs. And that's actually why we put web driver in front of all this stuff because that was a good automation idea. And a mobile automation framework should be open source and spirit and practice as well as in name. So that's the Appium philosophy. That was what was different. That's what separated those four things separated Appium from all the other tools that are out there proprietary or open source. And it's amazing like no one copied it. But they didn't believe that this was a good idea. Now I think it's a good idea. Anyway, so keeping that philosophy in mind something came up at Apple's W, W or whatever their conference a week or two ago. They unveiled some new accessibility goodies in iOS 13 specifically being able to control an iPhone with a mouse. Now granted if you're an Android person this has been old news for years. Android is, you can plug in a keyboard or mouse, no big deal. But iPhone has always been this annoying thing to automate. So I'm kind of a hunch. Anyway, I got a phone, download the developer beta. It's not publicly available. So if you have a developer account you can go in and install it. And I have found a couple of little quirks. But this is just playing around with manually. Again, kind of that other theme there of manual testing before automation. So just to see if it's even possible. And this is a USB mouse that kind of cuts around with. Again, there's this possible idea. The reason why I kind of brought up the phone app or messaging, things like that is because it was definitely true at the beginning of the Appian project. You could only automate your application. You couldn't automate the settings or any other application that you don't control. I've also had recent conversations with some people where they don't have access to the developer. They can't get the developer version of the app onto their phone. They're just sadly either for political reasons or technical or revenue or money reasons, whatever. They have to use the same version of the app that everyone else gets to. So it'd be nice to be able to not have to modify it and still be able to test it. And the idea here with the mouse, even though you don't have access to the tree of elements, you could at least get something tested. So I had a hunch here where that, well if you can do it with a manual mouse, potentially you can make a fake mouse and control that fake mouse to play in reverse. So again, going down this hunch, there's a version of Arduino. It kind of echoes back to, if you play around with Arduino, you might not actually know where this is gonna apply to work, but if you have some of these skill sets, you've got a bigger toolbox that you can instantly apply to something. And so this little demo, it's sending mouse commands authentically just like the manual mouse was. And so I had to have it running on a loop because I had no way to control it. It's like half the equation of what you need to drive it from Appian. And so I have it going up, down, left, right, and click. And actually what it's automating is pinthing.com. This is like my, this is my art project that tapster is a spin-off of. I want to make a, I made the software version because I didn't know how to make the electronic version, but it's the idea of making thousands or millions of linear actuators and you got this cool 3D display or something like that. Or you can make it like maybe like a water bed or something, I don't know. Still working on it. But I have pinthing.com is where I often, besides Angry Birds, where I kind of test the whole thing. So like, hey, success. I can move around and I can click a button on the website. So like, that's cool. But I still don't have a way to automate it. Except for miraculously 24 hours ago, Jonathan solved the problem. Yay. And the cool thing about Under the Hood is driving the Raspberry Pi is very similar to driving Arduino. And he also kind of did all the hard work of kind of putting the Appium front end to that. So it's up there. I will post the links later. I can also just find Jonathan and ask him more about it. And so my version I think will be maybe when I post it later, it'll be a fork of that project and it'll be Appium-Arduino driver. And so we've instantly in 24 hour period have gone from a cobbled mess of all the mobile platforms to now drum machines and, well, it's still the iPhone, but like doing it in crazy ways. So this is the next step. And if I can kind of show my mouse here, I wonder if I can zoom in. Enhance. What's going on here? Don't have a schematic. Sorry, I'll do that maybe later. On the right side, you have the thing, kind of a version that you saw before the video, that it can act like, the thing on the right can act like a mouse. The thing on the left talks to the laptop. And it can listen for commands and do stuff. And then the wires, it's the implementation of the I2C protocol. Just like you have USB or Ethernet or can bus in a car. I2C is a way for smart things on other boards or computers, whatever for them. Devices that how they can talk to each other. It's amusing I2C. There's another weird thing. It's called, another word for the I2C protocol is the two wire protocol, which ironically, it's actually three wires because you always put a common ground. So they could have called it three wire protocol. Anyway, but the two wires, the yellow wire is the data wire, and then kind of got little inside jokes in my various things. That's the clock line. So it's the clock work orange movie reference there, whatever it is. So I always, in all of my designs, the clock line is always orange. Anyway, cool story. So this is the demo. And this is being driven by Appium. So that's the electronics version of it. And then here's the kind of, the view of what I saw on my laptop. So I'm using a, I'm streaming the video over AirPlay to my screen. On the left, you, on the left bottom, you have the Appium server. That'll be hopefully on GitHub soon. And on the right is my test script written in Python that's doing throw. And then there's a couple of things to play the game. I just hit that button on the bottom right. No verification, sorry. And also a lot of sleep statements, sorry. But you can kind of see if you zoom in maybe later, it's an authentic Appium script. So as far as your test is concerned, it doesn't know that it's talking to random things. This is also kind of interesting because sometimes my argument to working with robots is like, I'm not gonna expect everyone on the Appium project to have this on their desk just to keep support for the robot stuff. But now I have just this small thing that's kind of the moral equivalent of the robot that can, so hopefully in the future, maybe when you're testing future versions of Appium, you might have, besides all the phones on your desk, you might have a couple of these little circuit boards around that kind of represents the electronic circuit board version of some bigger robot. I could do this as a live demo, kind of tempted to. Maybe I'll do that. Here's a code, I'll probably show it later, you can't really see it, but there's a certain level I've override the perform touch thing. I got some really awesome guidance from Jonathan on how to do this, this is the code that I wrote very recently in the last 24 hours. If I, right before I give the demo, fun fact, I flew all the way from Chicago, stopped in London a couple days ago. I had just enough of a layover to leave Heathrow, go have coffee with Simon, show him my demo, and then get back to Heathrow and then get on the plane to Bangor. So just as like kind of like insurance before my live demo. All right, three, two, one, go. Hi, I'm Simon Stewart, currently the Selenium project, talking to Jason Huggins, who created Selenium. I've seen the live demo, he can make it work. I've seen it, it works. Thank you. So you don't believe it doesn't work why? Here's some proof, at least my testimonial from Simon, thanks. I'm very, very tired at this point, right? So on that note, let's see if I actually can do a live demo very quickly. It's not really appropriately a real technical conference unless we have a failed demo. So I'm just gonna say if it fails, that means like we're on the bleeding edge. First thing I'm going to start my Appium server. The next thing I'm going to do is not this. I probably want to bring up, so you can see the screen. I'm gonna turn on screen mirroring. So pardon for the awkward pause here. Doing it live. Had those video backups and the testimonial because live demos are precarious. And of course, I apologize if you're sick of Angry Birds, but that's what you're gonna get. Oh man, I crashed it, I didn't see that before. That's fine, welcome back. Okay. Ooh, oh there's an ad, unexpected pop-ups. Okay, so darn it, sorry. I need to turn up full screen mode. Again, the name of my phone is Jason's Terminator T-799. There's reasons for that. Wonder what I'll be the next version. Okay, let's see if I can make this smaller. Sorry, so I can see the, oh that's great. So the server is running. I really don't know if this is gonna work. Let's remember, Simon saw it. Gonna connect, hey. All right, I'm gonna play Angry Birds. Come on, don't let me down. We're doing it live, folks. And we've got the three stars. All right, I am dropping the mic. All right, thank you. Hopefully the code will be up later. If not, you can also kinda check out John. That's good that you literally just put out yesterday. And also, yeah, find me on Twitter, send me an email or just find me at the conference. I'll probably rock in the hallway truck today with some version of these demos running at the tables. All right, thank you. All right, cool. Oh, I thought it was like right at the time. Am I done tonight? Okay, cool. So how does this work then? So I'm gonna have a mic. Copying my slides is really gonna be very hard because it's like several gigabytes in size because it's all these video files. No, I'll copy it, it's like how much weight do you have? How many, maybe it'll work, sorry. Someone has a question, please. Hey, it's a nice job. So in my mind, that demo is like one half of a full-fledged interactive testing thing. Is there any way to, say, retrieve screenshots from the device in line with this sort of thing? Because then you could use Appian's image testing to get full-blown automation on this. Right, so one thing I discovered on the path to this, I still was able to get the image on screen. Once you could get the image on screen, you could then do the image recognition stuff. There was something that you had in your demo that I couldn't do in mine. I think this is a bug in Iowa's 13 right now. But in the previous version, your phone can act like a webcam, like the screen, when you bring up QuickTime Player. And you can select the video sources, like selecting other kind of cameras. You can select the video source. And there's a whole bunch of open source tools that can kind of get grab frames, including open CV, like all of the usual suspects for image recognition, open source stuff. Because the iPhone can act like a webcam, there's tons, all of the finding objects that it assumes a webcam. However, because I had to use this fancy dongle to plug in the USB phone, it broke that whole showing up as a webcam thing. So I'm hoping that Apple fixes it somewhere in the next six months. I still was able to get it over, like I'm actually streaming it over like Wi-Fi and whatever random radio active waves that you can hear going through. My phone, my actual phone is the hotspot between this phone and my laptop. That's how I got it working a couple of hours ago. So and all of that stuff for AirPlay is all like proprietary and expensive. So I wouldn't build a solution off of that. I'm hoping that if Apple can fix the iOS access to webcam thing, then you can bring in the image recognition stuff. So just like you had a part of it, it was like an exercise to the reader. The image recognition where you can kind of close the loop and make it a full-fledged thing where not only you're doing things, but you can do verification. Yeah, as an exercise to you. There's one other aspect of it though. I've noticed that if you're doing, there's another argument for robotics from a performance testing point of view. Again, there's this argument for like non-intrusive testing and when I noticed by streaming, and I think it's also true if you pull the frames directly off the device over USB, you're modifying the performance. Like you're throwing all those frames. That is slowing down the device, even no matter how fast the chip is on there. You were doing something else that's not a user would usually be doing. And so if you're really, again, there's some secret robot labs that are just focused on performance testing. And so what they would do is not stream it over AirPlay or over USB cable, but they would have their own imaging camera literally sitting a foot or two above the phone and then you'd have to put back the black box so ambient light doesn't mess everything up. So the really fancy stuff uses, again, from a non-intrusive way, and that's where you would have the robot and the camera and not use the USB thing. So it's all kind of the sliding scale of capabilities. For this demo, it was fine that even though I was slowing down the phone, that was okay. But if you really wanted to have a true test, you would use a camera. That makes sense. Totally overanswered that one. We can take one more quick question. Sorry. All right. Hello, Jonathan. When are we bringing robot support back to Appium? Oh yeah, funny story. There was robot, there was support for capture in one of the first versions of Appium. And Dan wrote that code. And then like version 1.3, Jonathan deleted it all. But actually for good reasons, because it was like this monolithic nair ball of a project and so they kind of, there's like sometimes there's a way to kind of clean your house, take everything out of the room and then only bring back in what you need. And so there's a, Robots didn't make the, there's actually kind of why I'm giving the talk right now. I'm arguing for Robots to kind of get back in there. But the way the architecture is now, it doesn't need to be in the project. We can have either when plugins come in, you can kind of pull it in on the fly or with these projects that there's so many other kind of, there's an ecosystem now of stuff that I don't necessarily need Appium to support it because there's very cleanly implemented interfaces for this stuff. So as long as like I implement the interface, you can kind of use it on your projects because Appium's done the work that it needed to get done was to find the interface and I implement the interface and that's how we all get along. I don't know if it'll be in the project, it'll be nice, whatever, it's not necessary. I'll let you add other things after that. All right, thank you. So thanks Susan. So then we have a big round of applause for Susan. Thank you so much.