 This is a connect exploit. So this is a talk about joining the connect, Microsoft Connect to Metasploit via a 3D first person shooter game environment in real time. And we're gonna use that to own a box. So this is all demo, there's no slides, but it only takes about 10 minutes or so to do the demo. So let me just give you a brief little intro so you have some context about what's going on. So I got to connect about a year ago or whatever it was that came out. And the first couple of projects I did, you can see these on YouTube. I did a drum machine using Pygame. It's just like three cubes and you're a point cloud. And if you swat at it vigorously enough, you can make a beat out of that. And then the second one I did was a David Lee Ross soundboard that I'm sure you guys have seen on the internet a long time ago, they took David Lee Ross vocal from running with the devil and chopped it up and then put it on the soundboard and click it and get all this crazy whews and all that stuff. Same thing, just cubes, you're a point cloud and you swat at it. But what that taught me and what I bring it up is that I'm a Python programmer. So I'm not into mounds and mounds of OpenGL code, which is what you end up doing there and physics as well. So, and then the third thing is that there wasn't any, and there still isn't as far as I know, skeleton tracking for Kinect and a Python wrapper. So those are the problems that's taught me from doing anything else. And then I stumbled upon Blender. So Christopher Weber did a talk at Pycon 2011 talking about Blender 2.57. Blender is a 3D game environment. Let me show you. That you can use to do CGI animations and also has a game engine. So that solves the OpenGL and physics issues and gives you a platform that you can use to create all kinds of stuff. So it's a 3D, I'm assuming you guys haven't seen Blender so I'll just give you a brief little quick tour. 3D animation environment and the coolest thing is that, I don't know if you can see the swap on the top there. It's telling you it's 60 frames a second. So it's real, real-life, real-time game engine, all driven on the back end by Python, Python 3 nonetheless. So you should check it out. Blender.org is the site. They fostered the development of Blender by putting CGI animators and developers together and then they make a movie with it. So there's a movie on YouTube you could watch called Cintel, S-I-N-T-E-L. It's 15 minute animation, it's great. And that's how they made, they used that combination of folks to actually make this. So, seen Blender. We still have the problem though of how do we get skeleton data into Python. Like I mentioned, there's no easy way to do that. So the answer to that is OS-Selotin. OS-Selotin is a program that uses the OS-C protocol, open sound control protocol to make a UDP stream of joint data. So you get this just over your local connection or you can send it over the internet. If you found like a stream of all the joints that Connect is seeing. So let me set this up. And here's just a quick little Python debug about 10 lines or so and it's just gonna blast out everything that we see. So, if this goes well. So it's initialized looking for me, found me. If I do the pose, now we get joint data. So just like that, I got joint data in Python. Let me show you what we get because this is what goes into Blender. So you got a variety of joints. Usually it's just elbow, hand, shoulder, knees, all that stuff. But it gives you the joint, the player, come on, player number, and then you get X, Y, and Z coordinates. So now that's what we're gonna stream into Blender. So let me start this up again. Let it find me. And then I'm gonna show you the first part of the... Oh, I think it's done. Yeah, it is done. So now we got that into Blender. The first thing I like in these displays is when they actually have a person that maps to what you're doing. So I built this using just some open source mesh stuff that's out there on the internet. This is just a human male figure. But the important thing, the thing I wanna show you is that it's complete, let me make that bigger, it's complete down to what Blender calls an armature, which is a series of bones put together that when you move them, usually in a render engine, you can make walk cycles, poses, stuff like that. In our case, we're gonna link that up to the OSC data coming in to actually get poses. So when we pull this back and we'll demo, this guy's called Super Armor. So it's running a little slow, it gets better. Let me do this again. Still a little buggy, Blender. Okay, that's better. So it's real time, joint data, you can see, arms are kind of the best, legs are a little twitchy, so I don't really have them doing too much because I'm not a dancer. But you can move your arms around, the head is also a little twitchy. And then, so from the basis of the, and the reason I wanted to show you the bone stuff in there because that becomes the basis for gestures. So there's gonna be a flex gesture later when we get into the game environment. And so that's this bone, 90 degrees to X. So you can start to recognize all kinds of gestures. So let's jump into the connect-disployed environment. So this is connect-disployed. It's a first person shooter game environment. It's kind of minimalist. There's only a couple of rooms in it. We'll only use one room. And it's built, there was a guy on the Blender forums who built this just as kind of a first person shooter game he was messing around with and he gave me permission to use it, to use the background anyways for this demo. So let's jump in and I'll show you the gestures. The first thing it's gonna do when we start up is make a connection to Metasploit. I got Metasploit running over here and in a victim box. And it's gonna announce it and then we'll start going through the owning of computer. Connect-disployed initiated. So I'm already recognized from OS Seloton and there's a little super hammer in what's called a heads-up display in the Blender. So it's just mapping two scenes on top of each other. And you can see we've got our real-time running Metasploit console output up in the top. So some of the initial gestures are if you move forward, you go forward, or you lean backwards, you go backwards, lean to the left, you go left, lean to the right, you go right. And then if you want to get a look around the room, just rotate your hips. And like I said, this is a training environment so we've got some inspirational posters on the wall here to get us a little in the mood. To do some poning. The more you pwn, the more you know. Then of course the last poster, ICBM. Now you could also, there's a door here, you could move yourself through the door and all that. I won't go through all the rooms because this is the main room that we want to deal with. The idea would be room equals an IP range. Just a local IP range on my box in this case. So to get into actually using this to drive tools, of course you gotta scan to find them first. So the scanning gesture is both hands up to your forehead like you were scanning the horizon. So we'll do that. Come on. It gives you a countdown because scanning announces your presence. When it's done. M-MAPs is coming in the G80. Launching M-MAP. Yeah. So you can see the output from M-MAP up there, just right out of the Mesploid console. That was my idea anyways of what M-MAP sounds like. It was a weird thing trying to come up with sounds for this because what does M-MAP sound like? It sounds like that now. So these are two computers that we found victims that we can attack. And I was talking to my daughter about this because you know, first person shooter, you're usually just a shooter. You have a gun that you can use to target things. And she was like, well, maybe you can just use your gun, gun, the flexing gun, like cock your gun gesture. So that's the gesture to target one of these guys. So you do a flex and you touch your shoulder and elbow like in a cocking motion and you get a targeting ring. Target acquired. So, there we go. Thank you. If you like that, you're really gonna love this. So the last gesture in this thing is the only one I could come up with for owning a box. It's the, well, the French have a word that I can't say for it, but it's called arm of honor if my translation is right. And it's this. D.B. Autopone initiated. So this guy's having D.B. Autopone run against him. And this is my idea of what D.B. Autopone sounds like. It's a mixture of bats and the Skype feedback sound. So I don't have to have to find someone who has a better idea of what that should actually sound like. Maybe keep your eye on the Metasploit console interface up there. You'll see, it'll announce the modules that it's launching after it gets done looking for what it could possibly use against this thing. And the animation happening is kind of blowing its brains out as we're Autopone getting involved. All goes well. We'll get sessions. And that's the end of what we can do so far. We've got five modules left. Two. Sessions. Sessions. So there you go. Thank you. So those all will get out of the game and show you their actual real live sessions in Metasploit on there. But the next idea that I got for this one, this doesn't work yet. It'll be that you march up to this thing and then through some gesture, maybe hopping or something, you launch yourself into the computer and that becomes a different scene than that you could use to drive one of them interpreter sessions that you got in there. So, like I said, there are real old sessions. Two of them in there and that is Connectasploit. Thank you very much.