 Alright, good morning everybody. We'd like to thank you all for coming out. We are, we assist. We're a group of students from University of Delaware and over the last few months we've been working on hacking and modifying the WeModes and the WeFit boards to use them to help people with disabilities. Basically, our plan is to provide people with disabilities more interface options with Windows operating system. There might be future plans of porting into other operating systems, but for now we're just coding it for Windows. What basically what we're doing is we have infrared head tracking so it tracks the movement of a person's head and then we also have weight shift tracking to kind of, to add to like with the WeFit board a little bit more mobility. We chose the We platform mostly because it's inexpensive. It's only about $40 per controller and for that $40 you get a great infrared tracking camera along. It tracks up to four different sources and right now we're only using one source for head tracking, but in the future if we plan on doing gesture recognition or maybe even like sign recognition with the WeMode it would give us more options for more sources. Also, the WeMode has Bluetooth capabilities and so does the WeFit board and this provides a really easy connection with the WeControllers and the WeFit board so we can get that information coming straight to our computer and then there's another library that we're using, WeUse which takes all that Bluetooth information, all those hex values and then converts it into functions which we can easily read and communicate with. At this point I'm going to pass this on to my friend Cuddles over here who's going to talk to you a little bit about the hardware aspect of the WeMode. Alright so the WeMode, everyone's seen it. Most of you guys have probably even used it and basically it's the controller for the Nintendo's Wii which is the current system they have out now and what we're working with primarily is the infrared camera in the front and that camera detects infrared light from the sensor bar and the sensor bar doesn't even have any sensors in it at all. It's just two clusters of infrared lights on both sides and both sides have five infrared lights each. So to the camera it's just two infrared blobs on each side of the sensor bar and with that it allows it to triangulate your position to tell where you are in relation to the screen. And it actually tracks up to four sources so you can actually have two sensor bars and it would detect like all four blobs of infrared. Now for the hardware, here's actual pictures of the chip inside the WeMode and here are the buttons. This is the adapter where you connect the nunchuck and the guitar hero controller, Bluetooth and the infrared camera which is what we're primarily working with. Now the infrared camera supports a resolution of 1024x768 pixels which is pretty good compared to most webcams you can find out there which are for $40 that only produce about 640x480 pixels. So not only you're getting a good controller but you're also getting a pretty good infrared micro camera. And the Bluetooth chip is a Broadcom 2042 chip and it's not 100% compatible with HID. Now what HID is, it's a standard Bluetooth so basically it's made to connect to the Wii, right? So you can connect it to pretty much any infrared Bluetooth receiver but it won't work. It'll work but you'll get like these weird errors or unforeseen errors. You'll be able to work through them but you'll get like finicky controls so we had to work with that as well. And here are our glasses which, yeah. So we made that out of just basic goggles and we constructed a circuit for infrared sensors or transmitters. And they provide a 180 degree angle of infrared so you can turn your head a little bit and the Wii remote will still detect you. And it's pretty energy efficient, only one and a half volts and 50 milliamps. And here's the actual circuit. You have the 1.5 volt battery in series with two infrared diodes in parallel to each other so they both get the same amount of voltage. And then the resistor in series of all but given them the 50 milliamp current needed. And then here's our little lab setup. It's pretty much exactly what you see here and it's pretty ghetto but it gets the job done. And now I'm going to pass it back on to Robert who's going to talk about the software end of our project. Alright so here's a little bit about the software. As I mentioned earlier we're using the WiiUse library and that is coded for the C programming language. Basically it's a single threaded non-blocking library and it's pretty simply coded like I stated before. Most functions are self-explanatory. It's like set LED to turn an LED off, like set it to 1 and stuff like that. So it's mostly self-explanatory and if you do have any questions about it there's a great website we got it from WiiUse.net has tremendous documentation on it and it's all really basic to understand. I mean we understand it and we didn't really know anything much about the Wii mode coming into this. So after we got that we decided that okay so we need to have C programming for our Wii modes and C programming for the Wii Fit board. But how are we going to have them run at the same time? And then our idea for that after much thinking was how about we run multi threads? So right now we have one main program that creates which is written in C++ that creates two sub threads which run the Wii mode and the Wii Fit board to get the information and then relay it back to our C++ program. The reason we have a main C++ program is so that we can access other code such as gesture recognition code that's being worked on at our school. And since that's code in C++ we need to have another code that can communicate with it easily. So one of the main features of our product is the head tracking aspect. And what it is is like we have an infrared source like Larry talked about mounted on a pair of safety glasses which the user puts on his or her head. Now the two Wii modes you have a Wii mode positioned to your left and your right and they pick up your positioning on an XY coordinate and then through a little bit of geometry convert it to three-dimensional positioning with your XY and Z. We use the X and the Y coordinates right now just for mouse movement and positioning. But our ideas for the Z coordinates might be when a user leans in close to the computer maybe magnify the screen or make the mouse disappear. So if you're reading something and trying to lean in close you won't have a mouse in your way. At this time I'm going to demonstrate a little bit about the three-dimensional positioning and show you a little demo. Alright so this is my main screen. Alright so this demo is basically, we made something simple and it's just, it creates like, it's just a couple images, it's a rocket ship in space and it'll show you how we can move it left and right up and down and it'll also demonstrate the Z axis by getting bigger and smaller. Alright first thing I need to do is tell the program what the distance is between our Wii modes. Currently we have them at 36 inches apart. It's whatever the user decides but we found that 36 creates a decent viewable range from both Wii modes. And then I also have to tell which we mode is to the left and which we mode is to the right. So first off I'll say three because we have Wii mode three to the right and we do that by lighting up the LEDs to tell which we mode which. And then Wii mode two is to the left. The reason it says there's three different sources is because we also have the Wii Fit board connected which would act as Wii mode one. Now that everything's connected and ready to go, I put on my glasses and very fashionable as Ranji states. As soon as I push the up button that'll trigger the infrared detection to start. Or it should. It's on. Alright, you gotta tell me what I'm looking. Is it on? Alright, so I can't, it's presentation mode so I can't really see the screen. So what I'm gonna do is I'm gonna move left and right. It should be moving left and right, I hope so. It's actually moving in circles. Wait, I'm not done guys. Alright, so I'm moving up and down and it should be moving up and down I hope. Bear with me if it's not. Alright, and then lastly I try to zoom in and out. Is it getting bigger and smaller? Alright, cool. Alright, so that's a three dimensional positioning. Sorry about that. It's more convenient when you're doing it on the screen in front of you, so. Alright, I'll just close out of that. You guys can connect back to the computer. Alright, you can take it out. With the Mac you can see it on the screen at the same time that it's up there. This computer is kind of weird. I don't want to know what it is. What is it at? That's a Mac. Give it a second. What were you saying about your Mac? There it is. No, we're good. Alright, we're all good. All systems go. Okay, so back to the presentation. I hope you guys enjoyed that. So how does it work? I guess that's probably what's all in your mind now. It's how exactly are we doing it. And what we're doing is we have, as you see up here, we have our screen and then we have a WiIMO to the left, WiIMO to the right. And they're both pointed about 22.5 degrees out, or I guess towards me. Sorry about that. Alright, they're both pointed about 22.5 degrees towards me and up vertically. So that kind of creates more of like a background. See if I can use this. It creates like this background at zero degrees right here. And that also works for the Y direction and the Z direction. So saying that all the positioning starts there and then goes off. As far as finding the angles, since there's a resolution, since we found out the resolution is 1024 by 768, well we're thinking, well if your X coordinate is going to go from zero to 1024, why not scale that down to zero to 45. And when you do that then you have a zero to 45 degree angle, which the WiIMO is set up to project. So we did that for both the X and the Y to get our angles vertically and horizontally. After we got the angles it was just a little bit of geometry from there. We calculate, we create this formula or derive this formula. And first we split up into two triangles and each, and we first found the X direction of this triangle right here. And then from there we found out the other X direction, combined in both and you get your X coordinates, which is pretty much the, it's based off the distance from your left WiIMO. After getting, after we, with one of our squares then we were able to derive what the Z and the Y coordinates were with another, with a little bit more geometry using the tangent function. Since we had the adjacent side, why not use the tangent function to get the opposite side for both the X and the, or for both the Z and the Y. And as I stated before they're just, they're from, they're the distance from this little zero degree mark right here and up. But you ought, right now as you see, we also have a displacement of about 14 inches. And that's just to say we don't want, we don't want the user to have his head laying on the table when he has to use the WiIMO. So you got to bring him up a bit so, so that he or she can comfortably use it. All right, at this point I'm going to do another demonstration about the mouse control. One sec, Mike seems to have done it. Oh, now it's good. Okay, never mind. All right, it's good, sorry about that. Okay, again bear with me because I don't have a screen in front of me, but I'm going to try to demonstrate the movement of the mouse, movement of a mouse with the X and the Y coordinates. All right, again I got to say the distance, it's 36 inches. And then we have WiIMO 3 to the right. Remember folks, he's doing this blind. Ready for the flip. Yeah, it's like I'm from the circus. Blindfolded. All right, and I get hit up to activate the control. That was a scroll wheel function. I'm moving the mouse around where I hope I am. I hope I'm moving the mouse around. Move my head accordingly. Go to the next one. Okay, well usually we demonstrate moving the box. If you guys stop by Hardware Hacking Billage or Q&A or something, we'll try to demonstrate that there, but I can't really see a screen, so sorry about that. All right, so I'm Josh, I'm grungy, whatever you want to call me. It's cool. But I'm going to talk to you about how we use the Wii Fit board in our whole system here. We were kind of just kicking around one day. We got the tracking and everything, and you can move the mouse around like you just saw. And that's pretty nifty and everything, but you can't really do anything once you get to that spot. Once you get to that spot on the screen, which kind of is a point of a mouse is to actually interact with the computer. So to add that extra functionality into the whole system, we're trying to think about ideas and everything. We asked our professor, and he had some cool ideas about this. And to do right-click, left-click, and all the normal Windows functions, we were like, all right, maybe you can't use your hands or as well, maybe you have multi-scorosis or something like that. But you should be able to, it doesn't take too much to be able to shift your weight around, and that takes a lot to get yourself to have the accuracy to do that. That's most people, I think. So what we did was we realized that the Wii Fit board has pressure sensors in each of its feet. So you can tell where a person is leaning, and Nintendo uses that in their yoga programs and their aerobics and stuff like that to do that. So basically the features we added were right-click, left-click, scrolling, and actually even opening the gesture program in our whole thing. To do right-click, left-click, and scrolling, that's just, you use basically a Windows function called send input, and you can just look that up in MSDNA or check out our code when we release it. And to do the actual gesture program, I thought that was a really cool part about this, was you have to install what's called a Windows hook, and what that does is it sits in the message queue in Windows and watches all the messages go by that are going to all the Windows, and it says, oh, I'm looking for this, that's not what I want until it finds the one it's looking for, and then says, oh, I'm actually going to process this message. And that's what tells the gesture program to sort of open up and let the user, say, do a circle and open up Microsoft Word and the speech recognition software, whatever they want to open, and go through all the menu-driven stuff, and I think that added a whole new level of functionality. Also, bonus points, if you know what that error code means in Microsoft compiler, they get that one a lot. So, do you want to do that? Yeah, I see this. All right, so we're going to attempt to do the last demo because we'll see how good Rob is. There you go. All right, so originally, we were going to do duck hunt and just try to demonstrate shooting the old NES game and shooting the duck, moving your head and clicking, but I just realized that I can't see the screen, so it's not going to be too effective. You can stop by later if you want to try. So I knew I just came up with this idea, or I guess I think I heard a friend of mine say at one time, how about use paint? So we'll try to use paint to just show that we're clicking and moving around. All right, it's calibrating now. Let's see if we can do this. This is without hands. Yeah. So I just want to thank everybody that was made like using the WiIMO it's possible and we use and all the independent hackers out there. Thanks. Keep an eye on this website. We should be posting our source and all that stuff in the next couple months. Just keep an eye out and thank you all for coming. We thank you all.