 Great, great. I am Mark, yes. All right. Well hello there, Mark Risen-Hopkins here, coming at you once again live from Google Iow 2013. I am here with Jay Kim, who is the director of R&D for Apex and Apex Labs, and you guys have done something caught in my eye. I'm a big fan of augmented reality, been writing about it for quite a while since it was a dream and an iPhone. And so, when I saw what you guys had over here, I wanted to get a little bit more deeper dive. You've got some glasses, an immersive display, and all kinds of interesting applications. So, let's dive right in. What do you guys do and what's it for? So, Apex is a smart glasses software company that actually built something that we call the Terminator Vision for the Army. And that means that we had a military pair of glasses with an embedded camera, those within. And we're able to pick off faces in a crowd and if I'm seeing a bad guy, then it just shows the results of that bad guy, that identity, who he is, what he does, where he lives, just delivers it to you on your heads-up display. So, that's where we actually got started by delivering the software stack for military smart glasses systems. And we started transitioning some of that technology off to the commercial world. And we started targeting large businesses whose workforce is largely mobile and largely desk-less. So, people that don't traditionally have access to computers. So, the idea is that we're going to use our software development platform and link up with smart glasses hardware guys and the software integrators and working with them to provide a platform to deliver real-time information to smart glasses users. Say, for example, someone in healthcare, an EMT, who wants to get help from a doctor who's remote, is able to grab his camera feed and then push it out there. And at the same time, the doctor is able to give them step-by-step instructions as to how to treat a trauma patient, logistics, being able to use glasses and navigate through all of the different warehouses and I don't really know where the next item is. Well, guess what? Smart glasses tell me where they are. So, that's what we do as a company, but here at I.O., we were invited by YouTube to implement our hands-free computing paradigm, user interface, to interact with YouTube videos completely hands-free. So, that's what we've got to show here. So, can you just walk me through what we have on this setup and you obviously won't be able to see what I'm seeing, but you'll be able to, not directly, I've tried this once before, it's very interesting. It's hard to describe to someone who hasn't put something like this on, but you'll be able to see what I'm doing as I'm controlling the interface, right? So, this is actually my cheat screen where this shows exactly what you're going to see on the glasses. So, as far as the headset itself goes, we took the Epson Mavario BT100 transparent displays and we added sensors to make these glasses smart. We added a five megapixel camera. We added a microphone and we added a nine-axis motion sensor and what that allows my software to do is it allows the software to understand what you're looking at. It allows me to understand what you're hearing. It allows me to understand your orientation. So, let's get it tight on this, Kenny, if you can, because it really is an interesting device. It's bigger than the glass, obviously, but it does a lot more than the glass. So, it kind of shows the thickness of the, it's not that thick, really, the plastic portion. And, you know, it's stereoscopic, right? So, the exciting thing about this display as opposed to glass is that this allows you to, this allows me to have a canvas that's in your line of sight and I'm able to draw into both of your eyes, which means that I can do stereoscopic 3D content, for example, and I can do 3D overlays on top of real world. So, you know, fundamentally, it's a different way of visualizing content than a glass. Right. So, let's change devices. You hold my microphone for a second and you can kind of walk me through, because the hands-free interface is something that's interesting, too, is you don't speak out to it like you do with some of the, oops, lost the nose piece. Get it back for you? Yes, thank you. But you don't speak out to it, you move your head. A lot of use. Yeah, it's been used quite a bit today. You got it? Oh, I got it backwards, don't I? You got it backwards. You have to hold it just as you're putting it on and then it'll stay in place. All right, there you go. All right, there we go. All right. So, I'm watching a video here, thinking I have one. All right, so, let's actually have you double tap on the right side of the sessor twice. Okay. Yeah, right here? Yeah, double tap it. All right. All right, you're back on the video wall, and as you're moving your head just across, sweep your head, you're getting a wall of video 360 degrees about you. Oh, wow. If you see a video that you like, you're able to tilt your head up or down and just stay there for a little bit. Okay. So, yeah, there you go. Wait for that red thing to... Okay, here, here you go. Okay, so now you've selected the video, and you can do playback controls also using your head. If you nudge your head up like this, it pauses. Right. So, let's do that again to start the video playback. And if you tilt your head a little to the left, it rewinds. Yeah. And then a little to the right and fast forwards. So, you're able to completely control all of your YouTube video completely hands-free. Yeah, it's very cool. And then to get out, again, double tap it. It's very responsive. I mean, you don't really have to... You only have to hear it once and you understand exactly how to use it. Yeah, it's an intuitive way of interacting with just transparent screen where you don't have access to a touchpad. So, yeah, we caught all that, Kenny, I hope. It's a fantastic little device. It's a lot of fun. So, like as you said, you're targeting the commercial workspace. What is the kind of... Is it just a softer target? What made you decide to do that instead of go after the consumer space? Well, we think that there's just a tremendous amount of value proposition that we're putting out to all of our big clients where return on investment of all of these people that are wearing glasses is an easier sell, frankly, than some of the consumers who tend to be a little less forgiving. If I'm able to go and prove that using smart glasses is going to be able to cut down on the number of accidents and situational awareness, increase worker safety, and also improve worker efficiency across all of the industries that I mentioned, logistics, field service industry, healthcare, then that's an easier sell for them to take the leap into the next wave of mobile technology. So, I can also envision kind of a big data angle this too as well, because right now in a lot of industrial applications, RFID is kind of the name of the game where sometimes it's QR tagging or things like this, but we're talking about a system where you could probably, I mean, you could easily scan across like a warehouse floor and acquire the data with the sensors and not only be able to acquire it and store it, but you'd be able to see it in real time, kind of what's available or what your inventory is or something like this. Yeah, something that smart glasses bring, that no other mobile devices bring today is access to real time data heads up and hands free. And that's the crux of our software is that we do so much of our algorithm and underlying research is being able to smartly detect the context of the smart glasses user. So RFID being one example, but taking QR codes, which are becoming more and more ubiquitous. My camera that's embedded into these smart glasses is able to go and pick off QR codes just discreetly in space. And just through that, without the user having moved a finger is able to know exactly where you are, exactly what you're looking at. And I can go and tell the user based on those actions and based on that contextual knowledge, whatever it is that user needs to know right then and there. So data entry becomes a much less arduous task potentially. So what are the sensors? Let's go down the list of all the sensors that you have available on the platform so we can kind of get a sense of what's possible or what may be possible with the system. Right, so our definition of the smart glasses requires at least three sensors to be worn on the head. And it has to be a camera, five megapixel camera that we have in this case. It can be any resolution, has a microphone, and then it has a motion tracker. It has an accelerometer, has a gyroscope, and has a magnetometer for compass. So it's able to tell me your pitch and roll of your head motion. And it's also able to tell me through the camera what it is that I'm looking at. Okay. And I guess you've got it hooked into an Android device. Are you able to use any of the Android sensors that are available on devices to couple that with the data you're acquiring through the goggles? Right, of course. So our software is smart enough to realize that Android devices themselves can be also used as a control device if we so chose. So being able to get access to sensor data, for example, GPS so that we're not carrying GPS twice, we couple it to the host device and the phone. Okay. And so I wanted to, because we talked earlier, and you mentioned, it's a platform. It's not necessarily a package or you're not interested in selling the hardware. And you definitely don't want to, even though you're targeting the commercial sector, limit this away from folks that want to hardware hack this up or add to or use it for personal reasons. So I guess the most interesting thing is for those types of hackers and developers in the audience, what is the release date? When can they play with what you've built? So our platform, which includes the client-side user interface and user experience, alongside some of the logic that's required for augmented reality and smart glasses, alongside the server-side logic that manages all of it and pushes real-time data is, we call it Skylight. And Skylight's going to be available at the end of August, running on a set of these BT100s and all of the sensors that power these glasses, we're also going to make available at that time as well. Fantastic. Well, I appreciate you taking the time today. This is a really cool thing. We're going to be watching you guys as you progress forward. As you say, you bet the company on it, so we're going to see if that bet pays off. Mark Russel Hopkins, Founding Editor at SiliconANGLE, we'll be back just shortly with more Google I.O. 2013 coverage. So keep tuned, right here, we'll be back.