 Well, we were walking by this booth and we saw a brain and so we definitely had to stop by and talk to Luke Stuber from Cognition, I'll get it yet, Cognition, spelled with an X, which is why I stumbled over the word. Hi, Luke, how you doing? I'm good. How about yourself? I'm doing good. So you've got a brain here, so that's nice, and you've got some sort of headset. What are we looking at here? We do. So, well, the brain, of course, is a brain, which is, it's not a real brain, I'm afraid, for those listening. What we've done with the brain is label different parts that are relevant to our work, because what we make is a brain-computer interface. For people who are profoundly impacted by a physical disability, like a brainstem stroke, for example, or spinal cord injury, so even people that don't have the ability to move their eyes at all can use the brain-computer interface to communicate. Seriously, can't even move their eyes and you can tap into their brain, okay? Yes, it's kind of, the funny example we have of this is if you hold up one hand and say this is the yes hand, and one hand and say this is the no hand, and then look at me straight on, you can see what's happening in your peripheral vision, and that's kind of the way that this works, is that even if you can't move your eyes, you can sort of attend to a signal. The way that that works is that if you think of like an animated GIF sort of thing, we have images that are flashing at a certain frequency, like nine times a second or 18 times a second, and so it creates this nice little wave in the back of the brain, the occipital lobe, which is processing visual information. So when we see that curve change, basically we can say, oh, they're looking at this button or this button. Oh, no way. The future is cool, and we beat Ewan Musk to getting it done, so we're proud of that. So what we're looking at here is a headset that does not look bigger than a VR headset. This is pretty small. Great big sunglasses on the front for some reason, and it's white, and it straps around the head. Does this, do you have to drill into your head to do this? No, this is non-invasive, right? So we have, it does, it looks a lot like Oculus, like Quest type thing. Yeah, exactly. It looks like a VR headset. The difference being that it's augmented reality display, so it's not totally opaque. You can see through it and see your environment, and it sort of plops the keyboard down into the world, along with various other functions. So you can control Alexa through this, turn on fans, change the channel on the TV, that sort of thing. No, seriously. Yeah, oh yeah. We can show you. Can I put it on? Yes, absolutely. The comedy of that is always worthwhile for the audience. Yes, we can absolutely show you the way that it works. We're still probably nine months away from having these available to the public, so we're here largely for feedback, which I say, so don't judge me too harshly when you see. So I'll just pretend I'm at CES where everybody says it's coming in three months. Yeah, right, and then it never does. Yes. No, this is, but it's a real thing. It works, and we're really proud to be bringing in. I mean, this is, for a very long time, the world of speech generating, and I would say disability generally, has been relying on sort of the scraps of consumer technology, right? Yeah, let's take an Android phone and put something on top of it, which is cool in a lot of cases. And that's, you know, I sort of have this hypothetical goal that in five years personalization should be synonymous with accessibility, and in that world, then I'm so happy the Googles and stuff are doing it. What we wanted to do with the brain-computer interface is build essentially for what they call the hardest use case first, is like, okay. Oh, right, you can back into other stuff, but if you start where you've got no ability to move anything. Right, that's like, you can always move up from there, you know, in terms of the technology, but, you know, if we build it, so it works for everyone. You know, there's really, everyone has a different physical profile, right? But there's no such thing as accessibility needs, except for as a failure of product design. So, I truly believe that everything should be accessible to everybody, and, you know, this is just a step along the way towards that. Yeah, boy, you're being awfully profound. I'm not used to that. No, no, I'm sorry. I'm kind of rocked back, thinking about what you're saying. I mean, I've heard it spoken a lot of people say blind, but blind means like a thousand different things. The profile of low vision is completely continuous, but you're talking about all of these accessibility needs are continuous. Yes, absolutely. So, maybe one guy can move his eyebrow, and the woman can wiggle her ear, but if you can't do either one, then that's another level. Yep, I'd say, you know, if you've met one person, you've met one person, right, in terms of their needs. He's going profound again. Yeah, really. No, that's awesome. But it's totally true, right? I mean, we have to build to the specific needs of the folks that we're trying to address. And over time, I hope that all of this will be integrated in just general consumer technology, but, you know, for now, we'll sort of try to advance things as we can. So, you must have brain-scientist people at Cognition or Cognition. Are you one of them? I'm a clinical speech language pathologist, and I've been making these for a decade or so previously with a company called Toby Dinovox. I was the head of product, eye-tracking company. But we, yes, we have, so we have biosignals engineers, what they would call essentially applied neurologists in our engineers in Toronto. So that's where the smart people are. So this whole looking at the wave thing, how are you tapping into what the brain is doing if it's not invasive? So there's, on the back of the device, pressing against the skin, there's six dry electrodes, so there's no gel or anything that's necessary. And it can go through hair? Yes. And now, with a caveat of thickness of hair, because it does, we can get about half an inch in or so. You just crank it down on the back of my head? Is that what we're going to do here? Kind of, yeah. So there's like a ratchet system that disconnects these. So he's taking the helmet off of the, am I allowed to say dummy? Yes. The inanimate head. Our person, should we name them? Yes. Name this one Allison. Okay. Allison. Thank you. Our CEO assistant is handing us one here. Yes. He was going to correct me a little bit with that, right? So just to give a sense of how this functions, there's a reflective lens. It's magnetically attached there? Yep. You got it. And so these are replaceable. That's kind of the part that we think might be easier to rank. Yes, exactly. But then inside there's the screen, and it's just going to look blank right now, which is kind of funny because black is transparent in augmented reality. So it doesn't look like there's anything going on, but if I put the lens back on, what we can do is get it set up and show you how it works. Part of the reason we use the lens too is when somebody composes something to speak, it also will throw it up and display it on the front. So if it is a loud environment or people just want privacy. So I'm going to have this on, and I'm going to figure out how to write hello, and it'll be displayed on the front of the glasses. Yes. Is this going to take a while? No. Okay, we have time to do it. Okay. Let's do it. So what I will do is I'm going to crank this down on my head. Yes, indeed. I don't think my hair is too thick. And we are, this one actually, I'll show you the, the mechanism of interaction with this guy is head pointing. So I'll show you how that works just a little bit. Okay. Okay. All right. We're putting it on my head here. Okay. And I'm going to just crank her down. Okay, I see, I see some words, green on the top, pink on the bottom, my brain is getting squished. No, you can go tighter than that. Yeah, yeah. Keep going. Come on. Yeah, you got it. Okay, let's see. I want to make sure it's balanced. Yeah, that's pretty good. Okay. Now both words are backwards. So I'm just going to hit a button here. It's going to do a three second countdown. Okay. It's writing something there. I can't read everything that's on screen. Okay. So if you look down, you'll see a bunch of sort of status indicators. I see two arcs of white. And they have a little dash on either side. Yes, perfect. Okay. So try actually looking down, like with your whole head. Oh, oh, tilting the head down. Oh, there you go. That's the right way to put it. Okay. All right. So you can see sort of it's giving you status. There is the ranked computer interface connected. Okay. Oh, I see. Okay. The very top, there's a little house. Everything's in reverse. This is supposed to be Alexis backwards, histories backwards. My favorites is backwards. Oh, maybe he's pressing some buttons on my head. Cut right to that. All right. We're going to do this re-centering. Okay. It's still... Oh, now it says re-centering in two. Now it's forward. Perfect. Okay. All right. Oh, now I've got all kinds of data at the bottom. Yeah. This is way more interesting than it was before. We were doing demonstrations and it evidently was the wrong setting, but... Okay. So I've seen Bluetooth and Wi-Fi and an audio signal here? Perfect. So, yeah, give the ability also for privacy people can turn that on and off at a lot of the time. All right. And then if you look, sort of the same amount up. Okay. Now I see the house. When you're looking at the house, maybe a little bit higher. Okay. I'm looking at the house. Oh, it just turned yellow. There we go. Oh, I just went to home. Now I've got Alexa. I've got history. My favorite. Okay. I want to type something. No. No. Okay. I'm trying to get to the H. Holding that steady. H. There we go. Okay. I'm going for the E. I've got the E is turning yellow. So what I'm seeing is a little yellow line on the things I'm looking at. If I hold on it long enough, I get to the full box around it. Oh, there I go. And is anything showing on screen, Steve? Not yet. So if we go down and there's a blue button with a play. Okay. I would have called that purple, but I'll do it. Does it say hello? Yes. All right. I just, uh-oh, I just type, hello, you know, I need to get to the delete key to get rid of that because I'm a perfectionist here. Nope. I'm not getting any keys right now. Oh, there's the space. Oh, there we go. And now it wants me to decide which word I want. Hello. There we go. All right. Yeah. Look at me go. I know. That is really interesting. Yeah. And that's one of the really nice things that we're trying to do is like for folks who are using like eye gaze communication systems, there's normally about a four hour set up in calibration process and you can see this just kind of takes right off. Yeah. Yeah. I could see you could get into that. So the person who's locked in has to be able to move their head to be able to use this. So this one, it's funny because we actually weren't originally planning on making one that was just a head pointing or switch interface, but what we've had is a lot of folks with cerebral palsy that have come to us and wanted this solution because they have a little bit of head movement or can use head movement with a switch. So a brain computer interface is a slightly more limited interface because there's about eight-ish interactive elements that we're able to support at once. Okay. So it's almost like eight-bit. Yeah. So basically it'd be the idea of sort of looking at a quadrant of the screen and then kind of zooming in and then making a selection from there. But one of the big pieces of my work is trying to get that to be as efficient as possible. So you can actually save 57,000-ish things within three hits, three activations. Wow. That is crazy. It's cool because right now, brain computer interface in the laboratory environment, folks are only composing about half a word a minute. Oh, wow. We're trying to get to 30. That's the call. We're getting there. Yeah. Yeah. This is fantastic. Well, Luke, this is really cool. If people want to learn more about cognition, where would they go? Yes. So cognition, which is C-O-G-N-I-X-I-O-N, cognition, and you know. Dotcom? Yes, dotcom. And the idea is that there's a person, the eye is a person, and then talking to the other eye. So it kind of looks like a little person. Oh, it's very cute. Very cute. All right. This is fascinating. I think people are definitely going to want to come go take a look at this. And I'm sure they're going to want to see how awesome I look wearing this helmet as well. Thanks a lot. Absolutely. Thank you for your time. This is great. Oh, wait, wait, wait. One more question. If Steve hadn't cut it off, you said you have a tech podcast as well. What is your podcast? Yes, Talking With Tech. It's called. Talking With Tech with Luke Stuber. Very good. Give you a little plug there. Thank you.