 So, my name is Federthans. I'm the Herald and I will give you a little introduction to Kai Kunze who is from who's currently working in Japan at Kyo and He is going to give a talk about About eyewear computing and this is a really interesting topic because our human brain is not just limited for specific visual processing, but it's actually kind of You could say it is doing things on its own It is processing things on its own and this aim in the end is to enhance to maybe enhance the computing we do by specific devices such as for example eyewear and He's going to talk about the beginnings of that and So give a big round of applause for Kai Kunze Thank you. Thank you very much. It's them. Yeah, the mic is on Can I get my slides? So for me, it's always You know, I give a lot of talks, but for me being here at Chaos Communication Congress is always special So I'm also a little bit nervous and I hope that I won't be wasting your time for the next, you know 25 minutes or so on so the topic I want to talk to you about today is eyewear computing Augmenting the human mind as mentioned I'm working at Keio Keio Media Design Keio University in Yokohama, Japan And before I start to give you an overview I'll show you a quick demo because before and I had some problems with Bluetooth in this room So I think now the demo works. So What do you see here? I'm currently wearing prototype from Jin from a Japanese glasses maker and it uses electrodes to measure my eye movements And you see my life blinks. It also has an accelerometer inside so you can get, you know, also the motion of the head So I'll go into details how this thing works and what you can do with it also But you know, because the demo worked right now. I thought it's good to start with it and So let me give you an overview over the torque So first I give you a little bit of background about myself Then I'll go into details why I think eyewear computing is is interesting or something to explore and Then I'll talk about how we can make it accessible So we can use it every day and you know normal people not only as geeks can use it especially enabling technologies like electro oculography and Then I'll give a little bit of an outlook on how we can be able to measure Other cognitive states something like cognitive workload or so on in real life But that's more or less an outlook and we just started with this work So as a background I worked in wearable computing and this means, you know, I used Mostly motion sensors and other sensors that put them on users and tried to figure out what they're doing My former supervisor Paul you see him here in the in this picture Call is called to the Christmas tree set up because we put a lot of sensors on the people and try to figure out what they're doing and Try to support them during everyday tasks. So doing, you know maintenance work, but also sports and a lot of my work was Trying to figure out how we can get from these dedicated sensors towards something you can wear with you You know your smartphone or sensors that are embedded in everyday garment and then Since the last two years I got more and more interested in what's going on in our mind So can I also detect cognitive states or cognitive activities? In this case the eyes got more and more interesting, you know, I started with EEG and I found you know I'm a little bit lazy So I found it too difficult to make sense out of EEG signals and the eyes were another way to Observe the mind because you know, they are our We use them every day. We use them even while we're sleeping even when we have our eyes closed our eyes closed they are still moving and There's a lot of work in psychology and cognitive science and other related fields that link eye movement to something like attention Concentration intention and higher cognitive tasks But unfortunately in most of the eye gaze today is used for something like this so for advertisement or Marketing and now that we have mobile eye trackers, you know You also do this in supermarkets and figure out how your product must look like that people will pick it up And I think that's a waste and that's a little bit sad So last year I Gave a talk about tracking reading habits. So in this case, we used an mobile eye tracker from SMI This is also an optical system using infrared lights. You see on the top the view on the eyes and infrared cameras underneath your eyes to Detect your pupils and with this you can recognize your gaze and you get saccades so fast eye movements and fixations out and With this we can recognize reading and we can tell how many words you read and not only this But we can also figure out what kind of document types you're reading just by looking at the visual behavior And that's important for Japanese students, you know, we can distinguish science books versus mangas and And also we try or we slowly try to assess how much you understand comprehension, but that's trickier or more difficult that was last year and You know, you could already build some kind of Fitbit for the mind I'm not really sure if you want to have this, you know, you could get your reading count during the day How many words with what speed you're reading how many mangas versus science papers and But I'm wondering myself, you know, what are healthy reading habits? How can I improve my reading habits? What happens if I copy? Somebody's reading over two or three years will this make me an expert in a certain field The biggest problem I see right now with this approach using the technology I introduced last year is First of all, I mean this just works for reading So I would like also to have something more general so focusing on attention concentration focus In in real-life settings and the other thing is it should work during every day And the biggest problem there is the mobile eye tracker you see over here. I mean, it's not something you would like to wear You know, maybe some people would like to wear it, but and then there the next problem is then also the battery power You know the device is not so you can maybe record four hours or five hours in one go and so we focused already on Tablets and smartphones, but then the problem comes that okay, if you turn this tablet off It's not working anymore. So you don't have a coverage of the whole day So we try to implement eye tracking on these devices. It also doesn't work so well with a front-facing camera The computer vision is quite complicated, but we did also some work on on Google Glass Although, you know, it's not Maybe social acceptable everywhere, you know There are more problems in Germany if you work around with Google Glass than they are in Japan So it depends also a little bit on the cultural background, but you can do some stuff with it And now I want to show a demo a little bit an extension from last year's work on glass so Basically glass doesn't have an eye tracker, but it has a small infrared sensor over Here that's used to detect if you wear the device or not and also you can have a long blink to take pictures So what you see right now is the picture I see on the head mounted display in glass and you know You have this touchpad on the side so I Can see all pictures I took I taped over the camera. So if I take a picture right now You should see it just chose black and a little white dot because there's the light sensor otherwise I cannot see anything on the display So what we did with it We have an open-source glass logger that locks all of the sensors From the device and we can also monitor the infrared sensor. So this is something I showed last year Already this is a little bit an advanced version and you can also recognize So if I move my pupil and don't blink, okay, some of the things are detected as eye blinks, but you can see also the light the Infrared sensor moves so you can detect something like left and right movement of the eye from the change of the distance to glass and now a Little bit more advanced demo. What can you do with it for one? I can do some very limited activity recognition with it So after maybe two or three seconds it should show what I'm currently doing. This is just a two-class segmentation Problems or currently it says talking and now If I focus here So reading this just uses my head motion as well as The eye blinks to detect To distinguish talking and reading talking works relatively stable during the day reading is a little bit troublesome with the infrared sensor because of You know, if you're watching a movie or so on it might also say talking Reading because that's not working so well But other things you can do with eye blinks as I mentioned one is recognizing talking so this is simple because your blink frequency will increase usually double and There are also some correlation with focus. So usually if you focus on something your blink frequency will decrease and Also with content. So what I observed is if you look at people reading from an e-ink display you recognize that they blink when they turn the page and I would like to use also the change blindness during blink I haven't really gotten a good idea what I want to do with it But you probably know that if you have some kind of stimulus or if you have some if you blink and somebody changes Maybe the color of the background or changes some people in here I wouldn't recognize because I blinked in between and you could use this for example for display that Changes just when you blink so you wouldn't recognize that, you know Something in your eye in your view changed, but if you want to get the new information It's already there and some some ideas like this and I think it would also be cool for Horror games or interactive movies or so on I think you can really scare somebody with that And another idea something that's always also fairly simple is fatigue detection so just by the duration of blinks and the blink frequency again, so if it gets more More higher you can detect how tired somebody is and how tired somebody will become and I Had the idea of you know equipping a batch of students with just simple Blink detectors for classes and then have a ranking system of lectures So you can get you know the most boring lecture versus the second boring lecture and so on I tried I Discussed us with some professors as Osaka Perfection University The university I was beforehand and they didn't really like this idea too much I Wonder why maybe we can do something at KMD with this and then other things you can do blinks together with head motion Helps you distinguish closely related activities in this case We recorded reading watching a video solving a Sudoku puzzle sewing to have some Physical activity and also talking and just with blinking frequency you get already quite far You get up to 70% and then if you include Include the head motions you get up to 82% However, there are some issues also with glass and even more with the eye tracking Hardware it's first of all power, but also, you know, they don't really look so appealing So it's not something somebody will wear During every day tasks. So there are new systems from SMI and also Toby and they Look nicer, but you can already recognize, you know, there are things built by engineers and researchers for engineers and researchers so for everyday people for for normal people maybe It's not possible. So I was quite happy when Inami sensei asked me to join for a project with Jin Japanese classes company and they make Prototypes of Jin's memes or the glasses I showed you in the beginning and they have a very different idea from Google Glass So this is not a full-fledged Computer on your head. It has no display. It has no camera and The device is just connected to your phone. So currently it's just streaming data to your phone One is the electro oculography and the other one is motion sensors So with one you can detect eye movements left right up down and also eye blinks And the other one Accelerometer and gyro and we're directly working with them in the research department and on one thing I was also happy. So on one side I can work on smart glasses on the other side for some promotion video I could wear the optical camouflage from in my inami sensei. So win-win situation Yeah, the demo in insert the demo here So you already saw it and now I go into the principles how it works. So we use electro oculography Basically your eye is a depot It has a positive charge in the front the negative in the back and if you place electrodes around your eye you can measure eye movement So the regular setup is like this You use it an electrode on the top of your eye and electrode below your eye to get the vertical Movement and one left and right to get the horizontal movement and usually one reference one the advantage compared to optical eye tracking You can run very high sampling rates You have no problems or not so much problems with battery power because you don't need so much processing And some of the disadvantages are It's just relative eye movement and it can be sometimes noisy. So it depends also on the skin and so on and on the electrode placement, of course There were some other people who worked on this so you don't really have to use goggles You can also use headphones. So Manabesan from Dokomo used headphones for that or Andrea Spooling used also this regular setup with top and bottom for up and down and left and right here for the horizontal potential measurement What Jin's meme now did? They integrated it into normal eyeglasses. So they use a three-point EOG Here you see it in detail and if we can go to the GoPro I can also show it to you on the video No, no, okay. So you see these are the three electrodes. So you use the left and the right one For if i'm not shaking so much you can actually also see something the left and the right ones for the horizontal axis and the These two then for the vertical axis Okay, back to the slides Right now These are just first prototypes and the battery runtime is around eight hours streaming data If we do onboard processing or so it will get even better. So now you might wonder what you can do with it For one we can recognize The eye movements as you see here. So blinks and left and right This was one of the earlier prototypes. That's by the way shoya. He was here last year. Unfortunately, he couldn't join this time And you know, what is the first thing you try to implement? If you have a binary control system like eye blinks Yeah, flappy bird. This is really really really hard to play. You also see it in the face of shoya And um I can try it also later. Maybe you can try it but getting even a score of one or two is hard What you can also do is of course more interesting activity recognition in this case We can detect reading and just by time we assign a word count currently So this is you know 15 to 20 error But of course we can analyze the data further and get better results so reading and also the talking so the Head motion change as well as the blinking frequency change Easy to detect so you can't get an overview over your day how much You know physical activity to you did because it also has an accelerometer inside But also social interaction and how much reading you did Now I think right now They mostly focus for the product or so on they mostly focus on fatigue detection And they want to release sometime next year in september But if you're not into you know consumer products and so on or don't want to wait till september and want to build it yourself It's not so difficult. So there are a couple of instructions online. I was mostly astonished by the iboard by a honduran teenager And you know his his setup really works. So masai student of mine also built one He took several of the diy instruction sets and he just uses two electrodes left and right to detect the horizontal movement But as you see it it works relatively well. So even small Eye movements or so on can be detected with a diy set and now Oh, we have to hurry. So just five minutes left before Questions answers and now we are also trying out Other eog setups or other electrode setups what works best for reading or for for other Cognitive tasks and you might wonder, you know, can we use eog also for this tracking of reading habits? And how does it compare to the optical eye tracker? So in this case We use a medical eog with active electrodes. So the one I have equipped there and just looking at the horizontal Component again from the potential That's a graph you see here for me reading And I just use a simple peak detection algorithm to detect the line breaks So these long peaks are the line breaks and interestingly enough You can also detect very small line breaks. So kind of two or three word things. So these are the not so deep peaks And that's not possible with the optical eye tracker because the sampling rate is not high enough But that's quite nice. Oh, by the way, this is not published yet But as soon as it gets hopefully published I'll also share the code or so on and share how you can Kind of filter or so on the eog single And then for the last part for the last five minutes, I want to talk a little bit about How we can recognize more general cognitive states in this case, we try to track cognitive load over brain sensing So we used nears and near infrared spectroscopy device from shimatsu lab nears and this estimates the oxyhemoglobin change In the blood and in this case in the prefrontal cortex So the oxygen change In the blood we used the lab nears the chin's meme and an optical eye tracker stationary eye tracker Device what do you get from the nears is brain activation? So the red means high activation blue or green is lower activation Strange interface, but yeah And this is then the eye gaze synced with the brain activation. So in this case a reading task But what I'm really interested in we didn't only record reading but also calculation And some memory games. So in this case, you see a memory task n back That's a common task to assess memory and one back is relatively easy two back gets Already quite difficult if you're not trained three back is more difficult and four back nearly impossible If you have not trained and this is a recording from one user With the workload, you know as I said one was easy for him looking at the The f nears two was a little bit more difficult Three was most difficult and four the person gave up So and I'm Most interested in this state If it's possible to detect this state not with a lab nears device, but with something like a blinks eye gaze features or some other Cheap sensors we can you know implement something that keeps us challenged while learning And not frustrated and that's something, you know, I would really like to go. We just recorded the data I'm not sure if there's something there. I saw some correlations pupil diameter is something But that's not something I can get over the e o g blink frequencies also interesting, but it's not good enough to Detect the state yet or also predict it because what I really want to do is I want to predict that if I increase the difficulty for a task The person will give up And that brings me more or less already to the summary So I believe that if you look at the last centuries the biggest scientific breakthroughs, where about our physical limitations So, you know, we can now travel faster. We can build higher. We live more comfortable More long or longer lives and I believe, you know, the future will be about overcoming or the biggest scientific breakthroughs Will be about overcoming our cognitive limitations and let's work together with, you know, open tools and open A code to to achieve this And that brings me also to a quick thank you slide just want to name two people So especially oliva amft and christoph schuber who enabled part of the talk because of some hardware I got from him And a special thanks to the people who actually did the work so So, yeah Katsuma and yeah And yeah, now if you have questions remarks or violent dissent And also if there are people interested in an open eyewear platform or so on, please come and talk to me later Thank you for the nice talk We have We have five minutes for questions. You can line up at the microphones. We will start with microphone four Could you explain the blink again? How would you capture that? So for the blink Sorry for the interruption. Please move out quietly. No talking because the eye is a You said it's a deep hole. So it moves down when you blink or what? You know blink is actually muscle activity you get also So what you see here is Blink is relatively a nice Uh Pike so kind of this this signal in the eog and you get it over the muscle movement So this is not the deep hole So the eye does not move in this case you move your muscles and that's also something you can Register over the electrodes. That's actually, uh, how do you say a noise signal if you want to recognize? Uh, eog So you have one kind of sensor capturing muscles and deep hole movement of the eyeball Yeah, yeah, and you also see I mean you also have to do some filtering So if I move my head and so on of course, you will get this also on the eog Because of the small, okay, thanks Yep, one question from the signal angel The question was uh, what sampling rates does the eog operates? Uh, we've for the line lengths Uh, so for sampling rates, this depends on so the Chin's meme currently samples at around 100 hertz Uh, it can go I think up to 200 hertz This has nothing to do with the electrodes but with the chip inside and for the line lengths the data you saw beforehand That's a medical eog. So this is 500 hertz So here you need some some higher sampling rates So for this you need 500 hertz, but I think maybe with 200 or 150 you would also get Reasonable results for for line break detection Thank you microphone number three I was wondering if there's any ethical reflection included in your studies and In your work because well, you don't have to have too much fantasy just to think how this Uh thing which is funny here can be used as a perfect tool of control Yeah, I agree and I think there's already There's already a lot of work. That's why I had this advertisement slide. I think we are already behind You know some companies that work in this field and have more knowledge about the relation between brain And an eye movement and so on and also kind of you know how You know there's whole there are whole studies on how to place products or how to Design a supermarket or something like that that are not out there And for me the important part is that the data stays with you, you know, I don't want google or apple or somebody like this Developing glasses and then getting all of our data because I'm already scared From the data, you know, just the location data and the other data i'm giving away for free So there's definitely an issue. I think you know one thing we can do is just try to Open this research up and show people what is possible to Hopefully prevent somebody Uh misusing this but yeah, that's that's definitely an issue. Thanks microphone number two Yeah My question is are you doing any research to connect your technology with virtual reality? For example in the field of architecture or in arts or I'm personally not so much but uh, there are some people at kmd So especially in army sensei who who worked in virtual reality and also thinks about this but uh, I had not so I think for especially for virtual reality. I think Optical eye tracking is nice or nicer because you can get the relation of where you're where you're where you're Seeing and you don't have this problem if you're already wearing an oculus rift or a similar device You don't have this problem of uh, you know, uh, having something lightweight or so on to carry EuG I'm not so sure. I think you know this blink Thing could be something, you know, the change blindness doing blinks could be interesting, but yeah, yeah, thanks One last question from number two Yeah, hi, I wanted to ask you about whether you have you thought about capturing brain waves from the frontal cortex Because it seems like, you know, something that is around Mm-hmm So I I tried eeg beforehand. So I played with the emotive. I have also the open bci So the dot com thing not the orc thing And I was interested in the talks yesterday, but unfortunately I couldn't get in and I haven't really met the people From from my experience as I said eeg is relatively noisy and it depends highly on users So for me, surprisingly it works very well for a lot of my students It doesn't really work well because of the placement of the electrodes and so on So I'm interested and I look into it But for me, you know being lazy, I guess seems to be easier to come by and and use Also, I find, you know, afneus is also interesting to some part because we saw now that Maybe two or three channels could be enough to detect something. So I definitely look into it. But uh, yeah So so far it seemed to be For me more difficult Than using I guess. Thanks One last round of applause for Kai