 Today I'm going to talk about brain computer interfaces, the systems and how they connect the brain to technology. So I have a lot of cameras here. I'm going to just focus on this. I would like to start by a definition of what brain computer interfaces are. These are systems that collect brain recording from brain imaging tools and use the signals in order to understand what the user is experiencing. That includes what the user intentions are, what their emotional status is, and whether they have a certain cognitive states such as fatigue or high attention level. We use these state classifications in order to make a connection between the human and technology, some external machine that could be a computer or it could be a robotic platform. Then again, once we have this connection between the human and the machine, there is also this feedback element that can happen actively when the user actually sees how their intentions have been sort of translated into commands for machines or the technology, the machine and the background is passively changing based on the desires of the human. A typical example I give in this situation is a robot who would shut up and no longer ask questions if the user is tired, which happens a little bit pepper all the time talking. The image that most of people have from brain computer interfaces, what they get from media is this invasive systems that sort of translate a human motor intentions using brain implants. What we usually have with this system is that the person is connected to a sort of neural prosthetics or an exoskeleton, and then we use the surgically implanted electrodes in order to understand how the brain is sort of reacting whenever the user has an image or an intention of movement. Whereas in my research, I do not work with patients, I work with healthy users, and the idea is that to be able to record the brain activity noninvasively from the surface of the scalp. We call this technology this method electroencephalogram in short EEG, and with the advancement of the hardware, nowadays we do have EEG headsets, EEG cap that can look at the electrical activation while people are engaged in an activity, very noninvasively, so there is no intrusion to the user. And at the same time, there is a lot of improvement in the hardware to the point that we can actually collect this wirelessly. So you can wear these caps, these headsets and walk around and we will have access to your brain activity. Now, in my research, I have been using EEG in order to connect human brain to multiple technology forms. So in my PhD, being at Ishiguro lab, I connected humans to these human-like robotic platforms and investigated the sense of embodiment and telepresence that they experienced. Then in my postdoc, I shifted toward more social robotics and tried to develop systems that could basically monitor brain activity of a user in a therapeutic setting with a social robot. Right now, I'm focusing on two interesting projects that take neuroscience tools in order to measure human perception when they are engaging with digital humans. And also, I'm interested in learning. This is the project that I'm going to explain more today. And then I also have some projects that just recently got kicked off or they're still waiting funding that are more focused on consumer psychology and elderly health care. Because I don't have time to talk about all these projects, I have selected neurofeedback and augmented learning and I would like to walk you through the steps that we take in order to develop these systems. Before that, I would like to first start by emphasizing how important this neuro technology are to the learning. So why do we need to look at brains, the brain activity to measure a human learning, a user's learning process? So what is currently happening with most of the educational technology is very similar to traditional education where there is a learning phase and then there's the evaluation phase. So you interact with technology in form of a robot or maybe virtual reality environment, maybe even a computer tutoring system. And then after that, we asked the subject how they felt, what was their subjective workload. We give them a task and then collect their task performance or response time and that's basically what happened. So we have no information about the learning process and the dynamic of the learner's experience in that process. So what I suggest in this case, well, many of the researchers in this field suggest is the neuro physiological measures that can actually give us an ambient tool that does not really interfere with the learning process, but at the same time can give us a way to understand how the trainee is experiencing the training. And the cognitive theory behind this is one that connects performance with cognitive workload. We know that in many training settings, there is a certain level of cognitive workload that is called the optimal workload in which the trainee shows the highest performance and highest learning gate. So if the workload is too much, they are going to feel anxiety anxious, they are going to have a certain level of anxiety and distress. And if the workload is too low, they're going to get bored and they no longer can learn. So if we can maintain that optimal level of cognitive workload, we can actually make the performance efficient all the time during the learning phase. And that's what the idea of this project is. With that, I would like to move to the project Mastermise, which I gave a little bit of introduction last time. It's a collaboration between Royal Netherlands Air Force, Silver University and Mind Labs. And it aims to integrate neuro technology and VR virtual reality environments in aviation training. For those of you who were not here last week, the concept of this project is that right now most of the training is happening in Air Force. The aviation training is happening in either actual flying hours or in cockpit simulations. And cockpit simulations are very resourceful and expensive. And particularly with Corona, we saw that suddenly, you know, you may lose your access point and basically the training is stopped. Whereas what Air Force wants to do, wants to use this new technology, such as virtual reality, to make the training more accessible. So the pilot can just be at home and connect to this VR simulation and be able to get training, even in track with their instructor in that simulation. And while they're wearing this VR headset anyway, the idea is that, okay, let's combine it with the EG electrodes to have a scanning of their brain activity to be able to monitor their workload so that we can make their training more efficient for them. So we have three research questions in this project. First, we would like to validate the VR simulation for this kind of training. We want to look at the effectiveness of the simulations and compare them to the traditional cockpit simulators. Second, we want to know what are the neuro indicators of long-term training, particularly aviation training is not something that can happen overnight. It takes longitudinal training sessions and we want to know how the learning curve emerges for each pilot. And then after that, we want to have this neuro feedback model that can adapt the training within that VR simulation for each pilot, for each user. And here is an outline of our working packages. So we have the part that we develop the system and then the part that we evaluate the pilot interacting with that system. So I think the most interesting part for the viewers today is the development part. So I want to tell a little bit about that part. So a little bit under the hood what's happening. What we usually do with this experiences is that we collect a lot of EEG data and then we start processing them and using AI techniques with them. We did a recent systematic review using keywords such as EEG and learning, training, particularly aviation in the context of aviation training. And what we found is that the reports are quite a mess. So learning happens everywhere in the brain. And there are so many components that are reported in relation to learning, even in one context, that is the aviation training. But one indicator that seemed to be a little bit more promising and had been used in the past in other contexts, like contexts other than aviation training is the EEG engagement index that uses spectral features such as beta band power, alpha band power and theta band power in order to understand how much cognitive workload the user is experiencing. And the idea of these band powers is basically very easy if you are not familiar with them. It looks at how fast or slow the brain activity is. The faster, the higher activation you are experiencing and perhaps the more involved and engaged you are in a task. And for our project, the first step, the first experiments that we are planning is to actually validate this indicator, EEG indicator, in two simulation settings. One of them is a virtual reality simulation that is provided to us by Air Force. And the other one is a desktop simulation. And we're going to recruit subjects as soon as the labs are open after Corona. And of course, together with EEG, we're going to have subjective measurements. We're going to have eye tracking, behavioral measures. So it's always about the relationship between these measures in order to ensure that the EEG indicator is actually showing us what we want to find in the data. Now, I want to close this talk with some talk about BCI ethics. There is no talk about BCI in which I don't get questions about brain hacking. Certainly ethics is an important issue in BCIs. A recent review collected different categories of ethical issues regarding BCI technology. In this, we have physical factors such as how much safe the technology is, psychological factors, whether the technology is going to change sense of self and agency. And more importantly, the social factors. So I think the physical and psychological factors are factors that have been already investigated a bit with the patients. But the social factors require expertise from other fields such as law, regulation, ethics. And to that end, I actually have the planning to write a proposal together with a law school student on the ethics of BCI and whether there should be regulation about this systems in order to make it accessible to everyone in order for human flourishing to happen. Thank you so much.