 Hi, everyone, my name is Sudarshan. I'm one of the HACQ organizers. So today I'm here to talk about this thing called OpenBCI. And I'm going to give a brief introduction to brain component interfaces. So I already know why you guys are thinking of this, right? So what usually you expect when you say brain components, is you expect the super futuristic kind of stuff. But no, brain component interfaces actually look really ugly right now. So what they are are these super complex array of electrodes that requires very specific placement around your head to get electrical signals from some part of your brain, depending on where these electrodes are placed. And this actually creates what's called an EEG or electrocephalogram. But you know, this guy's ugly, so let's just replace him with a baby. Now it looks significantly cooler. So you're probably wondering, electrocephalograms are basically like ECGs. So ECGs are stuff you do in the hospital, right? They take electric pulses of your heart and it appears as lions on the screen. Well, EEGs are same as ECGs except you replace the word cardio with cephalo, which means brain or head, yeah. And our EEGs are actually described by how many channels they have. So for a typical brain component interface, that can range from around eight to 32 channels. And usually the more channels you have, the more data you can get from your EEG scan, which means the more things you can infer from this data. And yeah, so what EEGs look like is this. And if you look really, really close at the patterns, you realize that it's absolute garbage, right? Like really, what can you tell from this? Absolutely nothing. Well, that's not completely true. You can roughly tell five main things. A human person can differentiate about five things. Delta waves, theta waves, alpha waves, beta waves and gamma waves. And this actually arranged in a way such that this is the lowest frequency and this is the highest frequency. And as the frequency increases, what you get is a state of more excitability. So this will be when you're sleeping and this will be when you are really, really thinking hard about something. And you can actually look at what these graphs represent. So gamma waves are typically seen when you're problem solving or when specifically that area of your brain that the electrode is reading is undergoing some sort of problem solving behavior. Second is beta, which is like, it's slightly active. It's not really concentrating, but it's more active than normal. Alpha is when you are resting that part of your brain. So if you're not moving your arm, typically you'll see alpha waves from that part of your brain. Theta is drowsiness. You're about to fall asleep. And delta is when you're actually in proper sleep, in deep sleep. So what is open BCI then? Well, open BCI is coming from the word open source and then brain component interface. It's basically a way to make medical grade EEGs more accessible for hobbyists to play around with because the typical medical grade EEG can cost upwards of $10,000 or something like that around there. I don't exactly know the numbers. And so this actually completely hobbyist powered and fully open source. And this reduces the cost of this kid over here to about $1,000 or so. And it also allows for basic hobbyists to experiment with this kind of technology. And for this specific brand of open source brain component interfaces, it comes with an eight and 16 channel version, which are called the Scython and the Ganglion bonds if you Google it from the website. And it can actually expand up to about 32 channels, just the maximum we can go to. And that's like a good medical grade EEG, even though it's not as good as Elon Musk. Elon Musk is up to 3,000 electrodes. And that thing is surgically implanted into your brain. So a different beast. And applications for open BCI include things like prosthetic arms, mind control drones, which I'm currently working on. I'm not really sure how that's gonna go, but that might be cool. And also emotion recognition. You can use your sensors to figure out what person's feeling. So the same I am a poor NSF and I earn like private rank pay. So, and I know the kid starts at about $1,000 and I didn't buy it, a rich person bought it for me. So the interesting thing about open BCI is a few interesting things. Firstly, if you want to make add-ons to the board, things that add hardware capabilities. So let's say you want to add an FPGA that does some sort of fast field transform in the hardware. You can do that, cause the codes and the electronics board is completely open source. Secondly, the frame itself is completely 3D printed. So when you want to print your frame out, you just print it out at home and you assemble it yourself and it actually reduces the cost for them quite a significant amount. Unfortunately, to print one hemisphere of this open BCI thing, it's like consists of two hemispheres. It costs, it takes about a solid day and a half. Basically, I didn't sleep for three days while my printer is in my room printing away at these headsets. And for the electrodes themselves, they are injection-molded and they come with a spring. So those are really the most proprietary parts of the system but they provide a few extra in the box, so that's a good thing. So also another thing, these hemispheres are really, really hard to print cause they actually have a lot, a lot of support materials. They are like, yeah, so about three to four days of failed prints later. I got the frame out. It took about two tries for each hemisphere cause the supports kept failing. And then you add in the Psython board. That's what it looks like. In this, in this hardware setup right now, it's under this casing, so you can't see it. But is this running STM32, I think? One of the arm chipsets there and it just reads essentially the electrode data. Okay, so interesting thing is what the electrodes is that the electrodes are completely remountable. So each of these comes with a set of screws inside there. So like you basically just screw your screw on your electrodes and you can move them around cause like these wires aren't really attached to anything. They're just loosely hanging off the frame. And where you actually place these depends, like will determine what kind of data you can get from your headset. So for example, like they are like, if you're gonna get a lot of motor related data, you'll put them near the motor cortex. If you want like more like high level like emotional related data, you might put it in the prefrontal cortex. It really depends on your capabilities, depends on your electrode arrangement. Also a quick note here is that they should really not use jumper cables for this part of the electrode. So like I feel like it will break very easily just in my personal opinion. So the auto cortex mark V, this mark IV, so it comes with a 35 total electrode arrangement. So these highlighted ones are the ones you can place it at. So I have like them at a standard configuration which is like these three, these two, these two and the back two. Yeah, but you can look at them later here. It's more obvious there. So after installing them and wiring them up, you get something like this and you can turn it on and I'm gonna like have this at a corner later after the thing. So if you wanna come play with it, you can come play with it afterwards. So for connected to the PC, unlike a lot of other traditional EEGs, it's actually completely wireless. It's just really useful for you to walk around. And it actually does this using Bluetooth. They provide their own nifty little Bluetooth adapter that you have to plug in. For some reason it doesn't work with your own computer's Bluetooth adapter cause I think they have something like something in the hardware of the thing. I don't know, like the Pro-Core is complete proprietary. I'm pretty sure there's a way to make it work with your normal Bluetooth adapter, but I don't know. But it was just way easier to plug it in and have it work, so yeah. So what you get is this sort of UI here. So the eight channels are secure and you get the EEG sensors themselves and they actually provide a few other plots they can draw. Like for example, the head sensing plots and like you can also detect whether how elevated your state is, like basically measuring the, yeah. Okay, so that's about it. Just a few things. That's my email if you wanna ask me questions. This is my handle everywhere. And then please come to Geekamp SG cause I'm giving like a really, really in-depth talk on brain computer interfaces and how human evolution is gonna merge with machines. And it's actually called love at first bite a romantic journey to the future of us. And I'll be wearing this thing during the entire talk and having the slides change depending on what I'm feeling. So there's a lot of cool demos to come there. So if you have any questions, you can ask me now. Questions? Yeah. Yeah, so usually the typical research right now is they do a lot of clustering. So yeah, usually like they take a bunch of EEG readings from each, like a bunch of people and like for example, for each emotion, they'll take a whole set of data and then they do clustering based on your EEG, current EEG data into those. That's the most of the data analysis that they do. So what I'm planning to do is take a more of a machine learning approach by using neural networks instead. So I'm planning on thinking of using a LSTM because this seems like very procedural data. It's like talking, right? So that's a lot of time-dated stuff. So I thought maybe LSTM would work but I don't know if anyone else has done that before. So I'll figure it out I guess. Yeah. That's the standard method of roofing down the data. Yeah, yeah, yeah. That's why I said the data is absolute garbage. I have no idea like how anyone supposed to get data of that. Anything else? No, yeah, okay. Next one.