 How's everyone doing? I'm Alex. Thank you for coming. I built software to help understand human behavior. And one of the sources I based my research on is electricity. You see, the body is an ocean of electricity. Everything we do is facilitated by electrical signals flowing through our bodies when we speak, when we move, even when we're just simply breathing. The way we recharge ourselves is by, of course, resting and feeding. Well, today, we're going to use JavaScript to witness some of that electricity, specifically the one coming from our brains. So before we get started, I'm actually going to need a volunteer to join me on stage. And hopefully someone without a dirty mind without a huge head, so we could feed the headset here. So who's up for the challenge? Is someone there? Someone here? Your head is not too big. You there. Yes, please. How are you, sir? What is your name? Mike. Nice to meet you. Where do you come from? Holland. Awesome, man. Well, thank you. You look like you have a decent kind of head. So thank you for coming. Oh, OK, so we'll see if that works. From here, it looked like you had a head that would fit this, but we'll give it a try. You can grab it real quick, just so you see why it's going to be in more up here. Yeah, yeah. Right, so before we get started, let's try to put it on and see if it fits fine. OK, wait, not too fast. Let me try to adjust that you do have a big head. It's fine. This headset, in particular, is medium-sized. So it fits me, but it doesn't fit a lot of people. So let's see, it's adjusting here. OK, so wait a little more. OK, maybe it's not too small. Is it comfortable? Yeah. You think so? But it doesn't go quiet down. All right, can we try someone else? I'm so sorry, man. That's fine. Yeah. Hey, please give it up for Mike. Sorry, I promise I won't embarrass you. You think this is going to fit? OK, looks pretty big. What is your name, sir? Please come. What is your name? David, thank you. All right, let's give this a try. Oh, yeah. OK, let's see. Oh, perfect. OK, but don't tell Mike, OK? OK, do you feel this spiky here? A little spiky. OK, do you feel it here? Is it too tight on the sides? Oops, wait. Don't die on me. What about back here? Do you feel it? OK, I think it's pretty much all set. So the next thing I'm going to do, I am going to, can you hold this for a second? I'm going to bring down these little plugs for the earlobs. Not a big deal, it's just so your head doesn't explode. All right, so I'm going to turn it on back here. If you don't mind standing up and just turning around so they can see the blue light, or not. OK, so something that I didn't tell you, my bad, I'm going to have pseudo access to your brain, if that's OK. And you have nothing to hide, nothing to hide. So if you're insured, if you're the right person right now, just like, calm down, just focus, and think of something appropriate for the situation. Awesome. OK, so let's just do this, right? So what I'm going to do, I'm going to launch my demo here. And is this going to be a quick image? It's going to be kind of like first thing that flashed as a demo was kind of like booting, right? So relax, it's going to be OK. Ready? All right, one, two, and I'm just kidding, that's not it. Yeah, but now seriously, I'm going to go to my node terminal here. And I'm just going to launch my node app, which is the entry point for this application. I'm going to do a npn, run, visualize. And what is going to happen is that node is going to crash. Oh, typo scores. If there's anything that's going to go wrong during a talk, it's your demo, always. OK, node is going to open the app. It's going to start communicating with the headset. And we should start seeing some activity, some data samples. And it's going to probably go pretty fast. It's the Angular CLI. Don't even tell me. So actually, give me one second. Let me make sure it's properly turned on here. Give this another try. OK, last resort is just on blog. Plug it back in. All right, I think it's going to be good. Awesome. So right now, what we're seeing is we're getting a lot of data from the headset. And if you start looking here, there are around three channels that are not changing values, which is four, five, and six. So I'm going to make sure that they're making contact. Channel one is fine. Two, three is this one. It's fine. Four, two here. OK, we're almost there. We have then this one. No, I'm not going to kill you. This one is a little tight here, right? If you're uncomfortable at any point, please let me know. What about there? You're OK. Oh, I see this one. I'm making a lot of contact. So I'm going to just, OK, we're almost there. There you go. OK, one more. Right here, here, it's fine. One, two, and three, four, five. I think this is good for now. There you go. So all kind of stuff making sense. So what I'm going to do is now launch the real app. We have some activity here. All right, so you're very relaxed. That's good, OK? Now what I want you to do is just let's start with the muscular part first, and just try to clip your jaw. OK, good. Now try to blink a few times a little harder there. Are you really blinking? OK, there you go. So we're getting some electricity. Awesome. Great, so what we're seeing right here is in real time this headset, this device, is capturing the electrical impulses from the brain in real time. And I'm receiving it. And at the end, it basically appears in the browser. Another way of visualizing brain waves is by actual frequencies, right? What I showed you before was a time series. And now we're seeing this in frequency waves. Great, so far so good. By the way, I hope you read the fine print. OK, so what just happened? This is what I call neural JavaScript. And what neural JavaScript is, it's a combination of a lot of things or a few things that you might be familiar with put together, including isomorphic JavaScript, both on the server and the client, neural technology, as you can see here, the headset, some data science, of course, data visualization, and, of course, open source. So let me break down the way this actually communicates. I don't know if you can see that black part there. There's not a lot of contrast. OK, starting at the top left, you have the headset. It's called the OpenBCI headset. And via Bluetooth, it sends a signal to this USB dongle that you are probably able to see right here, which is part of the kit. And that USB dongle, I hear that word, connects to the computer, of course, and communicates with my node application via the serial port, which is pretty much reading the information directly from the USB, which the headset is sending, serialized, serialized here, ultimately getting to node. And that is when the actual work starts happening on the software side, of course. Then via WebSockets, we send this data to the browser. Of course, there's a lot of manipulation done to this data for the browser to be able to interpret it in a friendly way. As you can imagine, we have seen the data in frequencies, in time series, so there's a lot of parsing being done on the data side. Some algorithms, for example, the FFT, Fast Forward and Transform for the frequency. But also, the amount of data being sent to the client is specifically trimmed down to something that the browser is not going to be overwhelmed by. So again, we go to the headset via Bluetooth, and I want to break down each one of these pieces. The first one, the headset, is OpenBCI, and it stands for an Open Source Brain-Computer Interface, which was created and manufactured in New York. But the great thing about it is that it's open sourcing both hardware and software. This one here in particular was treated printed right there at the office, and I guess there are offices, and you can see here that most of that is the plastic, is that treated print. And then ultimately, if you don't mind standing up, and just turn back so they can see the board, we have the OpenBCI board, which is actually where all these sensors go to in order to process that signal. Great, and the cable. So putting this together, you can do it yourself by the board, by the sensors, and then put it together at home for an affordable price. A few years ago, when the technology was not there, mostly for medical research, something like this would be over $30,000. Now it's like a few hundred bucks, right? You can see that there. And that's the OpenBCI headset in a nutshell. If you want to find out more, you can go to OpenBCI.org, but later. This is an illustration of how it comes together, all the little pieces. It's actually not too bad. There's some complexity, but everything is well-documented and on GitHub as well, including all the 3D design. The USB dongle, which is part, of course, of the kit, is just based on Arduino. If you're familiar with the Arduino microcontroller, that family, this one uses a radio frequency and Bluetooth to communicate with the board. But it's custom-made to work with that headset specifically. And then we get to the serial port part, and we have no JS. So I don't know how many of you have worked with the serial port. I have some friends that were the ones that developed this whole SDK. But the idea is that the connector, in this case, the serial port, could be swapped for other things like Bluetooth directly to Node or Wi-Fi. So right now, we're doing serial ports. In the future, we're going to have the ability to use more connectors. And it's being developed also right now. Great, so on the Node side of things, of course, it's on the NPM ecosystem. And you can simply just do an NPM install up in the SDK that will get you the package that would get you the class that is going to allow you to listen to these events. At the end of the day, everything coming from the headset and receiving Node is converted into an event that can be accessed easily with an SDK. If you're interested in knowing more about the Node side of things, you can definitely go to the repo. I'm going to have some links up here later for you to see. But as you can see, at the end of the day, we get a data sample. But what is this data sample? So this headset, in particular, has eight channels. And you can see them mostly here. Not all of these nodes are being used, but eight of them are in very aesthetic places. It uses a grid system called the 1020. And it's actually interesting, because the event numbers are on the left, the event are on the right. And when you're doing some research, you can see some interesting activity happening more on one side of the brain than the other. And that is something definitely positive. Great. So we get channel data, which is eight channels. Each one coming from the headset in order. So you have channel one, two, three, four, five, six, seven, and eight. And you have two on the back, right? We have some auxiliary data that we're not going to go into in detail today. But it's pretty much data coming from the accelerometer, because the headset also features an accelerometer. Then we have some other data like what is the start byte, the sample number, what is the stop byte, and all that fun stuff. Now, we're getting a sample. We're getting one of these every four milliseconds. That's pretty fast. That is a very good sample rate. It provides a very decent, high quality data from the headset. So when you start working at that level of something happening every four milliseconds, and then you have to plan how that data is going to be manipulated and channeled through your app, it becomes a very interesting challenge. So definitely had a lot of fun on the notes as well as the front end. Ultimately, we get this to the browser via WebSockets. How many of you have used WebSockets before? Very cool. Yeah, lots of fun. So the interesting part about WebSockets is all of them in grades is that depending on how many events you're sending to the client, it can really affect the performance. So limiting the amount of events from WebSockets is going to be something that we need to take into consideration while developing a UI for this. Now, the UI, I'm using some technologies that you might have heard. One of them is Angular 2, which is what controls, I say the DOM, the change detection, and then I'm using some data visualization libraries based on different technologies. Of course, well, Smoothie is based on Canvas, ChargeJS is based on Canvas, Floodly has Canvas, it has WebGL, 3JS definitely has WebGL. So I'm mixing these technologies because the type of visualization that we need here, there are a lot of them that not a single library is going to satisfy all of the use cases. Awesome. OK, so clear your mind one more time. You'll be awesome. Just keep any theory thoughts away because I'm going to show you again and start explaining you about the implementation of the time series in real time, a lot of time. Great, so our app is still running. And here, when we see the time series, you can see that the numbers here on the left are, do you blink right now? Do you burp? I noticed something there, something happened. There you go. OK, so the numbers here on the right are actually the amplitude, right? And you're going to see positive as well as negative numbers. And you have a lot of negative numbers, which is something as good or bad as positive because it's really nothing to do. It's going to, the energy is going to flow up and down. So these are just changes of amplitude. They don't mean anything in particular as far as when they're negative and positive, just something to have in mind. Great, so what I'm going to do now is that I'm going to show you how this is built. And you can take a break from that, so your head doesn't explode, thank you so much. There you go. Are you OK? Good, awesome. Please give it up for him. Right, I unplugged it and I turned it off. And I'm going to stop my app real quick. It already stopped. And let's talk a little bit about Angular 2 and some data visualization. How many of you have used Angular 2? OK, good amount. So this is one of my first projects using Angular 2. I've been using it for over a year now since it's been developed in beta and reduced candidates. So as you might be familiar or as you might know, right now, one of the options is to use TypeScript, which is a very friendly way of developing an Angular 2 in particular. I use it when I do a lot of Angular 2. So the code that I'm going to show you here is based on TypeScript. But you're going to be familiar with it if you have used, let's say, Babel or other transpilers. I try to keep the TypeScript-specific code out of this presentation. So it's a little bit easy to understand for everyone. So in here, we have some module imports. And from the Angular API, we're getting some of the things that we have available, like components, element reference, and then we have some hooks. So I'm going to explain the component cycle of Angular so you know how this is all connected. This time series is based on Smoothie, which is the best library that I've found for streaming data, mostly with a time series, of course. And we're getting some classes like the Smoothie chart and a time series. And then we have some app-specific code and, of course, sockets. So this is the component metadata that I have created for the time series component. And as you can see here, we have a selector. If you're familiar with Angular 1, please raise your hands. You said before. All right. You're going to see the selector, which is not different from the Angular 1, even just like path as a string. But you have some template URLs, some style URLs. And then we have some providers. It's kind of like a way of dependency injection. In this syntax here, the ad is just metadata. It's called annotations. And it's definitely available in TypeScript. So after that, let's talk a little bit about what is going on on the view. The most important part here is that we have a canvas element. And that canvas element, it has an ID, right? And you can optionally pass some width and some height. But what's going to happen is that from the Angular world, we're going to target that element. And then we're going to stream the data via Smoothie. So let's see how that looks like. In my class definition, this is kind of like the top part. And it continues. And I'm going to show you everything. But if we explore a class, let's say time series component, we're implementing here some of the hooks. For example, they're on init or on destroyed. They're part of the Angular 2 lifecycle. And we need this because I'm going to show you right now. We have some dependency injection here. I'm creating a view. This is basically a reference to the element itself. Let's say the view that we're referencing from the component made of data. We have some constants. We have sockets. We have a service that has some utilities. And then, most importantly, we're creating a Smoothie chart here. And then we have some data containers, in this case arrays, for amplitudes and timeline. So I'm going to run the application in simulation mode. So I can walk you through what these things mean. Great. This is the simulated. It looks way better than the real brain waves. But at the bottom here, we do have a timeline in seconds. Then we have the amplitudes. And then we have the channels here. So what I'm doing is all that data is being passed from the node side via sockets. So when I initiate the component, I start by basically initializing the Smoothie library. And then I'm listening to the socket events. In this case, it's just a string that matches the event being emitted from the node side of things. And as I receive this, in this case, I believe it's an event each like 32 milliseconds or so. No, actually, I'm buffering a lot on the backend. And I'm sending it. And then I'm reusing that buffer. So I think it's more like probably one second of aggregated data. But yet, it's real time, and it looks completely in sync. So what we get back is this data object that contains the amplitudes, the timeline, everything I need in order to fuel this chart. And ultimately, I'm using a method that I created called Append Timeline Serious Lines, which basically, it's what are those lines are moving up and down that you see for each one of the channels. And it's important to think about the channels here, right? And be mindful that everything you're going to be here doing is based kind of like at the channel level. So ng after view in it is another lifecycle hook. It's basically when the view is ready. Think about it as a kind of like a jQuery document ready function. And the view has a property called native element, which is just basically a reference to the DOM itself. And we're querying the DOM and looking for the time series ID, which matches our canvas object. So that's how the connection is made between basically the DOM and JavaScript. Also, we have, for each one of the lines, basically each one of the channels, eight iterations. We are adding time series. And they contain a stroke, which in this case is mapped to colors that I have saved on the front end. So all the different colors are basically being referenced here on this iterator. So these two functions are very similar. If you see them, the thing is one is pretty much adding. The time series are basically initializing the data stream for each channel. And the other one is appending. So the one previous, the one, the adding time series is called once. And appending time series lines is called each time we receive a socket event. And it's just basically an append to the smoothie time series object with the current time. So far so good. Ultimately, for this type of data visualization, it's very important that when the component is destroyed, meaning it was removed from the DOM, we are removing the socket event listener. Because otherwise, it can be just lingering. And it can just build up and use some of your memory there. So it's definitely recommended to remove listeners or from the socket event as the component has been destroyed. Let's say you're switching tabs. Then you don't need to receive events from the other components. Great. These are just other things as part of the same chart, including some of the, let's say, the channels. Actually, this one is for the amplitude. So I'm going to an array of eight items. And for each one, I'm using data binding in Angular 2 to display the proper value. And Angular 2 is great for this, just like Angular 1 or any other framework. And here, as you can see, this is basically a for us loop that we have in the DOM. And the directive is just ng4. So from previous version of Angular to this one, it's a little closer to what JavaScript really is. And this is basically the time value at the bottom. Great. So let me take a pause here and show you a little bit of other type of data visualization that we have, other than the frequency line. We also have a frequency radar, which is basically the same frequency, but just display a sphere. And we also have some bands, which is very interesting because in this type of data visualization, we're seeing the amplitude electricity for each one of the frequency bands. Who is here familiar with the different frequency bands, like delta, theta, alpha? So this is when it gets interesting. This is when you can actually start doing something with this information that you're getting, because right now you're seeing data being displayed. And that's nice. But according to research, you're going to get some more, let's say, alpha as you're starting to get focused and you start to meditate. You can actually train your brain and see some results from this. So funny story is that at the beginning, I wanted to start by trying to measure something in our bodies. I was like, I want to see if I can detect the JavaScript or with this application like if someone is hungry. So I want to put the headset and see, yes, you're hungry. You're not that hungry. And I was reading tons of medical papers about research about the different bands and what they mean. And it's kind of like somewhere where I can understand how this is reflected when it comes to electricity, because it's only here. And I read, OK, so alpha should go up, delta should go down. And I'm like, that's not too hard. If I try to, if I fast, let's say for half a day or a day, I should be able to see those changes. In reality, I just ended up being super hungry. So I said, I'm not doing this again. This is not fun. But this is definitely the part where things start getting interesting and start making sense. So also, the different channels, depending on where they're positioned, they're going to tell you a different story. For example, back here, we have our visual cortex, which is what process the vision, for example. So if you want to conduct some experiments based on vision or maybe based on other parts of the brain with some cognitive data, you can definitely isolate the different channels and just base your experiment based on different parts of your brain, which is very interesting to me. Awesome. So I want to show you this real quick on this data visualization part for the frequency bands. This one was slightly easier because I'm using a third-party library called ng2charts, which is based on chart.js. And at the end of the day, if you include the directive from the library, you can use the component that the library provides. And with their API, you bind the properties to the data here on the right, from the left to the right. And if you know the documentation for that library, you're going to parse your data the way it expects, and then it just pretty much works, which is not a lot of work on the Angular side. Of course, if you want to get really performing, try to really need to know exactly what each one of these libraries are doing, and not only the Angular 2 library, but also the main data visualization library, or probably just roll your own, which is one of the things that I might be doing soon. Yeah, so we've talked about all of this in Angular. I'm very excited, but what else, right? What are we doing here? So I broke down this project into three different phases. The first one is visualization, which is what I've been talking about here today. The second one is more about experimentation. But in order to start working on and building experiments, you need to have some visibility. That's why I took a step back, because I started first on experimentation. I want to see what this data looks like. I want to learn and want to understand at that amplitude level. And lastly, hopefully some interpretation, right? What this means is that you can make sense of what's going on in here. And this is actually what everyone wants to know and wants to do. So a lot of things are possible, right? Some of the things involve controlling some IoT objects. There's a lot of possibilities. There's some research done and some projects that involve moving a wheelchair with your mind for disabled people. So the thing is, this is technology that just was made accessible to the open source community like this, without having to be for a license. It's all open sourcing. It's actually encouraged for everyone to just give it a try. This project that I show you here today, all of the code is on GitHub. And you can definitely just set it up locally. If you don't have a headset yet, you can just run the simulation part. If you're passionate about it, you can get a headset. Definitely talk to me afterwards if you're interested. And I can put you in contact with OpenBCI. And get involved. There are meetups of new technology all over the world. So please check out the meetup page. Try to find your local meetup if you're interested. Go to the GitHub organization, where we have a lot of open source work. And also, you can go to OpenBCI, the community, where you can read about what other people are doing, people like me, right? And thank you. What do you use that for, Netflix? Ooh, OK. So I just started Netflix like a month ago. So so far, I haven't used it for Netflix. But that's definitely part of the plan. I want to apply it in every area that I possibly can in my life. And Netflix is just an amazing ecosystem of technology and APIs. And I can see a lot of opportunity there. So, TVD. What, turning on the TV with your mind? Oh, well, the first thing that comes to mind is like, I just want to wear the headset and just for it to tell me what I actually want to watch, right? Or control the UI. Like, this is just me being a dreamer. But like, you have to dream big, right? You do. And actually, if you've been dreaming about it, maybe you can answer. A lot of the audience said, can you give us some more real world type things that this could be useful for? We've seen that person controlling the prosthetic arm and controlling a wheelchair. Yeah. So these sensors that you saw on the headset, right? They don't only measure brainwaves. They also measure some muscular electricity and even like heart rates, right? So you can wear these different sensors over your body. And there's definitely a lot of opportunity on the muscular side of things, right? It has been proven that just by thinking that you're moving your arm, you actually trigger those nerves and it actually behave as if you were moving them. That's how you can see some people, you know, amputees actually controlling things with devices that are created by humans. Funny story is that I was at a conference and I was letting other people try the headset. And this person came to me and said, I want to try it, but I actually have a big headache right now. I said, OK, let's give it a try. And put it on. And we could see that the channel 7 and 8 back here was like derailed. There were like tons massive amount of electricity compared to the rest of the head. And I said, yeah, but the pain is like right here. But it was actually the amount of electricity was coming from back here. So she put some eyes on and said, it made me feel better. I would have never guessed that it was actually back here. But for real world examples, you know, this is very early stages. I have to take a massive step back because it's not like, OK, there's an API. I can know what I want. I can Google. I can talk, you know, all that stuff. We're waiting for you guys to help me with that. So that's why I'm here today. Can you put it on my head? Yes, I can put it on your head. The headset. But don't look at what I'm thinking about. Oh, there you go. It's perfectly. Thank you. How is it? Cool? Awesome. Can I take it home? No. That's how I have fun, man. Last question. How's the hardware built and extra future points if it's 3D printed? Can you say again? How's this built? Built. 3D printed. Oh, yeah. It's 3D printed. So basically, it's printed by parts. And you have all the little nodes in one plane. And then you have the heads that actually split in two. And another kind of like, four printers could be printing this at the same time because our different files that are going to print, could print this in, let's say, for files that fit, the normal print at home 3D printer, the affordable one. It takes a lot. It might take, I don't know, probably up to 40 hours, depending. Next generation of headsets, it's going to just get more simple and easier. Again, tons of work. But yeah, you have to be gluing. You have to be sanding. There is a very nice GitHub tutorial about all of this. And I can send it. And I also have some videos of the 3D printer printing it also. Excellent. Thank you, man. Yeah.