 We've never been equipped with so much technology. First, we all got smartphones, then smartwatches, then earphones. And soon, we will all be equipped with AR and VR headsets. And yet, the way we interact with these devices remains extremely limiting. Human computer interfaces as they exist today are failing us multiple times a day. Honestly, they should not. What if you could control your device without even touching it? Almost magically. And what if such technology could be integrated in this small and ever-present device that we were on us all day long? Hi. My name is Yacine, and I'm the co-founder and CEO of Weizier. We are a bunch of data scientists, hardware, and product experts who love to tackle big challenge to make your life easier. And we are quite convinced that you shouldn't be enslaved to the technology upon you. You shouldn't struggle to have to find that tiny little button on these 200 euro earphones of yours. And you should never, ever have to miss a call, because Siri didn't quite understand what you said while riding your bike. Now what we believe is that the way we interact with devices should be seamless, voiceless, and touchless. And that's why we're building what I think is the very first neural interface that can sit in your everyday devices. So what's a neural interface? It's a big word, right? Big scientific word. Neural interfaces are the most direct path between our intents and the technology around us. Neural interfaces are empowering us. They give us back the control over the machine. They are the most natural and intuitive way to control the devices around us. If you think about it, the way we act today is super crazy. We have to monopolize the most important part of our body, our hand, or even our voice, to press a key on a keyboard, to click on a mouse, to move a controller, or even to click on a clicker. And then these devices are in charge of translating our intentions to the gazillions of other digital devices around us. They should not. Let's take a step back right here. Your body is like a power source. Whenever you think or whenever your brain sends information to your eyes or your muscles, an electrochemical message is sent. Now what's interesting is that if you place the right sensors at the right locations, you can capture that electrochemical message. And what's even more interesting is that if you add the right mix of neuroscience, of signal processing, and of machine learning, you can not only clean that signal, but also analyze it and find patterns through it. What this means is that your body now becomes the computer interface. You don't need to press a key, click a mouse, or even punch a button. You can simply peer, blink, or clench. Maybe actually, I don't even need that clicker anymore. I can put it in my pocket. And I can go to the next slide, or to the previous one. We don't even need to press a button. What you see on me right now is the very first demo kits that we've built that way here, the very first wireless earphones equipped with neural interface. This means that these earphones can be controlled using my facial muscle activity and my brain activity. And I'm quite happy to say that we have a few limited quantity available for sale at the moment. These earphones are the very first earphones that can decode real neurons using digital neurons. I want to be honest, this is much better than what I had to wear for the past two years. If you look at these prototypes, we're a bit more involved, not yet to the point where you're going to wear it yourself, but much better. I'm also not ashamed to say that we have one of the most dedicated team of experts, not hesitating to shave their heads to try a new prototype, or wear very bulky and uncomfortable headset for multiple hours to bring the technology to where it is today. But what I'm really the most proud of is to tell you that we've built the most private, high speed, and inclusive human computer interface on the market today. Privates, because we have optimized our AI algorithm so that everything can run on the edge, directly in my earphones. No data needs to leave the device for it to run our technology. High speed, because we have improved our data processing so much that reading my intents and transforming it into actions is faster than clicking a button. And inclusive, because our technology works, no matter if you have a very strong French accent like me and Siri doesn't understand you, no matter if you have limited hand mobility. So I've shown you what this technology can do on, let's say, a very actual use case, changing the slides. But now let's project ourselves a bit further in the future. And let's imagine what it would look like for a VR use case, for instance. So what you see right here is my colleague wearing an Oculus headset. At the moment, if he wants to play or pose the video that he's watching or go to the next video, he has to use the controllers. His hands are busy, so he cannot eat popcorn or do anything he would like to to stay immersed in that reality. With our technology, my colleague can now play and pose the music, or the music, the video, sorry, this immersive video. And he can actually do whatever he wants with his hands. Or, and sorry, and same goes if you want to skip to the next video. So our technology applies today, and will apply for any future use cases that arise. We've already developed two very strong Maya controls. And our roadmap is pretty clear. For the next two years, we're going to be adding eyes and brain controls to move from two controls today to 12 controls in the coming years. Neural interfaces are the most intuitive and inclusive human-computer interface yet. They work for everyone. They work everywhere. And they offer anything. Neural interfaces are coming. And with the advent of AR and VR, they are ineluctable if we want to enjoy these devices at their full potential. I would like to thank you very much for listening to me today. And I really hope that you will join us in this journey to make people's life easier while bringing the next generation of human-computer interface. Thank you.