 Tim is wearing sensors. It's a recording from my ear, brain activity, and we're recording heart rate and we're recording movement from the chest. We're streaming this to the cloud, where we're translating this into measures of my relaxation level, my heart rate, we can get heart rate variability, we can get a variety of states that are related to the types of signals that I can get here from this in-ear sensor and from my heart rate monitor. And all that happening in the cloud and coming back to my mobile device. So we've got our EEG, our ECG, our heart activity, my relaxation score, actigraphy. So if I move around a little bit, it's going to go up. It gets all red. Now I'll be still and then my heart rate. And so we can take all of this and we can do something kind of cool like this because all my data is going through Neuroscale. Hey Alexa, ask Neuroscale for my current relaxation level. Your current relaxation level is 20%. You're pretty pumped up if you feel like relaxing a bit. We could do some meditation exercises together. And that's an example of how we can integrate something like Amazon Alexa through the Neuroscale platform. Connect that to my brain and my heart and my body. Do something useful like asking Alexa, what's my state? How am I feeling right now? But it could also be your light. It could be your car. It could be any other internet connected device. And you can build that kind of application on the Neuroscale platform.