 All right, once again, we are blessed with an opportunity to speak with someone behind the scenes working on the hardware and the software that we all love in the world of Android. Joining Michelle and I is Bjorn Kilburn, who is the VP of Wear OS at Google. Bjorn, it's great to meet you and great to talk to you today. It's really great to meet both of you as well. I'm really excited to be here. Thanks for having me. Yeah, absolutely. And I can speak for myself, Michelle. I know you've got the hardware as well. I was literally just before we kind of walked into the studio here doing setup on the OnePlus Watch 2 that's on my wrist. So I haven't dove into, you know, the nuts and bolts of everything that we're going to talk about today, but this is going to be a really good kind of laying the groundwork around what my experience is probably going to be. But why don't we start with kind of Wear OS for the state of where we are at with wearables on Android? Just break down. If you don't mind is kind of a good starting point, some of the highlights of the new Wear OS for that your team has been working on. Yeah, sure. So, you know, we're focused on streamlining the core smartwatch experience and in Wear OS for we added some key, frankly, missing features like watch transfer, meaning taking your watch and getting a new phone and being able to transfer that watch to the new phone, adding some apps, flash screens and also bringing Google Calendar, which is one of the Google apps that we really felt was needed and needed to be added into the Wear OS ecosystem. And then also, you know, one of the biggest things we've still got to solve. And I think the OnePlus Watch 2 announcement is a great step in that direction is better battery life. It's one of the key user needs that we still need to make progress on in order for the watch to be a trusted device that it really needs to be. And so a number of the things that we've done in Wear OS for to, you know, set the conditions for that are, for example, bring the declarative or watch face format, which is a declarative format. And that sets us up to be able to enable watch faces to be much more power efficient. We've also introduced system managed data bindings for tiles for certain data types, like, like heart rate or steps. And that allows tiles to be updated without necessarily having to wake up the app in order to update those data types. And we generally made improvements to power management within Wear OS itself in order to make continuous step towards better battery life for users. Yeah, I mean, no question battery life is, I think at the top, at least in my experience with, with smartwatches, and I haven't had a huge amount of experience with watches that are, you know, more devoted to a single thing versus the like wide, you know, the wild, wild West with smartwatches that could do everything, which is amazing as a user. It's like, I want my devices to do everything. But as you say, like, as that feature set expands, then, you know, the battery life as one key example, really it begins to take a hit. And I think it's hard to measure up. I have to imagine you can tell me kind of your perspective on this, but it's hard to measure up against what the expectation of the user is, which is I want a smartwatch that can do all these things and last forever, versus what's actually possible. And I know that the architecture here is a really big part of that story, this hybrid interface architecture. And I'm just kind of curious, like, what, what exactly is different about the hybrid interface approach? When you compare to previous Wear OS generations and Wear OS efforts? Oh, sure. I mean, that's a that's an excellent question. We've we've actually been working on the hybrid interface in past versions of Wear OS, but we haven't really talked much about it because it's something that we provide to the OEMs to, to, to implement based on their particular power management strategy. But for example, in Wear three, we added the health hybrid interface, which enables Wear health services to be periodically updated by the MCU on the system. Also in Wear three, we have a display hybrid interface, which some OEMs have been using. And then in Wear four, specifically working with with with one plus, we added the notification, a hybrid interface, which enabled them to be able to handle a lot of the notification use cases on the watch in the MCU whilst leaving the AP asleep. So having we've been building that interface to enable OEMs and, and, and really new new in the one plus watch to is the addition of viewing and dismissing bridge notifications on the MCU, and then seamlessly transitioning to the AP, if a more, you know, more bespoke reply, for example, if the keyboard user wants to type out a response, then we seamlessly transition to the AP, and then the user is able to respond to that notification. But it really, you know, part of the reason we haven't really talked much about the hybrid interfaces, because it really shouldn't be visible to the user what's going on. It's really a set of tools that we provide OEMs to enable them to get to better battery life. Yeah, something I've really noticed when I was using the one plus watch to over the past couple of days, like, I would get a notification. And it really does. As you mentioned, this is supposedly running on the microcontroller unit or the MCU. But I don't actually notice that it's running on the completely custom RTOS that's on this smartwatch versus wireless that's powered by the more powerful application processor or AP. And I'm really impressed by how that's been done. And I'm kind of curious to hear if these API that are provided currently by the hybrid OS interface, the display health services and notifications, if there will be others coming in the future, or if that's what we're working with for these for people feature right now. Oh, it's definitely an area that we're continuing to, you know, iterate and learn on their their capabilities that our OEMs are asking us to add to the hybrid interface. I think in particular, as you get into these transitions between display control, where the MCU is sometimes controlling the display and the AP is sometimes controlling the display. That's, you know, that's where it gets really tricky because you've got to make sure that that transition is so seamless, you don't want the user to have to notice it. And and that that there's there's definitely some some improvements in that area that we think we can still make to enable even more seamless transitions to also enable more capabilities on the MCU, so that more of the use cases that our OEMs want to be able to power from the MCU can actually be driven there and driven with similar fidelity. And I think one of the things that's really important to us is these watches are, you know, if they're 50%, I the way I like to think about them is like 50% functional, like they need to do helpful things for people, but they're also a piece of jewelry. So, you know, they need to be beautiful. And, and so really being able to make continue to make the watches really have this premium, beautiful jewelry like feeling whilst also optimizing battery life is actually a pretty fun challenge. It's pretty complicated to do and definitely will be, you know, investing more in the hybrid interface to enable more of that beauty to come through in a way that also optimizes for battery life. I wanted to follow up a bit about the exact capabilities of this hybrid interface. So, in the blog post, it's mentioned that this enables more power efficient experiences such as sensitive data processing on the MCU while the application processor is kept asleep. I wanted to ask, like, is there a specific kind of pole limit on the sensor data collection or can it be effectively continuous because I was wondering if it would be possible to build more complex features running entirely on the MCU such as sensitive data collection for car crash detection. And, you know, that requires continuous monitoring of like microphone of accelerometer data and gyroscope and being able to have all that running on a super power efficient MCU would be much more beneficial than having that run on something more power hungry like the main AP. Yeah, it's a great question, Michelle. And like, I mean, a lot to unpack there. For sure, like on the one hand, these these microcontrollers are great for always on sort of low compute complexity tasks. But there is actually a threshold beyond which, you know, that if the compute complexity gets too high, then the AP is actually both more performant and more power efficient. And so making sure that we're running the right compute jobs in the right place is really the key thing here. And, and certainly, you know, in I think pretty much every watch architecture that I'm familiar with, the MCU is generally running all the time. Usually it's some maybe it's some lower clock speed. But it's doing things like collecting your heart rate on a continuous basis or steps on a continuous basis. And, you know, obviously with the user's permission and at the user's request, but most users like to have that feature enabled. And once they do the MCU is they're doing that job all the time. And to your point, you know, even just the normal sort of tilt to wake gesture that you make with the watch needs to have the gyro, the IMU, the gyro and the accelerometer running all the time, those are usually connected to the MCU. So usually the MCU is on for even those basic type of gesture detection. And so then having other tasks run on the MCU when the MCU is on makes sense. And, and certainly, since it's running, the more tasks you can take care of there without turning the MCU effectively into an AP is is is a is a way to approach that problem. So certainly, these MCUs are actually pretty capable. And it is definitely possible to run some pretty advanced machine learning algorithms on low power MCUs like such as models built with TensorFlow Lite Micro, they can be run on an MCU. And so that can enable some of these types of scenarios, whether it's an improved heart rate algo, or it's an improved distance tracking algo, which you know, generally involves not only, you know, fusing GNSS, but also initial motion algo so that you can dead reckon when you go out of GPS coverage and things like that. Those types of algos can be improved with running AI and that AI if it's a small enough model can run on the MCU when it comes to something like car crash detection, it all depends on whether or not the model can be made accurate enough and small enough. And the compute complexity can be light enough that it makes sense to run it on an MCU will have to see how, you know, how folks do with with being able to get a model onto a watch like that. But in theory, those types of things are possible and in practice, you know, those types of algos are already used on an MCU on an always on basis for these types of tasks like detecting whether or not you've tilted your wrist and doing that very accurately or you know, improving the quality of your heart rate or or or distance or step calculation, or like fall detection on the pixel watch. Yes, yeah, fall detection is another one that would send with few sensors together and that type of algorithm can run on an MCU. Yeah, yeah, I think, you know, at the end of the day, what I'm realizing as I'm listening to you kind of talk about all this is, you know, again, touching back on what we started with. At the end of the day, the general user of these devices just wants better performance, wants longer lasting performance. And what's going on under the hood seems to tackle that. What I also realize is that, you know, our fans, the people who watch and listen to Android faithful, there it's a mixture of the developer people, you know, the people who really understand the under the hood kind of mechanics of all this working. And then there's the user, the person that just really enjoys Android and wants a better device. So if you had to break this down and kind of spell spell this out for the just the basic the user of the watch to know, like, what activities are that they're doing are powered by the MCU, compared to the application processor and like how that improves their kind of their their perspective on their experience, if that makes sense. Well, I mean, I think the first thing that the first thing I'd say is like, I think the fundamental thing here is that we can put all kinds of fancy, you know, capabilities into a watch. But if we, you know, run out of power before the user gets through the day, they're not going to rely on the watch, it's effectively not going to be a useful device because you're not sure, you know, you're just creating way too much anxiety for people. And so it's really, really important that we get to trusted battery life and trusted battery life for me is like, is your watch there for you on your worst day, right? Like if you think of a friend who's trusted, they're going to be there on your worst day. And so that's where we need to get to in my mind with, with smartwatches is your smartwatch has got to be there for you on your worst day, your worst battery day, in fact, the day that's going to drain, you know, the most battery life. And if it can't do that, then every other any bit of functionality it does, it's kind of cool or me is actually kind of a parlor trick. It's not super helpful at the end of the day. So that's, that's ultimately where we need to get to the, you know, as to your question, like about what runs where it really varies by by OEM and what they implement where but I will say that, you know, pretty much everybody makes use of, of, for example, the health hybrid interface. And so it's just makes a lot of sense to have things like your, your step count and your heart rate running off a micro controller and not running on the on the application processor. Some OEMs use the micro controller to run certain parts of the display, typically for long running use cases, right? Because if you can it's it is it's it's expensive to run the AP and it's expensive to start it up, not because it's inefficient, but because it's designed for really complex computations. And so it's sort of like a big engine in a car, like if you can run it on the little electric engine for a little while, and we don't really have to do a job that's really that hard, then it makes more sense to use the electric engine until we've really got to, you know, get up to speed and pass a car, you know, then we then we turn on the big V8 and and go really fast. And it's using the right engine for the right job or the right, the right CPU for the right, you know, the right job at the end of the day. You can imagine, for example, like, you know, in at the end of the day, like a stopwatch doesn't isn't really very computationally complex, right? It really probably shouldn't be running on the AP. Now, in most cases, I think pretty much every case on a Wear OS watch, the stopwatch runs that I'm that I can think of runs on the AP today. That's something we should get to at some point that, you know, a use case like that, where you just set a t timer, you know, maybe doesn't need to be something that's being kept track of on a on an application process, a little bit overspec for that job, if you will, whereas like, opening the Play Store, updating apps, you know, maybe navigating in maps, like some of those really complex tasks, they're just actually, you're going to get better battery life actually running them on an application processor, because they're inherently complicated. And and the AP is is more power efficient for those complex tasks. Talking about the whole race to idle approach, where you get the complex tasks done as quickly as possible and go back to sleep with a processor, right? Yeah, yeah, exactly. And there are tasks that, you know, that are in that we wake up the AP for today that are not super complex, complicated. And those are on, you know, on our list of of things to get to to move them out of the AP. But they're they're they're naughty. It's it's it's complicated. Most of the challenge to get to battery battery life is not a big silver bullet. It's it actually takes a tremendous amount of work up and down from silicon to glass from, you know, the work we do with our silicon providers to how our OEMs assemble the watches to what we do in Wear OS. And together we'll get there. But it's it's it definitely takes a high level of collaboration, you know, across across the ecosystem. I wanted to ask you a bit about one of the capabilities mentioned in the hybrid OS interface is the ability to support certain watch faces. And I believe the block was specifically said only some watch faces are supported by the hybrid OS interface. Could you elaborate a bit on whether or not this is about whether the watch faces built only in the watch face format or if all watch faces could be capable of being upgraded to support the hybrid OS interface. That's a great question, Michelle. The legacy watch faces, the ones that are based on Android X are basically like many applications. And so they need all the APIs that are available in the, you know, effectively in the Android system. And if we were to take those APIs and have them run on the MCU, then the MCU would be a mini AP and would kind of defeat the purpose. So the advantage of the watch face format is that it's declarative, which means that it's the watch face itself doesn't need to run developer specific code. And that allows you to run the watch face effectively or for the watch face to be what we call system managed, meaning that the the system takes care of running that watch face. That opens the door for that watch face to be rendered either on the AP or potentially on the MCU. Now, the next limiting factor that you run into is that some of those watch faces are then can be quite large in their assets. They're, you know, depending on how big the the assets or the images that are in that underlying watch face, there has to be space for those. And, you know, in most cases, the MCU's inside these watches have fairly limited storage. Now, that's a problem we're working on with with with our OEM partners to come up with more flexible memory structures to enable future watches to be able to take advantage of more of the watch faces that are out there. But in the in the near term, the storage that's directly available to the MCU is often quite limited. And so what the point we were trying to make with the blog post was that this watch face format does open up the possibility of being able to run more of these watches watch faces on more watches on the MCU. But it very much will depend on how much memory is available to the MCU as to which watch faces will be able to run when. Now, of course, the user will get the benefit of better battery life when when we are able to run one of those watch faces on the MCU. Otherwise, the developer and the user shouldn't know or have to do anything different in order to get that capability. It's something that will work together with OEMs to make sure that the watch face can at least run on the AP and if we're able to move it to the MCU because there's enough storage available on the MCU, then that's something we would obviously want to do. And and clearly, you know, like we talked about like these watches are jewelry, so we don't want to just make the watch faces really simple and cut the assets out. We want them to be beautiful. And so there is a real balance to get right there to make sure that we continue to enable users to have the most beautiful possible watch and watch face, whilst also, you know, meeting this fundamental need of having trusted battery life at the end of the day. Yeah, indeed. Now, to kind of wrap things up, I guess a big question that a lot of people are going to have if they're watching the announcements and kind of reading and listening today to the to this news, so much of it, at least from one plus's perspective, is tied to the one plus watch, too. What about other Wear OS users? Is any of this really hardware dependent in ways that might actually prevent some of these improvements from hitting some other watches that are already out there that could, you know, potentially be upgraded to enjoy some of these improvements? What's the what's the view there? Well, we, you know, so much of what's in Wear OS for is generally available for, you know, for watch users that will either upgrade to to wear three or, you know, buy a new watch, which is based on on where for and so there are, for example, power improvements that we haven't talked about in this launch that are in where in Wear OS for that do lead to better battery life as well as the other capabilities that we talked about in terms of transferring to a phone and those types of those types of capabilities. We spent a lot of time talking about the hybrid interface here because the specifically the notification hybrid interface is something we work very closely on with with one plus on their watch announcement. And when it comes to the hybrid interface, it is something that depends very much on each individual OEMs underlying watch architecture and their power strategy. So unfortunately, I guess in some senses, it depends based on, you know, what those underlying watch architectures are as to how much of the hybrid interface might be adopted in Wear OS for by OEM. On the other hand, we like we've talked about, we really don't want this to be something that users need to unduly worry about. And in some cases, you know, OEMs will have other ways of achieving better battery life that may not make full use of one sub portion of the hybrid interface. I would not be surprised, though, if we don't see more OEMs adopting, for example, the notification hybrid interface, you know, because generally we're we're working closely with OEMs and anyone who comes up with a great idea of how to improve battery life that's really top of our list to want to collaborate with them on. And we're really thrilled to work with Apple on it. They were very excited about it and, you know, we welcome that kind of innovation. So I when it comes to battery life, we're we're excited to take on the best ideas from across the ecosystem. And I think that's one of the beautiful things about being part of an ecosystem, right? We have, you know, we have a number of really competent partners out there who have great ideas and we can collaborate together to bring those ideas to fruition all in the service of more trustable battery life for users. I think that's a that's a great thing at the end of the day. Yeah, at the end of the day, that's a huge win for everyone. OK, that's what we all want at the end of the day. Bjorn Kilburn is VP of Wear OS at Google. And Bjorn, it was a pleasure meeting you today. Thank you for carving out some time on a Monday to talk to us about Wear OS and everything you guys are doing. We'd love to have you back sometime. Absolutely my pleasure, Jason and Michelle. Thank you very much. It's great to see you both. Absolutely. We will talk to you soon and keep up the great work.