 Today, I'm talking about the Android HAL, and this is where I was about two years ago. I had just an off-the-shelf, it was actually a Beagle board, not a Beagle bone, it was pictured there, and a sensor board, and I want to put the two of them together. It seems like that should be easy, right? All the source codes out there, all the drivers are there, Android is completely open source, but it was, there was no documentation, right, and a common thread I keep hearing at this conference, it seems like we're all going and sort of combing through the source code and reverse engineering, you know, the big picture out of all these details. And so that's what I want to share today. Another reason you might want to do this is for off-the-shelf devices, right, on what I was showing before, traditionally, you know, you have sensors talking to the application processor over chip-to-chip interconnect, I squared C, SPI, and, but what I think there's some really interesting possibilities for connecting up, say, an off-the-shelf, you know, no-name Android TV to a Bluetooth temperature sensor. Or you know, over there you say, well that looks like another one of those boards, but what's interesting about that board from FreeScale, and if you download these slides, you can click through these links to these to actually find them, that's a USB sensor hub, and so you can, you know, put it in your own little box, connect it up on USB, no soldering, no header pins required, all right? Or maybe you just want to get more functionality out of the phone you have, because like so many things the hardware provides a lot more functionality than the software makes available. In particular, on the Galaxy S3, it has a barometer, and the barometer needs a very good temperature sensor on there, and, but the sensor how the chips with that device doesn't bring the temperature sensor out. Why not? It's probably the best guess at ambient room temperature you have. So you know, this is, I hope with what I'm talking about today, you could go grab your device and start poking around and see what you can see if you can unlock a little hidden functionality. All right, so yeah, you should be able to do this, start experimenting, integrate some new sensors, and you know, get an idea of when things aren't working, where you need to start debugging. Here's the problem. It's actually better than it was a couple years ago. But if you go Google that, this is the page that comes up, and it basically points you towards the header and says, this is the hell. You should go implement it, and you go, you open up that header and it's a whole bunch of, it's object-oriented style C with these function pointers, and you first glance at it and you go, what the hell is this? How do I do that? And like I said before, the scorecode's there, but not much of the big picture. All right, so here is the big picture. And if you went to the camera 2.0 talk yesterday, this is a similar kind of diagram because Android, they use the same how sort of mechanisms for sensors, GPS, camera, right? They're trying to insulate the Android internals from, you know, device to device changes. And so every device that ships has its own, you know, slightly different sensors.so or lib sensor it's sometimes called, you know, I'll go back and forth using it interchangeably along with how, but they're all the same thing, they're that blue box, and if you're making a device or you want to change how the sensors work, you want to put your code there. So let's walk through the data flow, not such a different slide. So all the way at the top we have applications, all the way at the bottom we have silicon, you know, these great mems devices. And what's great about doing this, if you replace the how, is all those apps you can download from the Google Play Store and elsewhere, they use the standard Android APIs, they can now talk to the sensors on your device. No changes, no effort on your part, you know, to do apps. I think that's one of the great parts about Android. So just walking through how this goes, let's say you fire up an application that requests the accelerometer. That comes down here to the sensor manager, you know, inside Android, and the sensor manager is what takes care of if you have three applications and they're all asking for the accelerometer data, it keeps track of who's listening and at what rates and who wants the data. That comes down here as a request to the sensor how, if the accelerometer isn't turned on, the sensor manager knows that and tells the how, hey, go enable the accelerometer and once you enable it, please set it at this rate. And you'll notice if you're familiar with the Android sensor manager APIs where you request, you know, the fastest rate or the slower rate, that translates pretty directly into the sensor how call. And this is where it starts to get different on each device is the, depending on the sensors that you have connected and what drivers they have, that how is going to have some intimate knowledge that, okay, for the accelerometer, I need to use CISFS to enable it and there's a CISFS entry for that device. That goes through that Linux infrastructure, standard drivers, all the way down to the silicon. Likewise, you'll, so that's the enable and then what happens is the sensor manager keeps the timer and starts asking the how, hey, do you have any data, do you have any data at whether it's five hertz or 200 hertz or whatever the system can do. All right. So, kind of talking a little bit more at the top of the stack here, I'm going to have a little map to keep that bigger picture in mind. This is one of my favorite sensor apps, it's nothing fancy, but it's fancier than anything I would have built myself, which is usually just text streaming on a console. And it tells you all the sensors that the how says are there, right, so you can actually have more sensors on the device, but if the how doesn't bring them up, the sensor manager isn't going to know about it and neither will any of the apps. At the keynote yesterday, the guy from NASA was talking about the CellBots data logger, and it's another great sensor application. And I think this is part of the joy of Android, you don't have to write those. So, kind of jumping around a little bit, I'm not going to talk about sensor drivers, this is actually a real key sticking point when you're trying to hack on devices, is are the drivers doing, are they available for your sensors? Do they have any incompatibilities with what you want to do? And I've talked for a long time about this, and actually a little shameless plug, and I'm going to be giving a talk at Design West about reusing sensor drivers that are already out there and porting them to different systems. All right, so I won't talk about sensor drivers today. I'm going to talk a little bit about the Linux infrastructure that the HAL really relies on, or you're going to rely on if you're building the HAL. And you think about those drivers as having kind of a bottom API, that they use to access SPI or I2C or USB. But at the top level is really where the HAL gets involved, is what's the top level interface that those drivers present? And most, two years ago, a year ago, maybe even, most drivers had input event interfaces. And I'll go into detail. I'm going to talk about input event today. Sensor manufacturers are moving away from this and doing IIO drivers and some different methodologies, but what's great about input event drivers is they're easy to write, and they're also very easy to work with command line, which is why I'm going to talk about them today. But I think the big key with the Linux infrastructure is there's so many tools out there, don't reinvent the wheel. Use something that's there. Input event framework hails from the days of way back when. It's really rooted in mice, keyboard, and joysticks. And I think a lot of people kind of look sideways at it when they first see it, but it really is handy to work with. Here's the demo part. And the thing I want you to remember out of this demo is that the input event framework, it presents the sensors as a name that you can understand, like MPU 6050 accelerometer, and a whole stream of events. All right. So usually the first thing you want to do, kind of backing up here is what I want to show here is it's hard to write the HAL if you don't know how to read the data. So that's why in this demo I'll walk through finding out what sensors are there and reading the data, and then the HAL is going to have to do that same thing we're going to do with command line. All right. So here we go. So we start up adb shell. And if you just go and, you know, drivers prevent, you know, they look like files. You should be able to cap them, right? You should be able to, you can go in ls, you know, all the input event drivers show up in dev input. But it's, you know, which of those 14 devices is the one we're interested in? So to do that, you just have to remember this magic, you know, incantation here. In the proc file system there's a place where you can find out all the names and gory details about each of the devices. So on here you can see that there's a, there's a light sensor. You know, we've got some other jack. You know, a lot of buttons and different other peripherals show up as input event devices. The ones that we care about for the sensor HAL, obviously the sensors here. And what you want to look for is this line that says handlers and that tells you the device file that you can read. And so the accelerometer that I want to look at is on event six. So let's go cat event six. And this is kind of a setup here because catting is a bad idea. It's not going to hurt anything. And there's all that garbage. And so why is all that garbage coming out? Well, because it's not, that's a binary file. It's a whole series of these input event structs which have a time stamp code. They've got some other stuff I'll talk about. But at the command line it's great. There's something built in. It's in every Android distribution, you know, every Android device I've tried has get event. And you can use get event. I'll use the dash t to get a time stamp because that's a key property of input devices. There it goes. All that data comes out. And if you look here, so we've got the format is time stamp, event type, event code, and then data. So event type is telling you if it's a zero, sorry, this is whether it's a sync or relative absolute. You don't have to pay too much attention to know that this is real data and this is saying, hey, these three records go together. That's a sync. Then the code, zero, one, two, that's x, y, z. And if you look at the values, that actually makes sense for an accelerometer because what do accelerometers do mostly? They measure gravity coming down. And you can know that it's sitting flat on a table. And so, right, we do our conversion. This is actually a negative number really close to zero. That's really, you know, x, y. Those are both pretty close to zero. And this right here, 2,000 hex, just, you know, because I've spent too much time doing that. I know that's what gravity shows up on this sensor as. Now, that doesn't look like, you know, an Android sensor. It was saying 9.8 meters per second squared is what gravity is coming down as. And that's a big job of a sensor is to convert those driver values into floating point, you know, real engineering units. And what can mess you up is if the driver is, say, in a, right now it's in a 4G mode, but that accelerometer supports like an 8G and a 2G. And if somebody's changed the mode on you or you've changed the mode and forgot about it, that conversion factor changes. And so it's something that the how has to keep track of if you're going to switch ranges or you're going to do anything strange. All right. So yeah, use get event. It's very handy. And in fact, you know, I've taken, qualified to see how well the sensors are working on a platform just by piping this into a file, pulling it out, you know, parsing it up. There's a whole lot you can do with this before the rest of the Android system is built if you're building your own board. And you just want to know, hey, how good are the sensors working? Right. So input events are data pipe. If you look in the header, there is technically a back channel for force feedback joysticks that I've been tempted to piggyback on. But really it's the common way to do it is to use makesysifest entries for control. And here you'll see this little demo will start up. It's going to go read a bunch of data, I think. Oh, first I'm going to turn it off. There we go. So this is our good friend Echo. And this, you know, how do you find this path? Where does that magic come from? That's, you got to go read the driver and see where it's putting it. This particular driver, sometimes, some people put them in CIS class sensors. They put the enables. If you dig around in CIS class sensors, you might even be able to find ways to, you know, read the raw data or read calibration values. The Samsung devices tend to put them here for the drivers on Samsung devices, I should say. Okay. So, oh yeah, got to redirect in the file. So now I try and get event again and no luck, right? It's trying to... Nothing happening there as we expect. I'm going to go back, enable the sensor, and events show up, right? And so those actions right there are most of what your HAL needs to do. I mean, there's other details, but that's the key thing is how the HAL needs to operate the sensor drivers to enable them and get the data out of them. So, here is this header, go open it up, take a look at it. And here's the important bits. Input event devices make input event structs. And it's the same thing we were printing out with get event, time stamp, a type code value. And, you know, the difference between, you know, you'll see over there some of the types of key that shows the keyboard legacy of this. And the difference between relative and absolute, you don't have to worry too much. You know, a mouse is a relative device where every time you move it, it just spits out a delta. And a touchpad is an absolute device. And when you're hitting the top left of the touchpad or the bottom right, those are always the same. It's always 00 or 128, 128. Why that matters is a subtle detail on the input event system filtering out values, but I won't go into that too much. The event codes, XYZ is pretty simple. You have a light sensor. That's when things start to break down, you know, it's just a single-axis sensor. So, what is the event code that it should output? You know, I think that on this device, it shows up as like EV gas, or something like gas pedal, right? There's a lot of game heritage here. I've seen magnetometers that return EV hat. What is that? And the standard starts to break down. But for regular sensors, people, they usually show up like this. What bugs a lot of people about input event system is that you have four of these structs to represent what is really three, probably 16-bit values. So there could be a little bit of overhead, especially some of these high-rate gyroscopes that are, you know, one kilohertz of data. Yeah, it's a lot of extra baggage to be bringing along, which is why people are like IIO. But it's useful. Okay. So here's some links you can click on, some more reading about input event. I'll let you guys check that out later. It's a lip sensor. Okay, time. It's primary job. Is it startup? It tells Android, hey, here's the sensors I have. And then as we just did command line, it needs to control the sensors and read the data. Okay. So remember I said ugly callbacks? You open up sensors.h. And this is the function pointer you need to implement that does get sensor list. Ugly, but the best part is you find someone else's implementation and you just tweak on that. So it makes sensors available and it has more methods like activate, it's enable, disable, set delay, how fast you want the data, be wary of milliseconds and nanoseconds and all that. And this sensor data, when you pull, see that struck sensors data? Sensor data T, this one right here. So we got those input events that had values that were integer values. Inside that sensor data T, it wants floating point values. So that's really the job of lip sensors to convert those driver values into Android's preferred engineering units. And here we go. That's what I was just talking about. And this sensor data T has a nice union so you can get at it as like a vector or an x, y, z, or we've got a few more things in there. Another thing besides doing units is also the orientation of sensors. This drives everybody crazy. I spent more hours in my life than I care to admit because someone has laid out the sensor on the board one sensor this way and the rest of them all that way. And it really comes into the fact that y plus y is supposed to be here and z out that way. And it's a real time waster to get them all oriented in the same direction. And that's another big job of lip sensors. So there's the header link. Take a look. There's actually, it talks about, there's some decent comments in the header about Android's conventions for how it wants sensors. As I said, adapt an existing implementation. If you are cooking your own, I recommend one of these rowboats, one from a rowboat. A lot of the ones if you look in the AOSP tree on Nexus devices will have proprietary links to sensor manufacturers, sensor fusion. And it really gets in the way of what you want to do. And this is kind of a more simple place to start. I don't really like this. I think this sensor base is the class. This is in that canonical implementation of LibSensor. They utilize this class. And I think this represents a lot of what people don't like about C++. And it's kind of thrown together. But it works. And what it does, you derive your own from sensor base an accelerometer sensor or gyro sensor. You can do a generic three-axis sensor, however you like. And utilize this input event. Someone else has done the ugly bits of pulling the input event and getting that. So you can reuse that. It takes care of issues you don't want to. All right. So go ahead and follow these links. Here's one from the Samsung Tuna. What is that, the Galaxy Nexus? I always get all those Nexus names confused. And you look in there and you'll see that the light in the pressure sensor are just brought out in the same way that they are in the TI-Robot LibSensor. But then for the inertial sensors, the magnetometer accelerometer gyro, it's all taken care of by the InvenSense MPL layer. And that can be problematic if you are trying to re-implement the HAL. So back kind of in the... Oh, my box is off there. Some more stuff to read about Linux infrastructure. Like I said, IEO is definitely, if you're looking at the sensors, the latest InvenSense drivers are all IEO, the latest ST drivers are all IEO. And there are lots of legacy... Legacy has a bad connotation. There's lots of input event drivers out there too. And some more reading on the lower sensor device standards as well. All right. So things to remember. You want to implement the glue in between the HAL. Re-use what you can. And try and find drivers that you can use. Nobody likes debugging drivers. All right. So what's next? The HAL hasn't changed in the last three Android releases. The implementation of sensor managers changed though. So it used to be you could only have one accelerometer in a system. Now, if your implementation of the sensors says that there are two accelerometers in ICS, both of those will show up. Sensor fusion demons, that's that binary bit I was talking about that InvenSense puts out. Sensor platforms actually puts one of those out too. They make it a little harder for the hackers and all of us. But if you do present just the raw sensors, Android will compute some sensor fusion for you. It's not great. I wouldn't steer spacecraft with it or little robots. But it gets the job done. These dedicated sensor processors are an interesting idea. I sort of see it as the flip side of graphics processing. There's a surprising amount of math and physics that goes into this stuff and a lot of computation. And if you can, you know, anything you do to keep that application process from waking up. So people are either putting dedicated sensor hubs or combining the touch controller with a sensor hub together or the GPS in the sensor hub. And it's an interesting architecture choice. Along with that, you know, in the future, I really see something more like an open GL for sensor processing happening, especially as we start pushing the bounds of what these virtual sensors are and what you can do with them. Like I said, those virtual sensors, we think are pretty interesting. So that's all I have. I'm happy to take any questions. Yeah. All right. The question is, I mentioned sensor fusion. I didn't talk about it a whole lot. Where does it show up? Let's go back here. So sensor fusion, there's nothing to say it can't happen right here. It's typically, if not straight in the how, there's some fusion from that, I should say the InvenSense MPL. They, straight from the how, they load up a shared object library. And that shared object library does the fusion. The AKM is a magnetometer, a compass manufacturer. You go to AKM's website, you can download their AKM demon, which used to be on the Crespo device, I believe. And you think about a demon, you can run top and you can see, wow, I think that AKMD takes 5% or 6% of the CPU when you're doing a full device orientation. Which is, that's the tough thing that sensors want to do. With these dedicated sensor processors, the always on kind of thing, if nobody's listening to the sensors, calibration's always running in the background. And I was talking with dedicated sensor processors, they're starting to push that calibration and even some fusion down into the hardware. And it has some interesting battery trade-offs with the requisite increase in bill of materials costs. Yeah, so there's a, if you provide a, if you don't provide a rotation vector, so rotation vector orientation, the sensor service will actually use the MP light and do some fusion for you. You know, another one of my, I'm going to get on my soapbox about virtual sensors is that, so you could also think of screen orientation as sensor fusion. It's taking the accelerometer and trying to figure out, are you holding it this way or are you holding this way? How many times, you know, a lot of times I just turn it off. If I'm laying on the couch and it's got it wrong, I want to see it this way. And I think that's part of the problem is that they're doing some sensor fusion, trying to make this virtual screen orientation sensor. They're doing it up here, whereas if they would have made that a virtual sensor in the how, somebody else has a better idea. Maybe they want to use the camera and track my eyes and correlate that with gravity and say, hey, gravity, you know, the device is facing this way, but he's really looking like that. So let's not change the screen. Unfortunately, that's not something you can do because there is a little bit of fusion happening up there. So I guess that's one other question. Yeah, IO, industrial, I believe, input-output, industrial input-output. So, you know, sensors have this, there's such a broad connotation, right? I mean, you can think of a camera as a light sensor. Why doesn't that show up here? But, you know, strictly more sensors, I've seen people talk about, you come from the PC world on Linux and there's like hardware mon for monitoring fan speed and CPU temperature. Those are sensors. But they're very, they're kind of low-rate. The information that comes out, you know, it isn't very fast. On the other end, you have people like at CERN, who, you know, they have sensors in the super colliders that are like gigabits of data a second. And so the, you know, input event definitely won't work for them. You can actually see in one of the complaints about IO, that they say, well, that doesn't even work. And IO comes from the, from analog devices, right? Sensors, another way to look at a sensor is really, the silicon, it's this mem stuff is doing very minute changes in electrical properties and they have really good analog to digital converters in there. And so, you know, they're kind of fancy analog to digital converters or device specific. And the kind of data rates that analog devices was seeing, you know, they wanted to buffer a lot of data. The input event system wasn't working for that. You know, they want to do one, one kilohertz data rates. And so IO has support for buffered data, multiple channels of data being able, rather than, remember I talked about how a magnetometer coming back is, you know, EV hat. What's that about? You can actually name each of the data channels that are coming out. The downside to IO though, in my opinion, is it's too, it's overly flexible. You know, whereas I can get an input event device and read X, Y, Z and I've got 90% of what I want. And so the IO, you know, just because you, your LibSensors targeting one IO driver doesn't mean you can instantly go to another one. It may look quite different. That is a, that's a good question. So the question is, if you have multiple accelerometers, which one, which one takes precedence? And, you know, actually I haven't looked at the Sensor Manager API for that. But I've brought up, I think, Donna Hallen brought two of them up and they show up. So good question. I'll follow up and look into that. Okay. Well, if that's all the questions, please feel free to come on up if there's anything else you want to ask. Thank you.