 This project actually has a different objective to the NEA's collection and reporting of data. Contrary to the fact that Dr. Thich Nguyen made, it was never my intention to correct, replace, or audit or even question the NEA's data. In fact, I depended upon their data. My objective was quite different. My problem arose because of my runover. I run over an exercise at a distance. If you exercise in one place, if you run a tennis court or football field, if you can see or smell haze, then go inside. But if you run, as I do, a 10K loop, or if you cycle a 30K loop, or if you paddle a 15K loop, you face a very real possibility, if you're in borderline haze conditions, of going out, running for an hour in no haze, and then turning for home and being confronted by a wall of haze and an hour of ground and cover to get back home. The question for me was, wouldn't it be nice if we had closer to real-time, at the time, three-hour data, and a much finer grain than just one-fifths of Singapore? Perhaps 50 or 60 sensors. I looked into it a bit, and there's a few ways of measuring haze, or measuring particular matters specifically. The amazingly expensive way, and Dr. Alvish ensured a picture of a station running more or less these sensors, actually captures a particular matter for 24 hours, and you then weigh it, sort of micrograms, and work out the density of the air. It's very expensive, it's labor-intensive, slow, but it's very accurate. Somewhat cheaper is the beater decay system. So this is actually running air through a piece of cloth strip, and then putting it into a chamber with a beater radiation source, and measuring how much beater radiation is absorbed, and then actually infer how much matter has been collected in the tape during five minutes. Both of these are great. They are a solid basis for publishing data for medical, legal, and diplomatic purposes, and the NEA uses, principally, the one on the right, by suspect, one on the left. However, these are all thousands, tens, or hundreds of thousands of dollars a piece, which needless to say, I can't afford one awful little loan, 50. So, I took a different tack. I said, well, there are these really cheap dust sensors. They're about 12 US dollars. They're used, typically, in air purifiers. They're used in infrared beam to count particles that are floating through the beam. They're not exactly, they don't measure the whole of a particular matter, and in fact, they're completely blind to most of the haze there. They can only see things one micron bigger. But my guess was that the haze has a fairly consistent composition. So if the one micron stop is going through, then it's probably indicative of the haze going up at the same time. So, I put together a prototype to work this way. I take the dust sensor, spark core, put it in a box, find a bunch of volunteers to stick it on their balconies, and use their Wi-Fi at home to send me a sample every 30 seconds, and to make the map that you saw at the beginning. The device itself looks something like that. It's just a black box the size of your phone, but thicker. Some holes for air to come in and out. In this case, it's sitting on the sheltered power in the balcony on a condo. The device inside is minimalism at large. There's an infrared ideal on the right, which is emitting infrared. There's a photodiode up here at the left, whose current changes, the amount of light changes. Then the barrels with your lens, and then there's a resistor, which is a heater, which is creating the connection current to keep the air moving through it. So, as particles of dust, one micron or larger, enter the sensor, the amount of current flowing through the photodiode dips. To get a sense of the optics, the guy at the teardown actually put an ordinary, visible light LED in place of the photodiode that they put the lens in. So there's a focus just about the resistor. So there's actually how it really is focused on what is traffic about the resistor. This was the same guy who reverse-engineered the schematic, because the manufacturer doesn't. The electronics geeks in the room might notice, we use P1, I'll see at the bottom. The important fact is there's no feedback. So unlike reading a thermometer with an Arduino, where you're reading an analog value, the output of this one's either on or off. It's actually that it's pulsed with modulated. So you're actually measuring how much at the time the output spends low. The more time it spends low, the more particles there are breaking the beam and reducing the current through the photodiode. The manufacturer doesn't publish a formula, but this chart. So I found somebody else using it who had actually done a fitted curve, which is that second holiday piece there, which is basically given the ratio, the fraction of the time that the output of the sensor spends down. What does that mean in terms of concentration? It can nearly hit 10 particles per 100th of a cubic foot, which is an amazingly convenient measure. So what the system delivers is a series of events using the Spark API. And so the key numbers are about the middle, the ones they will see. Notice they do jump around a bit. And so you do have to average more than one sample, but whether you do two minutes or five minutes is a choice you can make after the data has been gathered rather than prejudging it with the device. And it's the basis for being able to make decisions about how close to real-time data it is. I used a very crude linear approximation. The measurements from that our sensor do move with the NEA's data. That was my first indication that I was off to something that might actually work. But I do need to keep from meeting people this is an approximation, and all of the data is extrapolated. It's not something that replaces the expensive instruments. The outpoints, in fact, I don't know which numbers the data streams available to developers in the means of congratulations public, but in fact in the map that I publish it's just the five colours, the same colours that the NEA uses for the five bands of BSI. And they correspond approximately. So this is great. Here we have a measurement, we have a thing, it's all great. There's one rather serious problem left, and it's humidity. So it turns out that this method of measurement has a fundamental problem. The dates on this curve has been known about for a rather long time that as the humidity in the air goes up the amount of scattering that occurs goes up drastically. And so you're reading numbers between 0% and about 60% relative humidity. This depends on the aerosolver detail. Then it's all reading accurately. Once you get above 60%, the reading for the same concentration gets higher and higher and higher in this case under 10-fold between 80 and 95%. And of course in this country the humidity spends most of its time in that range. And so there's a serious problem with the data coming out being more affected by how humidity air is than by how much haze there is. So this is an ongoing difference with the needs of racing. I won't go through that in detail at the time. So the big things that I learned out of this were one, that the NEA is risk-wise important. When my project came to the teaching of the newspapers, they approached me and said, okay, for sure they've now spoken to the NEA. There was no interference of any kind. And in fact, when I happened to meet him later at a e-cam, I think, as he said, sorry, a lot of questions, he then connected me with the Chief Science Officer for pollution control. She is supportive, that she's a scientist, she's not an IT person. And the need to deal with IT people gets in the way of even the best of intentions, as I'm sure half of them would know. Most of us have been there. Another important lesson was that the smart core was too mature. If you're working with a device on your desktop that you're learning with, then a recently kick-started or outfunded component is fine. But when you're deploying dozens of them to run unsupervised places you can't easily get to for months or years, then little problems become a major holiday network. So that was a... I like the device like the people, but that was a choice I wish I hadn't made because a couple of the issues with the maturity of the devices had gone in the way and I couldn't fix them because of the way they were deployed. Finding appropriate volunteers was difficult. 80% of people in Singapore live in places that do not have bad money as a power that's on them. And those that do are all in one part of the island. I can't sweat any longer what's people who want to help. The correlation between them and where it seems to be turned out to be really poor. But the really big discovery was that the one micron proxy, the near and far beam is actually a modest proxy for measuring the intensity of the haze in Singapore. I do get inquiries about other places and the answer is I don't know. Beijing, for example, has a whole lot of stuff in its air that it isn't haze. And so the correlation work that I've done is critically dependent upon what the air is made of. I have yet to have answers on whether that's via what it's where. I'm starting to work on expanding the work into places where there are no apartments. I've been offered science where there's no infrastructure and I object strenuously to paying $220 a month to Telco to send a very small amount of sensitive data. So we're looking at these Norwegian radio modules. I will talk about this in a conference session on the weekend. But the major number is the minus one in DBM. That's about 1,000 times as sensitive as the Wi-Fi receiver in my Lenovo. Meaning that on paper these things have a multi-kilometer range despite operating entirely within the low power device limits in our mass inspection. But if you can't get much data around you can't web browse with it. But to move a few samples from sensors you're talking about, I think these are $50. They're not expensive to us at all. They're like an inch long. They're very small. So I'm working towards putting those in and I'll talk more about that on the weekend if you wish to hear about it. So I've got 10 to 12 sensors operating now. I'd like to get to $50 or $60. How much time? Go to the minute. Okay. I have still failed to get the archival and live data feeds up and running. But I'll do those fairly soon and probably to back end source. But what I particularly want to do is add the humidity sensors, mesh modules, and also the battery and charger for solar power in a couple of places. In light of time I will cover a lot of small detail in our conference session on the weekend. I've already plugged in. I will also be doing a session on something a bit preposterous. Later this year I tend to bounce radio signals off the moon. So I will also do a talk on that on the weekend. Anyone who's into radio electronics or long distance communications, this should be your interest. Thank you.