 So I'm Greg Conti and the reason why I'm here is just to present some research and really that's relevant to all of us on the sensors in our lives, the instrumentation, and how our lives are becoming more and more instrumented and where that could go. And I take it that we're really at a point where it's not if we can lead a private life or what percentage of our life is private. It's more a matter of what percentage of our life is private and where that percentage will be in the future. So this talk is going to kind of cover this breadth. Hopefully expanding, you know, there should be some things in here a little bit surprising for everybody, but just kind of think across the breadth of the problem and and see just really how much of our lives is instrumented. I'm not going to dwell on the usual suspects. So there's things that are pretty much common knowledge to people interested in privacy. We'll touch upon that, but I'm not going to dwell in dig, dig deeply, dig deeply into things that are pretty well trodden, more focusing on the breadth. And it's also we're going to look at some countermeasures, but really the countermeasures, and there's a wide range of things we can do, but I argue there's really no silver bullet technology or policy countermeasure. This is almost like the tide, right? So we can't stop the tide, but maybe we can help divert the water, or divert the direction a little more positively, and also, this is probably one of the largest gatherings of privacy-aware folks in the world right now in this room, and it's a good opportunity to throw some ideas around. So I'm interested in your questions and dialogue in the Q&A Room 3 afterwards as well. And if... So I thought this slide was an appropriate way to start. This is Iran Gilela, the godfather of the American paparazzi, and here you can see him with his camera stalking Jackie Onassis, before the restraining order, of course. So no one actually reads disclaimers, so I thought it would be interesting to just translate it into different languages over and over again and back and forth and see what popped out. So I'll read that one. The viewpoints of views expressed by the author of the statement, Costumes, and do not reflect the official policy of the University of Bow and the United States Army, Pentagon, US government sector employment. So that's about as legible as the original, or understandable, so I thought it'd be fun. So as you go through this, think about the censors in your lives. Think about the censors you've invited. Think about the ones you don't know are there, and what you've traded for, you know, sometimes you have opt-in options, sometimes you have an opt-out option, sometimes you have no option. And just to think through that as you see the different ideas we throw forward. First off, it's surprisingly difficult to find pictures of people wearing ninja masks taken via ATM cameras, but so I apologize if this is someone in the room. But the Nashville Police Department had a set online, so I took advantage of it. But a joking aside, we don't want to live in a world where we have to wear a ninja mask to preserve some element of our privacy. And when you think about how information is being gathered, sometimes it's relentless. Has anyone tried to avoid taking the census? Yeah, how did that go? Right, they're pretty relentless. And, you know, robo-callers calling your homes, trying to make contact, satellites flying overhead, cars driving around the streets with cameras on top. I mean, really, who could have thunk that 20 years ago? Mapping the whole country. And sometimes we go to them. You know, there were people who went to Black Hat, they had a barcode on the badge. And if you went up to a vendor booth, they'd scan your badge and you got some trinket in return. And literally people were getting, and that was probably at the cost of a lifetime of spam from that company. So people were going up to get a button with a blue blinky light on it from one vendor. So they've traded a lifetime of spam from a company for a blue blinky light. And these are security experts. But really, it's not the people in this room. I mean, everyone in this room is aware, right? You're able to defend yourselves to the degree you wish to be defended. But we have to really think in terms of everybody else, the people walking around pushing strollers with six kids, walking through Caesars and walking up and down the strip, the people who buy bottles of Dr. Pepper and go to the website and type in the code. The people who respond to surveys printed on their restaurant receipt. Does anyone want to admit to having done that? Did you win a, anyone win any of the, did anyone, you won? Someone won. Ah, very cool. So I mean, it's, they offer, you know, I always thought the odds were like one in 10 billion or something to actually win. So very cool. And no talk on privacy is complete without a discussion of the Panopticon, which is a design that, this is of a prison. But what they're getting at is people behave differently when they're being surveilled, ideally when anonymously surveilled. And they're, that was part of a prison design. So it's something to think about how these, how this instrumentation influences our society. And then what will the data, what insights will this data provide? And we're really hitting a point now where we can be predictive. We can predict what masses of people will do. We can, I believe we can predict what largely individuals can do based on the sums of the data being collected. Now, of course, this, this instrumentation serves dual purposes. So I'm not saying that it's all bad and nor that it's all good. But what we really need to do is think about the right balance and the right ways we can prevent. It's kind of like the Wild West out there. People are running amok and finding the right limits on behavior by those collecting the data. So this talk starts off with, it kind of walks through the data collection in your personal life a little bit online. I'm really staying in the analog space here because I think in the real world impacts us. People think about it a little bit less, perhaps. And in your home, community, some countermeasures, and then where the technology is going. So as you see these different sensors or platforms that I'll present, it's useful to kind of think through various facets of them. You know, what's the sensor? What's its capabilities? Is it analog or digital? Does it have a power source? So if you're going to hack this thing or try and influence it in some way, it's good to know how it works. What are its strengths and weaknesses? What's the input? What is it sensing? What is the subject? Is the, what's the environment it's operating in? Is there a warning or a public warning? Do people know about it, warning or not? Is there a privacy policy? Is it an understandable privacy policy? Can you opt in? Can you opt out? Or is it a real, isn't some sort of realistic opt out? Like you can opt out if you stop breathing air kind of opt out. And are you complicit? Have you made a decision to be engaged by this in return for something? Where is the data being retained? How much of it is being retained? Is it ever destroyed? And I think that's something we need to think about, to think a lot more about is it's often in the best interest of those collecting the data to not destroy it. Certainly it's all about incentives. Is there incentive for the data to be destroyed? If a company's worried about being subpoenaed, perhaps they'll delete their email over time. But that's a potential solution that can be valuable. And I think one that's actually workable on a large scale is the idea of maybe pushing forward on the idea of destruction. A lifespan, the idea of learning, learning, teaching ourselves to forget. Because humans forget and it works pretty well. But with the cost of hard drives, it's trivial to retain data often infinitely. What type of processing is taking place? Either on the device or post-processing. Are they processing for uniqueness? Can they tell? Maybe they don't know the identity of an individual, but if an individual pops up again, can they tell it's the same person? Is that type of activity taking place? Is it such a resolution where they can tie it to a database of individuals in some way? Is there any other data mining taking place? How is it being communicated? Is it being communicated in real time, near real time, in a batch? If you've got a given device, are you syncing it with something else with network connectivity? What's the odds of it leaking? And finally, who's consuming that information? Is it just being stored or is it being consumed in some fashion? There's those that you know and those that you don't know. How did they get their hands on it? Thinking about processing, we'll cover some automated means later in the talk, but there's also just some interesting innovations of enlisted crowdsourcing surveillance. This is the Internet Eyes website. It's out of the EU, and they allow anonymous viewers to earn points by monitoring video cameras and then reporting things they see. The winner earns a 1,000-pound monthly prize. The runner-ups get included in a thank-you list. And you can see at the bottom just an example of the view that they demonstrate. And it should come as no surprise for people in those rooms. What boat is this? So the Exxon Valdez. And I collected this more than 100 days ago, so there's probably other examples that could be used. But I don't know if you've seen it, but it should come as no surprise to anyone in this room that information is slippery. It has a way of legally and illegally spilling. And the idea, again, it comes back to the idea of destroying the data at some point, learning to forget may be valuable. And there's this trend that it's a lot easier if there's something unique. As you're trying to analyze this data, people are trying to find ways to uniquely identify individuals or devices and the like. An interesting example is that of firearm microstamping, where the firing pin has a small number on it. You fire the gun, the casing pops out, and there's a serial number on the casing. The idea of digital cameras, the idea of digital cameras, that pixel noise in the sensor can be used to uniquely identify a camera, potentially. And that that can tell what photo was taken by what camera. But there's this wide range, and the EFF has a very nice list on their website in some other places. But thinking about your social security number, your DNA, there's talk now of if one person's arrested for a crime, they could potentially tell that a family member of that person could match close enough to a person committed another crime. So if I committed something and my brother did something, they could, and they had a DNA sample of my brother, but didn't know who that was. There's some ongoing legal debate whether they can use that, that's enough probable cause to bring in my brother. And then a wide variety, I mean it seems like every facet of our lives is trying, people are trying to find ways to make it unique so it can be tracked. And you've probably, well certainly biometrics is all about that. People who are ham radio operators know that you can tell a person, experienced operators can tell a person by their fist how they send Morse code. Gate recognition, printers and copiers, the idea of putting micro dots on there so that they can be tracked. So anyway, what I wanted to get at here is the idea that uniqueness is important, facet of what we're talking about. And there are other emerging technologies, and this I didn't make up, this is from chat roulette, there's apparently a problem. But it's useful, I mean it's all about incentives, right? So people are finding ways to, motivations to collect and perform ongoing analysis of data. And then sensors, sensors are getting cheaper, they're becoming more embedded in toys, in our phones, and millions of other places. And it's just the cost is dropping. So let's look at things you might carry on your person. And you can think of the idea of RFID tags and clothing, that you may get custom clothing made online. There's a kid's website now that kids can go online and design custom clothes, and then three weeks later it'll show up to their house. As an opt-in option, you may have chosen to use the Nike plus iPod system and put a sensor in your shoe. And again, think about this is today, and you have to look into the future and say where is this going? One of the biggest culprits is your phone. More and more sensors are becoming embedded. The idea of applications, some of them arguably, not all that trustworthy, performing activity. Phones may be automatically updated. We don't know what type of remote control the service provider may have. And there's often a location, a GPS activity as well. I think it would be an interesting, so as I go through this, I'll throw out a couple of topics, I think would be interesting research topics. And I'm interested in the idea, and I'll get this later, of software phoning home. But people perform packet capture on their home network, and it's relatively straightforward to do that. I'm concerned that not enough people are looking at what is actually phoning home from your phone. So what are these applications doing? It's a higher bar to entry. So I think it would be interesting to continue to watch that space. And not just the applications, but from third parties, but from the actual providers themselves. It would be interesting to watch. And then things like next generation watches. The Garmin FR80 automatically transfers data to your PC or Mac, and it monitors your heart rate and other things. And you can share it through the Garmin Connect online community. What watches look like in the future. Of course, there's books. When they're not deleting your copy of 1984, we have to assume that they certainly have the technical ability, and there's probably the financial incentive from the companies to track what you're reading, when you're reading it, how fast you read, all of that. That's just another input into this. Online, we could go on and on. We could spend a whole day just discussing online. But I've just focused on a couple here. Think about it. eHarmony, people are disclosing enough information about themselves in a survey. To find a mate for life. That's the one coming. So think about this. What we've covered, the different facets, the different types of things that we are disclosing. And then are these being combined in some way? Clearly social networking. I mean, I tend to believe that each time we get a connection from someone else, a little bit of our privacy dies along the way. Particularly, I kind of like thinking about people from high school, or this other part of my life, and then no one knows of that connection until they pop up and join a social network. There's a website called Hunch, which is looking at the idea of social search. Personalize the internet. And what they do is they ask people to answer questions. And you answer those questions, it'll provide personalized search. So there's this balance between, or this tension between personalization and privacy. You get more personalized service, the more you disclose about yourself potentially. And then the idea of location aware games and other activities. But there's this general theme of finding ways to find the people who are pushing the strollers up and down Caesars Palace to disclose information about themselves. Finding some fun way. We give them a badge. We make them a mayor. We give them another star on there of a web account or whatever to trade for personal information. And the idea of our software is phoning home. It absolutely is. So I would suggest this. If I had an infinite amount of time, I'd spend a year taking all the different types of applications, loading them one at a time, seeing what they're phoning home, because they gotta be doing it. Sometimes they ask. Sometimes they have it buried in defaults. And other times, certainly they're not asking. And these are legitimate companies. So you also have to thank our government. There's plenty of photos available of people hanging smoke detectors in their homes. So let's take a look at some of the things in your home. And also think about where this is going. For example, cable boxes. There's experimentation ongoing. And this isn't fielded or anything. But the idea of putting different camera technologies in your cable box to tell if people are in the room. So they can tell if that people are watching the advertising or not. Or to allow targeted advertising. Again, I'm not going to go into the computer, but just think of it as this multifaceted sensor that you're typing information into that may have cameras hooked up, microphones, fingerprint readers, and the like. And it's also a facilitator of communications. This being DEF CON, who has modified a Teddy Ruxpin. Who has tortured a Teddy Ruxpin in their past? Do we have one? Do we have a few furbies? Anyone with a furby? Yes, a couple. Sometimes the same people. So these are technologies. Toys are becoming, are including more and more sensors, right? And they're probably syncing with PCs and eventually will have wireless connectivity or may already have wireless connectivity. So what are they collecting? What thinking about our games? This is Project Natal now called Microsoft Connect. It has a depth sensing camera. It can ID and track body parts. And it's, I believe it's on sale in November. They're talking about it being useful for a home security system. But they say it's just a question of whether it's socially acceptable to do that type of thing. Having a camera looking in on people. Our appliance, well, let me take a step back. Isn't it a great world we live in if you can sell a refrigerator just by putting a pretty girl next to it? It's kind of shameless. So, but the idea of internet toasters, everything is becoming internet aware. It will include sensors, food consumption, for example. The idea of geo-tagging being, capabilities being built into cameras, GPS automatically tagging the photos. And more. Smart power meters are on the horizon. Home automation is on the horizon. And each of these things have the ability, these are sensors, they have the ability to collect information. And some of the older technology is not so much. But as we move forward, certainly more and more. I had a question. I was showing this to a friend. He asked about the smart urinal. I haven't done too much research on smart urinals, but I recall seeing the idea of putting urinalysis technology embedded in a urinal. So if you can tell if your troubled youth is using drugs or something like that. And then there's a spoof video that we'll get to later in the talk. People are wiring up their house plants, saying, thank you for watering me. Urgent, please water me. People, best men, have wired up their, the bride and groom's honeymoon bed to say, they're on the job. They're on, they're off the job. Action concluded. Yes, 12 minutes. It gets better. And of course there are pleasure model robots on the horizon. These are about $5,000 and available in Japan. I won't ask if anyone owns one. But God knows what they'll be collecting. I also think, kind of shamelessly, this would be an interesting research topic for someone in the room. I don't know if the National Science Foundation will go for it, but perhaps. Okay, so thank you, but that was our home. That was our person. That was our home. And just, and there's plenty of, now most of these, well, this is the idea of just surveillance cameras in virtually every small store we go into. The fact that blood samples are taken by children are from babies almost soon after they're born. And then all the, I could have gone on and on about the data being collected in a hospital. Where is that being collected? Obviously, there's benefits to that, but is it being protected? Is it being destroyed when appropriate? And that type, it raises those questions. And then at the gym, we're getting more and more smarter and smarter technologies. Some of you may have, you can put on a belt, a wireless belt around your chest and it'll collect telemetry off the human, the heart rate, that type of thing. And you have to think into the future. Where is this going? Gunshot detectors instrumenting our neighborhoods? Or not? And again, I have to thank the United States government for making available pictures of your analysis tests. But you have to think about how your life is being sampled. And this is literally a sample in a little plastic bottle. In England, they were very big for a while on keystroke logging to make sure their workers were working. Although workers tend to find a way around such things. Your employer may require a polygraph. Where is this going? What are the incentives for the employers? Anyone going to Disney World Disneyland and had to get your thumbprint read? Fingerprints read? At the Super Bowl, they've, in this kind of show, illustrates emerging technologies or showcasing of emerging technologies. The idea of using facial recognition at various public forums to identify people. Clearly, the various frequent shopper cards for grocery stores, and it's pretty significant. I mean, you get a pretty significant discount or a penalty if you do not participate. But think about it. Think of all the things we've looked at already and more and more pieces in our space to operate is becoming more and more constrained. Operate privately is becoming more and more constrained. There's incentives sometimes small, sometimes large to opt into this stuff. And it's emerging from the grocery store, a little plastic tag to your cell phone being this general purpose loyalty card. And it says no one in advertising has been able to figure out how to do one-to-one real-time marketing. The mobile phone is where that will probably happen. It's the only thing connected and always with you. Think about that. And clearly, finances becoming more and more electronic. At the same time, sadly, anonymous phone calls are becoming more and more difficult and there's been various movements to limit or prevent the idea of purchasing anonymous cell phones, the throwaway cell phones in our car. How far can you drive in your car? Maybe if you've got a 1965 Mustang or something like that or a 66 Mustang, you can drive it anonymously until you get the first red light camera, the first electronic toll collection. And there's license plate readers, there's some great videos of that technology online. There's low-jack radar and laser sensing, speed sensing, and of course the black box. Where is that going to be going? Where are the incentives? Insurance companies will have pilot programs out there to give a discount if someone allows them to instrument their car and tell if there's good behavior or not. And of course, travel by air and becoming less and less fun and travel by train. Think about it. How can you get around right now? How can you get from point A to point B without someone knowing about it? Okay, so these are countermeasures. Like I said, there isn't a silver bullet in here but there's some, I think, things that can be used to influence the flow of the debate perhaps. Some of these things are illegal, okay? Please talk to a lawyer if you are uncertain and I'm not a lawyer. So the first countermeasure is living in the 19th century. Does anyone live off the grid? We probably have a few people in the room who more or less live off the grid or attempt to. Oh and whose house is this? That's Ted Kaczynski's house before he went away. But that's not entirely, living in the 19th century is not altogether satisfactory. And living in the 20th century, even if people are trying, really we want to be citizens of the 21st century and have some checks and balances in place. The idea of using paper money and the phone book and our library card, and paper mail isn't really the right answer either. But one solution is to disclose that surveillance is taking place. Technologies to detect that sensing is taking place and obviously in certain communities there's professionals who sweep rooms to detect if sensing is taking place. Community monitoring, collective monitoring of the sensors in their neighborhood. This is the New York City surveillance camera project is another kind of way to provide a little pushback and a little transparency on what's going on. And this has to be the best security picture of all time. If anyone has the original, I mean you'll see this in one out of every other talk here. It's kind of like the penopticon. But it's awesome and if anyone has the original high resolution version of this, please sell posters. This is really a keepsake. But the idea of bypassing what you don't want. Um, anybody work in a Faraday room or have worked in a Faraday room? Yeah, um, anyone have license plate covers for red light cameras? They make special license plate covers. Does anyone know if they work? How well they work? Ah, they don't work. Dang, okay. Well, they're available online. And then there's other technologies repeat shielding technologies, RFID blocking wallets, radar camouflage and paint and the like. And again, none of these are entirely a good fit. Do you want to live your life at a Faraday cage? No. I heard yes, but they're okay. Most people probably don't. The idea you can jam technologies. This is running water. The idea of, you know, the white noise helps prevent surveillance. Saw it in a spy movie. But anyway, you could find a faucet of running water online from the main state government actually. So apparently they're trying to conserve water. And there's the idea of vibrating windows to protect, because your voice will vibrate windows and that can be sensed using a laser interferometer. And this is one in particular. Please don't do this. But certainly if you don't own it. But that disabling sensors, I mean, black tape can be one of the best sensor disablers out there, particularly if you own, you know, you don't want the camera on your laptop to always be staring at you. You can take out the batteries. You can, people have microwaved RFIDs. And if you've been to some of the security shows, they'll give out the little plugs that, oh, here's one. These guys, the little things that plug into your, into your microphone jack to hardware disable. Your microphone. And this is very, very cool, I thought. It's, you can express your displeasure in a public way. So this individual, I don't have his name handy. But he was disappointed in his, the use of speed cameras in his community. So when the Bluff City, Tennessee domain name went up for sale, he purchased it. The Bluff City Police Department domain name became, you know, they expired. He purchased it, took it out from underneath them and then put up his own website about speed cameras. So I thought that was pretty good. Art can be in a very effective tool. You're trying to communicate to the masses. Art is a mechanism to do that. And if there are any art students in the room, I think it'd be a really interesting area to explore. You can certainly find other means to rally community support because that's kind of what it takes to force governments to perform a major muscle movement. And you may have heard stories about the English villagers expressing displeasure. I don't know if they had pitchforks and torches, but they definitely made it known that they did not like street view cameras, street view cars in their community. Certainly their embarrassment is another tool that will perhaps change behavior. You can spoof, you know, the sensors can be spoofed and this researcher is holding, you know, fingers. Many ways, other ways that awareness can be raised. I mean, Cory Doctorow's book, Little Brother, very good, but it can reach to audiences in the young people and help change perceptions. There's the top rights YouTube video, ironically, hosted by Google that's a parody of where Google could go. It's the free, you get the free Google toilet and it analyzes your activities. But it, you know, sends matches and gets it out there. And then movies like the Minority Report, I mean, that really saw some things coming and reached a broad audience. And we need to get our arms around anonymization a little better. Ideally transparent anonymization. That it's, if not done correctly, people have shown you can go work backwards and each time you anonymize the data, you perhaps dilute its value to the holder. But this and the idea of aging data and destroying data is something that merits future work. But again, it has to have incentive. Whoever has the data has to be incentivized to do these things. And one way to do that is by voting with your wallet. Engaging policy makers. And again, the governments have lots of good pictures of people engaging policy makers because they pretend to like that thing. There are policy solutions. This is one out of a database nation. It was a Canadian Standards Association Code for Protecting Personal Information. But it was pretty thoughtful. So policy is part of this. Helping inform decision makers is part of this. And of course, one way to get head toward this is voting with your vote. Although it's more important that really we need the many. We don't need the ones, onesy twosies, we need the many to help influence these things. I like the idea of supporting opt-in in general because it allows people to collect data for those willing to do it, but they have to make a conscious choice. I thought a great example was the Hope badge. And this was in 08. They gave an active RFID badge and they handed you the battery separately. And they said there's an active RFID in here if you'd like to be part of the project where we collect, I think, attendee metadata project. If you'd like to be part of that, it's cool, it's anonymous, but you opt in by putting the battery in your device. And when you want to opt out, you take the battery out. The idea of collecting and conducting interesting research and sharing it, I had put this in there before, I think this is Peter, or Eckersley, and actually he's speaking next hour on this project. But the idea of doing interesting, relevant research in the space and sharing it, finding other like-minded individuals, and these are some of the places that they, that they, well, besides DEF CON, that they may appear. The Computer Freedom and Privacy Conference research in the, often you'll see research presented workshop on privacy in the electronic society or privacy enhancing technologies symposium. And there's more out there. And of course, supporting your privacy champion of choice, EFF, Epic being two examples. And I think it's important that we think about privacy by design, that how many, you know, some, does your laptop have a little flap that you can slide over the camera to make sure it's not collecting anything? This is a coffee table. So it's a coffee table and shredder. Okay, thank you. And also, now I showed this to a friend and he said, are you over, are you think, are you mentioning overthrow of the government here? No, I'm not talking about overthrow of the government. I was going with the top line. And I think the idea of common sense being most important here, part of the problem. So let's take a quick look at the future. Emerging technologies, things like through the wall radar around the horizon. Internet scale experimentation, the idea that Google, for example, tried a Bing-like background. And it didn't go over well. It was planned to be a 24-hour experiment. But the idea, and people let them know, that we don't like this. But they were able to find out, I mean, those people, you know, when you get numbers like that, you get the statistical significance in the results. And you can really, and it can be subtle experimentation, but we can do, large online companies can do that type of thing. We were seeing, I wouldn't be surprised to see electronic license plates. And there's, you know, this site, Kids Eyes, is, it convinces parents to sign their kids up to take surveys from ages six to 12, and teenagers too, and provides them some nominal trinket in return or funding in return. Think about the future where we have wearable computing. People are working now. We've done experiments trying to use technologies such as the MRI to see how people, how they think. And one experiment was, people were asked to predict what they were going to do based on a certain set of, you know, information. And 50% were able to do it, predict what they would do correctly. Well, the MRI experiment was able to predict with 75% accuracy. So more accurate than the human. The idea of network science and mining user data, this was just an example of a recent AT&T project. That mined cell phone records and could tell travel habits and how far New Yorkers traveled on weekends based on their cell phone activity. And there's an interesting group at MIT that is exploring this idea of reality mining, exploring themes like organizational dynamics, complex social systems from this data, this sampling of the real world. There are people performing interesting research trying to dig the most they can, analyze the most they can out of the data. And it's also important to realize that this data is probably not being thrown away. So advanced technologies can be played back against historical data. The future of augmented reality, same type of thing. And the emerging internet of things, where virtually every item will be connected to the internet and some are tracked by the internet in some way, it just opens up ample opportunities. So what is this a map of? Besides the United States, I know that. It's a radiation fallout from probable targets. It's kind of a Cold War era thing. So it shows where you should live in the event of fallout. Now I showed this to a friend who happens to be Canadian and he said, obviously you should live in Canada because there's nothing up there. And it was a good point. But the reason why I included this here is because our space to operate anonymously, I or privately, is dwindling rapidly. And I really think it's worth the point where it's not a matter of if we can do it at all. It's what percentage of our life is being instrumented and to think that through. And there's probably an interesting model there that could be built. So with that, again, the government's great. They have pictures of them, freely available pictures of hanging sensors. So with that, are there any questions? Okay, so we'll take it to the Q&A room, which is room three, which is you go out to the main hallway, turn right and immediately on your left. Thank you very much.