 I had lots of fun troubles with infrared cameras in preparation for this talk. I had one that was like, I don't know, ten years old and I was going to use it and it turned out they don't make the software for it anymore and you need this weird thing that they don't make anymore, this hardware to actually get it to work. So then that didn't work. So we went to these other guys to try to rent one because apparently you can rent these things and that's there. That thing is being quite silly now. So I'm going to start. So we were going to start with a minority report trailer originally but that didn't work out because we called Fox and Fox was like, no, that's a bad thing to do. But you know, actually it's a fun fact that I discovered if you ever want to know every single advertisement in a trailer, just ask them, just ask the guys to let you use it in a presentation and then send you back a list saying these are the clips from the movie you can never use because they're advertisements. So we wanted to use the minority report trailer because it parallels very closely the program that we did research on that we're going to present to you guys about. The program is called FAST because everything has to have a clever acronym. And the program like minority report pretty much has precogs because it's supposed to go and stop murder by looking into the future and it's supposed to be correct. So we thought that, so as we were doing research into this program, we came up with a few questions because you know, as in minority report, doubts in the end arose. So I'm Samon Restchikov. And we're going to present to you guys about FAST. So we'd like to thank Morgan Wang who was our collaborator who helped us do a lot of the research for this. Turns out you can find all sorts of very interesting things on the internet by using Google. Who knew? Unfortunately, she can't be here right now. We'd also like to thank the U.S. Department of Homeland Security and Draper Laboratories and Paul Ekman and the people at SPOT and everyone involved in FAST who developed these programs because with you guys, we really wouldn't have a presentation to present on and you know, dissect your methodologies on. So we heart to U.S.G. Thank you very much. So a little bit of a disclaimer before we start. I'm not really part of Draper Labs or I'm not really a fed. We're kind of out of the loop on all this stuff. Everything we found, we found by Googling. So if any of our information is not accurate, that's because you guys don't publish anything. So that's kind of unfortunate. So FAST. We were just, the way we found out about FAST was we were just Googling around, you know, browsing the internet because we do that sometimes. And we found, and we went on the DHS website and we saw these two very interesting projects called, you know, the FAST project and the hostile intent project. And we read through them and our eyes focused on this one little very interesting sentence. The system will measure both physiological and behavioral signals to make probabilistic assessments of mal intent. Hmm. So when we saw this, we were like, whoa, that sounds really cool. Because just like biometrics to fight terrorism? Really cool. It's, you know, curing terrorism with science. Sounds awesome. And the premise of FAST is that you can tell people's mal intent, you can tell the intention of people by using, by scanning them with very advanced biometrics. And if you do a little, and so that's the whole idea. So we were like precogs. They're real now, apparently. So we started doing research and we found out that the genesis of the thread, the ideological thread behind the program, started sometime around the late 90s, you know, ancient history. It's really old. Back with Paul Ekman. You see, he looks a little bit like Napoleon. So Dr. Paul Ekman is famous for his research on deception theory and figuring out how he can tell who's lying and who's not. Any of you read Malcolm Gladwell's book, Blink? I'm so sorry. Well, he's more famously known as the inspiration for Dr. Cal Lightman in the now canceled Fox show, Lie to Me. His entire thing is this theory called micro expressions. When you try to hide feelings, there are certain involuntary emotions, little twitches on the side of your mouth, things like that, that are the same across every human being. He, to find out a little bit more about this, he did this thing called the wizards project. He took people from all kinds of walks of life, cops, lawyers, high school teachers. And he tested their ability to reliably detect who was lying and who wasn't. And high school teachers can't tell. And he found that basically the only people who were significantly better than everyone else were secret service agents because they have some kind of training or it's just self selecting. We don't know which one of those it is because he stopped publishing when he says he realized that other countries were watching his work very closely and he didn't want them to be used against the United States. This also results in no one else reviewing his research so we don't know if he's just making it up or not. Security, woohoo. So Paul Ekman's research in micro expressions was the genesis of SPOT, which has nothing to do with dogs and it's not very cute. In SPOT, you have all these TSA agents that are trained in the analysis of micro expressions. So if say, you know, you're going through the airport and you set off some sort of flag like you're a young Muslim guy who's reading the Quran and interested in Middle Eastern politics or you book a ticket, a flight ticket, that's one way and you check no luggage. You might get referred to I think their behavioral, what is it? Behavioral analysis officer. Yeah, to a behavioral analysis officer who a B-A-O, who will go and question you in the nice little room. And while he questions you, he will go and pay very close attention to your face and try to detect any micro expressions, you know, to tell if you're evil. And you know, so at the beginning, this sounds like a really cool idea. If this research actually works, it seems very interesting. So at the beginning, you know, people were kind of excited about this. People were saying we should use it everywhere. It's going to be awesome. But over time, confidence in the program started to wane a little bit. So that's not actually my idea. That's a clip from, that's an actual headline. I'm not that mean, although I did put in the dead dog. So the thing with spot is that it refer, so in the preliminary trials, because it started right up around 9-11, so it's been going around for about five years. So these preliminary trials, they referred, I think, 40,000 people to further screening. And 300 of them are arrested. Which sounds awesome, because like, dude, 300 terrorists caught, that's pretty, that's not that bad for, you know, a few guys who can read faces. But there were no actual enemy combatants arrested. Because the 300 were I think, all entirely drug dealers and people who lack documentation. Which, so you know, that's like, oh look, so you're arrested 300 people by questioning 40,000 extra. And you know, that's not, at that point, that's not very surprising. So the program has not been incredibly successful. And it seems that the big problem with the program is that really, like, human beings, even when they're trained, are not good at detecting deception. Maybe some people are, as per Paul Ekman's research, but it's not something that you can really do very well. So this isn't my slide. This is the slide from one of Judy Bragoon's presentations. Judy Bragoon is someone who we think is very closely related to fast. And so out of, you know, the relative semi-failure kind of a little bit of spot, you have people who decide that human beings are sucky at this, and that we should get robots to do it. Which is always a good idea. That coin is also from her presentation. As you can see, government presentations are classy. So spot was indeed not very, it was also very expensive. It's, it was $212 million a year. I think the Obama administration just increased its allotment to $286 million, something like that. And so, you know, in donuts, because that's a great currency, that's $160 donuts per federal employee per year. Or so if there are any feds in here, you guys want 160 donuts, just talk to some people. Or one donut for everyone in the United States on Christmas. So, yeah, it's not incredibly expensive, but it's not very good for 300 illegal immigrants. So, you know, they have this program and it wasn't very successful. So you try and you try again. Because we believe in FAST. So FAST is this new program that is going to go and stop the terrorists from destroying America. And, um, so my, this is also a slide from Judy Bergeron's presentation. And I really like it because two of those pictures are from conspiracy theory books. One of them being about FDR. Uh, I'm just a big fan of this thing. So what is FAST? We haven't really explained exactly what it's supposed to be. Well, it's a trailer. Like, it's literally supposed to be in the form factor of a trailer. That's something that they've emphasized many times because it's supposed to be portable. So you could put it in lots of places. Not just like airports, but like Congress, you know, to check if, you know, any terrorists are getting in. Or like your local, uh, I don't know, sports stadium. I think we saw around the internet, uh, in various presentations, some suggestions of putting it on like border patrol. Or I think even public transportation, which would be interesting, you know, get your mind right every time you go to a bus. Uh, so what is FAST? Uh, alright. So FAST is a series of pretty modern sensors. Um, the first on the list is this thing called Bio LiDAR. LiDAR is basically a fancy laser, uh, range finder. And someone over at the defense department has figured out how to use that to measure heart rate. Because you can totally do that. Um, the next part, the one we'll be demonstrating is thermal camera, which you use to measure stress responses. After that, it's just a video, a video camera at eye level, which looks at your pupils, how red your eyes are, agitation, maybe even the micro expressions if the algorithms get good enough. But it could also just be used as a really good detector of hangovers. They've also mentioned in a few of the presentations, um, something to measure gait, which is unique to each person. But we haven't actually seen any data on that, but we assume they're working on it somehow. To be honest, we haven't really seen much data on anything, but we haven't seen any videos of gait at all. So I'm, yeah. So two of the pushes behind FAST are not only, you know, actually getting this detection thing working, but also making sure it's FAST. Because we have polygraph machines. Polygraph machines are great. They're not incredibly accurate, but they do sort of work. But it takes an hour by a trained professional to run an exam. This thing is supposed to be like five minutes. It's supposed to be high throughput, so they could actually use it in portal settings. So you go into this trailer, and, you know, in the middle of an unmarked van or something. And, uh, those sensors, I mean, Hal looks at you, and this guy asks you a few questions. And it looks a little like this. Try and refrain from moving. Now they're going to start asking her questions. Do you see that? I think she's blushing. So this is from, uh, we think, we're not sure, because again, none of this is really published well. This is from one of the only field studies of the test that they've done. So if you see that's the thermal camera video, um, I think that's heart rate. And as you can see, she failed the exam. So that's pretty much it. So if Hal doesn't like you, so contrary to that video, um, from everything we've seen, it seems that if Hal doesn't like you, you get to go through it again. Because, you know, we don't really trust computers all that much. So, right? You guys know all about this. So if Hal decides to have a glitch, you get a second chance. But if Hal doesn't like you again, you get a meeting with this guy. Do you see the glove? Notice the glove. It's important. And you get to go into this room and have a nice little conversation with him. And if Hal does like you, you get off scot-free. And it's wonderful. So we thought that all this stuff was really, really cool. So, um, did it get working? You don't know. Oh, brilliant. So theoretically we should have a demo around here, because we thought that flers, um, forward looking infrared cameras are cool. So we're going to show you guys how to, how little of a difference there is between someone who is lying and someone who isn't. And how much of a difference there is between someone who ran around the room multiple times and someone who isn't. Um, what? All right. So we'll get back to this in a little bit. Hopefully. So, that was the system, right? And the system sounds pretty great. I mean, if you do get something like this working, if you can in fact tell people's internal mental states by biometrics. Oh, brilliant. Wonderful. Thank you very much, sir. Oh, or not. Huh. Okay. So, if you get this thing called fast work, it's pretty awesome. Because you do in fact have a very effective tool for deterring terrorist attacks. Because, well, you can maybe invent new ways to make bombs, but it's very hard to control the way your biometrics actually work. It's very hard to control these very fine things. This would theoretically be a full-force system if it worked. Now, uh, there is no, there's not really too much published stuff on this. So we don't know. Maybe these, the guys who are working on fast at Draper really do have it all working and it's awesome and like they're going to deploy it tomorrow and there will be no more terrorism. But, well, we, as we looked through, we found a few methodological shady points and we'd like to point them out. And so, one of the things that we found was that, what? So, one of the things we found was that the theoretical, the theoretical basis of fast was on interpersonal deception theory. That's supposed to be deception. And it was developed by, uh, Buller and Bergoon. Bergoon is the person who slides, we've, I've been showing you guys. And, well, the theory is, it's fairly complex. I can't really summarize all of it here because of psych theory. So it's like 18 different point, 18 different assumptions and this is how we work when we lie. And, uh, it's interesting. But, uh, there are a few little sketchy things about the papers in which it was presented. All the papers are called interpersonal deception theory, Roman numeral. So, and most of them referred to a study in which you have two people, uh, so you have an interrogator and you have, uh, a suspect. And the suspect first, and the interrogator first asks questions for which the answer is known and the suspect will answer truthfully. And then the suspect, and then the interrogator starts asking questions for which the suspect may or may not lie. And the interrogator knows which ones are definitely truthful and which ones aren't. And it's always in the, and the truthful ones and the non-truthful ones are always in the same order. So there are, the, this, uh, study may have trained the interrogators to see clues about lying that we're not actually there. It may, they may have started to guess things based on the order of the questions. Uh, a better criticism of it is in that paper, um, which if you're, if you're interested in it's quite nice, it's civil but it's, uh, the tone of the paper is something like, Burgun, you did really awesome work back in the day. What the heck? It's a little sad. Oh, so, um, there is this thing. So another assumption that the fast guys have been making is illustrated in this slide, which is also from Burgun's presentation, which is that you have, uh, you have people who are benign and people who are hostile and then the benign people are going to be in one set of internal states and the hostile are going to be in another set of internal states, which is fine. But the benign people are going to be something like calm or excited and maybe at worst agitated. The hostiles are going to be fearful and deceptive and angry intense because good people are never tense. And so this is like, it's alright. It seems sort of reasonable. People who are going to be deceptive are going to be in a slightly different mental state. But the, and so this might maybe extend to lots of things like drug dealing or people trying to smuggle stuff through, but it won't necessarily extend to someone who is, uh, doing something for the glory of their country or their ideology or their God because you see, if you're about to do something that is, you know, you're almost your purpose in life. You're about to really fulfill yourself. There's no reason for you to be tense or fearful because you're going to be fairly calm. You're going, like if you've ever seen people who are about to do something really important, they're, they're, they're quite happy. They can be calm even if they're doing something incredibly dangerous. Uh, they, there are various studies that support this if you guys read into it. And you know, so you have these terrorists, they're about to blow their thumbs up and make themselves happy. They're about to get the ultimate prize. And that's, so that's a problem. Here's another problem. This is another slide and it's interesting because it's showing the way they've been constructing some of the studies. The mock theft experiment is in which is one of their stuff, is one of their experiments which we found some information on although no specific methodology on which you have people who steal something, people who don't and then they, they, they both try to say that they're innocent and they run through sets of detectors and you know, they see what happens. But one of the strange things is you see these words like two states, innocent and deceptive and actors in screening location scenarios. That's weird because that sounds like they're getting people to pretend to be innocent or guilty, to pretend to be relaxed or agitated or over controlled. And we thought, you know, that would be really strange because if you're calibrating your instruments to something like, you know, you go act suspicious, you're not going to get anything that works too well on the field. But in fact it seems like this is actually what they have been doing, as you can see. It's a little troublesome. Yeah, it's very early in the research but it looks very promising. I tried to contact John Verica but he never answered any of my emails. It was kind of unfortunate. So you have, so you know, we thought, we were trying to really think that like they can't, they can't be doing this. They've got to be doing tons of field tests. They've got to be designing their experiments as a way that people are actually, you know, lying and caring about it. But apparently we're in many experiments that has not been the case. If you see here, this is another slide, this is I think from the mock theft experiment. And you have the same guy in three different states. So the only way you would get something in that context is either if you're asking the same guy three different questions and he naturally tenses up like that. Or if you're telling the guy to pretend to be relaxed versus pretend to be controlled. Frankly, by the exaggeratedness of those positions, I think it's the latter. So that's another problem. And even Paul Ekman, the guy that you know, originally started SPOT and has been pushing through a lot of this, a lot of these ideas thinks that this is just a bad idea. Because it makes sense. If you're going to calibrate your instruments to detect something as fine as the difference between someone who's angry at their spouse and someone who's angry at America, you need to be doing really, really, really good research and calibrating your instruments to things that are actually relevant to that sort of question. So you know, we don't want to criticize entirely. We want to be a little constructive. So there are two things that they have to be, that they should be doing in their studies to actually make these sorts of studies meaningful. But first of all, the subject must be actually lying. They can't pretend to be someone evil. They have to actually lie about something. And the second part is that the subject must actually care about getting through. The subject must actually, there must be some sort of stake in the situation. Because if there's no stake, that's a very different situation from if there is a serious stake in getting through or not getting through. So these conditions are, it is non-trivial to replicate, to get them to work in the lab, in a psychology lab, but it is possible. You could get someone who strongly cares about some sort of political view or religiously or whatever to try to get through this series of screening in which they are pretending to believe the opposite. So I don't know, get an atheist to pretend to be religious and stick them through this thing, see what happens, see if it can tell whether they're lying or not. You can get someone who has issues that they personally care, things that say they're embarrassed about, like medical conditions to try to pretend that they don't have it. That's not directly the same situation, but that is closer than getting actors to pretend to be evil. You also could do field tests. They should be doing lots of field tests on all this. With field tests, what they could do is they could just, you know, you have a lot of people going through the airport, just have someone flash someone who's going through an FBI badge and say, hi would you like to participate in the study? Have a knife. You have to go through that door and say that you don't have a knife. It's okay, you won't get in trouble. I'm a real fed. And just go through. So that would be a very realistic situation because not only would the person be agitated, but the person would be trying to smuggle something through that they think is okay, but they're kind of not really sure about because you're still, you know, smuggling something through. So if they did lots of tests like that, and that sort of test, you know, you could, there's enough space in airports to stick a little trailer on the side, they might actually be, start getting some pretty good data. So, but those are all methodological criticisms. Those are all things you can fix with a little bit of tweaking and a little bit of money. There are other things that are fundamental to the problem of portals research and biometric state analysis that you can't really get around. So, to about 200 years ago, there was this guy named Thomas Bays. He was a priest. He's essentially the patron saint of statisticians. He came up with this thing, which you're seeing now probably on some, in some university math department, where given a few probabilities, you can figure out if one thing has happened, what's the chance of something else happening. So, because this is all math stuff and it doesn't actually mean anything, we thought to put it into a little bit more concrete terms. So, let's say worst case scenario, there's 100 terrorists in an airport of a million people. This is five times the number that were involved in the 9-11 operation. So, 0.01% of the total, or one ten-thousandth of the total, are in fact terrorists. So, if this is several years in the future, we have something called super fast, which is eventually, it can tell the difference between terrorists and non-terrorists 99% of the time. So, sounds pretty good, right? Sounds like we would catch almost all the terrorists. Unfortunately, not quite, because we would catch 99 terrorists and we would get through, because it's a 99% accuracy on that side. But then 9,999 normal people, non-terrorists, would also be said as terrorists. So, we'd catch over 10,000 people, only 99 of which would be terrorists. So, if the system says someone is a terrorist, there's actually a really low chance that it's true. There's actually a much, much greater chance that they're not a terrorist. And think, this is with 99%. We're nowhere near that right now. So, the issue with this is that you can't really rely on the data that this thing gives you until it actually has an incredibly low base rate, until it actually has an incredibly low false positive rate. It would sure, you know, help narrow things down. But nonetheless, because it takes up so much, I mean, it decreases its effectiveness compared to say normal humans or just lots of secondary screening. Furthermore, we don't even know the actual, you know, accuracy rate of this thing, because we've, in a paper, we saw 78%. We've seen everything up to 99%, although actually I spent all morning looking through for where that 99% is, and I have no idea where I saw that. I just remember it. But, because the thing is, they really haven't published anything at all. And that's kind of a problem. So, here's a study that they did. This is, I think, the only field test of this stuff that we heard of. They had 144 people who had no idea what was happening. Go through a fast scanner on the way to some sort of electronics expo. I'm not sure if the electronics expo was an actual expo or whether it was staged. I don't remember. But, you have these 144 people and they go through and 23 of them were given a disruptive device. That's all we know. And, yeah, that was the study. It was a field test. Does anyone see any interesting things about this? No? Yeah, him, that guy. Yeah, that's one thing. But another slightly different issue is just, you know, the sample sizes. You're, if you're going to implement this sort of biometric scanning system industrially and scan everyone with it, you need to be doing very large field tests on the very large numbers of people to make sure it actually works reliably. This sort of thing, it's not incredibly good. Because the thing is every human being is special in their own little way. I told you guys in grade school it's true. You know, some people, for instance, have like heart arrhythmia where, you know, they have these little attacks where all of a sudden they don't even get, you know, short, short breathed. They're working perfectly fine, but their heart beats a little faster. In fact, like that many times faster. And, you know, so I suppose someone is going through a fast scanner. All that happens. That might be something interesting. And there are lots of other things that produce such great massive changes in biometrics. Like, I mean, take fevers or hangovers or what else can you take? Like seriously almost any disease is going to change a lot of these things. It might not change things like pupil dilations, but what? Possibly. Yes, indeed. Yeah, actually, three cups of coffee might do that to you too. If you ever go to a French cafe, they serve you cappuccinos and bowls. So if you have three of those, yeah. So that's a problem. And that's not something you can really get around. Well, you can sort of, you can collect and store the biometrics of every single human being in the United States and then individually calibrate everything to everyone. Which would be nice. And you're probably going to have to do a big chunk of that anyway to get enough training data for the system to make sure it works reliably. But, and you know, then you'd have a nice little ID system for every person in the United States. Unfortunately, having the government having biometrics on all of you, especially very detailed ones like your heart rate and how fast, how much you sweat is not an incredibly popular decision at the time. So here's another thing. Do you know what these things are? Do you have any ideas? Yeah, Valium. Valium is a beta blocker. Valium makes you calmer and makes your base and makes your excited states go closer to a basal state. Well, that might help some people, wouldn't it? Lots of other drugs have these sorts of properties and they're not even incredibly legal. It's, you know, not actually a problem for someone to take some Valium before they go through. There have been cases of people who use Valium regularly to try to, you know, calm themselves down and make themselves less suspicious in other contexts. This is one that they could definitely use it in. So one question probably you're probably having, how can this much collection of information be legal? There's a thing called the Fourth Amendment. No unreasonable search and seizure. They need to be able to say, oh, I saw blood coming out of that guy's trunk. So we found this paper called the Gill Paper by Lindsay Gill, who was a law student at BU at the time, who wrote about this entire program and a lot of the analysis that we did was in rather better language in there. So the Fourth Amendment actually has a few exceptions. One of them is called the Plain View Doctrine and has like things you can see. One of them is the Administrative Search Exception, where it applies to everyone. So the Plain View Doctrine is basically everything you can smell, see, touch, without any particular equipment, something that anyone can do if they have the training, even the Sniffer Dogs count in this way. However, most people besides us don't have thermal cameras. Thus, it's still technically breaking into their house without a warrant. There was an interesting case in California where the cops did a huge drug bust of a giant pot farm and the way they found it was by getting very expensive infrared cameras and just seeing the heat of the hydroponics through the concrete. And so, you know, they busted them and the case went to court and the court completely threw out everything because they were like, this is not cool. These guys couldn't have bought this camera on the consumer market. So, you know, you think that TSA searches would also be illegal, right? Because not everyone can buy metal detectors and not everyone can buy giant X-ray machines. But there's this thing called the administrative search exception, which was originally invented, frankly, for people going into courthouses to be searched. And it pretty much says that if you know about the search beforehand and it's not too, too bad, it's all right. So, let me explain. Imagine you're a bunny and you go into a park. Yeah, it's nice. And then this guy just runs at you and he's like, stop! You terrorist! I'm going to search you. That's not cool. That's not a thing you're allowed to do. But if you stick metal detector in the front of the park and you say, all bunnies must remove their shoes before they go into the park or because otherwise the park may blow up. That's all right. That's perfectly fine. The thing is you start getting a sketchy territory when you're getting more and more and more intrusive searches. Like how? Getting a little bit difficult here because, you know, you're collecting lots and lots of biometric information. And people care about that. People care about their health privacy. There are fairly strong privacy laws in here. And this thing is essentially a medical exam. So we're not even sure, like, exactly how they're going to store the data that they collect off you. Different spokespeople have claimed that they're going to take, they're going to, you know, erase all the data. But that can't be entirely true because, I mean, obviously if you get, like, arrested, they're going to keep your data. And if you, and at the very beginning when they're ramping up their project, they're going to have to collect a huge amount of data to make sure it works reliably on most people. And, yeah, that's a little bit iffy. But we're thinking, you know, so this is a privacy issue, right? This is pretty serious. But on the other hand, it could have also been a solution to our health care problems. Because you see, it's essentially a medical exam every time you go through the airport. That's not too bad. So, like, you know, you go through and then Hal scans you. He's like, hey, you have cancer. Go to this doctor. That'd be pretty nice, I think. It's not too bad. I mean, I wouldn't mind, like, getting a throat checkup every time I went through. And if you're getting it to an incredible reliability thing, at that point, you're probably doing enough analysis to be able to get things like this. So, in the end, in our incredibly unprofessional opinion, we think that they could probably push fast through. Because most people are entirely okay with giving up security for at least, oh, sorry, giving up privacy for at least a semblance of security. But there are lots of interesting and fairly serious potential legal challenges that anyone with a crack team of lawyers could put up against the implementation of FAST. So, there's another interesting thing that we found while we were looking, which is that the Israelis are doing the same thing, but privately. This is a company called Suspect Detection Systems, or SDS, which is essentially made entirely of ex-MASA agents. And they're making that booth, which is called Cogito. It's like FAST, but it's smaller and more portable. And you go in and it asks you some questions while you're listening to, it asks some questions through some headphones. It listens to your voice. It's like a hand in the scanner the whole time. And it's like taking pulse and temperature. And it's entirely automated. It's another product that's trying to do the same thing. They claim on their website that they have a 4% FAST positive rate. We've seen everything between 4 and 8. And we still haven't seen any actual, you know, peer reviewed studies for any of this, which is kind of unfortunate, because, god damn it, guys, can't you please share your awesome research with us all? Like this whole thing, all this, you know, reading people's minds by biometrics and all this incredibly interesting analysis could be, could very well be, very valuable research to the scientific community. People, I mean, it's interesting. I mean, I'm interested in having a scanner that tells how, what people think. That's pretty cool. I'm interested in how the heck they pull that off. I want to know that, you know, the research behind that is reliable before we stick one into every airport. And yet there really hasn't been very much published about all this. Kind of unfortunate. It would be nice if the guys from Draper published all this. Just like a paper or two saying, hey, yeah, we ran this study. This is how we ran it. This is not exactly how our algorithms work, but this is at least the things that we look for. And so other people could test it out and see, hey, yes, indeed, this program works. And then this would be fine because at the moment, you know, if you're an outsider and you're doing research on it all, it seems a little sketchy. That's a problem. So fast. It was fun doing research on everything. And it took a while. It was an exciting time. We discovered a lot of stuff in the end. Like, we honestly just really want to know what the heck they're doing with this thing because they're not being very public about it. And so before we ended, we wanted to end the presentation on a lighter note because there was this, well, as we were, well, there was this trailer that they had for the FAST program. Like, not a trailer like the FAST trailer, but like a literal, almost like a movie trailer. And it was a little weird because it seemed like people, well, how do I say this? It seemed like the, you know, we know where they're doing this at Draper, but it also seemed like they might be doing this in another research laboratory that we follow very closely. So here it is. Inside the screening mobile module, a suite of real-time non-invasive sensors measure the individual and physiological indications of malintent or the intent or desire to cause harm. Fact. The key to any successful cooperative test is trust. And as our data clearly shows, humans cannot be trusted. The track on the walls, ceiling and floors provide a secure and easily reconfigurable means of installing equipment throughout the screening unit. I give you panels, the planks of tomorrow. Only configure... The unique low-floor trailer design ensures accessibility for all. They're equal opportunity. So now, pay attention to this part. You might find this one interesting. It can be wirelessly networked to a single command module. That might be an interesting thing to look into. Then we box them up and ship them straight to your doorstep so you can protect the things that matter most. Good night. Hey, Johnson, we're done here. Thank you. What? Oh, wonderful. So our demo is actually going to work now, we think. This is going to be cool. So we'd like three volunteers, one of which is willing to run around the thing one or two times. No, no one. Yes, sir. Yes, sir. Yes, ma'am. Come up. So let me find Josh, where do you have the water bottle? I didn't put... So what I'm going to do is I'm going to give one of them an object. Like, we're going to give them a water bottle and pretend it's nitroglycer and unfortunately, we can't find it. So I'm going to give one of them a pen. Pretend this pen is dangerous. All right, this pen is a dangerous weapon. I can stab your eye out with it. What? So one of them is going to... So first of all, I'd like one of you, whichever one is willing, to just run around a few times. All right, yeah, sure. Just run around a few times. That's all right. This is fine, guys. So now these two are going to come up and I'm going to give one of them this pen. None of you guys are going to know which one it is, but they will. And then we're going to go film them and I'm going to be the TSA agent asking them questions. That infrared camera, which apparently is going to work, is going to show you guys heat maps of their faces. You're going to see just how little of a difference there is between the ones that are lying and the ones that aren't about having a pen. So can you guys come over here? We're going to do this behind the table because the table is black and you can't see through black. Well, I can with my camera. I was asking you to write a few laps, not one. I mean, it's healthy. Come on, guys. This is DEF CON. We're not really being healthy at the moment. Health is good. So one of these guys, or girls, has the pen. Sorry. Did I ask you to come? I only needed two volunteers in that third one. I apologize. I may have made myself a little unclear. I'm sorry. I feel really bad. All right. You can sit up too. One of these three volunteers is a pen. The other one does not. I'm more realistic because there are a few terrorists. So can you guys all come up? Actually, can you guys all come up somewhere over there in front of the screen? Yeah. Go up there. Is it? No? Come on, Josh. Seriously? I hate technology. What's up? All right. I'm going to distract you guys for a bit. So let's tell you guys fun things that I found out about FAST while this whole thing was going on. Let's see. So, hmm. There were definitely, Jesus, Josh, what are you doing? Oh, really. All right. So there are a few things that were amusing. I'm going to distract you guys. I'm going to distract you guys. I'm going to distract you guys. I'm going to distract you guys. I'm going to distract you guys. I'm going to distract you guys. So I thought we're amusing. Like the fact that Fox News was one of the guys that did the best coverage of FAST. It was really weird. They were one of the few guys that mentioned the study or the actual field study. They're one of the few guys that... dude, I don't have the software for this. Fine. I blame everything on him. All right. What's up? I'm going to keep talking. So Fox News did this really awesome coverage. That was the place where I think all the copies of the trailer got up online. That was the place where you had the study. Wow, this is interesting. Holy crap, I go around the whole thing. That's really cool. I like this. So what else was there that was interesting? I think the single, I think there were at least 20 videos on YouTube comparing it to minority report, but we didn't really realize that until we had the idea ourselves. Very original with that. Also, yeah, I told you. All right. So another thing that was interesting. Fox asked us to pay them $1,000 a minute of trailer to show this, to show you guys the trailer. So, I mean, it was probably fair use, but we were like, screw it. Fox is much easier to contact for copyright claims than Valve is because Valve doesn't respond to anything. Well, Fox, you just call up News Corp's communications guy and he calls you back in five minutes saying, yeah, sure, send an email to these guys. You send an email to them and you're like, yeah, sure, pay us $1,000 a second. No, you can't use it even though you're not making any money. What else was there? I was going to use a camera that I borrowed from MIT, but it was like from 1999 and that turned out to be a problem. It was also much bigger than this one. It was like this big. What's up? Oh, yeah. So this is intense hacking right here. If this doesn't work, if anyone wants to come see us at the Q&A room, we'll show you guys the camera and at least we'll show you guys how cool it is and we'll actually do a demo with a screen this big. That's the big problem with screens. They're just not big enough. Oh, man. Keep talking. All right. So has anyone thought of anything else? Oh, yeah, questions. Why should you talk to them? Well, I mean, you don't have to talk to them. This thing doesn't have to talk to you either. It just scans you and says you're evil. Are you really actually not legally allowed to talk to TSA? That's really cool. Not required. I mean, yes, not required. That is a difference indeed. Yeah. So that's the whole thing. The administrative search exception is pretty much they're allowed to bar you from doing things and the government can force you to do things if you know about it ahead of time, even if it violates things like the Fourth Amendment. And they're not too horrible. Anyone else? Yes. Oh, man. Actually, I don't think they'd be very, oh, it depends on how big the beard is. Like, if it covers this part, if it covers your cheeks, at which point, I think you need to shave. It might be pretty effective. If it's just, yes? Can they go sit down? Oh, hey Josh, can they go sit down? You guys can go sit down. Yeah, as long as you come to the speaker room afterwards so we can show some people pretty pictures of infrared things. All right. There's any one more question that I can answer here? Yes. Die. All right. Thank you. Thank you, everybody.