 What's the magic word this morning? It's Steve Larson. What's the magic question? Is it safe? Is it safe? So Steve is teaching in the Pennsylvania on computer science and a lot of things on hacking and cyber security and the like and he's here for a few weeks and we catch up with him and we love to learn what he's been thinking about and teaching about because this is a very important topic. It's not quite as as existential as climate change but it's right up there right up there and we'd better you know it's like Mueller said we'd better attend to it. Is it safe? No, thanks for having me on the show. You know we had an election coming up and everybody know everybody in the world knows that the Russians and for that matter copying the Russians you know Trump has his own organization which engages in this. I don't have to tell you that Parskow who was the guy who was the social media you know manager in the last election is now his campaign manager overall so we know what kind of campaign he's going to be doing. It's totally digital. How should we be worried? What are the things we should be worried about? How should we be worried? What should be worried about? We should be very worried if you're concerned with keeping democracy through democracy in America. The things we should be worried about is how they're going about influencing the voters. If you recall Trinidad and Tobago had some elections recently and they used a lot of data and used data mining to influence the voters. I'm not sure how familiar you are with that but wasn't that a Cambridge Analytica? It was a Cambridge Analytica job. Yes and Cambridge Analytica was very happy to report that it was their job. In fact they bragged about at the time they were influencing at about 10 presidential or prime ministerial elections every year at that time. So and it's not that they came out and they said politically you should vote for this guy or you should vote for that guy. In Trinidad and Tobago there's basically two cultural groups the African toboggan's and the Indian toboggan's and what they did was they helped start a movement called DUSO which I don't speak the language but what I understood was to don't vote and they put that out there and all of the young people that were just new voters were persuaded to not vote. However the Indian toboggan's followed their parents more than they followed social media and even though social media was saying don't vote their parents were saying do vote so more Indian voters voted among the young people than the African toboggan's. Which was the desired result and they elected who they wanted to elect. And so in effect Cambridge Analytica controlled that election. Exactly. I remember that was in the movie The Great Hack and a fellow named Dave Carroll I don't know if you know him but he's in your kind of business somewhere on the east coast he teaches the stuff. And then there was a woman who I really thought was really interesting she was a whistleblower from Cambridge Analytica relatively young early 30s maybe. His name was Brittany Kaiser. And this movie which is on Netflix right now is a documentary and tells you all about how Cambridge did it. What's interesting what you and I were talking about before the show is OK so Cambridge Analytica under pressure those bankrupt and they disappear themselves. Every table and chair every computer every algorithm every disk it's all gone missing. Nobody knows where it is. Correct. What happened to that. What happened to the people. People are now working for the company that took its place. You know that company just didn't disappear into thin hair. Everybody still has those skills and talents and somebody wants to use those. Somebody else hired them just under a different name. Yeah. And they did Brexit too. Yes. Which I you know that's more than the Caribbean for sure. Oh yeah. And Brexit has had and will have and is having a huge effect on Europe and on Britain for that matter and they influenced it big time. Oh yeah. And for example an investigative reporter went to Wales where she was from and she was talking to people and they're saying oh no we don't we don't want to be in the EU because of immigration problems and because of the bunch of different problems and she never saw that on any public media. It was all on social media and she went there and she looked at the demographics and that had the lowest incidence of immigration foreigners coming in. They were worried about people from Turkey coming over and they had the lowest rate of immigration from Turkey in the whole European Union. But yet the people following social media were influenced to think a certain way. And she talked to a couple of the young people and voters and they said oh now we want to get out of the European Union because what has it done for us. What has what benefit have we had. And she was showing pictures. Well there's this university that was started because the coal and the auto industry went away. University was started because the European Union funded it. And a bunch of different organizations that were used to better lives of the people were funded by the European Union. Yet social media influenced them to think the European Union hasn't done anything for them. Oh it's deception. It's disinformation misinformation. Yes. And it sounds right out of the Russian playbook playbook in 2016. The with the Internet Research Agency in Moscow which spends a lot of time doing it. You know one thing it strikes me is that Cambridge Analytica. We're going to take a short break. We'll be right back with Steve Larson. Aloha. I'm Marcia Joyner. Inviting you to join us on Wednesdays at one o'clock for cannabis chronicles. The 10,000 year odyssey where we take a look at cannabis as food. Cannabis as medicine. Cannabis and religion. Cannabis and their old uncle's family. So please join us to learn all about cannabis. Again, Wednesdays at one o'clock. Thank you. Aloha. I'm Stan Osterman. Stan the energy man every Friday here on Think Tech Hawaii. If you're really interested in finding out what's going on in energy, especially here in Hawaii, but also all the way around the world, and especially if it has to do with hydrogen, look into Stan the energy man every Friday, 12 o'clock, Think Tech Hawaii either. Aloha. Okay, we're back. We're live with cybersecurity expert Steve Larson visiting us from Pennsylvania where he teaches slippery rock. Anyway, so Steve, you were talking during the break about this investigative journalist who sort of opened opened our eyes a little bit about about the motivations. Can you repeat that for me? Yeah, so she has a couple of TED talks. And in those talks, she's talking about how we are all worried about our privacy. And I understand why, because you know, all of your data is out there. But she was saying that corporations that do the data mining on all the data that we share on all the social media platforms on Google searches, and in our email, those are all mined, and the corporations are using it as a form of power. They are influencing elections, like we said, but they're also using their power to influence making of laws, and other things like that. So we should be very afraid that these corporations are using this data. Now, you remember the book 1984 by George Orwell. Absolutely. Did you follow what they wanted you to do? Were you worried that big brothers watching? Yeah, well, they are. Well, literally now, but it's not the big brother we thought it was in the book, it was the government. Now, it's all the corporations. But we are the ones giving them the data. It used to be George Orwell said, you know, everybody was afraid that big brother was watching. Nowadays, people are afraid that nobody is watching them. For example, teenage kids, they're all over Instagram and Snapchat. And they're they want more and more followers because they want to be seen, they become famous and stuff like that. Oh, yeah, they want them have their own YouTube channel. But we are giving them information that they are using against us. Like information warfare, they're using that data to influence us. For example, when I was active in the Boy Scouts about five years ago, a lot of my emails dealt with Boy Scout and scouting and camping and hiking is and a lot of the pop up ads that I used to get were all about scouting, or all about camping information or stuff like that. So they're mining this now a human being might not be mining it, but they have algorithms that are mining it find out what I should be. What I should be fed, so to speak. So if you look at it, in that case, they're tracking everything we do on the internet, whatever clicks, go to CNN.com. Last time I checked was about six months ago, there are 38 non CNN companies tracking your activity on CNN.com. And they are tracking what what links you're clicking on there, how much time you spend reading each article, what type of articles. So if you click on mostly Democratic, Democrat leaning articles, I guess, you know, that's your Democrat. Conversely, they know whether you're Republican, and then they will have pop up ads, or they will say things like, and this is what Netflix is famous for. Based on your preferences and what you've done in the past, here's what we think you would like to watch. So they're trying to influence what kind of information you are consuming, based on what you've already consumed. Yeah, and, you know, you, so you get a little note from, I don't know, Netflix or CNN says, Jay, how would you like to like, you know, do a documentary on this subject on Trump? Well, they know I'm interested in Trump. And I say to myself, oh, that's good, you know, they know I'm interested in Trump, but I don't think we realize how nefarious this is. Because they know a lot more than my interest in some political aspect, they know everything about me, they, you know, it sounds like there's a consolidation of data. It's not just the television or the cable. It's where I spend my money, what I read, where I go, who I talk to. And they got a profile on me. And what what struck me out of the Cambridge Analytica movie, you know, the great hack, we should talk more about it, which I wanted to bounce off you is, okay, so they got this data from Facebook, primarily paid a lot of money for it. Somebody gave them the money to pay Facebook. There's a lot of money. Facebook got what hundreds of millions out of that deal. Somebody had a writer check the Cambridge Analytica, given that money. Okay, so now they have the data from Facebook and other places consolidated data, and they create or maybe they got it already, maybe somebody gave them profiles on you and me, Steve, and they know where we go, what we do, what we think. Right. Now they want to affect us, change our thinking. So they got to make they use an algorithm, go through that data, you know, the metadata kind of analysis. And it's like terrorist kind of analysis is everybody we ever talked to got our emails, my goodness. And then and then they take that and they put us in multiple categories. I'm guessing on this. And the categories are, okay, he's a he's a Democrat, and he likes this. And he likes that. And we we know his tastes, we know his inclinations. So we can formulate messages to him. That will appeal appeal to a person with those tastes and inclinations. Very accurate. It's, it's more than why don't you buy a widget because we know you like widgets. It's why don't you buy a bunch of ideas and change your thinking, because we know your soft points, we know what your, you know, your, your, your vulnerabilities are. That's what it sounds like to me. Cambridge was bragging that they have 5000 bits of information about each person. I don't know 5000 bits about myself, you know, but they know they can track how fast you're typing in a web browser. You know, they like when I turn on my web browser this morning, immediately it said, hey, you're in Waimalu, Hawaii, here's the temperature. So they know, they know your location based on your IP address. Right. They know, like you said, just about everything about us. They know that I buy glass. Of course, I get ads for glasses about once a year, when you renew your glasses. It wouldn't be so bad if they knew your purchasing habits in and your, your size of pants already. But they know so much more than that. They know how you think. And this is this they don't tell us exactly how much they know. In this movie, I think it was a Dave Carroll character, real person, went to them and he wanted to get the information he had they had on him. Right. You never got it. No, his problem was he didn't live in Europe. Have you heard of GDPR, the General Data Protection Regulation in the EU? Yeah, where when you want the information, they have to give it to you. If you want it deleted, they have to delete it and prove that it's been deleted. Otherwise, they can get fined 4% of their receipts for the year. So we don't really know the full extent of those 5000 points of data. No, and we don't know, I mean, be even more difficult to figure out what kind of algorithms because they got really smart people, people maybe who trained in Russia, people American kids to smart, really savvy on computers and creative and innovative. And they are given the problem of trying to change our minds. What do you put out and then targeting us and our little profile? It doesn't have to be, you know, 5 million people. It could be a few hundred people. But it will have great effect because these algorithms they write are so effective. And, and, and there's nothing to lose. Right. If they are unable to change our minds with one approach. So what? Go to another switch and you get another approach. Yes, that's right. You know, this is very scary. It's really, you're right. It's way worse than 1984. And, and let me tell you, it's partly our fault. And I say that because when you log into a website, or the first time you go to a website, you want to join social media or something like that. One of the option it tells you is log in with Facebook, log in with Google, log in with you know, whatever else it is Pinterest or something like that. Once you log in with that, now those two sites are connected. And your Facebook can get everything about you on this weather site and this site can get everything about you from Facebook. So we're, we're enabling the connecting of the dots of all the 5,000 bits of information that make up our character. If we thought about it, you know, I'm always troubled when it says that you can log in three, four different ways through other portal sites. Well, the portal site means, and watching what you do, the keystroke everything. Exactly. It's not just a matter of exposing your passwords, it's a matter of exposing your whole, your whole life. Right. Oh, I'm, you know, I'm very concerned about it in the sense that our democracy really is based on a secret ballot. It's also based on the basic assumption goes back to the revolution that the electorate is well informed. Right. The electorate has conversations, develops, you know, sound opinions. The electorate votes in accordance with its best, you know, every, every man and ultimately woman, they didn't, they didn't allow women voting until this ready a century but, you know, everybody who votes is a valid, you know, contemplative, thoughtful voter. We don't really have that anymore. We lost that somewhere along the way. We're being fed misinformation. So we think we're making an informed decision when we vote, based on what we've been fed by the media. Yeah. And somebody can control us. Right. And so Facebook gives Cambridge Analytica, Facebook and others, I'm sure to a moral certainty, it wasn't just Facebook. Oh no. It was that whole corporate world you described. Correct, correct. Cambridge Analytica in a space that, you know, the photographs in the movie about how big the space wasn't that big. Yeah. 2000 feet or wasn't that big. They got all this data and they got these young people, creative people working out algorithms. And the idea is they're going to change your mind. Yes. And they know you well enough to know what appeals to you. So, for example, I don't know if I have this right, but if you were on the left side of the political equation, they wanted to make you hate the right side. Easy enough, stir up the divisiveness, which we have plenty of in this country today. And it's not only the acts by the president, it's all that information that's thrown at us. And if you're on the right side, you know, well, they'll help you hate the left side, divisiveness. This is not good for democracy, because then you don't talk to each other and you don't make compromises and all that. And then it's the middle, and that's the most interesting part of all. Those are the algorithms. They're trying to influence you one way or the other. And these days, what's happened is they've influenced us to the right. And so the moderate middle becomes less moderate and more upright. I mean, conservative. Yeah, they're more conservative and ultra right. And so this is very troubling because, A, it happened, we know now that it happened. I mean, the mother didn't fool around, he found out. B, it's happening now. And C, it's going to happen in this coming election. So critical to the future of the country. What do we do? What do we do? There's a couple of things you can try to keep your own data safe. That would skew the information that they send you. I would be aware of anything that they say based on your preferences, based on your past activity, there's the information we think you would like. Because that's them telling you what you should do, what you should read. You should decide for yourself what you should read, not based on their algorithm. If you're not going to think on your own, there's an algorithm that'll think for you. And that's what they're hoping will happen is, we're so lazy, we like being spoon fed. Our algorithm will tell you what you what you should be reading. And that way you're making the decision that they want you to make based on the information they give you. But that's not the way we should be making our decision. So how what am I, what am I looking for? I mean, what are the badges of email, for example, a social media, which is trying to change my mind in an inappropriate way. What distinguishes it from other things? So much, hundreds of emails every day. There's tons of emails every day. I would not open up any emails that I didn't know who the sender was. And even so, you can spoof the sender. So are you expecting an email on this or that? Yes or no, right? You're not expecting the email, then it would be suspicious to open that email. That may also put malware on your machine, which then of course, increases the chance of you getting misinformation. Yeah, right. The malware, they can sort of bounce it off your own machine that way. Exactly. And they can also use your machine to send it along to others, send it to others. So you become part of the network, their network, right? Well, okay, you know, if you get an email, there was some of this in the movie, if you get an email that's bashing Hillary Clinton, it's saying things about her that you haven't heard before, right? Then you really got to question that, even if it seems to come from a valid source. How do you trust that? Right? I mean, it's, it's, I like to call it digital mud slinging. In the past, the candidates would, you know, mud, they would just basically say bad things about their opponent, whether it was true or not. Now it's all digital. We get it in the form of email, we get it on our websites. We get it in the pop up ads, things like that. It's even on TV. Sure. They're just trying to get you to distrust the candidate that you would prefer and trust our candidate, the candidate we want. Right. Now, you know, there's a history of this sort of thing in American politics. Even in Hawaii, there was a thing called a headpiece. And the headpiece just, you know, it's really exaggerated, and it just spends all its time hitting on the adversary, you know, the opponent. And I think, you know, in Hawaii, most people don't like it. Right. They see it for what it is. They know that it's not, that it's not kosher. It's not no opponent. Right. And so you can see that one coming. I suggest to you, Steve, going forward, these kids, you know, the graduates of Cambridge Analytica, wherever they may be in the world could be anywhere. They're more sophisticated. They're more nuanced. They know we are looking, waiting, watching for them. So they're not, they're not going to be obvious. It's going to be very subtle. Very subtle. And you're not going to be able to know what's fake news and what's real news. Exactly. Exactly. So I want to make you Congress for a minute. Okay. If Congress, you know, became functional somehow, and became rational and being concerned about the welfare of the country, became concerned about the protection about democracy, which is in great jeopardy, in my opinion. And many people feel that way. What could what should Congress do? You know, Mueller said, you guys got to do something. Yeah, he's not going to do it. He's going to only suggest we should do it. Oh, yeah. What should Congress do now? What should Congress do now? I would definitely encourage them to take a good look at GDPR, general data regulation that the EU has, and implement something like that in the US. So we could actually take control of our own data, and say, you know what, I don't want Google to know all this stuff. Google, you know, have have a way to verify it's me requesting that my information be deleted. And then you got to get rid of it. And there's a hefty fine if you don't. They got to act. It's funny that that now we have, you know, another issue which has been emphasized only in the past week about hate groups in the country, hate groups that live on the internet, that have special sites, special social media arrangements to stir them up and make them hate more. This has got to be part of that same legislation. Don't you think? How do we deal with them now? It's not only divisive, it's hate. It's the extreme of divisive. Right. And then they say, well, gun control has to be in place. Well, you don't worry too much about the hate groups going on, but they're becoming extreme in their violence to offline. And that's where the guns come in. So what's the root cause? What is the root cause of all of this going on? Part of it, I think is the internet is not regulated. Anybody can say anything they want on the internet. We have to be very careful about what we consume. Well, yeah, that's a very interesting statement from a guy like you. And it would be from me too. Because I have always believed in the internet as freedom, right, is global freedom of speech and expression. And yet, it has taken a turn. It has become negative, it'd be weaponized. And where before I would have said, no, it's got to be free as a free as a bird. Right. I don't say that anymore. I think there must be control, because it's a balancing of considerations. Right. One of the considerations is hate. Another is inappropriate influence on voters. So we have to control the internet. That's what you're saying. Yeah, I would. I'm not sure I would want to control the internet. I would like to regulate a little bit. Like Electricity, it's it's becoming a public utility. Yes. When I somebody said that, this was Warren said that. I'm sure I think the internet is great source of information, but you need to validate that information is correct, which means you have to do the work to go to the source of where the internet is telling you what it was. I tell my students, you can write me any paper you want, not site Wikipedia as your source. The site Wikipedia, you go and you look at the sources that Wikipedia used, go to the original source. That's what I want to happen is I want to be able to go on the internet and get to the original source, no matter what. I can verify that the information is correct. But doesn't it require non profits and sort of NGOs, cause organizations in the country who combat those corporations you talked about and government and, you know, hate groups that are weaponizing the internet. It needs a countervailing force in the conversation. Does that force exist now? Or do we want to encourage that force to grow organically? I think we need to encourage that for us to grow organically. Who is it going to be done by? I don't know somebody with very deep pockets, though, is going to take a lot of money to combat what's going on. That would be, you know, a great result if you had a cohesive organized force. On the other hand, I would say that part of the negative contributors to the internet is the power to divide the opposition. So if there was, you know, a major effort to develop sort of self regulating of public regulation, if you will, of the internet, there would be an attack on them. There would be attempts on those who believe in dealing with your opponents by dividing your opponents, I mean, such as Russia does, not only in this country, but wherever it goes. So that group would have to be very determined. You think it should come from academia? Speaking as an academic, no. Speaking as somebody who's been in industry for 15 years before joining academia, you could possibly do that. Academics are kind of like Mueller, you should do something, but they will study the problem. But coming up with a solution is kind of tough in academia. It is a good place to start because that's where the young people are. And we have their ideas. And they're the future of the country. So I would like to get their input on it. And they're actually quite smart. And they know, they know about this. They know all about it. They know more about it than a lot of older people. Yeah, me, for example. While they're on the internet, you're also, you know, suggesting that the government is not the right one to do this. That regulation by the government is, I mean, we saw the questions that those congressional committees were asking Zuckerberg, you know, you know, showing that they knew very little about Right, they're not asking the correct questions. We can't trust the government. We don't want the government to regulate it. No, we won't. In a true democracy, I believe the government would not regulate it. They can encourage, they can set up fines if they don't do if they break the law. But currently, there's not very many laws in place or regulations in place. You know, if we could get the GDPR, something like that here, with hefty fines, but you know, Facebook got fined $5 billion, like taking 20 bucks out of my my pocket, you know, in is mean inconvenience for a week, but then it's okay. I started out today, by asking you whether it was safe. And you said it wasn't safe. No, I think we totally agree on that. But now I want to ask you the closing question. All the things we talked about all the things that we touched on. Are you optimistic? Are you pessimistic? Is this going to work? Or are we going to be in an existential crisis? Frankly, I think we're going to be in a crisis. We're not moving fast enough to combat what's going on. People are still sharing too much on the internet. You should only share what you believe will be okay to make public. I mean, you don't put your bank account information on the back of a postcard and send it through the mail. But people will do that on email. Email is not any safer than the postal mail. So I think we're going to be hit with an existential crisis before we get this thing figured out and get it resolved. Unfortunately, over the days of the Northwoods, we're all in it and we have to live with it and we better intend to it. Right. Thank you, Steve, Steve Larson. Great to talk to you again. Really enjoy that. Thank you so much.