 is Think Tech Hawaii. Community matters here. Welcome to the Cyber Underground. I'm Dave Stevens, your host. I'm the Cyber Guy, and today we're going to change it up just a little bit. We're still talking about cybersecurity, and we're also going to talk about hacktivists and how they actually get made. Now, that ballistic missile alarm that we got that was a false alarm could have changed our psyche a little bit, and it occurred to me this has been happening in history over and over. To discuss this with me, Tom Moore, adjunct faculty at Capulani Community College. Hi, Tom. Welcome to the show. Thank you so much for coming. How are you, sir? I'm in pretty good shape for the shape I'm in. I'm so happy to be here. Well, I love having you on the show. Tell us a little bit about yourself. Just give us a... Well, for the purpose of today, I'm adjunct faculty at Capulani Community College, just like you. I'll just teach on a different floor in a different building. It's basically the same audience. So, you're your local boy, depends on how far back you go. I sort of grew up in Hilo, and I've been living on Oahu for 30 years or so. Just hanging out? Yeah, just waiting for this moment. This is my big break. I'm expecting everything to be different after today. Everything will be different after today. Okay. Well, let's talk about the short-term mentality that's been developed in our culture. And this has happened in other cultures in the past. And you had a great note in here. If you could read us right here, this one that you came up with. I love this. Oh, that one? Yes, that one. Let's just eat the seeds. Why bother to plant them? Right. Since I have no future, there's no future. Yeah, right. Exactly. Yeah. So, a lot of cultures this has come out before. We've seen it in the Greeks. We've seen it in the Romans, where it's not only the stress of living, but also when, say, when Rome and Greece started out, they were on highly industrial societies. And that's why they flourished, especially Rome. And they had what's called deferred gratification. So, their investment in time and effort would pay off at a later time. And we see that in agrarian communities. When we went from the hunter-gatherer to the agricultural communities, we have to plant a seed. And then we got to sit back and wait and nurture the seed because the benefit comes months later. And then you get to harvest. And then there's some gratification. So, run through this with the deferred gratification versus its opposite a little bit. Well, the opposite's pretty simple. Instant gratification or the ever-popular extra-instant gratification. Now, describe that, extra-instant. I would just made it up. I couldn't possibly describe it. But the trend is people are demanding. And when I say people, I mean all kinds of people, infants even, they've done studies that bitty little babies, given a choice between a cookie now and two cookies later, prefer a cookie now. And it's scary because all kinds of good things come from deferred gratification. Success comes from deferred gratification. Civilization comes from deferred gratification. Tools and investment and absent a sense, absent a value for deferred gratification. And given the growth, the decline, if you will, into our want, proceed need for instant gratification, we're seeing all kinds of unpleasant and unpleasant consequences. It's scary. Let's talk about the unpleasant consequences now. So, we talked about how Rome and Greece came to power because they were deferred gratification. They studied mathematics and engineering and agriculture and waste management and water management. And they came up with the irrigation techniques that we still use today in the bridge building and the arches that we still use today. They're aqueducts in Rome are still standing. It's amazing architecture. So, at the end of those societies though, when they started to decline, the resources became so bountiful, so plentiful that we actually had, or they actually had, a class of citizenry that no longer had to be part of the industriousness of the society. They could just benefit as consumers. And that, they were the patricioners, of course, in Rome. And we have that class of society now here in our culture. And I would say it's almost everybody. Pretty much. There's hardly anybody that gets into the nitty-gritty and works, you know, 12 hours a day building a bridge or mining coal. And everybody else just gets to walk into their house and turn on the light and, hey, you have light. Turn on the heater. You can heat your house. Turn the air conditioner massively, tremendously important out here in the islands. You know, I have it in my car. If I didn't have air conditioning in my car, it's 1960 again and I'm suffering. People don't like suffering. No, we don't like suffering. They don't think they can endure it, really. Right. Which is a little, a little tragic. It is a great deal of tragedy in that statement, I think, because sometimes suffering builds the character that could improve you in the long run. Right. I would say that almost all suffering does improve you somehow in the long run because it makes you tougher, makes you a better problem solver, and it challenges you to be, I think, a better thinker. It certainly gives you, all that's true, perhaps another consequence or advantage, is that it helps you discern the difference between what you want, what you need. That is a great one. I love that. Yeah, let's connect this now to why we're doing this on the cyber underground. I see right now in the cybersecurity field, there's a tremendous amount of patches and fixes coming out, Specter and Meltdown are some of them, and these have to do with engineers not thinking about security risks, they're thinking only does it work. You and I have worked in industry for decades, between us half a century probably, and many of the times when I was working in the industry and say I was a programmer, I was making an application, the owners of the business would tell me, hey, just make it work, we got to sell this thing right now. Hurry up. Yeah, hurry up. And I'd say, well, it's working, we did a good demo, but I got to go back and clean it up. I got to make it secure, I got to make it stable, I got to make it extensible, and they'd say, no, no, we're not done. Do that later. And it never came. No, there is no later. There is no later. There's never a later. So we end up with unsecure, unstable scaffolding that could fall down at the, you know, the little shake of the earth. And we have tons of that software right now, and that's a short term mentality. So let's talk about in the United States, how did we come to this point? Well, maybe we should go back to the beginning of the United States versus now. Basically, what, you know, we both, as it happens, at least for the moment, work in a bureaucracy. And I've heard that. Did you not get the memo? I did. It must not have made it to my office. Yes. Well, you know, it's interesting that there's always enough, wait a second. There's never, here, sorry. There's never enough time to do it right, but there's always enough time to do it over. Always. And so what you had alluded to is, you know, with your management, hey, it looks finished. It works in one case under one limited set of circumstances. Right. But as a system thinker, you know, as a professional technologist, you want it to work as much as possible in all cases. And, you know, perhaps that's not economically feasible, all cases. But one case isn't enough. Now, that's a perfect example of this. This just occurred until it did release some patches for its specter. Or I'm sorry, they're mitigating meltdown. But they initiated patches and that patch management, when people implemented it, caused computers to randomly reboot. Yes. Well, all they needed to do is to get a patch out the door. But they needed a press release. They didn't need a solution. Yeah, that's a good point. Yeah, they're looking for some PR. Yeah. Exactly. And now their statement is, we'll have a patch out at the end of this year. Which if nobody's really paying attention, it's only January. Yeah. That's quite a while away. Well, verification deferred attention. Well, maybe they'll make it right. But in the meantime, we're all unsecure. This is a bad problem. We are having to mitigate this problem by adjusting other software applications that run on your computer. So they are not susceptible to these attacks. Specter, Meltdown, Specter, Variant 1 deals with JavaScript and a shared buffer array. And so you have to upgrade your browser or change settings in a browser to compensate for this. And people love doing that. I had people love doing that. Except when it slows down the computer. Oh, by 30% or so. Yeah. And so the older processors are really struggling with this. And it all comes down to people wanted to make things faster for more instant gratification. For instance, in the browser, when they used a shared array, which is a storage area for anything you want to put in memory, that was shared across applications. And sometimes the application using it wouldn't go in and check the size of the array before it started jamming stuff in there. So jamming stuff in there without checking it first, of course, doubles the speed. So more instant gratification, you can see your browser's faster. But now with this new error, if you don't check that first, another application can share that and speculate what's in that by some random method. So that's what Specter's all about. It's speculation of what's in that shared buffer array. Now when you can figure that not to work or upgrade your browser in Chrome, Firefox, so forth, it's slowing down your browser. Now for faster processors, you don't feel a thing. So if you're out there buying new equipment all the time, the instant gratification of better stuff, you're not going to feel the impact. But people who are practicing deferred gratification, you know, my Mac, I wanted around for six, seven, eight years, as long as I can get to the new OS, my Mac is slowing down. The longer you can postpone your new purchase, the more bang for your buck you get when you replace it. And some people are tied into legacy applications that they can't, you know, they're stuck with. You know, their organization doesn't want to replace it, is unwilling and or unable to replace it, State Tax Department. And they won't retool it to work with newer systems. And I know there are organizations that have to use, say, Internet Explorer 8 because their web application was written for IE 8 and no other browser uses features only in IE 8 and IE 9 would break it. The national health provider in London, in England, they're stuck with XP because their apps only work in that environment and they didn't know that they could virtualize it. And so, you know, their entire national health care system got ransomware. And this is kind of strange and sad at the same time because you know that's not fixed. Oh, hell no. Yeah, you can't go through the entire NHS and you can say, oh, by the way, you got to upgrade everything. Yeah. And rewrite your entire application. And could you have that done by just afternoon? Yeah. Yeah, it's not done. I do not think so. That is probably one of the biggest projects they're ever going to have to face and it's a budget buster, which is, comes down to money. And let's talk about the management aspect of building this kind of stuff and some of the technology companies that pioneered this activity, putting a piece of software out there and letting the consumer deal with it, but they're making money on the short term. So that company's share price goes up and all the executives get a big bonus. They save a lot of money if the customer becomes the beta tester. That's right. Or in the worst case, the alpha tester. Yeah, let's describe this. So when we're releasing a product, there's usually a very small batch of testers, the alpha testers, and they test the very loose edges of something that's nowhere near complete. But they map out the very, the tremendously horrible bugs that might be there. Then they go to a beta test after they've mapped out those big bugs and the beta testers a little larger group, but they tend to find a lot more of the nitty-gritty in the bugs and they test more scenarios. Especially gamers now. This is a big one, beta testing games that would drive mean nuts, but some people make a career out of it. I like that. And a good living. And a good living, I might add. So, and then we go to your first release and that release candidate is what gets pushed out to all the people who are in the early release list. And again, you're using consumers to test your product. So you've gone from the engineers to the consumers in four steps and then you do the real release. Now Microsoft's gone to this, but they didn't implement that until about the mid-90s. And I remember the first windows that kept coming out and coming out and coming out and it was tragedy. But they made billions upon billions of dollars. Well, it used to be that there's this economic concept in internalizing externalities. It used to be that the manufacturers paid for their mistakes because they provided support, free support. And when they paid for their mistakes, then they had some kind of an incentive to release a quality product to the extent that the support model and the funding for the support models is completely different now. Yeah, completely different. Yeah, it's virtually non-existent. Support is virtually non-existent. Well, it puts you back to the customer. Exactly. You've got to pay for it now. Exactly. Well, we're going to come back and talk about that. We're taking a one-minute break. We'll be back. We've got to pay some bills. Until then, stay safe. Aloha, I'm Keeley Ikeena and I'm here every other week on Mondays at 2 o'clock p.m. on Think Tech Hawaii's Hawaii Together. In Hawaii Together, we talk with some of the most fascinating people in the islands about working together, working together for a better economy, government and society. So I invite you into our conversation every other Monday at 2 p.m. on Think Tech Hawaii Broadcast Network. Join us for Hawaii Together. I'm Keeley Ikeena. Aloha. Early arriving for a little tailgate. I usually drink but won't be drinking today because I'm the designated driver and that's okay. It's nice to be the guy that keeps his friends in line, keeps them from drinking too much so we can have a great time. A little responsibility can go a long way because it's all about having fun on game day. Welcome back. We're here to the Cyber Underground. By the way, if you're going to comment on this episode on YouTube, please give a comment about my glasses. I've been hearing a lot about them today. Apparently, when I wear glasses with these frames, I look smart-er. I actually heard you look smart first and then someone said smarter, which I appreciate a lot. Lucky thing, yeah, yeah. Might be more gentle. Well, we're talking about kind of the short-term mentality of manufacturing software. And let's talk about how it's developed in other products too, in engineering circles. And I'm going to talk about cars because the latest cars coming out now have online capabilities that have already been hacked. People are hacking Jeeps. Sorry, I have to pick on Jeeps. That was the first, right? And you can hack into a car and make it do things. Slam on the brakes, take it off the road, shut off the engine, whatever. Four and kill people. For law enforcement, that's a good thing. Everybody else does not a good thing. So let's talk about the short-term mentality. Now, if I bought a 1955 Ford F-150. A red one? Of course, you got red and black or something like that. Let me tell you, you want songs that's red. Right. That would probably still be parked outside my house and probably be my go-to vehicle for heavy lifting. It might be your retirement plan. It could. Well, yeah, it was a really nice one, right? It's a car design at this point. But it was a good vehicle. I mean, a solid V8 engine made a solid steal. If you cared for that thing, it'd be working today. And you'd be a loyal customer into the future. Well, I don't know. I bought Fords for many years, many decades, and I saw the quality decline. Well, but based on that first one, you know. Well, I stayed with Ford based on that first one. So I had a Ford Falcon. Great solid six-cylinder car. Got me around. Good gas mileage. And I could just beat it to death. And it was a wonderful car. And it was built in the 60s. By the 70s, when I bought my Mustang, things were changing. There was a designed obsolescence on a short-term mentality. People started to finance their vehicles for three, four, five years, right, in longer periods. And so by the 80s, what I saw was the cars I was buying in the 80s, they were only made to last five years. You finish your payment, maybe one or two years later, got to buy a new car because that thing's just falling apart. The Dodge Aries K car was a perfect example. That was a disposable can opener that you weren't supposed to fix. And mechanics would actually turn you away. I can't get parts for it. This thing's all one big piece, so I can't... Unibody. It was unibody, that's right. One of the first ones. Well, we keep doing that. You know, the last Ford, I'm going to pick on Ford now. For Ford Explorer, I had the 2004 Ford Explorer. And you don't have it anymore? I know. I tricked somebody into buying it. Actually, I sold it and they were very happy to get it. But I did warn them, everything plastic on the vehicle was beginning to become so brittle, it would just disintegrate. So the secondary controls on the roof and the dashboard and the center console, I had to keep replacing those and it was hundreds and hundreds of dollars. Of course. And every four or five years, those things would just disintegrate. The rubber parts were the same way. There was a hole where you pulled out the seat belt. It actually wore a groove in the hole every time I pulled the seat belt because the seat belt was tougher than the plastic. So I don't think we're actually designing for the longevity anymore. We're designing for profitability and it's all short-term. But there's a trick to it. You know, these aren't just random things. You've probably heard of big data, right? Sure, yeah, they're analyzing their data. Yeah, they're analyzing their data and so they work these things out. It's very coordinated. You know, they ask the engineers, when's this going to break? Engineers give them a number. Oh, I don't know, 53 months. Are you sure? Yeah, we're pretty sure it's going to probably break. MTVF, meantime between failure, it's going to be 53 months. Call it counting. We want a four-year warranty. Yeah, a four-year warranty. That's it. So 48 months. Yeah, it's not random. The perception is it's profitability oriented. And that's short-term again. Exactly. We're looking, as Americans especially, I think. We look six months ahead at the most. I think it's highly unusual for a company to come out and say, this is where we're going to be next year and actually be there next year. Because they change their plans so often because of small shifts in the market. So people have become short-term thinkers. So when they're writing software, short-term. Internet of Things, perfect example. I just want this webcam to work. Stick it on the wall, hook it up to your Wi-Fi. Security be damned. Oh, yeah. I just want it to work. And now those webcams can be used for DDoS attacks and breaking into your Wi-Fi. We have refrigerators that are Wi-Fi enabled with absolutely no security. There's a username and password. Password. Yes. Didn't you send me the picture of the Hawaii emergency management system, the screenshot from the interview and this person posing in front of one of the computers inside the control center? Yes, I do recall having sent you that screenshot. And tell me, what was so unusual about it? The post-it note sticking to the screen had the username and password. In the background, right? It had the administrator password. Password. Yeah. That's what password. Didn't they actually say administrator? I could be. Maybe that was, yeah. But it's scary, but when I go to audit companies, as part of the business I operate, the cybersecurity business, we do audits and vulnerability assessments. And you'd be surprised how many places someone can put a written password on a post-it. It could be under the keyboard. It could be hanging on the screen. It could be up above. But again, I think this is short-term mentality. No one's taking the time to memorize the password or put it in a place that's actually secure. It's just a quickie and stick it on the honor. Well, once again, especially in a bureaucracy, people essentially trying to cover their anatomy. You know, they make rules. You know, they check off boxes on forms that go in files that people don't read. And so they have to have a password policy that creates stupid passwords that are hard for people to remember. They change every 30 days. And they furthermore change them every 30 days. So people, you know, they rebel in their own sort of silent little way. And they put them on post-it notes and stick them on the screen. And, you know. That's a rebellion. And, you know, people, lesions of people aren't going to do things the hard way unless you really make them do things up. You gotta force them. Yeah. And that's too expensive. Let's connect that now to things like hacktivism. And there's a connection here. Let's talk about where activism meets hacktivism. And now, if you don't know hacktivists, you know, hacktivists are people with a political cause or an agenda. And they're not really out for the profitability. They want a point. Or philosophical. Or philosophical. Yeah, it could be ideological. Yeah, yeah, yeah, yeah. They want to push their agenda somehow so they use hacking techniques to get it done. They hack a website. They put up their message or their manifesto or something. Denial of service attack, you know. They can bring down a company, right? But they can also cause harm. Now, hacking doesn't have to be just affecting computer systems that just inconveniences people. I'll give you an example. Last time we went to Black Hat, we saw a demonstration of somebody that hacked into a car wash and controlled the industrial control system via a web app that was provided by the industrial control system and wasn't secure. Because they didn't bother. And they were able to close the car wash door on top of a car over and over and over until they really damaged the car. It must have been fun because they obviously had a webcam so they could see it happening. And this probably had a... There was a webcam in the car wash that was connected to the application. So, yeah, they can watch and laugh. And that could be connected to hacktivism if somebody was environmentally conscious and they said the soaps you're using and it's getting oil into the gutter system which goes right to the ocean. There's a lot of hacktivism that could occur and that's one of the connections we can make. They can hurt people with the hacktivist. But their mentality is, I don't have to pay for that because there is no tomorrow. I'm not going to get caught because in a couple of years when they figure it out, I'll be dead or the country will be... There's no relationship. People live between their thumbs. There's a lot more self-absorption than there used to be. Selflessness is... Some people go, what? Selflessness. Is that a word? What is that? Selflessness. But what if you've got a pacemaker and someone goes... Some pimply-faced teenager goes, hey, let's hack pacemakers. What could possibly go wrong? Oh, yeah. And pacemakers have already been hacked. Exactly. The latest thing is now you can get a contact lens that will read your diabetes level. Your blood sugar level, which very soon, if not already, will be connected to a pump that pumps insulin into your body. Oh, so if it's hacked, you get the wrong dosage and you could die. Yeah, right. Better living through... Yeah, chemicals was the old one. Well, yeah, better living through technology. And if people don't give a darn, if people don't have empathy for other human beings... They want to be first to market. That's the important part. Of course. Yeah, and I think our medical providers are right at the forefront of putting IoT devices on wireless that aren't secure. Pacemakers, even morphine, can be on a Wi-Fi. So... They've got intercranial pumps now. You can put just a tiny little bit of stuff in, you know, for mental illness and... Yeah, it's crazy. It's getting scary. Well, it's the double-edged sword thing, right? I mean, for every cool thing that's powerful, it becomes more... To the extent that it's more and more good, has more and more power for good, automatically it has more and more power for evil. It's true. There's a balance to that. Yeah. Well, let's talk about how people, I think in the United States, in our culture might have developed this kind of mentality, say, over the last 70 years. I'm thinking post-World War II, 1950s were in our heyday. And then we go into something called the Cold War, when we realized Russia has the atomic bomb. Then we have intercontinental ballistic missiles. And all the way up until the 70s, the 80s, when I'm growing up, 70s and 80s, we're still having the duck and cover drills, but then they became earthquake drills in California. It's the same drill. Get under the table and pray. Did you get them mutually assured destruction memo? Yes. Yeah, that was the plan. That's wonderful. We don't care if we survive so long as they're all dead. Right. And both guys have that mentality, and it's like, why should I clean the bathroom? Well, now we have North Korea, and we have the ballistic missile alert, and it just promotes a philosophy of why should I really care? Never mind. It just doesn't matter, because we're all going to die tomorrow. Well, unless the ice caps all melt, then we're, you know, you know, there's lots of things to worry about. Oh, and climate change. Yeah, sure. Well, with our last 30 seconds, give me your opinion. Did the short-term mentality contribute to things like the Cultural Revolution in the 60s and the weirdness in the 80s, and now the apathy of the new millennium? I don't remember. 60s were good to you. Yeah, yeah. All right. I had the same thoughts about the 70s. Yeah. Well, I think this science denial goes along with it, too. People don't want to make the effort to learn the science, so they can't argue science, so they just say, it's not real. Well, I can't be wrong if everything is untrue. That's right. Okay, we're going to have to wrap it up. This is a great topic. We should do this again really soon. I agree. Would you come back? Oh, absolutely. Let's do this again. I'll come back today. This afternoon. Yeah. Well, we'll be doing a promo, too, in a couple of minutes, they tell me. Thanks for joining us on the Cyber Underground. Until next week, stay safe.