 I want to thank you so much virtually and I really, really appreciate the opportunity to get a chance to talk with you all about a subject that I'm pretty passionate about. I know that you see slides up on the screen and I'm going to do my best to go through those as fast as possible because I can see a sea. I show of hands. Does anybody here love power? Surprising. So, what I want to talk to you guys about today is something that's going to apply as you step into your new roles just as much as it's applying today, tomorrow, as it has in the past. And that's coming back right now. We are at war. I know that in the headlines we're seeing a lot of what they're doing. And as of 10 minutes ago, we're not sure what their action is going to be. Are they pulling people back? Are things changing in the region? Are they going to attack Ukraine? Any and all of that. The point to take away from this is this is a different kind of war. It's a war that has been going on in a formalized fashion for about 100 years now. And I want you guys to look to your left, look to your right, look to the people next to you. The thing about this war is it's global. Everybody's impacted by it. And the person to your left or right, you don't know if they're going to be an ally or an adversary from one moment to the next. And that's the thing about war in a digital environment, an information war, is it's based on ideology rather than nation-state boundaries. And Russia is very, very versed in information war. So much so that doing it to other countries, us included, for the better part of 100 years. It started around 1918 when they stopped, they didn't stop, but they changed their disinformation strategy from doing it against their own people to doing it against the rest of the world. Now we are still seeing some of the impacts of disinformation campaigns over the decades that they've been doing it. And I mentioned to you kind of to look to your left and right. The thing to take away from that is that you all, your airmen especially, are all icebergs floating around in this information environment, this sea of information that's out there. And to put it in context, the average person consumes about 500 to 1,000 posts on social media every day. Consume being scrolling and things like that. A lot of us know airmen who are in that 2,000 to 3,000 range, but every single one of us, even if we're not, are icebergs. The above the surface, the tip of the iceberg, that kinetic action that we take, those are the out loud things that we say, those are the behaviors that we exhibit in the real world. Those are our actions. What is being shaped by the information we are provided. And before the digital environment for social media, it was a much slower pace. It was much more thought out, there was more time built in. Now it's information speed and at scale. So you're getting those 500 to 1,000 posts per day. And because of the way in social media, that information is being micro targeted. So I want you to kind of paint a mental picture here of an airman standing in the commander's office talking about something they posted, shared, commented on, or generated on social media that runs contrary. I'll use to doctrine to anything that aligns themselves with being an airman. We often talk about the tip of the iceberg, what they did, but we don't explore what shaped that. How their environment was influenced, what information they were exposed to, how that information became a compounding effect. Because if you look at one thing online, if you look at one thing on Facebook, you're sending a signal to that company that you like that. Or if you engage with it, you're sending another signal that you want to see more of it. For all me, your airman, are all these icebergs floating around in what's called the gray zone. And for those not familiar, the gray zone is indicative of warfare that doesn't rise to the conventional level of what we understand war to be. So it's not Russia invading Ukraine. It's not China attacking something in the South China Sea. It's not any of those conventional things. It's not a firefight or a bombing run or anything like that. It's influence operations. It's cyber attacks. It's sentiment manipulation. It's countering narratives in this information space. And the problem is, we don't necessarily know how to handle this bicycle of gray zone warfare as we're riding it. And our adversaries are taking advantage of the space in between. And so as we're trying to figure out whether or not Russia hacking a satellite can even be attributed to a country or is it an act of war? As we're figuring that out, our airmen are getting directly but second, third, fourth order effects of things. And that leads us to kind of getting this or getting one narrative either from foreign adversaries, domestic trolls, or any number of sites that they can go to online and the device that they carry around in their pockets. Then how they actually getting after understanding what our commander's intent is. And typically, if I were in person, I'd ask, and this and the higher up in the chain that I go. The problem is not that airmen are disinterested or they don't care about knowing what their boss wants or knowing how to succeed at their job. The problem is when we're talking about an airman who has grown up, the sum total of human knowledge in their hand, in the palm of their hand that they carry around in their pocket is they're used to that information being delivered to them. They don't have to go search for it. The example I use is for people who do have social media, if you wake up, you log in, how many people are navigating to other Facebook pages or other pages on Instagram? Or do you just open the app up and scroll until you see something that interests you? 80% of the time it's the latter and that's what our airmen are doing. And the information that we put out there, we're kind of not understanding that in a digital environment, commander's intent and audience intent are on equal playing fields. So social media is like a commander's call where airmen can get up and walk out. They can enter up on stage, give them a hug. They can invite their friends to join in. They can offer commentary. It's participatory. And because of that, our messaging being the way it's traditionally constructed is not breaking through to them. And so they're seeing alternatives out there, things that are not rooted in fact, things that have agendas from a foreign adversary. And because social media and digital media, like again, even if you're not on social media, Google does this as well. Google has an algorithm assigned to each of us. So for me, it will look different for you. If we typed in Russia right now into our... I would get news results, yes, based on immediacy, but also based on preference. So I might see Washington Post. Somebody else might see the Independent. Somebody else might see CNN or Fox News. The source is out there because those algorithms recognize that if they want to hold... If they want to keep somebody online longer, they have to deliver information that they want to see. And unfortunately, the reality of this is they've done studies. If you have social media, digital media groups left unchecked and verified, just streams out there and micro-targets, it always, always, always drives through the extremes. So the more you look at, say, a certain group on Facebook, the more you're going to be exposed to friends who align with that narrative. And sometimes that's fine if it's, say, like hiking or... Yeah, but it's not fine when it gears toward extremism, misogyny or racism, any number of those things that are terrible and incompatible with service. And all of that is... What that's doing is constructing echo chambers, which echo chambers are basically in the digital sense how you think the world works based on the information you receive. So if you have that same pic... If all they were exposed to were pages on social media that aligned with racist or misogynistic or sexist content, how long before that under-the-surface part of the iceberg builds up in crests sexually harassing someone or making racist comments at work or committing domestic abuse, any based on how social media is designed to create potential energy under the surface, your thoughts, your beliefs, your knowledge, and then translate that into physical energy, what you do about it. And no one understands this better on the world stage than this guy. And if you don't know, you should because that's Vladimir Putin. The word underneath of him is the Russian spelling for the word disinformation. And it's kind of funny because one of the earliest disinformation campaigns Russia launched about 100 years ago was to try to convince the rest of the world that disinformation was a friend. And that's critical because a key element of disinformation is it has to have a kernel of truth behind it. So yes, disinformation is a French word, but Russia had it in their vocabulary. They tried to convince the world that it wasn't true so that they could continue to conduct these disinformation operations. And you see that over the course of history. Disinformation actually goes in waves. We're actually in what's called the information, which it's different than all the previous ones drastically because the first focused on disinformation at scope. They had a specific objective. They targeted either saying that's the CIA engineered the AIDS virus or that they targeted that the U.S. should not develop or implement a neutron bomb, which we didn't because of the disinformation campaign. They were targeting it. In the information age, in the digital age, they realized they can do it at scale and at cost. For that neutron bomb disinformation campaign, they spent roughly about $685 million in today's dollars on that to change the course of developing an enhanced radiation weapon. It worked. Now, in the 2014 to 2016 run-up, in $2,000 or so a week on disinformation targeted to the United States and so roughly $52 million. So from $685 million to $52 million, and the effects were much wider. So they're learning from this, but at the core of it, their model of disinformation has remained unchanged because we keep falling for it, and then because we fall for it, we amplify it. So Putin's worked for him. And meanwhile, while we have Vladimir Putin doing more targeted disinformation even at scope, he's still targeting it to get after his objectives of disruption, distraction, division, you have China. And China is taking what Rush is doing and learning from it. They have their 100-year marathon. They've said by 2049 they're going to take their rightful stand power, ending what they call a century of humiliation. And they are willing, they are wanting to do that economically, socially, militarily, you name it. And when it comes to things like this, Chen, like that, they are still targeting their own people. They've thrown up the great firewall, limiting access to the internet. They have apps out there that the rest of the world is using. TikTok just beat out Facebook as the most downloaded social media app. In fact, you can't use TikTok in China. They have a watered-down version that is stricter than the People's Republic of China. So the app that all of your kids and my wife is on a lot, and that's Amnesty Hour because she's not on this call, but that app that she's on and your kids are on and your friends might be on, it is not allowed in our primary, in our pacing threats. And beyond that, TikTok is a social media app that does not share the data that it gathers on its people. You know what it's collecting on your friends, your kids, your neighbors, your spouses. We don't know what it's collecting, and they don't share it out there. They just hoard it for whatever they want to do with it. And if you think 100 years of Russian disinformation operations is long, a lot of the stratagems that China uses in their 100-year marathon to assert dominance in the world are backed up to about 14, 15 to the doctrine of warring states, and they haven't changed them much over the course of the moment, because they keep working, because they're so effective. So we're coming into this struggle late, and then we're left wondering why and how our airmen are getting influenced, how our friends are getting influenced, what can we do about it, how do we separate fact from fiction, how do we critically think in this environment. And it really comes down to that, and building the future of our enlisted force. Your role as inspirations to these airmen who are on their phones constantly, we have to build an airman who can critically think faster and earlier in their career, because again, they're exposed to a thousand posts, all of this information. A thousand conversations with an individual airman every day, they're doing something horribly wrong. So we can't compete on scale with that. So we have to compete with the scope of it. We have to pull their attention in different ways. And the answer is, unfortunately, the answer is not to use social media as a magic bullet to kill the problems social media generates. It's to understand the environment, take some of that discussion offline, and use these platforms because it is an information war. And the airmen of today that are just coming in, the airmen formations, they are the ones who are going to inherit a digital war, a world war web. And so what does the future look like? In a lot of ways, we're at a crossroads. If we can think things the way we're doing and just saying, okay, well, social media is a nice little afterthought or the digital environment is nice to act as, we're going to let our adversaries vastly outpace us in that battlefield because they understand what it can do. They've seen evidence of what it can do. Regardless of where you sit on the political side of things, what happened on January, that example of how potential energy on social media can lead to kinetic action. I'm going to test the earliest time Russia ever did that to the United States in the digital environment is they advertised that free donuts were available in New York, hacked a camera and watched people line up for around the block to wait for free donuts. So they knew they could do it and they started doing it in stages and building up and building up and building up. So we are still susceptible to getting targeted every day. And so with that, I tried to go through as fast as possible to get what's on your mind. What can I answer? Because the worst thing to happen to this discussion is for it to just be here are the facts cut and paste. It has to be what is on your mind. What is concerning you? What can I what can I help you with? You're still going to use the mics on the side. I got you 44. Chief Washington Luke Air Force Base. So are there any known or recommended settings that we could in the discussions we have with our airmen that they can put on their phones like to eliminate some of the targeting or the filters that, you know, that identify them the statistics that's collected on them. Sergeant Denton, did you hear the question? See. Oh, I guess I got to unmute that. I broke it. Sergeant Denton, can you hear us? Yes. Sir, can you cut your video and for bandwidth purposes and your slides as well? Appreciate it, sir. Problem. Is that better? I think so. So far. Thanks, sir. We have a question. Can you hear me? Yes, sir. All right. The question I had, are there any settings or recommendations we can give airmen that may be targeted when they're out on social media that could prevent some of the statistical data being collected on them? If I heard that right, it came through a little broken. You were asking if there were any settings that we could do to kind of mitigate some of these issues that I just talked about? Correct. There are, but there's caveats to it. Things that you can adjust. So you can restrict apps, but a lot of social media companies are kind of smart about that. So when you log into Facebook for the first time on a new phone and it says, do you allow access to your camera, microphone? Do you allow cross-sharing across apps? A lot of people click that without thinking. While this, then you don't get to upload photos to Facebook. Then you don't get to upload video, things like that. You have to enable these things in order to access the full experience of the app. And the problem with that is if you enable that, then it's a two-way street. Facebook, as part of their terms of use, says, hey, because you are uploading this content, we are informed. We reserve the right to monitor that and use it to enhance, they say, to enhance your experience. So what they do is they take your Facebook usage, what you scroll on, what you engage with, what you upload, and they plug it into their algorithm. And it programs basically an AI to recognize what content they think you will want to see more of. So yes, you can limit your permissions. And I do encourage everybody to go and check the permissions, especially on third-party apps, because a lot of those apps, you don't know where they are. They might be gathering data that you are not comfortable sharing. People that I don't think actually exist, but you don't want Candy Crush to have access to cross apps and have access to your bank account, things like that, unless you're really into Candy Crush. That's kind of the thing. So check your permissions, but also understand that we're in an age where determine how we can use these apps. Can you hear me? I can now, yes. C.R. Master and Josh Ziriak from Hill. A quick question. Where can we go? What resources are available to continually update our perspective so we can be more in touch with the issue and understand some of the risks that are perpetual to us and our airmen? Sure. What resources can you use? There's a couple of rights out there that talk about things like disinformation. There's a website, though, and that is the European Union's attempt to combat Russian disinformation in the form of the stories that they write, the information they put out. One of the biggest things that I encourage, though, and this is definitely a plug for my boss's reading list because things like books like Like War, Hundred Year Marathon, those are all on there. And I encourage people to look at those things, like look at books on there. One, because the past is prologue when it comes to earlier how Russia was using, they created a disinformation campaign to say that AIDS was manufactured in the U.S. That is exactly what they recreated when they were talking about the coronavirus being manufactured in China and they pushed that narrative as well. So history gives us a lot of stuff because if it worked once, they think it can work again. So the other... Encouraging your airmen, encouraging each other to pick up a hard copy book or Audible or Kindle is it takes you away from those platforms. I prefer hard copy book because it, from social media, it takes me away from that digital environment that is just like a dopamine addiction. So I encourage reading more, checking out senior leader reading lists because more and more you're seeing digital information warfare books show up on there. You have training out there does a couple of digital conferences where they talk about disinformation because they're in a lot more directly than us. So there's a couple of webinars out there. There's been a few very, very good series out there by Washington Post did a very good series on disinformation and Russian disinformation in particular. So there's resources out there to read about it because that gives you the no-kitting, the no-kitting, unvarnished kind of history and truth behind it a little bit more than the digital environment. Hello, this is a senior Elmore from DIA. My question is how can we better approach and recognize psychological warfare in our day-to-day fight with our workforce? I apologize you came through broken on that one. Do you mind if you repeat it? Sorry. No problem. So with the current threat we have and we see, we always hear about China, Ukraine and our other adversaries but with our day-to-day workforce and how social media can influence everybody, how can we better adapt and recognize the effects of psychological warfare with our junior folks and across the workforce? So I caught some general tone of it. Sorry about the sound issues. So if I'm understanding correctly with the current threat environment and what we're dealing with today, is the question how do we kind of insulate ourselves or how do we protect ourselves from? I'm going to try to paraphrase and repeat the question. In this current threat environment, how do we recognize the psychological warfare or psychological effects of the social media and digital platforms, paraphrasing, but I think that's the question. Got it. First and foremost, it's not always easy and that's the nature of disinformation. Sometimes you don't recognize it until you're too deep into it. It gets after that critical thinking part that I meant, but one of the first things, and this has to be done at an individual level, is to really personal and encourage your airman to do this. What are your trigger points? If you are scrolling online, what is going to make you stop scrolling in a way that's going to compel you to engage without thinking? We all have those trigger points that if we're scrolling, we're suddenly like, I'm mad and you have to say something. Rational thought and critical thinking out the window. And that is a core element of, especially Russian disinformation. So what they do is they create this narrative based around a kernel of truth. It's a false, around the kernel of truth. And then what they do is they launch it out and they saturate the information. For example, with COVID, they programmed bots to go across Twitter and repeat the message just the flu. Because that was a narrative they saw was picking up steam. And what they do with that is they look for what they call, in the translation of it, a useful idiot. And that useful idiot basically picks up and picks up that message and carries it back to their platform and adds their credibility to it. And they do it without actually thinking it through. So the first step on that and one of the most important steps is to recognize what trick something does. The first response that you have, if it is I need to share or I need to do something, that's the wrong response. So then creating a pathway, a mental pathway that says, I'm triggered or this is hitting my trigger points, I need to learn more about it. Or I need to understand why it's triggering me. Because that's the component that's missing from buyer aim that social media encourages. So social media is so rapid, we're going to be the first to share this. We're going to break the story or my friends need to know about this. And we do it without thinking. We do it without fully examining what is going on or why it's making us do that. And the thing about it is Russia plays in that playground. They love it. They love hijacking narratives and comments. If you go to Fox News on Facebook or New York Times or any of the major media sites and you look for a very, very controversial story, you could probably scroll about 10 comments in and find a Russian troll. And I said comment to upend the narrative or to disrupt the narrative and then they move on to their next target. Because they're literally, these trolls are sitting at the Internet Research Agency in Russia typing away and trolling pages. They get paid to do this and to disrupt narratives. So from our end, we have to understand what triggers us individually. And then we, which is in place to, to prevent us from amplifying these false narrative. I think we have one more question, sir. Good afternoon. I'm Senior Blaine from Offit Air Force Base. So my question is what is public affairs or possibly big Air Force doing to implement information like this early on in an airman's career? And if you're not the right person, that's cool. Just let me know. But I mean, this sounds like something that could be involved in basic training. Definitely F-TAC, Airman Leadership School. So do you know of any initiatives that are moving forward on how to get this information to Airman directly versus me trying to explain? Because that doesn't always work out well. Over. No, absolutely. And I, it's one of the, I get this question more, most often, like, what are we doing about it? And from a public affairs standpoint, the job I had probably, now is I was working with a team to revamp the public affairs and modernize them. And I can tell you that the CDC is going forward that are going to train three to five level Airmen to do their job in public affairs is, has a huge section on disinformation. It has a huge section on information warfare from a public affairs lens. And really and truly the ability to do this in 2017 when then Secretary Mattis drafted the seventh joint function and added information into our joint functions. And that, with its subsequent document of the joint concept of operating in the information environment, really changed kind of the matrix of what public affairs is because, and I'll say this about my own career field, anyone who says we wrong because we work for the commander. We work for the commander. We work for the Air Force. Information as to this, to this joint function comes with influence backed up to it. That is the name of the game right now. So my career field has started changing its training has started changing how it educates Airmen at our tech school. Our tech school, the Defense Information School, about a year or so ago, a public resource called Pavilion. It's spelled P-A-V-I-L-I-O-N. Anyone can access this and it has training about disinformation. It has more advanced social media training. So what we're trying to do is teach this to the fleet and field and joint, not just Air Force, but teach this to the joint fleet and field. When you have your public affairs Airmen standing up at F-TAC and briefing, they're not saying, hey, don't do this on social media. Don't do that. Where public affairs come tell us your story. They are talking about Russia. They're talking about intelligently on disinformation operations and how Airmen could be targeted by this. So that's what's happening in the P-A realm. There are more discussions at the Pentagon than ever before on information warfare. What it means, I know my boss is incredibly down to it because it is information warfare is such a cheap, influencing people and it has been for a number of years. So there's more discussions happening about it. There are initiatives out there that certain magic comms are doing about making sure Airmen are getting ops intel briefs so that they understand that strategic environment out there. So being done, conversations like this are key though as well because then that just reinforces it. My objective for talking with anybody that I talked to on this is that it builds force multipliers so that when you go back to your formations and you talk to your Airmen, you might get one Airman that's like, oh yeah, I know about martyr than me, better than me at this who can do this briefing ten times better and we're just creating something that builds upon itself.