 Okay, so yeah, so just a little bit additional information about me is so as we guys said, I'm an entrepreneur and I'm hopefully turned social entrepreneur. So, so my next business or next enterprise is going to focus on on on culture design through platforms for personal growth. I specialize in machine learning, I am a published scientific author, and I'm also an avid practitioner of lateral thinking and interdisciplinary analysis and this is basically the angle which I want to take in that talk. So I want to be, I want to traverse multiple domains to hopefully create a more holistic understanding of the problem of misinformation and the influence of kind of adversary social network interactions on society and individuals. So, in that talk, I would like to challenge some in the box thinking that that is popular in the subject of fake news and misinformation. I'd like to make a case for complex systems modes of thinking. So, saying that they're really important and actually quite indispensable tools in the effort of weapon immunization, and in general, in order effort efforts of positive social change. And having introduced that notion of complex systems, I would like to advertise a couple of useful frameworks, both as established and bleeding edge that explore modality, this modality of complex systems for applicable understanding and design of effective interventions in complex systems. Then I'm going to go a little bit interdisciplinary and introduce a modern definition of trauma as a useful lens through which to view the human condition to understand behavior and understand the sources of harmful behavior. And in general, understand possible interventions that are more than firefighting that are more long term. And then drawing on these, these examples from complex systems, I would like to kind of make a case for investigating the techniques employed in therapy and personal growth as possible cyber vaccine. So, trying to convince convince the listener that they could possibly quite effectively reduce harm from misinformation. And lastly, I would like to kind of provide some inspiration of using software and AI in making those treatments scalable and cost effective. And a sketch kind of a field study that could be done within the web immunization grant or this this this general endeavor that we have here committed to. Just a disclaimer, I might be wrong about all of this. I'm kind of taking the liberty of the courage to be wrong. But this panel is about idea generation. So I hope that's okay. And time is short and the subject is broad. I think that will be limited. So feel invited to reach out to me for additional information on the things that have been touched on during this. This talk and they're also going to be some the blog, the geography at the end, where you can actually reach to the sources that have inspired some of the thoughts in this, in this presentation. So, let's start with the challenging let's start with being outrageous and basically probably not being liked by the other panelists. I think that I think it's going to be fun. So, so I think I identified three very important box types that are often inhibiting effective work with complex systems and the society and the humans are complex systems. I want to give some examples for each of those so we have a focus on symptom versus cause. We have a focus on pathology and firefighting. We have reductionism where we try to really we out of fear of complexity we really limit the field of study to a very narrow part of the system, which oftentimes results in reduced predictive capability and reduced quality of interventions. First focus on symptom versus cause. Let's see on examples. The first box we're going to talk on anger so we can, for example, create an assumption box where anger in social media should be suppressed because it threatens the civil society and stability of our institutions. So if we kind of go deeper and unpack anger a little more, we see that this is a primary emotion of protecting oneself, and psychologists say that there are multiple modes of anger and some of them are very healthy. So anger, according to Jordan B. Patterson can be an immature reaction to an overwhelming situation, but can also be a necessary reaction to tyranny. So, so a very adaptive emotion a very important emotion. So, now we can have a look at what kind of prison bars does this box create. And we can see that by suppressing anger we risk censorship or authoritarian regime where grassroots change is impossible. So we we kind of created dystopia by trying to help and alleviate an existing problem. And how do we step outside the box and these are going to be in this format here usually questions that we can ask ourselves to kind of broaden our understanding of the issue. So, and so we can ask for example how we can help ourselves to mature, so we are less overwhelmed, thus reducing those immature reactions, or how we can help ourselves to be aware of our anger and see the real drivers. So, our anger cannot be hijacked by outside agency. Or we can also ask what tyranny do we or the people oppose, and can we find constructively find ways to constructively enlisting that energy of anger into meaningful change. And similarly, distrust distrust. And we can say, okay, there's a lot of mistrust caused by social media. And there are all this conspiracy theories and all those kind of misinformation, just circling around. So we need to teach people to trust in the authority of science and institutions. And how do we best do it right so that's the box. But when we unpack this trust when we go deeper into how it functions in the complex system. So, it can be a result of a cognitive bias of unresolved developmental issues or traumatic events and that becomes not distrust by but mistrust we we just place our trust and improperly. And can be a false die hot on me wrongly general is from experience so kind of I have been mistreated the ones, everyone must be evil. But can be an adaptive reaction to an entity or system that has violated as repeatedly. I don't trust this person because two times he cheated me right, or I don't trust this institution because it has provided improper information on multiple occasions right. So distrust is an essential component of creative and balanced cooperation. So, now that we unpacked it we can see the prison bars and we can see the risk of a sunken cost fallacy when propagating a falsified narrative through a frosted source becomes an imperative kind of an end in itself. This is out of the fear of compromising the believability, the believability of the source itself. So, once a public agency kind of tweets, a particular science fact that that is debunked. It might be prompt to continue this line of reasoning out of the fear of losing this credibility right this currency in the information entrepreneurship right and this is very dangerous. We can see, you know, that leads to censorship or alienation of affected groups like we just push them away because they're mistrust. So, so we kind of close them in their their own segment and just push them away right. But it can also disable the skeptics and non conformists who are essential agents of societal evolution right, there is, you know, most of innovation comes from those who disagree with the status quo right mistrust the status quo. So, what do we do to step outside. What is the reality of people who mistrust right is this is it generated by another non adaptive process right could it be caused by anxiety or hyper vigilance that all resolved from from trauma. How can this process be amended this generating process not the symptom right. What's and also we can ask what systems have perpetually violated trust. And how coercion has become a societal norm, because of marketing and kind of post truth politics. You know mistrust maybe a natural reaction to a culture of coercion right. And how those drivers can be amended and put in check so we can restore healthy trust right. How we can educate ourselves about healthy trust and distrust, and how we can have distributed agency right so these are some of the questions to step outside the box. So now we go into pathology and firefighting. For example, how this kind of modality of thinking can cause in put us in the box is, we should find ways to spot super trolls and super spreaders and disable them. Right, so we find those who, you know, share and misinformation to in a direct message to 100,000 people right or 50 other friends right. If we look deeper in the box if we look to unpack this is like the super spreaders are usually a minority, given the hypothesis that they follow a power law. And their impact, the impact is amplified by the fact that sharing and giving positive feedback that the that affects the algorithms is free in social networks. So there are basically their activities and bounded. It's free and does can be easily gained. And also their impact is amplified by regular spreaders. And is affecting the unconscious bystanders as well. So, from that modality we can see the prison bars. So this can be taken to an extreme and that really results in censorship, where we kind of take away the freedom to to share or the freedom to, or we just silence particular groups that are not congruent with a general agenda right. But also, and this is to quote Sharon vaguely, you know, we have this problem that science has always focused on people and conditions that are pathological disturbed or at best normal. And you can kind of see it in the past 30 years, there have been about 46,000 scientific studies studies on depression and an underwhelming 400 on joy. So we does with that mode of thinking we we risk regulating social networks into rigidity. Instead of stimulating them into flourishing, we can kind of take away, like throw out the baby with the bathwater take away the good that the social networks have brought us because we, we kind of over regulated. And this kind of turns into a waka mall game where, you know, the trolls become smarter and we hunt them better and arms race, and lots of resources wasted. So how do we step up Saudi box and we can ask ourselves what contributes to a digital hygiene and can some of the behaviors be automated or encoded into the platforms themselves right. So we can also look at people who do not spread misinformation or where misinformation is harmless to them so it does not change their behavior in harmful ways. And look at their features and see if we can amplify them if we can kind of educate them right to other people, we can kind of, we can also increase the stakes of the game of sharing. We can make the sheriff's stakeholders in the effect of their sharing right. And that includes like transitive trust networks, for example. And we can also induce induce limiters in the network so we can create scarcity that could possibly inhibit mindless sharing. So we're going to have a limit of weekly shares per account, right. So these are, you know, different ways of looking at the problem from a different perspective. And then we have reductionism so increased focus on a particular facet of a problem that kind of closes us off to the, to the underlying complex system that generates the problem. And an example of thinking like this could be we must focus on availability and quality of information effect checking provided by a reputable source so that people can make more informed decisions like this is the box. But when we unpack it when we look a little bit deeper on how people make decisions. And we see that this box relies on the assumption that we can overrule misinformation with better information by providing fuel. So information for system to our rational thinking system to follow the canamans logic. And to show you, this is the system to this is according to Kanaman, we have two systems, we have intuition and instincts that take about 95% of our decisions and we have a rational thinking that take 5%, right. And also rational thinking is informed by heuristics from intuition and instinct. This is where, you know, the data that's being put into the rational process originates. We can see that our decision making on average is seldom rational. As this process of rational thinking is slow, expensive, and often unpleasant and requires focus and attention. And in fact, decision is, is a multifaceted problem invoking multiple systems so we have for the rational thinking, we have emotions and intuitions that are strongly tied to our biology. So our cortisol levels will change our intuition right our our oxytocin levels will change our intuition right and we have relations. And we use empathy and social predictions in our decision making. So, additionally, what we can see is is misinformation or false beliefs are not harmful, unless they elicit suffering, unless they elicit some response that is harmful. So, unless there is violence risk behavior or self harm or trans transitory harm where, you know, this information hurts somebody we spread it to. They're not. They're not harmful right. And in that sense misinformation has to out compete other decision drivers in our the way we make decisions to become harmful, right, so we can look at things that that make up for good decision making, like, you know, values, awareness, mindfulness, right. As inhibitors of bad decision making that could be caused by misinformation. It's, you know, it's a it's a very fluent game between all those driving factors. And also, because of that many people act contrary to their currently declared beliefs, and this has to be understood. When we talk about the effects of misinformation. So, so now we can see the prison bars, we are mistaking a human for their system to their rational ego right. And, in fact, many of contemporary social engineering hacks are designed to bypass rationality, and kind of just serve the ways of emotion and real relational pressure. So fighting them on rationality rationality is a lost cause. You know, we can again have this something cost fallacy. And, and also kind of thinking about, you know, credible sources is prone to corruption. So, so we have the, we can kind of see this, the, these prison bars and now we can ask how to step outside. So we can ask, for example, how do we foster mental postures that reduce suffering and decrease the chance on inflicting suffering on the others, regardless of the information that is currently hold in, you know, in our working memory and how do we help ourselves achieve better emotion regulation. So, how emotions cannot be hijacked and how do neurobiological aspects of the human experience like diet respiration posture or hormonal profile become drivers of that affect emotion regulation and social behavior right. So we can even ask, maybe, you know, good diet and, and, and the respiration techniques are more important than media education, because they can, they change our our hormonal profiles and does our social behavior right. And, you know, and we can, we can think about how we kind of basically promote those positive regulatory behaviors as a way to countermeasure for negative drivers. So, yeah, so that's the box that's, and that's some ways of thinking to kind of to spring the discussion to help us step outside, but you can also see that that outside of the box. So we're getting very complex right there are many interactions there are many additional inquiries there are many additional lines of thought. So we can now ask how do we navigate outside the box. And, and some really useful tools in that is complex systems and complexity theory. And basically complex systems are systems that are composed of many diverse parts that I that are highly interconnected and capable of adaptation. And they perform some collective function. So the key features in the way we view complex systems is the network perspective. So, again, things highly interconnected their feedback loops are non linear. So we have butterfly effects, we have threshold effects. We have this proportionality of input to output is very hard to predict with kind of easy causality what's going to result. Right. And so you guys you wanted to say something. Yes, I hate to do this really because I feel that this is the beginning of a fascinating dialogue and discussion but unfortunately we have to really stick to the schedule because there are next panels and schedule is very packed and unfortunately you have just reached your time. As I was a little bit afraid of so we're still have five minutes so maybe I will give back the mic to to Isabella about the questions and definitely will schedule a much longer time to discuss everything that you didn't have time to present. Sure. Maybe I can jump to conclusions for two minutes. So make it please two minutes. Yeah, okay. Just feel free to stop me. So, basically, we need a well rounded approach that optimizes the mind, the embodied brain, our relationships, and this approach has to be informed by complex systems systems. So we need to look for ways to create a mentality thriving society in and it's a matter of global security. So preventative mental health care is a must lifelong education and practice is a prerequisite for happy and safe society. And we can look at how software and AI can contribute. So it can contribute in scaling those efforts and can contribute in making judging the progress of those effort more data driven. So it's a must to reverse engineer complex systems. And it's good at monitoring complex systems because it can take in a multitude of signals that humans are, you know, have hard hardship to follow. So our proposed an objective for weaponization is, we create a field study where for groups. One is unconditioned. A second is educated about misinformation and ways of reasoning. The third one is enlisted into a personal and relational growth program where we kind of teach mindfulness, positive psychology physical exercise, and group support and rational practices. The first one is a combination of two of the second one and the third one. And we see, we kind of see if the hypothesis that having a kind of developed wrong human that has the tools for for a positive engagement in life is actually seen enough to the misfits of misinformation to the to the threats of misinformation. And that's it. Sorry.