 Okay, so hello everyone and welcome to EGU 2020 sharing Geoscience online. My name is Hazel Gibson and I'm the EGU's communication officer and I'm really really excited to welcome you today to one of our special live sessions that we are streaming all throughout the week of the conference. This is great debate number five, values versus facts should Geoscience get personal. And we're very lucky to have two amazing speakers with us today. So I'm going to briefly introduce them and then let them get started on the presentations. So, first up, I'd like to introduce Laura Smiley. And Laura Smiley currently works at the European Commission. Prior to working at the European Commission in joining the European Commission in 2017 Laura was the director of external relations at the internationally renowned research institute rather than research. After completing a masters in European communications specializing in the psychology of cultural diversity. She has worked in communications for several years for organizations like Deloitte and Ogilvy public relations worldwide, the European Food Information Council and the European Food Safety Authority. She's also the former chair of the crisis and risk communications working group of the European Association of Communication Directors. In her current role at the joint research center, Laura is tasked with heading up the enlightenment 2.0 project for the European Commission. In addition to explore the extent to which facts, values and social relations affect political behavior and decision making. Throughout her career Laura has been working at the science policy interface. And in addition to her practical expertise in the fields of global food chain policy risk communications and extensive stakeholder management. Laura has developed and published a model for optimizing the communication of scientific risk and uncertainty. And we're also very lucky to have with us Professor Steven Lamondowski, who is a cognitive scientist at the University of Bristol. He was an Australian professorial fellow from 2007 to 2012 and was awarded an, sorry, and was awarded a discovery outstanding research reward from the Australian Research Council in 2011. He was a visiting professor at the University of Amsterdam before moving to the UK. And in 2016, he was appointed a fellow of the Committee for Skeptical Inquiry for his commitment to science, rational inquiry and public education. Steven's research examines people's memory, decision making and knowledge structures with a particular emphasis on how people update information in memory. And he was also a research professor at the University of Bristol for his commitment to scientific research interests examine the potential conflict between human cognition and the physics of the global climate, which has led him into research into climate science and climate modeling. And also took him back to Australia, where he was appointed a visiting scientist at the CSIRO, oceans and atmosphere laboratory in Hobart, Tasmania. And then we'll open the floor for questions and we'll also try and have a little bit of discussion around this topic that we're talking about today. Really looking at the role of values in communicating science, particularly with talking about things to do with climate science, those kind of things in this session. So starting there, I would like to hand off to Steven. So if you wouldn't mind taking over Steven. Very good. Okay. Well, I want to keep my remarks brief to leave lots of time for discussion. And basically what I want to do is to just provide a bit of context about the situation that we're living in. And how we have reassigned us to have to navigate this increasingly confusing space out there. Now what you can see on the slide is my very brief history of lies and I use American presidents to illustrate that simply because it's fairly easy to do that because they're being examined all the time for their statements. Now, I would argue that there has been a shift in the prevalence of misstatements and misleading information across presidencies. And here we see Donald Trump's record to date. Well, not quite to date, but the last update I could find. And so there seems to be a lot more happening out there in terms of misinformation and misstatements, misleading statements and so on. And of course, for us as scientists, we have to be concerned about that. But it's not just that there is also something else that I think is very important to understand. And that is that there has been a shift in the type of misinformation that we have to deal with in society. It's always been the case that some politicians are challenged by the truth, at least on certain occasions. But they used to create what I would call carefully curated deceptions, just to give one example, it's the weapons of mass destruction in Iraq, where we were supposed to believe that Iraq had those weapons and in fact at the time it then turned out that they didn't. Now, what has changed is that now we have a situation of what I call epistemic insouciance and shock and chaos disinformation and it appears as though when politicians are now saying things that are false. It is very often difficult to even discern a purpose in those falsehoods because they can be operatic in their claims and we're sort of confronted more with a spectacle than with an attempt to actually mislead us into believing a specific falsehood. So I think that's very important to understand because that shock and chaos, the lizard of disinformation has consequences, not just for science but for society in general. And one of the consequences is that in the current environment where anything inconvenient is labeled as fake news, for example by Donald Trump. We live in an environment where talking about facts and evidence becomes very difficult if everybody has their own alternative facts, and they can therefore escape accountability or an evidence based discussion. Now what this does this kind of public display of competing epistemologies is of course the destabilized institutions that are based on the idea of evidence, such as health organizations and for that matter scientific organizations dedicated to climate change. And we can see the consequences of that in a variety of ways. Here are data from a recent paper that looked at the share of votes for populist parties and vaccine hesitancy across European countries. And what you can see here is quite clearly that as the vote share for populist parties increases so does vaccine hesitancy and the belief that vaccines are not effective. And of course, the essence of populism is to replace evidence based discourse by an emotionally driven suppose a distinction between the elites and the real people and so this is one consequence of that effort. Now, why do people accept this? Why are there people in society who are happy for Donald Trump to utter so many falsehoods? Well, to answer that question, easiest way to do it in the present context is to look at what drives people's attitudes towards climate science or climate change. And it turns out that there, the greatest, biggest, most important driver of attitudes is people's political worldviews. I have done research in this arena many times and I can ask people four or five questions about their attitudes towards the free market. And the moment I've done that, I can predict their attitudes towards the laws of physics that are governing climate change. Here's an analysis that I'm showing on the screen right now that spans 24 different countries with more than 5000 participants. What you're seeing here are effect sizes, which explain the strength of the association between conservative free market, libertarian worldviews and denial of climate change. And you can see that there is a strong association that is greatest in the US and bigger in English speaking countries than other countries, but that is consistent globally that political views determine people's denial of science, be it vaccination or climate science. So that's the situation we're in. How can we counter that? Well, there's a number of things we can do. Here's a list and just very briefly, we do have tools as communicators to get across information even in today's time. So reinforcing the scientific consensus on climate change, for example, or indeed on vaccinations has been shown to be effective. But it's not just a matter of communication. It is also a matter for the public to do their part and learn about misleading argumentations and indeed for scientists to not be unnecessarily hesitant to go out and to state the scientific evidence and indeed to be an advocate for climate mitigation. But I think there's a lot of stuff to discuss. So I'll leave my remarks with that and we can return to that list later and deal with it in the Q&A. Thank you. Thank you very much, Stephen. So now we're going to move over to Laura Smiley. So I'm going to start showing my screen Laura and then you can start your talk. Is that okay? So that seems to be working. That's fantastic. Thank you so much. So first of all, thank you so much for this invitation. I would have obviously preferred to be sharing this and having the opportunity to meet many of you in Vienna. But under these exceptional current circumstances, I very simply hope that everyone who's listening and your families are all safe and well. And I'm glad that you've been able to take this initiative and nevertheless in these difficult times we can hopefully have a really good interactive discussion about this. As Hazel and her kind introduction referred to, I am leading on the project, which is rather pretentiously called Enlightenment 2.0 at the European Commission, which is a multi-annual research programme which is seeking to understand the drivers of political decision making. Next slide please. So clearly complexity is the meta problem of our age. And science, and I want this to be made very clear and everyone to understand this, that science can help to unpick this complexity. I am working in the Joint Research Centre of the European Commission where we have over 2000 scientists working in all types of regulatory science underpinning very important areas of EU legislation. So we are extraordinarily pro-science. And yet some of the evidence-based statements I'm going to be making today might be slightly confrontational for some of the people listening here. And so I just, in terms of framing, which I have a nice frame behind my head, it's really important to remember up front that actually, you know, we are coming at this from a very scientific perspective and always striving for that optimal uptake of science in the decision making process. Next slide please. And why is this so important? Well, very simply because evidence use is being challenged in policy with the rise of populism, polarisation, disinformation, as Steve has been so eloquently explaining and talking about. And so we believe that understanding of human behaviour can actually help to put knowledge and reason at the heart of political decision making. And so powerful new insights from the behavioural and the social sciences, in addition to the humanities, can help us have a better understanding of human political behaviour. And so this is why we decided to look at emotions, values, identity, reason and how these different elements fit together and how they can actually influence and understand better the political decision making process. Next slide please. So the first output from the Enlightenment 2.0 project was a report called Understanding Our Political Nature. I had the privilege of working together with 60 experts from all around the world, working in lots of different disciplines. So working with everything from evolutionary biologists, anthropologists, cognitive psychologists, behavioural economists, historians, philosophers, it was just the most remarkable group of people who answered two main questions. So the optimal, sorry, first question was what are the drivers of political decision making? And then secondly, what are the optimal strategies for the uptake of evidence from your specific discipline perspective? So everyone went away and worked extremely hard to come back with that and they did state-of-the-art reviews and then within the JRC, because we are knowledge brokers, if you like, we're working at this interface between science and policymaking, we actually translated much of what they had written into a way that our policymaking colleagues could actually understand. And so subsequently we have now been championing all of these insights and findings in order to help start and change and adapt processes internally to optimise the uptake of evidence in political decision making. Next slide please. And so there were seven key chapters in this, really looking at misperception, disinformation, we were focusing on collective intelligence, emotions, values and identities, framing metaphor and narrative, trust and openness as well as evidence-informed policymaking. But for the purposes of today and not to go into too much depth, I'm just going to very quickly run through a couple of the top key findings from the values and identities, the framing, the metaphor and narrative and the trust and the openness as to make sure that we've got lots of time for discussions and chats and difficult questions. Next slide please. So if we kick off with values, next slide, thanks. Really importantly what we're seeing is that values and identities are driving political behaviour but are not properly understood or debated. So what's this actually meaning? Well, political decisions are strongly influenced by group identities, values, world views, ideologies and personality traits. And importantly, there's traditionally been sort of three main types of contradictions historically and what we're seeing is that with political polarization in the rise, there's actually a new form of cultural rather than economic polarization that's actually started to emerge within the past 20 years, which is often associated with far-right opinions, with immigration and also with multiculturalism. So we need to be understanding this. And very importantly is that values are strongly influencing not only our political behaviour but also our perceptions about facts. And I'm sure that's something that Steve and I will kind of come back to later on because that's quite a difficult one to be getting people's heads around. Can I have the next slide please, Hazel? So next we're going to be talking about framing, metaphor and narrative. Next slide, thanks. And very simply, the facts don't speak for themselves. The fact that framing, metaphor and narrative need to be used responsibly if evidence is to be heard and understood. Now personally, I adore this slide. It's a compilation of different images from the G7 meeting that took place in 2018. And these were all the official photographs shared clockwise by Germany, Italy, Canada and France, where each country has tried to show their leader, their figurehead at the centre of all of the debate. And that to me just sort of summarises, you know, in exactly the same room in the same location, you know, just the importance of framing. Because we must understand that there's no such thing as a neutral frame. Something is included at the expense of something being excluded. And this is important as we're talking a bit about communication and what, and responsible communication and what that means as we go forward. And it's also important to realise the way in which policy problems are framed can actually significantly influence beliefs. And so we need to start and realise that it's not the side that has the most or the best facts that necessarily wins an argument, but the one that can provide the most plausible scenario and feels intuitively reliable. And that is being communicated by a perceived credible source to not necessarily a credible source, but certainly perceived credible source. And this then links on to the next topic, please Hazel, which is of trust and openness. And it's particularly key for today's discussion. Next slide, thanks. Because really we're looking at the erosion of trust in experts and in government and how it can only be addressed by greater honesty and public deliberation about interests and values. Here you see, for example, the leader Leo Vardekar of Ireland as he's speaking following a referendum and liberalising abortion laws in Ireland, which followed an intense year of dialogue with a representative citizen assemblies that took place. And so as we're talking about, you know, what can we be doing in terms of opening up communication? What can we be doing more? It's important to be understanding this environment in which we are we are working in. And I think one really key takeaway point, perhaps for you for the for the members of the EU is many people are talking about trust and the importance of trust. But in our work, we were actually dipping deeper into the concept of trustworthiness. And it's really important to pull that apart and have a good understanding because trustworthiness, it's not just about scientific excellence or or thorough understanding of a particular competence. What we are saying is, is it trustworthiness depends, yes, of course on expertise, but also on perceived honesty, which is coming from shared interests and values. And that's incredibly important for the scientific community to understand then what that means when they are having to take on board their scientific communications. Another thing that is a difficult one, but I'm going to be putting on the table is the fact that science is not value free, and that the ideal, the ideal of value free science is actually more complex and in reality, because values can actually enter into the scientific process at many, many different stages. Very importantly, and going back to the point of the organization I'm coming from and representing, it does not mean that science cannot be trusted, but there is a desperate needs to be more transparent about the roles of values and science. And so much of this is to do about increased transparency, opening up evidence to public scrutiny, maintaining help to maintain that credibly important scientific authority and how these initiatives can then be going hand in hand with institutional initiatives, such as deliberative democracy initiatives, etc. Next slide please. So those were the main points I wanted to raise from the report. Just to give you a little bit of context, we're also doing some important additional projects under the Enlightenment 2.0 program. So it was very clear that we didn't understand enough about values and identities. So we're doing a deep dive once again with experts from around the world to understand more on this, and it's clear that it's quite a nascent area of science, working with lots of different disciplines to try and see how much of a consensual understanding we can achieve on that. We're also looking at the influence of the online environment, digital influence on political behavior. And this is a report that's due to be coming out really soon. And then very importantly, we're going to be looking at things like meaningful and ethical communications, one that I'm sure will be of interest for listeners today, because there's a certain chronology to this. So once we understand the role of values and identities, once we've understood the degree of causality in terms of changing behavior, how can we then ensure that our communications are at the same time meaningful but very importantly ethical. So these are the things that we have on our plate and we're going to be looking at next. So I think I'll just leave it there. Thanks very much. Okay, thank you both very much for those introductory talks that was really, really fascinating. I want to kick us off with kind of a general question while I get back and go through some of the questions that we've been sent. And that is like particularly this is kind of a question for both of you, given what you both mentioned in the way that values and kind of identity and emotions and those kind of things can influence and very easily influence the political process doesn't science have a responsibility to be objective and just present the facts. So I don't know if either of you would like to Yes, I mean, you know, we have a responsibility as scientists to be bound by evidence and good practice and ethical conduct and all the other norms of science that we all subscribe to undoubtedly. I think the question that we have to look at is to what extent is that successful that effort of being objective and talk about just the facts. And that's where things get to be a little complicated because sometimes facts are not totally straightforward. There are some cases. When it is very clear one example being Donald Trump claiming that his in his inauguration crowd was the largest ever. Well, no, it wasn't. And we don't have to have a debate about that. Let's just move on to something more meaningful. But on the other hand, when you're talking about, you know, the economic consequences of Brexit, things could be a little more complicated and nuanced. And so one of the things we got to do is we got to differentiate between things that are facts and that are consensually scientifically established such as the greenhouse effect and other things where you know it is indeed a matter of values or perspective. For example, you know, should we care if some Pacific Island nations go underwater. I mean, that's a value judgment that is not something where there's only one answer. Well, unless you live on that island, but, you know, other people might differ. Laura, do you have anything to say to that. I do. I mean, I guess what where we're coming at this from is that we don't just want science to be heard. We want it to be understood and we want it to be incorporated into the decision making process and what our learning through this report and all this work from all these disciplines is actually showing is that to ensure the fullest of understanding they're doing deeds. The deficit model is just not enough. Just adding more facts to convince somebody of your argument is just not enough. We need to understand in a very human way and have a very human approach to to ensuring that we can responsibly be incorporating our our science into the political decision making process. Thank you. I have so many questions in the Q&A, which is fantastic. Please keep them coming. I'm working through them and we're going to try and incorporate a few of them. So actually, I'd like to start off with a question that Francis has asked in the question. Specifically, I think for Stephen, but either of you can answer if you choose to and Francis asks if we accept the different sides of the left right political divide, take on information in different ways. Should scientists adapt their messages to fit this. Yeah, very interesting question. I mean, first of all, yes, there there is a lot of evidence without question that people on the political left and the political right assimilate evidence differently. And yes, from a communications perspective, of course, there are ways in which you can frame a message that is more congenial to the to one audience or the other. I don't see anything particularly wrong with that if you know what you're doing and if you're careful about it. So, for example, there's there's evidence, at least in some laboratory experiments that you can talk to conservatives about climate change by considering the security implications of it. I mean, climate change according to the Pentagon is a national security issue for the United States. So, why shouldn't you be talking about national security to that audience. I don't see anything wrong with that. And conversely, you can talk about, you know, maybe the health co benefits to more left wing audience because that's more important to them, perhaps then national security so I don't I don't see a problem with that. But let me let me add. Well, two things really one very briefly that when it comes to accepting science. My colleagues and I have been doing research for 10 years now or so. We have yet to find a topic or domain where people on the left are rejecting well established science more than people on the right. We just haven't seen that it is, it is pervasively a right wing phenomenon. I'm saying that without value judgment. I'm just saying, you know, that's what the what the data tell us and as it turns out there are some strong reasons why you would expect that. So that's one thing thing to keep in mind. The second thing to keep in mind is having just said that it's okay to frame things differently. There's a difference between framing and spinning and distorting and you have to be very careful that that you stay on the framing side of things rather than going further down that road. Laura, I think this is this is kind of feeding into something that we were we've been kind of talking about throughout this which is the role of personalizing science communication with values and political opinions. So, I was kind of really interested to see if you could, if you could talk to that a little bit like what the role of actually adding your, your own personal opinions are in that sense, which also kind of links to a question that we've been asked. I'm going to find it again to do with scientists engaging with populist movements, particularly with climate science to do with things like extinction rebellion. So should scientists be bringing those opinions in to those scenarios. So I think very specifically on the on the climate question Steve's done a lot of fantastic research on that. And so specifically to that point Steve is probably going to be better to actually answer on that one specifically. But more generally I'll happily take that point which is looking at the validity of actually incorporating sort of values or or emotions into the And what we're really saying is, and it's going back to the point I actually highlighted during the very brief introductory presentation, which is this concept of trustworthiness. Now, we know that particularly if you look at your membership many of many of your audience today are going to be natural scientists for the most part who are striving indeed for this value free ideal of science being completely objective, disinterested, a social and all of these sort of ideals that one would perhaps strive to have. But of course, this is this is not the case we know that science is actually value laden. Otherwise, it would just be a question of science would be speaking truth to power and power would implement accordingly and it's not as simple as this. And if you then dig that little bit deeper and look at the concepts of trustworthiness in order to be building that desperately needed societal trust and trustworthiness in scientific outputs. Then what we're what we are seeing is that there's this real need to for connection. It's not just scientific excellence which I'm sure many of the people will be listening will will be assuming if I'm just really fantastic at what I do and I'm just publishing in the in the journals with the highest possible impact factors and working in teams with the most renowned professors, you know, come on guys that's enough and very enough, you know, I'm completely sympathetic to that. And you would think that it would be that if you dig in and understand the concept of trustworthiness and the fact that there's this honesty which is based on shared understanding of interest and beliefs and without that. And this combination of this shared interest, the honesty and the scientific excellence, then your messages are just not going to be as successfully communicated as the as the otherwise ought to deserve to be great. Thank you. Stephen, did you want to jump in on the idea of engaging with more populist movements to communicate your science. Well, well, I think we have an ethical responsibility to address the public we have an ethical responsibility to to correct his perceptions, even if we know that that's not always successful. So I would never say let's just walk away from it. However, I acknowledge that it's extremely difficult to engage with people who are, excuse me, primarily driven by emotion, rather than evidence and who are actually quite, you know, sometimes quite explicit about that. Now, the other question that Laura raised and then said maybe I can speak to it about the climate, specific climate change engagement and advocacy and all that stuff there. That is an issue where I think we now have some data to suggest that it is okay for scientists to take a position of advocacy provided they do so from a position of competence and expertise. So, clearly, a climate scientist should not comment on, you know, public health responses to tobacco and secondhand smoke. That's, that's, you know, it's not going to go over too well. That's something that deniers can do. So, but if they're if people if experts stick to their domain, then if they have something to say that affects politics or policy that is not undercutting their credibility, we have experimental data now on that coming in and I find that sufficiently plausible for me to think well actually that makes a lot of sense why shouldn't we talk to policy when we have expertise that's relevant. Great. And so kind of following on from that I've had a couple of questions that are specifically related to credibility and around reputation. So I'm going to pick one and then I may come back and ask the other because I actually really like both of these questions. The first one is from Fanny and they said, is it a mistake or not for a scientist to engage in an ideological debate, rather than keeping the conversation on science, or is there a good way to do it. Isn't the fear of losing your credibility legitimate. I mean as follows on from what you were just saying I think Stephen but I've really mentioned and then we can see what Laura. I wouldn't advise a scientist in engaging in an ideological debate as a scientist somehow bringing expertise to that debate if it's really ideological I think you know so if I understand the question correctly I would. I would advise against that. Now, when it comes to credibility of course that's crucial. And this is where we have to be educated about the situation we're in. And certainly as climate scientists, or people working in the climate arena. People are not operating in a communication environment that is universally receptive, quite on the contrary there there are hostile elements out there we have to face that we have to acknowledge that who will do anything they can to undermine our credibility in whichever way they can achieve that I mean you know we've we've been talking for 20 years or 20 years or 30 of climate denial and it's it's far from over. And so sometimes we're communicating in a hostile environment, and we have to acknowledge that. And one of the ways in which we have to respond to that of course is by being impeccably honest and transparent as Laura was saying I totally agree with that. And hopefully we shouldn't be naive and think oh, if we're just really good boys then everybody's going to love us. No, that'll never happen. There are some people out there who are going to attack us for the work we do because it is affecting the world views. And sometimes we have to respond accordingly we cannot bring a feather duster to machine gun fight. We have to be a little bit more aware of what's going on out there. Yeah. So taking that and then I'm going to incorporate this other question that I really like for Laura is that idea of risking your reputation to engage in these debates, especially in this really hostile environment. And Anna's question specifically was do you think the researchers need to be worried about reputational risks, when they could become more active advocates of value based topics that are associated with their research. Anna for the question because it actually allows me to open up a slightly different subject but is extremely relevant to Anna's particular question here. And in my introduction I made reference to the fact that the European Commission's joint research center is a knowledge brokerage organization so we're working at the interface between science and policy. And it's very clear that the knowledge brokerage concept is a shared responsibility. And it's the idea that the knowledge and the science is indeed being brokered between different parties and this requires an entirely different sort of skill sets in order to be able to do this. And it's a shared responsibility. So rather than individual experts feeling that all of the onus and responsibility is lying on their shoulders is, well, how can we actually be forming new types of teams with these desired skills to be working more closely at a policy interface and ensure greater understanding at earlier stages of the process. And so that is certainly something that I would very strongly be advocating for and encouraging people to be looking into. That's for sure. We have written some of some important work on skills and how people who might be interested in that. If we want to look into that further and we'll be sure to share those links with everyone at the end of the end of the session. So I hope that partially answers Anna's question with regards to finding ways of not shouldering all of the burden or reputational burden yourself and actually starting to look more broadly and understanding what skills you can bring to a knowledge brokerage table. You know, perhaps it is in synthesizing research, perhaps it's in, you know, you're fantastic when managing expert communities or you are actually in the role you have you're engaging with stakeholders and you're finding colleagues who can bring other similar skills to the table and actually collectively working together on this is going to be better than all the onus and single individuals. Great. Thank you. I'm kind of interested because I'm thinking about how that that diversity of teams kind of approaching these issues and addressing these kind of communication challenges would interact with with something that a lot of people are asking about in the in the questions, which is the role of social media. So obviously we're all very familiar with that the fact that that social media can be very influential particularly also in a policy environment nowadays even more than usual. And so I'm going to pick up on a question that Claudia asks that that's going to be quite broad maybe here, but thinking about how you incorporate that kind of professionalism and recognition of diverse teams into addressing these kind of communication challenges is how do you address the role that that social media plays and is there any possibility that some kind of regulation might be implemented to social media to kind of assist this process. Okay, well. Yes, those are very good questions. I mean, first of all, social media is very much a double edged sword and yes there's a lot of horrific stuff out there on Twitter and Facebook and you name it. And, you know, climate scientists have been under attack and, you know, there's there's misogyny everywhere and extremism and all of these horrible things that are certainly true and they're happening out there. But equally, I think social media has also given a lot of voices to scientists specifically climate scientists who are active on Twitter, for example, or on blogs, and who are actually able to communicate to a very large number of people with instant, you know, updates on new studies and I mean, you know, I've seen some fantastic information on Twitter, not the 280 characters but from the link to the peer review paper that I can then download and read. So, you know, it's a very double edged kind of thing. So, what I find very interesting is how social media in particular Facebook and YouTube have unfolded over the last month or so, because all of a sudden, we have this thing called COVID. And, oh, boy, it's serious. And not only is it serious, but we also need evidence and experts and science and actually we can't really handle these conspiracy theories theorists. You know what happens? Well, Facebook is flagging misinformation as being misinformation and according to Mark Zuckerberg that's cut down access by 95% through their overlay messages that they put up. Facebook has removed videos that were, you know, producing this nonsense about 5G being a cause of COVID and all of a sudden bang, we learn that the social media platforms actually have control over what's going on and they get rid of misinformation and conspiracy theories about COVID. I'm saying that without drawing too much of a conclusion from that other than to say, well, they apparently can do stuff. Now, we now know that so the cat is out of the bag. And now we got to have a conversation about what does that mean for the broader context, what does it mean for climate science, what does it mean for vaccination and other issues where the science is totally clear. There is organized politically motivated denial. And I think that's going to be a very interesting conversation to have over the next little while. And my best guess is that that will lead to consequences in terms of regulation. Yes, I think the political space that is there, which is also a double inch sword because I mean, if you look at Hungary and what's happening there now where they say, oh, you know, we just ban this information and if you go to jail if you spread it, well, whoa, wait a minute, who's determining that? Or ban the Hungarian government. Well, that that scares me more than climate deniers who are out there on Twitter, sprouting conspiracy theories. So we have to be super careful about that how that's being done. And the trick the intellectual challenge is to find out ways in which we can do that without impinging freedom of speech. Great. Thank you. Laura, did you want to jump in? No, I think that covered it really well. So I do have kind of a follow up question that links to that for Laura is one of the things that people are asking a lot is, in addition to that is this this idea that the public, not necessarily as good at recognizing false information. Is there ways that we can support the public and recognizing what is false information, particularly when it isn't a political environment. So it's kind of coming from Guerneth. So that's a really great question. So thanks to Guerneth for that. So one of the things that we're doing that there's obviously there's a political dimension to the work that we are currently doing and we're wanting to understand how potentially there's all these different factors that actually have the potential to influence the political decision making process. But of course that can also be influencing, you know, the voters on the street as well. So obviously we're also interested to know if there's any potential sort of misinformation that's going to work towards citizens as well and how that could potentially influence their their their political views. So with that in mind, looking to different strategies of how we could help people. I mean there's lots of fantastic initiatives that are going on. Steve has got an amazing anti sort of conspiracy theory handbook. There is lots of work that's ongoing, perhaps not conclusions to date but about the importance of things like an inoculation giving people things like pre bunking techniques which are actually telling people it is likely that X, Y and Z will be saying this within the next couple of weeks and then so people can actually expect this to be happening. So so these are useful techniques that we're seeing being used you far better off though getting the details from Steve on that than with myself. But I think what is going to be important is this future work that we'll be doing on the meaningful and ethical communication which is really going to be saying we understand the role of values and identities in group out group behavior from an identities perspective. We understand the extent of causality in the digital and online environment about how that's affecting behavior. We take all of that extraordinarily valuable learning with us and we go okay so now how do we deal with people so that we can target and they understand in the best possible way that at the same time that we ensure that we're doing that ethically which is kind of going back to the previous questions about you know should we be framing according to left according to right is this appropriate. These are all fantastic questions which we are going to be looking at more and more because sadly there's actually very little signs out there about scientific communication and ethics. So we're really going to bring that to the fore and come back and share results with you. Great. Thank you. So I'm going to just pick up on an aspect of that and come back to Steven again because again a lot of the questions that we've been getting in the Q&A have been around this the idea of uncertainty. And so particularly I'm picking up on the aspect that Laura mentioned that you want to kind of provide these quite targeted and quite defined statements that you'd be able to say you know this is happening then. But as a scientist you're often dealing with quite substantial uncertainty in your data. So how do scientists communicate those uncertainties in a way that isn't going to undercut their own communications basically. Yeah I mean great question and I think well there are two answers. The first answer is scientific and I mean climate scientific and that is that in climate change. Uncertainty isn't your friend. If you're worried about uncertainty then the greater the uncertainty the more you should worry about the future of the planet because uncertainty is always likely to make things worse rather than better. Now I wrote two papers as turns out in climate change that that explains that and I think you know as scientists we should agree on that. Now in terms of communicating uncertainty of course we can't deny the uncertainty that would be dishonest but equally we don't have to foreground it either. I mean I have seen climate scientists walk up to journalists and they open the conversation by saying there's uncertainty. And then you know as an afterthought kind of like yeah okay but we've known about the greenhouse effect for 130 years or whatever 150 I keep losing track. Now you know why we're not obliged to do that. No medical doctor would walk up to a journalist saying oh we don't we know very little about lung cancer and tobacco. No they wouldn't say that. They would say the opposite you start with a knowledge. We do know there's a relationship between smoking and lung cancer. We do know that CO2 emissions caused climate change and then you qualify that as necessary with the uncertainty. I don't think there's anything on ethical for a scientist to communicate their knowledge and to start with that and then say you know by the way that knowledge comes with error bars and they're about that big or that big whatever it is but we do have knowledge and I'm here to communicate that knowledge with you. I don't find anything on ethical about that. Great. Thank you. Can I just add something to that? Absolutely yes please. So in addition to that I mean if the media don't like uncertainty then I think it's fair to say that policymakers like uncertainty even less. They basically want to have a very clear cut is it a yes or a no please. I'm perhaps discrediting some of my policymaking colleagues just a little bit with that but it is a reasonable generalization and we have we struggle with managing and dealing with uncertainty which is why for example I think it's important to also put on the table that as policymakers within these types of institutions such as the European Commission. We also have a responsibility to ensure that our policymaking colleagues are fully trained and actually understand the subtleties of uncertainty. So we are actually for example in-house actually training our own GRC scientists to be able to explain some of these things better. We're also starting to provide our policymaking colleagues with initiatives and trainings on what they should expect in terms of uncertainties and such like. And as this is a very interesting area for many member states for example we're starting to grow what we're calling the evidence-informed policymaking sort of ecosystem and we are branching out into the member states trying to create networks amongst all the different member states and encouraging people to be doing similar thing to just generally try and increase and heighten the level of scientific literacy which is so incredibly important. Great, thank you. We're coming into the last few minutes of this session now. So what I'd really like to do if possible is I've kind of got a question for each of you that's kind of centered around advice that you could give because you both have so much wonderful experience that we can benefit from. So I'm going to ask you these questions and then we'll kind of come for if there's anything you want to say just to kind of finish us off at the end of this session. So firstly I'd like to start with Laura actually if that's okay. So Helena has asked a question about kind of relates to just what you were saying actually in terms of the role that scientists take in communicating and in educating non-scientists and other scientists and the question is about where and how can scientists find the time to understand how to maneuver in this environment. So what advice would you have for a scientist who does want to engage in this and how do they get involved in these kind of kind of communications and that kind of thing whilst also recognising the fact that they have to spend 110% of their time on research? Yep, I'm extraordinarily sympathetic to that question. Was it Hannah did you say ask that question? Helena, yes. Helena, sorry. Helena, thank you for that. So it's kind of, I'm going to go back to the point I made earlier on knowledge brokerage and the fact that we cannot actually be creating super humans. I think it's extremely unfair to be publishing, you know, in high impact factor journals that you are able to, you know, have editorials in, you know, a nice piece written by the science editor in the Financial Times this week and another one in National Geographic next week. Whilst at the same time, you know, having the equivalent of a backbone you can pick up and speak to any minister you want. I mean, that is just not reality. But there are a lot of people who are extremely talented in being able to do parts of this. And that's why I think there's a bit of sticking carrot required in all of this in terms of science funding, in terms of looking at how we change the nature of our relationships and how we look at the co-creation of projects together with local communities, together with policymakers. I'm not suggesting that this is policy and foreign science, but any stretch of the imagination, scientific integrity always has to be the cornerstone of everything. But what we need to be doing is looking at, so how can we share the burden of this and ensure that the best possible people are placed to be making an optimal contribution. And perhaps if you know you are able to get wonderful editorial pieces in National Geographic, you shouldn't be being expected to be, you know, publishing in high impact factor journals. But equally that then demands a degree of respect from colleagues that because you're not publishing in those high impact journals, the fact that you're disseminating in National Geographic is not lesser, it's just different. So this is going to require quite a shake up about how we're thinking and approaching these things holistically and together. Brilliant. Thank you. And then Steven really quickly right at the end. I just had a question that was to do with addressing misinformation and public myths in terms of climate science, particularly from Simon. So do you have a couple of really quick tips for people that do want to engage in these kind of debates and how they can do that effectively? First of all, I think it depends. First of all, you got to talk to your local colleagues. You got to make sure that you're in an environment where that is being encouraged and where people are sympathetic to that. That's the first thing I want to talk to your colleagues, your head of school, your dean to your vice chancellor. And in fact, that's what I did when I started out. And I was very lucky in that because I inoculated them against some of the backlash that I've been experienced from climate deniers. And that is generally a good thing, not just within your institution, but in public as well. Inoculate people, inoculate the public against what they might in the future experience, because if you get to them first, then that is half the battle. You're much better off saying ahead of time, oh, temperatures haven't gone up in two years. Guess what? Somebody's going to say, oh, there's a pause for global warming has stopped. Now, if you can say that ahead of time and say, you know, this is ridiculous, there's random variation, there's natural variability, blah, blah, then you can, you know, get on the front foot and meet these challenges before they get blown out of proportion. Great. So today, by saying thank you so much, it's been really, really wonderful to speak to you, and we hope to see you all again very soon. Thank you.