 everybody. It's finally Friday. It's been a long week. Welcome back to OpenShift Commons and today, as we do on Fridays, we have talks about transformation. So today is the hashtag Transformation Friday Conversation. And we have with us Fabio Pera, who is the author of Digital Nudge and one of our colleagues at Red Hat. This is really quite an interesting concept. I'm going to let Fabio introduce himself and introduce the topic. And then we're going to have a conversation with myself, Andrew Clay Shaffer, otherwise known as Little Idea on Twitter, and anyone else who's joining in in the stratosphere can join us as well and ask questions. So we can get an introduction to this topic and get nudged. Fabio, take it away. Thank you so much, Diane. And thanks, Andrew, for the invitation to speak to this group. I'm really flattered to be here. I would really like to understand what's your take on the on the Digital Nudge concept by the end of this. So let's definitely have a conversation about it at the end. Actually, I'm a Red Header. I work in Latin America. I lead the Open Innovation Lab's area of Latin America. And now we are rebranding ourselves to what we are calling Transformation Services, which is the area inside the services organization that deals with transformation. So we'll talk a lot about transformation here, what we mean by transformation. There is one word that I really like about transformation, which is behavior, because transformation has a lot to do with changing behavior, right? Changing the way we behave, changing habits, and changing the way we make decisions. And that's why I am connecting this with the Digital Nudge movement and book, because this is all about influencing people how to make decisions. And we get influenced all the time to make decisions and to do things that sometimes we don't even know that we are doing. And I actually want to start with an experiment. I'm going to ask everyone who joined you to respond very, very quickly in chat. The question I'm going to ask you, this is an experiment called the bat and the ball experiment. If you have already done this experiment before, please do not answer, because there is a bit of something that you will know about it. So we really want you to think quickly and respond the first thing that comes to your mind when I ask this question. So here is the question. A bat and a ball together cost $1.10. The bat is $1 more expensive than the ball. How much is the ball? Please type in chat how much you think the ball is. And because I'm not seeing chat, I don't know if Diane could probably read some of the responses that come out because I'm sharing my full screen so I can't see the chat. So the quick response is $0.10. The lizard brain says $0.10. Exactly. Yeah. So usually the first thing that comes to our minds is $0.10. But if you think a bit more carefully about this, you think if the bat is $1 more expensive, the bat would be $1.10 and the total is actually $1.20. So how come we make this mistake? We are rational people. We are smart people. We're all very smart. And don't worry if you made that mistake, half the MIT students actually make the same mistake. So don't worry. Harvard students make that mistake. Usually the right response is $0.05. When I ask this question, I usually get like 10%, 20% of people say $0.05. And I like to talk to people who actually responded $0.05 because that's either someone who does that calculation very quickly in their brains or it's someone who's seen that before. So the bat is $1.05 and the sum is actually the total is $1.10. So pretty much we make 35,000 decisions every single day. That's an average number, of course. It's a bit more to some people, a bit less to others, but it's an average number. And it turns out that over 90% of those decisions, we don't make them consciously. We make them using a separate part of our brain, which is the fast brain. There is a Nobel Prize winner called Daniel Kahneman. He came up with a concept called thinking fast and slow. It's basically the concept, it is the most advanced and most recent way of explaining our thinking and like neuroscience actually uses that a lot. It's almost as if the brain was a hardware that has two applications running on it. It has an application that runs very fast, but it actually makes more mistakes. The problem with the fast brain is that it creates shortcuts and it makes a lot of mistakes. The slow brain, it spends more cognitive energy. So we spend more brain energy with our slow brain, but it's more accurate. It makes better decisions in terms of being more accurate. So this thing about the 10 cents experiment, we use our fast brain and we just make a very quick mistake because we had our shortcuts. So to think about another shortcut, I've lived in Australia for eight years. I'm originally Brazilian, but I wanted you to type in chat as well. The first thing that comes to your mind when you think about Australia, let's give it like 30 seconds or so to see what comes up. And if you could also help me there, Diane, with what people are typing in. Okay. We got a lot of koalas. I don't know what it is this morning, but koalas. So there you go. Definitely the shortcut that we think about when we think about Australia. I get a lot of koalas and kangaroos. And when I moved there as well in 2008, that's the first thing that came to my mind as well. But other than koalas and kangaroos, there's a lot more to do with Australia. If we have any Australians here, probably not because of the time zone, but I don't even know if this community is a global or just a local community. What happens is a country is a complex system, right? So in order to simplify a complex system, we create shortcuts in our brain and we create these things that are stereotypes. And stereotype is a cognitive bias. We simplify, we oversimplify a complex system into something very simple. And we say Australia, a country, koalas. And that's very common, right? But being conscious of that, being conscious about the fact that we do that allows us to do it less and to actually be aware of it. When you are doing this, be conscious of it and be aware that you are oversimplifying a complex system. Same thing with a company. If we think Red Hat, you are oversimplifying it sometimes by thinking about something. And Red Hat is a complex system. A group of 14,000 people is definitely a complex system. Same thing happened when I talk about Brazil. When I was in Australia and I said that I was Brazilian, the first thing that came to mind to people was like soccer and beaches. And people thought that I played soccer and I'm actually very bad at soccer. People would invite me to go and play soccer with them and I suck at soccer. So another concept is a name, right? If you think about a name, Pelé, the first thing that comes to mind is the soccer player, but it's not necessarily that, right? You could have someone else's name is Pelé. In fact, my name is Fabio. And in Australia, I had a lot of challenges with that name, actually fun challenges. Because in Australia, nobody names their kids Fabio because there is one guy, he's very famous. And every time that people think about the name Fabio, this is what comes to mind to them because this guy, he's like cover of magazines and books and everything. And his name is Fabio. So I had a lot of trouble because when I said my name is Fabio, people thought that I was joking. And I tried to buy a wig and that's the closest I got to being Fabio. But that's also a stereotype, right? This is a connection that we make in our brains. And I want to mention here another book that was really inspirational to me. It's a book called Predictably Irrational from Dan Ariely. So Dan, he's a scientist and he proved scientifically that we are irrational and that we make irrational decisions in a predictable way. And this book talks about all the experiments that he went through to prove that. And I started studying that about 10 years ago in 2010. And I got really, really excited about the concept of predictably irrational, because I thought that I was rational. I thought that I made rational decisions that my decisions were always the best decisions. And I started realizing that I wasn't. So I started studying something called cognitive psychology. Cognitive psychology is a sub area of psychology that uses behavior economics to understand public choice and decisions in general, right? So that's, those are the areas that I really study. And one way that Dan Ariely explains our irrationality is by comparing it with cognitive, the cognitive biases that we have, he compares it with the illusions that we have, optical illusions. So when you look at the blue circle on the left, it looks slightly bigger than the blue circle on the right, because what's around it actually influences the way we see things. We all know that. We know that we have optical illusions and that we don't really see things in a clear way. And that's something that we do understand that the image that gets to our brain has some problems. So square A and square B here are actually the same shade of gray. When I looked at that one, I was like, oh no, I am definitely broken. Like my brain is definitely broken. And I actually cut like some parts of A and B to look and it's actually, it is actually the same shade of gray. It turns out that what's around A is light. So A turns out to be darker. And what's around B is shade. So B is lighter. And that is one of the most famous optical illusions that we have. And the way that Dana really explains our cognitive biases is showing some decision making. So this is, there's a number that in Denmark, it's 4%. And in Austria is 99%. I asked that in a talk once and I said, what do you guys think this is? And people said it's the number of Austrians. And I was like, that could be very clever. Actually, it could be. But this is actually the number of organ donors. So it turns out that the default in Austria is that you are an organ donor and you have to opt out. And in Denmark, it's an opt-in system. You have to ask to be. So you have to go and say, I want to be an organ donor. This is a cognitive bias called the status quo bias. We as humans, we tend to stick to whatever we are instead of making an effort to do something to do something else. So a decision that's already made for you is very powerful. So whatever we can do to influence someone to make a decision, if you make that decision a zero effort decision, or what I call a do nothing decision, the do nothing decision is the decision that you make by doing nothing. And this is just one of the cognitive biases that got mapped. And this codex is actually a map of the over 180 cognitive biases that got mapped by behavior economics and by cognitive psychology. And I got really interested in that and I started studying, right? I was like, what is a status quo bias? What is scarcity? And what is loss aversion? And what are all these things that I have in my brain? So I kind of became that little boy from the sixth sense movie that says that he can see dead people. So I started seeing biases everywhere. And I just started visualizing those things everywhere. So I would go to Google, right? And I would say, when I type on Google, where to eat in San Francisco, there's actually 342 million results calculated in 0.62 seconds. And I started thinking like, where do I click? And it turns out that 91.5% of people click on a result from the first page of Google. There's actually a joke online that says that if you kill someone and you want to hide the body, you should put it on the page two of the Google results because no one goes there. So I was thinking, right, if I click on a result from the first page, did I make that decision? Or did Google make that decision for me? And I was thinking like, this is like the default. This is like putting a result on the first page of Google. It's using the status quo bias and it's nudging me. It's influencing me to make that call and to actually make that decision. There's a few other biases that I want to highlight here. There's one called the Dunning-Kruger effect, which is that our level of confidence about some knowledge is much higher when we don't know anything about it. So this is pretty much an effect that shows that when someone knows almost nothing about a topic, they actually think they know a lot. And when we get to know something about the topic, then we realize, now I know that I don't know anything about this topic. So the level of confidence drops a lot. And it's very interesting, right? When we know that, we can deal with people in a better way because if we're dealing with someone and they think that they are experts about something, it could actually be that they are at the top left of this graph and they actually know nothing or almost nothing about that something. The Ikea effect is another bias that I like and I use it a lot. The Ikea effect says that we exaggerate the value and we get attached to something that we help to build. So if you build something, you get attached to that something and you feel like it's yours, just like when you go to Ikea and you buy some furniture and you assemble it. So if you want people to get engaged and if you want someone to get engaged on an idea, on something that you're doing, just get them to build it with you because if you come with the thing already pre-built, they will not feel as much of that Ikea effect. And the Ikea effect is something that was published by the Harvard Business Review and Dana Rielli is one of the creators or one of the people who documented that effect officially. We also see that when we're buying things. So if you're buying an iPhone 11, for example, and you look at the edit like that, then you see, okay, iPhone 11 699, it is the most expensive iPhone that I could buy. But if you look at it this way, then you feel like, wow, it's actually a mid-price. So 699 is something that is not the most expensive one because I have iPhone 11 Pro Max for $199. So that is also a way to influence people to look at a price and not think that that price is the most expensive one. The name of this bias is called framing, where you frame a decision in a certain way where people see it differently. There's an amazing show on Netflix that I watched and I was surprised by it. It's called The Push from Darren Brown where he, the challenge is do you think we can manipulate and influence someone up to the point where they can push someone off a building and I won't spoil and I won't give you what happens at the end, but I would really recommend watching it because you also learned a lot about how to influence people. And to bring a bit of context of myself, I've helped the Australian government with what was called the Digital Transformation Agency. It was an agency that was brought up by the Prime Minister and we had a whole hub of innovation to help build simple, clear, and fast public services using digital, right? And we used a lot of those concepts at the Digital Transformation Agency. That was back in 2015, from 2018, I joined Red Hat to launch the Open Innovation Labs area of Latin America. But one thing I want to bring back from the DTA work that I did was something called Digital Service Standard. It was a combination of 13 principles that everyone in government were using to build the digital services. And standard number one said we have to understand user needs. And to us here at Red Hat, I would say we have to understand our client needs because we need to build something. We need to bring to the clients things that are related to their needs. So I believe that anything that we build should be solving someone's needs. And during that time, I also wrote an article called the five cognitive biases to avoid during discovery. So I will explain just one of them instead of going through the five. The confirmation bias. So confirmation bias is a bias that during research, when we're trying to understand what someone needs, we might ignore their pain points because they don't fit with some existing assumptions that we might have. So I'll give you an example. If we have OpenShift and we go to a client, the only needs that we'll think about are the needs for OpenShift. So we'll think about, oh yeah, this client needs OpenShift. So we might be confirming a bias that we already have, a preexisting assumption as opposed to going to a client with a blank mindset and actually think, let me understand this client needs regardless of what I'm trying to bring to them. So one of the things that I was thinking was among the 35,000 decisions we make every day, have you ever wondered how many of those decisions are digital decisions? How many of those decisions are things that you either delegate to an application or to Google or to Netflix or to anything digital? And how many of those decisions are augmented decisions that you actually make using the digital world? So I was signing up to Netflix a while ago and there was something here that said, please do not email me Netflix special offers. And I was thinking, hang on, if I don't do anything, I will receive the offers. So this is pretty much the default bias, right? It is the do-nothing decision being applied to a form where the default decision, the do-nothing decision by not checking the box, I will actually be deciding to receive the offers. In order to opt out, I'll have to check the box. And I was watching Netflix and I was actually thinking like, when we watch a series on Netflix and we have the autoplay functionality, when you're watching, let's say, episode one of Black Mirror, which is a TV show I love, and you actually don't have to do anything to watch episode number two because automatically the autoplay feature actually has something embedded on the do-nothing, which is if you do nothing, you will keep watching. So think about it, all the decisions that you make without doing anything. We make a lot of decisions because others make those decisions for us. Amazon has patented recently something called anticipatory shipping, which is an algorithm. It is a whole business model around shipping things to us before we buy it. So they will understand our behavior online and they will think about like, of course, like the algorithms, machine learning, artificial intelligence will make a decision that we want to buy a certain product and that product will arrive. So that is designing a digital customer experience to send them something without them even clicking on the button to buy. So back in the days, like when I was going through that in my life about 10 years ago, I used to work with UX people and I still do, right? We still have user research and user experience people. And I thought, look, the UX people, they're very powerful because they actually create the decision environment that we make online. And then I realized that we are all digital decision architects. We are all making creating environments for people to make decisions. If you send an email, sometimes you want someone to make a decision. So at the end of the day, we are all digital decision architects, right? We are all digital decision architects. And to bring some examples of companies actually that use this concept, Lemonade is an insurance company, an insurance startup in San Francisco that has been given $13 million to reinvent P2P insurance. That was back in 2015. The latest I heard from Lemonade is that they've been evaluated on a $300 million company. So they went from $13 million to $300 million. And one of the things they did, if you want to know, is they've created the shortest time that someone could actually get acclaimed. So there was a claim that was approved in three sets. Sorry, did anyone say something? We go back to that slide, this Lemonade one. Yeah. So that's their funding amount. They're worth way more than that. Now they should be worth way more than that. So I should correct that. It's way more than $300 million, right? Yeah. But even then, that's the amount that was funding. So it's like some percentage, maybe like 20, 30 percent. So it's times out by five. That's what they were worth when they got that. Yeah. I don't know the exact details, but the big number. Yeah. Well, my message here is that it's been working out like since 2015, they've been working out and they've been creating like fast claims, three seconds claims. But one interesting thing I want to highlight is that Lemonade has inside of it, what's called a behavioral lab. And a behavioral lab is a way to understand people's behaviors. And that's why I want to go back to the word behavior. Because remember Dan Ariely, the guy that I mentioned that had impacted me with his book, Dan was assigned and he was hired by Lemonade as the CBO. So he was the chief behavioral officer of Lemonade. And when I saw that, I actually started looking and it turns out that this is a bit of a theme on the market that people have been hiring and giving people that title of chief behavioral officers. I became a huge fan of Dan Ariely. I was like, I actually approached him and we've been communicating since 2015. I met him at a conference and he's been giving me some tips every now and then to go in one direction or another. And actually the book has been one of the tips that he gave me. I was going in one direction and he actually nudged me in another direction that turned out to be the book. And I want to also bring another name that it's a very inspirational name as well. It's this guy called Richard Taylor. So Taylor and Cassus Steen wrote a book called Nudge and Taylor won a Nobel prize in 2017. So three years ago, he got a Nobel because of his contributions between the economics and the psychology side of things. So he merged these two worlds, right? He actually created a way for us to understand better economics and decision making through psychology and that is what gave him the Nobel prize. And the nudge, which is like his concept, what is a nudge? Nudges are small and powerful interventions in the environments where we make decisions, right? And a digital nudge is making that intervention using the digital world, using a digital device, using an application, using anything digital. So that's when I actually came up with the concept and the digital nudge concept came up a few years ago and it became the book and pretty much it has two publics. Like people ask me, what is the audience? The audience is the digital decision architects, anyone who uses the digital environments to nudge people into making better decisions. And also the digital citizens, which are all of us, like we all live in the digital world. We all have access to internet because here we are on blue jeans. Having this community and this is pretty much like half the world of pretty much half of the world's population has access to the digital world. They are digital citizens. Then the book came up, digital nudge. I had a few events already on the book. I've officially published it last year in Denmark at a conference in Denmark, the Gochu conference in Copenhagen. It was amazing. I actually managed to hand the book on the hands of Wozniak, the Apple's co-founder and I was like, I'm so amazed when I gave him the book. He looked at the book and he was like, wow, I should read this and I hope he read it. And if you want to think about like which other books have anything to do with this one, this is a list of some other books that have to do with this. So if you've read any of those other books like Homo Deos, The Fourth Industrial Revolution, Hooked, The Power of Habit, like all these books, they are correlated. So the digital nudge book has a lot to do with those ones as well. And some people ask me, Fabio, if I know about these things, can I manipulate people? And I say yes, because nudging is a tool and it's a very powerful tool. And as a knife, if you have a very sharp knife, you can use that knife to harm someone or you can use that knife to cut, let's say, a sashimi in a very nice way. So it is a tool and it's a very powerful tool and it can go bad, right? Nudges can definitely go bad if you do it in the wrong way because you can influence people the wrong way. And I like to say that there is the spider-man effect with nudges, which is with great power comes great responsibility. So you do have the responsibility to nudge people in the right direction. But how do I know what is the right direction? And I like Nirael's definition of the two types of influence, which one of them is called persuasion. So persuasion is influencing people to do what they want and what they need. And coercion is actually the ability to influence people to do what they don't want and they don't need. There is a good website called Dark Patterns, which is a website that mapped a lot of these dark nudges or things that are bad. They are things that people do that we don't really like. One of them, for instance, is called sneaking to basket. It is a way to say, let's say you bought a tablet and automatically by default, you have a case for your tablet in your basket. And why is it that we accept that online? And if you are shopping, let's say you are shopping at a supermarket and you have your trolley and then someone sticks something in your trolley, would we accept that? No, we wouldn't. Unless it's a bottle of vodka because I like vodka and that's actually a good vodka. But you wouldn't want that. And you would definitely realize that someone's put that in your basket and you would not accept that. So I decided to start a movement that I call the digital nudge for good movement, which pretty much is a mapping of all the ways that digitally we influence people for good. So I started asking questions and during conferences I asked people, can you tell me, can you tweet, can you post on social media with the hashtag digital nudge, anything that you feel is good? So I started keeping track of those things and one of them is a device that a friend of mine actually had one of those. He had diabetes and he had a device that helped him make a decision on when to take his insulin automatically by having an application on his phone connected to that device that was constantly measuring his glucose levels and he was nudged to make a decision and he was actually offloading his brain. And diabetes is a huge issue right now in the moment. One in 12 people have diabetes and there's actually another company called Ayuda heuristics that has been renamed recently to Quintac that has artificial intelligence to help people make better decisions with diabetes. Another digital nudge for good is this cap which has been developed by Ford. It is called safe cap and it has an accelerometer and it has the intelligence to understand when the head movement of a truck driver is about to fall asleep. So they actually realized like how is the head movement pattern in order to understand and predict a sleep and that's very interesting. So they nudged the driver and the outcome is less accidents. So I've been thinking at Red Hat right our technology helps people with like if we were to complete that we should definitely think about it in that term like what is it that we help people with with our technology because we are definitely nudging people to do things in a certain way. And to finalize I do want to talk about a framework that has been published by the University of Liechtenstein by Marcus, Christopher and Jen. They published an article called digital nudging and that came out after the book came out. I got in touch with them we are really connected and it's a framework of five steps to create a nudge to influence someone. I'm not going to go through all of the steps now but if you want I can share this with you. It's pretty much very well defined framework in order to go through five steps to influence people to make decisions using the digital nudging. So thank you to the three of them for creating that because that's amazing and you might be thinking okay what now Fabio what what what do we expect from this now? I think we can see ourselves in two of these boxes if you will or roles in this world. We are either decision architects where we're helping people make decisions so we should be nudging for good and we should be we're also digital citizens so if we are digital citizens we should be raising our digital consciousness in order to understand how we get influenced so that we don't get influenced as much and for reflection one question is what are we doing to influence the irrational brain both internally and at our clients at Red Hat? How might we find ways to influence our clients or even anyone who we interact with to drive success through behavioral change and I want to leave you with one call to action which is think about one action one thing that you will be doing differently from now on because learning is acting I don't think that learning is just consuming more and more content because content is like drinking water like so learning is not just about drinking more and more water because if you just drink this content and if you don't do anything about it I don't think you've learned anything because learning is about changing behavior so to finish I don't think learning is about drinking more water learning is about taking a piss thank you well thank you because I think that's a really nice synopsis of our of the book and of the point of view that you have on all of this one of the things that that you talk about and you brought up was persuasion versus coercion and I was thinking about this when you talked to show the example of the opt-in or opt out like you had to opt out in order to actually opt in or whatever that little thing was and I know right now like everybody who's signing up for virtual events and things along that nature we have we put up registration pages and there's a lot of government regulation around it like GDPR and you you actually have to physically you have to opt in there's some regulation that's in there so besides deciding to do for good that there's also a lot of regulation now being imposed maybe not on everybody obviously but on a lot of people about that opt in and opt out stuff and I wonder if you could talk a little bit about that maybe how because you worked with the Australian government if that came into play at all very curious because it's affecting how we do things in this new virtual world yes definitely so GDPR and all the the data regulating rules in fact in Brazil there is one now called LGPD which is pretty much GDPR for Brazil it has something called explicit consent so explicit consent means that you cannot assume that someone is giving you consent to either share their data or to receive things so there is very much a lot of regulation around that consent so the opt out version is pretty much an implicit consent version of that so what came out with GDPR and these other rules is that the default bias cannot be exploited anymore for certain things it is very much regulated right now for a few things especially data sharing so especially things around who can have access to your data that now needs explicit consent so I see that some governments like the UK government the American government they all work with behavioral units all of them give the units different names so the UK one is called the Nudge unit and the governments are usually very much influenced by this concept of behavioral economics and nudging the Nudge unit in the UK is very very much powerful into helping the government create legislation and understanding what is possible and what's not possible from a behavioral economics perspective Obama in the US had the behavioral economics unit that helped him a lot there's a lot of articles published by that unit that made a lot of advisory to to to Obama at that time the Australian government also has a Nudge unit it's usually referred to as the Nudge unit because the UK one is a bit of a reference and definitely there are units of government that inform and advise the government and change policy in a way that nudge people towards a direction and that's where the morals and the ethics come in place it has to be based on the morals and the ethics of where governments think that the population should be going and that's very controversial because ethics and morals is not a black and white thing where we can say it's zero one it's someone could think that being an organ donor is good or bad so it's very it's very controversial like where we should be nudging people towards but yes there is the Nudge units and GDPR is very much inspired by these concepts so the other thing that you talked about a little bit that that I thought was quite interesting was the confidence level the Dunning Kruger example and I love the the chart with like the hundreds of different biases I've got to get that and put frame it and put on the wall and you know and huge poster size because I that was really for me that was an eye over why we usually think about like the top five that you were talking about and stuff but I can remember a long time ago and I apologize I don't know who to attribute it to there was a quote that went around and still gets heard is that it's not what you don't know it's what you know you don't know right that that really is the scary thing and I think that's like when we're making decisions and things of that nature it's I totally you know I'm totally confident that I could fix the water leak main in my backyard and until I dig up the pipe and I look at it I'm like holy crap right that was this past weekend so it's like then I called it an expert that was not my area of expertise but you know I thought how hard can this be you know duct tape but I think a lot of those different biases when we're doing things like usability and user design and interfaces for things especially in tech we're doing this stuff we have to take you know take those biases in and make sure that we're not perverting them for our own ends and stuff like that so I think that's you know some of that really is good and the other one that I really liked and then I'll stop and let Andrew weigh in was the IKEA effect and not because of what you think like putting my furniture together makes me like my furniture more because I'm the builder of it so I have an investment in it and I've got confidence and I think one other example you gave in another talk was around origami stuff like that but I was thinking about that in terms of talks that I've been getting about end users getting more involved in contributing to open source projects and in some ways we are encouraging that like if you look at the CNCF and all the projects and people donating code like Lyft don't have any envoy and you know Jaeger coming from uber and Spotify giving back state all these wonderful projects plus people contributing from enterprises like Amadeus and stuff like back into that the effect that that if if that is having actually the IKEA effect that they're like being more attached to the Kubernetes and the cloud native ecosystem and they're more invested in it so even if maybe it's not perfect anymore like some of those pieces of furniture that I've actually put together with a drill as opposed to the holes and IKEA that I think is and you it gave me a really pause because I have to give a talk on that in a couple of days to see how I could incorporate that because I think that is actually what's happening it's not that the software is getting worse is that we're seeing more end users collaborating with vendors like Red Hat and IBM and VMware and Dell and you know everybody in the upstream which means they're getting more maybe confidence in using the product understanding but they also it's they're getting they like it better like and that's I love that insight yeah I had never had that insight on the open source like if I wrote one line of code and I submitted that my pull request got accepted I feel like that product is a bit of it's mine like I there's a piece of me inside of it so I feel like I own it I love it yeah that's amazing it's like a dopamine hit for for open source contribution I'm liking that and then if you get a streak you're even more you're hooked so Andrew what have you got to add in here I'm still excited on the the last analogy used in your call to action and I think that you could do a lot better than that but we'll come back to the needed dwell on that right now I think that this is a super common and in the you know the history of DevOps there's been a number of interesting talk people gave kind of catalog cognitive bias particularly when you start thinking about incident management and some of these other things that gets interesting but this psyche effect is also like the way I always say this is that we attach our ego to our artwork like made a thing then it's like you said us too but another thing that I keep thinking about like watching the actual evolution of this industry and you mentioned a bunch of these books and I haven't read all of them but there's this documentary out on Netflix about kind of like the social media stuff that's that's been going on and and we almost have we've created huge fortunes focused on specifically exploiting the these anti patterns or dark nudges or whatever however you want to call the the dark side of this and at least for the moment I'm not sure there's a that we like stop do like anytime soon because there's just such a high reward there's just such a high reward describing those behaviors I think in that that movie you're talking about there's a scene where there's like three clones of the same guy and every time someone does something on the internet they're like oh let's push this to him and let's push that to them and it's sort of like this and you kind of have that image of what's going on like when you're on the facebook or amazon or netflix like what netflix suggests to me you have this thing and and and it is it's it's um very lucrative these nudges um and and how you weigh in with companies to do what's morally right and that so another thing that you might see for that has always been a problem like regardless of which channel was used I was talking to a friend the other day and he was saying he was studying about the second world war and he said that most of the influence of the second world war was done by radio so a radio is a technology right and there is a way to influence people to do something by radio what what scares us right now is that the digital world has given us access to anyone any of the seven billion people or even half of it like the 3.2 or 3.5 billion who have access to the internet right now and that is what's what's brought it to an exponential situation right now that that is the big difference but you're right like influencing for that has always been a problem since we've existed well the the example you just gave about radio in some real sense is kind of an extension of of media back to the beginning of time and there's the you know pretty famous idiom that that the pen is miter than the sword right so able to communicate ideas to to get people to take behaviors is pretty core to the social animal that that humans are and and so what we have now in in this you know fabric that we've created with the technology is just is just a way to magnify and multiply that through every sense like we're basically able to exploit and and we're pretty close to being able to do to do smell too right you know you have vision and you have sound and you have like this we we're literally talking through the internet to each other right now and in some sense we're we're exchanging ideas like constantly there's another thing you touch on in in the book Fabio about VR and augmented reality and how the technologies I mean we go from radio to the pen to all of these technologies that influence what we see how we interact with the world and it's only going to get more exponential I think until we do things like fill Facebook off you know log out of taking that time to to read a hard book and and be influenced in a way that's not a nudge like that's a significant life changing thing like reading I don't see us I don't see us doing it we like the dough and we like it too much I'm not going to stop so so another thing that you said early in the talk that that I kind of got fixated on and I've been thinking about a lot in other contexts as well is this idea of you you have to make it you want to become conscious of this right you want to become conscious and and that's sort of a paradox right like some of the some of the things that make these bias and this irrationality or the shortcuts or whatever however you want to frame it is because it's deeply subconscious and and there's definitely things we can do and there's mindfulness practices and we can be thoughtful but at some point like you can like you literally have to exploit in both sense of that word those shortcuts to be able to participate in in reality yeah yeah you're right um in fact there is a Richard Taylor mentions on his book nudge that there is no way for us to be fully conscious all the time because we do not have enough cognitive brain load cognitive capacity to make all those decisions in a conscious way so when I talk about expanding our consciousness and in fact the book in portuguese the in english is called digital nudge in portuguese is called digital consciousness because in portuguese I focus more on that side of the of the what I want to do I want to raise people's consciousness and what we can do is we can minimize the negative impact of when we are being manipulated by by a few nudges that we get by being conscious about that I'll give you a real example if the other day I was on booking.com like booking a room at a hotel and I saw that there was a message saying that this is the last room available at this hotel and there was another message that said there are 15 other people looking at this room at this very moment so in my mind I trigger my consciousness uh button and I go like hang on they are trying to use my scarcity uh you know my scarcity bias to well influence me to click on the buy button right now so that's what I mean about being more conscious it means knowing and realizing when people are pushing those buttons in ourselves and actually triggering that slow brain to make a better decision and to not make all those decisions unconscious and this is an interesting thing too that it comes into play it's like when I see that on booking.com or anyplace else I have a trust issue right so it's like do I trust that vendor do I trust that website um and I think the importance here is I totally agree because if we didn't raise our consciousness and awareness of these biases and how we're being triggered it would be um the t-shirt would read resistance is futile join the bork right but um I think the other piece of it is um do do the people that you're interacting with online um do you trust them and I and I know from from red hats you know we always tout ourselves as a trusted vendor for our partners and our end users and our customers but how um how do people how does trust work into this this dynamic from your yeah the way I see it is that it's just a really I really yeah I like I like people nudge me if I trust them so really quick comment um on this particular example um and I won't name any names and it's been a while since this happened but the the this like little thing where it's like trying to create this scarcity uh I was looking at the source code for this website and it was actually there was nothing but a random number generated on the client side JavaScript absolutely to like to like say that there's this many things you know looking at it or what I have here oh yeah I mean so okay so my dream here I'm gonna tell so my my trust level went down really fast right there so yeah I love that because the same thing that the GDPR came up for the explicit consent I wish we came up with some sort of a a stamp or some sort of the same way that we have for food I don't know about in other countries but we have an organic food stamp right which is a regulatory thing to say that that food is organic like I wish one day we have a some sort of stamp of like organic content or organic digital platform or whatever we are here to say that this thing has been verified that there is no uh there is no one trying to do those things that those dark patterns and they're not trying to use a nudge that is not real because that fake javascript number is a fake nudge it is definitely a nudge for bad which then brings you to the whole concept of fake news and I don't know how many of you out there viewing and watching along watched the debates last night but one of the things that Dunn um Airely talked about in you know some book or somewhere was about when you swear in when you go to court and you say I um I swear to tell the truth the whole truth and nothing but the truth at the beginning of a court trial and now at now he's he his example was now they flipped it you said when you do your taxes you sign off at the end that you verify at the end so you have this propensity maybe to make little lies through the whole thing but if you make the commitment up front that you're not going to lie that maybe in the presidential debates maybe the next round of them we have every presidential candidate has to swear up front before they go on stage not just a covid test it's also you know you swear on a stack of whatever book that you you we trust you to it means something to you but you know this whole idea of setting up the interaction to start with and this is where maybe we go back to ux design and stuff like that and a certification organic certification or trust certification those are really hard things to do even for software to get certified we have all this testing and stuff that we set up but I think it's an interesting concept figure out the same way the same thing that we have for like a penetration test and things like the wasp the top list of the wasp things that that the vulnerabilities that we can look for we could probably like one of the dreams I have is to have a global organization that thinks about this in that way and actually prevents it from happening that we we not only have a penetration test for security but we also have a test for these kinds of of bad nudges and fake nudges and and coercions so maybe not a world health organization but a world behavioral or a trust of a wto or whatever not wtf but something of that nature out there in the universe would be a a great thing so aspirationally I agree but I don't see it developing anytime soon yeah I mean yeah I don't see it either anytime soon but I think yeah I feel like I've been planting small seeds that can grow at a time that I probably will not even see right and yeah I think I think that's the thing that we we try and do we talk um uh companies and communities through digital transformations and making codes of conduct and doing all these things but planting the seeds of awareness helping people figure out how to raise their consciousness and awareness of all their own biases we see that reflected in the politics of the day globally and then you know I do go back to a lot about building the trust between across the ecosystems with vendors and partners and end users and and the importance of trust in and people allowing to be digitally nudged you know I I personally I read through all of the stuff and I'm a number of the little books of the not the little but the important the politics here the Nobel Prize winning books that you had up there and you know I think about health and fitness and you know on your watch getting a nudge to stand up or to to breathe like I have a nudge that reminds me to breathe there are some things that yeah really take you know take a deep breath Diane um don't don't get so stressed out right I have to do that digitally so that I remember it's almost I don't have diabetes I have like stress so um I think that's that the key thing is it's just some of these nudges I trust Apple and the app to do that with me right I think it's so we probably don't have time to go all the way through this but I have this gamified life insurance policy that pays for me to have an Apple watch and then it's like I get I get points for doing exercise I get points for meditation I get all these things that like they're I mean there's some some quantitative model somewhere that says that if I do these types of behaviors that they're less likely to have to pay my life insurance but I signed up for these like nudges and I get a watch out of it so pretty cool that's amazing yeah I'm sure that's definitely inspired by some of the behavior economics in the nudge theory absolutely well I I kind of feel we're winding down a little bit which is great um and I I love the the Christmas list of books that you put up there um earlier and I think um one of the things we've been teasing out Andrew and the other folks from the GTO office is going forward to like creating this uh COVID reading list or holiday reading lists like our wish list for books and I think you've added at least seven or eight mentions of different things that that now people could read and add and append to that so I'm really grateful that um that you shared this with us I love the concept of nudge and digital nudges and and the way that um we at red hat we have so many different products and projects um that have user interfaces that do these kind of digital nudges and how important it is for us to be aware and um continue to build the trust um with our end users whether it's the OpenShift console and I'll put a plug in for there's an OpenShift console customization um contest going on now if you follow me on Twitter there's a couple of tweets about that but it's like how do you create those user experiences whether they're VR or consoles or CLIs or um augmented realities where people um trust you and so as we go about our day-to-day jobs um and and work with our end users and on our products making sure we hold these things and raise some consciousness awareness about all of these biases so thank you very much you have any last words Fabio and Andrew well let Andrew go last because there is actually one of our biases that however something ends is the thing that we remember the most it's called the peak end rule it's that's why when we see a movie we feel like the ending of the movie is not good we feel like the whole movie sucked so I'll let Andrew finish so that and you can take the blame for it then okay go for it Andrew so so memory tests show you remember things at the end and at the beginning better and that you get more fuzzy in the middle anyway uh I just think it's a pleasure to to have Fabio come talk to us about this I really like the the thought you put into this and the you know the work that you do as part of the team at Red Hat and so it's fun to to get you on and and have you kind of explore some of these topics and you know I haven't read all those books but now I got some you know another thing to buy that I probably won't read soon but we'll uh we'll keep on learning all right thanks very much Fabio um we'll have you back definitely thank you