 Hello, hello, hello. Good evening, good afternoon, good morning, wherever you are joining us from. My name is Abhijit Baduri and I'm the host of this show called Dreamers and Unicorns. We are into our second season and I work primarily as a leadership transitions coach. I also help organizations work through their social media strategy. But today I have a fabulous guest here with me. So, you know what makes us so special? Besides, of course, the fact that she's a very close friend of mine. That's always reason number one, but there is more to it. She was actually the first, you know, content. She was in a content management role at Netflix and one of the earliest people to do that. And then helped Toshiba America build the internet. And then she's written for and she's appeared in multiple shows for BBC, NPR, written for Wired, a really accomplished person. So I'm sure you are going to enjoy this conversation. We are going to talk about Katie O'Neill's latest book called A Future So Bright. So bright, so bright. And without further ado. Albright. Yeah, I know. I thought you were going to keep that on for a little bit more, but that's fine. I can't see if I have them on. So that's why it is so bright. But I wanted to just get you to describe your work and your, you know, how did you get to this place and talk to me a little bit about your previous books and, you know, the stuff that you've written here. Your previous book was Tech Humanist. That's right. Yeah, that was your name. It works out nicely because there's a lot of Katie O'Neill's out there. So if people know Tech Humanist, it's a great way to find me on Google. But hey, I want to back up and just say, I would say thank you so much for having me on your show and for hosting this conversation because you know that yes, we're friends but I am also a fan of you and your work and I love and respect everything that you do. It's just a thrill to be here connecting with you and talking to your people too, your followers, your fans, your audience, you know, the folks that are out there. So very happy to see you and talk to the folks out there. So to get to your question, I guess the thing that's interesting about the work that I've been doing for me, what's interesting is that it's been following this trajectory that gets increasingly focused on how to solve problems at scale. And it's been for the last several years, last decade or so. The way I've framed that question in my mind is how can we help humanity prepare for what by all indications looks to be an increasingly data and tech driven future. And I've been doing a lot of keynote speaking around that I've written books around that as you know. And what I have found is that increasingly that conversation leads us to, you know, we want to talk about the future of work we want to talk about intelligent automation and what that's doing to the future of jobs and the workplace. And we want to talk about, you know, AI and algorithmic bias and the potential harms to humanity like we want to have that conversation but we also end up talking about things like misinformation and disinformation and how that plays into geopolitics and we get talking about privacy and we get talking about climate change and the ways that AI and emerging technology can play into resilience and efforts to mitigate climate change. So it's been an interesting trajectory to get to this place where the work is so broad it's covering such a wide range of topics and yet fundamentally it's still about, you know, this very human centric approach to using technology and using innovation and trying to be resilient and bring the best of what we have to to bear on solving problems for the best futures for the most people. And I know that's where, you know, you and I dovetail back because your work is so human centric, right? Because I believe that, you know, when we talk about anything to do with technology, I think as long as we keep the human being at the center, I think we get it right. The moment we leave that out and sort of really focus on technology, that's where it really goes wrong. So yeah, I mean, I agree that that's one of the things that, you know, connects both of us because both of us believe that it's really all about serving humanity in some shape or form. And for all the listeners who joined in, thank you very much. And you'll be delighted to know that Kate O'Neill is one of the top 30 marketing influencers you must absolutely follow this year and later. Because I completely follow her, you know, and today we want to talk about your new book called A Future So Bright. Do you have your book handy? Yes. Thank you so much. Yeah, yeah. It's been very funny that everyone sort of assumes that the glasses on the cover were inspired by these glasses and the truth of it is I actually bought these glasses after my designer did this cover. And I was like, oh, I got to find yellow glasses now. I love it. I would have imagined that, you know, it was these that became your signature, you know. Yeah, it's like an autographed book. Yeah, yeah. But what's really great about these is a little here, I'm going to do a little commercial for this brand because this is a brand called Parafina. And they use recycled materials like rubber and PET bottles and other kinds of recycle other kinds of plastics and rubbers, including ocean plastics and they make these glasses out of those materials. And then they donate 5% of their proceeds to school supplies for kids in Paraguay. So I just love that they're an example like this, you know, just even wearing these sunglasses in public appearances is kind of bringing the discussion into a meta place because it's using, you know, a prop that's that's part of a company that's doing a very strategically optimistic approach to aligning the work that they're doing with solving problems at scale. And it's just I love that. Absolutely. And, you know, I had more power to companies like that, which are trying to make the world, you know, more sustainable for future generations. So I wanted to take a step back and talk to you about your time at Netflix as one of the first content management roles which was there. Now, which year was this? And, you know, what were you doing there? So this is in the 99 2000 2001 sort of timeframe. I wasn't born then. I know. I was a baby when I was working there. It was weird that they hired an infant for the role. I don't know why. No, no, it was actually, you know, it was several years into my time in Silicon Valley. After I had graduated from done my undergrad in Chicago and I moved to Silicon Valley to work for Toshiba as you as you correctly pronounced it. No one else. Of course, Americans always pronounce it Toshiba, but the Japanese and the company themselves pronounce it Toshiba. So you got it right. Yeah. And I worked for them for a while and then a series of startups after that. And I was doing a lot of technical publications and content strategy sort of work. Thinking about how that translates between printed documents like user manuals and then online health but also like website content and making and a lot of what that was leading me into was thinking about the user experience, even though we didn't really have that terminology as much at that time that wasn't as common a phrase. So what I ended up being an advocate in every one of those roles for the customer or the user or, you know, who is going to be the person that was using this software or that was using this tool or whatever the hardware. And that ended up leading me to when I saw Netflix come out, you know, we bought a DVD player and it had an insert in it. So it already existed as a company and it was using their service that most people don't remember that even as a as a DVD service, most people know it as a subscription service, right? You would subscribe, you would get a set number and they would send you the next thing in your queue. But there was a period before they figured that program out where they were doing a la carte rental just like a blockbuster sort of arrangement where you rent a video, they send it to you, then you send it back and that's it. So when I saw and we were using it at that point where they were still doing that a la carte rental, and I got invited as a customer to be part of an alpha test of this new subscription program. And I felt so lucky because I was like, this is genius. I think they've just totally cracked something that's really interesting. And once I'd used it for a month or two, I sent them my resume, I was just like, I don't know what you need. But here's what I can do. And I would love to be part of helping you scale this because I think this is brilliant what you're doing. And they created a role for me. So I was the first content manager. I had a team of content producers. We sort of rolled into the set between the editorial team, which did a lot of reviews and writing content for the site and the database team because what we were doing was very structured data. So one of the first projects that I took part in was to recognize that we had the genre allocation all wrong. Like we were doing one to one genres, which any movie fan knows is going to break as soon as you get to romantic comedies, right? You can't have romance or comedy. It has to be either you have to be able to do both. But any movie like you can imagine there's a lot of situations where you might want to have multiple. Action, action comedies, right? Like that's another one. That's more like, you know, one of those oxymorons. They're one of my favorite genres. They got oceans 11. It's one of the best action comedies out there. But what was interesting about doing that, so we worked together with the database team. My team very closely worked with them to redesign the entire entity relationship, the data structure behind the content to make sure that it could have multiple genre and subgenre allocations. And when you think about what that has meant for Netflix today, the way that they've been able to be very loose and, you know, the metadata is very tied to these kind of nuanced algorithmic ideas. Like you have watched this movie. Therefore you might like these others because it looks like these are based on literature or because they have a strong female lead or something like that. And that's a really interesting thing that they've now been able to do and it builds off of some of that early work. So it's always interesting with these companies that you watch scale over 20 years to think about what were the early decisions that were able to be built out that were able to, you know, expand out into smarter and bigger decisions over time. And that, go ahead, go ahead. No, no, no. And I was just saying that it's just such an incredible idea to move out of, you know, let's say single genre to multiple genres and then move into something which is a lot more intangible. I mean, you know, sort of say that strong female lead or, you know, romantic comedy set in China. So the more descriptive you get, the more likely you will get a really sharp focused recommendation. Yeah, yeah. And that's what really drove me there was was thinking about, you know, it's not like helping people watch better movies, more movies that are more suited to them is solving like a world serious problem. But it is something that matters to people. It helps them have a better quality of life and enjoy themselves more. And I love movies. So it was a really great fit for me. And I got to really play with and understand the relationship of metadata and just data structures in general to the customer experience and how you could think about how those things are related and how you when you make the right data decisions that it can actually change the way people experience the site and then the product itself, the service itself. And that I think was a really fundamental realization for, you know, back in 2000 or so for the next 20 years of work for me, like that there's this solid correlation between data and the relationships that people have with that data, whether they know it or not, and then what that's going to mean for the experiences that they have in the world with, you know, with a brand and outside of the brand. So that's so much a part of the tech humanist work is recognizing that, you know, we create data trails around us with everything that we do. And a lot of the times companies are picking up those data trails and monetizing them and optimizing them in ways that can be manipulative and can be, you know, kind of predatory even. And so it's really important for us as people to understand those relationships and understand that that's what's happening in many cases. And it's also important for companies, for business leaders and for experienced designers and creators themselves to be thinking responsibly and ethically about how to use the data that they're collecting and amassing about people to create more meaningful experiences for people as opposed to more manipulative experiences for people. Isn't there a very fine balance to trade on because, you know, at one level I need to know exactly what kind of movies you watch at some stage. And I'm taking, you know, Netflix because you have work there, but the same thing could hold true for let's say, you know, a food delivery company that typically on, you know, Saturday evenings you like this particular flavor of ice cream because I saw 10 jello ice creams and desserts also is present here. So, you know, I did seem to like the introduction in science, but I'm so okay, just imagine that you would like a certain kind of a flavor of ice cream, you know, and you take a larger serving of the same ice cream on a different day of the week and a different time of the week you would choose a different flavor. The more you are able to do that and the more you are able to then triangulate and create patterns and sort of understand the consumer sort of saying. But it's also therefore possible for companies today, you know, I was just reading that the latest iPhone is now going to be able to over time be able to predict that, you know, you're potentially suffering from depression. So which which is powerful. But at the same time, you know, I don't know whether I want the phone company to know about that, or do I want any other company to know about that. So it's a very fine line with human beings. It is resolve that. So there's a couple of ways I think we need to resolve it and one is that we have to recognize that that it is our responsibility as experienced creators to think about what makes experiences meaningful and what and and it's going to be in some cases we are going to be collecting data. We want to know about ice cream flavors that people enjoy we want to know about movies that people enjoy, because there's a an understood trusted relationship that the person has entered into with us that they've signed up for the service they've bought from us and they've said hey look, I'm interested in your products. Give me the most relevant version of these products within reason right like don't creep around don't follow me around the web don't send somebody out to follow me in a van walking down the street you know that's that's the equivalent of what the digital tracking looks sometimes but within reason, give me some idea about which of these flavors which of these movies which of these services are going to be most relevant for me and there's a trade off there of privacy for a trusted advisor sort of relationship with the brand. But the moment that that becomes tipped over into the area where the brand is taking advantage now the brand is using more data than they need to the brand is in a power position like there's an example of a test that had done a few years ago they were on record they were found out as as having run a test to see if people when their batteries are low and they're calling for a ride from Uber would be more inclined to pay a surcharge than people whose battery levels were higher. So you imagine yourself in that scenario you're stranded if you don't get this ride right like maybe you are somewhere out and you just need to get home. And now your battery levels getting super low so it's your phone's about to die and you just want to book that ride so that you can get home. If uber says hey we're in surcharge time like it's going to be an extra charge. Are you willing to say yes let me go ahead and I'll go ahead and pay that extra money to book this ride. You don't have any opportunity right now to to price shop you can't go over the lift you can't you know figure out what your alternatives are because your phones about to die. So they confirmed it they know that that's something that they can do they swear that they're not using that data in production. But let's be real like once you have knowledge that you can manipulate the user that way what's the chances that you're going to be able to stifle that knowledge within your organization. So it's really the sense of the cultural mindset and the leadership of the company saying we need to have a trusted relationship with our customers. An experiment like that probably should never have even been done like why are we even collecting the battery level in the first place from the data set. If we can't help but collect the data level because it's something that the phone just passes us. We still like there's no reason why we would have ever legitimately run that test because there's nothing good that can ever come from offering someone a surcharge just because their battery is low. So I'm saying like I think it's a whole cultural mindset and that's what really when we look at digital transformation and a lot of technology programs anyway so much of them are about culture. They're about people they're about how are people actually leading this change through the organization. How are the leaders of companies setting the values and setting the roles within the company and saying this is how we're going to model the effective and responsible and ethical use of data. This is how we're going to treat our customers with respect and responsibly. We're going to tell them this is going to be your favorite ice cream. We can just tell if it's not you know the deals on us and you know get your favorite next time. But I think that that kind of mindset the difference between you know that uber mindset that would run that test. And you know I like to think that you know Netflix was run in a very a pretty responsible way. I think there were they've made some gaps a few missteps here and there over the years. But for the most part I have a great deal of admiration for how that company is run. And there's a lot of other companies doing very good very responsible work around data and minimizing the amount of data they're collecting and being very responsible with it. That's what I think we all have to be doing right now is thinking about responsible ethical data use and how to create the most meaningful experiences for people that trust us with that data. And you know there are a number of examples of how something completely unrelated can also be predictive of a different kind of behavior. For example you know the other day I was sort of talking to somebody who was talking about credit card companies. Look at the fact that people who charge their phone before the battery dies down you know at night when they do that these are people most likely to repay their loans. So you know they are good to sort of lend stuff to and ever since I've heard that you know I kind of say gosh what do I do I wonder and as luck would have it the day I got discovered this huge piece of insight. You know I was out somewhere and I forgot to carry my battery charger and of course my phone. Yeah the scary thing is I think that you know people people who are really looking at this in terms of scale in terms of how this can play out in different countries and for different with different policies. Are looking at examples where like China has the you know the kind of the credit score the social score and things like that could actually interrelate in that kind of scenario right like if you if you are a responsible citizen you won't let your phone run down and so it would actually count against you to make what would seemingly be a trivial mistake or a shortfall because someone has made the correlation that letting your phone run down means that you're a less responsible person in general who is likely to default on your bank loan and even though you haven't defaulted on your bank loan you've only let your bank your phone run down now that's like an indicator it's predictive. And so I think we have to be careful about the spiral that that leads us into and be sure that we're thinking about policy in this way that is protective of human rights and basic basic civility like that we still have the right to let our phones run down that adversely affecting our credit score right. I did. You know we've got one of our viewers Shushan says that the ethical use of data is most important value driven essential for any company and and I think it's sometimes I wonder whether along with all the tech courses that are being taught in all these engineering colleges should it be mandatory courses to be taken on ethics and you know on philosophy and things like that. Without that perspective, I think it's becomes harder for people to make the right choices. Yeah, I think you're right. I also will point out that my friend David Ryan Polgar has an organization called all tech is human and they're doing a lot of work that the. It's a it's an association of sorts it's a loose conglomeration of responsible tech practitioners and academics and just people working around ethical AI. So I think what's happening within that organization and many in a lot of spaces is that people are advising on policy and advising on company practice and making sure that it's understood that you know these are going to be the curriculum I'm sorry that meant to mention that so you're talking about you know should ethics and philosophy be bundled into you know some of these these programs and I think. Yeah, I think that that's probably a good thing for for schools to consider and also I think it's important that company leadership be be constantly refreshing. And I think it's an understanding of what it means to be ethical and responsible in the context of emerging technologies and data because it's presenting us with so many new kinds of choices that. Most people who went through business school, you know 2030 years ago weren't necessarily pondering what it meant to use data in these ways that have so much scale and impact now. I do sometimes think that you know the people who are most powerful to make a decision you know the ones who can make those kind of decisions very often are the ones as you said they've been to these schools 2030 years back and understanding of what it could potentially do and we've seen that you know sometimes when you see legislatures and different countries you know talk to the tech leaders you know it's a laughably unequal conversation that you know people don't quite understand what is it that the technology should be doing what should they be asking And in fact very often it's you know I remember this is not my phrase but one of the one of the newspapers said it reminds you of you know your the grandchild having a conversation with the grandparent about you know why is it that my email is down because the light still read. So you kind of conversations. Yeah. Yeah I think we're going. I was going to say I think we're going to have that. I wanted to just tell our listeners that you know some of them asking these questions that so those who joined us late we are you know talking to Kate or Neil and she is the author of this fabulous new book which I've just finished reading. It's called a future so bright and she clarified that the she bought those glasses after the design was created is that right. Yes. Yeah. Yeah once the once the designer put this together and I saw the yellow I was like all right now I have to go buy yellow sunglasses. Actually the ones that I had before so my first pair of paraffinas that I'm telling you about I got in London years ago and they're all banged up now I don't know if you can see the the lenses are all cracked and everything. So it was time it was time for a new pair anyway but it's great that it they both actually tie in a little to the color of. Every time you go to talk about the book and you're going through one of those book signings make sure you're wearing one of those right. Yeah, or both. All right. That's fine too. The first one is for your first book. Then it's for the second. Three books. Yeah. You need to wear three of them. Gosh that's a lot. Yeah. And so now you also have 10 yellow ice creams who says that digesting data and concentrating into action is a big challenge for strategists so. Yeah. Absolutely. Yes. So. But you know what I think is interesting about it is so one of the things that I advise with clients is to think about what's meaningful about the company when you're designing data modeling around the company like what is it meaningful for you to collect. Operations part of your business. What is it meaningful for you to collect in the customer support part of your business and how do those those two bits of data from those two areas support one another and interrelate with one another. What do they tell you about your business. But I think in order to do that really effectively you have to start back way back further at the top of that discussion and I always have people start with purpose and that in that sense what I'm talking about is a three to five word articulation. No more than that. We just want to keep it super simple and essential. What is it that your company exists to do and is trying to do at scale. And if you can make that articulation that's the sink. Then you can actually go through and think about the goals that you've articulated for the year and the quarter. The mission that you've tried to say that you know your your company is about what you're trying to achieve in the longer run. What your priorities are what resources you've allocated to those priorities. What you've said the brand should look like what you've said the experiences should be you go through and you cascade through all of these decisions that happen throughout the organization throughout the company leadership. You try to tie those together and we believe that if you think about data modeling. Once you've gone through that that process you get to the point where you say you know what's what's going to actually mean something and be insightful for us about customer data is going to be when it tells us that we've actually achieved what we set out to do. Like we want to create meaningful relationships with people around the ice cream that they enjoy right like we want people to be able to have delightful experiences with their dessert and so how can we be sure that that's happening. So you know the desserts are really fun example because it's not you know it's not anything insidious it's not all these so many of the. And very clearly tangelo ice creams is really loving this conversation because the person says I must read this book and how I can use this for my company and for a large number of farmers. How to make sure that the data is useful and translates to implementation. Most of it they have a number of challenges so yes you know so. There is a chapter on a brighter future for how and what we eat in this book and it gets and there's also a little bit into the agricultural side and how AI and emerging technology are being used in agriculture and then the food chain in general. So I think actually the folks from that ice cream shop are going to enjoy tying that together because it is an interesting time when it comes to food food supply and agriculture and there are a lot of. Bright future forward ethical responsible decisions that we can all make about what we're eating and what we're sourcing to eat making sure that we're we're making. Low carbon decisions about the food that we eat and making sure that we're responsible about where we're consuming it from but also for companies that are in the food space. It's a ton of opportunity to improve and and really try to bring their carbon footprint down and make sure they're making responsible decisions to and data and technology will certainly help with that so that's exciting to see that. Tanslow that you guys are thinking about that I'm glad to hear that that's that's something you're you're so progressive what you're thinking about that. Yep. And one of the hypotheses I have is at some point of time you know food tech will overlap with health tech because you know what you eat is what you know shapes you in some very very literal way. And that's where I think it's going to go so you know you would start to see more of everyday food which you any case would have had get supplemented and at some point of time. I used to work with food companies and these they were already thinking about how do you add a little kind of a you know protein supplement or whatever you know depending on that with whatever was being sold so you simply by shaking it up. In terms of the container. You could just add that protein supplement before you consume it so it was just any innovative ways of packaging that would make that happen so that you know the packaging would open up with that shaking and release the stuff into the container. So there are many different ways in which it can be done. Yes. So the fascinating examples of kid I want to actually dig a little more into the book that you've done. I really enjoyed reading the book because of you know the first thing that connected with me was you know you were saying that you first need to understand the meaning a little bit of it you referred to earlier that you know meaning is about what really matters to you. So you know is this something that applies to individuals just as well as organizations that because I'm always interested in seeing what is it that I can do as an individual even though my company may or may not choose to do that or I may or may not have the opportunity to do it. What can I do to learn from what you found out you know so what would that be. Yeah so that's a very good observation and very good question because I feel like one of the things that was really important to me with this book was to write it to an archetype of leader you know I have had some people in mind one person in particular in mind that I was writing to in a sense and that person would be a corporate leader but I also recognize that you know city leaders and institutional leaders and really just community and activist leaders are important to have in the conversation as well. And then also just as individuals when we think about our future you know how do we think about making the right choices are the best choices that empower us that set us up for for better outcomes in the future. It's less about that personal you know kind of how to approach and more I think you can take away from this a lot of sort of philosophy that may help ground that that decision making but yes meaning is one of the most fundamental pieces of this philosophy and this playbook you know this approach to how to how to build a better future the the understanding is that humans crave meaning like that's the most to me that's the most uniquely human characteristic that there is is that we are meaning seekers and meaning makers like everything that we encounter everything we do we always want to understand something about it what is essential about it what what does it mean what is what's significant about it why is this happening All of those things are a form of meaning and they're they're fundamentally about what matters about this thing right and then what I what I point out in the book is that if you accept that meaning is this fundamental human characteristic and it's about what matters what's interesting is that you can then push that through to innovation and say that innovation is about what is going to matter. And in that way you can always make sure that innovation is going to be human centric and it's going to be solving for the right things the most meaningful things the things that are going to have the biggest impact for us. Yeah and you know one of our listeners Nana Prabhu says humans crave meaning we always go back to it yes absolutely. You know when I like the way that even in the book I kind of really enjoyed this whole perspective that meaning is in some sense this is not the words you use but that's the conclusion I drew that meaning in some sense is when I connect the dots with whatever of dots I figured out as of now that's the meaning that I hold today. Tomorrow as I add more information you know that meaning evolves yeah. So in some sense I kind of thought that might be take away was that as you add more meaning you know as as you connect more dots. It's OK to change your mind. It's OK to have a different point of view. It's OK to evolve because for many of us changing our mind seems to be something that's embarrassing. That's something that you should feel bad about that. Oh my God yesterday you were saying this and today you've done that for me. It's a great degree of freedom that I can give to myself that I knew you know I had only two pieces of information yesterday. So I drew a line today when there's a third and a fourth piece. It no longer is a line. It's a box. So why would I not change my perspective on it. Innovation is the meaning for tomorrow. Tell me more about that. Right. You know what I think was so interesting about what you just said too is that that's a very data driven way to make decisions. Right. Like you're using you're allowing the data the two two data points that you're talking about to be what helps you triangulate with. You can't triangulate with two pieces of data. But you get the idea to help you understand the world around you. And once you have more data you're allowing that to be insights that you can use to make better predictions and make and understand how to make better foresight and and make plans that are that are more in line. And one of the things that I talk about in the book is that hope is actually a really useful tool to us in helping us in a sense change our mind because hope tells us what it is that we want. And it's a great tool of focus and refocus because hope can always change and we can always decide from day to day or moment to moment that what we were hoping for is not what we're hoping for anymore. And that that is our mind changing. But once that's true we know that our strategy has to change too. Right. We know that if we're now hoping for something else something different from what we were hoping for before that there has to be an accompanying strategy a different plan that's going to get us toward that new articulation of hope. And that is fundamentally what this book is really about is about saying what do we hope for. And I think that collectively we hope that we're going to be able to have a better future than the one we think we're facing right now. We think I think collectively that we're facing a very dismal future. And that there's this whole as you read you know that I talk about this false dichotomy the way we talk about the future so often is about dystopia versus utopia. And we know we don't have utopia because no one's ever going to make all the perfect decisions and we're never going to have all the perfect things go right to be in a perfect world. So that's out of the question now all we're talking about is varying shades of dystopia. I think that's an incredibly unhealthy and unhelpful way to think about the future. And so instead I think we need to embrace our power and agency and the fact that we make decisions that cause the future to be whatever it is. So we can make better decisions to create better outcomes and that whole empowered way of thinking about the future we're going to create and build together. That's the way that I think we need to take our hope about the brighter future we think we can have and build collective strategies to get there and make sure we're solving for the best futures for the most people. And when you look at that as hope evolves our view of the world also changes. So you're right and this is so powerful especially not only because of the times of pandemic but also this entire conversation about how data has been used to manipulate us and it's all of that. And you do talk about a whole lot of things around workplace automation and I'm really keen to learn from you about how do you view this entire thing because So my opening question for you is in the case of workplace automation do you think employees trust the employers more than they trust let's say the government or some individual organization. Well I think we have some data to suggest that people in general right now are trusting companies more than they're trusting NGOs or governments or institutions like other institutions like media. The Edelman trust barometer has run for 20 years 21 years I think and this year's Edelman trust barometer shows that corporations are actually the most trusted institution out of those four institution types. So whether that means that employees trust their employers per se I think is a different question subtly but I think it's a really important lens on that on that question to understand that companies enjoy this privilege of trust and so companies need to use that really well and make healthy and responsible choices for the teams that power the companies and for the people who are outside of those companies who are looking to those companies for trusted leadership. So I think yeah the future of workplace automation and what that's going to mean for individuals and the jobs that they do that is going to be a malleable shifting conversation and it's a very interesting one. But it's one that I think that leaders need to be thinking very human centric in the long long term about and making sure they're making choices that can that can empower people to continue to be adding meaning and nuance to the experiences that are being created for people outside of the company. Like we need that we need for there to be some kind of human to human connection somewhere in the process whether that human to human connection is automated and amplified by chat bots or conversational AI or you know any kind of other technology. It still needs to come from a very human centric meaning driven place and that's that I think is a really fundamental thing for four companies to think about. You know building on what you said I think it'd be nice for organizations to while they do employee engagement surveys etc all of that. But I think a more powerful survey that I would like to see organizations use more frequently is do I trust my leaders. Do I trust the organization because if I trust the organization I know they're doing what is right for me and I'm going to give my best. So I think because of the pandemic the entire employer employee equation has been reset. And so therefore this is an opportunity to focus on building trust before we start thinking of employee engagement productivity and all of that. You know we are for the benefit of our listeners we are talking to Kate O'Neill about her latest book. It's called a future so bright so if you can just yes absolutely I think we need to have a program which says that every time I mentioned the book you should hold it up so that yeah because I've read it. I've read the BDF version and so you know it's on Kindle. I can't show that because it's easier to read. That's how I read all my books too on Kindle. I love reading on Kindle. It's such a different experience from I joke on my bio on my website that there was a point at which I read war and peace on a second generation iPhone just to say I'd done it and just to see what the experience was like. And the experience was actually kind of crappy. So it is amazing to be at a place now where you know on this beautiful rich, rich pixel rich iPad that I can read a book and it's good. It's a good experience. It's just amazing to be at such a different place with it. And right now I also want to give a little plug to my friend Dory Clark who just launched her book The Long Game yesterday. That's the one that I'm into right now. So you know three of her examples and Fast Company. I was just reading that and I thought it was incredible and I must have Dory talk about her book on the show as well. Yeah, you absolutely said for sure. Yeah. You know, I wanted to sort of talk a little bit about your process of how this book evolved in your head. You know, I always find that a very fascinating question. You know, what is this what you started? How do you write? When you get a real clear perspective of the book, then you just simply transcribe it. Or does it evolve? Do you kind of have like a framework and then you kind of fill it in? How does it work for you? Because I had fiction and nonfiction in different ways altogether completely. Oh, I bet that's true. Yeah. And, you know, I've been a writer all my life of various forms, you know, from writing poetry and plays to I actually lived in Nashville for a while and was a songwriter. So you know, I've done a lot of different forms. But when I think about what the work I've been doing the last, you know, five, six years around consolidating these insights of the last 25 years in technology to make them package up the value of them for clients, for audiences and for readers and make sure that there's something tangible. There's philosophy, you know, I'm bringing I'm trying to bring my own observations about the world and what's meaningful to the playbook to the guidebook that I'm presenting. So yeah, so the process has been actually that speaking is what gets me to the clarity about what the idea needs to be because it's a great way to run an idea by an audience and get, you know, several hundred reactions all at once and kind of get a feel for whether people, whether it resonates with people, whether people kind of click and go like, oh, that makes a lot of sense. I love that idea. Or if it kind of lands with a thud, then you know, you've got some iterating to do to get to a better articulation of it. But there are a few examples that, you know, while I was when I had written pixels in place back in 2016, I was speaking about that at the keynotes I was doing. And then eventually, over time, what I found was that more and more of the discussion that I was having on stages was about the ethics and responsibility of companies in building experiences. Because pixels in place was about interconnected experiences and the data that the human data that connects them. But the more of that conversation I was having, I was like, this is a bigger discussion about this, this idea that companies use this data and build the experiences that we have. So that became tech humanist. And then when I was speaking about tech humanist over the years after 2018, I was finding that more and more of that conversation was starting to move toward these bigger implications of the big existential and exponential change that we're all facing, things like climate change. And like I mentioned earlier, like misinformation and disinformation, things like privacy and truth and trust and how those different factors play in to how companies are actually rolling out their own strategy. And how can, you know, leaders think about all of those ideas in a hopeful way. So that was what led into a future so bright. And it's, it is an interesting process. Now, you know, for the next couple of years, I'll be speaking about this, but it'll evolve into whatever the next book is, I'm sure. The book triggers questions. Yes, those questions consolidated becomes your next book. That's right. Yeah, yeah, discovered. I just discovered such a fabulous way to write my next book. Yeah, because I think what happens is, you know, I'm presenting about the concept and then people will have questions during Q&A sessions after a talk. Or like I have an example that I love to use in my keynotes about Amazon Go, the retail store that has the just walk out concept, and the fact that when you start up that Amazon Go app for the first time, there's a onboarding wizard that says, you know, if if you help somebody else because all of your your purchases are logged the moment you take them off the shelf. Don't reach anything for anyone else don't help anyone in the grocery store basically. And I've asked in every audience around the world, India, China, Australia, Europe everywhere. I've asked people how many of you have ever helped someone else in a grocery store or been helped in a grocery store and literally like every hand goes up this is a universal human experience. And I'm like, how is it possible that the designers of this Amazon Go app didn't think that this was going to be a problematic experience to try to restrict people from because that means that we can't help each other in grocery stores. I know it sounds like such a trivial thing, but it's an experience that reaches scale and when that happens it actually changes culture, because experience at scale is culture. And that that insight sharing that every single time phones come out people are taking pictures of the screen and you can just feel that that resonated so much with people. And I'm like, there's got to be something to this idea that it's something about this relationship between the experiences that we're building and creating that are data powered and the scale that they're afforded by emerging technology by AI by algorithmic, you know, amplification and what that means for us as humans in our culture, and people just immediately I think can understand on some level, when that insight is is revealed, that's a big deal like we're experiencing this left and right with with companies making these decisions with these other rhythmically enhanced experiences for us. So I think that's that's what what drives these books is seeing people react so, you know, viscerally and strongly to certain kinds of insights and they just build into more more work. Yeah, I also think that, you know, with technology in its in its quest to in, you know, create a much more personalized experience. Yeah, in some sense, I think it's turning the world and that's one of my hypothesis that is turning the world into far more individualistic people than collective and which is why, you know, collaboration is such a big deal companies are constantly talking about you should collaborate you should do this. But I think, you know, for example, music has become completely individualized. So, you know, you don't know what is the kind of music that your partner likes because you're sort of really listening to it all the time you don't know what your kids like. This is really creating compartments and I would really ideally like to go back to a time like where I grew up, you know, the radio used to be played so loudly at home. Everybody knew the same songs and then even your neighbors sometimes knew the same songs because we were listening to the same programs we heard the same music. It creates new kinds of games that there's a game we play in India called induction which is really all about shared common experiences of music. And that game is becoming increasingly harder and harder to play today, you know, with the newer songs because most people don't share those songs in common. So I think that's something it's such a brilliant example of Amazon go that it actually encourages you not to help somebody else because you will be charged for that particular product. I think I just lost the audio for a second. And we have a while, you know, Kate is going to go back and fix that audio. Nana, thanks for that comment. You say that experience at scale is culture. Didn't you like that? Yeah, I just thought it was such a brilliant statement to make and you know, Anish says it'll be great to do manager NPS. Totally agree. I think that would be a good thing to happen. Can you hear me now, by the way? Loud and clear. All right, just had to replace the battery in my microphone. Isn't that annoying? This is the technology that will run out. This is the other thing you should write about how technology will always let you down. Well, you know, it's funny, it's like the one of the things that's been so weird about this time where all keynotes, all of my keynotes over the last year and a half have primarily been through a virtual format. And what I love about it, obviously, I hate that I'm not in person with people and can't have that, you know, great experience of bonding with people and everything. But I love one thing about it, which is the chat function in Zoom and other keynotes. And you're getting it because, you know, you're putting some people's comments up on the screen and stuff like that. I love that in live streams, but in virtual keynotes, I love using those opportunities to have people respond with just like one letter or one word. And it's just like this flurry of stuff up the screen. And you get, I think as an attendee and as a speaker as well, this kind of joy of the feeling of everybody's playing along and participating. And you can see a lot of interaction that if you were in person, you might get that with like laughter or applause, but you could never really get the individual nuances of, I ask people to say one word that for them really encapsulates what it means to be human. And you see this flurry of words and it's really fun to just see these different ideas that people have about humanity, like emotions or creativity or whatever. You can't get that in a live environment. So I think, you know, we have to look at the opportunities that technology presents us for what's good about them and then try to mitigate the risks of what's not good about them. But yeah, like that's me taking a little bit of liberty with your joke about technology always fails you. But I think that we really do have some opportunity to embrace what's good about technology and try to downplay a little bit what's bad about it and make sure that we're keeping it in check and not allowing it to take over our lives. Let me then ask you this whole question that Kate imagine this conversation that you and I are having, you know, now of course we are having it online and people can join in and it's all of that. But imagine if you and I were talking like this on stage with the audience around and the audience can also respond that there were so many apps that you can use to actually project their talks. That would be the best of both worlds. Ideally, I must say, I totally enjoy, you know, talking to the audience in person. I am a sucker for that. No two ways. Given a choice, I would always do that. Because you know, when you meet friends through FaceTime or any of these zoom calls, it's great. It's a good substitute, but it's not the same as sharing a meal together in the same restaurant. So it's brilliant. I wanted to, you know, I wanted to sort of say that when you look at your book, what do you think it will do to the people in the workplace? How do you think people who are employees, people who are managers, leaders, running organizations. Do you really think that they will stop pause and wonder about all the things that you've talked about all the intangible things that I think, you know, I loved. Whether it is about finding meaning, whether it's about hope, whether it's about, you know, the way you can shape the culture of innovation. There are so many things I enjoyed about your book. What do you personally hope to achieve with this? Well, that's a thank you. That's a big and great question. I think for me, I hope that people are reading it and maybe they're feeling like, well, this is kind of a lot. Like I can't do all of this, but they do find one or two things that they think about and go, you know what? I can probably easily take that back into my organization and try to figure out. Like, for example, maybe my organization is going to decide which of the 17 sustainable development goals from the United Nations we most align with. And we're going to pick that SDG and we're going to make that our designated SDG that we're trying to align with. Maybe it's quality education, maybe it's no poverty, whatever it is. And we're going to even maybe put it in our Twitter bio of our company's Twitter profile or maybe on our LinkedIn or whatever. We're going to publicly declare it and let people hold us accountable for it and start figuring out all of the ways that our charitable work, that our actual work, you know, can align more with this SDG. That one action I think would go a long way toward creating some good, good change momentum in this world. But I think there are so many other opportunities. And I just hope that people find one or two that they can begin to do to move themselves into a more empowered place about the future and a more hopeful place about the future that's actually putting strategy behind it and actually working to build those best futures for the most people. Gosh, that's lovely. If I had to pick one particular goal for myself, most certainly I would pick, you know, making a difference to education, something that I really care a lot about because I think many of the other things can get solved if you people are aware in terms of what they can do and they feel more empowered. But I just want to take time to say thank you ever so very much because, you know, it's just so fascinating to hear the lovely book that you've written, Kate, showing one last time. Let's do it, let's do it. Yeah, let's do it. The future is so bright. I love the book and do check it out because it talks about all the things that we could be and should be doing. So, you know, thanks once again for being part of it. Thank you. Thank you so very much.