 Thanks for joining us at today's Public Good App House. Our sponsor for this event is UiPath. They make software robots so people don't have to be robots. Today's special event opened with a fireside chat with Beth Cantor, non-profit tech expert and co-author of the upcoming book, The Smart Non-Profit, Staying Human-Centered in an Automated World. Let's welcome Beth Cantor. Beth Cantor is an internationally recognized thought leader and trainer in digital transformation in the non-profits workplace. She's the co-author of the award-winning Happy Healthy Non-Profit, Impact.Burnout and co-author with Allison Fine of the best-selling The Networked Non-Profit. Named one of the most influential women in technology by FAST Company and recipient of the N10 Lifetime Achievement Award. She has over three decades of experience in designing and delivering training programs for non-profits and foundations. You can learn more about Beth at www.bethcantor.org. So in this fireside chat, Margarita Mucci-Vabic, Public Affairs Manager at UiPath, will interview Beth Cantor about her newest book. I will hand it back off to Margarita. Thanks again. Shout out to Beth and welcome her. Let's all welcome her to this great event. Really excited to have you, Beth. I'm looking forward to the fireside chat. Me too. I'm so pleased to be here. And I really enjoyed your presentation, Margarita. I think we're like saying, we're speaking the same language. It's great to hear. That's awesome. Thanks so much. Coming from you means a lot. So I really want to learn more about this. Why did you write the book about smart tech? What motivated you to do it? You know, I've had a front row seat at the creation of a field, the non-profit tech field for many decades. And I was there alongside many of my colleagues from TechSoup. So I'd give a shout out to Susan Tambi and Marnie Webb if she's there. And I've always worked at the intersection of emerging tech and nonprofit mission-driven work, mostly as a trainer and facilitator, collaborating with technologists to help nonprofit leaders really understand the relevance and to help adopt the technology strategically, reflectively and ready their organizations and ready their own leadership. So smart tech, and I'll explain what we mean by that in a moment, I think it's at this inflection point where it's common to technologies that reach everyday use. You know, there's this enormous increase in computing power and it's met with dramatic decrease in the cost of technology. So that means that it's no longer just for the NASA's and complicated moon launches, if you will, but everyday people and nonprofits can start to use it for fundraising, accounting, human resources, service delivery and even more. So the smart nonprofit, it's my fourth book, my second one with my wonderful collaborator, Allison Fine, we wrote the network nonprofit many decades, two decades ago actually. No, no, actually I'm rushing a decade and a half. And we wanted to write a guide that was not technical and that was specifically aimed at senior nonprofit leaders to understand how to leverage both the benefits and also navigate the challenges. Can you give us some examples of the benefits of the organizations adopting smart tech and maybe a bit about what smart tech generally includes? Is, are we talking about efficiency? Are we talking about more? What is there for us? So we decide to come up with the term smart tech like smart phones, smart houses. And it's an umbrella term that we apply to describe a whole range of advanced digital technologies that basically make decisions for people instead of people. So this includes artificial intelligence, machine learning and all its various subsets and cousins such as a natural language, processing, chat bots, robots and other automated technologies. And the reason, you know, we had, Allison and I have been writing a lot about artificial intelligence for social good, AI for good, AI for nonprofits for a couple of years now. And we noticed when we use the term artificial intelligence it for, at a leadership level people would kind of lay back and say, whoops, that's not for me, that's a technical issue. But we really strongly believe given the enormous benefits and given, you know, where that we're at this inflection point that, you know, that the use of this technology is a leadership issue. It's a leadership challenge. It's not purely technical. And so that nonprofits really need to be working in collaboration with the technical experts and also make sure that they're, you know including their end users, whether that's internal end users such as staff or external folks such as their clients and donors and the design and delivery of this tech. That makes a lot of sense. I do feel the same way about, you know how you talked about AI and then smart tech people tend to think about it differently, right? They become more open to hearing what you have to show about technology besides just AI itself, right? And I think there's too much popular science fiction negative narrative around the term artificial intelligence. And I think at least for nonprofits the concept of automation and the dividend of time as you talked about, I love how you quantified it but it's not just I think the, you know the return on investment which is significant but you mentioned 16 hours of save time. What we think and one of the reasons why we wrote the book is like how are nonprofits going to reinvest that time for to improve what they're doing to improve relationships with donors for example and maybe take a hit at the abysmal donor retention rate, you know actually connecting with donors and asking how they're doing. On the other end we all know, you know and certainly I know from a lot of firsthand experience and why I wrote my second book on workplace wellbeing is that in the nonprofit sector it's a burnout then because people are overworked and work long hours a lot of us are passion driven there's been an increase for demand for services but yet we're often spending that time doing low level administrative work that can be exhausting and that's been exacerbated by the pandemic and so if we can free up this time think about like maybe we could do things like a four day work week maybe we could give staff that rest they're needed or that's sabbatical and still get things done. We wanna caution people that automation isn't about just doing the same stuff more efficiently and working those long hours because you can get more done now that you have saved those 16 hours and it's not to lay off staff because oh wow we eliminated 16 hours because you know we have to be human centered about it the technology, the robots are good at doing one thing but humans are good at human centered stuff like relationships and empathy. Could not agree more Beth. So since we were talking about work life balance and wellbeing your last book The Happy, Healthy and Unprofit was about that but given the great resignation is there a connection to it? How do you feel about these things coming together? I'm really happy because these are two of my passion topics as I've been thinking about for the last 20, 30 years and so when I started down this path it was actually been writing about this before the pandemic and saw the potential benefit for workplace wellbeing from the saved amount of time but also the potential for certain apps to also help with that mainly apps that can streamline workflow. So in my work I facilitate a lot of wellbeing retreats for staff or else I'm teaching workshops and the biggest complaint I hear aside from the burnout and the overwork a lot of it's caused by these unrealistic workloads and if you sort of peek under the hood around that a lot of it has to do with unautomated unautomated manual types of systems that are there supporting the work that take a lot of bandwidth take a lot of our brain bandwidth and make us physically exhausted and put this on Zoom and stack it with back to back online screen time, we're exhausted. So I think if anything the pandemic has really taught us really the importance for wellbeing and the importance of digital transformation now it's time to marry the two. Awesome, thanks so much Beth for the great insights. I have a question around, sorry about use cases examples because I think when we're talking about use cases people have this aha moment and they figure out okay, I can do it as well. So in your book, The Smart and Profit you share a lot of great use cases, examples for nonprofits. Can you share a few of favorites of these examples so that we can learn and take home? Sure, that's a great question. We spent the book aside from talking about like what it actually means to be human centered and helping organizations prepare their data and ethical and responsible use. We did interview scores of nonprofits about how they were beginning to use these tools and for program delivery, fundraising and back office. So things from like screening resumes based on criteria that organizations set determining the eligibility for a host of social services, identifying prospective donors from your technology fundraising data delivering medicine and food to hard to reach places or directing refugees to available beds. And certainly the examples we're gonna hear in the second half of this session one of my favorite maybe more specific examples is from the Trevor project which provides crisis counseling to LBGQ plus people. And they created a chatbot. And by the way, chatbots seem to be the most common technology that nonprofits are using right now. We came across many examples but they created a chatbot named Riley not to replace the counselors on the frontline but to help train the counselors by providing real life simulations of conversations with potentially suicidal teens. And Riley's always available for a training session with volunteers and that helps staff scale the number of trained counselors without adding more resources. And of course, this is a big problem right now especially given that the pandemic has exacerbated a lot of teens in crisis. And the Trevor project sees the role of the technology as being really human centered and really understood what the potential harm of putting an automated system on the frontline as a counselor. And I mean, there's many bad examples of that, right? I mean, one of the most famous ones is a Twitter chatbot named Tay that the automation was one again like Riley was very smart and it was self-learning. And this chatbot on Twitter was intended to learn how to converse with young people but the trolls got a hold of it in less than 24 hours and turned it into a racist, misogynist swearing insulting harmful bot and had to be taken down. But let's go back to Riley. Riley is again, one of these really smart natural learning, processing, smart boss that learns from socializing or interaction but it's never exposed to anybody. It's only exposed to very controlled environment and to learn from counseling approaches which are very sensitive. They would never let it interact with the general public. Some other examples just really quickly some of my favorite came because of the pandemic I think the pandemic inspired or we might say forced a lot of nonprofits to go through a decade worth of digital transformation in two years. And one of my favorite examples is in the food banks because there was a huge increase in demand of course for people with food insecurity. And one of my favorite examples is from Pittsburgh. So during the shutdown kids were in marginalized neighborhoods were not able to get their lunches from the school because they weren't home. So they used AI and machine learning to efficiently re-engineer the bus routes so that the buses could bring the lunches to the kids instead of bringing the kids to the lunches for schools. And I just really love that example. And there's many others. So these are great examples Beth. I mean, that's just so inspiring and it gives you a lot to think about. But I really think this angle that you touched on earlier about the control environment for Riley to be safely used. I wanted to ask you about what are the challenges around adopting smart tech for nonprofits especially talking about ethical and responsible use such as the example you mentioned earlier what could you share about that? Thanks for that question. And I'm sure many of you that are in the chat and of course all of you on the call here and of course you Mark are to understand these challenges intimately. And in the book where we are creating this really for leaders to understand these challenges they don't need to know how to code but they need to understand how code is built and how it may be biased. So making strategic decisions about when and how to use a smart tech really is this leadership challenge not purely a technical problems. And as we know there's consequences to automating systems and processes that range from losing the ability to make judgment calls, that human centred piece like giving the unusual job candidate a chance to introducing flat out bias against people of color that's blocking them from receiving certain programs and services. And we include many examples of these in the book not only so we can learn from these and maybe mitigate the potential for problems. Bias can happen in different ways. First the data that the software is trained on can be biased, it can be incomplete there can be issues around how the data was labeled data hygiene, I like that word dirty data data that's really incomplete or inaccurate. And the technology needs enormous data sets in order for the algorithms to work. And my co-author also fine likes to say library of Congress sized data sets. And bias can also happen due to the assumptions that the algorithm or the mathematical code that actually makes the decision is constructed. Is it, there's a wonderful quote that I really like saying a lot is that algorithms are opinions encased in code. So we need to understand what assumptions did the developer make in creating this tech? Are they research-based? Is it just their view of the world? Did they do ethnographic research with end users? Where does that all come from? There's also data stewardship and privacy issues that organizations, senior leaders must understand and having protections on that data we're seeing more kind of scary types of stories about privacy being violated. And I think nonprofits really need to think through a do no harm pledge in using the technology and not wait for something bad to happen at scale but really be able to test and iterate their way and understand what the potential for harm is. And I'll talk about readiness in a moment but things like having an advisory group with expertise in ethics and data privacy and AI can really help guide the organization. And I think this is a really great opportunity for similar types of organizations maybe to share an advisory group like different food bags. I've noticed all these conversations and then in the last few years there's more and more thought leadership and recommendations and energy in this whole conversation around ethics and responsible use of technologies, emerging technologies. And since we are here also to talk about readiness and your thoughts, get your thoughts on how nonprofits can start their journey with smart tech I think it's really important to highlight that there is also a non-technical component related to readiness as the one you mentioned before. Sure, and we do focus on the non-technical the human readiness, the organizational culture readiness in the book. In fact, we have a whole chapter that's called Ready, Set, Go. And really it's sort of a blueprint for nonprofit leaders to ask the right questions and to set up the right design thinking types of methodologies so that they're really doing this in a very reflective, knowledgeable, strategic and human centered way. And that really begins with getting feedback from end users, whether that's staff or clients or donors. And maybe it's, you might start with a particular idea around the use case you might discover and doing this research and oh no, that could have potential harms or that's not solving the right problem. There's also really understanding and working in partnership with the particular types of tools that are out there, asking the right questions, reading their white papers, asking the tool makers what their assumptions were and understanding what's under the hood, if you will, without having to technically reconstruct it. And then finally, I think that last step is really in the nonprofit sector, we just wanna get things done, right? We don't wanna like test and learn if something doesn't work and have to correct it. But I really think that approach is necessary when we're talking about this technology to avoid some of the problems. This is wonderful, but thank you so much. I definitely learned a lot and I hope everyone enjoyed this RSI chat, really great energy, really great insights. So over to you, Beth and Nina. Thank you.