 Well, thank you all for coming to our panel. I'm very excited about this topic, which is near and dear to my heart. My name is Ann Mae Chang. I spent most of my career in Silicon Valley in the tech sector, working at big and small tech companies, including places like Apple and Google. And then about six years ago, I decided to make a transition to spend the second half of my career in the public and social sector. And one of the biggest surprises for me was the difference in the use of data. So at Google, we had data on everything. And we use data every day to make decisions, to make our services and our products better every day. And the tech sector, I think, data has driven the accelerated pace of innovation that's the envy of the world. When I got to the social sector, what I found is mostly what we had was data on activities of what we do, like so we know how many farmers were trained, how many loans we gave out, and so forth. But a little data on whether what we're doing works, whether people like what we're doing, whether it's helping them. And so in my mind, one of the big challenges in the social sector is that the things that we're trying to achieve, the social impact we're trying to achieve, often can take years to get to. And so as a result, it seems very daunting. And the best practice is to do these randomized control trials, which can take millions of dollars in many years to do. And so we just don't do it. And what we end up doing instead is focusing on the solution and continuing to optimize the solution and really fall in love with our solution. And we forget about our goals. We forget about the problem we were trying to solve in the first place. And I think that as a result, we're far less effective in the social sector than we could be. In Silicon Valley, we have Moore's law that every two years, the speed of processors doubles. And that means that my iPhone now has a processing power of a computer that used to fit in this room. If we could have that same kind of accelerated innovation in the social sector, just imagine the kinds of problems we could solve. And so I think that data is a really key aspect to being able to tackle those problems. So I'm delighted that we have a diverse and exciting panel here today to talk about these issues, ranging from a foundation to an impact investor to an actual social enterprise. So we're going to get a diverse set of opinions of how data operates in this world. So to my left here is Mariana Iskender, who's the CEO of Harambe Youth Accelerator. She's here from Johannesburg in South Africa. And this is Lindsay Louie, who is a program officer at Hewlett Foundation, just down the peninsula. And Sasha Dichter, who's a chief innovation officer at Acumen from New York. So I'm going to start out by asking each of them to just give a quick overview of their organization and the work that they do just for some context. And then we'll dive into the meat of the topic today. So Sasha, would you mind starting? Is this mic on? Is this mic on now? OK, great. Thank you. Well, that's a really perfect framing, Ann May. And you really framed the, I think, the aspiration we all have is to do better. And somehow I think we've let ourselves off the hook and said, well, the thing we ultimately want to do is hard to assess. And so we'll use all these proxies, and that'll be good enough. So Acumen came into the field in 2001. We've been an early-stage patient capital investor in businesses that serve the poor, mostly in the developing world, and more recently in the US as well. We have always been very, very committed to understanding the impact and hopefully to creating it. And about four years ago, we created something called Lean Data based on our ongoing frustration that we weren't actually, we came to the conclusion that the data we were collecting from the end enterprise that we were working with was not going to help us allocate more funding to more things that made more impact. We did not have the data we wanted to improve to your framing. I mean, that really was what was driving us. And so I'm happy to share more details on kind of how we got from there to here. But our main hypothesis was, one, our companies have lots of touch points with customers. And if we could use those touch points to talk to them, we could learn things. Secondly, all of our customers, even the poorest people in the developing world had access to mobile phones. And so we should probably use those in some way, shape, or form. And we were under the, we had the opinion that the main problem with impact measurement was it was extremely expensive. And so if we could use technology, if we could talk to customers and we could decrease the cost, we could shift this from something you do sometimes to something you do maybe even daily. And so since then, we have been doing Lean Data for the Acumen portfolio, for the Omidyar portfolio, we've done work for Gates, for DFID, CDC, and JARO, Dune Foundation, et cetera. And we've spoken to more than 40,000 customers directly asking them what matters to them. And I'm happy to sort of share where, you know, what we're learning from that. But I think the headline is, I think we need to shake ourselves up from this slumber of letting ourselves off the hook and saying this must be hard and it must be left to the experts. Because as you say, everywhere else in the world, talking to customers lets you know if you're meeting their needs. And somehow or other, we're in the needs meeting business where we just don't talk to customers. And it's a little harder because some of our customers are more remote or harder to get to. But I actually think this is gonna shift from kind of an assessment mindset to a competitive advantage mindset very, very quickly. And I think that's actually the opportunity that people who are moving earlier are beginning to see. So then we can talk more about that. Great. Lindsay, can you share some of the work you're doing through Hewlett Foundation? Yeah, I just wanna like cheer for everything you just said. It's really nice to see all of you this morning. The Hewlett Foundation obviously has a broad range of things that it works on all over the globe. My particular focus there is actually grant-making strategies that will try to make the philanthropic sector itself more effective, to your point. And so one of the core strategies that we helped to launch about four years ago is a donor collaborative called Fund for Shared Insight. And the goal of Fund for Shared Insight is that funders and nonprofits would be more meaningfully connected to the people that we're ultimately seeking to help and serve and benefit in this work. And that so often that link is missing or broken or dormant or something. And not because necessarily people don't think it's important, although some may not, but also because it's been under, I think underattended to over time. So Fund for Shared Insight has now 13 core funders from all over the country who pool money through Rockefeller Philanthropy Advisors. And we meet three times a year and we set our strategy together, co-create that work and do all of the grant-making. And in our first year, we put out an open national RFP for nonprofits that want to collect feedback, ongoing high quality feedback loops from their clients. They could take our money and develop any feedback system that they wanted to, which is pretty expensive if each organization is building up their own system. I mean, not millions of dollars to your point. It doesn't have to be that scary, but still it's a lot to sort of everybody inventing a feedback wheel at the same time. So we got 120 proposals from nonprofits across America and we had the money to fund like seven or eight of them. And that didn't feel good at all. And so we regrouped and thought about how could we offer something that could meet this need, this demand. And we created this initiative called Listen for Good. So Listen for Good is a simple six-question survey that you can use with clients. It's offered now in five or six languages and we're happy to have organizations translated into any other languages that they want to. The data is collected in SurveyMonkey and organizations can analyze their data over time. They can look at a comparative benchmark to other organizations that work in an issue area like theirs. Right now, organizations need to get Listen for Good grant from Fund for Shared Insight. They get $45,000 and the Technical Assistance Coach and their account with SurveyMonkey because we kind of consider it to be in like a beta. We didn't want to just burst onto the scene and say, guess what, 13 funders came up with six questions and you should all use them. Like how would that end? Probably badly. So we've been at it for two years now. We have 112 organizations and probably close to 40,000 responses as well. We've just finished a data mining project on those responses so I can talk more about that. But our goal is that, we'll keep offering the capacity building, but our goal is that about a year from now, the tool will open up in SurveyMonkey so that anybody can use it. Great. And Mariana, tell us all about Harambe and what do you do there? Sure. There you are. Years ago and in the first three years of Harambe, we had set a target of placing 10,000 young people into their first job and achieved that and then in the second three years has now tripled that to another 30,000 so a total of 45,000 young people in their first job and in doing that have reached roughly 350,000 young people who have benefited from employability services that we provide and have had a million young people who have interacted with the organization. I would say that the drivers for scale were a combination of awesome people and relentless pursuit of how to use data and how to use data to drive innovation really very rapidly. And in our organization, you really, you can't propose an activity if you don't know how you're gonna measure it and how you're gonna use the data and I think that a lot of the innovation that I hope we can talk about has really demonstrated that in a matter of weeks we often are able to find solutions for getting young people into jobs that I think if we didn't use the data in that way could take years and feel slow in a lot of the things so I don't think we've figured it all out and certainly I think a lot of value of data comes from failing and then figuring out what to do but we're trying very hard I think to implement what these guys are talking about and this is what we're sharing there. Great, well Lindsay let me turn to you and just ask at the 10,000 foot level like why should people care? Why is it important to get data from customers and from the work that you're doing? Yeah, I think that there are sort of different arguments to make to different audiences. I think sometimes we talk about this as being both the right thing to do and the smart thing to do for your work to your point and one of the ways that we've started to think about it especially in the foundation world is that there are sort of two forms of knowing that foundations maybe most often focus on one being evaluation, so to your point that's often third party fairly long time that it will take between opportunities to measure can be expensive and difficult to do and can be important in the right circumstance to say are we achieving the outcome? Do people have jobs five years later or is recidivism lower or has the government saved money or whatever the longer term outcome is that's really important but it is only one way of knowing something. Then there's monitoring which is a little bit of what you described earlier maybe which is what are the indicators that we have that we can probably count ourselves without a third party and maybe more often than every many years to know if we're on track, to try to predict either if the evaluation is coming or as a proxy for that evaluation, how many people got jobs, how many people kept those jobs, et cetera. So we've started to think of it, these stools have four legs in the audience but we kind of think of it as for now a three-legged stool so you have the evaluation leg of how you know. You have the monitoring leg of how you know and the leg that has I think been given too little attention is and how are people experiencing this? What is the experience that they are having in the program and there are a lot of reasons that that's important. We've invested in several research projects to try to understand in certain fields of work are there actually links between the perceptual data and the long-term outcome data. So that would be remarkable if and where it were true because you could get a really early signal. So to your point it could be much cheaper and much faster if you knew. Look, if people are telling you that they're not satisfied with your program at eight weeks they also are highly unlikely to be successful at five years and you could intervene a lot sooner. So that's one thing but also I think that we have sometimes undervalued the perspective that the people in the programs have about their own lives and their own experiences and we have perhaps overvalued so maybe undervalued experiential data and overvalued maybe those other legs of the stool like the evidence data or the expert data and so I think that re-strengthening or building up that leg can help bring a piece to the puzzle that to your point and your point can really accelerate what you're doing can really accelerate what we're able to achieve and I think that the cost of not investing in that leg of the stool is kind of invisible to us because there's nobody to stop us, right? Who's gonna come make us invest in that leg of the stool? If you're in a position to be doling out resources you can keep doing it the way you've been doing it so it's a little bit invisible to us what we could unlock if we pay more attention to the perceptions of customers but I completely agree with you that if we do I think that we'll see growth and change that we haven't yet seen. And I think one of the big distinctions that I've come to learn in the social sector is that in the private sector you're really driven to look at data because one, your customer is also your funder, right? That you're trying to sell something to someone so you care whether they're gonna buy your thing. In the social sector it's more complicated because you have two different customers you have your funder which is usually a foundation or some sort of donor and then you have your customers and your customers are not the ones who are paying you and so what I find is that a lot of times organizations are spending a lot of time on evaluations and monitoring that are getting data that the funders want and the audience for that data is to satisfy funder compliance requirements or your grant requirements or so forth not data that actually helps you run your business better. So it's data for compliance not data for decision making for the business itself where the audience is your own internal staff who are trying to deliver better services or products. So Sasha I'm wondering if you could talk a little bit about you mentioned the power of technology and can you talk a little bit about how you guys what are the tools you use to help social sector organizations collect data? Sure and I'll just go back a half a step which is I mean I do think the distinction you made is the right one but it's the distinction between your auditors and your like market intelligence function and the idea that you'd run anything other than in the social sector with just your annual audit is so fundamentally problematic. So when just going and sometimes especially when we talk to a more expert community that's an evaluation community they will say well we're worried about bias we're worried about this or about that. Every single day in a nonprofit or a social enterprise the people running that organization are making decisions in the absence of any of this data. And so unless our data is worse than nothing it is helping to make better decisions. I mean the how that we do it and we are heavily focused on using mobile phones people have phones in their pocket a lot of the people that Acumen companies and our peers are serving our kind of customers of modern goods and services sort of for the first time or first time recently. And so one that we found is the responsiveness is unbelievably high. I do think over time this might evolve right. So in the US we've done less of this work I think if you call somebody on the phone in the US and say I'd like to talk to you for 15 minutes or seven minutes or whatever about your product or service it might be harder to get through but we find whether it's in Kenya or in India or Pakistan or Uganda or Ghana or wherever there's an incredible openness to do this. Now we do give airtime minutes and other bonuses for people to do this. We started out mostly using SMS. We have found actually that voice is the best tool and gives us the best balance between value and quality of data mostly because the cost is really maybe just to frame this we're typically surveying about 300 or so customers. And so even though an SMS might cost a few cents and maybe a voice interview will cost I don't know up to a dollar. So that feels like a big multiple. If your data collection cost total is $500. I mean that's kind of the order of magnitude we're talking about a direct cost. It's just not a lot of magnitude. And in particular just to get a little bit more specific we found that what works really well is to have our own trained contractors to do those calls. Because it's again it's not rocket science but you can do it poorly. So if we have trained people or usually college students there is a quality to the conversations that they're having and then there's a synthesis that they can do at the end of having had a hundred conversations that is beyond each individual data point. And so it's this funny mix of qualitative and quantitative and our starting point actually was we started very structured multiple choice kind of things because we thought that that was the way to get objective data. And then one of our partners in Kenya actually told us try some open-ended questions. And we added in one survey just a question and said is there anything else you'd like to tell us? Which has become by far our best question because wow do you learn a lot of stuff? In that particular case 40% of the customers just complained about the same thing. And the CEO which had his own direct sales force had never heard that just as an inkling. And so now what we do is and we'll get to I love the six questions because I think that's where this all ends up is what are really really great questions that allow us to empower customers to tell us accurately about their experiences. The way we've developed our questions is we started with open-ended questions. So I'll stick with energy because we've done the most work there and it's the easiest one to describe. We went to our customers of our companies and said what is the greatest impact that having a solar panel on your roof has had in your life? And they would free form response. And then you take all their free form responses and code them and group them in various ways and you go okay well this is what they said. This is the most important thing and then this is the next and this is the next. So each of those categories let's develop better questions. So we really are starting from the perspective of you tell us what matters to you and then we will figure out how to ask questions that capture that in more and more and more refined way. So that's a little bit of the feel for the how. But it's been an incredible thing once we've started walking down this journey of talking to customers the idea I mean to your point of the title of the panel flying blind that's really what it's felt like. It's just like how could you not do this? How could you not? And I think it's just you know at least the companies we're working with you're starting a social enterprise you've got a million things to worry about. It's not that you don't want to it's just you know 4,000 things on your list and if it feels like this is for the funder is gonna be a have to. And so we find we're showing up and saying we're giving you business intelligence basically and people are really excited about that. Yeah I can't tell you the number of evaluations that are basically poured over and written and then just filed on the shelf and sent to the funder right? People don't use it for their businesses. So Mariana you know you're sort of our actual person on the ground doing the work and using data. I was wondering if you could share a little bit about what that looks like at Harambe. I mean maybe building off of some of these comments and trying to think about sort of practical things so it's not just sort of what we do. I think this question of how to navigate what funders need and what you need has been a lot of kind of frankly bringing our funders along. So we report I mean a year is like a lifetime. We report on a quarterly basis to everybody about everything because if units of time are smaller everybody has to focus. You have to kind of achieve what those targets are. We've gone to funders to say honestly can we just have one report format that we do for everybody so that we can share more with you in the customizing every report for everybody is poor use of our time if we can give everybody that same information. And people are happy to do that and so then it just makes it feel like this is the kind of these are the metrics that drive the business. I think the second point which you made is we have a SMS survey that goes to every young person every four months that collects information about their employment journey. What that means is we have more current data than our own government about what's happening because the stats essay, the sort of stats South Africa Bureau issues a quarterly report that looks at 18 month trailing data. And so we now issue alongside them to say why don't we look at what's happening this quarter in terms of employment outcomes. The dilemma is that you change the questions as you get smarter and then people want to come do evaluations and you say but we haven't been asking that question for the last however much time so you have to negotiate again. If your questions improve and change you can't evaluate them over a five year period. And I think that we have definitely chosen to not go the evaluation route where it doesn't make sense and have to just say if we're actually putting kids in jobs running a model that does require employers to pay so we have that pressure of a customer on that side. I would say that the last point which is really that your customers drive what you do every day is that those metrics have to be visible to your teams and matter. So young people evaluate our teams, we get net promoter scores, the teams see that on a weekly and monthly basis and so it's because somebody doesn't pay does not mean they are not a customer. And I think that actually that customer mindset means that the experience is not a beneficiary, it's not a marginalized person, it's a customer and so are you delivering what you think that they need. My last observation is that I think the discipline that has evolved in our organization is that if you can't explain how you're gonna use the data because there's also I think this myth like you have lots of data and it's actually not obvious to anybody what you're doing with it or so you have long spreadsheets you've got what you think is data but nobody can actually articulate how it's being used to make decisions and so I think that again it's not actually even good enough to say we are collecting this information you have to be able to articulate I will therefore make what decisions kind of based on it and kind of build that as ritual and have it within the organization. And if I could add and if you wouldn't mind I'd love to ask you a question which is I think it's so crucial of saying the first problem we're solving is between an organization or a company and the customer and we're really to your point like we have tried to have that discipline and at times we fall short of like okay well question 13 why it's like oh we really wanna know that. No it's not good enough it's like no who's we no but I'm just curious because we've seen this also happen but this managing back up to the funder because our working thesis is if it's valuable to the organization and they're using it and it's about that you can convince the funders that this is more important because you say this is what matters to our customers and I'm just wondering how that's gone for you because I think it's such a perfect example of sort of standing up and saying we also care about all the things you care about but actually you're asking us slightly the wrong question so I'm just curious how that's played out. I mean I think that it's been two things which is you have to succeed in order to have that conversation and so I think we've had the benefit of being able to say to funders like this is working like please don't mess it up so let's just keep trying to get smarter about it. I think that the second thing and I mean I say this with all humility is like then sometimes you just say we're not, this isn't gonna work like I can't take your money and then actually move away from delivering for the customer. So like you don't really need question 13 and if we can ask question 67 and get you what you're trying to look for so I think we do a lot of negotiating like what is it you're ultimately trying to know and can we find other ways to help you achieve that? I just wanted to say is I think this question of you use the phrase in other settings of vanity measures which is things that are kind of sexy to report on but actually if you don't understand how to get to that number you're spending a lot more money than you need to or people are chasing the wrong things and understanding people's incentives I mean that's the other part of the question is what behavior does it drive and are people working as teams or elbowing each other in an unhelpful way or making us look uncoordinated in front of the customer because they're trying to have an individual target. So we really shy away from individual targets it's team targets for everything and this idea of throughput targets which is in order to get a young person in a job there's 15 steps before that and if you don't understand what unit measure at each of those 15 steps everybody is chasing the vanity target and maybe doing it in a pretty inefficient and not helpful kind of way in order to get the top line number but kind of screwing up all of the stuff underneath it and so we make very visible kind of throughput targets that get measured on a weekly basis and then you say are those driving the top line measures correctly or incorrectly? All right, like just as an example to be more concrete I think like a vanity metric might be number of youth that we placed in jobs because you got a youth in a job but like the more substantive metric in terms of the impact you wanna have is like the retention, are they staying in those jobs? And how many youth did you have to send to the employer to find one that they're happy with? Interview conversion rates, geographies that young people are from so again if you have jobs but you don't have the pools that are big enough in the right geographies and actually managing those kinds of things on a daily basis proves to be much more difficult than the top line measure. Yeah, Lindsay I wonder if you could talk about share an example or two of how this kind of data through your program has actually been helpful to organizations. Yeah, definitely and I think that one of our goals is that the data would make it more, make it meaningfully to funders and I think that there are all of these dynamics that we're talking about and I hope that in our lifetime we get to a day where we don't have to talk about them on a panel but I think that organizations have our experts in the work that they do and experts in their clients and that they can and should position themselves that way to the funder and that it's powerful to say this is what we know about our clients and about what they need and I'll tell a story about that in a minute. I think our and to your point I would also say what we've really emphasized with organizations or guide stars working for example on putting up something in their profiles now about does the organization collect feedback from clients is emphasizing a high quality process for that feedback collection and not emphasizing the scores because the minute that you start getting into well what was the score? Then I mean you've all hung up the phone and they've said like at the end of this call you're gonna get a survey, could you please give me a 10? Like that's not what we're up here to talk about but it happens if that's people do what is incentivized so I think we've really tried very, very hard to incentivize a high quality feedback loop where you're collecting data using good questions, you're trying for a high response rate so that it isn't people can't just say oh you have 5%, it's not representative or it's just cranky people or that people can't dismiss the data if they don't like what it says that you're using the data to make changes and that you're closing the loop with clients and telling them what you heard and what you are not doing and that piece almost every nonprofit we work with tells us that they've always fallen down there because you're doing those hundred other things that were on your list and yet it can be really important for building trust with clients so that they'll know you were serious when you said this was data for improvement, we're listening to you, we're not gonna retaliate against you, you're not gonna stop getting services from us because all of those concerns and questions come up in this work. So if there's one theme to the examples that I'll share in what we're learning it's prepare to be surprised at best and sometimes to be wrong in your assumptions. I think that we've had organizations that said, like they're texting out the survey to their clients and the CEO said, you know, I'm totally down to try this but I don't think it's gonna work. I think clients won't use their minutes to text us back, this is all domestic. I don't think that they care enough and they had a thousand responses in 30 minutes from their clients and she shook her, right? She was pretty sure that it wouldn't matter. I think in terms of examples we've had both, I don't wanna frame this all in the negative. I think there are cases where, like they wouldn't have known except for the survey but there are positive cases too. So the questions that we're using are a variant of Net Promoter. So the first three are, how likely would you be to recommend this organization to a friend or family member? You can say who needed it if you wanna qualify it, zero to 10. So that's your Net Promoter score question. And then the modification is two open-ended questions. What does this organization do well and how could this organization improve? And then the final three are, how much of a positive difference has the organization made in your life? Scale question, to what extent has it met your needs and how often did the staff treat you with respect? Questions four and five are so highly correlated that we're gonna drop one and let organizations try out some other questions. That's your point, if you can ask less and learn the same thing. So we had one organization that said that one of the best things they got out of their first survey was their staff felt incredible when they found out that 98% of clients felt treated with respect. That that was incredibly meaningful and validating to staff. On the other side of the equation, we have had experiences where either management staff learned things that they wouldn't otherwise have known or I would say sometimes it's more that it elevates it. Like you've heard it or somebody mentioned it but you could kind of like dismiss it because it was just, oh, Mary's always saying not about that site or oh, that guy always complains. But when you see like 40% of clients wrote the same thing in that box, it gets harder to say, we're not gonna fix that. So we had an organization that found that all of the net promoter scores were pretty high except for their clients from the Caribbean. All the scores were much, much lower for that group if they were able to segment it. And so they were able to bring in a consultant to help them look at why was that happening with their staff and do training and also they added a question into their survey to what extent does your therapist take time to understand your family's background, belief and values so that they could measure that over time. We had another example where a food bank asked clients what else they could provide and often with food banks you'll hear like if you go visit they'll say they have like a new menu program or they're having cooking class or people who aren't clients if the food bank sit in a room with PowerPoint, they come up with things like that, right? That relate to food. And when they ask clients, what else do you need the food bank to provide you? The overwhelming thing that they wrote in that box of what could we do better was dental care. So we could PowerPoint that all day long and probably none of us in this room, unless you're a food bank client might have thought you needed dental care and now they have dental partnerships with all of their food bank sites and clients can get that access that they need. So it was about saying like this isn't necessarily about us or our theory or what we think is the right thing to do, it's about what are these people that we're hoping to make a difference for, you know, what do they need? Yeah, so often we're trying to project what we think people need onto them and it's not the same as what they actually need. Yeah, go ahead, Josh. I mean, to your credit, I may have pulling together a panel. We're all agree on the violently and there's such alignment in what we're doing coming to different places. But I just want to take a step back to say this is not the way we're behaving. It's just not. So every time you see an impact investor say, no, no, we do all this work up front and then we're done. Or then we know that every time X happens we just impute all these other things. The prevailing practice 98% of the time is not to, every time it's just, well, we need is another framework, we need another this. And so somehow or other, I think what we need, what you who have come and said this is important, we need to enlist your help in, is to make that unacceptable. Because somehow or other, even though when I'm listening to all this, I'm like, yes, every single day in that promoter score, talking to customers all the time, it creates value. Like it's almost obvious. But at the same time, it is so far away from current practice. And current practice is just to ignore all of this or to say, we'll do it over here, but this is not the way we will understand if we're succeeding. So somehow or other, there's a breakthrough that needs to happen. And I don't know if it's just about acceptance, but I just want this group to recognize how out of the ordinary and how exceptional this is. And to say, every time you see someone saying, look, after we do the deal, we don't go back and really, it's too hard, we don't know what it is. That's the normal way to do it. And what we actually see is such incredible divergence of customer experience. So do solar panels save people money or not? It really depends. Do solar panels meet the needs of rich people or poor people? It totally depends. And on and on and on and on and on. So I'm happy to share more data that we're getting back. But I think there's this general assumption of like, well, it's all sort of similar. So when someone goes through job training, X will happen, so it's okay. And what we found is context matters so incredibly much. And people's experiences are so divergent. And if you learn about it, you can meet their needs better. And again, I know that all sort of sounds like it makes sense, but for some reason, this is not the way our sector is going yet. And I can't quite figure out why because it's sort of like to your point, once you've started, you're like, how in the world could we not do this? She knows why. Why? Why? Or whatever. You're raising your hand. So tell us what the answer. Yeah. I'm a consultant. And I've worked with a lot of organizations where I have basically pitched this and I get one of two responses. One is we have a grant from the Hewlett Foundation to deliver, and these are the things that we need to deliver on. We need, for example, to target employment. We were discovering, actually, that most of the young women that we work with are doing pretty well, but the men, surprisingly, are not. We wanted to make a pivot based on the data and the experience, but we are terrified or afraid to go back and renegotiate that. And most people on the investor side, the entrepreneur side, feel like they cannot renegotiate or are afraid to not hit their targets, whether it's a deal, a foundation, USAID, et cetera. And so the feedback I get is basically, yes, we'd love to do what you say, but we don't feel like we can, first point. And closely related, often a lot of organizations have people with technical expertise. They have PhDs in employment or health or education, and they don't want to hear the customer because they're the expert, okay? And so I think those two issues, I'm the expert I know, or I have a... But if you were to say that at Google, you would be fired, right? I don't want to go on, I want to hear from you guys. Yeah, I mean, just maybe to build off of that, I feel like if there's, like my call to action is that the more I think the funders can push themselves because it is absolutely true that what are we all incentivized by and how do we educate our funders, bring them along and say that what we collect has to be relevant for what we have to do. I also think that units of time, oh, none of us have PhDs in our, but if you have to tell me on a weekly basis what's happening, there is no time for big thoughts because on a weekly basis, what is it that we're seeing? What is it that's being reported? And what is visible to the entire organization? And so I think this question of units of time on data, not, you can go wrong with that, but it definitely forces sharper focus on what are we doing and is it working and is it trending in the right direction? I just wanted to tell my own mental story because I think that the question of how to push customers deeper and deeper and deeper until you know what really is going on. So we had a situation where we partnered with a local municipal government. They asked us to find jobs for young people who live in one of the most dislocated really kind of legacy of apartheid community that it is impossible to live in that community and afford the transport to get a job in town and there's very little economic activity in that community. So if you got a job, you wouldn't actually be able to afford going to work because of the cost of the taxing fare. So when the government came and said we really want you guys to find a solution for that community, kind of the initial impulse was we can't best jobs are and there's no jobs in this community. And so we really had to force ourselves to get creative because we had an opportunity to work with a cruise liner company to put kids on cruise ships. And somebody in our team said instead of running that project in a town or Durban with the by the sea and obviously related to the maritime industry, why don't we pick this very dislocated community in the middle of Johannesburg because nobody would have to pay a taxi fare if they actually had a job on a cruise liner. So we put down that road and it took a lot of innovation to get the kids floating and passing all of these tests. The very final step was they had to take a medical assessment. 40, we had 45 young people in this first pilot and 26 of them failed the medical assessment. And again, if you aren't really pushing yourselves, you'd be like, oh, that didn't work. Let's find the next 26 or whatever it was. So we went and really probed and pushed and said why did these 26 young people fail the medical assessment? And it was for two reasons. They had bad teeth and their iron levels were low. And it doesn't take much to understand that those are poverty related health conditions that are fixable, that is treatable. So we said to the employer, how about we actually work with you to now understand how to get the public health clinics in this community to give iron pills because there are jobs. So if you can get it together on some of these other dimensions, what does it take to get somebody to kind of address and fix dental issues? Because in that job, you gotta have nice teeth and there's no hospital nearby so you've gotta be able to sort yourselves out. And I feel like sometimes the quantitative data, surveys, big data is where we're really focused, but actually having the kind of just tenacity to keep pushing and pushing and pushing and pushing on where the answers are in the data then can deliver, I think, really unusual solutions. So now in this community where there are no jobs, every kid there is dreaming about getting on an international cruise liner because it's possible and they can imagine a kind of global experience that middle-class kids all over the world get that have been out of reach. And so I think sometimes in some of the customer engagements, it's not just the quantitative kind of survey data which is easy to collect, but it's the equivalent of the open-ended questions, which is if you are just relentlessly asking why, why, why, why until you get to something you can use some of I think the biggest breakthroughs that we've had have come from that as well. Great. Well, I'm going to open it up for questions in the audience and I wanted to request, first of all, just give us your name and organization and please ask a question rather than making a statement just so we can get through some of these. Oh, sorry, you. Sorry. My name is Saju Jain. I'm from India. To Sasha's point and maybe Lindsay as well as to why these surveys don't happen or these evaluations don't happen, could it be that if you provide a, almost make it compulsory for companies that you fund to have a compulsory line item for measurement and metrics? So A, it would force them to think about it as like how are we going to measure before they get the money? Second, there's money allocated for it and C, the third thing would be that it's baked into the true cost of delivering this product or service. So it would give you these three different advantages and if it's every year because they got to present their financials every year, that line item has got to be compulsory. Did you have a question, Taff? So is that something that you've tried and would you consider that? I'm curious, Lindsay, your take. I find myself torn about whether this is a make sure it's funded question or not, meaning of course people need to have the resources to do it. I do worry that that alone will be insufficient because it reinforces the notion of it is for the funder. And I just fundamentally believe like, so somehow or other we need to make sure there's funding available for it and be clear that this is about value creation for you in service of your customers and I worry that simply line iteming it and making sure that because those line items exist. And so somehow or other I think there's a bigger piece, part of what we're doing which I'm happy to share a little more later is we are now trying to take everything that we're doing. We're doing it directly and we're doing it directly for other organizations but we're trying to kind of open source our content, our questions. And so if anybody wants to go do this, we will build a website, lean data dot, I can't remember co, I think it is. We have courses on plus acumen that 45,000 people have taken. We wanna make it really, really possible if anybody wants to go do this to go do it because I do think if you have a line item that says your evaluation line item, it's not gonna lead to everything we're hearing about at Harambe. That is a how do we run the organization, how do we gear the organization to use data to serve our customers better? And somehow that mentality shift is happening in pots but I think when people see that line item of evaluation they don't end up at this level of inquisitiveness about how do we better serve our customers. So I think it's a bigger mindset shift in addition to funding would be my. Yeah, like how can we shift the framing to align both funders and social enterprises or nonprofits around the question of is this working? Like I mean both should care about this but instead we end up micromanaging line items of the activities to do and things. That's exactly the word, it's activity based. It's like we're gonna make sure you engage in the activities you said you would and you have these really messed up incentives. It's like I promise I did what I said I would do. And it's like again, anywhere else you would go out of business. If you did what you said you would do and it's not working. And you said gosh, I'm not sure what to do because I wanna change it but I'm worried that my source of revenues will go away if I shift to meet customer needs better. Like it's fundamentally at that. That's where things get really, really broken. Yeah, I'd love to see funders just give unrestricted grants and just ask the question like is this working? Right, at the end of the day. But not even funders if I just give a super practical example of that if you look at the kind of workforce development and training space it is certification and accreditation crazy. And you know we have chosen intentionally not to accredit because if we can get that young person in the job in three weeks why would we pay for 12 months of training? But then you're not accredited and you don't get the benefits in that system. And so then you sit down not just with funders but government partners and they say, well how do you know if it's working and you say did they get the job? So why don't we actually start with what the ultimate metric is and then say certification systems and quality. They have their value but they have got to be nimble and flexible and reward I think the right incentives and I think that alongside funders trying to really change that at a systems level is gonna be necessary. And get kids into jobs, right? Like it's not how many certifications you take. Yeah I mean this, you know the what are we funding and what do we value and the power dynamic like it's been a conversation for a long time and I hope it isn't a conversation for a long time to come but that's all very real and live and I think that to the point of to the question of funding I think we always think purpose first. So I think it's better to give flexible funding. We I feel that pride ourselves on doing that not everybody does the percentages overall are still sort of shockingly low like a quarter of grants or something or our general support but I think asking it's about the leadership commitment to whatever those line items are. If we mandate a line item in the budget for measurement then you're gonna start in with the please give me a 10 I mean not that question but like that spirit, right? The spirit of please give me a 10 at the end of this call is very different than the spirit of what do clients know about this that we don't know that might help us? So I think like that's our biggest the only thing we really screen for kind of with listen for good grants is does the leadership really want this data to learn and improve and are they open to what they might hear from clients and are they open to dealing with whatever hard things might happen whether that's staff are upset or something difficult, one client group is being treated unfairly. But it's so amazing like listen for good and feedback labs I mean there's just like tiny little micro community around feedback and it feels like it should be 10 times as big. Yeah. So the reason we set listen for good up as a co-funding is now we have 70 funders. So organizations have to be nominated by a funder so that funder has to have some skin in the game although they get the grant from us so it's a little bit of third party like it it separates it a little bit but I think that I mean the other thing I would say about these power dynamics is that like I started to think on a really cynical day that we were all in like a Stanford prison experiment you know what I mean like we're all colleagues we all have the same goal. I come to work every day and it's my job to do the best I can to give away the money and meet the Hewlett Foundation's criteria but like I'm just I'm the idea that like somehow I would be that scary to someone is like horrifying you know. I mean I had an organization last week we said we were a little heavy handed we said we kind of want you to do the project this way and they called me back and they said we can't do it that way do you still want to give us the grant? And we did give them the grant you know and thanked them for the courage it took to say. So I think that we do have work to do on that front but we're Melinda Twan who runs Fund for Shared Insights, she's our managing director wrote an article like 15 years ago in SSAR called The Dance of Deceit that's all about this dance of what does it really cost and everybody's kind of in their own pairing of that and I think that we're only gonna break it if everybody is braver and I hope that funders start to value I hope that the movement there's a lot of conversation and foundations right now about the roles of questions of diversity equity and inclusion both in our own work and staffing but also in how we make decisions and what we fund and I think that that listening to clients and valuing their experience actually fits very much into that sort of wave that's coming so and Human Center Design is another wave so there are a bunch of waves that I think may help shift this but organizations if you consider that organization you need to try to have that conversation with the funder because if they don't even try then we just continue in the dance of deceit and we don't know it I mean we should know I understand but we don't we don't know that that person was sitting in their office all twisted up that we're funding the wrong thing if they came to us with a beautiful perfect report about how perfect it was unfortunately we're human and we're gonna be like oh you know I hope we probe more but you know it really does take both sides and nonprofits are gonna have to push as well so I got the five minute signal so I'm just gonna take two or three questions of people wanna so one, two, three yes you have the yeah I so I mean it's so power, I don't want to take my time so please keep it short into a question and I'll have the panelists kind of wrap up with trying to answer the question so my question to the funders so that people you know the money is why don't you have these criteria in terms of why isn't this not part early on in the conversations with the organizations like in the due diligence process if I were a funder and I'm invested money in her for example I would ask her do you know how satisfied your clients are first question then I have a conversation about will I give money to you at the moment what I'm listening from the panel is I'm an entrepreneur I wait for the funder or the donor to give me money and then I learn if my clients are satisfied is back to front from my point of view yeah and I think so sorry let me just take a few questions so only five minutes to wrap up here up here hi it's sort of piggybacking off the back of both of those questions hi my name is Tara I'm a co-founder of a very small organization called Strength India we work with women from violent backgrounds in northern India there's not a lot of infrastructure we're a very small team we're totally bootstrapped as far as funding goes so as we're trying to take our work to scale and figure out whether we want to go you know completely philanthropic or straddle more of the social enterprise route what kind of data questions are you guys looking for and I know that's kind of a specific question but you've got the three general legs that you were talking about earlier but you know if we're gonna have to make choices it's me and my co-founder and a project manager you know what kinds of choices should we be making beyond customer satisfaction especially looking as we want to grow and scale right and then back there and after that just the gentleman on the side here great hi I'm Zach from Vera Solutions so we build data systems for nonprofits and social enterprise to help them better track impact my question is actually around systems we've been talking a lot about content and questions and what we've seen from the last seven years and working with about 200 organizations is that really there's kind of three keys to good data right you need good content you need to be asking the right questions in the first place you need capacity within your organization so capable people with the right time available they need and then you need good systems for managing data and giving feedback loops to people who need it we see a lot of people making bad decisions based on what systems to use kind of I know a guy or try to do everything with Excel or this looks really cool curious to see what best curious to know kind of what best practices you guys have seen in terms of system selection technology selection and then one last question there and then I'll let the panelists each wrap up I'm sorry it was a woman up here with had her hand up earlier yeah I'm just wondering if any of you have thoughts on the future use of mobile phones for gathering the technology like more better faster while being respectful of people's privacy kind of the reverse of what Sasha was saying you know you're using voice versus SMS but in terms of locational capacity and ability to get lots of real time data you know what do you hope you're gonna be doing in terms of using those in five years right okay so here's your challenge I'm gonna give you one minute to answer as many of those four questions as you can go I have four quick answers I think you have to have the courage to know what you want to measure and then you tell the funders it's not the other way around because you know what you're trying to achieve and change we have had a very similar experience in using voice actually is a very effective like and again if you do outbound calling you can use IVR you can use a lot of technology to get things done at large scale at very low cost and to your question we didn't wait until we were big in resource yeah right I think that the sometimes the question of the system is sort of like a technical problem to solve and the first question around sort of like why isn't like what should we measure or why isn't the customer the first question that's like the adaptive problem to solve like we don't totally know we can't system our way out of that it's like we're in a rut in the road maybe and I think that I think that whatever work you're doing you need to be the expert in that work and do the highest quality work that you can do however you define that and then have courageous conversations I think if from the beginning you commit to yourself that you're gonna do really high quality work you're gonna talk about it honestly and candidly with funders and see what happens even when that feels hard and that you're gonna build in strong data systems and all three legs of the stool in whatever order is important you think to you from the beginning and then you set yourself up for I think a lot of opportunity we have a combination of organizations where they're building this customer feedback leg like but they have RCTs and other things and like they're sort of shot to say like wow we give our clients feedback every day we have an RCT that shows we make a difference and we've never asked them how they're experiencing this and now they're totally bought in to doing that so they're building the leg much later we have other groups like we have a group that's a transportation program for moms in Buffalo New York the program was developed through a human centered design process what do women need to get good prenatal care and where they landed through this process was they need transportation they need bus passes they need bus stops where they can push their stroller in the snow in the middle of Buffalo in the winter and now to your point of okay it doesn't stop there now that we designed a great program we need an ongoing feedback loop and so they're able to build this in and build the data system in from the beginning and anytime you can do that I think that's a win but it's fine to retrofit it too Sasha I agree on all those things quickly I think you're going for I hopeful that on the last question sensors and kind of passive data collection will become part of what we do very soon and then we'll only ask people things that you can't collect in other ways but there will be some privacy questions I think it's interesting triple underline you need to know what your customer is not the other way around and just a few other yeah I mean it's funny because we've done like 150 of these projects we just got a database recently that has not been the constraint for now and we have like a quarter million data points probably more than that and the backwards for it I would love a funder who just said I will not give a nickel to anybody doesn't talk to their customers period that would be neat and just sort of forward looking just again order the magnitude like because we haven't really talked about we've talked a little bit about cost but from our perspective like doing straight standardized surveys totally standard surveys where the questions exist and you're just executing a few thousand dollars a big soup to nuts customized project that you can go from nothing to amazing data in like four weeks super high end like twenty five thirty thousand dollars super high end right so like just the idea so here RCT I mean I must admit just for I shudder how easily and frequently it is to say we will spend six figures maybe even seven figures I'm like you literally could do 50 of these 60 of these 70s and change the culture of 50, 60, 70 organizations and if I had my wish between getting an organization that didn't use data but wants to to one that does 50 times over versus knowing a supposed truth in reality when context matters so much I am really really really really nervous about the second one but that is what people are used to funding so I just think there's wide open space here for people to fill and you know again we're gonna try we're really really committed to making everything we're learning available in public and we've just kind of some funding to build that out so that there can be 1,000, 2,000, 3,000 lean-data practitioners because we need this we need this to make it happen and somehow I think if we can just get over this rut of like it's hard or maybe it's not hard I want to do it but I don't quite know how we're trying to help people kind of leap that and we would love you know people raise their hands and do it help support make it happen because hopefully again you say a year or two from now we'll be like no no my customer feedback is better no my customer feedback is better I'm winning I'm winning I don't want to wait for the funder because this is how I win in the marketplace yeah I mean to Sasha's point if we account for the money we spend on gathering data in the way that we are running a grant then it seems like it's expensive but if we look at the value for money we're getting in terms of the impact that we can drive and the improvement we can drive it's probably the best money you can spend to get the greatest impact so thank you all this has been a fantastic panel you know I think if Sasha said earlier you know I think we're all violently agreeing on this stage but this is still very much you know not not common practice and so you know I hope that you guys have learned something here and we'll take this to the organizations that you work in and really push whether you're a funder or an implementer or somewhere in between to make data part of the equation to drive much greater impact in the work that we're doing so thank you all thank you to the panelists