 In this episode, you're going to learn what to do when you run up against the person holding the Excel sheet in your organization. You're going to learn how to show the impact of your work as a service designer. Here's the guest for this episode. Let the show begin. Hello, my name is Joyce. This is the service design show episode 125. Hi, I'm Mark and welcome to the service design show. On this show, we try to look what's beneath the surface of service design, what are the things that make a difference between success and failure, all to help you design services that make a positive impact on people and business. The guest in this episode is Joyce Yee. She's a professor of design and social innovation at the School of Design at the North Umbria University. If you've ever struggled to explain the value of your work, you're not alone. This is one of the biggest challenges in the service design community at this moment. Joyce is one of the people who has developed a framework that helps designers to show the impact of their work, and all while staying true to the emergent nature of design, where the fact is that value is created at times and in places and in forms which you didn't expect upfront. If you stick around till the end of the episode, you'll learn how this framework works and how you can use it in your work to show the value of design, even if you're currently in a context driven by short-term and quantifiable results. Now, while you're here, make sure to click that subscribe button if you haven't done so already and click the bell icon to be notified when new conversations like this come out. And that happens every two weeks. Now, it's time to sit back, relax and enjoy the conversation with Joyce Yee. Welcome to the show, Joyce. Hi Mark, how are you? I'm doing very well. It's a lovely day in the Netherlands, so I'm really looking forward to having this chat with you. For the people who don't know who you are, could you give a 30 second introduction? So my name is Joyce, I'm a professor of design and social innovation in the school design at Northumbria University. We're based up in the North East of England, in the UK, and I've been working as a design educator and researcher for the past 15 years, mainly looking into teaching, interaction design, service design and social innovation. And I think you have a special relationship with a guest that has been on the show not so long ago, Emma Jeffries. Can you share a little bit about that? Yeah, Emma and I go back a long way. We've been friends since she started doing her PhD at the school design. We got to know each other and basically ended up writing two books together. So yeah, she's been my kind of partner in crime in a lot of the work that we've done so far. The episode with Emma, we talked about empathy at work. We're going to discuss a different topic today, I think, just as interesting. Before we dive into the conversation, I want to do a 60 second rapid fire question round with you. The idea is that you answer these five questions as quickly as possible. Don't overthink them. Ready? Yep, ready. Question number one, what's always in your fridge? Coconut juice. Question number two, which book are you reading at this moment? I've started a lot of books, I haven't finished it, but at the moment I'm reading one on trying to read outdoor clues and sign. I'm a keen hiker, so it's interesting to understand how trees grow and read the landscape. I will add a link in the show notes to that book. Yes. Which superpower would you like to have? I would love to fly. I love to watch birds of prey, and I'm just amazed at how much they can just soar without any required energy, and they can just see the world very, very quickly. The next question is, what did you want to become when you were a kid? Definitely not a doctor, because I think I faint on the side of blood. I don't quite remember. I think actually very early on I wanted to be a designer very oddly enough, so maybe I've just been obsessed with it from young. Oh, good for you. Maybe quite sad. Well, I wouldn't say that. Finally, the final question is, when did you first hear about service design? Actually, when it really first came out, I think probably 10 years ago or more than that. I remember the design council had a promoter festival here, Design of the Times, 2007, actually more than 10 years ago. And they were bringing together, I guess, what you might call very early service design agencies and working with the community in an office. And I also have another good friend, Lauren Tan, who worked on the first book that Emma and I did. And she did probably one of the first PhD on that and looking to sort of design methodologies used in the projects that was part of the design of the time. So actually very early on, and I was working with people like Professor Robert Young, who was quite early on theorizing the topic area. So I was, yes, I was aware of it quite early on. Cool. Joyce, welcome to Touch Upon a Topic. We did a sort of preparation call, as I do with all my guests, and we went over a list of what might be interesting topics to discuss. And we had quite a big list. And I said, I think there's one topic that would really be interesting for our audience. We're going to talk about how to evaluate social impact, right? That's one of the areas that you're passionate about. I think there's a lot to explore. One of the most common questions I get is how do I sort of prove the value of my work? How do I measure the value of my work? How do I convince other people that it's actually working? So I think there's still a lot of mysteries that need to be unraveled. You've done a lot of research around this. So that's the topic for this episode. Before we come to dive into that, I'm really curious, how did you get on this journey? And how did you get interested in figuring out how to evaluate social impact? Yes, so I guess from a personal practice point of view, I've been involved in delivering some service design projects quite early on. So one of the projects that we were involved in involved turning a kind of traditional face-to-face work-based training program for a company in an office into completely an online digital service. And this was very early in 2010. And the project was hugely successful. By the end of the project, they managed to convert all their face-to-face learners onto this digital platform that not only enhanced the way they were delivering the program, but also helping a lot of cost efficiency in terms of not having to travel and for the advisors and the trainers to face-to-face and all that. So as well, I mean, they're still doing that. It wasn't replacing it, but it was much more efficient. And then they were able to kind of manage all the documentation online. So we knew the successful form of, I suppose, business part of you, cost savings, efficiency. But we also knew that it helped improve the learners experience. So we ended up wanting to do an evaluation on the social benefits of this service. And we at that time decided, OK, well, that's what are the kind of evaluation methods out there. And what's like, design is really hard to quantify in terms of the benefits and outcomes. So we ended up finding this method called social return on investment, SROI. So it's like return on investment, but with an S at the front, adding a social. So we thought, oh, that's perfect. It's an established way of evaluation. The business people will get it. They'll understand it. And we tried to use this methodology. I mean, it's quite an involved methodology. Actually, you have to be officially trained to use it. We almost kind of did it like guerrilla style, kind of learned about it and ended up trying, just trying it out as a test, really. And we got the organizational board and they were really happy to do this. But it ended up really, it just felt, it didn't really quite work out. I think it was because the fundamental way SROI works is that it gives a financial proxy to a, to an outcome. So trying to quantify some, you know, like cost savings of journeys, how much does it? Yes, you can save on fuel, but what about reduced pollution, better wellbeing, for example? It's very hard to, you can find some proxies, but they're also a lot overlapping proxies. So how do you avoid double counting in the kind of accounting speak? So that's the fundamental way in which we did it. We found it very challenging, very hard to really pinpoint exact benefits. And so we ended up just, we did it, but, and we, it did tell us certain things. But actually a lot of those, those, those insights really came from interviews and sort of long form narrative and understanding what has changed rather than trying to quantify that into a financial proxy. So that was the start. I kind of, you know, learned a lot from that and realized it just wasn't doing what we needed to do. And, and then how I ended up now focusing in much more in this space is because my research is really around working with designers who support social innovation initiatives, and mainly around the Asia Pacific region, because I'm originally Malaysian and I still have lots of links there and I'm really interested in the region in terms of how design is being used now, light in the West to, to deliver innovation. And so when we spoke to a lot of practitioners there, you know, just to find out what they do and what the practices are, you know, a lot of the challenges that they keep talking about is social impact and particularly around social innovation, you know, the main thing, the main outcome is about improving people's lives, giving social benefit. But of course they come, come across the same kind of issue that I personally have, you know, encountered, which is how do you mean flu evaluate social impact? Yeah, and even I think it's good to address that even though we're talking about social impact and maybe social innovation, a lot of the same challenges apply in the world of service design. So I think in a lot of cases it's almost interchangeable. One thing that I noticed from reading your stories and reading your research is that the word evaluating rather than something like measuring the impact. I think it's a subtle difference, but it makes, it makes, it makes a lot of difference, right? Can you elaborate on that a little bit like evaluating versus measuring? Yeah, actually we use the term evaluation because I guess in the parlance of the sector, it's quite common. So in international development programs or social programs, for example, so they often, you often hear the term monitoring and evaluation, M and E. So M and E in itself is a base in itself. So they often have an external partner that does an M and E on a program. And then it's not done externally. So M and E is a used term. We don't like the idea of monitoring. So we thought, okay, well, evaluation is actually what they know. So, and this is the term that is used by the practitioners. So we went with that. But I also think that even the term evaluation has a lot of baggage with it because it is linked to this idea of measurement. It is linked to this idea of monitoring. Making sure that the money that you've given to the communities is used properly and not being wasted away. So there is a lot of baggage relating to simply because historically that's the term and that's the function of trying to make sure that the money is because obviously a lot of these funding is either coming from government or private philanthropy and they have a board of directors and they want to know where the money goes and whether it is of good use or not. So that's why we've used evaluation. But we've also used it to change this idea of moving away from monitoring and assessment so much. Okay, I'm learning with each and every episode of the show that there are new areas that I know nothing about, which is awesome. One of the questions that I had was, what kind of limitations are there to existing ways of evaluating? Like why did you set out to find a better framework? So a lot of the MNE or monitoring evaluation models that are being used is very similar to the SROI that I mentioned which is pretty much set up as an assessment tool. It's also based on accounting models, so putting financial proxies to outcomes. What is a financial proxy? So for example, if one of the outcomes is to improve or increase the number of children going to school and finishing school. So you would then put a financial proxy to that. So if they were to be kept in school and then to leave, they would get better jobs and the jobs will give them 200% more than what they were earning if they didn't have school. So it was trying to kind of give those proxies to those outcomes. When actually what is more meaningful is that if they stay in school longer, yes, they might have an opportunity to get a better job, but it might mean that they have children at a later age they're not forced into forced marriages and they're not done. So all these are really hard, you can't quantify that. So I think that's also the challenge. So a lot of what we call the traditional or dominant way of evaluating. I wouldn't say all evaluations like that. There are a subset of evaluation methods that does take into account these longer term social outcomes. But the majority that is used and the one that is probably used because it's quite easy is a lot of its quantitative measuring number of people in the program, increased income, participation, all that sort of thing. So things that we know is easy to measure and it's easy to measure at the end of a program, but a lot of these evaluation doesn't often, unless you have a lot of budget, you can't go back six months, a year, two years after to look at the outcome. And we know actually social outcomes take a lot longer to develop over the, you know, some could be five years, some could be 10 years. So that's some of the challenges. And also the reason why I think we're very keen to change the conversation as well from, you know, what we hear from the practitioners is the challenges to evaluate and the stress as well, because they have been told by the funders they have to evaluate these and they may not have the capacity or the capability to do the kind of evaluation expected to be reporting at, you know, board level funding level. So it also puts pressure on the practitioners and they feel that sometimes the criteria is that it's being asked to be evaluated by the funders, which is set like miles away in some sort of boardroom or practitioners who do cross-national, cross-international programs, they want to do some cross-comparisons. So they set a baseline. But it doesn't quite work for the community, specific community. So they also find that the criteria they're being asked to evaluate doesn't quite match what they think is coming out of it. So they get frustrated that, well, why aren't they looking at this? Why aren't they looking at this? So the model isn't working quite well for the, especially in the way if you're using sort of a more design approach, you know, very emergent and you don't know what you're going to find out. You don't have an outcome. You know, you believe in the process, but you don't know what, you can estimate, you can hypothesize what outcomes, but it might not come out. So it doesn't fit the model, first of all. Secondly, it's not equitable. It's someone on the outside telling us what we've done well and what we haven't done well. And that's very disempowering for the practitioners and the communities. And it doesn't quite often capture what needs to be captured because they're probably measuring the wrong things. I so feel the pain. I've been in so many situations where this was the case and the challenge with services is that there are systems and systems are really hard to measure on an individual touch point level. You sort of have to observe the system and then it becomes really complex to measure all the side effects and side benefits. Like you said, measurements are set out, standards are set from outside. It sounds like a lot of challenges. And if we go back, what have you found to be some of the root causes? Is it because it's easy to measure people at the top are trained this way? What's preventing us from adopting a different way of understanding success? I think it's the same kind of challenge that a lot of designers face when they're trying to help an organisation be more people-centred or more design-led. There's this one way of doing business, which is very bottom-line and seeing processes and outcomes, but not seeing the human dimension of it. It's the same thing here. International development, certainly at top-down level, they just see it as programmes that need to be defined and has all these processes and methods in place. They don't quite think about that. I mean, its mindset is a way of thinking, I think, but also it's also very hard. It takes a lot of time and money to do it in that sense. For example, really evaluate and really look at the outcomes. You do need the resources to do that, but also the framing of this idea that we do need to kind of measure and know where the money is going. I guess what we're trying to do is not to replace this existing model because we understand there's a need for it, there's a need for it to be reported and a need for it to be accountable. Certainly with public money, you need some sort of accountability, but what we are trying to do is actually change the conversation from this language of measurement to using evaluation as learning. It feels like a missed opportunity because this idea of evaluation is something that designers do all the time. We don't call it evaluation, we just call it prototyping. What struck us was that actually what the practitioners were doing was they're doing this all the time. They're like kind of reflecting, they're co-designing, they're prototyping with the community, but then there's not calling it or framing its evaluation because it's not recognized by people who potentially are not involved at the kind of delivery level that is part of learning about the project and giving better outcomes. So that's kind of one of the reasons, one of the sort of challenges of this topic really. There is a gap between how we evaluate and learn versus the way people expect us to report success. Exactly. Have you found ways to sort of bridge the gap or bring these two worlds closer together because that seems like a really tough challenge. Yeah, we're doing it in a number of ways. So we obviously spoke to the, we ran a dedicated event through this network that I lead with my colleague Yoko Kama and RMIT. So we set up this network called the Designing Social Innovation Asia-Pacific. That's the up for short. And we gathered together the practitioners on a number of events in Asia-Pacific to A, understand how they evaluate their own work and learn from that and also to kind of have conversations about how they see the role of evaluation and what it does. And a lot of the work that we've then come from that kind of discussion. So we derived this series of principles and framing. And then we also felt that it was really important to talk to funders to analyze the gap. So the practitioners told us the challenges, but the funders, how would they respond? So we ended up having a workshop with the funders. And we publicized it as, are you interested in different ways of evaluating and obviously they self-selected. But the ones who came, they were ranging from quite large organization, international organization to all philanthropic organizations. And they were aware, they certainly aware that this is an issue that is a gap between how they normally do things and actually what happens in the crowd. And they're not aware, but they also find challenges, same challenges that we encountered is legacy. There's a set way that the directors want them to report. They don't have the money to look at these impact along the term. And some of them have the agency to change, like a smaller philanthropic organization that we've worked with. They're really keen and they work on what they call a trust-based relationship, which is basically they don't ask for project proposals. They work with partners long-term. So if the partners say they need some money to do a project and they say, right, let's have a conversation rather than submit a proposal and it gets evaluated. So they work in a very different way, but they can because they're smaller. Larger organizations have all these structures in place and ways of doing it. So we know there's a recognition, there's a challenge, but what we found out is that they didn't really know how to resolve it as well. So we're having ongoing conversations. And then also we've been starting to do specific targeted work with organizations. So I've been invited, for example, by a team in the Young Foundation in the UK. So the Young Foundation is one of the longest, I guess not-for-profit organizations working on delivering social innovation programs in the community since the 1950s. So they were invited to be part of this community-driving change program that's funded by the Tower Hamlets Council in London. And the program is really about trying to work with residents to help them understand things that can improve on their well-being and actually help them design interventions to deliver like, you know, service design type interventions into the community. And they're one of the many providers. So they work with other partners as well, but that team itself in the Young Foundation is really interested in using evaluation to help them learn about the program. So I then did a series of workshops and coaching with them and helped them just use an existing platform that they do and map onto what they always do anyway, like team stand-ups on a weekly basis, but get them to reflect more formally on this platform and note that down and has very specific questions about the project to use that reflection to learn and to refer back to on a weekly and monthly basis. Now, it may not sound like much. It sounds like, well, that's a really simple solution, but it's already tapping into what they're doing but making it a bit more thoughtful and a bit more kind of, you know, recorded. And that sort of helped them within a short period of time. They can sort of see the benefit of that so they can see what they reflect on, what action they needed to do on a short term but maybe perhaps longer term. And there's also, pleasingly, you know, the program manager said that, you know, he's able to go into these reflections and pull out some key unsanitary insights that he can put into the quarterly reporting that they have to do. So pleasingly, it's not just helping them learn about the project and improve as they go along. It's also feeding into the more formal type of reporting. So that's been a few ways in how we're trying to approach this huge challenge. So what I hear you saying is that one, maybe we as designers need to take more time to capture our learnings and formalize them and make them tangible. I think we like to think on our feet and really quickly incorporate our learnings into the next action we take. So I think we like to do things fast paced and formalizing and capturing things feels like it's slowing us down and we sort of overlook the benefits or the value of making things sensible. I think documentation is a big challenge in a service design for sure. There's so little documentation. There are so little known and well-documented case studies, for example. I'm curious, can you give a few more tangible examples of things that you've encountered that maybe service designers are already doing but not realizing that they could use those actions to use it as reporting? Yeah, so basically, what they're really doing on a kind of weekly basis, I guess, in terms of their own reflection. So one of the exercises we gave the practitioners in the workshop was to ask them to reflect on the way they evaluate the work, but on three different scales. So the scale of it being a personal scale. So how do you reflect on what you've done and what has worked and what hasn't worked? And then you have the team scale. So how do you as a team catch up on things and learn and work out, okay, we've done this, but this hasn't quite worked. So that's kind of the middle scale. And then the largest scale, which is, okay, evaluating with the stakeholders and all the other partners and the community. So that's the largest scale is the more formal evaluation that our practitioners are used to or used to be asked to do. But what was really interesting and what didn't surprise us was that there were lots of things that we're doing at the personal and team-based level. So things like on the personal level, keeping notes, keeping diaries, keeping sketches, keeping audio recording, after you're driving back from a workshop, you might just chat and say, this is my quick reflection. And then at the team level, if you work in an agile way, you'd have stand-ups, you have daily stand-ups, you have weekly stand-ups, and then you might have team meetings, you're not capturing everything at once, but you might have some notes that you make. You might have photographs, you might have team documentation, et cetera. So all these things are already part of a designer's process. What perhaps isn't so obvious is that how you use that to maybe assign some reflection time. So we use cold sort of reflection, action learning cycle. So you don't just describe what just happened. You try to understand why has that happened the way it did, how do I feel about it, and then actually what I want to change about it. And so it's very action-orientated, which fits design to the T anyway, but it's much more explicit. And when you have those little action points, you want to do this, this and that, that can be logged and taken into the team meeting. And so I've had this issue with so and so, and the community wasn't quite responding the way I wanted to, so I just tweaked the methods and the tools that I used and tried it again. So then you can sort of see and have a track of these action points as we go along. So just by having, and that's what I think hopefully is working with the Young Foundation team, the CDC team, because we were aware we didn't want to ask the team to do any more additional work that they already had to do. They're already time poor, but so the reflection, the diaries that we asked them to do is not long, which is just like four questions. And sometimes they were also interested in like, who do you speak to? How many people have you spoken to in the community? What were the reactions? So it was also a way to log the range of people they've spoken to and who they perhaps might have missed out on need to talk. So it was helpful for a number of things, revealing these things that is often very implicit in a practitioner, but not logged. And if you think about someone who then has the right report at the end of the project, they need to have access to these things to be able to kind of tell the story. But all the story they can tell is some service blueprints or personas, and then the, you know, the beautifully designed product services, it doesn't quite, it tells artifacts, it doesn't tell stories. And I think that's the important thing about understanding social impact. Sorry. What I've been hearing in your story and what I recognize for my own practice is that a lot of the value is in the process. The big mistake we tend to make is to sort of just only document the outcome. Like you said, the artifact. Well, a lot of the value is created in the process, in the learning. And there are, you need different ways to capture that. You do not capture that in a single drawing or single blueprint. And that's what I'm hearing you describe and capturing the value that's created throughout the story, throughout the process. Is that correct? Yeah, I think so. And it's not just for the benefit of capturing it and then obviously being able to celebrate the success with the funder, but it's more important to celebrate success with the community. It's also about understanding what the community value is. So you're working with the state, working with, sorry, service users, for example. So using this, if it's a service design project, what do they value? What do they value's impact for the work that you've done? The client may have a very specific outcome they want to achieve, but actually the value that has been given to someone who's just by making their life a lot easier to access tax credit, for example, it could be enormous, you know, but we just see that, oh, how many people have accessed the service, you know, 20% up from last year, great. So I think it's also about celebrating success for the community and also for the practitioners because it's not just about I think it's about changing conversation, knowing that what they're doing is evaluation and they're not asked to do something additional that they're not familiar with or they have to do, you know, they always tend to think, oh, I have to do evaluation. Oh, okay, I have to do an additional work, but actually this isn't additional, it's part of what they do and we do it all the time, and actually if you get them to think about impact right from the start of the project, it's really helpful because you're constantly trying to think, well, how would this impact on these criteria that I'm trying to aim for. So it's actually quite helpful to think in that way and to change the practice, to enrich the practice in this way. So when designers think, okay, it's additional work, I have to evaluate, I have to formalize it, there's probably a reason why this feels like a burden, is it why aren't they sort of feeling or experiencing the value of their efforts in capturing this? I think because it's often an add-on, right, so we don't we only have to think about the value of reporting, perhaps if we maybe are asked to do it by our client and we might think, okay, well, we need to put additional resources and all if they think evaluation, oh, we need to do some surveys, or we need to kind of test for these things. And so it does feel like it's an addition to what we already do when we're very much about you know, delivering or designing a service and you know, offering. So, but I think I go back to this thing about changing and framing not just for the funder's point of view but from the practitioners, this idea of evaluation, that it's not measurement it's learning and we do that all the time so if we can just find something in what we already do and maximize it, leverage on it, so for example the notes that you do and the the team meetings that you have and if you just have one agenda items around what have we learned that to the project that might help improve the outcome of the project and that little line could then be used to help build that story of the evaluations you see, I mean it's just small little things but surfacing that and then noting that down and whoever's then responsible for bringing the story together doesn't feel like they have to recreate that story, it's just there. And I think you touched upon a very important point here you have to do something with the things that you formalize so if there isn't a moment where you come together with a group and reflect on your learnings and adopt that back into your practice then it does feel like you're burdened because you're putting time and effort into documenting something that's just there for the history books rather than something that you actually get benefit from so that additional step of using it is I think the crucial key here. Yeah and I think that's why this thing about learning so using those notes that you made to say well that didn't go very well that meeting we presented the ideas and the stakeholder the users didn't really respond well why didn't they? Did they simply not like what we presented or did we present it in a different framing or were we completely off the mark so you use that to say okay what's the next step what we're gonna move forward with so it is very much reflective reflective practice it's action oriented you're not just pontificating saying well why didn't that work the world doesn't understand us but actually you need to then turn that into actions that you can test or try out a prototype and then we move on so it's very much about yeah it's not an additional thing it shouldn't feel like it but because of all these the history behind it the legacy and this idea of design always having to prove its value feels a bit like there's always a negative perception to it so it's used to be externally driven you're doing it for somebody else rather than for yourself and I also think if you look at the double diamond which I have some issues with like evaluation or the meta level like learning from your actual process it's not visualized in the way design processors are often represented and that sort of also gives misguides people like we're just moving forward we're just moving forward without incorporating evaluation or reflective thinking or critical thinking in our approach to actually move faster get more progress so it's also just a lack of how we have communicated about the design process so far most definitely I mean one of the things that I'm trying to imbue in our students at undergrad and master's level is how are you going to evaluate your work how do you know it's worked and they're like I'll just do a survey I might observe I'll get a few friends to test out the system and sometimes it's as much as they can do but I think so this idea of actually and also evaluation doesn't just happen at the end it happens at every stage right so but I mean it's also our fault we just never as a practice really talked about it and it's not the sexy end is it I mean we're like oh well and sometimes the ego gets in the way because you think it's going to work because I know it is going to work trust the process yeah that's what we say trust the process yeah that's what we say so I think I think it's partly the discipline practitioners need to also pick up on this as educators we are trying to pick up on it but it's not often that easy because students are still trying to learn how to be designers before they even want to evaluate right so but we evaluate as students we value them all the time so we want to emphasize that as well as changing the language and model that funders are using so it's it's kind of a many prong issue yeah I think and I'm not a trained designer by any means but from what I understood from a design field like graphic design design critique isn't sort of integrated part of the graphic design process or maybe it used to be and I don't hear almost any conversation about design critiques within a field like like service design I'm curious why we've never adopted a practice like that which I think would be extremely beneficial for everybody yeah that's interesting I guess because my background is graphic design and I moved into digital design and then interaction design, service design almost because I can see the benefit of applying design thinking across different ways of doing things yeah I don't really know actually I guess it does I mean probably still does happen at certainly education service design courses I know they when students present the ideas there is a level of critique there and discussion but whether that happens in practice I haven't really worked professionally as a service designer in an agency I've always done it in a kind of research academic context in which you're always trying to actually work out what you're doing is actually helpful or useful and there's an added incentive to also want to write about it because we need to publish so this idea of actually going back and reporting on what's worked to a certain extent we have added incentive but if you're a I mean I have worked as a professional designer and I know from that experience you don't have time you go from one project to another reflect and say oh how did that project go you know not even just reporting externally it's just within the team you're just on with it you don't have time to sit and reflect so I think it's also about the practitioners well talking about it talk about it with the client and even just saying that and planning it in in their costings just very simple things and I think it's going to be a really important part of it has to be mentioned because I think more and more these services you know you need designers to understand the business value the value of it to businesses so they need to be able to defend that so let's go back to that conversation because we sort of we need to still address it because at some point you are going to run into the person who's managing the excel sheet now I'm curious like have you found ways to open the conversation with people who have that kind of mindset like we're all about success will change it's emergent we don't know what will happen and then there's somebody who needs to manage an excel sheet like how do we successfully open a conversation with somebody who's in there in that position I think you try to sell it to them in a sense that ultimately if we adopt evaluation as learning it will lead to better outcomes and that could be better financial outcomes or better social outcomes or you know and they're linked so it isn't it's almost like you know why wouldn't you do it if it's and if you can say look it's not going to it's not going to there's an additional not a huge amount of additional time we're already tapping into what designers are already doing we're just formulating a little bit and if we can use for example these standard meetings and we record it or we do a quick recording at the end to say these four points what happened how we felt what we've learned what we're going to change right so and that we can take that forward to the next meeting and argue it from the point of is already there it's not like an additional thing we have to do we might have to enhance it to capture and be more mindful of of talking about it in our day to day practices and keeping an eye on that but ultimately it will then lead to better outcome and when we then say to our client we are able to see the value not just the value that you define but the value such as in this way you know it makes for a better story a better outcome reporting that you can feedback to the client really and the challenge still remains like if we are creating value that's not being appreciated like when it's not being captured in the excel sheet it might be value that we find valuable but if the people who are in positions who are in the decision making positions we still we will always keep this challenge right if we're saying okay but the community is happier and healthier but there isn't a square in the excel sheet to actually express that like that's I think the bridge we somehow need to get to show to educate or to co-create new ways of capturing value yeah I mean certainly I think the key is not to and I use this in my reflection you know not to dance to someone else's tune but to create your own tune and to invite others to join in right so I think if you advocate and if you can show and yeah some things just don't fit in the excel sheet and we should recognize that they are not just one way of doing things in this world and we know that if we do that it's led to what we have now which is kind of you know environmental disaster global pandemic falling economies so there's not one way and we know this one way of working doesn't work and this kind of so we have to kind of acknowledge that yes you can view the world through an excel sheet but it's limited and if you want to do other things with it and look at other kind of outcomes impact you need other ways of doing it and I think they can coexist I'm not saying we should replace one or the other I think the key is this idea that you can hold multiple views in your hand and not be like completely confused and you have a meltdown right but not many people can we take it for granted that I can you can see different multiple approaches to it I think it's about I wouldn't say replacing it's just bringing them closer or bringing it closer to each other and say they compliment and they support each other I still don't think the excel sheet should dominate and we need to do that I know it does that's the reality at some point in 99% of the organizations at some point there is the excel sheet that wants to have three that has a three month horizon or six month horizon and then that's the struggle we face one that the impact we create is hard to quantify and it's long term and those are two things that sort of clash really hard with the way impact is represented within at least profit organizations and maybe also not for profit yeah I it's you know we're limited to this construct which is yearly budgets reporting the length of a project which has a style and end point I mean I argue that the way designs set up so much focus on a project is problematic because a lot of the work that's done in social innovation is so long term and when you intervene you're just part of that at a certain point of timeline and then when you extract yourself they're still there so you need to be able to make sure that whatever you intervene you're then leaving behind capacity or tools and resilience so that they can carry on certainly the models in which we practice is set up doesn't quite fit but it's also because we're having to fit in with the way the world is set up the business world or even for nonprofits they have their own way of doing things it is a yearly cycle you've got a budget and you just spend it and then we don't know whether that's going to be progress I mean there's no I'm not offering a silver bullet here but I think what's actually important is to recognize there is a problem and to have that conversation and as practitioners to first of all think about it in terms of how they're evaluating their practice so that they can see the benefits for themselves then secondly how you communicate and convince the money man that it is a benefit I think there is hope because we're having these conversations people are talking to us they want us to support them they're reading out the results that we're offering so it chimes in them so they recognize the challenge they just don't know how to solve it like many of us because we're so tied into the systems we're in but they're taking even the work I'm doing with Young Foundation they're just taking small steps but that's better than nothing and they're seeing improvement at the kind of practice level where they can also see better reporting to the funders that was a surprising thing because the program manager is like oh yeah I can read through all the diaries and then I can really gain some insights rather than have at the end of the project have a long meeting and work out so what came out of that one key area here is that we also have to accept and be proud of that we express impact through stories through narratives is a valid way to show evidence and I think we are in a lot of cases intimidated by the quantitative data and sort of back down on our ways of showing impact and step one would be to agree with ourselves that we are to marginalize the ways of presenting impact that we have yeah definitely I think we need to obviously get smarter with understanding the data that comes out of quantitative stuff and but also call out is limitation you know not be afraid to call it is limitation and say well it doesn't tell us this it doesn't tell us that but we know like on the ground we speak to the community and we work with them we can see these changes and it feels great and I want to share it and how best to share and there are methods I'm not saying there isn't methods we're using we're using a method called developmental approach which is quite well established in Canada Australia and essentially it's about this idea of using evaluation as learning and it's an approach it's not like tool kit it's a method you can use anything you can combine them but it's about this idea of using it learning and it's iteratively it's a cycle so it marries really well with the design process you know and so there are already tools out there even just methods like stories of significant change so you ask the community what's the most significant change you collect that and it comes through as quotes or it comes with stories and I think we should really give prominence to those rather than you know the kind of excel data that comes out and go oh yeah yeah that's great I know it's easy to report on but it's a bit lazy on our part if we don't maximize the other side of the the kind of messiness of it and I think again you sort of mentioned something which is really important a way to present the value of the way we show impact is by showing the limitations of sort of the traditional ways so there are countless of examples where things were measured correctly and showed a positive ROI but in the end the effect was disastrous and I think that Trisha Wang who was recently on the show she talks about thick data versus big data like you can have all the big data you want but if that doesn't give you the insights to act upon it like it's useless so I think those kind of narratives we also need to develop and be smarter about showing the contrast and also showing like okay if you want to have big data then probably I'm not the right person if you want to have thick data and you want to have qualitative insights that will lead you to this I'm your man and I think we need more convincing and smarter examples there yeah most definitely I think you know data is as you say big data can be dumb data if it's interpreted the wrong way and it's all about the interpretation so but you do need data it has to be data driven but you need to be smart with how the data is interpreted and what importantly what data you're collecting as well you know sometimes it just spend a lot of money resource collecting the wrong data and it's just you know it just wasted resources and so I think we need definitely to be smart about that to call out but I think as designers you need to take ownership of it I think that's the key as well you know because you might maybe in traditional design education you're not really taught to think about impact or evaluate the work apart from some user testing and you're like okay but not really think actually how might this really impact and here we move on to this discussion around kind of the ethics of design and you know bigger bigger topical issues right and responsibility as a designer but I still don't think as a discipline we take we take ownership of these aspects as much as we just don't maybe consider as part of our our tools our toolkits you know we're good at idea generation and only recently we're like okay we can do some design research yeah we can do that and then we can use that to come up with ideas and we can prototype it and we can present it but then the evaluation bit is like oh that's for someone else to do you know or someone at the business side to do I don't want to be involved so I think it's also about taking ownership of this topic if you would have to if what do you hope is the one thing people will remember from the conversation we just had that to use evaluation as learning simple as that and to not be afraid of it and to not discount it and to take responsibility for it and that they're already doing it so it's nothing new they have to simply not simply but formalize it and think of it as evaluating the outcomes for the project itself yeah do it consciously and deliberately and formally that's already would be a huge huge win yeah I know you have a lot of material out there for people who want to dive deeper into this what are some good resources to follow up on I think to read up on this approach that we've referring to the developmental evaluation there's some easy quick guides that I can we can share with the links we've also written a number of reports and papers about the work that we're doing with practitioners and we introduce what we call the designing social impact evaluation framework and it's a series of I would say principles in how you might want to think about evaluation as learning and how you might do that in a more practical way yeah and we have videos of that we did of the of the gathering that we did with the practitioners about impact evaluation in which they share the methods and the way they're doing things and some of the challenges that they have so I think that would be quite rich resource to at least hear from the practitioners themselves and hopefully it chimes with a lot of your listeners I'll make sure to include all the links in the shout out so people can follow up on if people have ideas or questions about this topic is there a way they can reach out to you yeah feel free to just get in touch with me by LinkedIn or by email I don't know what you prefer to share so I'm happy to have conversations and just share ideas really awesome Joyce I think we've touched upon a super interesting topic it's definitely a topic that needs more maturity in the coming years so I hope this is a call to action to everybody who's listening to think about it work with it design it and I want to thank you for bringing this to the attention on this episode that's great Mark thank you for the opportunity to talk about this work to a wider audience because it's been mainly residing in the academic world and I think it's although we've been talking to practitioners I think it's really important especially for designers who are moving to space of social design and social innovation changing systems and processes and people we are making impact and we are making or delivering social impact we just need to make sure that it's the right type and that it fulfills it does deliver benefits to the community that you're trying to serve really so I am it's definitely a topic I'm passionate about and I would love to hear from any of your listeners about their experiences and anything else that we can discuss so thank you what's your biggest takeaway from this conversation with Joyce leave a comment down below and let us know if you made it all the way here you're apparently enjoying these conversations so make sure to click that subscribe button so you won't miss any new episode in the future thanks a lot for watching and I look forward to see you in the next episode