 N-C-B-O and Dan Corry are from NPC. There's a bit of confusion on the programme it refers to Paul Montgomery. So how I've been to a specific type, and I've failed to communicate with someone who did my own department. To come to the point, I forgot to ask him to come in today because there is that type of programme. So I apologise for having what we've had some very interesting things to say. The way we're going to do this is just five minutes ac ydw i'w ddwykinsiol i gynnwys eu pan fydd yr ymddangos a gweithio'r gweithio'r gwahoddiadau gyn gyfyrddur ac yn ychydig i'r pethau, os ydy i'w ddwyntio'r cyffredinol ar gyfer mae'r ddangos. Be gilywedd am y cyffredinol, dwi'n ffordd gweithio gyd fel hynny. I'm just going to touch on a few things, a bit more about impact measurement. We've been talking about a lot of things this morning so far about all sorts of aspects of data. This will be maybe about impact measurement, because we'll see if you haven't come across this where a consultancy and think tank are dedicated to trying to improve the impact of the sector, the funders and the charities. The problem is that charities, and actually lots of others I worked in central government for many years, we don't really know often whether we're making any difference to anything. We're not sure what to measure and in the charity World Philanthropists Foundation don't quite know what to fund, what is it that's working. So the result of that is there's all sorts of stuff, we're all doing loads of things, 160,000 registered charities, hundreds of thousands of community groups, and we're not really learning anything really about what works and what doesn't work and learning from each other. So we can all do what we all claim to want to do, which is to help more people. This is of course difficult stuff. The sector wouldn't exist though without the passion. That's why people get involved, but we've really got to get them thinking harder about the impact and certainly we found that the key thing in this, and it's always been mentioned by Neil actually, is getting charities clear about what it is they're trying to achieve and that's the kind of theory of change language that I suspect most of you are familiar with, which we're a bit guilty of bringing into this country and pushing hard, but that's the sort of number one thing before you get into the ethics of data and measure everything else. If you don't know what it is you're trying to achieve, and that is obviously often very, very disputed within charities, it's remarkable how much of a change word we do and how tense it gets within charities arguments about what's going on. Then you'll never get anywhere, and then when you've got that, you can then start to try and ask this question about impact and try and take into account the organisation so it thinks that matters, and I think that's slowly happened in the sector. I think it's very important at the minute that charities are trying to do this. I think it's always important just because we should be trying to do the best we can for our cause with the resources we've got, but also because the sector's under a fair amount of attacking these days for all sorts of things, a public pressure a bit skeptical. I'm not saying that the fact that we can prove our impact means that that will all disappear, that olive cook affair would cause problems in the way charities are going about fundraising, whatever we did, but it would certainly help. Then you get into the issue, so what does that mean for the levels of evidence, and it's a very simple diagram. If you like the sort of dream academics on the right, randomised control, trials about everything and then over here, something that none of us here, I'm sure, would agree with a few anecdotes don't really prove anything. We've slightly pushed back against the way people sometimes present this, and you might be aware of the ladder of evidence, which puts this as a ladder, which suggests that what everyone should be trying to do is climb up it to the great randomised control trial in the sky. I think that's a danger because for a lot of charities it's much better to do something in the middle and do it really well than to try and push on to something that's not proportionate for you, you haven't got the data, crazy numbers. If the best you can do is cause the work to be very good, do it well. If you're not after collecting massive data sets, don't do it. So I think that is a problem, and we've tried to push quite hard, strangely enough, for an organisation that pushes data and everything. We've been recently pushing that some charities are collecting data they don't really need to collect, and this is all madness as a recent blog by one of my colleagues called Five Types of Data for Sessing Your Work, and it's one of the things it argues, for instance, if you're a small charity, for instance, doing mentoring or something for young kids, for God's sake, don't follow them over 20 years and find out whether it worked with a control group. Find out whether there's some good academic evidence that that kind of mentoring really works, and then prove that you're doing the mentoring very well. And there's a lot of sort of slight nonsense that's come through into the sector. Of course there's lots of hard issues in impact, and I'm sure people in this room will be well aware of them. There's the attribution issues, there's whether our data's any good, it's expensive, it can be expensive. Timescales for evaluation, particularly if you're doing preventative work, which a lot of charities are very hard to prove, and sort of comes to cultural to the sector. I think this second point I'm afraid, there's a lot of the academics in the room, but there's a lot of absolute rubbish coming out of academia and from some consultants. Social return on investment, which got kicked off when I was still in government, and if you can't get off this paper I think people still use, is a very, very good framework about thinking about life. But then to translate everything into £1 in £15 back when you have no serious control group is absolute madness, and there's some terrible stuff out there. And we've had, I think, some of John's colleagues have written papers about SROI inflation. Everyone's got to, you know, you're not allowed to say £1 in £2 back anymore. Funders say that's not very good. These aren't the case. So £1 in £10 back. The fact that the work was rubbish isn't looked at. So there's all sorts of issues here. Charities wonder whether funders really care about this stuff. Interesting, some work we did a while about asking charities where they were doing more impact work, and said, why did you start doing it? And they said, well, because funders all wanted, you know, they all wanted to get a glimpse of our impact. And then we said, well, having done it, what did you find? Did the money come pouring in? And they said no. But they said what was good is we could now be a better organisation and allocate our own resources better. And there's an issue about our lot of organisations now. Any self-respecting charity that's big enough has got to have a measurement team and all the rest of it. Is anyone actually making source allocation decisions based on it? Are we actually learning anything, or is it just something to shut in our brochures and all the rest of it? And one of the things that we've published about all this is called our Falkillers approach. I think all of this stuff is difficult. We published a paper recently. I just want to touch on a few things, or some new things knocking around. They've already been mentioned by budgets about everyone so far. But we glossily called it eight innovations in measurement and evaluation, which were kind of globally sourced. I'll just mention a couple of things. I think it's just a gift of feel that there are new things going on in the measurement world which are useful in the sector. One of them, which is the shared measurement side, and essentially that's just saying can we get organisations doing similar things to use similar metrics, and then we learn. Because now we can see why some people on that metric seem to be doing better, some are worse, and then we can try to understand what's going on. And you can do that collectively if you're a bit nervous like we had in the HLF example of kind of exposing those who are doing this well. And a good example here is Safe Lives, some of you might know, which is a lot of the domestic abuse charities that work together. And using shared metrics, getting the data collected in the same way, they uncovered various things including that those charities who are trying to have somebody located in the A&E departments picked up things much quicker, and a lot of other charities have followed that. So that's a good example of shared metrics. It happens amazingly rarely in the sector. Charities doing exactly the same thing, all have different metrics. I had a vague attempt at one point to try and persuade the big children's charities that they should collect all the data in the same way since they do the same thing, and they all thought it was a good idea but nothing happened. More exciting, remote sensing. Charities are starting to use this, and this is obviously that you can try and pick up impact without actually having to be there and asking people questions. Some examples we used is the Clean Cook Stoes program in developing countries where it's trying to get people to use cook stoes, which are less polluting than others, and you can now install smoke sensors to be able to measure what's happening when the health is increasing. And another obvious area that's being used again in some aspects of the environment is to try and see what's actually having to forest through satellites, et cetera, et cetera. So I'm not saying that's suddenly going to translate into what sort of bread-and-butter charities are doing, but there are opportunities starting to come here in a way we didn't have it in the past, including video diaries as well just to see how people are responding to measuring and things like that. The third one, again, which has been mentioned, which is less about techniques but making this stuff useful, is the data visualisation, and I think you're going to talk about that a bit later on. And there's very interesting stuff there. The Trussell Trust have done some great stuff there, trying to bring supply, if you like, and demand together in maps. And so you can see where the gaps are very quickly, but you'll find your senior management kind of like all that kind of stuff. And the last one I just wanted to talk about was data linkage, which again has been mentioned, the ability to not only link different datasets but link them over time. And so I just want to touch on something that we are very keen on at MPC, which is to do with data labs, if any of you have come across. This is trying to say what a lot of us do is we have an intervention with a set of people and we think that's going to give them a better state, but actually we find it very hard to find out what that happened. This all started for us in the criminal justice area, so there's a lot of charities that work with prisoners and they believe that they reduce re-offending. And they find it very, very difficult to find out whether people actually did even re-offend or not. The vulnerable people that charities tend to work with are not the people that you can follow up to years later with a questionnaire. They won't have the same email address or number. So it was absolutely hopeless. Meanwhile, who's got all that data? Government. As it happened, they had it all over the place into the local probation and norms and police and all that. Quite frankly, how exactly we achieved this, I don't know, because Chris Gayling was the Secretary of State at the time. But the Ministry of Justice have now created a Justice Data Lab. You can go online, you can use it, you can find out the results. And essentially what you do is you say, here's the identifiers of the 100 people we work with. Can you tell us whether they re-offended two years later and set up a control group using propensity school matching? And that's essentially what the thing does. It's got more sophisticated over time, so it can be frequency of re-offending. And the control groups have got better because they can put things like where the people are dependent on drugs and so on. So it's been very, very powerful using data that's already the government has. So when no one's having to collect any data, charities don't really even have to understand what the hell's going on. They just say these are the people we worked with and the Justice Data Lab tells them whether they have an impact. Obviously there's issues because people in this room will understand. Often the answer is we can't tell whether there's a difference between doing nothing or not. Not least because people don't work with enough people, enough clients to get statistical significance. It's very powerful and we've been spending a lot of time recently. That was an amazing success. Still going. There's white cuts at MOJ. They're still running it. And we've been pushing very hard recently on health. We've got 24 health charities wrote with us to Jeremy Hunter to argue for that. And we're having a lot of discussions with DWP, which look like they may get somewhere to have an employment data lab. That's an example of the way that data link is using data that is already there. I think it's an outrage if the government sits on this data and we're not using it. They're not using it. We're not able to use it. A very, very last thing, because among other things I do, I'm involved in the What Works Centre for Well-Being. I just wanted to flag this up. I feel that at MPC we work with lots and lots of charities and funders. And I think deep down what we're all trying to do is improve, give people a good life basically. And that's well-being as close as you get it. We then end up measuring other things because we've discovered we don't get funded if we just say we increase people's well-being. So we say we're going to get needs into jobs or we're going to reduce loneliness of old people. But actually what we really care about isn't just having a decent life and feeling good about our life. And it's interesting that well-being, as you can see, GDP has risen over the years, not so much in recent years. Life satisfaction hasn't changed at all. So it's something different from just the economy going well. You can now measure well-being at ONS measures. There's four measures there in a lot of the surveys that Louise outlined earlier. There's a lot of controversy about whether the subjective well-being is the right thing to measure, et cetera, et cetera. But I do think it's something that will come into our sector a lot more because I think deep down it's what we're all about. So those are some slightly random thoughts about impact measurement and hopefully perfect a few thoughts. Hello, everyone. I'm Carl Wilding and I'm from NCBO. I run a public policy function at NCBO, so policy research and so on. So I'm coming at this from the sort of not just the data perspective but from the research perspective. But I do have a research team staffed by Geeks. They're the sort of people who would look at that chart. And then we, sorry, look at that flip chart. And then we'll have a 10-minute discussion about is it data are people or data is people at the end. John, you must be to sort of disagree with Dan. Actually, that's quite difficult because I sort of, I probably agree with quite a bit of what Dan said. I think where possibly I do have a point of disagreement and I'm going to try and touch on this, is that, you know, if I was going to sort of criticise Dan and NPCs, it strikes me that sometimes I think there's a view of the world that basically is, well, if you demonstrate your impact, resources will follow or public support will follow. And I just don't think that's true is my starting point. Just to tell you sort of what we do, you might have seen a publication in NCBO producing is annually working with third sector research centre called the Civil Society Almanac. I mean, in some respects that's a good reflection of the sort of data that we have to work with in the sector. In that it says an awful lot about workforce, which we bought from Alistair's work in the past on that. It says a lot about resource inputs that come into the sector. And it also says something about resource outputs. And it says a little bit about impact. So that slide that Dan just had there about contributions to GDP. We can use an ONS approved method to work out the contribution of our sector to GDP. And it's something like 0.67% of GDP can be attributed to the voluntary sector, which strikes me as a pretty meaningless statistic. It reflects the fact that we've got lots of data about inputs, I think, as Dan was saying, but finding data about outputs is hard enough. Never mind finding out data about outcomes, let alone impact. You only need to look at the set of published accounts for your own charity to look at the bit on the statement of financial activities that talks about your outputs and say how much you spent on fundraising. And then you will probably say how much you spent on support costs and then there will be a very big amount that says how much you spent on charitable activities. And that's about it. Whilst that's not the state of the art in any way, it's sort of where we are. The questions that we get asked about charities at NCVO traditionally have been on from journalists or traditionally have been on the lines of well, does this charity make a difference? At Intervention Work, increasingly we're being asked, especially by a government, not so much does Intervention A work, but is Intervention A better than Intervention B? I think certainly where lots of commissioners are at the moment is that they are absolutely obsessed by a lot of cost, ideally by value for money. And if you're making a picture to someone who's already delivering a service it doesn't have time to recommission it or it's very risk averse and doesn't want to try something different, this issue about if your Intervention is better than what exists already is what we're being asked. So the public want to know whether or not charities are making a difference. And by and large they can't find any evidence that charities are making a difference. Or they can't find evidence that is in a format and I guess here the objective is in a digestible format that fits the level of understanding of how our sector works. Remember most people still think charities have staffed entirely by volunteers and they survive almost entirely on donated income, which is not true anymore. And because they can't find evidence of impact, and this is where I think Dan and I will agree, they start asking other sorts of questions instead. So the question that the public and the newspapers have been asking and all for what recently is, how much has your chief executive paid? How much do you spend on fundraising costs? How much do you spend on admin costs? Not helped by the fact that charities account don't actually report on admin costs but that's still very much where the public are at in terms of their narrative is that they want some sort of measure of efficiency because that's what they've been trained to look for. They want to know that charity A is better than charity B because largely speaking people don't give to charities, people give to causes or people support causes and they want to find an organisation that will take forward because they believe it. Because they can't find evidence of what is making a difference they're therefore asking proxy questions instead. We've done focus groups with donors and they more or less tell us this that because they can't find what they want they're asking these much more difficulties are on questions in some respects easy to answer how much your CEO pays but asking those sorts of questions about inputs arguably take us down the road of focusing on the wrong things. There's lots of evidence for example that charities that don't invest very much in their back office in what you might call their administration costs generate less impact than those organisations that have good IT that have good HR. There's also the problem and again this is what people are telling us in focus groups that we do is that when charities are producing quantitative data about processes or about their input people don't believe them. Does anyone work with a media officer who used to work for the BBC? When you next work with a media officer who used to work for the BBC ask them about their experience at the BBC journalism school because one of the very first things that the BBC journalism school tells its trainee journalists is don't believe any statistics that are given to you by a charity. Part of the problem here is again as I think you're alluding to is we have a problem in terms of how we use data and the pressure that we are put under to use data. Dan used the example just before of the trust or trust where I was slightly shaking my head behind your back. What's the charity called the Chex Facts? Is it literally fact check? Will Downslaw? Full Facts, yeah. The trust or trust was taken to task by Full Facts because their lovely infographics that Dan was referring to look absolutely fantastic but the data underpinning them was not actually telling what Full Facts believe was a true story about the issue of food banks that they were talking about. So on the one hand people are saying that they want more data about the impact but on the other hand they're saying that they don't believe sort of content that we produce so there is a sort of frustration at times that we sort of can't win. I guess almost just tying up that little bit of conversation is almost to sort of move forward and say actually there's a lot about charity that strikes me that's actually quite irrational and there's certainly a lot of behaviour about how donors support charities in terms of their time and their money that's quite irrational. So if we're going to have a conversation about impact I think we certainly have to move beyond just quantitative data and we have to think about how to move beyond measure as well. So we have to start thinking about ok well what is it that people are interested in what do we have evidence of that actually seems to work. So then you start to, I don't think it would be an innovation but you add to Dan's list of innovations by starting to think about things like narrative like user accounts. The ESRC for example have an absolute battery of performance statistics about the difference that social science makes to UK society but they also have a fantastic little story about how a few years ago for those of you who are as old as those up front will remember about the rolling out of something called 3G for the mobile phone network and we're sort of well past 3G now but it was based upon a small bit of research that ESRC funded at UCL on game theory and game theory was used to develop the auction for selling off the 3G licenses to the mobile phone companies which generated 22 billion. Presumably ESRC spent a few hundred pounds of pounds developing that. Moving forward because I'm probably running out of time I think we have to be honest and we have to think about who Gage was for and who impact measurement is for because not everybody gives because charities make an impact. My wife has just set up a £30 a month direct debit to a charity that I personally think is awful but if there was just something about that charity that has just resonated with her because of the story that we're telling you we have to be clear about why we're measuring and one would hope that we are thinking about measurements not because we want to sell our organisation better and the SRROI example that Dan used is absolutely top of my list for everything that has gone wrong in terms of data and measurements should be about improving services and presumably that leads to different sorts of questions. I've also had a bit about snake oil salesmen but mine were actually consultancies like the long down guns. And my own and quite seriously we also promote theory of change. My concern at minute is that theory of change is the new social return on investment because it's not a theory, it's a hypothesis but we're all starting to believe the fact that just because we say A leads to B therefore it must be true. It's a hypothesis, it's not a theory. And the final point is user voice is essential. We know this works and if you want some really good examples of how to change people's minds about the impact and show user voice in Manchester has anyone come across the reclaimed project in Manchester run by Rupai Bebuna? It is an absolutely fantastic example that for me shows the impact by putting in people for what they do. Also look at girl guiding and how girl guiding now they are very much putting forward young girls as the spokespeople for those and using young girls to talk about their own experience and use their narratives because that is what is making a difference. And then finally just in an attempt to provoke debate with Dan an absence of evidence is not evidence of absence just because we're not very good at this stuff doesn't mean that we're not making a difference and that's where at times I feel like pushing back certainly against people in government who say that because you can't prove it therefore you're not having an impact well no that's not true. Okay thank you very much that's really helpful. An awful lot to think about there so I've thrown out on the floor I know Dan's got a training camp and Carl's got a training camp we've got at least half an hour depending on how long people want to be in the car for their lunch so if I throw it out we maybe take two or three comments or suggestions perhaps what would be really interesting would be to reflect on what you've heard and the life experiences of your own organisation some of the trade-offs and dilemmas and challenges that Carl and Dan have very eagerly raised for us and who'd like to start Can I just remind you I'm sorry I was interested by Dan's kind of continuum and I think the challenge I've had within HLF and other organisations is senior people tend to be on the left so they always believe everything based on the hunch or they think that might be right and obviously when you use evidence I'm trying to move people along a bit so I was interested in any ideas or experience you've got of helping them move towards a more evident space approach to decision making basically Do you want me to tell? Anybody else want to follow from that? I'll send them the lines OK, Dan Yeah, I mean Carl, I mean we're not stupid in a PC, I have to say but we know that people when they are particularly if you're in a fundraising mode which is one note that in the charity sector we have to be in we need the funds but then it's not the donors who should be our main audience and sometimes it feels like that but of course you need stories of course you need narratives that were said, no stories without numbers so fine to give the anecdotes and the stories and the narratives if your data shows that is more or less representative of what you're doing if you just plucked the two cases Fred, we went in we did violin with him in prison he'd been a serial offender we did the violin, now look at him he's running his own business if it turns out that Fred was the only one and actually for various reasons it was all dead weight to use a kind of I'm an economist so I use a kind of terms then that is a very misleading anecdote and I think one should be trying to be honest with the evidence when you've got the evidence of course you give an emotional story which illustrates the thing that you're doing the same in qual stuff people often think qual is bad no it isn't, qual's the evidence can be fantastically much closer to causation and all sorts of things like that you know, you make sure that your sample is of different types of people and all the rest of it, you don't just go in which I'm afraid a lot of charities do and the public sector is much worse at a lot of this I've worked there for a long time and public sector evaluation is terrible and it's very very rare that whoever you hired to do the evaluation doesn't come back with an answer that you first wanted it's many years later when the academics get to it that it turns out that whatever it was community programmes didn't really work and things like that so I think that's important how do you get your organisation because I think it's difficult I think particularly and we find particularly doing work with the big charities the power of the fundraising team cannot be underestimated it's not probably true for HLF although you don't want to tell people bad stories you say well there's money big lottery does the same thing goes to good causes if you were then to reveal but actually half the money we give out doesn't really achieve much that wouldn't be very good news for selling more lottery tickets but I think it's slow I think it's a cultural thing I think the new leaders at all levels coming into charities do think a bit harder they're moving a little bit that way but I would be scepticable about saying you've got to get all the way I don't think there's sort of a session with our advice control charts which personally we've kind of incorporated from health and all those worlds and we can have the placebo and so forth I find in social policy I don't even really believe it if you find something that's been RCT works somewhere that does not tell me because social issues are so complicated it will work in another place at another time we've had the recent case of the family nurse partnership which had been RCT to death in the US brilliant came over to Britain it's just been RCT it doesn't work because in the States if you didn't get a nurse partnership you've got nothing whereas in the UK you have a pretty decent service even if you don't get it so you've got to be very careful on these things so I wouldn't push them all the way up that but certainly getting to a bit further but it's a deep cultural thing it's not easy it's not come back I just want to add to what Dan said so Dan and I I think are both of a generation that we come from an era where people run reports every time they did a piece of research now and I almost sort of feel given the pressure that a lot of us are at the minute it almost feels like I don't want to read reports anymore I just want to change things so almost my advice to get senior management listening is use evidence and use data to help you change things and then show them the change that you've made so something that you're probably going to be familiar with with your teams is that we send out organisations we send emails out to all your organisations and they have open rates of about 20% or something like that so we've used data to improve the open rates from something like 20% to 30% doing things like heat maps every testing that sort of stuff so change your own service show that something works and then people will listen to you the other sort of obvious things to say to you is that it's got to be relevant it's got to be timely and it's got to be jargon free there's no point doing fantastic research if it's out of day if people don't understand it and it's about something that is not a pressing issue for the organisation can you take a word? please my name's Gillian Ratton I'm from Action on Hearing Maths so I work in the policy team I also have a social research that we do and in terms of using data as an organisation I can see there's loads of different ways that we use data in evidence to make evidence based on quality and we definitely select different types of data based on what we're trying to achieve so we're very good at doing the influence thing getting moving case studies to influence donors or policy makers in government and we're quite good with the quantitative evidence as well which we're using to place not like that kind of mix at parts of the numbers but it's a story because people respond very well to emotional stuff but I'm also quite interested in the digital transformation that we're only going as a charity and I'm working on that and also what we can learn in the third sector of organisations from the revolution that digital has created in terms of business models and deliveries to the journeys and how we can use better data collection to incrementally and in a very almost real time way change the way that we act as charities and not just think about our kind of recipients or targets of data of being people who make decisions but also use data to improve user journeys for our websites for our services maybe even our staff and so on and I wonder if there are any charities or any groups that you think that we've learned from who are incorporating that really cutting edge development in terms of feeding in and then changing what we do does that make sense for you? Before we get enough from Carlin Town I wonder if there's anyone in the room who feels their own organisation perhaps doing that sort of thing or other examples that they're aware of I'd like to share with you OK If not, Carl, would you like to start? It strikes me all the big charities that are in that space at the minute everyone's thinking about transformation I think actually on two planes one is about digital and the other is about volunteering we're all trying to think about how do we move away from just delivering everything via sort of government contracts and so on doing things differently the ones that I think are cutting edge at the moment which is a sort of a volunteering platform for young people and if you look at how they're sort of redesigning how they've redesigned their service all based on you'll be familiar with the concept of personas so using personas to identify what client group system they're trying to work for actually if you were to just google NCB or NCGO's volunteering forum you'll see a presentation from them in Manchester a couple of weeks ago where they were talking about the design principles that they're using for the new website and how data informs what they do which then takes me to CAST if you've come across CAST so what CAST are doing in terms of trying to help organisations well actually they're about digital and they talk about being agile in digital development but from what we've seen in our experience of working with them those principles of agile development they're not about digital they're just about service transformation full stop so another example of someone who's trying to do that at the moment is Parkinson's UK what Julie Dalts is trying to do there but then the final thing I would say is that I think it's dozens of organisations that are at that stage now it's not hundreds or thousands would be my sense because this stuff is difficult it's expensive and what my experience what we've found is that the skills set required at the moment is in very short supply and therefore it's very very expensive just to a bit more less about digital for service delivery and all that but about valuation and two of the things up here the theory based evaluation to some extent is a little bit more about real time I think what digital and having real time feedback allows you to do is rather than say what we do is we set up our intervention we set up the evaluation and at that stage we tell you whether it worked or not which is a bit hopeless really because people in need at this minute but it's much more you can start seeing whether things are working quite quickly and you can start changing within programme the way the programme is working that's a nightmare for evaluation people because which programme are you evaluating but in terms of helping real people quickly it's very important and that kind of real time approach annual conference last year someone from the Government Digital Service who was talking about the whole way they first designed universal credit and it was classic government and I remember a lot of charities thinking oh god we do the same thing is what do we do we get a whole lot of people to design the new strategy we provide a 500 page project plan and then we roll it out and they did that for several years and it was a complete disaster and I know universal credit is still a disaster but then they moved to kind of leaner approaches had a smaller group of people they tried it, piloted, saw what the feedback was see what wasn't working, changed the programme and all that kind of thing is coming into evaluation as well the other thing just to point out is user centric evaluation again you can do it much more with digital you can have pretty real time feedback from people how they're feeling about it how they're experiencing things and you can feed that into what you're doing so I think these things are quite exciting and they get us quite close to distant from the people who were actually trying to help or sort of sitting around waiting for some years I mean a classic government evaluation is that a good evaluation is published some years after the government has changed and the programme has already been amended you know it's useful we learn something about certain kinds of approaches that work and don't work but in terms of those people who could have been helped by that programme and I think we feel in the charity sector that they're awkward about spending money on something which will add to knowledge if we share failure as well which we've had a discussion about earlier but actually won't help the people we care about in the meantime so I think these things are all quite exciting I just want to make a really quick point that points to the flaws in people together about your idea and one of the things I like about this real time feedback is that it potentially it has the potential to kind of bury two sides the people who are kind of like working at the front line of the charity you know who actually get kind of face-to-face meetings and the back room people who kind of keep the whole machine going and we've spoken earlier about sometimes the kind of cultural cultural areas between those two parts of two sections of charity but the way this operator might be it brings the two sides together that it could be a really interesting kind of mode of conversation between those two sections I think there are two questions perhaps to your first and then at the same time to be taken both together I would really live to this idea of using digital which feels so sort of big and you know it might be that it's using data in the background to present something you know just in a really tangible way or we could be talking you know service models coming from a sort of mental health background future in mind a national report very much focused on online online it's everywhere but actually and then of course we need commissioners down we need commission on online services but it doesn't work like that it's not that simple and actually you get you know you can really well establish services or really crack services you know to be involved so you know it's recognising the other the end of it that is the funding commissioning end and actually for our organisation we have some really really great commissioners actually and it's not great or not great but we've got some really great commissioners who do understand but it's the pressures on men and their budgets and very often in terms of an organisation's approach particularly in the voluntary centre monitoring and evaluation the expectations are here if you so you either have to already have that in place or very often there's not huge willingness to fund the tools to do that the expectation is that it's already in place and actually you mirrored what local authorities produce or have produced or you know link up you know it isn't this sector that's really bad it's often about tools okay okay I just think particularly one thing that's really come so far is what funders are doing and all this I just want to really have more discussion about what they their confidence, what they ask for what they don't it's interesting being funders in digital and a colleague of mine trust Lonley wrote a rather good thing saying that charitable funders were really not equipped to fund digital projects because first of all they're often innovative you don't even know exactly what's going to happen so funders quite like saying we'll give you this money for three years and what are the outcomes you're going to produce and the answer is we don't quite know it's a world they don't understand so I think it's made it problematic for charities it's also very hard to fundraise for digital things so I think that's a problem I mean it's a difficult world and I think so there are many type of people need to push themselves into this we've had an awful lot of charities recently who are talking to who all think they've got to have an app because you've got to have an app these days and they don't really quite know why they want to have an app there's just a sort of feeling and you can see where they're coming from everyone seems to be using these things so if you don't how are you going to link up with the people you want to help but then try to evaluate whether the app achieved anything there's other issues about why does every medical charity want to invent their own app when all the apps are basically the same with a different front end can't we all share it somehow but also how do you evaluate apps we even have one pet device that wanted to evaluate the app which at that point it seems things are going a bit strange so I think we've got to think about that in the sort of measurement community it's interesting for you saying that you had quite good relations with commissioners I mean as we know lots of charities don't have that relationship and commissioners are starting to think lots of things should be online and all the rest of it they don't quite know what they're doing so I think it's a difficult phase and there's a terrible danger we're getting miles behind the private sector on the place and it's not a great market for them the innovation with people talking about AI recently we had someone from Google talking at our conference last week and obviously they've just had the leaders in all this stuff but it's moving very very fast and the bright PhDs are not leaping into the charitable sector to help us with the AI just the one thing about commissioners the converse of your good commissioning story was in the headlines a couple of days ago which was Ministry of Justice and they're transforming real vegetation companies and their performance indicator was all about referrals so what have they done they've focused on referrals and that's a good example of the data that you collect influences the service you provide I think there is a real diversity but I think my sort of thinking about what's needed in the future is some of the larger BLF SMA some of those big companies are really doing well co-op foundation are starting to really work together more and wanting to work with local authorities work with health and where we all end up sectoring ourselves I think it can become reductive let's talk about small charities let's talk about working around themes because our organisation is working with our current partners it's not helpful anymore we don't live in that commissioning world so please sorry I'm interested about wellbeing I mean, as an organisation we're actually directly involved but my position is how much should wellbeing now form part of the valuation of these and what should we expect from using wellbeing as a measurement part of the valuation I think wellbeing is fascinating and the whole focus of wellbeing with all the flaws in the O&S measures which are about life satisfaction and anxiety and so forth about subjective wellbeing some people don't like that but it's raising some really interesting issues and putting them on policy agendas that wasn't there before mental health has shot up the agenda because people realise there's nothing worse for wellbeing than mental health relationships equally kind of stuff we all knew but you put some hard numbers on it and suddenly people take it more seriously it had a little verse the reason the government's got lots of these the reason the wellbeing one got set up was because for a period of these David Cameron was very into this and certainly the inside people in government tell us that the current Prime Minister will be in nonsense whether the government's so interested but I think if you look at wellbeing probably in the end what people will do they'll square the circle so I was saying that perhaps we should get to a world where the ultimate outcome is wellbeing but I think what people will do is they'll say okay there's enough evidence that if you improve the wellbeing of older people let's say they're less likely to go into A&E and therefore we as a health commissioner are prepared to fund something where the outcome let's say if it's a payment by results contract we pay on wellbeing going up because we know it's well enough established that that reduces A&E admissions I don't think that's quite the way it should go round I think you should be trying to match paying for the wellbeing sort of thing So you think it's still investment-led Ultimately It's hard to see that changing but this is kind of quite new stuff wellbeing's a funny concept I remember I went to my recent Lambeth Country show a big tent called the wellbeing tent and that's exciting and it was all kind of crystals and yoga but it's not necessarily what I was looking for I don't know what I was looking for Nevertheless I'm really interested in the sense that we're thinking at the minute my chair who Dan knows well is sort of saying to me well we've got to come up with these sort of indicators of what is the impact of the voluntary sector go away and come back in six weeks sort of stuff and wellbeing if we are going to do anything wellbeing strikes me as the closest thing we've got to a candidate in the sense that there's been a lot of user testing of the subject to wellbeing the subject to wellbeing data and so on and there's another part of me that thinks it's the next SROI in the sense that you would just have instead of £15 for every pound you sell we raise wellbeing from £7.2 to £8 there's nothing else that can do that you have to invest in us but you could point him to some work by academics not by snake or academic instead but by some very large scale and authoritative European studies which actually conclude some of the aggregate impacts of the voluntary sector are by no means as great as people like to think they are in terms of employability and in terms of health and wellbeing and so forth and I suppose my question there would be to what extent would you actually stop doing things on the back of evidence like that some which is pretty robust on the standards first of all I think we at NCBO and Dan I think an NPC of the same we try and be quite qualified when we talk about things that volunteering for example does in terms of employability or voluntary organisations we always carry out it by saying in a particular set of circumstances and so on but let's be honest here the single hardest thing to do in the voluntary sector is to stop doing things without a shadow of a doubt stopping doing things is the hardest thing ever because people start things on the basis that I have a gut instinct that the healing power of crystals improves people's mental health and to get someone to disavow that gut instinct is very hard I mean I just, we're so good generally although we mainly talking here about the impact of particular charities and whether their interventions work or not, you know I put a massive value on the existence of civil society in creating social capital cohesive communities you know if you imagine a world without civil society and we can see some countries that have got no civil society and they're not good countries so how the hell we put a value on that I read Pamplein recently where we argue very strongly for this theoretically but we also get quite a lot of charitable funders coming to us and saying we'd like to fund you know there's a town and it's really lacking social capital and we'd like to fund something which somehow boosts social capital and what is it that we should fund if it worked I think there's a really important question and civil society does help create that and that value goes beyond individual organisations what they actually achieve One of us questions about What is it working and my opinion about what work centre is what it doesn't so a data network there are intractable problems that are proven by data year on year things like inflation and work in some areas present for homelessness so I think it's worth looking at the spending in those relative areas relative to those data and then seeing where things are what is that intervention and I think where I love that a set of graphics working I'd like to see that as user centre design so we were talking about the rep but earlier I didn't impact in the rep and one of the things we had to prove was the corroboration so commit that as an impact and that was powerful because the assumptions would make that this would work and in fact it had to be proven by the I suppose the kind of acti the kind of person that was going to tell us what it worked and I think it was really interesting in putting that through so I think what doesn't work what's not working is a really good way of thinking like this because we're seeing a lot of polemics between a couple of models the voluntary sector model the university model which is between a batch of something else so I think it's just a good way I think my question does relate to this although it's a gentle one it was particularly for Carl although he relates to Dan's credibility picked in that I was quite interested in the BBC anecdote about not believing statistics which is useful from a personal perspective next time for someone else it would be something like this I guess my question is that so as a member of the general public is whether some of the works in this sector I never believe any statistics for anything pretty much so my question is is it something specific about the charitable sector that makes people less likely to believe the statistics that they're putting out do they know that they're rubbish or is there something more existential about it and if so how do we go about rectifying that is it as the last question being more honest about the limitations as to what the data looks like and how you got there or is it simply a question of getting better with the methodologies that people are more likely to believe it's more credible or is it kind of what you're intimating which is a slightly different focus on narrative and quantitative data Can I just come to the first question first about user-centered design I think I've used the expression user-centered design If NCBO puts a survey out to our members and says what's your biggest problem what do you most need support from us absolutely 90% every year will come back and say funding no one will ask themselves the question of actually are there some issues sort of deeper than the fact that we can't attract funding that we need to sort out such as we can't demonstrate our impact so I say that because if we just listened to the data and didn't interrogate and just did what users told us to do I don't think we'd actually address the things that really have got a problem On the issue about is there something specific about charities well I think there is something there's a particular problem in terms of August and the fact that lots of charities publish survey data opinion polling data in periods when the media is generally quiet because it tries to get their cause sort of higher up the agenda I think we're under pressure to articulate issues as black or white when we know as researchers actually Shades of Grey is what the reality of social researchers like so for me getting better it's not just it's not necessarily a researcher problem or a data problem it's more of a public policy problem where we have to make sure that our researchers and our policy and our campaign's people are talking to each other all through the process and it's not just you have your research phase and then you have a policy phase and then you have a campaign phase and each one the message gets sort of changed you've got to have people involved at the very beginning and your predecessor at APC, Martin Brooks used to use his phrase once which he loved in the systemic communities where policy makers and researchers understand the field and work with each other all the way through and there isn't just this Tadda approach to research findings where you hug hug hug and then just reveal a year later and then suddenly there's pressure to change the meaning The trouble is the great thing about the voluntary sector is we're all passionate about our causes and therefore in the sense the public know that we're constantly churning out stuff that there's more need in our sector because we have a weird thing we need to prove that our sector's got more need than somebody else's sector and then we also want to argue that everything we do always works and it's interesting that some charities that are better at occasionally saying we did something and it didn't work and I think that helps their credibility tends to be the development charities I think that's because my theory is that that we forgive them in difficult countries running a lot of programmes and of course your difficult programme in a bit of Africa or something doesn't work and you report that and people think well I'm glad they did it and they're really honest bunch of people because they told us that so I think there's something but the weird thing is that the sector is very trusted and we wondered particularly some of the things we talked about earlier data sharing and consent and all the rest of it where the public are very suspicious about data to the supermarkets but we're deeply suspicious of what they're up to with our data we don't trust government because they're going to try and work out we didn't pay enough tax or something and everything I've seen suggests the public trust the charity voluntary sector more with their data their problem with us is that they think we're a bit useless and we probably lose the data not that we'd use it for bad reasons so we have this kind of trust thing that we're not motivated by private profit or to sort of link up data sets to find that someone should be you know a deported or something like that but they don't but they know that we're so passionate about our cause to believe everything we say would be a mistake and to be quite frank they're probably right now the government uses ONS so most people believe the ONS data they then don't believe the analysis on the back of it that says famously that you know we voted for Brexit the economy would collapse although it is slowly collapsing should we have a sort of ONS for the voluntary sector maybe that's what John's mob should be doing that's a challenge, thanks very much lunch is waiting for us outside so I just thank Dan and Carl for a very engaging question