 So, good morning, everyone. Thank you for being here on this preternaturally warm day. It's wonderful to see you. On behalf of the Conflict Prevention and Resolution Forum, I want to welcome all of you here today and to thank USIP for hosting us and Search for Common Ground for sponsoring the forum and all of the CPR principles for making this possible. So today, we are focusing on impact in peacebuilding and how we learn in our field. But I'd like to start with a story, and then I'll introduce our panelists today. And actually, I don't want to call them panelists because we really want today as a conversation. You have a group here with tremendous knowledge. We have also Adrienne joining us from Burundi. And all of you in the room have such great experience yourselves. So we really will hope to make this as interactive as possible in the next hour and a half. But pretty early in my career, about 15 years ago, I worked at the Hewlett Foundation, which at the time gave $24 million away to fund, at that time called Conflict Resolution, both domestically and internationally. And just after September 11th, their board chair, Walter Hewlett, of the Hewlett family and currently, or at that time, was going through the Hewlett merger with Compact, said, you know, Melanie, I don't think I can fund programs where I don't know if the dog barks. Where I look around at the other programs at the foundation in environment, in education, in health and population, and all of them have beautiful metrics, and they can really tell me the impact of our work. And you have not proven that to me. And they ended up shutting down the program because they couldn't show impact. So that was really a lesson to me. And I made it my mission in my career in peacebuilding to really think with colleagues in the field about how we tell our stories, how we measure our impact, and how we go not only for measuring the impact of our individual programs, but to making the case for peacebuilding as a field. And I can be very transparent in saying we're doing that right now in Congress. We're doing it with state, with aid, with the new administration of how do you convince people of the impact of this work and the impact of our field? So that's what we're talking about today. We've made great strides in the last 15 years. Our methodologies are much crisper. The stories we tell are more compelling. We are much better able to match appropriate methodologies with appropriate interventions. So I think that we're coming at a very different place. And that our panel today and all of you can really see where we are and where we need to go. I'd like to say a few words about the Peacebuilding Evaluation Consortium, where we are all members of that. The Peacebuilding Evaluation Consortium is hosted at the Alliance for Peacebuilding. For any of you who are not members of AFP, I encourage you to join. We are a network of 106 Peacebuilding organizations working in 153 countries. And a great part of our mission is, as I said, to prove the impact of Peacebuilding. So the Peacebuilding Evaluation Consortium, the PEC, is made up of AFP, Search for Common Ground, CDA Collaborative Learning Projects, and Mercy Corps. And our goals are to improve the methodological rigor of Peacebuilding design, monitoring, and evaluation, to change the culture of evaluation, to make sure that everyone in the field understands the importance, not only of good evaluation, but monitoring and learning and adaptive learning, and to create a safe space where all of us can talk about our successes and also our failures and develop a culture of learning and impact within the Peacebuilding field. We'll be happy to answer specific questions, and you'll learn today more about the work of the PEC. But it's my great pleasure to introduce my fellow members of this conversation this morning. Joe Hewitt, sitting on my right, is the Vice President for Policy, Learning, and Strategy here at USIP. And he leads the institute's efforts to capture learning from all of the institute's programs and to apply it for more effective policy engagement and strategy formation. So he brings more than 20 years of experience working to apply rigorous analyses of conflict dynamics to strengthen our tools for conflict assessment to improve the design of peacebuilding programs and to refine systems for program monitoring and evaluation. And it comes to us from USAID, so you can also give us the perspective today of being a government partner and a government donor. Isabella Jean, sitting to the right of Joe, joined CDA based in Cambridge in 2007. And her professional expertise is around conflict sensitivity, peacebuilding effectiveness, program design, monitoring and evaluation methods, and accountability and feedback loops. So she's led collaborative learning processes and field research in Africa, Asia, Middle East, and the Caucasus. And she co-authored Time to Listen, Hearing People on the Receiving End of Aide, a book I highly recommend to all of you. Prior to joining CDA, she conducted policy research on conflict, coexistence, democracy and education in multi-ethnic societies. Leslie Wingender on the right is a peacebuilding advisor at Mercy Corps's Peace and Conflict Team, and she supports their teams on the design, implementation, monitoring and evaluation of programs and contexts like C.A.R., Columbia, Guatemala, Jordan, Iraq, Lebanon, Myanmar and Yemen. And as part of her team, she's led work on developing robust M&E tools for conflict-affected areas, and she provides technical support on conflict sensitivity, reconciliation, M&E and conflict analysis. And we're very fortunate to have Adrienne Lemon from Search for Common Ground joining us from Burundi. And Adrienne, I'm sorry you're set up, so I think you see the audience, but not us, but we can kind of see you sideways. But are you able to hear us okay? Yeah, I hear you fine. Great. So thank you for joining us. So Adrienne is the director of design, monitoring and evaluation specialist at Search for Common Ground. And she oversees the quality of global research and learning within the organization. She's a sociologist by training. And she and her team support in-country staff to tailor research methods to fit complex contexts while maintaining the rigor necessary for reflective practice. And her doctoral work focuses on post-conflict reconstruction and political participation in Burundi. So welcome all of you. And we wanted to start out with today were some of the challenges around design, learning, monitoring, evaluation in the peace-building field. So I wonder if Adrienne, if you could start us off with some of the challenges that you experienced in your work around DM&E. Search, we have a very diverse group of country teams working across 35 different countries. The context and the types of conflicts that we address in those countries are quite diverse and require a lot of different types of response. And so in evaluating and capturing some of the impacts or outcomes of those types of programming, I think we've found several different challenges. So one is that we have a lot of programming that you might have media programming, but the outcomes and the potential outcomes are very different depending on the context. So what your expectations are and how you understand success is very different in CAGR from how you understand that to be in Madagascar, for example. And so we've had to think really dynamically about how do we address that and how do we come up with ideas about success and impacts that are achieved while maintaining an understanding of the complexity of each context and the specificity of each context. Another challenge is in capturing behavior change and some of the long-term changes that are really important to peace building processes when a lot of our projects tend to be one year, two year. And even if you have a three or four year, five year project, some of the longer term changes that you're looking for, it might take much longer than that to even really see what you're hoping the outcomes to be in the end. So we've also had to do a lot of work in terms of thinking dynamically about how do we capture some of those intermediate changes? How do we build that to tell a story so that one of the things that our country teams have had a really tough time with sometimes is they almost forget because they're so goal-oriented and they know where they want to get to. They forget all the little changes along the way. And so how do we go back and make sure that we're capturing all of that? And then there are logistical and practical challenges of capturing some of the outcomes that we see and know are happening. But just logistically, it might be extremely expensive to compare areas where we're working to areas where we're not working. And they may be in terms of isolating areas and the effects of programming. Some of those higher-level ideas that you can sometimes do with a food distribution, you might not be able to do with a radio program because you don't control exactly where the messaging is getting to and all of those things. So we've had a lot of challenges around that as well that we've worked through. And we're making progress on continuously. So. Thank you so much, Adrienne. I think you've flagged some key issues for us around complexity, context, collective impact, and the difficulty in our field of doing the kind of randomized control trials that we see in other fields. So thank you so much. Joe, challenges? Great, thank you. And on behalf of USIP, let me thank everybody who's here. So we have filled a room to talk about M&E, which is great. It's like, I'm living the dream right now. Seriously, thank you. And welcome to USIP. Melanie, thank you. Thanks to everybody for organizing this. This is going to be a good conversation. Let me, that was very helpful from Adrienne to hear some of the challenges working in the field and doing M&E on the implementation side. I want to flag a challenge at the program design stage. And then I also want to talk a little bit about accountability. And I'll talk a little bit. It's nice to know that Isabella is an expert on this. So it's going to be great to hear what you have to say too. Starting with program design, I think one of the big challenges for really good M&E is remembering that in the earliest stages of program design, you're already working on M&E. You're doing that when you're thinking through your theory of change. And program design teams, I think, often overlook that their investment in clarity with their theory of change will pay huge dividends with a good, strong M&E approach. So when we're quick with our theory of change and we specify something simple like, if we provide civic education to young males, then they will be less likely to engage in violence. And we let that stand as just a very simple theory of change to be the foundation for our program. We have forfeited a ton of leverage with a good, rigorous M&E approach. If we push ourselves to have very clear theories of change that are transparent about what's being assumed, we ask questions like, what kind of civic education are we providing? Exactly why or exactly what does it look like when young males have a stronger appreciation for dynamics related to how their particular communities work and how government relates to society? What does that look like? Why does the education change their attitudes towards that? And then what's the logic that connects that to some proclivity towards violence? When you begin to ask all of those questions and push yourself to think about that, you'll start opening up doors to what you'll be measuring later on, what it is that you'll be looking for. Measuring, first of all, with what it is that you're trying to change or measuring what it is that you're doing in terms of your intervention, and then also measuring what it is you expect to change. That specificity comes in the early days of program design when you're designing or putting together your theory of change. So that's a big challenge. We have to remind ourselves to not skip past that part and not to go quickly through the explication of a theory of change. Second thing, accountability. This is hard. Every peace building organization wants to demonstrate their success. But the way I think about what makes for a good M&E approach, and this gets to the hard edge of accountability, is this. My thinking has a lot to do with the way I was trained. So I'm a political scientist. My training, I was a quantitative modeler back in my academic days. And the way I was trained, the best kind of hypothesis that you're testing is one that's very, very difficult to pass the test. If you're testing a hypothesis and you're making it very, very hard to find support for that hypothesis and then you do find support for that hypothesis, you can be very confident in how the evidence supports that finding. And so what we want to do when we're involved in any kind of social science inquiry is set ourselves up to have very high standards for how evidence will demonstrate whether we're succeeding. So let me convert that thinking to how an M&E approach should work for our programs. The best M&E approach is an approach that maximizes the potential that it will rigorously document our failures. That's the best M&E approach. If we have maximized the potential that it's going to document our failures rigorously, then we have essentially maximized the potential first that we can learn. But also, it's going to give us the strongest confidence that when we are successful, that our theories of change do make sense, that they are supported, and that we can feel good about the way we're going. That said, you can see how this bumps up against the hard edge of accountability. It forces us to be committed to the idea that we could be comfortable with failure. And this goes to what Melanie started us off with in her opening remarks. If we're serious about transforming our cultures and our respective organizations to be learning cultures, we have to put ourselves to the test and design our M&E approaches to maximize that potential that will document our work even when it fails. So I'll leave us with those two challenges and I look forward to hearing the rest. Great, thank you so much, Joe. Yes, accountability, design, huge. Isabella, sorry. I'm also really excited to be here. And just wanted to share one tidbit. 18 years ago, I spent the fall semester here in DC working with Search for Common Ground, helping them establish this conflict prevention and resolution forum, and helping to bring the very first session of this forum into on the launch pad, so to speak. So it's really exciting to be here many years later and to speak on a panel and engage all of you. I had to pick and choose which challenges to bring forth, not to overwhelm you with the thought of so many that we can't even begin, but I think I'll make some really select choices here given that my colleagues already are bringing some from the operational implementing side and Joe mentioned some as well. One thing to start is year 2017 and the peacebuilding sector has no standards, zero standards, which makes the work of evaluators quite difficult when we think about quality of peacebuilding programming and what constitutes effectiveness, and that extends to M&E standards and how M&E of peacebuilding should be informed. So in terms of industry-wide, sector-wide standards, that remains a really important conversation. Fraud was many different opinions in terms of whether or not we should have standards, whether or not we could learn from the kind of challenges that the humanitarian sector went through all through the 90s and 2000s, setting standards and battling it out in a variety of humanitarian fora around sphere standards and HAP standards and CHS standards more recently. What can we learn from other standard-setting bodies and industry-wide standards in other sectors? And I'll also ask the question of who should be setting them and should they be informed by the experiences and concepts and lenses of global Norse or should they be co-designed jointly with peacebuilding organizations in Global South and how that should be structured and what that conversation needs to bring out in terms of all the various power symmetries and politics in this process as well. So I just wanted to point that out and it's something that the Peacebuilding Evaluation Consortium is actively discussing and has been for quite some time and we've engaged evaluators at the American Evaluation conference last year and many other colleagues already in some initial conversations on this and maybe Melanie can also share some thoughts on this in a discussion if that's of interest to the audience here today. Another thing I wanted to mention is I like to distinguish technical challenges in peacebuilding harmony from political challenges and by political I mean this institutional, organizational resistances and pushbacks and sensitivities which at least partially relate to accountability issues and this organizational culture of the reputation, a protection of organizational reputation and so on. But also just internal institutional behaviors and dynamics that exist in many organizations which we've been documenting at CDA across several sectors actually and they truly are organizational behavior elements that could be categorized as such and you could see patterns across different types of organizations. We continue to see different forms of data privileged over other forms of data, expert opinion, privileged, formal evaluation, data privileged over participant feedback, over other forms of information that may arrive at the doorstep of an organization to be considered to be embedded in decision making, to be at least informing some of the programming decisions and so on and not just programming. And so for me some of these challenges are not purely technical, they are about what do we value and do we actually actively solicit different types of information or do we commission very particular types of extractive or information seeking exercises. So the lack of gaps in evidence, let's put it this way, are not just related to the fact that better evaluations could be done, I think we can all agree on that, the quality of evaluations could improve, could continue to improve and there's been a lot of progress on that. But it's also what happens to those evaluations when they cross the threshold of the organization and when they appear on important desks, decision making desks or processes that can be quite significant in reorienting the organization strategies and programming options and decisions. So for me having watched a lot of these debates in the humanitarian sector for quite some time as well, I know that even when evaluations exist and when there's plenty of them, there's a critical mass of them and even in cases where they've been uploaded and shared which is yet another challenge of transparency and sharing, even when evidence is there and it's been synthesized, the use of evidence remains an issue and I think that's something that we need to already be looking for in our sector as well in terms of what enables it and how decision makers treat different forms of evaluative exercises and the kind of results that come out from them. And another issue around institutional setups, I'm always curious where many functions sit in the organization, where in the organogram are they located? I've been tracking this question specifically on accountability and feedback as well and it's really telling how far from decision making and learning processes sometimes these functions are or how close and deeply embedded they are. We've seen an entire range and it'll be great to hear more from search and Mercy Corps on some of the setups and the different ways that they have thought about it because as some of you who know CDA's work, we're not operational organizations so we are a learning partner. So in a way we serve as a pollinating bee, visiting many, many different organizations and learning from their experiences, from their experimentations but since we're not implementing our programming, it's more of an observational comment here on the role of thinking through your organizational setup and we continue to see in many organizations and many functions being relegated to a corner and being seen as a technical function and not always engaged in the kind of high frequency learning that we all aspire to. We all aspire to see this rapid learning cycles and feedback loops and adaptive programming but it isn't always possible just from the organizational processes alone. So I'll stop here. I won't go into my technical issues. I think the methodological pluralism and all of the issues that AFP and PEC specifically have been tracking is quite interesting and I'll share some resources that we've put out later in the session. Thank you. Thanks Isabella. Leslie, challenges. Great, well I'm excited to be here and be on this panel. I share many of the challenges that my colleagues have just presented. As Mercy Corps is an implementing organization similar to what Adrienne was sharing, we are implementing peace building programs across diverse contexts and so we have different objectives, different approaches if it's in CAR, if it's in this section of CAR, if it's dealing with this issue about social cohesion or if it's more about grievances around shared resources, your approaches and your programs are different and that's just in CAR and what is it like in Edak or Lebanon and other places. So thinking through both we need to be contextual and think about the M&E's systems for each of those programs but then also not lose that there is commonalities between these programs and capturing that learning as an organization. So I know one struggle we have as Mercy Corps is how do we capture that and how do we share the learning that's happening in CAR with our teams in the Middle East and even though the focus might be different, there are common indicators that they're using and seeing how did it work in one context in another and so figuring that out as an organization and what our structures are, I think continues to be a challenge but an exciting one as well and I think both seeing it within Mercy Corps and how I sit on a technical team and so I'm able to see our global peace building work and so I'm able to make part of my job is to make those connections and to share what this interreligious peace building project in Nigeria was doing and what was their M&E system because Myanmar is starting up and even though they're gonna have a totally different approach, they're actually being able to look at someone else's M&E system and go, oh, I like that and this and that or not that and really fostering that engagement. I also think, so on the one hand, when everything is to the context that it seems like everything will be different but I think that challenge is also pulling out the key patterns that we're seeing. Are people responding to certain questions and in a positive way that allows us to measure the impact we're making. Another challenge we have, I think is the timelines of just as Joe was saying when you're designing a program and just the realities of what a proposal process is like and that is the start of the M&E process but you write that and then you send it off, you see if you get it and then start up as quick and passing that over to the new program manager and the M&E staff that are on the program. How to make that a smoother process, how as an organization can we foster that sort of handover or team, someone who's involved in the proposal team is involved in the kickoff of that program so that at least the learning is there or that it's documented in a really good way so that if that person can't be there they can actually have a document that says this is why we thought this was a theory of change and that this point in time, seven months ago, this is what we thought or a year ago or whatever. But then also recognizing our programs don't line up with each other so we get findings at different times so that's the environment we work in so continuing to create a learning environment at different points in time is something we're at Mercy Corps thinking about and again, I come back to Mercy Corps because I'm thinking about all of our programs that we have ongoing but then I think of search and the many programs that you also have going and how do we continue to share that out with Search for Common Ground and Partners Global and other organizations that are implementing in these contexts. So I'll leave it there for some of the challenges. Well, thanks so much to all of you and just on that last point, Leslie, part of what the PEC is trying to do not only with our organizations but we interact with a huge cross-section of the field both here in the US and abroad is to share that collective wisdom across the field not only within implementing organizations and our partners but also with policymakers and I'd like to raise just a couple of other challenges that have just come up in the last week very starkly and we were in Europe last week for meetings in London, Brussels and Paris and what came up frequently and we were seeing it now in our own government is what happens when policymakers don't care about evidence? So we're doing the best that we can to come up with a really compelling evidence-based set of arguments for peace building when they are motivated by political arguments and so how do we start to connect the two in these really challenging, populous times? And one other point and we talked about the difficulty of sharing even within a multi-mandate, multi-country organization like Mercy Corps Search but how do we have that collective impact for the field? So to take all of the evaluations that are being done at such a good job at the program level and bring it up into peace with large. So after naming these challenges we also wanted to talk about bright spots and very particular tools that all of our agencies and organizations are working on to meet these challenges. So I wonder maybe Adrienne and then Leslie could talk kind of operationally within your organizations and Isabella and Joe to talk kind of more broadly about the work you're seeing and your reflective work and your policy work. So Leslie maybe you could start us off with bright spots. Great, so with all the challenges I presented I also can think about the many Skypes conversations, trips, email exchanges with both program managers and M&E staff about questions about indicators and learning and trying to connect our different teams. So I think that is ongoing in our organization. I have exchanged a lot with Search for Common Ground and they also have a similar dynamic and so that's exciting to know that we're thinking about our organization and how we learn and what are effective ways for facilitating that. So as our role in the Peace Building Evaluation Consortium we developed a survey tool several years ago and to really look at how do we measure the impact of an integrated conflict management economic development program and so this was in Nigeria where we first tested this tool and as part of the PEC we've been able to reflect on that survey tool and look at what information did we get back at the baseline, what were the response patterns to several of these questions for key indicators around trust, around conflict management, leaders resolving conflict, around economic activity, getting out of the house, being able to move because of security constraints and looking at these different questions and then revising them, changing them a little bit so that we can in our follow on program to be able to get better response rates as based on that learning. We did have the end of that first program that we completed just recently and we were able to use that tool to run an impact evaluation because we had a comparison group in Nigeria and so we were able to run the changes between online baseline and then compared to the comparison sites and found impact on that the program had on perceptions of security in the area in people's movements that in where we were working they were more able to move and access their resources than in the comparison sites and so it was exciting to both see that tool used and help show us impact for one of our particular programs. We're also able to continue to see that we weren't able to get at some of the economic well being measures that we continue to look at what is the impact of conflict management on economic well being and on income, on investment in assets and so as part of this one study that we did under PEC we've made sure that in another program we've included those measures both that baseline and inline. So again, gets back to that timeline issue where you learn something and they're like, okay put now let's make sure we put it into this next survey tool that we have but at the same time we don't have findings necessarily about it yet. I think another piece that we've been looking at is also across our governance and conflict management programs where we have governance outcomes and engagement with local municipal government and we've articulated our theories of change. Many of our programs have maybe have had a theory of change or some of them didn't so we took a step back to think through why are we doing the objectives? Why do we have these three objectives? What are our theories of change? Articulating those and then laying out what are the indicators that we had measured that with? And then we took our baseline studies and the data that we had in three cases in Lebanon, CAR, and EDAC and looked across that data to see if there were relationships between people who were more satisfied with government services were more likely to trust the opposite group and really testing some of these pieces that were quite interesting. I think one of the really interesting findings there was that in all three of our studies there was a strong relationship between successful conflict management and conflict resolution by leaders and the perception of their communities were safer. And this was just a baseline study and this was set up to look at a cross-sectional analysis so what people said on two questions or something. And so there was a strong that they were more likely to see perceived peace in their areas but those who also said there was more better conflict resolution in their areas did not... We didn't find a relationship with their support towards violence. The question we had around would you support violence for a just cause? And so that really reflects back on conflict management programs when they're in isolation and not addressing grievances that might not get at that aspect of why people might support an armed group. And so that's giving us some inkling of how do we better analyze our contexts and think through the pieces of our program or various programs in the area for the goal of what we would like to see changed in those areas. So I bring up those two cases what we've been working on and in terms of tools and what we've set out is really a reflection on how did those questions work in these three areas. Here's the indicator that we have. So we have a tool that has... Here's the indicator. Here are the three questions we used. Here's something to note about each of the case in each of these areas. And we recommend this question versus that question. Or this is how we contextualized it in C-A-R because of X. And so as a tool so that someone, an M&E or a program manager or other organizations can take that and reflect on that and say, well actually this question did work well. Let's continue to use it so we can have not a standard indicator but still try to use common questions so that we can eventually look across programs and across agencies. So those have been some bright spots for us and excited to see down to the very details of a question because for me it does come down to review in a baseline 10 o'clock at night for it to be piloted the next day. It's like what question is good? And it's like, oh this work in C-A-R, Lebanon, and Iraq. Like let's put this one in because if it comes down to that maybe no one else in this room ever has had that experience. But and also to engage with our field teams and our M&E staff and push back or have their reflections and capture that. So that's some products that we'll be putting out for the on the piece building on the DM&E for piece which Adrian will talk a little bit more about but having those tools as an example and a tool to again for you to take and go yes and no or yes this works and let's try it. So I just followed up with a quick question Leslie. When we talked about this earlier you talked about the trust levels and you mentioned C-A-R in response to your survey. There were trust levels of like 95 percent which if you know C-A-R might be a surprising finding. So how do you explain that when you get something which seems maybe like an outlier? Yeah so this is a survey question on trust and how much you trust the other group. And it was an interesting, it was a question we had also in Lebanon and that had more of like a 40 percent response rate. So we didn't think the question was all together bad. It's just in C-A-R I think the context was there was a bias towards responding really positively and it was also it caused our team to reflect on is was there a bias, yeah, bias one to respond very positively. And so how might we change that in the next time? Actually is the trust between those groups really, are we just talking to, when we say the other group is that even understood? Are they just thinking within there either their Christian community or their Muslim community? It's not thinking cross, because that was the goal of the program is really to reach in these areas in Bois, like the two conflicting communities that ran along, even though it ran along religious lines. So it's, I think there was another question about like perceptions of the other that worked really well in C-A-R and had a much lower rate. So somehow that trust word was, didn't work as well as an indicator. So it's just interesting reflections on, again, for our practice, working with the enumerators, testing that question again and seeing this didn't work. We got a very, very, very, very high response rate. Like is there another way we can word this to get some more variety or so that it's not so as biased? So again, how is Mercy Corps, do we take that into practice when we run our survey and pilot it as well as think about maybe there's other, in that context, perceptions was a better use of, a better question to use. Thank you. So Adrian, can you talk about some of your bright spots and also the DM&E for peace, which has emerged as such a critical tool in this area? Sure. So the way that search has approached getting better at learning and sharing some of the lessons learned across all of these countries that we work in was really to prioritize, I think, transparency and open discussion around issues of M&E and the way we capture results and understand them. So as Leslie said, we also have a team called the Institutional Learning Team that's very similar to what Leslie does in Mercy Corps. And we support that sort of internal process, reflection within each country, understanding how we can take lessons learned from other countries. We've piloted different initiatives to encourage sort of cross-country discussion and cross-pollination of the lessons learned by each team. And we've got a few different events that take place, for example, the What Works, I think, is an internal webinar that we do with InSearch, and the country teams get to come together around a theme and discuss what have they learned and what are their experiences with some of the different approaches that we take within their organization. So it's a really good opportunity to encourage that discussion and encourage understanding of both successes and failures around these different approaches. We also work on internal reflective practice of country teams and making sure that we find those opportunities where you can do a really great evaluation and capture some of the, you know, capitalize on when there's other people doing interesting, other people doing interesting studies or capitalize on if there's a really good opportunity to gather important information and make sure that that's making it back into country team reflection that they're having their own discussions. But then also we've prioritized transparency and discussion in a sort of external format. And so that's where DM&E for Peace comes in and several other approaches that we've taken. So we publish all of our evaluations online, the good, bad, the ugly, everything in between. And I think what that's been able to do is garner... We've been able to have very interesting discussions with other groups and we've been able to share very openly and honestly about what have we learned through our experiences in doing peacebuilding work. And that's been incredibly useful for discussion with other organizations, with donors, with people in the political sphere as well when you can kind of share those openly and say, here are the results of the work that we've done. So that's been important. And then we have DM&E for Peace, which is an open source knowledge platform. It provides all kinds of practitioners, right? Peacebuilders, development practitioners, evaluators, even academics or donors are welcome. Anyone's welcome. It's a very... It's the ideas to create an interactive space where people can share lessons, learn, share tools and talk about what are the emerging practices on DM&E in these kinds of programs. So we did this... We piloted this as part of the PEC, of the Peacebuilding Evaluation Consortium in order to make sure that there was a library and a kind of interactive community where people could come, improve their own work and really engage in thoughtful design around peacebuilding and development. So the DM&E for Peace now, I believe Jack is in the audience who's our DM&E for Peace Manager, but I believe we now have participants from 191 countries that engage on the platform. There's almost 5,000 Twitter followers. So it's something that's really managed to create conversation and engage a very diverse group of people. And to add to that, we've even brought at different times the DM&E for Peace team out to country teams to help them understand how they can engage better in it, which is something that sometimes like these open platforms are really fantastic for a very diverse group of people who use these tools already, but getting that group of the partner organizations or people who don't have experience using these types of platforms is another level. It's another challenge. And so we've managed to kind of bring staff out to help teach people about how can they also use these. And they see that the tools are available in English, French, some in Arabic. And so there's a very, very diverse opportunity for diverse engagement. And so that's been really interesting to see. And then to kind of build on that, we've also DM&E for Peace also engages in the Thursday Talks program, which is designed to sort of connect expert evaluators and people who are experts in their field around different themes and connect them with intermediate level or emerging evaluators who are very interested to learn about what are the practical challenges of evaluating Peacebuilding and evaluating development programming and how do we address those challenges and how do we learn from each other? So I think we've now done about 62 of those Thursday Talks. And they're on average, they're attended by 45 people. So to give you an idea of the reach of those types of platforms, I think they've been very successful in bringing people together around the practical issues and they give a forum for people to discuss both the lessons learned and the tools that they're developing but also sort of the practical challenges and how they can be addressed and ask all of the very detailed questions about, all right, I hear you, I hear the theory, but I'm doing this in my country and in this program and how does that relate? Or I hear that this is a very interesting approach. How is it different from this other approach that I've heard about to get into some of the more detailed differences and lessons learned around different approaches to evaluation and Peacebuilding? So that's DM&E for Peace and the Thursday Talks. And then I hope you can kind of see how that's also translating into the culture at search that our team has been able to also make use of these external platforms and also kind of mirror that and evolve internally to make sure we're also sharing lessons learned internally and across countries and taking advantage of opportunities where we can innovate and do something really interesting and then share with other teams and say this is how it worked for us. How do you think that will work for you? So that's really been the main approaches. The only one thing I would add to that is that our DM&E for Peace team also was able to host the Breaking Barriers Conference, co-host the Breaking Barriers Conference in Cape Town, South Africa last year, which to Isabella's point about inclusion and really getting everyone involved, the Breaking Barriers Conference was a workshop that was based around the process of co-design and it had a variety of different actors come together to talk about inclusion and peace building and the challenges of making sure that when we're understanding success in programs or impacts of programs that we're actually looking dynamically at everyone that's involved in that system and that ecosystem and not just looking at our own idea of who was supposed to be affected and what was supposed to be affected and so in the process of co-design we managed to engage practitioners, evaluators, academics, again a wide variety of people to get into one room together and discuss what are some of the concrete ways that we can keep moving forward and keep innovating on these issues. So those have been some of the bright spots and the successes I think in terms of being able to meet these challenges and talk dynamically and openly about the fact that yes we don't we don't all have the exact same experience all the contexts are complex but also we have shared lessons learned and we can learn from each other and learn from new approaches that others are trying and so that's both an internal and an external process at Search. Thank you so much Adrienne and I invite all of you to join the Thursday talks we would go from 45 to 145 if all of you did that. There's also a mentoring program through DM&E for Peace so if you know young people or people earlier in their profession who need guidance it's there. So Isabella this is a good segue to talk about the online field guide and other bright spots and tools that CDA is working on. Great thank you. Yeah and just to respond to some of the challenges I raised earlier on the institutional issues that I noted my bright spots are actually Search for Common Ground and Mercy Corps and I think they've really invested I won't repeat everything that has already been mentioned but when you see an organization invest significant resources and human resources and really support some of these learning processes and think intentionally about placing these processes in a particular way and connecting them to program quality reviews and internal reflection and evaluative processes. It is a really important step that I think distinguishes this organization and a few others that we work with like Safer World in the UK and a couple of others that are using very innovative methods like outcome harvesting and systems thinking informed evaluation to inform both their program design but also their policy work on goal 16 on the SDGs and so on. So definitely bright spots for me on the actual resources and linking this to my comment on zero standards in the absence of standards CDA for many years has been working as part of our reflecting on peace practice program on using criteria for effectiveness as we used to refer to them something that came out of CDA's initial set of 20 plus case studies and the work that maybe some of you have seen called confronting war. So they're not exactly standards we've never referred to them as such. We were pleasantly surprised and a little bit alarmed when we heard that a lot of people use them in lieu of standards. So what we decided to do a couple of years ago in 2013 we took existing methods for program quality assessments and evaluability assessments something that is not unique to peace building sector in fact is more known and applied in other sectors like education and other disciplines as well and we embedded some of the RPP infuse the reflecting on peace practice elements like the quality of conflict analysis and the frequency with which you updated the theories of change and how well articulated they are and how frequently they're tested program logic you know really testing that logic of how you intend to reach your desired outcomes and desired changes and also distinguishing changes at the peace read little level where it's very localized to a larger societal level changes which we often refer to as peace read large. So a lot of this and other elements drawing them together with program quality assessment with them that's already been accumulated by evaluators and program designers elsewhere and testing it in a field so we went out to Afghanistan and tested it there with Norwegian church aid we went to the Caucasus and tested it with international alert in a long standing program that they've had in the region in Sri Lanka as well as in Mali with interpiece and the findings from that was recently recently pulled together in a resource guide that is now out and I would highly recommend for you to check it out for a variety of reasons it's actually quite flexible and adaptable my colleagues spoke about it exactly well not even a week ago last Thursday so there's a Thursday talk recording on this guide and it's called thinking evaluatively in peace building design implementation and monitoring and it offers this couple of options that are quite useful and we've found a lot of organizations reflect on a utility of it when they tested it with us in instead of a full on evaluation when the timing is not right or the resources are not there this process is allow you to engage in evaluative reflective exercises either within your team or with some external facilitation if you require that to really test these elements of your program design and the quality of the implementation and yet it still you know isn't exactly the kind of standards that we wish to see in a sector so to find this and other resources I just want to point out that out there in a table there is a single sheet flyer that highlights the online field guide which PEC Consortium and my colleagues have put quite a bit of time and resources into pooling foundational resources some really important intermediary some beginner level resources that could be at your fingertips if you are a program designer or if you are an M&E specialist and evaluator interested to see what tools and frameworks and methodologies have already been adapted for piece building programs or which ones have been peer reviewed and advised by evaluators on our advisory group and in our network you can find the online field guide on the M&E for piece and there's a link here and just a couple of other resources that we recently commissioned that I'd like to briefly mention I think there's a lot of wicked problems out there in the piece building sector which you know we refer to in with this term because there is a lot yet to be unpacked uncovered and truly understood so one briefing paper that you will find on the online field guide is on piece building ethics and another one that recently was commissioned by us and came out online already is on the role of the supernatural in evaluating interreligious peacemaking action and the reason why we commissioned it is increasingly in a lot of the interface work and interreligious peacemaking work evaluators found themselves facing situations where they were asked to evaluate and pass a judgment because essentially evaluation is judgmental pass a judgment on whether or not a program achieved its result its intended results and has been effective and many other obviously questions embedded in the evaluation framework and they were working with deeply religious people for whom the role of the divine is part of how they explain the causality of how you reach certain outcomes or why you haven't reached them or the time scales of reaching certain outcomes or the reason why we stay the course even when we don't see effectiveness effectiveness defined as in reaching your results in a particular you know efficient and effective way so it was an interesting collaboration by a colleague of ours who is a pastor and by a seasoned veteran evaluator many of you know who is an atheist and we're them working together to to bring out some of this initial thinking on this so I highly recommend you to check that out and another paper that's coming out soon is on monitoring evaluation of preventing violent extremism another wicked problem I'll stop there thank you so much Isabella and I just mentioned briefly in line with the supernatural a subset of our peace building evaluation consortium is a program program called the effective interreligious action program we're just looking specifically at problems of monitoring evaluation and learning in that interfaith peace building space so Joe USIP is one of our founding partners we were just galvanizing the world of monitoring evaluation learning design so what are your bright spots let me that's let me get at that question starting by going a little bit big so I am going to my remarks will be a bit different than what you just heard about tools for good field programming and tools that help program implementers and I'm going to talk instead about the bigger policy picture I'm going to take your challenge about thinking about how evidence has influenced policymakers and then I'll bring it down to USIP I think we're at a point where we as a peace building a community of peace builders practitioners academic scholars we're at a point of remarkable consensus at a big at a big level about what drives conflict about what makes a country vulnerable to violent conflict we're at a point where I think there really aren't any more surprises when conflict breaks out in a particular country we're at a point where most people agree that that was a country that was conflict vulnerable within the development world a world that I was a part of up until just very recently I guess I still think of myself as a development professional the large sort of I guess it's a theory of change but the meta theory about conflict is that pretty much all internal armed conflict can be traced back to a serious dysfunction that exists between state and society wherever there is a broken or frayed social contract between people and their government the conditions exist that will enable armed violence that is the big picture and that picture is fortified with lots of evidence that has been thrown on the table starting in the late 2000s culminating with the 2011 WDR continuing with more academic research that has shown that all kinds of dimensions of the broken social broken state society relations have a role in influencing the outbreak of armed conflict so that's a pretty solid consensus about the structural drivers of armed conflict and that consensus has driven that academic consensus has driven significant policy consensus so Isabella briefly mentioned the SDG goal 16 sustainable development goal 16 about the importance of promoting peaceful inclusive states that represents part of that that consensus earlier in late 2011 the new deal for engagement in fragile states represented a consensus between developing countries that self-identified as fragile between development donors that worked in those countries between a huge community of civil civil society actors also about how you work in fragile states that was a major consensus about how to work in these countries and an agreement about what the structural causes ultimately are of armed violence all of this is at a very high level a high structural level a characterization of countries that exists beyond the reach of how individual peace building programs operate individual peace building programs operate in different small parts of the state society relationship and so while we have a very important bright spot which is a point where we have this significant consensus about what structurally is driving conflict we now have to start filling in more of the puzzle with how it is that our individual peace building programs fit in in addressing or providing a remedy for broken state society relationships and we do that we're getting there and this is where I'll tell you a little bit about how USIP is trying to do this so USIP refreshed its strategic plan in the middle of last year and it's essentially the strategic the strategic plan is built around a theory of change that if you address the causes of fragility the extent to which you've got a broken social contract if you address the causes of fragility you will reduce the conditions that promote armed violence and so how do you do that USIP specified a number of peace building objectives the kinds of objectives the kinds of goals that are within reach of peace building programs and articulated a logic about how those peace building objectives play into addressing the larger causes of fragility so for example peace building programs that promote inclusion building institutions that are more inclusive of all major societal groups that's a particular peace building objective peace building programs that do work to advance or strengthen systems of justice peace building programs that do work to strengthen community security peace building programs that work on reconciliation bridging divides in society these are all peace building objectives that are within reach of our peace building programs and we believe all add up in a way to address the larger causes of fragility the challenge for the peace building community is to start compiling evidence about how all of this works how all of this work adds up and it's something I've heard from all of my colleagues this morning about how we've got to get ourselves to a point where the evidence from our particular programs can be aggregated so that we can learn how it all adds up and I think what it should add up to is an understanding of how these programs are addressing the broader structural conditions that are contributing to fragility so I'll end this I know we're going to get to a Q and A from the audience soon which I can't wait for there is a bright spot that we should not lose sight of we're sitting at a remarkable time with some very important consensus it's up to the peace building community now to continue working to align our programs to figure out what we're learning from all of them to help us better understand how we're addressing fragility thanks thank you Joe that was really powerful and I might add to that that another consensus that's forming that's perhaps a little bit broader was in the past year we saw around the World Humanitarian Summit the World Bank Fragility Conflict and Violence Forum a real consensus among top policymakers institutional leaders that conflict prevention lies at the heart of revolving huge issues of humanitarian disaster refugee flows pandemics and so I think there's even more focus on us now as a field as being a driver of change in those areas so then again gardening their evidence how do we fit into that is a huge piece so I'd like to say just one word about the next steps of the PEC we're getting Adrienne back good okay so all of the challenges we've talked today about collective action how we actually move from better peace building monitoring and evaluation to better peace building itself will be the next phase of the PEC so we've applied for another grant from the Carnegie Corporation which has very generously funded this work we have funding from the GHR Foundation which funds the smaller interreligious action program but we really look to all of you to give us questions engage with us go on to DM&E for peace as we move to this you know this new phase where as Joe very put it very well we're all figuring out where our piece of the puzzle fit into it I think we are starting to accept as a more universal theory of the effectiveness of peace building so we have about 20 minutes and I would love to hear from all of you so why don't we cluster our questions and then take it from the panel oh my goodness okay so yes sir your hand was up first and then here and then here then Lena Joe I'm really glad you fought for this final perspective and in looking for evidence I would suggest at the interval of two they came up with this amazing thing of universal type version in 1980 we're down the evolve which is a bipartisan commission report by President Carter who said basically unless we end the worst aspects of what 300 party by the end of the century we'll have trailers and war or genocide if I just do things like that or something this is nothing new and I'm excusing it from my frustration there's a lot of great stuff going on but we need to scale it up and fast you know what it seemed like a sustainable building bill as you mentioned 16 which is a key goal but we need to do it there needs to be a comprehensive approach what do we do to overcome this trust issue with our government the media you know you mentioned that the problem with facts these are facts that we hope these juicers can solve everything wow these things you know how long did it take us to anyone else thank you so much hi I'm Sharon Kodak and I have a question regarding the programs have you found that conflict resolution and peacebuilding programs are more successful if they bring together people from warring parties for instance you know programs that involve both like Palestinian and Israeli youth or programs that involve you know both sides of the car conflict are those more successful than trying to train people separately thank you great question thank you see we were here in the back who had a question up here before okay Lina hi Lina from search for common ground thanks to all the panelists I just want to dovetail on what you said Joe as well in the sense that I think that the opportunity is to make the monitoring and evaluation serve the peacebuilding goals in terms of quality but also be able to speak to policy makers and that we are really on the edge of being able to do that if we're understanding that those policy makers also want to make a good investment that will lead them to being proud of supporting stable and healthy societies and just a couple comments search for common ground also recently completed a strategic planning process for the next 10 years and we looked at effectively in a similar way to what you were saying Joe what enables conflict to become violent and that states a society relationship which is vertical cohesion and also that horizontal cohesion and fragile and conflict affected states being a big part of the picture but also other states that may not be in that state of fragility but also are ripe for conflict and I think that as we go forward we should be able to orient the distinct tools in our monitoring and evaluation to be speaking to how is it that we're strengthening vertical and horizontal cohesion and from a search for common ground perspective we found that we do that better when we're able to make collaboration and accepted and expected social norm and have institutions reflect that culture of collaboration or what we call the common ground approach and so now going forward really the challenge is to be able to do that in a way that can speak to the prevention of violent conflict in a way that will make sense to policymakers which they will see in their interest and which will strengthen our peace building tool so that we're not working at cross purposes okay thank you why don't we take one more yes sir Hi my name is Ashok Panikar and I'm from Metaculture I'm a practitioner and I've been a practitioner for three decades now and yes the everything that's happening in the world of M&E is fascinating and very necessary I just have a minor word of caution and this is coming from the technocratic approach that very often conflict resolution professionals and agencies are sometimes forced to take because that's what the funders want and I think we should be very clear about why we emphasize M&E at this level is it because we want to see better results or is it because funders want a certain kind of yardstick by which they can determine how much funds to give or is it because that's the only language that funders can actually understand the reason why I say this is as a practitioner I am conscious of the kind of skills and values that are not amenable to quantification leave alone measurement so for instance there is a tendency in the last 10 years or so to make conflict resolution a science I think we must recognize that very often it is an art and I had an earlier career as an artist and I have to say the ways in which artists take risks the creativity that is required to do this work when measured crumbles we need to be very very conscious of the implications of jumping headlong into this entire world of somehow making everything measurable and quantifiable because I do want to overstate my case I appreciate this I in my work I need it but I think we should just be a little wary of how far we go thank you good thank you Ashok I just want to welcome you back to Washington moving here from Bangalore it's wonderful to see you and Alexa and then we'll go to the panel Hi Alexa Courtney with Frontier Design Group thank you to all of you for sharing your experiences a quick question about data as a global good and potentially data sharing as my hypothesis a catalyst to putting some of those puzzle pieces together Joe that you mentioned I'm relatively new to the DM&E community so I wonder if there is a data hub or kind of data sharing agreements that maybe are already in place for DM&E or if not where else you all collectively turn to to actually look at raw data to maybe reproduce your own results or have your own insights it's very powerful and I appreciate search for common grounds kind of radical transparency about publishing all of the evaluations online which is a huge step forward but it's very different to access a synthesized evaluation than some of the data thank you thank you so we have some really interesting questions around you know that these questions have been around for a long time you can even start with the hunger approach it's been done and then move out to conflict so kind of what's new here the issue about whether it's better to bring piece builders together kind of separately by community or across lines talking about the collaboration issues further let's see is this an art is it a science what it had we measured the unmeasurable and then open data and data hubs so Joe do you want to start and we'll move down okay it's tempting to try to weigh in on all of these so I'm not going yes I'll pick a few okay Sharon I'll start with your question I thought I saw Caroline Breerly come in here oh she there she is so my former wonderful colleague at USA's office of conflict management and mitigation did a lot of work funding people to people programming based on the the theory that it's much better to bring people together who historically have been in conflict and work with them together to transform their attitudes and behaviors towards each other and there is a lot of rich evidence to suggest that that works how you do it depends on lots of things these programs take all kinds of different shapes and sizes so you know one very successful program you mentioned Israelis and Palestinians CMM for many years has been funding a program called peace players bring kids together to play sports together and then surround that with a lot of other programming and there's a lot of rich evidence to suggest that in fact their attitudes do change an impact evaluation was done of that work part of that evaluation was funded by USIP part of it was funded by USAID and the evidence was pretty substantial that it worked there were other projects where you brought people together to do natural resource management efforts people who had historically been working and the theory was that if you bring them together for a cooperative endeavor that doing that will transform their attitudes so there's a lot of rich evidence to suggest that what works is to bring the people together rather than to work with them separately oh Vina I'll pick out your question I think you're exactly right there's we have to be worried about both vertical cohesion and horizontal cohesion the only thing I think I would add to what you said because I agree with what you said is that horizontal cohesion the extent to which groups in society are getting along with each other processes for reconciliation are moving forward and so forth that doesn't happen independent of the quality of state institutions governing institutions have a role to play I think in promoting horizontal cohesion it's very hard to imagine horizontal cohesion existing in an environment where governing institutions are not responsive to all major identity groups where governing institutions are not inclusive of their needs and so I think there's a dependency between horizontal and vertical inclusion that we have to be aware of when we're designing our programs so that's just an addition to your comment I think you're spot on about the importance of paying very close attention to horizontal cohesion I can't resist weighing in on Osho's comment about conflict resolution having being an art and I don't I'm not sure if you said not necessarily a science but I will say this that in no matter how you view the activity of peacebuilding we have to be faithful to the importance of evidence in our work if we cannot demonstrate that we are succeeding whether that demonstration is a qualitative demonstration or a quantitative demonstration then we are in deep trouble and so whether you say it's an art or whether you say it's a science your commitment has to be towards pulling together rigorous evidence about our work which can be done with qualitative bits of evidence but the point is that we have to be rigorous thank you I'll also pick on a few to respond to in relation to the first comment I'm sorry I forget your name but on this nothing is new it is a pattern in a lot of our case study work that my colleagues and I do at CDA to engaging conversations or at participatory development for example what's new in that and yet it is still a very active conversation and much needed one and it is a surprising thing indeed just an equivalent I'm giving here where Robert Chambers seems to have already said all the really important things that need to be said back in the 70s and we went out and did the listening project in 28 recipient countries and heard just how minimalistic participation or inclusion or ownership of any of these decision-making processes is so indeed lots of deja vu moments I think for all of us across different sectors and what I think is really important for us in our work with donor government agencies and at the policy level is to infuse some of that thinking also with systems thinking with the complexity aware approaches that allow you to also retrace some of that I guess retrospectively some of the barriers and places where efforts have failed collective efforts as well as individual agency efforts but also plan ahead so I would I would say that that's that's a tool that we need to be using more often and collectively not just in individual organizations on intergroup or single identity work I think it's very contextual and I think it could be complementary you mentioned Israel Palestine so you know we've captured a number of really powerful examples of single identity work which was quite important at the time for Israelis and Palestinians to be meeting separately and to be working on their societal issues on the various divisions within their societies and it's no less challenging to do single identity work in this country it's very similar right so and on vertical and horizontal cohesion what I think worries me there is that in a lot of societies where civil society space is closing up and we're seeing that actually quite strongly in in former Soviet Union republics and the region where I personally am from from Armenia and neighboring countries where a lot of our peace building partners that we cooperate with in Aprasia and a number of other places Ossetia are just having such a hard time actually engaging with government structures and in Ukraine and many places where the state structures are extremely restrictive and are actually placing such serious barriers and threats on the kind of vertical state society relationship work for external people to be engaged in that in those contexts becomes a highly political exercise as well almost toxic sometimes depending on who you are who you are as an outsider intervening in this so you know in essence in places I'm just taking it away from Eastern Europe even in Ethiopia where if you are wanting to to tackle the state is the citizen state relationship you are instantly restricted by the by the regulations put in by the government currently on how much of the democratization and human rights related work can you do as an outside actor or an inside actor so thinking of M&E in those contexts that are so restricted is is quite an interesting area for us to consider and to consider what we as outsiders offer to insiders working on these issues in their own societies in highly contested and constricted spaces and I can't I cannot agree more with you Ashok on you know the pendulum swinging too much into the technocratic I think that's kind of what I you know began in my comments referring to the technical issues which do not solve the political and other issues and institutions but having visited many many field offices out there in the front lines of this work and seeing enumerators sitting in front of computers typing things into as as PSS databases and not really being engaged as critical thinkers as people who can think on their feet and do the kind of adaptive learning it is problematic for me personally as well how do we treat M&E how do we actually prepare it now I teach this as a course at the university and I'm highly aware of the fact that we're training that the generation of you know it is now professionalization exercise right in planning tools in measuring tools and when is it actually a lot more of an art form and do we allow for that space is is a real question for me as well thank you thank you so Leslie and Adrienne we've about five minutes left so fast yeah I'll adjust Alexa's question so sharing data I think that's a very we've discussed this a lot as the PEC and you know to be honest it's very hard to share the data there's a lot of you know there's there's a lot of reasons behind that and so currently USA does require us to upload for USA programs to upload our programs our data on data gov lab or love something gov hub and but I think thinking through the beneficiary protection and being really clear even with that website to be clear there and I think it does come back to the question what do we want to do with all of that data so I think yes it would you know to the statisticians or economists you know they'd be excited to have access to data we also need to think like what do we want to do with it and start with that question we'll put a plug for the USIP impact initiative where they're looking at seven programs in car that and there are many systems to see what data is coming out of that collectively and that's been a very challenging process and Diego's here and Ruben was here earlier but I think that's that what they're coming out with as an initiative it would be very interesting to think about as the process and what we might learn you know there it was for USAID programming on specific governance and conflict management and what would be really interesting from that and so I think that question about collectively as a peace building fields there's still that area to explore if we wanted to take one region and one area all the programs on going there you know how might we deal with the different data the fact that the data is you know the small sample size you know you have all these differences so I think there's a lot of thinking through there that's still an opportunity at some point but to be clear and also it's a very tenuous you know to get people on board for that not because it's you know people don't want it but we knew beneficiary protection organizational reputation you know so just very briefly on that note we held a round table last week with Eplo our sister organization in Brussels with EU policymakers and evaluators and they are introducing a system called Opsis in which I think similar to to USAIDs where all the data would go directly into their system but when we asked about privacy protection of local partners who would actually be able to have access they were very squishy it was clear that they kind of the technology was head of the ethics Adrienne okay so very quickly I really like this question of you know just peace building work better with you know communities when they're brought together or whether we keep people separate and I like it I think partly because it touches on a couple of other questions that were asked and you know yes there's a lot of data that shows that bringing people together is what's most effective and I do think that that's the goal but what we've also seen is that in our programming doing that too early on is not necessarily the most effective moment and so you really have to this goes to peace building being a science or an art and how do we collect that data and understand the lessons learned what does that mean to bring people together well in CAR for us it's meant bringing a couple of key people together and showing that it can be done to the rest of the community before just pushing everyone together and you know in DRC it's meant you know bringing communities together on common issues that they can deal with to the point where they get to sign a peace agreement between two communities that haven't been able to talk for for a long time but so what does that actually look like and what does it mean are very very different things to the contestants and you know that goes to I think this this piece you know the work being somewhat of a science and somewhat art and then in terms of data data sharing I did want to touch on that very quickly because for search what Leslie said is right that we also upload to USAID but beyond that you know we've meant we've managed to find little pockets where okay with Harvard Humanitarian initiative we've managed to share the data so that we can build our data on top of that with UNDP and in Sudan they're producing a data set that they said they'll be able to share out so that we can build on it rather than recreating some of the same data so there are opportunities that I find them to be very relational very you know specific at the moment and there's there's less of and I think for many of the cautionary reasons that were discussed there's less of the okay here's all the data and and you know anybody can use it at any time so those you know and then in terms of speaking to policymakers we've worked a lot to kind of take monitoring and evaluation and gather data in working with some of these different groups and in data sharing that does provide opportunities to to show policymakers here is a robust group of people working together with a lot of different data and a lot of information and here's the case for the approaches that need to be taken and it's been it's been effective I think you know in the in the pieces of that that I've been involved in involved in and I've seen where you know also policymakers are starting to rethink just in January I was in the hay you know talking with a very diverse group of people about how do we understand the effects of of peacebuilding programming and other types of programming and they're really open to okay what do we need to do as policymakers to set different standards so that we can better understand these issues and encourage learning rather than discourage it and so I found that to be an interesting shift from maybe past conversations great so thank you Adrienne for joining us from Burundi thank you to the panel before we leave I would just like Carolyn if you could raise your hand Carolyn Carolyn runs our learning and evaluation work at AFP if you have questions about the PEC talk with her talk with any of us look out on the table on the hall for a list of all the resources we discussed today and thank you all again on the panel and to you for being here today