 So a couple of years ago, I came here to Tic-Tac, and I was describing a project that was in the field, and now I'm happy to get back to Tic-Tac and describe the findings of the study. Where's the clicker? Well, kind of funny, because the results are a bit disappointing. I still hope that even though the results or the findings might be disappointing to some, there's going to be important lessons for future work, and I hope also another goal that I have is that people will see the value of collaboration between practitioners and academics. This is joint work with Melina and Jonathan Rodin at Stanford. Is that not working? OK, so I'll just start with that. This is just like a motivation, like the project started because we were all observing this exponential growth in the penetration of mobile technologies. This is a figure from Africa. And we see these types of figures, we were asking ourselves, OK, so this is a serious revolution. Can we use mobile technology, this penetration of mobile technology, can we use it to improve service delivery in low-income countries? Most of my work is in sub-Saharan Africa. So just to preview what we're doing in the study before we can dive in, we collaborated with local NGO in Uganda, and together we were looking at the effect of a community reporting platform, and we were looking at its effect on service delivery outcome in health, education, and water. But the platform basically did it allowed citizens, villagers that live in really remote areas, to report service delivery problems by texting to district officials in the district capital. We introduced the platform. The study was run as an experiment. So we had 200 villages, 100 were selected to receive the platform, and 100 served as a control. We followed them for two years. And though we found that utilization of the platform was relatively high compared to benchmarks from other studies that tried to crowdsource information with mobile technology in low-income countries, we're not finding large effects on service delivery outcomes. In terms of the starting point, the reason that we kind of like, where's the need coming from? So the need comes from this persistent abysmal public service delivery in many low-income countries. About 30% of teachers and nurses are not present in any given moment. Even when they're present, there's low effort levels. A lot of teachers are sitting outside taking tea, and especially in the health sector, there's a lot of abuse by health providers. And on one hand, the government has the mandate to discipline frontline service providers. But there's a problem, and one of the problems is that many local governments don't have necessarily the kind of reliable targeted information when exactly a teacher doesn't show up, when a clinic is closed, when it's supposed to be open, or when there's a nurse that is abusive. And on the other hand, citizens, the villagers, do have this information, but they might be reluctant to take action because there might be a repercussion. And so that is the status quo that we are, that is the status quo before we started the project. And so there's some good reasons to think that the technology might help us solve some of these problems. And so here's some reasons why we thought we might see an effect of service delivery if we implement this platform. If we connect between villages to the local government with text messages, text messages are very inexpensive. We actually set the platform in a way that it will be free. So in our platform, text messages were basically free, so that reduces the cost of reporting. Probably the most important aspect of this is the anonymities. Citizens in low-income countries are afraid to report on teachers and nurses because of power disparities. And so we set the platform to be anonymous. So you can send a text message, but on the side of the government, it just opens as a case. With a case I did, there's never a phone number. It also immediate, I don't need to wait till I go to the district. I can send it immediately when somebody abuses me or when the clinic is closed. But it also can allow government, if they respond, I can see that the government is responsive. And this can have rippling effects on my willingness to continue engage with the government. And the last thing is these platforms have the take advantage of the comparative advantage of villages in terms of the information that they have. But we're not expecting them necessarily to hold service providers directly to account. They just are providers of information. The information is just crowdsourced from them. The public officials are those that need to take the disciplinary actions. And so we were expecting that if we implement such platform, we might see an effect on things like monitoring of health providers by the district officials, maybe more effort of service providers, and maybe also more input. So this is what we set up to find. In terms of our research design, we were working in one district in northern Uganda called the Rua, districts in Uganda are responsible to the provision of services like health and education and water. In terms of the implementation, this is a collaboration with a local NGO called GAP. We launched it in 2014, and the end was somewhere in 2016. As I said, we introduced it in 100 villages and 100 villages in control. Because the platform was open, the treatment here is really just to go to villages and introduce the platform and encourage them to use it. And our unit will be village clusters. And so this is kind of how we set it up. Each health clinic in Rua, there's services for five villages. And so all four or five villages around the health center receive the same treatment. Yellow will be treatment, and blue is the control. The treatment was quite extensive. It wasn't just the design of the platform. There was also training of district officials on the dashboard. So on the side of the district officials, there's a dashboard. They can log in, they can see messages coming in, they can respond, they can also do some queries like how many messages come in a month, in a day, in the last year, how many in water, how many in education, how many, so it's a pretty sophisticated dashboard. And we had to train, now when I say we, this is the NGO that we were working, they were training. There were inception meetings in these villages to introduce the platform. There were quarterly meetings in these clusters where government officials came to report, here's what came in, this is what we did. This is why we didn't do things maybe because we don't have a budget, or maybe because it's the responsibility of the national government. There was a registration campaign also in 91 villages where we registered the phone numbers and sent periodic kind of text messages to encourage people to use the platform. This is how the inception meeting looked like. And I'm gonna, the first results that I wanna show is a result about take-up. We worked in about 100 villages, so the project was in 100 villages, but they are really tiny villages in a very, very rural area in Northern Uganda, in West Nile, and so this is an area where the villages are around 200 people and only about a quarter have mobile phones, okay? And almost no one has a smartphone. These are all like stupid phones, okay? At the top you have the number of messages over in the first year. These are cumulative messages at the bottom. The figure is monthly. This is, sorry, the mean on a monthly basis, where the dark black relevant messages coming in and the gray actionable messages, okay? And so we were getting about something like at the peak, something like 200 to 150 messages a month from a really small, very small population. What is relevant and what is actionable? This is really important for our analysis of what happened. There were lots of pretty significant number of relevant messages, like I greet you all, but major problem is sickness. It's clearly relevant to service delivery, but there's not much that the district can do with such a message. Actionable messages like the one at the bottom where there's like, it's very clear exactly where the village is and exactly what is the problem, okay? And so kind of previewing what we're finding, there were just not enough actionable messages. We were looking at health. We were looking at schools and we were looking at water services. And these are the outcomes that we were looking at. Monitoring, effort and input. So for example, monitoring is by the district. We're looking at the calls that the district officials made, the visits that they made, reports that they wrote about these facilities. Efforts could be looking, we were looking at absenteeism, we were looking at engagement. So we were sending people to schools and we're seeing whether something is written on the board, whether the teacher is actually in the class or sitting out taking tea. And we have a big part of the data we were collecting was also in inputs like drug stock out and supplies. There's a lot of data sources that we utilized in the two years of the project. We were looking, we were doing unannounced audits in schools and clinics. We collected tons of administrative data in the district from the district capital. In order to understand a bit also the results we supplanted that with a service that we conducted in 16 villages with over 4,000 respondents. And we also conducted some focus groups in eight villages. In terms of the finding of the study, so a little bit disappointing, we found no evidence at all of any change in health services, not in monitoring, not in inputs, not in effort. We do find effects in the education sector in all these three parameters, actually quite significant and quite large or substantively, but only in the first year of the program. These results did not last beyond the first year. And so we're actually happy that we did it as a two year project because if we would have stopped in the first year we might have been reporting results that are inaccurate. We did find also suggestive evidence of positive effects in water services. There was a lot during the two years we visited the district many, many, many times. There was a lot of anecdotal evidence where people were saying, oh, we send the message about that and then they fixed it and the district was happy to share a lot of these types of stories. Outtake is that this is great but what's the counterfactual? These things might have been also addressed anyway in alternative means. And so for us, even though there's some suggestive evidence of effect in water and there were clearly effects in education in the first year, we don't think that this platform had like a sea change in terms of changing the equilibrium from a low equilibrium to some high equilibrium where teachers are fearing that people will report on them and so they all start coming to work. Okay, so why didn't we find, so we have a paper, it's online, I'm happy to share with people that are interested. We go deep, I won't be able to go that deep in this presentation but we go and try to understand why do we see in education and not in health so we go through a list of things that could account for that so they're up here. It might be that it's just what people might care about more on education than health. We show that that could not be the case through survey data and through like message data. It might be harder for citizens to identify problems in health. So there's a whole list of things. At the end of the day, a lot of the, it's for us it's just putting out different reasons, trying to find evidence to some and not others but we wanna remind also people, we remind people in the paper that our N in terms of the district is equal one, meaning we have only one district education office and one district health office and it might be very idiosyncratic, okay? And so these are some of the limitations of the research design. Why did the effect in education didn't last beyond the first year? It could be that the first year was like novelty effect, people were excited. It could be that in the first year, the mobilization efforts of the NGO were significant. They did a lot more work in the first year than the second year. They were doing other things in different districts but they're also things that are related to citizen engagement that are really important. So on one hand, citizens, there was like something like 84% response rate from the district, which is really high like and that was not our prior when we started given what we know about responsiveness of governments in Sub-Saharan Africa but when you go deep and this came up from focus group, from surveys and from analysis of all the responses. So we basically analyzed all the responses of the local government, here's what we found. We found that because the messages were relatively vague on the side of citizens, the responses of the district were not that helpful for citizens so they were disappointed. Many of the things that they cared about were not the district couldn't address in the budgetary cycle and a lot of time the responses of the district government were technical, not understanding the reality on the ground. So the classic example is someone says the teacher in my kid's class never shows up and the response of the local government was oh, okay, this is really important, you should talk to the principal. Kind of da, I mean the people know that they can talk to the principal but the whole idea of reporting anonymously is because they don't wanna confront the teacher so after they receive such a message they basically disengaged, okay? And so this can account for some of the problem. At the end of the day we found that this had an adverse effect on citizen satisfaction. So we did a survey kind of to also explore a bit some of the assumptions of when such a platform, we were thinking when such a platform should have an effect. So first, citizens need to believe that the district can fix a problem. If they don't believe they will never engage and so we find that this is the case here. Our survey suggested clearly people think that the district, if it wanted, has the capacity to make a change. Citizens need to also believe that the government will be responsive. Why will I send a message if I think that no one will be responsive to my message? And we find also, and that was also a positive finding for us that citizens did believe that they can prompt the government to take an action. Obviously in order to use the platform people need to know about it. So we found that only about 30% of people knew about it. I don't know, many of you can tell me if this is low and high, I don't know. But as I said, there was, you know, in order to have an effect, citizens must send actionable messages and this is not something that we found. I'm gonna finish with a few lessons learned both on the side of citizens, the side of government and the side of technology. On the side of citizens, based on the take up, based on focus groups and the surveys, I can say absolutely clear that there's a demand to use these types of platform. So it's not as if we try to solve a problem that doesn't exist. I don't think that this is the case. There's absolutely a need in poor areas for these types of platform. One of the things that came up, we knew it before we started, but we didn't appreciate how much it is until we did our focus group is how much anonymity mattered to these citizens. So here's something that when we started analyzing the data, we saw very common. Somebody would say, a teacher never shows up and the government official responds and says, okay, can you give us information about the village and the teacher? And then they don't, they never respond. And so when we were speaking to people, when we hold the focus group, it turns out that when they heard back, they thought that the anonymity has been compromised even though that was not the case, okay? So many thought that if the district government can respond to the platform, they know who they are. And so they disengaged. So for us, the lesson was that we need to strengthen anonymity. I think there's also, we learned that there's a bit of a tension between all-purpose platforms and specific platforms. This platform, we didn't limit people in like, you can report education, health, water, whatever. There's some studies that suggest that if you do platforms that are very only on water or only on health, you can get more actionable messages. So I wanna throw that out there. In terms of the district official, they were really happy. They were like, oh, this is great. We totally love it. Yet, it also was clear that like without a major involvement of the NGO that had like an intern that sit in the district government and basically nudging their officials to respond to incoming messages, this would not work. And so I want us to also be mindful of that. It's also clear that there was a mismatch in the between what citizens expect and what government delivered. Citizens were expecting the government to take action. They weren't expecting them to say, oh, the right person to go to is the principal or the PTA. Some additional lessons that I'm gonna end here. The government, the platform was more techy than what the local government used it, you know. The messages could, you know, you could tag them as like pending and complete. And, you know, there was a way of sending messages back to citizens making complain, updating them on the status. Oh, we're investigating, oh, it's pending. Oh, we checked it and this is what we found. They haven't engaged in that. It wasn't used in this way. We think that for these types of platform to work, you have to absolutely have to update citizens at every stage of what's been done with the complaints. This is kind of a bit unique to Uganda that the importance of including or so lower government level. One of the things that we didn't do well and I regret is when we spoke to facilities, they had no idea what messages came in about them so they didn't get any feedback. So teachers, so principals didn't know when parents were complaining about teachers. Facility in charge in the facilities had no idea when complaints were made. Unless the government contacted them and that is not a good idea. Okay, just in conclusion, we think that these types of platform, we don't think that we need to shut down all type of these platform. I think there's some promising things in trying to utilize them. There is clearly a demand and we believe that there is maybe a way of setting it up in a better way than what we've done. The most important thing is that we were working, most people are just with like flip-down, there's like old Nokia. In a world in which you have smartphones, which is a world where we'll be in like five years also in these areas, I think that's gonna be a big difference because you can like geotag yourself, you can take pictures, it's just you don't need to report where you are, you just basically geotag yourself and so that's gonna be a sea of change. And the last thing I would say is that here I'm just reporting the results in terms of service delivery. We think that it's also important to understand take-up because there was a lot of variation in both within and between villages. Some villages used it and some not. In some villages there was really high uptake and in some villages there were not at all and so we have another paper that tries to explore using some network analysis, explore what villages and what villagers are more likely to use the platform and if people are interested, you can send me an email and I'm happy to share this with you. Thank you very much.