 Kia ora tatou. I'm Jennifer. This is Sam. We're going to be talking about some of the ethical issues which arose in a recent project we undertook. Sam is going to be describing the project to you and telling you a bit about the app which we built to collect data and how we navigated the university's human ethics process to get approval to do this project. And then I'm going to be talking about some of the things that the Human Ethics Committee didn't ask us about but probably should have. So I'll hand over to Sam. Kia ora tatou. So Red Zone Stories centres on the residential red zone, roughly 630 hectares of earthquake damage land in Ōtotahi Christchurch in North Canterbury. In the years following the 2010 and 2011 Canterbury earthquakes, families, communities, including schools, community centres and local businesses were forced to move from affected areas. In 2012, following the Canterbury earthquake recovery authorities decision not to remediate the land, the government offered to purchase properties from the owners in the residential red zone in order to remove buildings deemed to be at significant risk in any future seismic events. The majority of property owners accepted government offers while a small minority chose to stay. Land clearances took place between 2012 and 2016, creating large gaps in the city's urban landscape and fracturing well-established communities in Ōtotahi Christchurch and a neighbouring Kaipoi, Kaeraki and Pines Beach in North Canterbury. Red Zone Stories was developed to support research that examines how people relate to places from which they have been displaced and environments that have been radically altered. Principal researcher on the project, Associate Professor Donald Matheson, felt that much of the discourse on the residential red zone, following land clearances, overemphasised the emptiness of this place, treating it like a blank slate, and he claimed that, in fact, this was a place full of people's stories and memories, things that needed to be shared and remembered. The Arts Digital Lab is this approach to develop an app that would be a cross between a community archive to Kete, or Christchurch City Libraries Discovery Wall at Turanga, and a participatory urban planning tool, like Map Shneer, for example. The idea was that an app would promote engagement with the research aims and empower people to freely share their thoughts unencumbered by the presence of the researcher. Red Zone Stories is not only a research tool, but a place for communities to share what is meaningful to them in areas that, on the face of it, appear empty. The app utilises a map interface, layers of historical satellite imagery and archival media to help locate people in the landscape and show how red zone suburbs looked before the earthquakes. Participants can use the app to tell their stories as they move through the now largely vacant residential red zone, capturing their thoughts, feelings and memories of its places in the form of photos, videos and markers on the map. As participants tell their stories via the app, their location is recorded and plotted on the map, allowing researchers to see where people tend to go within the red zone and how they move through these spaces in the act of storytelling. We were very conscious of the need to develop something that would be sensitive to the needs of the particular communities in question. One thing we heard during the early stages of the research and scoping of the app was that people who had lived in the red zone had felt consulted to death and that many of their thoughts on what should happen with the red zone hadn't taken on board, nor had their feelings towards particular places within the red zone being properly acknowledged. We felt that there was something of an ethical duty to provide a platform that would allow people to tell their story, how they would like it to be told, rather than how government agencies or the media might tell it. As such, we favoured an open-ended approach, one which did away with lengthy question years and endless prompts and instead took cues from the world of thick mapping. We were in participants who simply respond to the place in which they are standing and add their experiences as a layer on an ever-evolving map. Beyond the implementation of ethical standards such as participant safety, security and consent, which I will turn to now and which Jennifer will expand on, we felt that promoting participant agency was a key part of the ethical dimension of the project. The human ethics approval process at UC is overseen by a dedicated Human Ethics Committee or HEC. This committee is comprised of institutional as well as community appointees whose role it is to ensure that the research participants are treated fairly and ethically during the course of the research. The HEC looks at factors such as who are the subjects of the research and how they are being recruited, what information is provided to potential participants and are they able to give informed consent and how information on participants has been gathered, what is it being used for and is it being securely stored and what mechanisms are in place to ensure that participants can extricate themselves from the research if they no longer wish to take part. The Human Ethics Committee employs rigorous standards and criteria when assessing research proposals which can often mean that projects take months to review. They assess hundreds of research projects a year and have encountered a wide range of research methods. However, as far as the current committee members were aware, Red Zone Stories was the first app-based research tool the committee had come across. As such, the HEC's assessment criteria was not always applicable within the context of the app and in some cases actually fell short of the requirements of an app-based research environment. For example, the mechanisms through which traditional research projects inform potential participants and facilitate informed consent do not necessarily translate directly to a mobile app. In the case of the former, research is generally able to communicate with their participants directly assisting them in explaining things that the participant may be unsure of. This can occur from the recruitment stage right through to the point at which data is collected, collated and analysed. With a mobile app, direct communication between potential participants becomes problematic. The participant in question is essentially left to their own devices, pun absolutely intended, and in all likelihood might not meet or directly talk with researchers. Therefore, it is key that we effectively communicated with the participant through the app's interface. Participants need to be made aware of and understand the implications of taking part in the research, including how and why we were gathering their data and what for. Where paper information sheets and consent forms are employed in traditional research, we had to develop a terms and conditions style consent process that blended paper consent forms with something more closely resembling an end-user license agreement. However, the challenge with end-user license agreements, as you are all probably aware, is that people have a tendency to skim through them or ignore them entirely. With this in mind, we implemented some basic UI and user validation solutions to encourage the participant to read the terms and conditions and be in a position to give informed consent before contributing to the project. When signing up, the participant is required to enter email address, a username and a password, select a default level of availability for the content, public or research only, and agree to the project's terms and conditions. They cannot complete the sign-up process until they have tapped on and view the terms and conditions, as well as the privacy policy required by Android's development policy. While not a full-proof method of encouraging participants to read what they are signing up to, it does serve to alert them to the fact that there is important information to read before proceeding. While the HEC was satisfied with our technical solution to this problem, they were not so happy with the language we had initially used in our terms and conditions. Our tendency to use overly academic or technical language was picked up by the HEC, and they pushed us to consider who we were addressing and the key information we needed to convey to them. Using the excerpt of some of the original wording we used versus HEC's suggestions, and as you can see they had a valid point, we managed to cut quite a bit out of it. In short, the HEC were effective in the capacity to assist more standard elements of the research, such as clearly stating the research objectives and the requirements of the participant. However, they did full short and advising us on the more nitty-gritty aspects of app design, which Jennifer will now discuss, so I'll pass her over to them. Sam said UC's ethics process is rigorous and lengthy, but it's designed for traditional lab-based research. So that means there were a few questions that the HEC probably should have asked us, and they didn't. As we developed the app, some of those questions started to occur to us, so we made discussion of ethics a regular part of our team meetings. Anytime we developed a new feature we'd think about the worst-case scenarios, we'd think about two key questions, what harm could possibly come to our participants from using this feature, and what's the worst thing that someone might use that data for? For example, the GPS tracking we're doing where we trace where people were moving through the red zone. That has implications for participants' privacy and their personal safety. Now this particular participant, he might not mind that the whole world knows that at some point he walked along the riverbank. But what about this participant? You can see from the accumulated traces that he went back to the same place over and over again. So applying our worst-case scenario test what if he had a stalker who used that information to lie and wait for him? So to try and mitigate that risk a bit, we've made sure we don't display any time-based information on the map. We only show who the participant was and where they went, not when they were there. So that makes it a little bit harder to predict when they'll next be there. Now if you're wondering about the strange shape of the trace that he's left on the map, we went through his photos and discovered that he's actually playing frisbee golf. And the target is right in the middle of that weird star shape. Giving our participants agency over their content is a bit of a double-edged sword. It puts the power into the hands of the participant, which is great. But it also means we have very little control over any content that might harm or offend other people. For example, we can't stop our participants from breaching someone else's privacy by taking a photo of them, actually Richard has done here. He's taken a photo of another person. New Zealand law says that you can take photos of people in public places without their consent. But what's legal isn't necessarily the same as what's ethical. It's really easy to imagine a scenario where someone could be harmed by one of our participants uploading their photo. We could also easily imagine situations where a board manager might use the tracing on the map to draw a picture of something offensive. We wanted to test whether that's actually technically possible. So we tried it out for ourselves, not doing rude pictures, but our colleague Jenny actually managed to write her name just by walking on the map. So there's a bit of risk there. We can't control our participants use the app, and we really don't want to. But what we did was we made sure that in those terms and conditions we had really clear statements about respecting other people's privacy and not uploading offensive material. And we established a policy which we also laid out in the terms and conditions saying that if anyone breached those terms we could take down their content immediately. Someone contacts us and says, hey, my photo's in your database and I don't want it there, we'll take it down. And the same as if they let us know there's something offensive. We're not removing the risk, but we are reducing the risk as much as possible. Now there's other ethical considerations that come out of just mobile technology. The biggest one for us was data security. As part of our ethics application, the HEC asked us how we were going to be storing participants' data. They were thinking in terms of things like a locked filing cabinet or maybe keeping the files behind the university's firewall. But we were more worried about how does the data get from the participants phone to the university's servers in a secure way. Now that was something that we didn't have the skills in House to better deal with ourselves. At least not to a level where we were confident that we were going to keep the data secure. So we outsourced that. We asked Catalyst IT to design us a method of data transmission that would be as secure as possible. Sometimes to be ethical you just need to spend money. Another obvious issue was participants using their own phones is data usage. We didn't want our participants to suddenly be running out of data in the middle of the big empty red zone just because they've been using our app. When you're using the app just to walk around and passively explore the area, it uses about the same amount of data as Google Maps would. So we reasoned that most people are familiar with those kind of mapping apps and would be able to make an informed decision as to whether or not they wanted to use the app. But when it comes to uploading a story, especially if that story involves a lot of video, that could blow out someone's data allowance immediately and they wouldn't necessarily be aware of that. So we've actively discouraged people from uploading anything while they're actually in the red zone. In the instructions that tell you how to use the app, it actually says uploading things will use a lot of data, make sure you've got Wi-Fi connection before you upload anything, and then when you come to actually do an upload it comes up with a pop-up which again warns you that it's going to use a lot of data and says please make sure you've got a Wi-Fi connection. Now all of this might give you the impression that we thought of everything ethical that could be involved with that project. That is not true. As Sam mentioned there were things that the HEC mentioned to us that we hadn't thought of and even then and even after all of our discussions about ethics, when it came time to upload our app to the Google Play Store and you get a report card from Google as to how good your app is we realised we'd totally failed on accessibility. Because we hadn't thought about things like to use our app you need to better read a whole lot of text that's in quite a small font. When you're recording a story on the app the big green record button turns red which we thought was a great visual signifier of the fact that it's recording. Not so great if you've got red green colour blindness. And unfortunately by the time we got this report card we were too far through the design process to making major changes. So our app still scores pretty low for accessibility. And there's another accessibility problem that might be occurring to some of you to use an app you need a smartphone. One of our key aims with the project was to tell the stories of people who are normally disenfranchised by traditional urban planning consultation methods. People like the elderly and the disadvantaged, people living in poverty, all people who are really unlikely to have smartphones. Our solution there wasn't technological, it was human. We went out to places like rest homes and community centres those kind of places. We sat one on one with potential participants and we guided them through the process of telling their story using one of our phones to give them accessibility to the project. But we still think that accessibility is something we could have done a lot better. So we didn't solve all of the ethical problems and I'm really sure there's other things we didn't even think of. But what we did learn from this process is that if you're going to use a new technology for a research methodology that's out of the ordinary, you can't rely on your human ethics procedures. You can't just tick the boxes on the form and think oh well we're ethical now, it's fine. You really need to go back to first principles. You need to ask yourself how could this harm my participants and how can I mitigate that harm. You also need to budget to be ethical. You need to right from the very start be asking your questions about ethics before you do your funding application so that you can build in any extra costs that might be involved. Now I hope this has given you a bit of an idea of how you can make your digital projects a little bit more ethical. And I'd like to acknowledge Donald Matheson who's the principal investigator on the understanding place project, the overarching project for red zone stories. The National Science Challenge Building Better Homes, Towns and Cities who gave us funding for the project. And the Ngai Tahu Research Centre who shared with us the stories of Tungata Whenua in the red zone. We're happy to take some questions now. Thank you so much to you both. We've probably got time for one or two quick questions before everyone has to move on. And you've got one of those, the squishy balls. Here we go. I've got one down here. Do you want to turn around and catch it? How humiliating. This is sort of a tangential but do you think the term red zone is problematic? It's red, it's dead, it's empty. The official name is the Avanatakarau River corridor but everyone in Christchurch knows it as the red zone so just we use that title really just to connect with people so they knew what we were talking about. I'll just get ready. Kia ora. Are you able to give us any more information about Donald Matheson's project? How is this research being used? What's understanding place but more about that I guess is what I'm asking. So the idea behind understanding place is to find some sort of tool that would gather rich data in the form of texts as well as geolocated data and even to the point of having the videos that people submitted to the app to then build some sort of tool around that that can transcribe to manually transcribe. So looking at those real sort of micro connections between place and storytelling and how people refer to places that are no longer there, how people relate to spaces that they've had to move out of. So it's about finding, using a tool that relied on the person's experience as they're moving through the space rather than just simple questionnaires and things is looking at these sort of macro level relationships to place. And Donald's was still in the phase of getting a bit of transcription done and analysing the data so we'll see what comes out of that but hopefully that answers the question. And to be totally honest we had a bigger take up on this project as we expected. We had very few participants and we don't know whether that's because of that thing Sam mentioned about people were just consulted out in Christchurch or if there's some fundamental problem with the project that made people not want to take part so that's kind of the next phase of research is finding out why didn't we get a good take up. So I'm conscious there's probably more questions to be asked but I'm probably going to move on to next sessions. Thank you Al Speakers.