 This is no ordinary thing. It is a new technology that is being used in some cities in the United States to help solve crimes. And this technology is called Eye in the Sky. And what does it do? So with this technology, there's a plane that flies over a city and records the whole city. It takes photos every few seconds. And when a crime occurs in the city, the police can go back and see what happened there. I refer to this as a big data technology. I'm Rabia Kudapanakal, a PhD candidate at Tilburg University. And I will talk to you about how people perceive new big data technologies. So new technologies that use big data can be beneficial or harmful. One example is social media. It can help you connect with other people, but it also has some harms such as privacy violations. Another more relevant example is the Corona Meldor app. It's a warning app that informs you if you've been in contact with someone who is infected, but people are afraid that it might track their location and thus violate their privacy. Another example, which may be more familiar for students, is Proctorio. It is an app that universities use to invigilate examinations. So when there's an online exam, they watch the students. And this app records videos, audios. It can record the screen. It can see what the student is doing on other browsers. And it also does a scan of their room. So this clearly helps the university to catch cheating, but it also violates the student's privacy. So in our research, we tried to answer the question, what makes these technologies more or less acceptable by people? And the first question we asked was, what drives people to adopt these technologies and how morally acceptable do people find them? So for the first question, we looked at three different factors. The first factor was a favorable outcome. So a clear benefit. Does this technology provide a clear benefit? The second factor is data sharing. Do organizations that use this technology share your data with other organizations? And the third is data protection. So is your data protected from data leaks or hacking? But what happens if these factors oppose each other? So as an example, think of a technology that provides you a clear benefit or has a clear favorable outcome. And your data is not shared with any other organizations. But at the same time, your data is not secure from data leaks or hacks. Would you be willing to accept this technology? Another example, you have a favorable outcome or a clear benefit from the technology, but your data is being shared with other organizations. And at the same time, your data is being protected from hackers. Would you then be willing to accept this technology? So we used a method that was inspired from marketing research. And to give you an example, this is what the method looked like. So imagine you're buying a phone and you have to choose between two different phones. One phone has a lower cost. The other one has a higher one. One phone has a lower storage capacity. And the other one has a higher one. What would you choose? Would you choose the cheaper phone or would you choose the one with the higher storage capacity? So similarly, we varied the different factors. So we either showed people a technology that had a favorable outcome or a non favorable outcome where data was either shared or when data was not shared, whether data was protected or it was not protected. And then we asked people which version they would choose and also how morally acceptable they found this technology. So to give you an example of the eye in the sky, we showed people a description of the technology I described and then we gave them two versions. And the red boxes are the more negative aspects of the versions and the green boxes are the more positive. So in this case, what did people choose? What did people choose when there was an opposition or a conflict between these factors? And this is what we found. So we found that when the outcome was favorable, there was a 32 increase in willingness to adopt a technology. When data was shared, the willingness to adopt actually decreased by 10%. And when the data was protected, again, there was an increase of 31% in the adoption of the technology. So we concluded that having a favorable outcome and having data protected are driving factors for the adoption of the technology. But we also found some differences across different domains. So one exception was healthcare. So in the healthcare domain with a favorable outcome, there was an increase of 60% in adoption of the technology. And this was way higher than domains like employment or solving crimes. And when do people find these technologies morally acceptable? So data protection was actually the strongest driver of moral acceptability. So if the data was protected, then people found these technologies morally acceptable. But if the data was not protected, they did not find them morally acceptable. And we know from other research that moral values like fairness, harm, and liberty drive these results. So if people think the technology is fair, then they are more willing to adopt it. If people think it's harmful, they are less willing to adopt it. And if it violates their liberty of freedom, then they are also less willing to adopt it. So using our research, we were wondering whether it can say something about why students are so strongly against proctorio. So let's look at these three factors again. So with proctorio, there doesn't seem to be a very clear favorable outcome for students. And with data sharing and data protection, a lot of times it is not clear what's happening with their data. And this is probably one of the reasons why students are so strongly against proctorio. And what about corona? So this app clearly has a favorable outcome for yourself and for society. But it's not clear to people what is happening with the data. And actually, if you look at the privacy policy of the app, you will see that they actually do not collect any data. So the technology works on Bluetooth. But this is not clear to everyone. And if it is clear, they would know that data sharing and data protection actually does not come into play. So my take home message for you would be that organizations need to protect our data, provide clear benefits, and also communicate them clearly. This would help us reap the benefits of these technologies, and at the same time also protect people's privacy. Thank you for your time.