 Hello and a very warm welcome to this Davos Agenda Impact Session on Building a More Connected and Trustworthy Future. My name is Helena. I head a network of organisations in 100 countries fighting for consumer rights that all of us as ordinary consumers have access to a fair, safe and sustainable marketplace where we have agency. Today I have the great pleasure of moderating this session during which we'll focus on how we build trust in the technologies that are reshaping our lives today and we'll focus a little bit on IoT or the internet of things. Why should we care about that? Well already there are more connected devices than people in the world with significant growth predicted. During the pandemic we've seen IoT applications such as connected thermal cameras, contact tracing devices, health monitoring wearables that are providing critical data to help fight the disease, while temperature sensors, parcel tracking, helps get vaccines distributed safely. The potential of course goes beyond the pandemic to face the climate crisis ahead of us, water, energy efficiency, air pollution, the sharing economy. Yet the use of IoT in fighting the pandemic has also shed light on concerns about security, privacy, interoperability and equity. And this brings us to explore where some of the WEF experts connected in the future of the connected world have identified that we are lagging in terms of governance issues, how we build this future together. I always was taught that trust was about increasing three things, your credibility, your reliability, your intimacy and reducing the level of self-interest. Does that equation still hold true for the future of a trustworthy connected economy? Now I'm thrilled to have experts with me on the topic today. Harpreet Rai is the CEO of Aura Ring, a company which was just described by Tatla as the new wellness status symbol. We have Andrei Kudelsky, the CEO of Kudelsky Group, expertise in digital television, cybersecurity and IoT. And Kirstie Graham, the CEO of Edelman Public Affairs, a firm which annually shares a trust barometer looking at public opinion around trust. And I believe she just joined from Pfizer, so has some excellent experience in the health space. Harpreet, perhaps we could start with you. If I'm a consumer, if I have an Aura Ring, you have my heart rate, my body temperature, my respiratory rate, my activity levels. How can I trust you? How can I, as a consumer, trust in your product and your company? Talk about a hard question. Look, I just want to say first thanks for everyone, you know, putting this together at the forum and having us and Helena, Kirstie and Andrei and everyone else for being here. Look, I think Helena, you know, you honestly laid it out really, really well, I think with, you know, credibility and reliability. But I think for us, you know, we found as going through this pandemic from a situation of how could our data help? And I think a lot of that is, you know, doing education. And frankly, one of the things we found, you know, and we specifically worked with some partners like the NBA and others, you know, over 2020, was putting things into principles and then practices. You know, I can get into some of the details now if you want, but in general, I think we laid out sort of five areas. One of them was just from a principle perspective, being GDPR compliant for us, our company started in Finland. You know, we do believe that having data de-identified and aggregated, you know, is something that just is a great principle. But in terms of building trust beyond that, I think for us, we found that the second thing we focus on is independent validation. You know, when the pandemic hit, I think lots of people thought, hey, we may be able to help this way or that way. But we looked to independent validation as our second, you know, is really the second most important thing. So we are the first wearable to partner with an academic institution. We partnered with UCSF. We also, you know, for particularly enterprise applications, really focused on the product, designing around how to share data in a really, you know, I would say forward thinking way. We had to innovate on a product to do that, something called health risk management. But then we also put into practice. We formed an ethics committee for some of our partnerships and eventually, you know, that led into the last thing, contracts. So, you know, we actually made some, you know, practices internally that if we found out anyone's data was being compromised, that hurt their employment status or something like that. We actually put right in our contracts the ability to pull, you know, pull and close the contract. And we had to turn away revenue because of that. So those were some of the things that we did, I know got into a lot, but happy to dive in some more detail on any one of those areas. Brilliant. Thank you, Harpreet. And I'd love to give you a follow up question. I think there was a study last December, I may be wrong, by the University of San Diego taking 65,000 people using your product and seeing whether you could identify whether there were fever outbreaks. So I'm really interested, you know, that paints a picture of future potential for social good. What would help you and support you in driving towards that broader collective good as well as individual good? Yeah, look, I am actually a collective good was that data was from our community. So, you know, it was a joint study between UCSF and UCSD that the data science institute at UCSD is the one who's, who's, you know, computing all the numbers. And the early findings that put out early findings already in December, you're right, you know, showing that with about, you know, three quarters of the other participants, we could see the onset of fever, three days in advance. Not everyone, but that's significant. We found that by putting this right in our app, the first thing you see educating people about this study in the effort, that led to a lot of trust. And then frankly, communicating with our audience that our collective data can help us learn here. So we can prevent the next pandemic. I think that education, and frankly, everyone's sort of starting to understand that, hey, all of our data together could be actually really helpful. That I think was the hard thing to do. And we need more help doing that. I think, you know, we've learned from, you know, the pandemic that we need to do more of this. And frankly, all the things that the forum does that, you know, your organization and other organizations here work on, we need to work together. We can't do it by ourselves. That's a lovely segue to, to Kirsty. So if it can bring you in at this point, I mean that the trust barometer shot this year showed a real decline in trust. And tech companies, as I recall, suffered the largest decline, but we're still the highest trusted out of all the industry sectors. Was that correct? What, what are you drawing from the study this year? And what, why is it important? So thank you very much, Helen. And thank you for the opportunity to be here. I think today's conversation is an incredibly important one. So we have been studying trust in foreign institutions and government and media and NGOs and business for the last 20 years. And what was really interesting were the sobering nature of this year's findings. And I think that's partly because of the pandemic and partly because we have this very significant problem around misinformation. And what we saw was, you know, drops record lows for all news sources. We saw doubt in spokespeople. We saw more than half the respondents globally saying that whether it was business, government or a journalist, they thought they were purposely misleading them. Now, the context for that is I think trust, as we've seen across the globe in the pandemic, whether you're in politics, whether you are in business, whether you're in a an NGO organization, it is the most important currency. You know, it is your, it is how you bring people with you. It is your license to operate if you're in business. It drives your commercial success. And it's also about your reputation. So trust has become an incredibly important commodity. When we look at it, though, there is an extraordinary moment for business and the findings. Business is now the most trusted institution. It's the only trusted institution of the four. And business, people expect their CEOs to speak up. They 80 full 86% of people said they want CEOs to speak up on societal issues. My employer is the most trusted person, most credible spokesperson to me. And business is seen as providing reliable information. So it's ethical and competent. And this is the first time we've seen this for business, and it supplanted the role of government. It's increased by being the only trusted institution. Now, if you double click on this for tech, it's very interesting because I think as Harpreet talked about, we've all relied on tech during the pandemic for going to the doctor for school for instead of the office, there have been some amazingly good things that have come out of this. So it is the most trusted sector still at 70%. But it did record the biggest decline. And on top of that, trust in IoT and in AI saw a drop in 25 of the 27 countries that we surveyed. So there are some very interesting dynamics going on here. People are really looking for credibility. They're looking for credible sources of information and they're looking for leadership. And I think the whole context of this conversation we have now is what can business and tech in particular do to get ahead of that and to show consumers that we understand what's needed. Perfect. Thank you. Do you want to follow up and sort of start us off thinking about that next phase? What would you advise Harpreet as he's thinking about, you know, developing out that social good and delivering on it? I think Harpreet said a lot of things that really resonated. This whole idea of you have to have standards and protocols, but also the importance of communicating. I think being very clear about your purpose and not overreaching on permissions, people are far more likely to trust you if you tell them what's happening to the data, if you reassure them around accountability, and if you explain and explain simply what is happening. The other thing I thought was very interesting, the point around independent validation and around doing something in the collective good. You know, that also came through in our survey. People are far more likely to share information if they believe it is a two-way process. What do I get from this exchange and how does it benefit society? And we see that, I think, is a big issue in contact tracing and also in the response to vaccine and addressing vaccine hesitancy. Thank you. I'm going to come to Andre next because there are two things here. One, you shared some great examples of using IoT and building trust around that. And you have some interesting views about how we can be very nuanced about our understanding of trust. I'd love you to share a couple of those with us. And then I'm going to ask you to cast forward, where are we going with this? What might happen in the future? So we'll maybe first start by an example. IoT, so we have just seen by Christie that there is a drop in confidence. But it's like an insurance. Until you have a damage, you don't know what you really need. And IoT is one example of a technology that has become a plan A because the original plan A has not really worked. So fundamentally, when you have some limitation about traveling, meeting, and so on, you need new technology to make things happen. Just giving you one example of implementation of IoT is in a new generation of rental car, an electrical car that you can rent for just a few hours or just for a day. Fundamentally, we are able to transfer to a mobile smartphone some credentials that will allow someone that will be able to use a car and without having any other contact. So fundamentally, the car need to be able to recognize that someone has the right entitlements and not only that the person is allowed, but also that the person has the right behavior that suddenly is enabling this person to enter the car. So to avoid that you have other people at the same time that maybe are not authorized. That's one example. Now, more fundamentally, what we see is through the pandemic, there is a need to be able to do things remotely. And in order to do that, you have to rely on security, on privacy in ways that have never been seen before. So fundamentally, that is putting all the reliability and the security of IoT at a much higher level of stress than before. And that is one of the elements that is fundamentally changing in the today environment. Thank you. And it was cast forward another five years. How do you think things will have changed? Harpreet talked about the need for policy and standards. What is beyond GDPR? Where does the next set of standards come from? How will we evolve the way in which we communicate about privacy? So first element I would say is that when we refer to trust and we have seen that during this pandemic, it's not just some things that you address with your brain. Fundamentally, for someone to trust, he needs to understand. He needs to know people that he trusts, that are also comfortable. But the reason not to dimension more emotional is to have the feelings that you can trust. When you have human interaction, you have that as a direct feeling, direct emotion. When you are in a mode that everything is digital, then you need to build new way to establish trust. And one element that is extremely important in order to be able going forward to have something that people can trust is to have first very simple explanation and that are not absolutely global. Depending of what you are trying to do, you don't have the same explanation. Imagine that, for example, you have remote medicine. When you are able to have a doctor that is able to evaluate what is your health condition, then you have some very high demand in terms of privacy and to be sure that the system will not be hacked. Not only because you don't want to share that with people that are not authorized, but also if the information is temperate, then you may have some fatal consequences. So it's extremely important going forward to have systems that are case specific where we build the trust for specific purpose. Or, for example, if you have a wallet on a smartphone, then your main concern is, will my money be stolen? But if it's a question about health, then it's a slightly different question. Or if it's for elements where you don't have skin in the game, then it will be less of an issue for you. Great. So we should develop a more nuanced contextual understanding of trust that's more personalized. Maybe one element to precise what is important in such a condition is not what you say, but what people understand. So it's extremely important that when we bring things into context, we try to understand what does it mean for the end user and there is not one single end user, but you may have different cases. So just have that in mind. Thank you. If I can come back to you, Harpreet, when I said how can I trust you, the first thing you said was GDPR. There was a basis for trust because of the policy and the sort of the playing field that you're working on. You're an international organization. Do you see that direction evolving? Do you see other countries picking this up, the idea of creating privacy by design and developing that out for other entrepreneurs to work off? I think we're going to need to. So it has to happen. One thing I just want to double click on both what Andre and Kirsty were saying. I think Andre, you said the words they need to understand for themselves. It's not what you're saying that the actual person needs to understand. So our rings were deployed in the bubble for NBA players and staff. Now, you can imagine if this thing just shows up, is a player going to wear that? What could be the implications? And so we actually did exactly what you're talking about, Andre. We tried to make the players and the staff understand and we were very open on how this data could be used for bad and how this data could be used for good. And for example, excuse me, if you're Steph Curry and your coach sees what time you fell asleep last night and if you got four hours or six hours of sleep, they may choose to bench you based on that data. That could obviously affect your playing time, your stats. And we were very open with that's how this data could be used for bad. And then we said, but hey, guys, here's how this data could be used for good, right? We see signals that look like you may be getting sick. We can then prioritize a COVID test, do a second test in the case of the bubble because there's false positives and false negatives. And here's how it could be used for good. And then we said, here's how we designed around that issue you're worried about. We actually don't share what time you went to bed. We took all our data, instead of giving the individual stats of when you fell asleep, what your heart rate was, we aggregated it just for the purpose, a risk score, zero to 100 on how probable you're getting sick. So and then we actually gave that data only to two people, two people within the whole NBA bubble. It was an NBA PA rep with the union and the NBA head of the medical staff that was running all the testing. And so I think we had to, and we onboarded 23 teams in a short amount of time when they got that bubble going. And I can tell you adherence was directly related to how many people showed up to that onboarding session. So when people understood what it meant for them and their teams, as Andre was saying, like, they really understood, okay, I trust you now. You're telling me how this could be used for bad, you're telling me how this could be used for good. I think you need all these basis, you know, things like GDPR, you know, the ability to, you know, for example, pull contracts if you find out data is being moved for bad, you know, ethics committee, but I really do think that education and the design and product is the hard part. And that's hard to put policies in for, you know, we're still trying to figure out. So Helena, back to your question, we're going to need people here, you know, that the World Economic Forum is bringing together, right, to, to, it's a cross-functional thing. How do you communicate that? How eventually can we use this data with, you know, local and state governments that could be really powerful during the pandemic. So an open-ended question, but I think definitely feel a lot of what Andre and Inkerci are saying too. Thank you. And I think the question about that upfront conversation so important, is it opt-in or opt-out, of course? And then, then, you know, am I continuing the conversation with you, or are you going to then say be bought by another organization that may change its approach? And we need to have the conversation. So this is an evolving conversation as well, right? Yeah, totally. I think, look, I do agree GDPR was really, really helpful. I think being honestly one of the only wearables to say we're GDPR compliant during the pandemic helped. And to your point on, we're independent, we're not owned by one of the larger tech companies. I also think helped. So yeah, couldn't agree with you, but yes, then it has to evolve. Depending on the use case of how this data could be used, I think those conversations definitely need to evolve. If I can come back to you, Kirsty, on this idea of sort of how we maintain an evolving understanding of trust. Is there anything in the barometer that sort of helps us actually, you know, one, understand that better, and two, start to clean up some of the problems that have created the mistrust in the first place, including misinformation? Yeah, and I think a lot of the things that both Harpreet and Andre touched on are really important to this because one of the things, I think this piece around what Andre said around people, not just what you say, but what people understand. This is really important. The communications element that comes through loudly in our survey is incredibly important. You know, do what you say you're going to do. And I think we have a tendency when we get these really tough issues to think, I need more standards. Now, to the point of Harpreet made, yes, we do need policy surrounding this, but policy can't work in isolation. So, you know, we think technical, we think more procedures, we need to have legal review. What we also need to go back to is the fundamentals of communicating. And when we talk to people, and I think this is Andre's point about the human element to tech, when we talk and explain to people in ways that they can understand what is going to happen and how things get used, you see trust rise. And I think it's, you know, it's an irony as well that people have this concern about data and privacy, but very few people know how to change things as basic as the settings on their phone or what they are giving up in terms of information. So, I do think the communications element of this is going forward. It's going to be critical. The other thing I would say that comes through in the data, and it really is about the role for business, is people expect CEOs to lead on these issues and they expect business to recommend those changes. So, for example, when I listened to both Andre and Harpreet, this idea of self-regulating or of saying, we know we need to do things better in this space, we should embrace this as a moment to look at the ethics of this, and it can be a competitive differentiator. If you start to think about, what are my consumers telling me? What are they concerned about? And the last point I think I'd bring out there is one of the most interesting findings is that people believe, over more than seven and 10, believe that consumers and employees of a company have the power to change the direction of the company. So, this element around the consumer voice in shaping how companies take this work forward, I think is going to be incredibly important. Thank you, Kirsty. I'm really excited that you brought in that idea of consumers and employees at the table. The part would be, let's hope we can develop trust by design for consumers and employees at the start of the process rather than consumers and employees having to force any change, right? And how we do that. Now, Andre, I would love to come to you. I think you raised your hand and also perhaps if you can talk a bit about this concept of collaboration, I saw that you have started out working in Switzerland around this concept of the trust valley, collaboration across multiple organizations to build the platforms we need. So, thank you, Helena. I will just come back to one point that has been just mentioned by Harpreet regarding the data. And one element that we need to look going forward is not to have the standard type of regulation, but a regulation that is a new type of regulation. And just to give you an example, if you attach to one set of data, the rule of what can be done or not with the data, independently of who is owning the company that will operate this data and define, for example, this health data or like the sleep data can only be used by such type of people for a very specific purpose. And to attach them with data, the rules of what can be done or not be done. Because if we are just having a black and white question regarding data, we may or miss something very important. Let's imagine that you have data related to cancer or to COVID. If you can aggregate this data and save life, I know very few people that will refuse to do that. But if you just say, give me the access to your data, the response will be simply no. Now, coming back to the question of the trust valley initiative, what we have found in Switzerland is that fundamentally trust is a very valuable asset. And the issue when you go from a real world to a virtual world, the notion of trust, the notion even of ownership is very different. I give you an example. If you have a car and someone is stealing your car, one of the key thing is when you go in the place where you have left the car, the car is not anymore here. With data, if someone has stolen your data, they are still here. But there are millions of place, others, and you have no way to bring the genius back to the bottle in the bottle. So fundamentally, it is requiring new tools, new approaches in order to be able to establish trust. And the trust is not something that is forever. You can lose trust within milliseconds and to build it is requiring years or decade. And all the story behind that is how to be able to construct a real trust in the digital space. Well, with that, a perfect ending to this first half hour of the discussion. Thank you so much to all of our panelists.