 A lot of people say that data is a new oil. Data has a lot of value in aggregate ways, and that's what we call big data. On the other extreme, you have personal data. The data that defines you as an individual, your trails that you leave as you go into the digital world. As AI become more ubiquitous, we should expect our entire environment to in some sense become smart. And if every device is listening to you or every command, you better be careful before saying something. So learning how to adapt to this, and also deciding what the appropriate level of smartness in the device is, not to mention making them secure enough so you can trust them, that's going to be an enormous challenge. If you ask people some basic questions about what I call data maturity, how they think about their privacy, how they think about their app use, their ways of storing data, the way they use the cloud, what they share, what they don't share, very few people have a full maturity about what this means. All of this data that is generated can also be used in ways that would have a great deal of impact on their life. And so, given that AI is feeding off data, this becomes very important. AI has to earn our trust, and without having that transparency of what's going on behind the scenes, it's going to be more difficult for it to earn that trust and get accepted. And in order for us to see it widely used, we first have to believe it. What kind of protection regime do we need to have to be able to then also use this data in the most constructive way that in the long run does not end up harming people even if it helps them really short term?