 Okay, hola everyone, my name is Cathy Rodriguez, and today I will be hosting a conversation with the amazing Abby Barcy. I've been working on a work of art computer for over a third of years in the other things that I did for Disney on a digital dream that became Google Earth, and I really met that in second-large, and recently the art crisis and experiences in a lot of multi-big companies like Microsoft for HoloLens, Amazon's Echo Friends, he most recently helped Apple in a serious way. He also helped to clarify the harms of art driving business models, and has shown how our right to privacy is often full of mental human rights. I find this, even the existing generations of the art system, very compelling, and please about its potential to open new forms of entertainment, art, storytelling, a new way for defending our rights online. But whenever I dig in this topic, I'm concerned to hear researchers talk so happy and enthusiastically about collecting more what is called egocentric data, as they call it, about users, about our attention, about our emotions, even through our inventory movements, and of course, inferences that can be drawn from all these data. So I have been concerned about how to make sure that these technologies get practiced right. I don't think that they should only be about whether inferences can be done accurately, but also should open a discussion about whether we should meet some of these correlations at all. So, thank you so much for making the time to join us today and share your knowledge with us. There are a few questions I will make today to you. So the first one is, there are several different acronyms that get used in this space. VR, XR, extended reality, or the general term XR. I'm just playing what XR means. Sure. Thanks for having me. You can hear me, right? Yes. Okay. Good. So XR is a term that folks in the industry came up with because a lot of the other terms were confusing. And we have a problem in this field with companies coming and taking the names that we all use. You may have seen that happen recently. So we used XR as a plate holder term to mean any reality, whether it's AR, VR, MR, and that's after having some of these things happen, like mixed reality used to be the whole continuum, but a certain company started using MR as their name. And even with XR, a certain other company has tried to call that extended reality. And to me, that doesn't mean anything. I don't know what extended reality is. So I'd just like to use the term XR, and if you want to be really clear about it, just think of it like RX for drugs, how you have the R and the Greek letter CHI or CHI. Just think of it as that in reverse. And that's what we call just the whole field. It doesn't matter. It's not going to be that popular, but it's what we call ourselves mostly. Okay. Let's put things gone. What's the state for privacy in XR system in the near future? Will people be in control of their life? Do you think so? I think in a short term, yes, but decreasingly so with this technology. So I hear a lot of people say, you know, who cares what some company knows about us? And they give away their personal information for free just for a little bit of convenience. And that's the norm. And I think it's because people don't see a lot of great alternatives. So privacy feels a little abstract, kind of like saving for retirement. But what matters is whether these companies can use our personal data to manipulate and maybe even control us. And I know it's hard to believe, and some people dismiss this as, you know, mind control raise. And, you know, we like to think we're irrational beyond such influence. But the truth is, this stuff never really worked that well in the past. So it's easy to say, oh, it just doesn't work. They're just making it up. But it's going to work really well in the future. And that's really what I'm here to talk about, because I've already worked on research like this. I've already worked in a space for 30 years. And I can see a clear trend of the profit motive and the potential for the technology coming together. And that's what I really want to talk about. So I want to be clear that one of the stuff I talk about doesn't exist today except in research. But as devices are rolled out that have things like eye tracking, this can be more and more common. But look back at just at the history of this. We know TV ads didn't really work that well. I mean, they did somewhat. You can imagine some of these ads being emotionally manipulative. They would play on our emotions and sensitivities. But how well did they really work? I mean, how many people really think drinking Pepsi makes them more likeable? Or buying Duracell batteries makes them feel more connected with their kids. That's what the commercial is selling us on. And people don't necessarily believe it. But on the other hand, if those commercials didn't work, they would probably be doing something else. So there's something to it. It does have an effect on us. And that's before personalization. And so once the system starts to know who we are really well, we're going to see a whole new level. You can kind of think of it as the advertisers are kind of like professional gamblers. They're betting on us. They're betting on behavior. And they don't control the cards, but they can make small bets. And over time, when we're in money and for companies that are like the house, the people that are the places where bets are being placed, they don't have to win every hand either. They just have to win over time. They have to win by percentage. We're going to do more and more extreme over time with personalization down to the individual level. Right now it's still bucketing, but we'll talk more about what happens when we get down to the individual. Thanks Abby. Continue with a question on privacy. There's a whole discussion of how we may feel and what we are adding to the rest of the team. And whether or not this really works yet. I know that several centers of thought that could be used for this, to take maybe the best none of this, what is eye tracking technology and what are major privacy concerns in this space? Sure. It gets eye tracking in one second. One thing I need to explain I think is that right now ads are very interstitial. They show up on top of web pages. They show up between videos. And ads in the future are going to be a part of the world. It will be indistinguishable from the world. So it's what we might call experiential advertising. And the manipulations can also become very subtle and woven into the fabric of a reality. We won't even notice them. Mostly because we have to believe the reality that we're trying to manipulate that function very well if we're questioning the very nature of reality or questioning what's real or what's an ad or what's trying to manipulate this. It's sort of a level of intimacy that we need that requires some vulnerability. And we're really lacking the defences when it comes to that, especially when our personal emotional triggers become known to the system. The algorithms that can push the buttons are the ones that can get through our rational responses. And so we've talked a lot about this in social networks and that's one example of our problems being pushed and us engaged. But eye tracking, I think it's important to say has a lot of positive uses, right? It's not an evil technology that's just showing up to take over, but it can be negative ways. So some of the positive ways are to increase the performance of the devices or to help create operating systems of the future that know what we're trying to do and can help us actually accomplish our tasks. But because of the way our eyes work and a long article describing the details if you're really interested, but because of the way our eyes work, this becomes a window to our mind. Our eyes show what in the room are we thinking about? What do we notice? How do we feel about it? Are we excited about it? People dilation is an autonomic reflex that shows when we're excited about something. Having cameras capture that tells the reader that we feel excited about something. And basically, the combination of that with forward fixing cameras, like you'll see on the Ray-Ban glasses or on Facebook's Project ARIA or with their E4D collection, that captures what we're seeing in the world and then the eye tracking cameras capture where we're looking. The combination of that can tell the computer pretty much what we're thinking about relative to the world that we're in, not our abstract thoughts but our real ones. And so the danger is extremely high in this data. And I know most people understand how much, how important the state is and I don't think anybody really can knowingly consent to all the deeply personal insights that can be gleaned. Just having the ability to delete your recordings before they go into the cloud is not enough to guarantee the safety. It's also possible for this technology to learn how we feel about each other. Just think about the glances that people make at each other when they feel certain ways, positively or negatively. If a computer is watching those glances, a computer could learn pretty well how we feel about each other and build up that model without us ever even expressing our social network and who our friends are. So just to give you an example, we had one person that I put through eye tracking. We learned, I mean none of us realized this by working with the person for a year, but we learned that he looks at your mouth when you talk, not your eyes. Most people will look at your eyes, but he looks at your mouth, which might be the case of someone who's lip reading, but he wasn't hard of hearing. But it might have indicated is actually that he might have had some cognitive differences. He was maybe neurodiverse in terms of how he processed the speech. That's an insight he never had, and we learned it just by looking at his recordings. And I was very conscious when I was wearing eye tracking glasses of being careful. What I look at, don't make the wrong look or gesture because it's all going to get recorded. It has a very chilling effect on how we interact as well. It's going to be something that we all have in our devices. It'll be built into glasses. The first generation of Arab ants don't have them, but Project RA has already has it or looks at having it. And other devices will have eye tracking as well within the next one to two years, you'll see. The stream cannot hear you. With the paper I collaborated with Britton Heller, who's a very well-known human rights lawyer and author. And I think the idea was to cover the history of advertising and to show how it's going to be projected into the future. And one current thing to look at in that regard is Super Bowl ads, right? I know people. We probably all know people who only watch the Super Bowl for the ads because there's a lot of production quality, high value. And they tend to be more dramatic and more interesting, but they're not watching for the sports. They're not even tuning in for the sports at all. I think what we're looking at in the future is that the advertisement becomes so compelling and so much part of the world that we like it. The question is not do we hate it or is it bothers them. It's actually, maybe we like it too much and maybe it's actually affecting us a bit too much. And I think what we're going to be seeing with XR, especially more on the VR side of it than the AR side is, once the world can also be changed based on our personal graphs, the information that the companies collect about us, you could start to see the world tailored to match what we really want and need. And that sounds like maybe a good thing, but when it's used against this, it's not so good. So there's a few things about eye tracking that are important to understand. One is, you know, even just without eye tracking, just a moment with the way our eyes work, when you blink or when you move your eyes rapidly, you're actually blind. You can't see the world and our brains just fill in this continuity of the vision. You can actually change the world while we blink. You can change the world while our eyes move around and you won't even notice that the world has been changed. It's been used in research for some really impressive results. One of them is called redirected walking You are literally walking in a circle but you perceive yourself walking in a straight line because the world keeps changing around you as you move. So now imagine that tied to product placement. So the elements of the world change every time we might blink or look around and engaging that and matching that to our attention and what we're interested in, the world can now be optimized to show us what the advertisers want us to see so that we will look at their constant and we will be more persuaded by it. I think more importantly, the eye tracking allows this experimentation to happen. What things catch our eye, what things make us emotional and so it will be different for everybody. This is where TV and Future X are advertising really diverge. The differences between us come super important. So for example, you may have one person whose hot button issue may be politically is abortion and they willing to vote for candidates entirely based on that one issue alone. By being able to push that button the advertisers and the system are going to be able to trigger somebody to be emotional for a period of time and they're not even going to be trying to advertise about that particular political topic. They may be trying to just get somebody worked up so that the next thing that they see may be something that they don't look at with this rational filter as they might. And all of a sudden you wind up seeing people like we see today saying, I'm against CRT but you ask what it means and we can't tell you. The buttons have been pushed emotionally and it's a mapping to what their sensitivities are and what their issues are. So how far can it go? It's unclear right now. It's only been tested in early phases. We've seen some examples of this being able to work. Definitely seen an example of eye tracking be able to show the emotional responses and being able to track what people are interested in. But the projects that are going on today are the data collection projects that will enable the experiences of the future. That's what we have to think about. All this stuff is happening with Ego 4D and Facebook and Project Area. It's all being collected so that some companies can learn what to do that they'll know what to do. And we have a little bit of time to respond to that before it's put into action. So now is I think the time to understand all the potentials of where it can go. How about the metaverse? Freehood got a lot of attention last week for remaining in itself with reference to this science fiction concept. What's your understanding of the metaverse? Also, Abby, Rody told me that the public cannot hear my questions. Do you mind repeating my questions for the audience at Twitch? No problems. So the question was, what is the metaverse? And this is a bit relative to Facebook. And so I guess the short answer means I'm running over a little bit, I guess. It means something different to every person. It's had at least three, five, seven major definitions over the last 30 years. And for some people it's about ready player, one like worlds, maybe just on tablets and phones right now. For other people it doesn't even need to be 3D or XR at all. They'll still call it the metaverse because it connects up IoT and mirror worlds like Google. And for some people it is more about distributed control. It's more about NFTs and DAO, things like that. So it really depends on who you ask. For me, I've been focused on AR. So the one definition that concerns me the most is when people start saying AR is the metaverse. I think AR is AR. AR is our way of interacting with the real world in the future. We can save that for another day. But the thing that Facebook and Met always needed to do that they hadn't done before, strangely enough, even though their social network is, they never really had people on the website. Think about it. You saw icons and videos and text and pictures, but the actual people were not in the website. The people were remote. So you only saw the artifacts of people. And so the thing that explains the most, I think, why is Facebook doing this, apart from everything we can say about all the trouble they've got themselves into and branding and all that, the real reason why the metaverse matters to Facebook is they need to get over this problem. Their entire premise is they're connecting people so they need people to actually begin the space together. And they haven't done that yet. So all this metaverse stuff is about really bringing people in. And when Zuckerberg talks about embodied internet, that's what he's really saying. But honestly, I don't think the future of work is having virtual meetings and boring conference rooms. I mean, that part of it seems to be crazy. What's the point? I mean, think about it. The future should be how to avoid having boring meetings in the first place, not just how to do them virtually. Okay, Abby, and I think that's our last question. What about the glasses? You have worked with Astrid for inventing wearable devices for augmented reality. How far has this gotten? What is likely to go next? And how will this impact our privacy rights? Yeah, so I'll try to go real fast on this one. So the glasses are in progress. It's a very difficult technical challenge to be able to make glasses want to wear, first of all. None of us really probably want to wear glasses, but we do what we have to. But take glasses that you would be willing to wear outdoors and social settings is incredibly difficult. Their faces are very sensitive real estate for putting anything. And even in the future, we may have contact lenses, but not everybody wants to wear contacts either. They do because they have to. So we have to be, these things have to add a lot of value. They have to be extremely good before people are willing to actually adopt them. The challenge with everybody wearing glasses and getting now with glasses that can simply record other people is there's a lot of privacy concerns with people being recorded without their permission in public. I think we need to really have a discussion about permissions framework in which people can actually say whether they want to be recorded or not out in public and in private. And of course there's important issues like we want to make sure that we capture wrongdoing what happened with George Floyd's murder. We don't want to regulate that out of the toolset that we have for combating authoritarian abuse. But we also want to make other people have the permission to be able to say I want to be private right now when they're not doing anything wrong. Eye tracking is going to be a huge thing as we said before. The combination of these cameras capturing all that are eye tracking means that companies that are building these devices if they want to can build a world and a map of all the people and a map of what we're all doing. And that has a chilling effect on how we behave. This is this is where privacy really impacts all of our human rights because the more we know we're being watched and the more we know we can be tripped up for anything we did wrong whether it's being canceled socially or whether it's being arrested for not stopping perfectly at the traffic light or a stop sign. All the things that represent how we engage with the world become under extreme scrutiny when we're recording everything and when we're recording each other all the time. And that's one of my biggest concerns, right? There's a reason we have the 3rd and 4th Amendment of the Constitution. I'm not a lawyer, but but you know at the time we had soldiers being quartered in people's houses and we said, oh, we don't want the government invading us but I think we need to also say we don't want big companies invading our privacy because it will kill our ability to be ourselves to have our freedom of thought to have our freedom of association and speech even. The privacy is the first line of defense to make sure that we can be ourselves truly. The classes are really starting to impact that and I think now it's time to set the policy to say what we can and can't do it. Thank you, Abby. This was very interesting. I think we are short of time right now. We need to go to the next panel but I don't know if we really have any other question. One question from the audience. If not, then we can move to the next panel. Thank you, Abby. Thank you. Thank you. Yeah. So no questions from the audience. We are ready for the next panel with Kurt when you're ready. Thank you everybody and let me invite our panelists up to the stage. We are fortunate to have a great panel with us today and we're going to talk about surveillance in XR in all forms of VR, AR, XR. As Abby said, what matter what you called is a generalized term. This has come a lot to the fore lately with the Facebook's adoption of the metaverse as the future envisioning a future in which people will be doing a lot of things in virtual reality and then we also have a possibility of a future with lots of AR where people will be wearing glasses that potentially have cameras, microphones that we'll see and interact with the world around them. And these create a lot of issues. We want to make sure that we go into a future in which we can take advantage of the cool features of these new technologies without giving up these fundamental rights to privacy, to pre-expression that we have existed in our current world. And a lot of these things are challenging, especially surveillance because in order to be part of a virtual world or to fully interact with a AR world, there's going to be a lot of data that is being collected, sent through centralized servers, transmitted and all for the people who are here in this room, information about what you're doing and saying is going to alt space and then coming and being distributed among all the other people here. And this is a cool meeting and talking about it, but also creates a lot of possibilities of surveillance. So, oh and I should introduce myself. My name is Kurt Opsall. I'm the Deputy Executive Director of General Counsel at the Electronic Frontier Foundation and I've been working on our AR, VR, XR issues. And then let me turn it over to our panelists to do a quick introduction to themselves. So let's start with Danielle. Great. My name is Danielle Luffer. I'm a Policy Analyst Access Nose Office in Brussels in Belgium in Europe. Access Nose is a global human rights organization that works to protect the digital rights of people at risk all around the world. I mostly work on things that follow me on the level of artificial intelligence but I've still been doing some work on the AR, VR, XR and yeah, especially thinking about how this can be a huge frontier of some of the troubling things we're seeing today. Thank you. Kavya. Hello everyone. And I see in the space, the virtual space with you all over again. It's great to see that EFF is taking the step to really get into XR in the real world. I'm Kavya Roman. I'm the CEO and founder of XR SI. XR SI VR our mission is to help build safe and inclusive XR ecosystem. Used to say XR ecosystems but now XR ecosystem is one of the kinds with the metaverse which we will talk about of course again and in order to carry out this mission we identified a few focus areas such as inclusion so we have a dedicated coalition that focuses on those aspects. We have a medical XR console because these realities are autonomy at risk so medical XR console focuses on data protection and the privacy is a bit more for medical cases. Then we also have a trust for the media platform we conduct some immersive podcasts and we also are educating more and more journalists how to use these realities to trust them. And finally we have a child safety initiative very recently we helped contribute to both our reform and getting introduced to policy as the law making and policy globally to make sure that these new technologies are engaged so all of that and you know this is my passion I'm a profession I'm a security professional however XR just happens to be that brings up this I'm always here to listen to and share knowledge and that's all of you thank you Thank you Kavya Makayla would like to introduce for example can you hear me fine? Yes Thank you so much for the invitation it's an honor to be here and I'm so happy that we are discussing these very relevant topics I'm Mikayla Mantegna and I'm the avogamer and I wear mini hats the one that is most probably relevant for today is that Manafit at the Bergman Club Center in research and video game policy and particularly in something that I started to talk a lot and about this year which sounded like something from the from the future and it's always like everyone's games and mouth most of my work has been related to artificial intelligence ethics and intellectual property and one of my concerns about this is how these regulations interact is kind of enhanced total work well thank you so let's kick things off with a central question what are the issues what are the issues with XR are raising four surveillance which are different from go beyond the surveillance issues we have with traditional audio and video communications you'll be able to jump in yeah so as you know you always talk about issues in terms of and I think that is the term we should really anchor our conversation in because I mean there's a hands here how many of us really want to live in the reality where you're constantly watching and it's not just like some remote watching it is exactly what earlier Avivar was saying is by tracking your thoughts interact if you really want to think about and that's you know you say the best about optical it's kind of I don't know if you want to unpack that term and then definitely that's one particular issue to attract constantly and then there is data collection that's a data collection instead of video that all times to the point of being manipulated all times I don't know if we want to have that in our daily body jump in maybe jump in please there's a lot of issues I think we could just stay on the first turn for a couple of hours but I think you know coming at it from the perspective of my work a lot of the work that I do is around biometric surveillance would be campaigned to try to ban the use of biometric surveillance like facial recognition in publicly accessible spaces and at the moment you know that's law enforcement agencies just you know run retroactively on CCTV cameras but if you look at something like AR glasses you know in the future you're going to have the possibility where you will be able to run all parts of that you know facial identification to you're just walking in public space you can identify people to all sorts of crazy applications that in some cases are maybe pseudo-scientific like there's companies out there who say they can predict if someone is a criminal or a potential terrorist based on their face you know the length of their nose this is really stuff we don't want people to be doing in the human rights frameworks where you know established stuff and so without regulation or without being responsible the potential that opens up with having these cameras on everyone's faces run all sorts of crazy high ups is really frightening if I can champagne I think we'll I think we still have huge solve issues on social media and we are kind of working this into to these immersive words the other layer that they are going to collect a lot of data that's more intimate to us as Daniel was saying algorithms are at play they can create inferences and one of the things I I always allege about this is that those inferences are mostly secret to us so we can go around kind of invisible layers about us about what the algorithm think that we do and there's no way that this cannot be abused just if we think about the promise of social media and our images and our pictures and how it has been abused just by so many companies not the least clear view of the eye that started to take these huge information and selling back to law enforcement we think that we are going to have now devices that are really close to our body and as Abby was saying taking these kind of inferences of the things we are looking at and our policies because if you so at that conference I always have to say in Facebook I can't just say that but if you were in that they were talking about the future they were hinting about devices that are going to track our mental pathways just by how we think about moving our hands and that comes back to mental privacy and that one of the fundamental principles of criminal law is that you cannot be charged or you cannot be prosecuted for what you're thinking and there's no way that we can just believe the promises is not going to be abused particularly with we have these presidents so I think it's time to kind of get together and start putting some questions into these promises of this so presentation of kind of a dual-coded version of the world and also they are trying to sell so forceful version of reality we need to port the things of persons and the way that we are seeing the world because we were seeing the last conferences like are not creating a space where you can teach us and change your avatar and change your soul and adopt another skin they are trying to port reality into a virtual world which doesn't kind of make any sense to the concept of the metaverse should be so it's really time to talk about decentralization and not the least talking about decentralization because one of the things that for me is kind of really a big deal hey, interplay into this all these kind of things that we are seeing on automatic content moderation I'm going to be port into this very intimate creation of content as Abby was saying we are going to see real time creation of content thanks to generative artificial intelligence and that is going to be created by data that was previously from us without noticing so there is a lot of layers of these that are replicating the issues that we already see social media hey, Miquela a number of us people cannot hear you probably I'm sorry voice okay so a lot of real good issues raised there so we have the massive data collection especially biometric data collection to enable the conversation and also the potential inferences or secondary data derived from that inferences about what you are thinking what you are in favor of or opposed to what excites you what does not and these all create additional privacy issues that are harder to do in a more traditional video audio environment the issues we just discussed the panopticon this is a term derived from a prison design in which there would be a single go bird that would be able to view inside the cell and the notion was that therefore a single guard could be looking at you and this would inspire better behavior people they were being watched and this is one of the concerns that comes out with a extended reality environment where you are being watched in a way that you're not in a physical world so this brings us to the important question how do we move forward how do we keep the promise of this technology and take advantage of it without without creating an optical I can kick by saying that for me the centralization is key because one of the things that we should have for an accountable metaverse is different companies working on that and different people working on that and even governments working on that as well and have presence there because if we are we have seen this tension between centralization and decentralization and a company tries to be this kind of place that encompasses everything that is going to be so hard to take accountability for for me the key is accountability and decentralization so I want to zoom out a bit because you talk about we want to solve these issues and we did task ourselves solve these issues like back in 2018-19 and as soon as we began the first things that we encountered was what the F is XR so it brings us to first thing is are we speaking the same language we also heard the word mentioned by biometric data we heard inference we heard a lot of terminology being dropped by Abhi Parzid the one thing that we must do is first collaborate come together and then create consensus around taxonomy so one of the efforts that we kind of at XR and I'll take it as an example is biometrically inferred data and that's what we can potentially use that term to make better laws that take the data capture regulations or the privacy regulations beyond and BII which is a personal identifiable information so I think one of the major issue that we're going to confront is just like you know meta is not touting metaverse but has anybody taken the time to standardize it and that's kind of the active activity that XR is using with lots of stakeholders and partners is first we need to create consensus as to what exactly are we trying to address what does decentralization in terms of extended reality and what is it going to look like are we talking about creating virtual wars that will be based off of blockchain are we talking about self blockchain doesn't mean it's secure but what kind of implementation and standards are we going to adhere to so once we've created those standards then we need to go to the step two which is already the way again is understanding the context of the data because in certain context I would love to share my data and in some context I wouldn't so I do like to shop some companies who cater to me specific shopping and whatnot but in a certain context I don't want my medical data to be in the hands of developers to be using those inferences and then sharing it with some insurance provider at least in America you could potentially be denied coverage for issues that we may not even know that exists so I think there is like a process that we need to follow or to create consensus we understand the context of the data and then we kind of create these guardrails which is basically why are we coming together access now EFF XRSI and even meta like I would say without involving and engaging all of these companies it is impossible because they hold out of this data and they hold the rules of the game we can help them we can inform them what they should do in terms of like stuff governance because regulations will come in very late we need to educate them the third part is a lot of the people are talking about Facebook meta has anybody asked where the hell is Apple in all this we have not once heard about what is going on at Apple and I really would like to point out that black box needs to be unpacked because they can't just one day get on the stage and unleash all this incredibly sophisticated technology we are basically just sitting there and shouting at them just like they did with the CSAM stuff what I'm talking about is the you know they try to do the breaking the encryption in a way or scanning the client side scanning for child sexual content and stuff that is not acceptable so everybody kind of starts to demonize Facebook I'm saying we need to get together with all the stakeholders regulators everybody create consensus understand the context and then start to put the guardrails together not just regulation but self-governance and also have this sort of shared responsibility because people can give you choice like in creativity right now you have a lot of controls but you have to press the button to mute that harasser who is going to still come up to you and say horrible things we have to shift some of these responsibility and share it amongst us I would just add in that there is actually a huge amount that we can do with existing data protection regulation and data protection principles that we already have things around data minimization you see the same problem in the world of artificial intelligence doing square quotes where you know you hear things from companies like oh this is so sophisticated regulation can keep up and then regulation gets in the way of artificial intelligence and it's like data protection regulation X writes and artificial intelligence applications threaten them so that's why there's a conflict it's not regulation that's the problem it's some of these applications and you know I think if you look at basic minimization here's I think applying that the XR you know applying that to VR is key like there as Kavya said you know there's certain contexts in which I will want to give eye tracking data in which I will want to give you know quite intimate data because maybe I want that I want to you know have a really expression like an avatar that's full of expression in certain contexts that's what I wanted for that's it it's purpose limitation it's just for that purpose and if that's respected there's actually no issue so there's these basic well established principles out would solve a lot of these issues and companies are just not following me and I got with this rubbish that you know this is a completely new sphere regulation can't keep up and that is simply not true there are cases of course where we really do need to adapt to new developments but I think a lot of these baseline issues that we're encountering are really easy to solve with the tools that we've already got totally totally agree with you yeah absolutely I think in a way it is one of these good questions are interesting questions that have been addressed in other contexts and as a question of how to protect our privacy to new technologies which may be gathering more information or using it in different ways and trying to do that do that sensibly and at least my hope is that by giving people more control over their data how it is transferred more transparency so they have an understanding of what is agreeing to we can preserve privacy and civil liberties in an XR world very challenging one of them just looking at not participants who are maybe walking by that could be have their faces analyzed could have this technology applied to them and it's very difficult to go through a mechanism to get their consent or to have them even necessarily be aware that they are participating but it's a very important part of our society that you're able to walk out in the public sphere and you're used to a notion where people will see us but perhaps forget about us in a few minutes that you'll have a actual privacy off of ubiquitous AR glasses might change that paradigm so how do we address the privacy issues of bystanders great question and I think again I will go back to that same formula let's start with textology we said bystander we need to divide it further we need to divide it on what type of bystander situation is going on but one is their coexisting in a space as standard and you know especially for virtual reality users we share a same physical space or virtual space and we interact with each other to part that thought and about demoing so if somebody is just giving you a demo you know that's the actually applies in education field any of times people are just demoing to students researchers them in VR that I refer to that because we need to create a culture of you know virtual reality is a real experience for some people and the way you touch you know if somebody falls the way you touch I could really trigger like you know harassment PTSD there is this other cultural aspect of you know bystanders that we need to address and then there is the interruption it's about yesterday you know class 2 version 34 I believe the bystanders interruption and we've all seen this I don't know if you guys have seen this where would I just touch his kid and I think I just on my table but us too now gives you this while we know what space sense does it now allows you to view if somebody is interrupting in your guardian but have we thought about how it's operating what other data is being collected while you look around and what haven't been given the technical specifications and the details of what what that does to our and what protocols are being used so once we create the services and we move on to like okay how do we understand the context because actually in the AR glasses is going to matter the only way to address this is to have the contextualize AI which will tell us hey I want to be recorded in my living room that's okay AR glasses the limit I define in my room it shouldn't record when I'm in my bathroom and that is not possible unless we have contextualize AI so we need to give this data we need to create those type of understandings and basically a common understanding of what the look of the game and then those mechanisms need to exist and finally you know just like give away these privacy and by understanding the sense and privacy is respected I don't know what do you think Daniel were you going to talk just a quick thought to add to what you were saying because I think that the problems that we are describing are things that are already existence and there is not a different kind of argumentation because now we have cameras that can catch by standard but the thing is that now is a different scale of the information that we can take about them so think that the problems are already there and we have been working about this we should build on that because one of the things we are trying to address everything is something from new and something that I take from the Facebook conference and for me that's a risk because it kind of erase discussions that we have been having about how algorithms and how algorithms are processing content and I think we should start by taking the previous experience we have on privacy and surveillance and kind of adapting to this new magnitude and not forgetting that you already have a base to build on just I think for the question of privacy we talk about story and we published a blog today they made quite a big deal about the fact that they claimed their feedback and actually participated in a design job I think in May 2020 and gave them crystal clear feedback that they ignored completely ignored and to build on Michaela's point I think it's really important to point out that anything you can do with a pair of AR glasses you can do by pointing a camera on a smartphone at someone you can run a facial recognition if you have a level to the public you could run it on your phone or on the glasses but problem is that having cameras on the glasses having the ability to run you know these apps on the glasses there's a lot less friction there's a basic friction of like I point phone at you you see me pointing phone and that friction create the possibility intervention you know it both soft person from doing it and if they're brazen that on glasses where you're just looking at someone friction is gone and what we said to Facebook you reduce the friction on the user side by placing this capability in glasses you need to increase it in some other way you need to give a really annoying signal whether it's a loud noise, a lot horrible smell I don't care what it is but you need to somehow replace friction to give a reintroduce that possibility of the bystander to know that this is happening and potentially to intervene and I think as well like he had a fantastic talk about the the Ray-Bans who re-pointed out the idea that like our glasses have to have a front-facing sensor as I've pointed out that doesn't have to be a camera that can record the idea that these Ray-Bans stories are an essential step on the way to this nonsense we don't need us as to look exactly like in the Ray-Bans to record people that's not a thing we need to do unless you know you have some kind of sinister motive lies the panopticon in everyday life so I think it's a difficult challenge like I would never minimise how difficult that few of alerting bystanders is but I think Facebook Ray-Bans we can say that they've ignored you know like the problem completely off their shoulders that's an example and we'll do it for sure and Daniel I totally remember being on that panel and Baxter there was a discussion and I remember it being a very strong recommendation from multiple stakeholders that you really have to be careful about it but that's exactly what they did and instead of listening to the recommendation with a little tiny light no standards there on the light no red no nothing but a white light which basically actually gets a signal to a non-person like a nevelence sort of a signal and in many cases you could put a tape over it it won't be noble so yeah I mean it just goes to show that often times this sort of these are not heard and in which case we need some regulates some heavy hand that will say hey can't do this human beings surveil them on a daily basis I'm sorry but that that is an at least assumption because having a light to that I'm recording is not going to work for a lot of people and one of the things I'm seeing about the promises that are some reality is like to be certain and a person not only able to see and listen and also to access to the technology because it's going to be a huge gap I come from basically in Argentina and access to technology is so difficult and there's going to be a knowledge gap there's going to be an access gap and you don't want that because the thing about the smartphones and one thing that we already have like you can go to any site mostly with kind of devices that is going to matter but this is going to be really different in these kind of immersive environments that the hardware that you have to access means going to mother a lot and there is people that is going to be left out it's kind of for me it's like the internet of at least assumptions because always we have adaptive technologies to with this and how it is going to translate into the metaverse all right well we're almost out of time we started a little bit late so maybe go for a moment we had a couple of questions that came in one was how we put both of them out there so we can try and cover them at the same time one was how we address metadata privacy in the metaverse and the other was neurolink in the metaverse VR environment one of them very very interesting for the I don't know to talk about addressing the data part because we are currently working on this XR data classification working group it is a public working group if you a developer a company a civic organization can be a part of this working group to understand the context and the conversation because once we understand the context then we can start to apply the rule but I think that's one of the steps that I believe is really you know the early question others get to get can jump in just neurolink because I mean brings up the question of CI's computer interfaces and yeah just a big shout out to Ken Pai who's been some great episodes on this you know I think idea that like a headset could integrate BCIs and could be you know mining this incredibly deep information about you is extremely worrying but one thing that I was like to point out about this is you know more data deeper data deep better inferences it's not a simple you know you collect data and then you can know everything about a person there's a whole load of assumptions there's a whole load of content about how you construct you know this human emotion and what are all of these things and a lot of the theories that are being used in Silicon Valley and that are informing the technologies are very simplistic and they actually have a reductive of our human experience so I think it's really important not to see word and deeper a more intimate data we necessarily we make better inferences about people we can really get to a dangerous place where we're coming up with ideas and conceptions of what people are and using those to build worlds and form how people interact with them can really radically undermine very important things about us there's a quote that says that paraphrase the danger is not that reductive theories of human behavior are true but they become true and I think it's done badly could really be Yes actually when you talk about the valley this is one thing that I'm seeing is they are kind of telling everything like back in the days when the cigarette products were sold as entertainment you know whatever you know and now we're just a metaverse and then combined with brain computer combined with artificial end-up this is some monster making you better understand us otherwise cigarettes can kill you it took us only like X amount of years to understand this came to all right well we're sorry go ahead we're just a quick thought about mental inferences that circle back to the promise of cannot be abused and please have in mind clear view with that so thanks thanks everyone for coming or those walking on our live stream or in here virtually to our a bit please thanks to all of our panelists for joining us as we are trying to solve some of the issues XR VR the metaverse all of this will become more prominent and it's good to get in now to try and ask these questions look for answer and try and find a future that we would live in that incorporates these technologies but doesn't turn them to panopticon turn into a surveillance state looking into your mind you can take advantage it won't take advantage of you and that's the future we want you thanks everyone