 Okay, I'm Privacy Counsel at Common Sense Media, as we were just talking about in the prequel. You may have heard of Common Sense Media, we're a non-profit dealing with children and technology. If you're an educator, you've heard of us because we provide digital citizenship curriculum and other resources for educators. If you're a parent, you might have heard of us because we do age ratings for television, movies, and video games. We produce a lot of content. You might think of us as a content company. We're also a technology company. We also do legislative work. We have a Latino group. And the smallest and newest team at Common Sense Media is the privacy team. And I'm here to talk about some of our privacy work, and in particular, some of our latest research that's literally in progress. I'm literally speaking at academic conferences about this in-progress research. And I have the helpful assistance of research assistants and students, particularly law students and graduate students working on this research talking about young women and privacy. As many of you have realized, there have been young women who are really becoming active on social media and in politics. And we're moving our research is trying to move away from let's protect students, their helpless young individuals. So let's look at teenage and young women and see what they're able to do online and how we can not just protect them, but enable their work online. Let's see how are we moving the slides. Slides. OK, so who we are. I went into this really quickly and I do want to leave some time for a discussion. So I'll go through the slides pretty quickly. Slides will be available afterwards. You don't have to go crazy screenshotting and taking notes on that. As a professor, I always make sure that you have these visual resources for some people who are visual learners and audio learners and use it a variety of different ways of getting this information across. So we are the nation's leading nonprofit dealing with children and families and education and technology. What does the privacy evaluation program do? We, you know, in a nutshell, we're doing privacy evaluations of educational technology. That's where we started. And we're getting out into some commercial apps and the word educational technology has expanded. That's really like blown up. Hey, Zoom is now educational technology. All of these products that are not just Google classroom, but are, you know, Amazon, Alexa and Google Home and other products that are being used out there in the world that are not just education intended, but are de facto or actually being used in education. So how do we decide what to evaluate? We look, we have over 350 schools and districts in our school district consortium. We talk, we talk to software developers. We talk to educators. We talk to parents and we find out, hey, what do you want us to evaluate? What do you guys care about as far as privacy? And we put out not just individual privacy evaluations. And as the slide mentioned, smart speaker and smart watch and other smart devices testing hands on testing, but we actually put out reports as well. So you can see how the industry is doing as a whole. So why do we protect student privacy? I won't go through all the details of the slide and I'll make it available to you later. But, you know, it's not just we should protect privacy. There has to be a why. There has to be a reason that we care about this. And sometimes it seems obvious, but it's worth taking a deeper dive and seeing what's important about privacy. There is the possibility where someone's exposed to unwelcome and inappropriate content, we always focus on this one a lot, you know, protecting students from being exposed to outside content. We talked to Zoom at the beginning of the shutdown about their privacy and things have been, you'll notice additional security within that context and other products as well. I'm sure that we've poked. We keep poking them and getting them to increase their privacy either by talking to them directly or they see our evaluation and they call us and they say, how can we, how can we do better? How can we get a higher rating? All of our ratings are available for free to the public on our website. And I shared our link in the chat for anyone who wants to go poke around and see if a product they use or a product they have used in the classroom or used with their children is currently evaluated by our company. We haven't done everything, but we're out there. Contact risks. So this is where someone, a young person, actively involves in some risky communication and we look in our evaluations about how to moderate content, filter content, so that doesn't happen as often. Conduct risks. So when someone behaves, a young student behaves in a way that contributes to their own risky content, that's a little bit more difficult to filter, but we also look at those controls and identity. I kind of added this one to our typical risks. I think of privacy as part of the self. This is more of a European way of looking at privacy rather than a U.S. way of looking at privacy. The U.S. way of looking at privacy is a little bit more about commercial issues, selling data, that sort of thing, with the new California privacy laws. But it's worth mentioning that it's important to consider that privacy is a human right and it is part of yourself. I'll go really quickly through some of these research slides and you can look at them in more detail later. We looked at the state of the world's Girls 2020 Report about harassment and abuse online, lots of incidents, lots of concern, even if there hasn't been an incident, sometimes young women and girls feel unsafe as a result. And there is a concern about suppressing speech. So not just did someone have a bad experience individually, but if you hear about a bad experience or you're aware that this happens online, that young women and girls may not even venture online or speak their peace online because of concern about backlash. These slides are a little hard to read. Maybe they should be three slides. But what's significant here is looking at the age when online harassment first occurred. It's quite young, definitely under 18, maybe more in the 13 or 14 age group, perhaps coincident in the United States with starting high school. So that might be one of the factors. And the other slides list the kinds of harassment and it's important to notice that abusive and insulting language could come right out there even before something that you hear about in the news, like a photo being shared, there's quite a bit of microaggressions and abusive and insulting language in 59%, that's more than half. And they do affect the women's, young women and girls' participation online and their feelings about participation online in the last of these three boxes I'll highlight instead of the largest one, one of the smaller ones here, trouble finding or keeping a job. And you think, wow, it's not just social stuff. It's not just dating and friendship. It actually can have a long-term economic consequence and educational consequence for young women. This is just a little case study that I'll throw out there. I hinted at the beginning that I would talk not just about protecting very young girls and young women online from being harassed, but also what happens when female students want to take the lead and be what's called an influencer or make their living or their economic mark online. One of the things I joke about with this slide is that it may be a little bit outdated. You'll notice that there's a shocking lack of TikTok on this slide. Even though I had interns work on this, they may have been law student interns, so they may not be actually up on all the latest social media. I probably should have had my teenage daughter make this slide. So we made sure to get Snapchat and TikTok and some of the other more avant-garde social media aspects. So let's see. You know, we'll look into our research. We'll find out what happens as more young women become active online and how we can protect not just students in the classroom, but activists. So one of the ways I've used this research was to speak in the context of college admissions. So I spoke to over the summer to the largest national group of volunteer college admissions folks. And I talked to them about the young women's privacy in the context of college admissions and some of the negative consequences that can happen in that situation. So not just positive branding or going out there and saying good things about themselves, but also the possibility of active negative branding of a student, including some of the things that they did voluntarily, like racist and sexually offensive Facebook messages. That's still happening. And of course, the other side of the coin is passive engagement. So, you know, more students, there are a few students out there who are influencers, but many more students are just talking about social activities. And we want to be kind and gentle to young women and allow them to have a normal social life. Girls are taking initiative. Here we are, including things like TikTok, and they are reaching out to these social media giants and saying, hey, we matter, and our privacy matters, and we want to be able to report abuse and harassment without just having to drop out entirely. Whoops, too fast. So this is just a list of things that I brainstormed of how students can protect themselves online. There's quite a lot of things that they're doing. I'm only halfway joking when I say masks may actually help prevent COVID and facial recognition. So you can use it for both. And then some of the other typical privacy protective ideas, like eliminating location tracking and considering what to keep and what to delete. I always like to remind students when I'm speaking in a classroom, particularly the undergraduates and high school students, don't forget that even snap, even though you think the Snapchat has gone away, there was a case in 2014 where the US government reminded everyone that Snapchat can still be screen-shotted and saved. So it's never completely gone off social media. Around the world, other countries have taken a more active role in combating hate speech. Germany is well known for this activity. They're particularly sensitive to Holocaust deniers for obvious reasons and concerned about people spreading fake news and conspiracies. We have a little bit harder time getting to that place in the US, but because we have freedom of speech in our constitution, which fights against these laws and also because our government's a little disorganized at the moment. Questions, thoughts. We have a few minutes left, five or six minutes left. Just about to get the big warning, I'm sure. And I'd be glad to talk to anyone who wants to raise issues either in the chat or in the questions about some of the issues that I've talked about with regard to young women and privacy and online behavior. Thank you, Jill. Thank you, Yudita. Wonderful work summarizing this. There's so many things that can be brought up in this conversation of privacy. So yeah, kudos to you to making such a great summary. But just as you said, if anybody wants to raise their hand, maybe post a question in the chat or even ability to take your mic to make the question yourselves. Sure. I always like to hear from educators, whether they've had that experience in the classroom. I know that I've had that experience in the classroom as a professor and an educator, both it particularly comes up with regard to maybe younger undergraduates and the older high school students. That's the age group that we're focusing on. And there is somewhat of a suppression of free speech and a concern that they might suffer either online consequences where they're smashed down because they've said something in particular that the other people online have disagreed with. And also maybe even some real world consequences where people are recognized in the real world. And there are some consequences to their online behavior in the real world. So that is something to think about. I went to a talk a few years ago about the concept of sock puppets, which sounds really cute, you know, like you can talk over it and then it has little space on it. But in the privacy world, what a sock puppet is, is it's a metaphor for anonymity. And so anonymity is a really interesting issue when you talk about young women acting online. So with regard to gender and intersectionality issues, some people really need to preserve their anonymity online in order to be able to speak at all. So they do that, of course. Sock puppet. Sassing. And we have a sorry, sorry for interrupting. We do have a question from Tannis Morgan. Has your organization gotten involved in online proctoring conversations or evaluation of online proctoring software? Yeah, so that has really been in the news a lot lately, particularly the proctoring software for exams. I know college admissions has gone on, generally with the exception of University of Florida, which is the one holdout, not an express, that has still required testing. Generally, like 99% of colleges have gone without testing, but if we have continued a need for online testing instead of in-person testing for the usual ACT, SAT staff or otherwise, we'll need this online proctoring to prevent cheating, you know, the turn it in for online testing. We have been evaluating some sectors of the market. We have not particularly done online proctoring software, but we have definitely discussed it. It's definitely up next. Wonderful. And I think we have time for one more question from Karin. And she's saying, how often do you all evaluate and do you go back often to check in and the location tool has updated its privacy policies? What I didn't get into, but I will summarize in a nutshell. Yes, we go back in and how often we evaluate. We actually have software that we I mentioned that we're kind of a tech company as well. So we have engineers in-house that have built a software tool called the policy annotator, which if you look at our website, you can see all the it's all open source information. You can look on GitHub if you want to see the nitty gritty of the programming or you can look on our website if you want to see what kind of questions we ask for a checklist. But we've built software called the policy annotator, which looks at policy privacy policies in terms of use for all of these apps and technologies. And the software will actually go in and tell us when it scrapes every 24 hours and it lets us know when there's a new privacy policy or terms of use. And then we still have a human element. We're working on AI and machine learning. But we still have a human element. We still have human evaluators that are doing the evaluations, which is good or bad. Means we can't do new evaluations every 24 hours. If we were totally AI, that would be lovely, but we are not. But we do frequently, you know, at least every few months, go in and do a whole new evaluation. And there is one more question about IP and social media. Yeah, so we do cover that issue as well in our privacy evaluation. So you're welcome to look at our privacy evaluations. And it's not a huge issue, but we do give a point for companies that just take a license and don't steal all of your content and keep it forever. Wonderful, oh, I'm getting some feedback from my mic. Can you hear that? Now you sound good to me. Oh, thank you so much. Oh, and the 20 minutes fly by. We have more of the information, but it's we are on time right now. But I want to remind everybody that we can continue the conversation with Jill in OED Connect. I'm sharing the link with you guys right now. Because I'm sure there's so many other things that you could share with us, Jill, and we are very thank you for this presentation. Thank you so much for sharing this work and also the other layer of working with students and girls in particular. It's just fascinating. So thank you very much. And for the everybody, if you have any additional questions for you, please address them in OED Connect. I'm sure you will be with us through that platform. So we need to transition to our next presenters. If that's OK. Thank you, Jill. Thank you very much. Thank you, Jill.