 So we are webcasting today, which means that this is being streamed live and also will be available online afterwards. We're very excited to bring Leah Plunkett here today. And she is speaking about her new book, Sheret Hood, Why We Should Think Before We Talk About Our Kids Online. Indeed, it's a book for parents and other adults, but it also brings the perspective of a legal scholar with a deep understanding of privacy and equity issues. And a strong grounding in technology and an academic, Leah is also an academic who has worked for and with youth. She's faculty associate here at Berkman Klein. She is also an associate dean and associate professor at University of New Hampshire School of Law. She is a longtime close collaborator of youth and media, and Leah was just telling me that she was an RA with Jonathan Zittrain in 2004, which is fabulous. She graduated from the Harvard Law School where she was training director for the Harvard Legal Aid Bureau, and she continued to work as a legal aid lawyer with the New Hampshire Legal Assistance. There she founded the Youth Law Project, and she also brings a perspective as parent to her work. And she's got her son and her family right here, so that's very exciting. So we're very lucky to have Leah with us today, and let's give her a warm welcome. Thanks so much, Liz. It is just such an honor and a delight to be back with so many longtime collaborators and mentors and experts and old friends and soon to be new friends as we all have a lively conversation. So I'm going to talk for about 20 minutes and throw out three major ideas, and then we'll open it up for what I hope will be a dynamic discussion. I wrote a book from the MIT Press Strong Ideas series called Sharon Hood, Why We Should Think Before We Talk About Our Kids Online. This book is the direct result both of my background as a legal aid lawyer and a consumer rights lawyer. And also of my work with the youth and media team, where we looked together, they are still looking very closely at the ways that youth, so roughly 12 to 18 year olds are engaging in our digital world, as well as a number of the adults around them. And I began to be increasingly interested and concerned about what the adults around them were doing. And so I embarked on a conversation starter, the Strong Ideas series, which David Weinberger, who's also part of the Berkman Klein community, has been stewarding, is designed to throw out provocative ideas about technology and everyday life by academics and experts for folks to have general discussions about. And the ideas I'm about to throw out, I first tested in a fellow's lunch back, I don't even know how many years ago, so it's wonderful to come full circle. So my ideas today are going to focus on parents, play, and predictions. Before I move into those, quick question, because I'm a law professor and I have a captive audience, so I'm going to put you all on the spot and ask by a show of hands, how many people have heard the term sharenting? I'm glad my son has, because I talk about it a lot. So sharenting, I did not invent it, I got asked that recently by a reporter, I can claim no credit for that. I can claim credit for using it slightly differently. But just so we're all on the same page, the way the term sharenting is typically used is to refer specifically to what parents do on social media. So it is confined to parents in social media. I think that that's a very important part of sharenting and I will talk about it, but my understanding of sharenting is both broader and deeper. I think to really capture the full extent of the ways that kids' private information is acted upon digitally by the grown-ups around them, we need to understand sharenting as being carried out, not just by parents, but also by grandparents, teachers, we started to talk a little bit about school just now, coaches, in-laws, other trusted adults, and it needs to be understood as all actions taken with respect to children's personal digital information. So not just social media, but also Fitbits and smart home devices and other digital technologies. So now that I've sort of made sharenting broader and deeper, I am going to come back for my first provocative conversation starter and talk about parents. And I'm going to say that parents in particular pose an underappreciated risk to kids' privacy and their current and future opportunities. And I include myself in the parent category, so a big part of this book was writing a conversation that I was so engaged in and inspired by professionally, but also one that I was increasingly having personally with my spouse, with my friends, with family members. So I say, as a parent, I think we are a well-intentioned group. At worst, we are a little bit careless just in the course of daily life. It is difficult to impossible when you are trying to figure out whether this app is safe to use, to put your child's information in, while you're also answering a work email, making dinner, letting the dog out and doing a million and one other things. So we're trying our best, but we as parents right now are not the best gatekeepers. And that's tricky, because the law in the United States gives parents supercharged constitutional level protections around whether, when, with whom, how and why to have and raise kids. And as part of raising kids, whether, when, with whom, how and why to share information about them on social media, through apps, through devices, through smart home affordances. So the United States Supreme Court has made this very clear, I'm not talking about digital world right now, but just very broadly, the supercharged protection that parents enjoy has come up time and again in the court's jurisprudence. And the court will say things like, the child is not the mere creature of the state. Those who nurture him and direct his destiny have the right coupled with a high duty to prepare him for additional obligations. That's Pierce versus Society of Sisters from 1925. So gendered in the use of his, but that same deeply rooted idea that there is a high duty to buy parents to prepare children for their destinies outside what the government might require as a bare minimum of child welfare and child education. So that deeply rooted idea that we as parents are entrusted with the high duty of being gatekeepers between our kids and the world, it enjoys a lot of support. And I would argue makes a lot of sense if you think about the different potential gatekeepers for kids well-being to paint with a broad brush. You have parents, you have the state, you can sort of subdivide those into categories, but as between parents and the state, I do think we want to tip toward parents as being the source of protecting their children and defining their destinies. Now that concept has been pushed through into our digital world. And in general, when parents get to decide when it comes to whether, when, with whom, how and why to share our kids digital data, we get to decide all of that. We're subject to the outer boundaries of criminal law and other what we might call freestanding laws. So other laws on the books that don't have anything to do with privacy in particular. So let me give a few examples of legally permissible share-inting and examples of where we get to those outer boundaries of illegal or criminal share-inting. So perfectly permissible for me to take an ultrasound picture, put it on Facebook. Perfectly permissible for me to take my child's exact time, date, location, height and weight, circumstances of birth, full name, put it on Instagram with a picture. I can also make a YouTube video of the whole labor and delivery process and put that up. And if I monetize it correctly, potentially wind up with millions of followers and start building a business around it and enter the commercial share-inting space. I get to make those decisions as a parent. The law is not going to step in and regulate my ability to do that. The law would step in if I were, heaven forbid, to make a video of myself doing something to or around my child that was criminal or illegal. So there have been some high-profile examples of parents engaging in this type of behavior, filming it, putting it on YouTube and then actually having viewers step in and alert child protective and child welfare services. Daddy O5? Has anyone heard of Daddy O5? No, okay. So that was a couple of years ago. That was a YouTube channel that amassed a good half a million, three quarters of a million followers and it was a family prank channel. And their pranks crossed the line into what the judicial system found to constitute abuse and neglect of a couple of children in particular. Viewers alerted the authorities. The children were temporarily removed from the home. The YouTube channel came down. But even outside of this commercial share-inting space where you do cross that line into illegal and criminal acts, parents can push the envelope pretty far. So Daddy O5, I'm actually glad no one's really heard of them. Who's heard of Jimmy Kimmel? Who's heard of his Halloween candy challenge? Right. So every year we just kind of finished this Halloween period and Jimmy Kimmel celebrated a late night host issues a challenge every year where he says, parents, this will be really funny. Take your kid's Halloween candy, hide it, tell them you've eaten all of it or thrown it away, film their reactions when they understandably freak out and then send them in, put them on my website. And if you do a really good job, then I will select you and maybe even show it on the show. That's legal. Now, keep in mind that if it was an older student doing that to a younger student in a K through 12 public school system, we would very likely be engaged, we would be thinking about behavior that would meet the legal definition of bullying and actually require the school to take action against the student perpetrating that kind of, you know, ostensibly prank, but really behavior designed to intimidate, harass, cause emotional disturbance. But parents can do that. So the other thing that parents are legally entrusted with doing is stepping in when other institutions or individuals want to share data about their kids to the extent that there is a legal framework in place. Now, there are many spots right now in our digital world where there's not really a robust legal framework in place. So if I'm, you know, with my kids on the playground and I see another parent taking pictures of all the kids running around together, I can sort of, as a matter of norm or practice, go over and say, could you please not do that? You know, we don't put pictures of our kids online, but there's not, you know, some 800 number I can call where the Sharon Dink Squad is going to show up and take their phone. But to the extent that there are spaces where there is legal regulation of other people or actors wanting to share private information about kids, its guiding principle is parental consent. So a big example here, and actually where my work with the youth and media team started is schools. Schools are subject to federal and state student privacy laws. The big federal ones really predate the digital era, but there are a lot of state ones. In fact, I think from 2013 to 2015, there were roughly either 500 or 300, 300 bills considered around student data privacy and a number of those were actually passed. So when you're talking about Sharon Dink within the school system, a teacher wants to use an app, a school administrator wants to track attendance data, the legal framework in place to paint with a broad brush is that parental consent is needed if it's personally identifiable information from an education record unless an exception applies. And interestingly, because we parents can be somewhat lousy gatekeepers because we just don't have the time or the technical ability mixed with the legal ability to read all the terms and conditions and the privacy policies. If your child is in a school system where you have a really strong interdisciplinary team making those data privacy decisions, your child might actually be even more protected at school than at home. And I do think that we as parents, and we'll talk more about this sort of in our discussion, there's room for all of us as parents to be better informed and take more responsibility, but it's also the reality that a big part of this is that parents as gatekeepers made a lot more sense in the brick and mortar era where you could very clearly see the boundaries between home and outside, the boundaries between this is our community space, this is our school, this is our playground, and everyone else. And now when you have products on your desk, the smartphone in your hand, the Gizmodo watch that you give to your child, sorry Sam, I know I haven't given you one, on their wrist, a fertility tracking app, a fertility tracking bracelet in Alexa, an Echo Dot, a smart fridge, a smart TV, and I haven't even gotten into the stuff in schools, there's a lot of things that are in our homes or even that we put on our kids' bodies like a smart diaper that are actually, and that's real, I did not make that up, that are not actually keeping the information within the protected sphere of the home. And these decisions can have real consequences for kids' current and future life opportunities. So in a very concrete example, when parents post on social media and others in the community can see that in real time, that matters. One of the things that has come out as I've done a number of these book talks now in different settings is the problems of parents on parent Facebook groups for a neighborhood, a school or a community going off about issues that their own kids are involved in or that other people's kids are involved in and creating these digital trails that can actually really paint either their own or other people's kids in a pretty negative light. After this book came out, I got a call from a dean at an undergraduate campus saying, I feel like our undergrads are doing a pretty good job with this, but we have a parent Facebook group that's out of control. I really wish we could get the parents to stop saying things in this group about, oh, my child had to run in with the police, my child had a bad breakup. Could you write something that I could share with the parents? I said, that's a great question, sure. And then as we'll talk about a little bit more when it comes to predictions, of course, we as parents really don't have any way of knowing where the data that we're sharing is going in terms of being used, not just by the tech providers that we may think we're giving it to, but by third parties that they may be giving it to, by data brokers who may be aggregating it, analyzing it and acting upon it. Now we're in the future in ways we can't predict, control or understand. So that's my first provocative point. My second provocative point has to do with play. And I argue that childhood and adolescence as protected life stages to play are being threatened in today's digital era at the hands of adults, and that that threat further threatens kids' ability to grow up with their own sense of agency and autonomy and become the people that they themselves are meant to be. And when I'm talking about play, I'm not just talking soccer field or playing pretend. I'm talking more broadly about spaces that are experimental, iterative, inclusive and equitable, protected. And the American legal system has a recent, I'm not going to go back before the 19th century here, but a recent tradition of treating minors as deserving of heightened protection in our justice system. Minors typically can't enter into contracts unless it's for necessities. Minors have child labor laws at the state and federal level that limit their ability to work. We have a juvenile justice system that is designed to be rehabilitative rather than punitive so that if a child violates a law that would be called a crime. If an adult did the similar act with a child, we call it a delinquent act. We go to the court and say, is it true or not true that the child committed this act? And then we put a rehabilitative system in place, although as a former legal aid lawyer who represented kids, I will query how well we actually do that, but that is at least our commitment. And key to this commitment is a recognition that children are still learning. They are going to make mischief, they are going to make mistakes, and ideally they will grow up better for having made them. But there is limited or no way to learn from the experience of mischief or the experience of mistakes because there isn't any protection from an adult gaze, whether it is adults that know you or adults that don't know you and an ability to have the fun but a little misguided or downright terribly guided things you do as a child forgotten. And somewhat counter-intuitively, the more we surveil, track, monitor, record, and just share and post about what our kids are up to, the more difficult we are making it for our kids to grow up with agency and autonomy and space to develop their own sense of self. Now, is there a direct line between giving a child a smart teddy bear or having a child's first words be to an AI assistant? How many of you heard 1A last week, the NPR show? They had a great segment as a professor, I think, of media studies or linguistics talking about kids whose first words are now to an Alexa, a Siri, or an Echo Dot and what that means in the domains of child development and attachment. I'm not going to step out of my legal zone but just flag that for folks who might want to listen to it. Is there a direct line between that and saying that kids have a compromised ability or an inability to have the privacy to play and develop their own sense of self? I'm not going to say there's necessarily a direct line. My argument here is not a very precisely causal but a more bigger picture conceptual. And as we increasingly and casually share intimate information about our kids digitally and with other people or other institutions, we are moving away from protected spaces in childhood both because other people may see things as they're happening. There was a Microsoft research team study that came out this fall in October that found in a sample of teens, I think it was, that over 40% of them reported having issues with their parents posting about them on social media. They were aware sort of in real time of a sense of exposure and even a violation. Also because data about issues that kids have that's supposed to be kept private can get out and may be used against them somehow now or down the road. In a big category here has to do with surveillance and tracking and monitoring. Parents do those things. The Washington Post just had a great article about the Life360 app that a lot of parents are sending their kids to college with that is really sort of a comprehensive surveillance app and the Post did a great job of interviewing parents about why they wanted the app and kids about how much they hated the app and the parental child tensions that were arising when kids would disable the app or go out of bounds of the app as of course they inevitably do. We know that in addition to the devices you can use to track your baby's sleep or have a digital video camera in their room or a smart diaper on their tushy, we know that schools are doing quite a lot of this. There was a great article this summer from Education Weekly looking at the massive use of surveillance and tracking digitally that's being done in schools and that under my definition is sharenting. Those are educators, those are trusted adults, taking action with children's private digital data and one thing really stuck out to me about that article there's a product called Gaggle, not Google but Gaggle with an A that monitors digital content created by nearly 5 million USK through 12 students so all files, messages, class assignments on school devices or school accounts and all that data goes through machine learning to start with, it goes through algorithms it automatically scans that information looking and this is a direct quote to see if something bad is about to happen. If Gaggle thinks, Gaggle's machines think that something bad is about to happen they do escalate it for a human review and at that point the humans can decide whether something bad is about to happen or whether someone might be doing a research project on weapons in the Roman Empire but that is just one example of the ways really behind the scenes that information about our kids is being taken from a space that is supposed to be protected a school that's supposed to be a source of learning fed into a digital product that is designed to really make predictions about them are they about to do something bad and I'm not trying to downplay the importance of school safety at all but just to surface that we're moving well away from protected learning environments in schools when we think of protection as more than is something bad about to happen and my last point has to do with how kids growing up today are already subject to and at risk of far greater data driven predictions being made about them and their capacities for success and trajectories in major life domains including education I just gave one example and employment so I'm going to read a hypothetical because law professors love hypotheticals and then I will stop talking and open it up to Liz and others so here's my thought experiment from the book so near future hypothetical scenario not real as of when I checked yesterday maybe real today I mean things do move that quickly so you're helping your 17 year old finish her college applications the applications require her SAT score, SAT2 scores, AP scores and her Tikebytes personal capital scores what the heck is Tikebytes? Siri tells you that Tikebytes serves as your child's passport from her past into her future you ask Siri to stop reading the Tikebytes soundbytes and do some digging the response Tikebytes is a commercial database that serves as a repository of childhood data and a clearing house into adulthood Tikebytes aggregates as much data about each child in the country as possible and then packages the data for purchase by different types of institutions and individuals the most popular product is a set of scores that rates children's likelihood of future success in a range of areas including education, athletics and employment Tikebytes will share these personal capital scores with any individual or institution that pays for them isn't legally prohibited from having them and demonstrates what is in Tikebytes' opinion a legitimate need for them you and your daughter don't need to do anything to have these scores sent all colleges that receive applications from her will request and receive these scores at no cost to individual applicants Tikebytes does allow parents and youth age 18 or over to opt out of having Tikebytes collect and share their information but the Tikebytes website warns you that opting out risks your child's future after all the perky chatbot in the click here for help section tells you an applicant without Tikebytes scores is like a car without airbags you could take it for a spin but why risk it? right so what do we think good bad ugly? we'll turn it over to you and everyone else for a chat well I'm thinking then I'll follow up with a school question please since we left it with a school topic so there's a lot of excitement about schools needing to kind of get with the current century and having technologies be part of that and you know in order for kids to have an extended experience with technology that means they get tracked right they create an account and the account is set up by the teacher or maybe by the school probably the teacher and so how might we think about this data and how might we say if we're thinking from a legal perspective as a law student or someone wanting to get into this area how might we think about what's in place now what we might want to have in place in that context wonderful question and so when we're thinking about student data and particularly in particular we do have three big student data privacy laws at the federal level the Family Educational Rights and Privacy Act or FERPA the Children's Online Privacy Protection Act or COPPA and the Protection of People Rights Act and Amendment or PPRA which is sort of the often forgotten cousin of the other two but to paint briefly with a broad brush if you're thinking about a setup where a teacher or a school wants to create an account for a child that is transmitting personally identifiable information from an education record and that's a pretty broad category of information outside the school and transmitting it through a digital app or software provider is transmitting it outside the school they're either going to need parental consent or more likely they're going to use an exception to the parental consent request of a legitimate school official exception so a huge area of need for attorneys who work at the intersection of tech and education is for strong meaningful clear contracts to be in place between education institutions that are using these digital products and services and the product or service provider because the legitimate school official exception only works lawfully if the school is using that account to do something that would otherwise be done in-house the third party provider is under the direct control of the school that's the need for a negotiated clear contract and the third party provider is not using that data for anything other than the contracted for purpose COPPA would kick in if that device that the account was created for was then being given to a child under 13 who was going to be using a commercial app or software that either was targeted at kids under 13 or knew that it was getting information from kids under 13 that's again an area where you need both folks on the education space and on the tech company space not to mention the regulator space because we have a bunch of state laws and regulations in place to make sure there are clear negotiated contracts protection of people rights act in amendment would start to kick in around certain types of products or services that are really functioning as surveys that are getting very sensitive information like relating to religion or similar beliefs or certain types of apps that are collecting information that may be used later for marketing or advertising and with the explosion of activity on the state level where I mentioned earlier we had 300 bills considered on student data privacy among the states from 2013 to 2018 and that five-year period really being a time of very intense activity in student data privacy we started the student privacy initiative right around 2013 to help lead that conversation 25 states actually passed 59 new laws during that time and we continue to see growth in state youth privacy laws that don't just apply to kids in their capacity as students a big one being the California Consumer Privacy Act which will go into effect in January that would also bear on the space so for law students or you know lawyers thinking about spaces to get into helping to figure out how to negotiate and draft those contracts on either the school side or the vendor side as well as entering into a space of state and federal regulation and law enforcement activity there's a lot going on that's a great answer another thing that I've been thinking about we've been talking about a bit here is in the spirit of hypotheticals of one around Bitmoji which is a little app that allows you to create avatars that you text and so it's embedded into texting and it looks like it's just texting back and forth and it's making a funny face for you that looks like you or like an alien or whatever you want to look like but it's actually owned by Snapchat and so it is allowed to collect all your keystrokes and your contacts and your location and your accelerometer data and so it knows an awful could know, we don't know what it's actually collecting but it could know an awful lot about you so this is something that kids and tweens and very excited about using it can't tell that it's part of this sort of social network ecosystem because it looks like it's just in your text you don't see it there, it looks like it's in a different spot how should we think about this? that is a perfect example of sharing that happens in a stealthy way and stealthy by the provider so as a parent and I will confess actually until we started talking about this we realized the extent to which that actually was not just confined to my text stream so this is education for me as well but it surfaces a point that in some ways cannot be overstated which is that it is next to impossible and I will say potentially impossible for any parent even a privacy nerd parent even a lawyer parent even sort of the most vigilant parent to have the time and the expertise to combine all of the domains that would be required so let's take the sort of broaden out now and say this isn't something that's sort of being talked about and in the news to try to figure that out for yourself you would have to figure out where the privacy policies are for Bitmoji, where the terms of use are that can be hard to just find as a matter of access and then once you've seen them good luck understanding them and I say this again as a privacy nerd and even if you do parse the whole way through them there is inevitably some sort of get out of jail free language for the company where it's saying we're not going to share this except to improve your user experience we're not going to share this except with our affiliates we're not going to share this except for research and to development of new products and so as soon as you start to see language like that it's like well okay I've just spent hours or days of my life finding the language reading the language attempting to understand it and it's crystal clear to me that you're reserving the ability to use this data now and in the future in ways that I can't see and I can't really control or predict so I think that's an example that really encapsulates the weakness of this parent's gatekeeper model that we are using right now for youth privacy I gave you a hard one but it's a really interesting both sort of technological and legal question it's also a parenting question like what do you actually do so as a parent and someone who you must get asked what should a parent do all day every day what should a parent do when faced with a very very complicated situation like this the child probably just wants to send bitmoji to their friends why can't I do that mom? well and as my son has heard me say when it comes to in private information that's leaving on the house I think parents can err on the side of caution and I don't mean become a Luddite I don't mean you know break your phone and turn off the wireless but I do think there are a couple of parenting hacks if you will that we can all keep in mind one is I would stay away from any device or service that purports to act as a surveillance or monitoring device I think that we just don't know enough about the sort of cyber security issues around those in terms of their vulnerability to third parties getting them I am particularly sensitive to the idea that any company is going to know no matter what they promise me where my child is and potentially what they're doing and particularly as a former juvenile justice lawyer where I watched kids get pulled into court no kidding once it was my client had shoved someone else's books on the playground and the books had gone flying and it was assault I once actually stepped as lawyers in the room will appreciate I didn't quite step in between because I didn't want to have to tell my executive director that I'd been arrested for obstructing justice but I stood right next to a 16 year old female client who had slammed the door of a conference room we'd been in a very difficult IEP individualized education program meeting for her special education needs and she had documented I'd gotten a Dartmouth psychiatrist to document the depth of trauma history and emotional disturbance she was experiencing and she had done a great job there were like 12 grownups in her at a table talking about really intense stuff she got up and she got frustrated and she slammed the door didn't break anything, she slammed a door school resource officer the local police officer embedded in the school showed up and first told her she had to go home and said why does she have to go home we're fine now, I'm here, her mom's here special education team is here this is exactly what we're supposed to do I've seen her before she's going to blow a fuse later in the day but I just said thank you very much officer, I have this and he sort of huffed and puffed and he let her stay with me and then he said if you weren't here I would have arrested her and I said can I ask you for what what would the charge have been and he said disorderly conduct for slamming a door and I looked at him and I just said very vigilant and he looked at me with this sort of like are you complimenting me or criticizing me and it was one of those I just smiled and let him take it as he wanted to take it and he left but it was just the luck of the draw that I happened to be there that day and that I had gotten a lot of training from Harvard Law School in how to stand firm and have tough conversations but in an era where particularly for kids who have disabilities or who come from lower socioeconomic status or minorities in the communities they're in we are already hyper vigilant about responding to normal things they do as kids and teenagers with this sort of outside response you know slam a door disorderly conduct and so the last thing that I would recommend the parents do right now is to buy a commercial product that says we will keep track of what your kids are and kind of potentially what they're doing in some way shape or form so if they're outside of the boundaries you set for them you know we'll let you know right well the last thing I want to do is to create any sort of digital trail not just about where my child is but that purports to tell me when my child might be out of bounds because I have no confidence that any commercial provider would keep that data just for me to use with them no matter what they say and so would I then be serving up digital data that could go to a data broker now or five years in the future and from that data broker go to be aggregated into some sort of predictive software for hiring where it's not just anymore that you're taking you know Kathy O'Neill does a great job of talking about them and weapons of math destruction right the industry that does sort of pseudo personality tests that 60 to 70% of Americans applying for jobs now are taking these digital assessments designed to see what kind of fit they are well I don't think we're that far from these kinds of assessments getting data that we as parents have put out there about our kids thinking it's just to one company thinking they're keeping it to themselves and before we know it maybe not all of it maybe not every product but enough of it gets aggregated into those kinds of gatekeeping you know predictive products so that I've unwittingly served up oh you know this child was always out of bounds when they were 15 years old okay well finger on the scale of not being a good candidate for the job so I would say as parenting hack don't use surveillance and monitoring products I would not put any pictures online of a child of any age in any state of undress I think that invites an unwanted gaze as we saw this summer from research that came out from some in the Berkman Klein and other communities these sort of what the YouTube algorithm was doing from pushing viewers from you know content that is innocuous or innocent involving kids so kind of you know the nice summer day at the beach to content that is predatory you know that's just one example of the way that an unwanted gaze either by an individual trying to find that kind of content or an algorithm trying to captivate interest can go and last and maybe least I think if you are you know if you are going to give updates on social media maybe this makes me old fashioned I use a holiday card rule of thumb so my grandma used to put out these long you know one to two page type written mimeograph newsletters from wherever she was in the world my brothers here too our grandparents my grandfather was in the foreign service and so she would have these you know once a year these big updates and they went to everybody they went to like the second cousin three times removed my cousins here too they went to you know former colleagues former bosses so it was really for public consumption and this is less about the keeping things safe from downstream unintended uses and more responding to the increasingly voiced concerns and emotions and reactions of youth themselves to like I don't want people to know that I'm nervous about getting asked to the school dance I don't want people to know that it really took me a long time to be potty trained or like go to YouTube and search for girl gets her first period they have those mother daughter chats on YouTube you know and kids don't like to see that stuff so I would keep that that stuff private so that leads very nicely into my very last question and then I'm going to open it up one thing I appreciated about your book is sort of I could see your experience in legal aid and working with youth and surely dealing with the juvenile system the child protection system and the courts and I wonder what you might say to people who are thinking about going into that space what are ways in which they can be advocates or what are ways in which you would want to see policy change or what can be done in that space thank you for that because without that I run the risk of sounding very pessimistic and I'm actually a fundamentally optimistic and I am optimistic about this because I think that on the whole digital life is a positive thing for people and institutions and for the world in terms of the ability to build connections spark innovation and really transform everything about how we live, work, play and so much else and I think when it comes to kids in particular there is truly a heightened sense of individual and societal responsibility we feel so I'm optimistic that now that we're increasingly having these conversations all of us in this room everybody who recently read and commented on the New York Times piece how photos of your kids are power and facial recognitions technology I mean there is increasingly robust public discussion about this so what I would challenge all of us in our personal and professional capacities is to be increasingly guided by values based decision making that is conscious in our mind so are we going to have the ability to peel beneath the layers of bitmoji or whatever the next thing is every time? No do we have the ability to say within our families we value protected spaces to play so we are not going to bring in digital technologies into those kind of private spaces unless they serve a really compelling need or then we can start to broaden out that conversation and say to our regulators and our lawmakers you're not doing enough to address the issues of downstream uses of children's digital data California is about to be doing a heck of a lot more but even California's law suffers for some big holes in terms of not applying to enough tech companies and I think still making it difficult potentially for parents and kids to find what a company has about them I think we ask all the way up to the federal government when the federal government starts functioning again for more comprehensive youth digital privacy law that would regulate at the level not of parents or even schools but of tech companies themselves in terms of limiting the uses particularly around predictive decision making and other gatekeeping functions that they can make of data that they have collected either directly from or about minors and I think that has to have a robust enforcement mechanism a private right of action so you can turn the litigators loose and so that's another one Liz for law students and lawyers looking for a career focus there is always room for a ferocious creative litigator in a space even if there isn't some sort of new comprehensive federal youth data privacy law and so I am excited now that we're thinking increasingly about these sharenting questions to see what the next generation of lawyers does with perhaps some existing causes of action constructively applied in new ways I don't really see a space and I hope there's not a space to have a lot of kids growing up ensuing parents unless it's been really egregious but I definitely would look and hope for spaces to have some theories of kids growing up and ensuing tech companies or the gatekeepers that may have acted upon that digital data great well with that let's open it up to questions anyone have Ruben and has a mic and Megan has a mic I've got one over here by you Megan I'm just wondering how kind of stepping away from the parents plays into this like issues where say a parent is following these recommendations that you did or say they they don't want any information about their children like picture I'm specifically thinking of like pictures on social media they don't want any of that online but I hear in conversations a lot and this might just be anecdotal that there's that oftentimes aunts uncles grandparents are some of the worst perpetrators of Sharon hood and how does that play into that is there a way is there any do you see any inroads for ways of parents to kind of enforce their subjective kind of values over their child's privacy on other parties I'm smiling because I been getting that question increasingly and so I feel the need to write something specifically about this before the holidays so hopefully I'll get something up on my psychology today blog I will also just put in a quick plug I've been talking to a Wall Street Journal reporter who wants to do a story on this and wants to talk to parents or grandparents or other family members who are willing to go on the record to talk about that and so many of you are or no people who are my contact information is online please email me and tell me because I get this question a lot now that I'm talking about the book and so between my admittedly anecdote and also this Wall Street Journal reporters question there's clearly a lot there to unpack what I have been saying to people is that there's a bit of a take a deep breath about the conversation in a way that depersonalizes it and normalizes it so whether it's hey did you see what the New York Times reported about how photos of toddlers from 2005 are informing what the Times called Bleeding Edge Surveillance Technology that's creepy and weird I trust you you're an amazing aunt, uncle, cousin, brother whomever but I really don't trust big tech so can we just keep it among the family I heard this crazy law professor speak she's going on and on about smart diapers and fertility tracking apps but you know she might have not been completely crazy and maybe we shouldn't do this but I think we just have to take a deep breath depersonalize it and try to normalize it because you were talking Elizabeth about parenting challenges and I think those of us who are parents or caretakers of kids at all are really used to and in fact seek out conversations like hey you're kids coming over for the first time any food allergies or I want to show the kids a movie tonight are you comfortable with something rated PG like we're kind of used to those conversations and I think we all need to as uncomfortable as it is make this the same kind of basic conversation you have either on behalf of your own child or if you're watching someone else's child and actually when I was working with the publicity folks around this book one of the things we briefly talked about this summer is as sort of a give away with the book should we have a sticker that said like no sharing or safe sharing that you would put on your child and we decided that unfortunately in today's world that would just make those kids targets for having their pictures taken and then being trolled and doxxed and harassed so I did not do that but I think we can all kind of try to verbally put the sticker on if you just want to talk to the Wall Street Journal also you can come see me afterwards Hey Leah, thank you so much for the talk It's great to see you Sorry I've pumped in a little late so maybe you touched on this but I'm curious to see if you saw any generational differences between like younger mothers who may have grown up with social media in a particular way versus older mothers I think that the parents it's okay it's alright it's okay I'm a mother I'm happy to talk about mothers but to talk about parents I do see some and I see that for parents in my generations I'm 40 I'm the tail end of Gen X I didn't have email to college I didn't have a cell phone till after college I know a lot less than people younger than me when it comes to this and so there is this way in which and it kind of gets back to something I shared earlier about a dean from an undergraduate college calling me and saying you know my students are doing fine it's the parents of the students that are getting them into trouble kids are learning today a lot about digital citizenship and digital privacy their wonderful curricula that youth and media has developed that common sense media has developed and other trusted partners and so when it comes to the kids who will soon be parents or who are kids of parents like my age they may actually know more about their position to teach parents so I think in addition to seeing definitely some differences between parents who are younger than I am so in their 20s and 30s and somewhat more sophisticated we also no matter what age we are as a parent have so much that we can be learning about digital citizenship which includes digital privacy from our kids we've got one over here is that a hand over here while the mic is coming I will say I gave this talk recently and I had a more senior member of the audience say to me this is very interesting can you talk about what the kids and teens are doing in terms of sharing pictures of themselves and I said there's some wonderful scholars and experts who talk about that I'm not talking about that today I'm talking about you and all the other grownups in this room so yes Hi thank you so much for your talk I just wanted to ask I feel like a lot of what we're talking about is between kind of safety and privacy which we see in a lot of other spaces so coming from background doing work in like child exploitation there really is this feeling of like get everything offline but also this feeling of like I want to know what my kid is doing I want to know where my kid is and this like feeling of the way to keep them safe is to surveil which then creates all those other privacy issues so where do you think that middle ground lays of parents feeling like things are spinning out of control that they want more control over their kid but also wanting to make sure that we're respecting kids privacy that's such an insightful question and you see it in schools as well right that they're saying in order to promote safety we have to do more and more surveillance more and more monitoring which we can query whether that's even addressing the safety concern and certainly can be creating other downstream problems with privacy and potentially safety if the data that is collecting is breached I would come back to this idea of a values based analysis so even before you get to the question of is this particular product a good idea is this particular watch a good idea think about at the level of value so safety and privacy but also autonomy and agency so how do we balance those to try to further our goals here and I think one piece that is often left out of safety and privacy discussion is this agency and autonomy piece so parents are not going to be around all the time school personnel are not going to be around all the time alright I'm going to share it about my son while he's out of the room but he recently you're there okay don't know talking back but I'm going to share you tried to get me to let you walk to school by yourself by saying if I gave you a gizmodo watch I would know where you were right and I told you that I wasn't going to do that because I didn't believe in surveillance and tracking and also you said to me and if I get into trouble I can call you and I said buddy you can always call mom or dad about anything but actually when you are old enough to walk to school and you heaven forbid do get into trouble you can't call me first you need to find a police officer a crossing guard a parent you know a teacher scream run make noise even use your karate right we talked about that and come up with a solution space around agency and autonomy because even if I know where you are and even if you can call me with the push of a button I may be 10 minutes away that 10 minutes maybe too long and so I would say values based discussion before you get to the level of coming up with a tech solution and making sure that in those values are promoting agency and autonomy for kids and also equity and making sure that particularly as we move from a home setting into a school setting or a law enforcement setting we are not unintentionally engaging in inequitable solutions sorry Sam there is someone over here if you want to and I think this is the last one so thinking that we may be all be getting more comfortable with this idea of having information about us out there I think that one of the ways we reconcile it is thinking about even as we may in certain domains be accepting more and more transmission of data whether it's about our kids or about ourselves also recognizing that once people of all ages peel back the layers they actually become less comfortable and so I certainly accept that in general we are seeing a paradigm of letting more information out you know Kathy O'Neill's books two thirds of American adults have Facebook accounts that's just one of the many data I'm not going to say that your general characterization about the paradigm is off I don't think it is but I think as you peel back the layers when you see the Cambridge Analytica scandal break when you see as a parent another letter in the mail from the school district saying that Pearson got broken into over the summer real world example you do start to become less comfortable and ask questions and raise challenges and I think actually that our kids and teens are savvier about this often than we are I mean think about the Finsta, the fake Instagram account so popular among teens where they're very savvy about privacy and the youth and media team does a wonderful job unpacking this that they can curate and customize how they present in different spaces in a way that actually because they are using digital services and products and so they are letting some data out they're doing it in a very sophisticated very curated, very principled way so I think there's room for that kind of thoughtfulness and more complex response so I think with that we have to end I would encourage and please do come up and ask the rest of your questions and I would encourage everyone to get Leah's right outside and maybe we can all give Leah a thank you for this wonderful talk Thank you Thank you Sam