 Hello, everyone. I'm Urs Gasser. I'm with the Berkman Center. I'm also on the Harvard Law School faculty. I have the great pleasure to welcome you together with my colleagues, Alicia Solo Niedermann, Lea Plunkett, Sandra Courtesy, Dalia Topolson, and Polina Hadwang, all members of the Student Privacy Initiative at the Berkman Center to this Berkman luncheon, which is live webcasted. We also welcome the participants from afar. Thanks for joining us here in Cambridge. It's a beautiful day. Sun is shining. I'm really delighted that, nonetheless, you all showed up for a conversation that I hope will be interesting. It's certainly a timely topic. The topic of today, as you know from the agenda, is cloud-based ed tech and student privacy issues. We'll focus particularly on K to 12, where lots of things are happening, important things are happening. Today also is a special day for the team because we're releasing a new report that is available online as of now, I guess. So congratulations to the team. We'll hear more about that report, which is actually the result of a number of months of research and also quite a bit of collaboration, including several collaborators that joined remotely today and hopefully will weigh in later on through a question tool and also via Twitter, speaking of which, the hashtag is hashtag Berkman SPI, not for spy, but for student privacy initiative, just to be clear what this means. So the game plan is the following. We'll start with a brief overview. Taking two steps back, what's the larger context of our work, we'll then zoom in on the report and some of the findings focusing on K to 12 cloud-based ed tech and privacy issues. My colleagues will do the heavy lifting. I'm just kind of taking care of the entertaining part. And then the third part is discussion. So we'd love to hear from all of you, as well as from our audience, from afar. What's on your mind? What are the big questions that you're wrestling with today? But first, as I said, let's take two steps back and ask ourselves what's actually going on in education land? I was recently traveling through a number of countries in Latin America where I was asked, OK, you know, we hear there is a lot of things are going on when it comes to education and digital technologies. How do we think about that? And the best picture I could think of for the best analogy, which of course is limited for a number of reasons, is what it's somewhat like 15 years ago when this young man, Sean Fanning, started Napster and revolutionized the way we think about music, about entertainment more broadly online. I truly believe where we are as a point of time in history is really the beginning of an extra evolution or an evolution next stage in the evolution when it comes to education. Of course, the protagonists have changed. Here you see Salman Khan, it's no longer Sean Fanning. The means have changed by which this revolution is about to happen, very different, of course, from the Napster story. But nonetheless, the depth of the transformation we're going through as an education industry, as education institutions, I think is comparable to the fundamental transformations we've seen in digital music and digital entertainment later on in publishing. I think it's fair to argue similar shifts are happening now in education. Now, one proxy, of course, is that something really tectonic is going on and that the use of the digital technologies in education is not just kind of a hype, it's the response by which existing institutions of education, particularly formal education, are responding to the changing landscape or responding to new players, such as Khan Academy and many others, of course. There is a vibrant discussion going on in many places, in many educational institutions, including this great one here at Harvard. So what that means, what we see in terms of an emerging ecosystem of online education, of online courses, of open educational resources, of alternative ways of teaching and learning and powered and enabled, facilitated through digital technologies, what does that mean for institutions such as Harvard and for our schools more broadly? The challenges are manifold. There are business model challenges, if you think about MOOCs, right? Do MOOCs challenge the concept of residential education as we've known it? But there are also plenty of opportunities, of course, and Harvard, just as one example, of course, is experimenting with the new opportunities as well. If you think about HarvardX and some of the courses our faculty has been involved in, including CopyrightX led by Terry Fisher. So you see a lot of activity that already indicates something really fundamental is happening. Now, if we take a little bit of a closer look and focusing for a moment on the content layer of these changes, well, Alicia will then describe a bit more of what's happening underneath the content layer. We see at least four shifts that have some resemblance with the shifts we've seen in education, sorry, in the entertainment revolution and the digital media crisis more broadly. The first one is obvious, and I already mentioned MOOCs. The way in which content, educational content, whether courses or modules or learning units, tutorials and the like, are delivered. That changes fundamentally. Of course, here, digital technology plays a key role as a distribution channel where you're also able to reach an unprecedented audience, globally, real-time and the like. So in your head, you can make the analogy to some of the examples we've seen in other areas, including music, which I just used to make the point here. But it's not only about distribution, it's also about access, how we access educational resources. It's very different from the past. We see, of course, kids in schools using tablets, but even more importantly, looking at my children, for instance, they use platforms such as YouTube for learning at home. It's very different from the time when I went to school, where it was all textbook-based. It was a very different way to access materials. But it's, of course, not only the tablets and the platforms. It's really also the new players that have arrived. If you think about iTunes, you just kind of as a placeholder, there's an entire new ecosystem of platforms emerging that traditionally weren't thought of as educational intermediaries. They're new entries and new additions to the ecosystem. So access marking the second structural shift, arguably, in this new digital network environment of education. The third one is around usage. I also believe we see lots of examples where we, especially looking at young people, see how learning materials are used in much more interactive ways. It's not only about watching videos. It's really about making remixes, about recombining and using the digital technologies to create your own environments, to customize your learning spaces, for instance. And all that, of course, is supported by, increasingly supported by a layer of policies. Here the screenshot of open educational resources, which is even a movement to think proactively and creatively about enabling copyright policies, in this case, that allow for that sort of mixing and mashing and messing around with educational content. So definitely, here again, in analogy, if you think about the free culture movement in the entertainment and space of culture, we see something similar happening when it comes to education and learning. And finally, looking at the creation space, the lines, who's producing educational content? Who's producing learning material? Of course, the lines there are blurring, too. It's no longer the teachers that produce content. Again, going back to my own kids, so some of their favorite episodes on YouTube are actually peer produced, where other kids teach them how to do certain things or how to become better in playing games and things like that. So you see that mechanisms of peer learning and peer teaching kick in. And you see that some we also have very different providers of educational content. And it's no longer this hierarchical, the other professor stands in front and produces the content. All together, I would argue, and that's the last slide, kind of in this two steps back, let's look at the big picture intro, that these four shifts actually accumulate towards a new culture and the new paradigm of learning and teaching. There are different labels to describe what's happening at this moment where digital technologies become more prevalent and more important in the education sphere. One important concept is connected learning, which is particularly helpful as a frame because it highlights that not only how we learn and what we learn, but also where we learn is affected by the digital evolution or revolution to make the point here. So for instance, that much of the social learning, as in the examples of my kids, does not only happen, of course, any longer in formal educational institutions, but much of the learning happens at home or happens in social spaces, in informal spaces of learning outside of school. And kind of the challenge ahead for many of us be it practitioners or from the policy side or from the institutional side is how can we better connect these different spheres and places where learning is happening and build pathways into the various settings and of course also make sure that the academic learning in the traditional sense can capitalize on interest-driven learning and other phenomenon, the peer culture I mentioned before. So that's kind of the broad intro just to set the stage for the more specific conversation about ed tech, cloud technology and K to 12 and then delving into the privacy issues. What I just sketched is also very much kind of the background of the student privacy initiative at the Birken Center, which actually focuses on three things. First, we really try to understand what's happening looking at kids, at students, how they learn, how they interact with content. That's much of the work that the Youth and Media Lab is doing in the context of the report that we're gonna talk about in a minute. Of course, our focus has been on privacy attitudes and expectations of children and students, but more broadly, our interest is really to understand what our kids are actually doing when interacting with digital content. A second focal area is around school practices and policies. So we're collaborating with COSEN and other partners, schools nationally, regionally to listen and learn what is going on at the school level, at the school district level, but then also at the level of school administrators and in schools with teachers. And then finally, and that's much of the work that the Cyber Law Clinic leads at the Birken Center. Of course, what are the law and policy implications of these changes that I roughly described at the content level? Just to give a flavor there, we have about over 100 bills that are introduced, focusing only on the specific issue of student privacy at the level of the states. So you can see how strong the reactions are by legislators confronted with these seismic shifts that I outlined. Here is a link to our website, I encourage you to explore it. We have a number of reports that are already out, but now more importantly, and very excitingly, I turn over to Alicia to talk more about the most recent addition and most recent research. Thank you, Orson. So what's new today, as Orson just mentioned, is our analysis, Framing the Law and Policy Picture. It offers a snapshot of K through 12 cloud-based ed tech and student privacy as of early 2014. And it is now live on our website. We encourage you to download it, take a look at the materials and also of the other great previously produced reports across all of our three clusters. So our goal today in producing this report and also in this presentation was to build on the great structural changes Professor Gasser just highlighted at the content layer and think about cloud technology in K-12 settings at the data and infrastructure layer. Across each of the four structural changes, there is an undercurrent, a constant trend, which is the fact that data is being shared online in various ways, in fundamentally new ways and different ways that may be transformative. And the K through 12 ed tech space illustrates some of these trends very nicely. Before we dive in though, one more step back to define what exactly does it mean when we say schools are using cloud-based educational technology. It's an excellent question and there isn't in fact one specific definition, universally accepted definition of cloud technology in general or in educational spaces in particular. We do find it helpful though to drill down a bit and in the context of our work in this paper and in the initiative more broadly, we understand cloud-based ed tech to mean different tools and technologies that allow schools and districts to transition computing resources from localized systems to more remote and shared systems. Now the question becomes, you'll see much more as we go on in this report, what that means in context. It is something that's very much defined in context, but thinking about it in that way, from the local to the more remote and shared systems is a helpful background. It's helpful because there is a broad move to the cloud, across schools and districts in K through 12. You may have seen a number of recent headlines about this and a December 2013 report found as you see here that 95% of districts were using at least one form of cloud-based ed tech. And the question becomes then what accounts for such wide use of cloud-based ed tech? And it's transformative. There are in many cases opportunities for savings and efficiencies, as you think about schools with limited resources and budgets being crunched in the financial crisis. And it certainly appeals to schools and districts for that reason, it can be cheaper and easier to outsource resources to a shared provider versus having to build the technology in-house. On top of that and in a more transformative way, there are fundamentally new opportunities for innovation and experimentation and shifts in how learning takes place as or is underlined as we go outside of the formal classroom silos to more informal learning context. There are many, many examples of this and some of our previous work addresses this. For instance, you see here on the right side of the page are just a sampling of some of the companies that are featured on what's called Imagine K12, which is a startup accelerator that focuses exclusively on educational companies working to improve K through 12 educational outcomes. There are so many, it was hard to even pick a single snapshot for this slide. And one of our works previously published, it offers an inventory of these sorts of tools grouped by their affordances and you can read more about that on our website. But the key point for today is that cloud tech like this can help schools to both improve existing processes and create new opportunities. And the uses in context in schools can be capturing this data for a number of reasons which can range from targeted advertisements to attempts to improve learning in positive ways such as by tailoring the effectiveness of different learning materials to a student's unique strengths or weaknesses or even you could capture data to improve or identify at learning disability. So for instance, you could imagine a student reading and you could try to capture the rate at which they turn pages in an e-book and that could help to identify a trouble with reading. And there are certainly very fundamentally transformative uses of data. It also raises a lot of privacy questions as you think about capturing student data in this way. And so far, this still might sound a bit abstract. You might be wondering, okay, what does this mean in practice? And to help us think through some of the law and policy challenges and opportunities that come up in practice, we created a hypothetical situation called Skolair which my colleague Leah will now discuss and take us through and consider some of the law and policy challenges that emerge in the context of this innovation. Thank you so much, Alicia. So as Alicia mentioned, we have relied on our paper that's being released today that we're going to dive more deeply into now on that favorite device of lawyers and law professors, which I confess I am, the hypothetical situation designed to help us explore different, thank you so much, different aspects of a situation. Before we dive into some of the details of Skolair, our fictional, totally fictional, I promise you, cloud ed tech product, just a few thoughts about our approach to the privacy challenges that we've taken in this paper. We really have approached these with a desire to empower, especially right now to empower policy and decision makers at the school district, local government, state government and federal government levels. Not to say that there aren't a lot of other very important stakeholders in the cloud ed tech space, there are. We have enjoyed being in dialogue with them and look forward to continuing to be in dialogue with them, but rather just to frame for you all that our paper today, what I'm about to go into in more detail, really is aimed at providing greater clarity to those policy and decision makers about the privacy options that are available to them as they consider the ed tech space. And in terms of fostering this sense of empowerment, we seek to do so in a way that is grounded both in conversations that we have had with a range of stakeholders at the working meetings that the student privacy initiative has convened as well as in a variety of other forums that have happened both in Cambridge and well beyond. We also very much want to reflect the popular conversation that is taking place on student privacy and cloud ed tech. So what you see up there for many of you may be very familiar. I'm sure that a number of our folks participating remotely are intimately familiar with this coverage and with the events that they reflect. I won't go into each one in detail, but what we have done is in the spirit of snapshots, our theme today, we've taken a snapshot of the coverage from one major publication within the last academic year that we're finishing up now that gets at student privacy and cloud ed tech challenges. And what you'll see here reflects what we think of as these privacy and data sort of 1.0 or first generation challenges. These questions and concerns that are coming up reflected in this New York Times coverage shows that various stakeholders are thinking, talking and worrying, quite frankly, a lot about what are cloud based ed technologies, how do they work, what data is going into them, who's making that decision, our parents involved, what's happening to the data once it's in there, and how are we making sure that only the folks who are supposed to have this data, in fact, are getting access to it. And so as we think about these data 1.0 challenges for cloud based ed tech, we're focused on issues of legal compliance, making sure that everybody is following at bare minimum the existing legal and regulatory and policy frameworks at all levels, although I will flag that we are going to focus our comments today as well as the paper on the federal statutory and regulatory framework simply because, as Ors mentioned earlier, of the heterogeneity of state laws that are both on the books and pending. And so we are thinking about how to inform the policy and decision makers in understanding those basic matters of legal compliance, data security, and also what best practices might be out there to help policy and decision makers go beyond bare bones understanding and compliance to engage in practices that will balance both the need for innovation and dynamism in the educational spaces, as well as the protection of student privacy. So our paper today in our Scalera hypothetical translate these 1.0 challenges into three overarching questions that face policy and decision makers at these school district, local, state, and federal government levels. Who in the educational system should be making cloud-based attack decisions? When is parental consent needed for the adoption of these technologies and how can the data transferred, stored, and analyzed through these technologies be kept secure and when necessary, de-identified? This brings us to Scalera. There are many more details of Scalera available in the paper, and we all welcome your questions and comments on those in the discussion portion of today's presentation. But for the purposes of discussion now, again, lawyer and law professor that I am, I'll start with a disclaimer that we are not trying in Scalera to cover all cloud ed tech products and Scalera is not being offered as an exemplary best practice scenario for a product. In fact, those of you who are lawyers or law students know a lot of the value in creating hypothetical situations in the perhaps twisted mind of a lawyer or law professor is getting out things that actually are not a very good idea, but are happening or may happen in real life and trying to unpack why they're actually really not such a good idea. So you'll see some of that in our hypothetical too. So in this hypothetical, we have a principal, principal Smith at Anywhereville Middle School who partners with Scalera and cloud ed tech company that provides three main services. It stores student data in the cloud defined broadly as Alicia explained to a few minutes ago. Scalera then too offers a dashboard that gives authorized users access to easy ways to input, access and analyze student data and provides access to end user software applications. So in our hypothetical, parents at Anywhereville Middle School get a nice long letter from principal Smith set out in full in our report, which is or is mentioned is available now if you're interested at the student privacy initiative page in the Berkman website. So you'd go to cyber.law.harvard.edu and then go to the student privacy initiative, but brief relevant excerpt of the letter that our parents see Scalera says principal Smith is totally free for you and your child to use and it comes at a very reasonable cost to the school. Scalera also provides customers which now includes Anywhereville Middle School with access to some apps developed by third party app developers who may charge a small fee. AMS Anywhereville Middle School will transfer to Scalera all data about current students that it either has on file, whether in hard copy or on existing databases or ads in the future. Scalera will safely store all of your child's records on servers managed on behalf of Scalera by its third party service providers. Scalera will not combine data on AMS students with data it holds for other schools except as described below, right? Because there's always a little more going on. Each student will have a virtual cubby that authorize users that might include teachers, the principal, and then the parents and the child, right? So authorized users can access through the dashboard provided by Scalera. Scalera software also gives us a unique learning opportunity. We can compare trends in our cubby clusters so that translates into a classroom of students or a grade level of students as a whole to those in other schools that also store their data in Scalera. And Scalera also offers a virtual library of educational apps to help your child learn. We will be sending home permission slips to select the apps that will go in your child's cubby. So the teachers then, according to these permission slips, will be selecting the apps that then go into the child's cubby. And Principal Smith concludes, we ask you, the parents, to review the details of Scalera's terms of use, privacy, and other policies on the school's website. So again, a hypothetical product, a hypothetical letter, but not at all dissimilar from the types of products and, quite frankly, the types of communications that parents across the country are seeing right now. So very briefly, again, much more detail in the paper and very happy to dork out with anyone during the discussion who wants to get into some of the finer points here. So on the plus side, we have AMS showing great innovative spirit. It's reflecting a passion for technology and innovation and a desire for school-based efficiencies. But in a rush to act without the involvement of the school district, the principal has run into tension, or in some ways a fowl, of key provisions of several federal laws, the Family Educational Rights and Privacy Act, FERPA, the Children's Online Privacy Protection Act, COPA, and the Protection of Pupil Rights Amendment, PPRA. So with FERPA, it appears that there is no contract between Scalera and AMS. You see in the letter that Principal Smith is asking parents to review privacy policies on Scalera's website. Well, if there was a contract in place, then you would have that contract controlling the relationship between Scalera and AMS and its various constituencies. You wouldn't need to go read some general terms of use or privacy policies. So it doesn't seem to be a contract in place. And without that contract being in place, the exception to parental consent that Principal Smith seems to be relying on is a little harder for him to lay claim to. So I'll back up one more second. Under FERPA, in order for schools to disclose personally identifiable information from students' education records, which covers a range of information, including such things as grades or attendance histories, the school would need to get parental consent for that disclosure. Now, Principal Smith hasn't asked for parental consent to give information to Scalera, right? He's asked for permission slip about apps, which we'll get to in a second. But he hasn't said to parents, you can decide whether or not we send your child's data into the cloud to Scalera. So presumably then, he's relying on an exception to that consent requirement known as the legitimate school official exception. And that exception means if you are school and you want to disclose data without parental consent, then you need to meet three criteria. You need to have it be for a purpose that would otherwise be done by a school official. You need to have that entity, that third party that's getting the data, be under your direct control. And you need to make sure that that third party is not resharing the data. And without a contract in place, which we don't seem to have here, it seems very unlikely that Principal Smith and Anywhereville Middle School, in fact has the type of direct control and the prohibitions on resharing that you would need to avoid parental consent. So that's a long way for you lawyers and policy wonks in the room to say that Principal Smith seems to be running a fowl here of the requirement under FERPA that he get parental consent before sending this particular student data into the cloud. Moving now to the apps. If you have a for-profit website, which Scholar seems to be because it's charging a fee, that is collecting data directly from children under 13, you need parental consent. That's COPA, the Children's Online Privacy Protection Act. Now, teachers can substitute for parental consent if the data that is going into this website is just being used for internal school use. Well, again, think back to the letter we saw. Principal Smith is very excited because Scholar has the ability to do some data analytics that look not just at AMS students, but at students across the country and come up with patterns and information. So it seems very likely that the data that we have our students at AMS putting into these apps is not just limited to internal school use. So here again, we have Scholar and its third-party app developers on somewhat shaky grounds with COPA. Last but not least, the Protection of the Pupil Rights Amendment generally gives parents the ability to opt out of having certain types of personal information about their children collected in a survey or other medium if it is then going to be used for commercial or advertising purposes. It is unclear from the full letter that Principal Smith said at home whether or not Scholar and any of the app developers or this mysterious set of third parties that Scholar is subcontracting to will be using this information for commercial purposes. Therefore, we have a PPCRA problem as well here. You may be saying, oh, sorry, last but certainly not least, the lack of contract that I mentioned with respect to FERPA also raises some real concerns about data security issues because we don't see Scholar and AMS having an agreement in place around such issues as data retention, data destruction, and other types of data minimization principles to name a few. You may be wondering, so what law professors like to make mountains out of molehills? This is not a molehill, nor is it pretend. The study that Alicia referenced earlier from the folks down at CLIP, the Center for Law and Information Policy at Fordham noted in their 2013 report that districts are currently giving up a lot of control of student information when they use cloud services. Fewer than 7% of a nationally representative set of contracts that CLIP looked at restricted the sale or marketing of student information. Many agreements allowed vendors to change the terms without notice. And furthermore that these contracts when they do exist, which remember we're saying here, we don't think Principal Smith has one, school district cloud service agreements generally do not provide for data security and data retention or deletion. So I'm gonna bring Alicia back up now. Thank you, Leah. As Leah suggests, school air was not pulled out of thin air. It reflects very real issues that have been surveyed and reported on and that we've heard in conversation. Although it was developed to highlight some of those issues. So the question becomes out of this reality, what is the clear guidance? What are the paths forward? How can we be constructive and productive in harnessing the innovative potential of these tools while simultaneously protecting student privacy? And the paper aims to offer this sort of guidance and underscores the points we'll raise shortly. One note before diving in there is that these recommendations very much reflect the state of play right now as of early 2014. And it is quite possible that as the space evolves, both technologically and also with ongoing changes at the state and federal policy level and not to mention at the school level, the right guidance might change and evolve as well. There are also no bright line rules in many cases. Although there are laws, important laws like FERPA, COPPA, PPRA, in many cases, it's a combination of law and best practices that helps to balance the competing imperatives at this space. With that in mind, it's helpful to turn to something that might be quite familiar to a number of the law and policy people in the room in particular, which is Lessig's regulatory model, to consider the role of law, markets, and norms in this space in particular. And this framework offers for us a broader paradigm within which our pragmatic responses to today's student privacy questions can be understood. Just as a reminder, the questions within which we situated our entire analysis were who in the educational system should make cloud-based ed tech decisions? When is parental consent needed? And how can the data transferred, stored, and analyzed be kept secure and is necessary de-identified? And with that, I will turn it back to Leah actually to dive into law. Thank you very much. So there, again, we are trying to take advantage of a chance to make more clear to the relevant policy and decision makers at school districts, local, state, and federal government levels that existing laws are floors and not ceilings. And to examine best practices that could help with the balancing that Alicia just discussed. We recommend, at this specific point in time, again, as Alicia just said, subject to evolution as the space evolves, that the best practice in terms of who makes the ed tech decisions would be to have districts making ed tech adoption decisions because of the complexity of the legal and related regimes. We suggest that districts have a chief information officer, chief privacy officer, chief technical officer type, either full or part-time, in order to make sure that the legal details of the type we just discussed are fully vetted and in consultation as necessary with legal counsel, security experts, or other relevant experts. This is not to say that educators who are in the classroom or other stakeholders don't have an important role to play. They very much do. And again, our suggestion about who should be making these ed tech adoption decisions is likely to evolve over time as the space changes. But given where we are right now in this world of 1.0 potential harms and the newness of this space we recommend that district level decision making happen here. We also encourage districts to consider, either by themselves or in state or regional collaborations, having pre-vetted app stores or catalogs from which classroom teachers could be pulling apps so that you as a teacher, and I'm a teacher myself, so I understand this very well, don't have additional, perhaps unnecessary layers of bureaucracy to go through to respond to the real time needs and opportunities of your classroom. So having that type of store or catalog might facilitate both the centralized decision making, which could come about as a change in policy, regulation, or law at one or more levels depending on the framework that's in place in a given state. But then having the teachers be able to pull the apps directly into their classrooms. And just to flag briefly, since I'm a law space that we recognize also that going forward we may well see and we may well want to see other types of new laws, regulations, and policies at all levels. And there again, a lot of movement in that space with state laws that are under consideration as well as a proposal that was recently made actually to amend FERPA. So stay tuned for more on that going forward. And now back to Alicia. Turning to the next element in the framework, we have norms, which reflect as opposed to laws, the top down solutions, the bottom up solutions, and the role of these sorts of interventions in the space. Now bottom up solutions offer both unique opportunities and unique challenges. As I hinted at before and as Leah's presentation underscores, there are limits to what the law can lay out and the advice that statutes can provide here. There are opportunities however for schools and districts to band together and use their buying power to begin to collectively consider what sorts of contracts and what sorts of apps and services do reflect best practices and do reflect the spirit and letter of the law as we know them. So you see here best practice standards to be adopted by industry. The notion that it could be up to consortia of ad-tech stakeholders and groups of individuals to more effectively come together and dictate, think about together. What does reflect the best practices? What are the lessons we've learned collectively and how can we try to make this distributed space? How can we try to take the lessons that we've each learned and collectively apply them within this distributed space? Now one challenge here that I should highlight is that a lot of cloud products are coming in with what are called click-wrath agreements as opposed to contracts. And school air might fall under this category in fact. So if you have a contracted agreement, then presumably the council at either the school or district level has sat down with the vendor and they have provided guidance regarding the terms of service. In contrast in a click-wrath agreement, one might click through and simply agree to the terms of service point blank. So the paper recommendation given the snapshot in 2014 and I do wanna be mindful of time here and give us time to actually discuss and dive in. So I'll just be brief. Is to adopt fair information practice principles or FIPS and other best practice standards by industry providers that would help to increase data security and protection. We should note just quickly that these practices should supplement and not substitute for legal and policy reform. Again, briefly to pause for a moment on the role of markets more specifically. There could be an additional role that industry and consortia involving industry as well as nonprofit and academic stakeholders could play. One recommendation we make in the paper is to promote transparency. And one idea to promote transparency around how data is shared would be a sort of privacy nutrition label. So you could think of shared, it sounds funny, I know, but it actually could be quite effective. You think of right now the current language of many terms of service says things like the vendor is not responsible for any product that originates from a source other than the vendor. And statements like that that are very difficult for a school or district level official to parse and determine what that actually means on the ground. And if you could standardize and simplify and label more effectively, how data is in fact being transferred and shared with third parties in particular within the context of a particular service, it would really empower decision makers at all levels in helpful ways. One could also imagine then competition, productive competition in the marketplace based on the level of privacy protection that a given service affords and sort of a race to the top model for cloud vendors acting in concert with the providers, the users at the school and district level. The bottom line though is that markets are a complex space that interacts with all of the other forces in the K-12 ed tech environment in dynamic ways. And that it's helpful to think about banding together, promoting productive conversations and transparency across industry and actors in schools. With that, I will turn over to Ores to conclude. Thank you. I think we go actually right to Q&A because you've been already very patient. You see here the question tool for the remote participants that you could use to submit questions you have. Of course, looking at the report, you have the advantage that you can multitask and the team here will monitor any questions you may submit. But we have the first one here. Is there a microphone? That would be great. I'm happy to be the runner. Yes, here we go. Please introduce yourself quickly. Thank you. Hi, my name is Ron. I'm concerned in an environment like the one you describe here, what happens to the student where either the student or the parent decides they don't like this agreement. Or even if there was a formal contract, they read the contract and they rejected it. They did not like this. Now you have a student who is isolated from the rest of the group because they don't want to participate in this and what happens then. Great question. Who would like to answer or reflect on that question? I would like to. I agree that that is a concern. And I think that having a better understanding at the policy and decision maker level of when, in fact, parental consent, and so parental consent up to age 18, it would become student consent once the student hits 18, is necessary, might help with that phenomenon in that you would have all parents being given the opportunity, again, when the law requires them to be given it, to have meaningful notification of what's going to be happening to their student's data and the ability to consent or not to consent and opt out. So instead of having a lone student or a lone family kind of feeling upset or marginalized, you can have an opportunity for all parents and children together to be getting this information, making a family decision, obviously, legally belongs to the parents, but hopefully parents are going to be including their kids in it and then going back to the school or the district. Of course, we don't, I think, want a situation in which schools are forced into a sort of patchwork situation where you have Johnny over here and then Timmy over there. So I think the hope would be to have a robust enough shared space for making these decisions that even if not everyone's gonna be happy all the time, you can at least get to a place where there's a workable situation for almost everyone going forward, if that's helpful. So maybe Sandra, you may want to briefly comment on this question, how youth could be involved in making some of these decisions, some ideas that we've been approached in the context of the Youth and Media Lab, just very briefly. So for example, one of the options is some of the schools here in the area about the US-wide and also more internationally, they have student advisory groups or student groups where such questions are being debated and discussed to really get also to the youth input, which we believe is very much important. Great, we have a question from a remote participant. Polina, would you channel the voice from whoever it is? So this question is coming from the question tool. To what extent is there a framework guiding the application of cloud-based edtech technologies? As for example, the Patient-Centered Health Delivery Framework that is guiding the application of information technology in hospitals. Is that a question Dalia would like to answer? Sorry to put you guys in the spot, but since you're here, you know. There really is no framework that I know of that is governing this, a part of the research that's happening here at Berkman as a student privacy initiative is to try to dig into what might be the appropriate framework, especially in light of the 100 plus bills that have been proposed in various states and the potential for extraordinary confusion within the marketplace. So I think this is a great opportunity to both for industry policy makers as well as the educational system to really band together and figure out the best solutions to provide the richest tools to students, enhance education, while also protecting the privacy of our vulnerable populations. Thank you. Do you have any additions, Alicia? Just to note that I think the questioner and as Dalia gets at, there are certainly interesting comparisons between say FERPA and HIPAA, which would be the law that governs medical information, health information. But to my knowledge, as you know, there's no governing framework or clear guidance on that matter. Other question there. All right, please introduce yourself. Thanks. Hi, my name is Kate. I work for the ACLU here in town. We've been paying attention to this issue for some time now. We actually killed the in bloom project, us with some other people. So sorry if you're in the room in bloom. But my question is more of a, I guess it's a question related to the companies that buy access to the data and then sell apps, educational apps back to the schools. First of all, it seems like it's not a good deal for the schools, just economically. And second, I'm confused about why that's ever appropriate or under what circumstances student data should ever be sold to private companies. Like how is that ever legitimate? Yeah, please. In terms of, I'll just go briefly back to our hypo. I was trying to spare you all too much of the fine print. We had set it up so that, and I think this again, and Dalia correct me if I'm wrong, does reflect the way these development spaces tend to occur that app developers that want to make apps available through Scholar are given not real data to play with, but given an experimental design space with sample data on which to build a product. So it would then be, in this particular hypothetical, the teachers then selecting apps for the real students to use. But we were not envisioning a wholesale sale of student data to these developers up front. Now to get to your broader, sort of more normative as I take it question, are there ever any circumstances in which we should have a for profit company buying access to student data for any purpose, whether it is app development or something else? I think that's an incredibly difficult and complicated question. When we've convened working meetings here, we have stakeholders that have some very different perspectives on that. You'll get folks who feel like I imagine you and your colleagues might, that's never okay. And that as a matter of principle, we need to keep the educational space free of marketing and advertising influence. And there's certainly an argument for that in that students are a captive audience. And so if you're required under the law and schools, states have mandatory attendance laws, parents can face real consequences as can students for not sending kids to school. You're being, law's requiring you to send your kids to school. At that point, your child's data is being sent to a third party provider that may be using it for other purposes, that definitely does raise serious concern. On the other hand, and we have not come out yet with any sort of SPI position on the point on that spectrum from you can never, ever, ever send it to, you can send it without anything but bare bones legal compliance. We're not taking a position at that at this point, although we are very much enjoying being engaged in that discussion. But just to get to the other end of the spectrum about some reasons where it may well be appropriate has to do with as Alicia was talking about the school's ability to really realize both increased efficiencies at a time of government budgetary crunches to be able to do a lot of the really non-controversial kind of record keeping administrative functions that may not even be in the classroom in a way that's a lot more effective and can also be made secure, as well as the potential for more innovative, iterative and playful spaces. We do have a lot of cloud ed tech vendors right now who operate in the for profit realm. I think that in and of themselves does not make their motives suspect any more than it makes the motives not suspect if folks who operate in the non-profit realm really I think figuring out what the motives are has to be much more case specific than that. Other question? I'm Jeremy, I'm a new intern here at the Berkman Center. My question is about federal data breach notification laws and some of these laws that are currently going through Congress sort of try to strike a balance between fair and timely notifications in the case of a data breach while also trying to not over notify for fear of turning away participants from using cloud-based technologies because they think that they're either insecure or that they don't generally understand the important and necessity of securing the data. So I was wondering where the proposal would take us and how it would take a stance on striking this balance. So your question raises a good point which is that I think behind it privacy and security often get conflated when in fact they're deeply related but they are distinct in some ways and our proposal discusses harms 1.0 as we know which have to do with data flowing into and out of the cloud and security in that context is really key. It's important to think about there are things like fair information practice principles, there are industry best standards around security and around how to make sure that very basic things like where is data being outsourced to, where are the servers based, by whom is it being held and I think that paying attention to and as possible specifying very clearly in contracts the exact flows of data goes a long way towards helping to strike that balance between appropriate data security as information is flowing from across different stakeholders and the innovative potential of some of these tools. Any other questions or comments or reflections more broadly about time for two more inputs? Hi, I'm Megan. I just wanted to follow up on the question and your answer. Security is one aspect of course but if you encrypted the data and then sent it to the cloud holder, presumably it's a private cloud based provider, in theory and if they don't have access to the encryption codes, in theory your problem wouldn't be there, would it? Because they wouldn't have access to the data, the fact that the data is moving somewhere else if it's encrypted is almost irrelevant, isn't it? It's good training, you know? I think that's the key here and if a collection of experts in the field outside of, more broadly outside of the educational space were asked to provide a report to the White House on data privacy and big data and what they came to and they were specifically mandated to look at security protocols and technical solutions to the privacy issues faced in a world where big data exists and what they came out with is technical solutions is only one element of this. So with respect to the educational solutions and particularly the Scholar Hypothetical that was provided, part of the problem is not all vendors see data equally and the value of data is different depending on the business model of the service provider. So it's fine if you have an agreement that everything will be encrypted, you cannot rely on the schools to have the technology know how expertise to necessarily encrypt, they may at bigger districts or bigger schools, but the small school in rural, I don't know, Western and New Hampshire or whatnot. They may have one server, one computer, especially when they're financially strapped, et cetera. So there's a few elements that need to exist. One is contract in place or some format that allows the school to control and understand and know where data is flowing, how it's flowing, how it's being secured, et cetera. That's a technical solution. There's also a policy level should the laws dictate what vendors can and cannot do with data. For instance, the PPRA has this restriction against use of this type of data for commercial use in the K through 12 context. And is there also normative values, best practices that are just what companies should do and what the market should drive, which I think goes a little bit to should companies ever sell data, which is different than share data. And so I think that's where all of us need to think about there's a lot of space here for innovation, but in my personal opinion, we need to think outside of the box and outside of what has been done before in this context. Thank you. I think that's actually a terrific point to wrap up what you just said. This is just the beginning of a story and we've seen on the slides and you've made references to it that we strongly believe the topics we discussed today are 1.0 problems when it comes to student privacy in this kind of new tech environment. There are other questions at the horizon. So for instance, how the data ultimately can be used or will be used, for instance, down the road that we collect and compile and aggregate in the cloud and elsewhere. So the question, for instance, how the new generation of algorithms like that allow for predictions that are just kind of mind blowing in many respects. What happens if we apply the same algorithms that today enable and allow Amazon to predict and ship orders before the order comes in, right? If we apply that kind of algorithmic thinking and prediction to student privacy, to student data and the kind of performance data of my children or your children, that's really scary. So I think there is much more to come at the same time we've also heard and made the point about the benefits, the innovation potential the new technology brings and as Dalia just summarized it so nicely, where we are in the moment is best described as in the middle of a steep learning curve. What we hope the paper contributes to is in this sense of the snapshot to show what do we know already and where are the knowledge gaps? How does, how do the laws from the past apply to the new phenomenon? Jordan's not that well. But also what are alternative paths going to the out of the box thinking that may actually help us to manage all these moving pieces, be it the technology piece that is changing rapidly, be it the school practices that are of course evolving, but also the behavior of young people, how they interact with technology and educational content that's also changing. How can we synchronize all these changes without running to the legislator only? That's part of the solution arguably, but not the only one. I think that's the key challenge for the months and years to come to find a good answer to. So with thanks to all of you and to the collaborators, have a great sunny afternoon. Thank you. Thank you.