 I did have to go down, but I thought it might be worth it. Oh, yeah, of course. I should buy. Oh, I'll take it. No, right. Great. However, people have been getting part of you. Maybe I'm working with you. So, I'll do a war. So, I'll take care of it. Yes, it works, huh? Every session. You know, I mean, you told me about it on the scale. Yeah, I didn't say that. No. Oh, my... My mom's only like... Who knows what your idea is also? Yeah. You can leave our... You'll need to open it. Oh, yeah. Hello. Hello, everybody in the back. We are going to be starting this panel just in about a minute. So, if you can make your way up, you can sit anywhere around here. And in case you didn't know, if you don't want your photo taken, you should get a red lanyard at the registration desk. Otherwise, we do have our lovely photographer, Michael, right here. And he might be walking around taking your photo. This event is also being streamed online, and it will be recorded later. So, if you don't want to be recorded, maybe you do not stand near these wonderful panelists. We also have this great Twitter hashtag. If you're following or tweeting about the event online, you can use found privacy. What's privacy lost? Did we find it? We'll soon find out. So, thank you for coming, and we'll start this in about just a minute. Great. So, I guess I'll start off introducing myself and the Internet Law and Policy Foundry. My name is Tiffany Lee, and I am a fellow with the Foundry. I'm also a privacy fellow at the Wikimedia Foundation. So, this event is co-hosted by both Wikimedia and the Internet Law and Policy Foundry. The Foundry is a fairly new organization. We are a sort of a trade association, a nonprofit for early-stage professionals in technology law and policy. It's a foundation that was based out of D.C., but now has expanded to the West Coast. So, thank you, everyone, for joining us at our first West Coast event. And without further ado, let's start off with today's panel, which will discuss privacy, innovations, and regulations. So, I guess I'd like to start off by asking everyone to just briefly introduce yourself, where do you work, and what do you do, especially involving privacy? Thank you, Tiffany. My name is Elena Alquina. I'm currently at the Global Privacy Office at McKesson. It's a very big healthcare technology company. And I work on various privacy issues, starting from legal controls and compliance controls, policies, training, data agreements, but also work on verification of health data and other data that can be used for secondary purposes. I used to be a lawyer. I began my legal career in 1995, and I practiced law for about 70 years, and later on I realized that I'm more drawn to business decisions and strategy versus practicing law. So, I like to create, I like to build, and I like to work with businesses to find a way how they can accomplish what's needed. So, I moved to the compliance function, and right now I'm more like on the business side where in our privacy had. And outside of my daily job, I am committed to advancing women. I'm a co-founder for Women in Security Privacy Group in the Bay Area. If you're interested in joining or learning more about it, please ask me later after the panel. And I'm also part of the organizations for leading women in technology. That's another great organization that helps women in various technology fields to advance their career to the next level. I got that figured out. Hi, I'm Gautam Hans. I'm Policy Counsel at the Center for Democracy Technology. CDT is a nonprofit research and advocacy organization based in D.C., and I am based here in the West Coast working on a range of issues and tech policy, most notably privacy, security, and free speech. Before I was here for CDT, I was based in CDT's office in D.C. the headquarters as the cluster fellow focused on privacy and data issues. So, I've been working on privacy for a while during law school and then afterwards and have spent a lot of time thinking about both government regulation and enforcement as well as the intersection of privacy and speech and privacy and surveillance and issues surrounding government access and all sorts of different venues. Yeah. Hello, Jake Snow. I am a staff attorney at the San Francisco office of the Federal Trade Commission. I've been at the FTC for about three years and I do primarily consumer protection work, including data security and privacy. And before I joined the FTC about three years ago, I was an IP litigator. So, everyone knows, my remarks today don't speak for the mission. I don't speak for any individual commissioner in case that was an office. I've seen the same challenges with the microphone. Hi, I'm Michelle Paulson. I'm a legal director at the Wikimedia Foundation. The nonprofit that is hosting you today and also hosts Wikimedia as well as a number of other online collaborative educational projects. I've been with the foundation for a little over seven years and in that time worked on a variety of things and now mostly focus on litigation dealing with freedom of speech issues and IP and a lot of our privacy portfolio which entails figuring out the internal and external policies and procedures that we use to save part of the data of our staff, members, and our community of editors worldwide. Hi, I'm Shalva Mera. I'm a partner at Gibson Dunn. I co-chair our technology transactions group and in addition to being a lawyer, I'm actually a transactional lawyer which is perhaps a little uncommon for people who focus on data privacy. I come at data privacy primarily from the perspective of someone who does deals and particular deals which include licensing, acquisition, disposition of regulated data as an asset to us. Great, well thank you everyone. So as you see we have a large variety of different experiences here. We have people in nonprofits, people in law firms, people in government. We have legal side with the policy side. We'll have a chance for you to all ask questions later. But first I'd like to start off just very briefly. I guess what are maybe, what would be the top issue you would think today in privacy in 2016? Let's just go around the table and then afterwards we'll discuss some of those issues more in depth. Before we begin, can you raise your hand? It's just very nice to know who's in the audience. Can you raise your hand if you're an attorney or lawyer? If you come from the software development side or IT. Okay, any students? Privacy? Security workers? Oh, security? Okay, great, thank you. Just always good to know who's in the audience. That's great. For those of you following at home, basically we have a lot of people doing all of those things. So how about, Shalu, what's the number one issue you would think today's privacy space? That's a really hard question to start with. Just because, to single out just one, there are a number of pretty important cases pending. So if I think of one, it would be Spokia, which is before the Supreme Court right now. So we have this proliferation of statutes which prescribes statutory damages for various sorts of data privacy, data security breaches. And the question at issue there is whether those are enforceable without proof or evidence that there has actually been tangible damage to the individuals affected to the litigants. That could have a very pronounced effect. The way that comes down will have a pronounced effect on the scope of liability under those statutes. And where we go with that is kind of in determinant right now, particularly based on what's happened with the Supreme Court and the way that those are balancing out right now. So I believe it's kind of hard to pick one, but I think at least one that holds my interest for me is just generally how organizations are going to be approaching a collection of data through wearables, facial recognition, biometric data, and what kind of regulatory scheme, if any, is going to come into place to save part of that information from misuse from outside parties. Yeah, it's a general question. I'll give a general answer. I think kind of the increase in the ability for companies and governments to collect, store, and analyze data, and I think that's sort of the combination of what people call the Internet of Things and Big Data. When it comes to the Internet of Things, the FTC has a real serious focus in that area. I think it was last week, a case in Senate order against ASIS came out relating to some vulnerabilities in their routers. So if anybody has an ASIS router, go ahead and install the update. And that case had to do with routers and vulnerabilities in the routers and vulnerabilities in the software that was installed in the routers and the software that was used to configure them. And that's something where you have a situation where people are using a device. They're relying on security representations that are made to them in the sales process and they're just, I think, reasonably expecting that a device they buy in the market is going to protect their privacy and their security. And in fact, sometimes those expectations are frustrated. And so that's something that I think is going to see more of. There are a variety of ways in which consumer products, starting the routers maybe, but not the biometric devices and the Internet of Things where security problems and vulnerabilities can be discovered. And I think it's a lesson to industry that those things need to be taken care of to design things rather than after release. Definitely, second, the Spokia mentioned, not because I worked on an amicus in this case, although I do, you know, I will preface this by saying that my predictive powers are basically zero, but I'm predicting a 4-4 split and this one, the passage of Mrs. Scalia, who I think would have been probably on Spokia's side. The other issue that I'm thinking a lot about now is data breach. We have seen many, many security incidents over the last couple of years with very, very little action from Congress. So I don't think the effect of this is really regulatory or legislative. I do think it's on the PR side. Consumers, I think, are certainly more aware of data breach and not just because of the incessant notification. Notice as we all get through email posts, but the ability of a company to project your data and when the data that a company collects is increasingly more sensitive as in Internet of Things wearables through devices. And companies may not need to necessarily worry about more legislative or regulatory oversight, but the court of public opinion, I think data breach is going to be much more serious in the coming years that it was in the past. There is a hard question. There is so much going on in the world lately, the advancement of technologies we use and the last couple of weeks are particularly interesting. And I think what's on my mind lately is the new data transfer mechanism between the transfer data between EU and US. And as you know, the Safe Harbor was invalidated in October last year and the new data transfer agreement was reached in February last month between EU and US. And that's quite interesting. So this year will be a year to reflect and re-architect the global privacy approach for all of us, not only companies, but also in individuals. And relevant to the EU-US relationship, the data transfer, the GDPR, the new law that will be effective sometime in 2018 if everything goes well, you know how it happens. They promised that after Article 29 and Article 31 in all these states and European Union review everything which they promised is going to be done by July or August this year. We'll see. So it's going to affect companies and the new GDPR. The new law is much more restrictive than privacy shield and impose very burdensome and strict requirements on companies. And I think as well as providing additional protection to citizens and non-U.S. citizens, which is great. But I think how it can affect us here and other countries that don't have adequate legal systems to transfer data from EU, it's that the GDPR is going to affect not only data controllers, not only companies who have businesses in EU and U.S. for example, it's going to affect companies that are more processors. And they might just target. We don't have presence in EU, but they might just target EU citizens or may frame, I think that's what they use in the regulations, like framing or monitoring EU citizens. So it's going to have a significant impact on U.S. economy and U.S. businesses, especially start-ups. Great, thank you. That's a wide breadth of issues. But if you think about it, there are a lot of issues in privacy today, right? Let's see. We have consumer privacy, we had Spokia, the new ASIS order. And I think the EU regulation, for those of you who don't know, the EU has a large privacy regulation, was formally a directive, and will now become a new regulation, as Elena mentioned in 2018. Some of the issues there are how this will affect American companies. And I think that's something we can start off with. It's a consumer-facing issue, but it is international in scope. So actually, Shalu, I was wondering if you could talk to us a little bit. How do you think that this regulation will affect companies like your clients or any other tech companies in this space? Potentially it creates this huge exposure, right? So the statute has extraterritorial reach. It covers, it governs exchange of data, use of data, personal information, personal data of EU subjects, personal information in the world, and the penalties are up to 4% of global annual turnover, which is a fairly sizable sum, and would affect my clients. And the ways in which it could affect them. I mean, so for instance, there's the erasure law. So there was a proposal of a right to be forgotten. The case came out in Europe about that, four years ago, and that's sort of been baked into the data regulation. And there's sort of, for the litigators out there, there's an obvious conflicts of law issue, right? Which is it's good in conflict with Rule 34 and holds and discovery requests. So that's something that companies are going to have to wrestle with. I think the conflicts of law is in the territorial reach and just the size of the penalties. Those are some big issues that we'll have to deal with. It also may ease certain issues that U.S. enterprises are dealing with now, and so far as there'll be one uniform set of regulations. This is a regulation as opposed to a directive, as you mentioned earlier, right? So the member states are not at liberty to sort of interpret the regulation and have conflicting implementations of them. This will be one set of regulations that we'll govern across the board. So there'll be a measure of clarity there. And then there are some provisions that are just kind of interesting that companies are going to need to learn how to deal with. For instance, the data portability provisions. So in data deals, amongst enterprises, we've long required that data... At the end of engagement, for instance, at the end of an IT services engagement, that the data be ported over to, for instance, a company's new vendor and with the schema required in order to interpret that data. And what the data regulation does is to actually give that right to consumers, both to be able to port their data from one company to another and then to actually provide the means by which that can be done. Great, thank you. And feel free, any of you, to jump in if you have specific opinions. I was just thinking, if we're talking about, you know, the EU data regulation and consumer privacy, the first thing that comes to mind, of course, is the EU-US privacy shield. And I was wondering if Jake might have any opinions on this. Yeah, I don't think I can comment. Okay, all right. Well, how about Gotham? Anything? Any opinion whatsoever? I'm happy to speak a little bit, although this is far from my area of expertise, only because my boss and Julie Brill did a podcast about this. So I know what at least our perspective is, which is that this is a step in the right direction, although I think the surveillance concerns that the court raised when invalidating Safe Harbor are obviously still concerns. I think having not read the document, I think the devil will be in the details. This is a very little answer, right? I don't know yet. We'll have to see the details, and we'll have to figure it out. I mean, it's sort of the situation. I think the decision was certainly... I don't think anyone could have predicted the decision in the Sherman's case long in advance, and so it really destabilized a lot of the conventional wisdom. And so to the degree to which the Privacy Shield can return us to a state of regulatory stability, I think what to see how it gets implemented and what the document says. I think some of the fundamental political questions underlying the court's decision have yet to be addressed. And so I think the line upon the Privacy Shield as the DEA says, is probably not warranted, and I suspect that most of the entities covered by the transfer of data between the U.S. and the E.U. are not expecting it to be the savior of this particular issue. So, interesting times, I guess, probably the verdict. One thing from just a practical business perspective, the Privacy Shield is pretty extensive and the obligations that it puts on companies that are planning on transferring data, and what that does to smaller, to mid-sized organizations is put a pretty substantial monetary cost on the ability to apply what is needed of that or effectively shut them out of certain markets unless they're able to apply one of the other approved mechanisms. So I think that there will be some long-term impact on the ability for smaller companies to be able to flourish internationally. The only thing I'd add to it, and the text just came out a couple days ago, so I'm still reading it, but it's a couple aspects of it sort of leap out immediately. One is that it provides that the E.U. subjects should have a dispute resolution mechanism available with the entities, with the U.S. entities under the Privacy Shield in order to resolve disputes regarding use of their data or alleged breaches of their data, and it should be done in a way that doesn't demand expense by that E.U. data subject. One of the interesting things about that, and I imagine this may have been part of the motivation on the U.S. negotiators, is that kind of dovetails what's become standard practice and privacy policies over the last few years, since like the 1810, the Conception cases, which is to include an arbitration mechanism within a privacy policy in the terms of use under which there's a class action waiver under which the company, the enterprise, assumes the cost of the arbitration. And so what it actually, a measure that's intended to protect the E.U. subject there actually also dovetails with a liability mitigation mechanism that we've been building into terms of use for a few years now. So one of the important components during the negotiation and Privacy Shield was negotiated between the U.S. was the right of private action for non-U.S. citizens and as you know, Obama recently signed the new law that will give right to private action for non-U.S. citizens. And what's interesting about it that when U.S. citizens have certain rights to sue U.S. government and enforcement authorities, the non-U.S. citizens will have slightly narrow rights to sue U.S. authorities and enforcement authorities and government authorities. Also, there's going to be a list of those authorities. So not every government and enforcement authorities can be sued by non-U.S. citizens, which is quite interesting and at least will be nominated, I think, by the Attorney General. So it's kind of like I'm very concerned about the new law and how it can address what was concerned or coming from the European Union that non-U.S. citizens don't have the same right and they don't have a way to address. So it's quite interesting. Let's see how it's going to go. Great. Thank you. I think that's something we're all watching right now. And I think we're seeing right now that Privacy is becoming more of a global issue and not just a country-by-country issue. And we see a lot of changes, as you all were mentioning, just based on the fact that companies take in so much more personal data now. So I was wondering if we could talk a little bit more about that, about how this increase in personal data, whether it be health data, data from your fitness tracker, student data, or just information put into Facebook, how was this increase in big data changing privacy for consumers? Anyone wants to start? So I think one of the interesting developments in the past year that's sort of driven by the collection of data by enterprises is reflected in the fact that you'll notice that a lot of big tech companies have started to open-source or distribute their machine learning and big data technology. And a lot of the big data technology that the industry relies upon are open-source. And we can surmise the motivations behind that. The real value is in the data itself and in the data repositories and what's in it. It's mutually beneficial then to allow this code to go out and to be developed on at large because when the technology's developed, the companies that'll be able to probably the most benefit from it are those that are collecting large data repositories just as part of their business. And I'm actually seeing some of this in academia. Some of the most interesting research and data science right now is not being done in the academy actually. The researchers want to go to the big enterprises because they're the ones who actually have the most interesting data collections. I speak of another comment on that. I think it's a really important point. Because a lot of the machine learning and data analysis packages are open-source these days that really reduces the barriers to entry for using those kinds of tools to analyze large sets of data but also not so large sets of data. And also it's very easy for companies to generate large sets of data. You don't have to be to have a large amount of data about your customers. You could imagine that if you run a food truck and you think, I wonder if I should ask my developer to try to de-anonymize the identity of my customers who send orders through an app and identify them on social media so that you can open them, you can make additional contacts and things like that. I don't know this for sure, but I think ten years ago that might not be doable in a weekend project. I think today it might be. And that's something that makes it much easier for startups but also individuals to do this kind of analysis. And I think it's entirely possible that they won't be thinking through security and privacy implications of doing that analysis because it's so easy for them to use packages. I believe that technology will change healthcare. There are so many interesting things going on in healthcare. I'll give you a couple of examples. For example, I don't know if you've heard about the digital pill. So there is a pill. I'll give you an example. Let's say I have my grandma in New York and I live here and she thinks she's okay to do whatever she wants to do, but I can't really control how much she drinks, how much water she drinks, if she eats regularly, if she sleeps, who knows, things can happen. So the digital pill, if she takes a digital pill, it allows me, of course, after I set up all the controls and appropriate to me and my grandma, I will be able to monitor how many hours she slept, how many liters of water she drank, and how much food she took. And I can do much more like how many hours of physical activities. Did she move it all? So basically, whatever my fitness device does, the pill can do it for my grandma. But that's the benefits, but think about it. It's scary. First of all, if that pill goes into wrong hand, anything can happen. Also, think about hackers. So if hackers can do something, well, it depends what the pill includes inside and things change already. So if there are some technologies that hackers can utilize to hack the pill, so it can end up being very bad. So these are the implications what technology can do to save human race and at the same time can do quite opposite. The same as, I think right now, what cardiologists can do for your peacemaker device, right? I think there are some situations where hackers were able to hack the device just for fun of it. So it was pretty scary because it really affected the person who was wearing the device. And it's interesting how we can mitigate the risk and benefits that technology can provide to us. The big data era has led to a true first and ask questions later mentality to collection and some degree that the ship has sailed and collection is, per basic collection of data is the default for many companies and startups and those who are interfacing with consumers. I think the issues around security are a good reason to think about data collection in a limited way, the way those systems for collection are set up is also something that deserves a lot of attention. More knowledge, more information doesn't, more knowledge necessarily and I think the predicates and the assumptions that companies have when collecting data and analyzing it can be often hidden to both, to everyone, to the internal staff and to the public as well. I don't know if that's nefarious, I think that's just we don't necessarily know our own biases. So when thinking about big data analytics as a tool for social good and for better understanding of individuals and potentially even society, I think that's all very possible and I hope it comes to pass. But I also wonder as to how that collection and those decisions are architected and whether or not they can reify social difference and how we can work to be inclusive and protective of everyone so that it doesn't just end up reinforcing certain issues that we've seen across the board. Kind of what Jake was mentioning about how consumers generally expect if something is on the market, it's going to be safe. I think that in combination with frankly the shininess of all the new technology that's out there, I'm so excited I could wear a fit but I know exactly how much I need to go to the gym. I can tell that my dog got his food this morning through an app. It's great and we don't stop to think about where this information is going, who has control over it, who it's being shared with and I think because of that it has fundamentally shifted what was considered a reasonable expectation of privacy before and that has legal implications because it's such a subjective standard. Things that are tied to what we should be expecting as our privacy will start shifting in the law as well and will have fewer protections and there will be fewer reasons for companies to go out of their way to ensure that you have control over your data or your data is insecure and other things that I think any other countries particularly Europe are now thinking about how to take that. One other point on that I think that the question of whether to retain or to get rid of large amounts of data that are collected through whatever product is one that I think currently the balance is heavily on the collect and keep around the side and figure out what to do with it later figure out what kind of wisdom can be drawn from that data later on but I think that startups and corporation and larger businesses who are thinking about being adopting best practices and being ideal citizens of security and privacy will take a hard look at the data they're collecting and they'll think about what they need to serve their customers and they will think about having policies that destroy data that's not necessary and that's something that the FTC has made a lot of recommendations relating to in the data security privacy advice and in the recently released report on big data and so it's worth being an advocate to maybe for your employers and to others to think about not keeping some of this data. That's also something that enterprises are going to have to deal with just from a statutory and regulatory perspective as well. There's a move towards codifying requirements to require data to be deleted once it's no longer useful for the purpose for which it was originally collected State of California is now making that into the educational code for instance it's baked into the data regulation and interpreting that and dealing with that is going to be an issue that enterprise is going to have to deal with in the next few years. So does anyone on this panel think we should actually be collecting more data? No, the answer is no. Okay, so everybody out there companies just stop collecting data. I'm going to abstain from that. I don't think it's possible. I mean, I don't think it's arithmetically possible. I mean there's Moore's law and the storage actually grows faster than Moore's law. I think right now the estimates are very pretty widely but I think the world has right now something like 10 to the 21 or 10 to the 22 bytes of data which, you know, it's like... It's all relevant. It's all relevant why we're doing this why we're collecting data for what purpose. If we're a driver in the business decision and collection of data if we need this data then why not but of course we need to disclose it and be transparent with our practices but I think the main reason, the main question for me is why we're doing what we're doing. If we come up with the right answer or the answer I think there might be some situations where why not but again there should be a reason a good reason why you want to collect more because right now most companies don't collect so much and they don't know what to do with that but it's a change to the products themselves, right? We just don't know, right? Just in case, that's the whole thing Every car is becoming connected now and the amount of data that's generated by a car is phenomenal if you think about it telematics has been touted for a while as being the next big space but it's just for obvious reasons I think that's the first place is we're going to see the effect of the internet or how many of us are willing to part with our cell phone or because we don't want additional data collected, right? As nervous as we may be or as nervous as one might this really be about the idea of being tracked and data being collected the market shows that we're still ready to buy the products that are collecting more and more information about us. This is sort of the reason that the security point was the one that I raised earlier which is that it may not be for if there was a business purpose to collect a lot of data and I think there certainly is one or at least there's a perceived purpose the only way to limit that is to probably, at least at this point, is to realize that there's a related business purpose to limit your data production which is where the data security issues come in because the more you have the more tempted and arbitrary you are to bring a little thoughtful in advance about why and what kinds of data you're collecting as opposed to a sort of like, you know, grab it all now and figure it out later in mentality means that not only you probably have a more manageable data set to figure out what to do with just having an amount of information or a seemingly infinite amount of information but also for those people who are those actors who are external to your data for whatever reason the less you have probably the less tempted and tired you might be I don't think that's true across the board I can, you know, obviously some data sets will be more valuable than others some companies will be more attractive than others to, you know, for unauthorized access but I think as a general principle you know I think that limiting idea of data breach as being a business liability can be very motivating for a lot of firms kind of related to that not all big data is equal you, it's not just what you're collecting or how much of it it's also how long are you keeping it is it encrypted, how is it being stored and how is it being protected in transit, who are you sharing what form these, how the answers to these questions really changes how big of a vulnerability you've left yourself and your consumers so it's very scary out there essentially but there are ways that companies can protect their consumers and their users so what about outside of the corporate realm what can consumers or just individuals do about this should they be checking their privacy settings, should they be going further and talking to companies representatives, what do you see as the way for protecting privacy and how can individuals help that well in the privacy area I'd say that the FTC's main enforcement focus has been encouraging companies to be transparent and upfront with consumers regarding their privacy practices and also in ensuring that when the industry makes representations to consumers that they keep those promises and I think that's a probably a pretty good place to start I think everyone has different preferences when it comes to privacy some share everything they think some people have air gap, Linux box in their closet those two people have different preferences with respect to what they will tolerate when it comes to the privacy of the products they use and the purpose of the FTC's enforcement actions in this area that area is notice and choice for people to understand what they're getting to make educated choices and I think as consumers folks can like look closely what companies that are offering products tell you about protecting your privacy and then hold them to it if they don't abide by those promises then you can go to consumer.ftc.gov and you can submit a complaint and so I think educating yourself about privacy and forcing companies to abide by the policies they make is a good start probably not news to folks so I think going off of what Jake said is educating yourself while I'm sure all of you don't want to read every page of a privacy policy it's I think having a general idea of what the practices are of the companies that you share the data that you personally care about the most is a good first step if you can't do that I think just think about ways that you can protect yourself use two factors factor off you know don't use passwords there's a number of steps that you can take that are short of reading every addition of Apple's privacy policy I feel like we tend to complain and blame the big brother who's watching us and big companies and I'm guilty of that but sometimes I think we forget that our basic cell phones and video and photo capacities we are on threats it's not the big brother who is pying on us it's us we take pictures of people who might not want to be in the photo we videotape and then we put it on social media for everyone we forgot about basic code of conduct in the society where we should respect each other's rights and maybe think about it's not from the legal perspective but how do I want to be treated some people are more open and have more appetite their appetite for risk is different from for example for me but think about the cultural differences and how other countries think about privacy and try to think about it and respect their their rights and their preferences I just want to say maybe it's just me because I grew up in Soviet Union and the big brother was watching me at times I used to live in a secret city it wasn't even on the map my parents as computer engineers software engineers they were designing nuclear bombs so there was a specific city in Russia no one knew about it even Russian population so we were monitoring like our phones everything the city had everything you didn't need to leave it's not like you could leave easily and no one could come and visit really like showed me coming to this country showed me how amazing it is to have all these opportunities and how diverse the culture is and I think we tend to forget how culture can affect our our preferences and also I think we should be really careful with how we're using basic cell phone and the social media so we don't hurt each other so we're becoming our own threats so the next time any of you want to complain about surveillance how do you not live in a secret city that you can't leave now it's open it's called Arzama 16 and the number was assigned to mislead US of course during cold war now it's on the map it's still closed I can't go because I have dual citizenship but my mom she usually leaves the city and comes and sees me you can check it out now it's called Sarov or it used to be Arzama 16 you can find it wow okay so I guess that brings us up to another similar point we think about privacy you also think about you know how that can affect things like free speech or how privacy can protect or promote speech I know Michelle you might have some thoughts on this do you or anyone else have anything to add no thoughts on this whatsoever no so to me privacy and freedom expression are really go hand in hand when you are trying to talk about controversial topics things explore subjects that you might be a little bit uncomfortable with you need the ability to do it without fear of reprisal whether that's someone literally standing behind your shoulder whether it's government surveillance whether it's just you want to be able to speak in a forum online about a politician doing something that you don't approve of without the ability to do that anonymously the discussion as a whole is just not as rich and I think that's why when we talk about anonymity online while it does have some negative consequences that people like coming out and they are legitimate problems such as people just being a lot more harsh as they aren't speaking with their name attached to it overall I think the benefit is good it provides for a richer discussion on a wider array of topics I definitely want to second that I think Michelle is absolutely right there is a really interesting book by Neil Richards who is a professor at Washington University School of Law called Intellectual Privacy that talks about how privacy enhances ability to score speech and freedom from intrusion the flip side of that though is that I think there have been some concerns especially in the way of the superior decision in Surreal vs. IMS health about the ability for privacy statutes to survive first amendment scrutiny I personally think that some of these concerns are overblown in part because the statute at issue in the case was a particularly badly written one in terms of issues we have sort of especially post noting there has been a little bit less academic policy attention to the issue of privacy intersection of speech some of the other issues that have come to light more recently have been perhaps more pressing but I don't think the story on that one has been finalized yet especially at the federal level there are proposals for federal legislation that create privacy protections I think we will see more concerns about free speech especially in an era where the courts have been very inclined towards corporate speech one of the things that we talked about earlier was whether or not data retention limitations are beneficial I could see a business claiming to be a constitutional first amendment I don't think that's a argument that some people wouldn't make whether or not it would gain traction I think it was the option debate but I think the questions of how privacy and speech intersect are still up in the air especially with basic recording and how that intersects private sector and with the government I also think there's there's a time between freedom of expression issues and both data transfer and encryption so for instance as you're undoubtedly aware in Russia they instituted the law two years ago under which data personal information of Russian citizens wherever they are in the world needs to be stored in Russia in open text essentially in an unencrypted format which has an obvious effect on compliance issues for multinationals and you may have seen the coverage about the issues that Twitter had with that but there's a clear freedom of expression issue there as well well I mean this is extra territorial though you know for all of the for all the reputational damage that the United States has suffered at post note and you know for intercepting communications or allegations of that it's astonishing the extent to which the countries who are making those allegations are have these type of localization requirements Russia's not alone it is also worth noting that when a lot of these regulations are put forward the viewpoint or at least to what's said by the governments who are trying to push forward these laws is that it's in the name of privacy they want to be able to protect their citizens they want to be able to control the way people handle data but at the same time that opens it up for national security for law enforcement agencies and it leaves it leads to a variety of things that are not quite as palatable and are definitely not in the interest of privacy of citizens there great thank you I think that we'd like to switch tracks a little bit right now so the internet law and policy foundry is an organization that was created to help young early-stage professionals who are trying to get into technology law and policy it's an organization to help people who are on their ways to careers like all of these panelists have so I was wondering if all of you could just speak a little bit about how you got to where you are and what advice, if any, you have for students or early-stage professionals in this space Elena, do you want to start? I got into privacy a mistake well, it wasn't a mistake but it wasn't definitely a plan and privacy didn't exist at that time and privacy didn't exist right now first I just developed a passion for technology I was at Wilma Hale in Boston and I was working with a lot of bright people all over the world coming to United States and I just some of them doing nanotubes someone C+, and just was amazed how intelligent they were and what amazing things they're doing to the world and so I was like I wanted to technology this is something that it was so I was like this is amazing this is interesting technology was like coming out of IP so I got into IP and I was doing IP and I was very interested in confidentiality aspect and the diligence we purchased in a lot of different companies and just that confidentiality and security aspect was very interesting to me and very slowly I moved from that piece to privacy but again I was hired by a bank to a contract like a literally two week contract and I was thinking to myself things boring I'm not going there and my recruiter was like just go there it's fun and it's interesting you wanted to explore the different industries like why not like okay so I was there for two weeks and that's why I was like never say no to things especially if it's like two week assignment only so I actually didn't regret it I was really interested in the whole concept of banking and financial industry was very new I was learning a lot and that's my biggest fear to stop growing I really enjoyed and the people were fascinating the woman who was my boss she was she is my friend and she's my mentor and so she said can you stay for another week I have some really cool privacy project we do an online banking and mobile banking and there are a lot of issues with privacy and security you work with tech teams you love technology here it is just whatever you wanted to just stay I actually had another job at the law firm at that time and she convinced me she gave me whatever you want just stay and I stayed I didn't want to really be in financial industry at all I really didn't care much about privacy at that time but I was really interested in mobile and online banking as a product because there was a lot of technology working with security teams and IT and that's it so I got hooked I couldn't believe it it was such a fascinating amazing environment and people and diverse backgrounds and everyone brings something and you can make the difference and that's it the rest of the history I'm less financial industry but I will second the accident question mark career path I far much of law school I part of the motivating reasons for me to go to law school about a decade ago was a lot of file sharing litigation that was happening and there also was some privacy and security stuff especially with the FF suit and have the interest AT&T so those were sort of the motivating factors but people were talking much more about copyright than they were about privacy and speech and security and my entry into this world was truly an accident it was really based on getting a job at the ACLU that focused much more on privacy and civil liberties than it would on IP and I was like ACLU is great I would definitely do that and it worked out really well for me because it would allow me to build a resume that really focused on privacy and civil liberties work in terms of advice I think one of the worst words I've ever heard is the word networking but the part of it that I think is valuable is talking to people as though they're human beings for those people who are lawyers and I think probably also for engineers about I don't want to cast aspersions on a profession at least for lawyers no one in this room but many lawyers are very weird people and don't really know how to talk to other people and that's why if you are a person who happens to be a lawyer who can talk to other people you probably will be in a very good situation because it means that you can talk to someone here or another event or through work or through some other random time when you meet a lawyer which is probably very often about what they do and how they come into it and why it's interesting what they're working on and what they've been really engaged with and that part of networking if that is a good part of it is probably how I managed to build a career out of this world I think it's important to figure out what not only what substantively you're interested in what areas but also how you want to work in those areas we're lucky now I think the technology law and policy is growing pretty rapidly and people can have careers that are a lot more diverse than they used to be you can work in profit you can work in the house you can work for the government you can work internationally you can work on privacy, your speech security, international issues there's a lot more options and taking the time to figure out what it is that's motivating you probably makes you a lot happier as a person but probably a lot more successful in your career so it sounds like the lesson of this panel is if you want to do privacy work start an IP I too started as an IP legator I did that for about four years before I went to law school I was a software developer and so I had sort of a long-standing interest in technology and an interest in incorporating a technical aspect into my work as a lawyer so after doing IP litigation for four years I moved to the FTC and there it's been a mix of all kinds of stuff FTC does including data security and privacy but also advertising cases and competition worked well so I think a lucky accident is pretty good advice the other thing I'll say is just that I interned for the FTC in law school and so I think that was pretty instrumental in having people at the FTC just sort of discard my resume immediately and so interning where you want to work interning in any areas you want to work in if that's something that's possible for a student for example is a great idea and other than that I think probably there's lots of plays in I certainly didn't want to see for a long time so not something that has to be a linear so mine was not by accident more of an obsessive personality issue I started in IP so I'm keeping with the trend here but it's I was asked while I was at the foundation to write the next privacy policy for Wikipedia and its sister projects and I took that on and really dove deep into it it led to me talking to all of the departments of media understanding what their privacy practices were what their needs were what they wanted and needed out of the privacy policy and as a result of that process I realized all of these things that we could do to help better secure our user and donor data and so that turned into 30 more projects and it became more of a career than I thought it was going to be originally and I realized that this was something that I was passionate about so I wanted to continue to pursue so part of my advice to you is if you're already at an organization that doesn't have a big privacy practice look for ways that you can improve it start taking initiative and those projects and see where that goes another thing you can do is similar to networking the dreaded word networking is join a national organization that has to do with privacy Tiffany might laugh at me for doing this but I was recently I recently started with the International Association of Privacy Professionals and they have certifications they have networking opportunities and other ways to get involved and see issues well let's say you may be surprised to hear I'm an IP lawyer and I'm happy to share my background how I ended up working in privacy and security but I'm not sure if you're maintaining it because it wasn't a discipline when I first came out so I went to graduate school in computer science in the early 90s and worked in data mining which was which was what big data was referred to then and immediately I went to law school after that and I remember I was asked in an internet policy class first of it was kind internet policy class in the mid 90s to write a paper on the legal implications of data mining and found that I actually there was no substantive material based on which I could write a paper I mean there was the Fair Credit Reporting Act in FERPA and nobody of law the data directive was just coming out so but nonetheless the cryptography was becoming an issue I spent a lot of time in law school being the person who was explaining asymmetric encryption keys and how that concept works and then I came to Silicon Valley in the 90s to do tech deals and gravitated into data privacy and security as part of doing international deals primarily because the data transfer laws were emerging in Europe particularly under the data directive and so in doing multinational IT services agreements we needed to have fairly robust clauses to address issues under the EU data directive and got into data security issues that way as well in developing an understanding of data security standards so that we could build contractual terms consistent with them so one of the things I want to add if you are thinking about another degree to law school don't I highly recommend that if you have some time to explore the subject to expand your subject matter expertise look into the identification of data and data analytics this is the next thing this is something that can help us mitigate risk this is something that makes you so valuable to the business and if you were a privacy and security head oh my god you're golden so this is something I struggle because there are no well it's very much but it's very difficult because it's very complex and not every type of data is regulated meaning you don't have very clear guidelines how you identify data for health care it does provide guidelines but it's very hard to follow and there are not so much material so the experts they are the people who do that and if you become one of those experts you will be all set plus it's fun especially if you love data another thing I wanted to mention is cloud so a lot of products are moving to cloud it's so important to understand how it works if you're a privacy professional take the cloud certification it's very helpful it's very handy it's gonna help you a lot even if you don't do cloud right now you will be so knowing when you work with your businesses so just for your own sake just understand cloud and what's happening in this field great thank you everyone I think that's a lot of really interesting information for people trying to start out so don't go to law school but if you do go to law school but if you do do law do IP law to become a privacy professional but this is really helpful I think really just you know take chances go for what you're interested in do an organization, get certifications make your own way and I think that's something that we've all done I think something that everyone can do really no matter where you are in your career right now so I'd like to thank everyone on this panel and open this up for questions Jennifer has a microphone so if anyone has a question please raise your hand and she will come to you if the light is green it's on the button is on the bottom of the microphone perfect thank you for coming earlier you spoke to the shift in the expectation of private data the question that came to my mind is why are we still using social security numbers as a unique identifier so identity is a really difficult thing to do in the context of technology and you're right that SSNs were never designed to be one of the main robbery identity identifiers why are we still doing it and the lack of will and the fact that it's a hard problem I think there have been people who are trying to address this problem but this is one of the really thorny issues I think everyone agrees as with many issues that we deal with from a policy point of view everyone agrees it's not working but one has come up with a great solution that I know of I'm not that smart again I mean what's the motivating factor for change and probably some scandal which we haven't yet seen to really motivate it to a level that would lead to that kind of change at least you know what we'll see thank you for such an enlightened discussion so as far as destruction of extraneous data goes which companies hold there are a lot of times there's a lot of resources based on research on this so called unrelated data which because of researchers doing their research on such data especially in the medical field they come up with technological advances based on that and they don't at the outset they don't know that it's going to be relevant to them so how do you balance this with the privacy perspective and where do you draw the line the statutes have expressed carve-outs for research and for enabling collaborations for instance if you look at both the federal and state statutes governing educational data and the state level analogs they typically have carve-outs allowing data to be retained and shared so long as it's done for educational purposes and if it's de-identified you know in some sense CISA has that characteristic too right where the statute was passed last year to enable companies to share information regarding security breaches or potential security glitches without exposure to third-party liability for doing so and in some sense that's also a carve-out to allow for research and so I see that as a recognized issue I think healthcare is probably where that is perhaps the most significant outside of the research context that's a question that is going to very significantly the product or the business that the data was collected I mean I believe and somebody here that was no better than me that Google retains search history results for a certain period of time I think it's been longer but it's anonymized and I think there is a roll-off but I'm not sure about that but I think that's sort of an indicator that Google has made a judgment I'm speculating not speaking for the commissioner but I think that reflects a judgment that that search history data is not particularly valuable in a form which it's linked to after a certain period of time and I think that sort of has a certain intuitive appeal so I'm in the market for some product that's only going to be true for some period of time that I'll presumably make a purchasing decision that nobody really cares I was looking to buy a doghouse four years ago and I was thinking how to change it is Any other questions? Yes I wanted to ask about then this is for anybody in the panel but about the line between us today and the future where it's difficult based on what information is available about a citizen a particular citizen their conversations where is the line between today and the future where it's really difficult for people to associate with one another and participate in private communication and what's the line for you and do you think we've crossed that line? And I think being able to talk to one another and develop ideas together and opinions together is essential for democracy That's my question It's a philosophical question I think we're very close to crossing the line my opinion and I think I'll go back to my comment about our code of conduct that we should take responsibility in our own actions not to blame companies and the government for what they do and what we expect them to do we should take charge we should voice our opinions and I think we're doing much better with respect to each other's privacy and it's going to get worse we talked about all variable technologies medical devices we haven't even addressed the self-driving cars that's like another animal so I don't know if you want to go there but we forget that technology gives us a lot of new opportunities think about those self-driving cars they give opportunities to for a person who's handicapped to drive somewhere that needs to do so for parents they can send kids to school somewhere else if the car is safe finally and there are a lot of opportunities but at the same time even though the data is collected by auto manufacturers right now your speed, your speeding your accidents, etc the cars will probably collect data about when did you go to the liquor store are you taking a sick day or are you just drinking right now I would say if you are going through litigation divorce maybe you will have records where you spent your Saturday night and you still marry it so I don't know the answer to this question I think just being aware and continue raising those good questions and talk to each other talk to the government talk to companies having those panels I think that's one thing we all can do I think taking responsibility that's very important this is better it relates to what Michelle was saying before about privacy is promoting speech and and the ability to play around with intellectual ideas it sort of reminds me of secondhand smoke where you might make your own individual choice about smoking but you also affect other people and I think that people are empowered to make their own individual choices about what they do with their data and the privacy they have but that affects other people as well and that has yet to be determined what the bounds of that are without coming up against a lot of the other free expression rights that we value within this country and globally as well I don't think if any of us had an answer to this question we would probably have a lot of money and a different job but it's certainly a complicated issue I do think there is more awareness of it not to be really publish but I do think that a better perspective or at least more individuals understanding how these areas intersect certainly is more likely to lead to a more tenable solution and I would also say that the issue of of privacy in communications and personal communications and personal associations is an attention between that and open disclosure is not new it's a FOIA if you work for the government and federal or state under FOIA or state analogs there is a possibility that your communications over email which may be entirely personal will be available to someone who puts in a request and that's not new this predates concern over contemporary privacy issues I also think that so as pointed out we're going to be facing these issues in a much bigger way with the internet of things telematics, driverless cars, etc it's difficult for us to conceive right now what types of information are going to be tracked how easily they're going to be tracked and how readily they're going to be available but I think there is some recognition in the law sort of a retroactive recognition in the law that there should be protections for private communications for instance it generally takes a warrant to get access to communications on social media it's not just even though these are communications which are quasi-public there is legal process and there are 4th amendment rights in those communications and associations one other thing that's worth noting is we're already starting to see it a little bit more is the concept of privacy as a product, as a feature and something that you can market on think about signals think about wicker think about other ways that you can send messages and have these communications be private I mean think about I guess the way Apple is starting to really show exactly how committed they are to privacy that's important to a lot of people that could be the difference between whether they decide to get an iPhone or an Android so I think naturally there will be a mix in conjunction with things that are more privacy invasive I completely agree I think it's great for people to vote with their wallets there's a sort of dynamic that happens where technology is new it's complicated and it's difficult to execute the first time but after a while you see that that product becomes where features become so have search engines that don't store your search history have messaging products that have strong encryption and so I think folks who care about privacy can both use those products but they can also pay for them it probably makes sense also if it's an open source product for example to do something to support and maybe get it to your friends if you have conversations that store them which you can see you can do that I think we might have time for maybe one more question go ahead speaking of democracy what about political parties using big data and the rise of they're not doing it for money but using big data and merging every bit of public data with whatever data they can acquire every citizen has a profile and right now there's no clear data opt out and I can only see that getting worse where lobby groups would actually say hey how can we tip the scales in our favor so how can we protect our privacy in that sense I know that the FEC is interested in this but again it intersects with other areas of the law especially under the First Amendment how political parties use data and what limitations there are I think that you would certainly see a lot of litigation about this issue if it were actually to be challenged by the FEC sorry that's the Federal Election Commission or another agency I think voter targeting is very vital to both of the major political parties it's how they get turnout it's how they get donations it's how they figure out the likelihood of what races to put a lot of money into what races to slide so I think you're right to point out that it's an area of political concern in terms of opting out I don't know of obvious ways in which individuals can do so I think a lot of this targeting does happen at the code level and so it's not necessarily maybe about you individually but it's about your demographics and where you're located what kind of individual maybe that you are based on that I think voters are going to be increasingly targeted and that's probably increasingly cranky about this kind of targeting which probably will be the only ways in which there might be any change but I also even hesitated to say whether individual voter education has an effect on political parties because I think any one who has a thesis about how individual voters relate to political parties is probably having a thesis question a lot this year so we'll have to see what happens I think so it's well known parties acknowledge that they have individual voter records and data analytics are how campaigns are done and you'll notice that both at the federal level and at the state level the statute the data privacy statutes generally include exceptions for political campaigns so the TCPA for instance does not apply to political calls you know the shine the light law in California there's an exception there for for for political solicitations or for sharing political information so you know part of that might be because legislators are interested in promoting democracy and it partly might be that they're writing the laws that govern themselves I wanted to share a case study I was at the privacy con and BC recently and so if you've never been to privacy con it's pretty cool it's a conference that opens for public and different researchers all over the world come and present their studies different open source software and research and developing projects so it wasn't part of the presentation but I talked to a couple of researchers some of them are for Europe, some of them are for Princeton and they shared that they were talking exactly what you're concerned with and they said well you can't really do anything about it because the data is collected all over the place on the internet and maybe beyond so what you can do is they've done some studies and it really affected people's privacy so use fake names where you can play football don't use your name sure the credit card will be there with your name but use your fake name create something, coffee use a different name use cash, don't shop online so sure it's going to affect your convenience but they've done some studies and it showed that it did change slightly what type of information at least they can stop following people online the moment you go on Amazon and check some vitamins those vitamins will follow you for the rest of your life it doesn't matter where you go so they've done some basic changes like using their names in public and with some apps like learn, leave and shopping activities online and they noticed the difference it's going to take probably some time great well thank you everyone I think that's about it for our time but on behalf of the Comedia Foundation and the internet law and policy foundry I'd like to thank all the panelists and I'd like to thank all of you for coming out to our event so now it's time for more free wine and cheese but thank you again to everyone to all of you for coming to our event and for helping out