 Stacey Snyder wanted to be a teacher. By spring of 2006, she had completed her coursework and was looking forward to a teacher's certificate. Then, from one day to the next, her dream was over. She was summoned to the dean of her university and told that she would not receive her teacher's certificate. She would not be a teacher, although she had the credits past the exams, completed the practical training. She would not be given her certificate, she was told, because her behavior was not becoming of a teacher. Her behavior, a photo. Showing her with a cap and a cup. Captioned drunken pirate. Stacey Snyder had put this photo on her MySpace page for her friends to see and perhaps to chuck her. But the university administration found the photo to induce minors to consume alcohol, and thus to be inappropriate for a teacher. When Stacey was confronted by the university administration, she considered taking the photo offline, but it was too late. Her photo had been indexed by search engines and archived by web crawlers. As much as Stacey wanted the photo to be forgotten, the internet would not permit that. Remembering instead of forgetting. Remembering, forgetting. In 2001, Andrew Feldmar, a Canadian psychotherapist living in Vancouver, wrote an academic article in a journal in the article he mentioned that he had taken LSD in the 60s. In the summer of 2006, like many times before, Andrew Feldmar wanted to cross into the United States to pick up a friend from Seattle International Airport. The immigration officer Googled Feldmar and discovered the academic article from 2001 because Feldmar had failed to disclose to the immigration officer, although he never denied it, that he had taken drugs 40 years earlier. He was barred from entry into the United States forever. Remembering instead of forgetting. Of course, you may say now, Stacey's and Andrew's cases are tragic, but at least in part, it's their own fault. Had they not put information online, Stacey would be a teacher today and Andrew could still travel into the United States. Everybody has to decide for herself or himself what to make available online. What once has been put on the web is no longer forgotten. But not everybody knows what data is being collected about them, stored and made accessible. In the German city of Eisenach, there is a very large and very popular nightclub by the telling name of Mad. Every visitor of that nightclub must get registered before entering. During registration, passport data is being recorded, a photo is taken, and a database entry is created. So far, more than 15,000 individual guests have been registered. In return, guests receive an electronic charge card for the evening. Every transaction that night, every drink ordered, every tip paid is being recorded together with temporal and locational information for every of the roughly 1,200 guests a night and kept for years. More than 60 video cameras record every part of the club, store the video stream on 8,000 gigabytes of hard disk space. Since its inception, the club has never raised a single second of its recordings and has provided local law enforcement access to the information it stores via the internet and without a warrant, just in case. Of course, you'll say who enters Mad implicitly agrees to the information being collected. Really? Do we really know every time information about us is being collected, stored, and made accessible? For most of us, Google is the search engine of choice. Millions of people around the world sent more than a billion search queries to Google every day. Google is showing them the way. Google also shows us what is being searched, where, when, and by whom. This, for example, is a picture of the trend line for searches on Iraq over the last couple of years. Google can do this even for events way back because Google does not forget. Since Google's humble beginnings more than 10 years ago, Google has stored every single search query it ever received and every search result the user ever clicked on. Through cleverly combining information streams, Google believes it can link queries to people. Google does new for many years, which each one of us has searched for and when and on what search results we clicked. Quite literally, Google knew more about many of us. Many of us then we can remember ourselves. Remembering, forgetting. For millennia, for us humans, forgetting has been easy spilled into us. Biologically, we forget most of what we experience every day, our feelings, our thoughts. Remembering is the hard part. Since the beginning of time, we humans have tried to overcome that biological forgetting and to hold to memories that are precious. For thousands of years, we have tried like this Navajo to pass on our memories to our children in the hope that they too may thus be able to remember. This is how the great epics of the world emerged thousands of years ago. But human memory is not fixed. It changes as we reconstruct our past. Depending on it is not sufficient, especially when we want to capture something precisely or for a long period of time. Painting is one way of encapsulating visual impressions to create an external, more precise and lasting memory like this beautiful cave painting of Altamira. Script originally developed, can you believe it, by accountants searching for a precise method of remembering has for millennia remained humanity's preferred external memory. Language, painting and script provided us with the capacity to remember through generations and across time. But these tools have not all but the fundamental fact that for us humans forgetting is easy and remembering is hard, time-consuming and costly. The book did not change this either, neither did the photograph or the phonograph neither did film. Remembering remained expensive for most human beings and was thus chosen carefully. In other words, forgetting was the default remembering the exception. This enabled us to deal with time. Through our physiological capacity to forget, we rid ourselves of excess memory. What has long been passed fades in our mind. Thus, we pay tribute to time and depreciate what is no longer relevant to our present. But because forgetting is biological, we humans never had to cognitively develop a capacity to deliberately forget, to depreciate memories and to make them fade. Today, this is different. Google remembers, Yahoo remembers, Amazon remembers, the Internet Archive remembers, flight reservation systems remember. From biological forgetting, we have moved to permanent remembering. How does this happen? First through digitization, as you know, that enables very different types of information to be translated into a universal code and thus enables us to use the same information processing, storage and distribution infrastructure. That's not only enormously efficient, it's also surprisingly future-proof. Second, through advances in storage capacity. In 1965, Gordon Moore surmised that the density of integrated circuits, quote, might approximate a doubling every two years, unquote. Unfortunately, digital storage capacity has tracked the impressive increase in processing power that Gordon Moore first witnessed more than 40 years ago. This has permitted Google to create a global system of server farms holding a capacity of perhaps 100,000 terabytes of data. That's 100 million gigabytes. And growing. But storage alone is not sufficient. East German Stasi had hundreds of millions of facts on file on almost a million of its citizens. Yet with its elaborate system of pseudonyms and codes and mostly paper-based files, it had difficulties at the end retrieving the information it had in time. This, too, is different today with full-text indexing that was prohibitively expensive only a few decades ago and today is built into most major file systems. Global access, if we add that, that enables us to access information through a global infrastructure. A few minutes are sufficient to disseminate the document globally, even accidentally, and have it distributed as this page, from the manual of Operating Air Force One, which was made available online accidentally for a very short period of time. Once the mistake was realized, it was too late. Compare this to a conventional library with limited space for books, which forces libraries to periodically eliminate some books, in essence, to forget. Access is time-consuming and full of hurdles. You have to first locate the appropriate book, then using the book's index, if it has one, and then the right page, and then read the page or paragraph to find the information. And you have to be at the library during opening times. The Internet is very different. Taken together, this has led to remembering becoming the default and forgetting the exception. It could be a reason for celebration. After all, haven't humans tried for millennia to overcome the straight jacket of our biological forgetting and failing human memories? To be sure, our vast and accessible digital memories offer enumerous benefits from increased accuracy and efficiency to the promise to help us transcend human mortality. At the same token, undoing forgetting has consequences beyond the narrow confines of information efficiencies. Two terms characterize what's really at stake. Power and time. Power is relative and relational. As privacy scholars have long argued, power over information may translate into power over the individual the information refers or pertains to. If others not only have information about us, but the capacity to keep such information accessible for a very long time, this informational power increases. And has the potential to influence how we transact and interact. But power is not only dyadic. Jeremy Bentham's panopticum is the concept of a prison in which the guards constantly watch the prisoners without them knowing whether they were actually being watched. The aim is behavioral compliance through the permanent threat of invisible surveillance. Others, like Oscar Gandhi, have suggested the internet may help create a global panopticum in which everybody has to assume that she is being watched all the time. Such a panopticum may lead people to self-censor, fearing that their utterances might be misconstrued by any one of the hundreds of millions of individuals and thousands of jurisdictions connected with it. But today we face more than a global panopticum because of comprehensive digital memory. We have to assume that what we say today or do on the net will not only be witnessed now, but will remain accessible for years, perhaps decades into our future. This creates a temporal panopticum in which we may self-censor not because we are afraid of how others might interpret our words and deeds today, but because how people and institutions in the distant future might view them. My second concern is time or more precisely how we humans deal with time. As I mentioned, forgetting is biological so we did not have to develop conscious mechanisms to put different or perhaps contradictory pieces of memory into a temporal perspective. So consider the following hypothetical. Jane and John are old friends. Although they live in different cities now, they try to catch up once in a while. One day Jane receives an email from John telling her he's coming to town and is looking forward to having coffee. Jane is excited. She hasn't seen her old friend John in almost a year and wants to reply right away suggesting a nice place to meet. To remind her where they met last time, John was in town. She queries her mailbox folder. Up pop dozens of email messages she had received from John over the years. She's actually browsing through, but then her eye catches a 10-year-old email with a strange subset line. She starts to scan the text and then begins to read. Surprised, perhaps even shocked, she reads about how John deceived her and revisits an angry exchange back and forth between them. Slowly the events and the feelings she had triggered by this concrete external stimulus comes back to her mind. Her sense of betrayal and deception. She reads on about how over the following months and years John and her parent and herself apparently reconciled. Although exactly how and why the emails don't tell. But the forefront of her mind now is how John, her good friend John, deceived her. And suddenly she's not so sure anymore she wants to meet John and whether she really is looking forward to seeing him. As much as her analytical mind wants to disregard the revived memory, the angry words she read triggers her past mind. They are the external memory that help us remember things that we thought we forgot. But they also cloud our ability to devalue it and to decide. Put in abstract terms for us humans it is difficult to realize time as a dimension of change. When we are confronted with two pieces of memory temporarily apart it is cognitively different, difficult for us to evaluate these two pieces with time in mind and not be captured by the memories coming back. We can try, but frequently the act of remembering something long ago perhaps facilitated by an external stimulus will bring it back into our minds with a certain freshness and intensity. That may trigger incorrect decision making. In analog times the dangers were limited. Our biological forgetting obscured or our cognitive difficulties with time. But what when we are not permitted to forget anymore? We know a little bit about the consequences through studies of less than a handful of human beings who have biological difficulties to forget. This is AJ a woman who has difficulties forgetting. Ask her about a particular day she's able to tell you when she woke up who called and what was running on television. 30 years ago. That's not a blessing. It's a curse. She's haunted by her past so much that in fact it limits her ability to decide in the present. As Argentinian writer Jorge Luis Borges said perfect memory pushes humans to get lost in detail with no ability to generalize, to abstract, and to evolve. They lose Borges' rights what makes us truly human. T either to an ever more detailed past rather than living and acting in the present. This is the fate I fear we face with comprehensive digital memory. To perfect digital memory we also deny each out of the capacity to change over time to evolve and to grow. Without forgetting it is hard for us to forgive and so with comprehensive digital remembering we may turn into an unforgiving society. This is the real danger of shifting the default from forgetting to remembering. But there is another wrinkle to the story. What if frustrated with the shortcomings of our own human memory we begin to disregard our own recollections of our past and begin to depend on digital memory instead. Does that give those that control digital memory the power to change history? Hmm, so what to do about that? Well, there are conventional responses. The first one is the idea of privacy rights. The basic premise is relatively simple. By giving each and every individual a right to informational privacy we empower the people to fight for their own rights. Enforcement is both decentralized and delegated. It sounds great but comes with a number of inherent weaknesses. For example, in the U.S. the enactment of informational privacy rights is politically not feasible, at least in the present climate. But even if it were feasible I have my doubts about its effectiveness. The Europeans have in fact enacted strong information privacy rights decades ago but people have not used them. Whatever additional incentives frustrated lawmakers later offered the public no fault compensation, attorney's fees, a shift in the burden of proof nothing could change the bitter reality that individuals do not seem to want to enforce their privacy rights. At least not in court. Information ecology, the second conventional approach is the conscious regulatory restriction of what information can be stored and for how long. Part of the idea of privacy regulation these are norms that mandate, for example, the deletion of information after a certain period of time. Such norms necessitate government action and compliance enforcement may be costly. But they have two advantages over individual privacy rights. They do not require individuals to go to court for enforcement and they protect against an unforeseen future. Unforeseen future. Consider the case of the Dutch citizen register put in place in the 1930s for a perfectly good reason to ensure administration of social security. The register included information on religion and ethnicity. Once the Nazis had occupied the Netherlands they raided the register repurposing the information in it to identify Dutch Jews and to send them to concentration camps. It is a horrific lesson. As we cannot foresee the future and thus how personal information about us will be used it may be better to store less than more. This is the essence of information ecology norms. Unfortunately, since 9-11 we have seen a significant backlash here together with a wave of information retention laws in addition to the rather pervasive rhetoric of fear and security that we are exposed to information ecology goes against another emerging dogma that of transparency. Transparency has become our preferred mechanism for ensuring good governance both in the private and in the public sector requiring information retention and thus limiting the political chances for an expansion of information ecology norms to address digital remembering. Perhaps therefore we need to augment our conventional approaches and add some alternative ideas to the mix. Some have argued for digital abstinence for staying away from the technical tools that enable digital remembering. Not sharing everything on Facebook as President Obama recently reminded us may certainly reduce the threat of digital remembering but is it realistic? Over 200 million people have Facebook pages with other social networking sites around the world accounting for another 400 plus million registered users. Two out of three. By the end of 2007 two out of three young Americans put their information online. Two out of three. Did all these millions know that their information would be preserved long into the future? The exact opposite of digital abstinence is the idea of full contextualization. That is to store as much information as possible. Here is the rather intriguing argument. The difficulty of digital remembering that are associated with what I call time dimension stem from the lack of our ability to create the perfect context of a past situation or decision. So if we could only store everything including the context we could avoid the negative effects of digital memory. In essence this full contextualization would help us to regain our ability to think in time. We could revisit our decisions and at the same time it would equalize information imbalances. This is Brinn's proposal of a transparent society. But will full contextualization ever be technically feasible? And if it were will it actually address the challenge of digital remembering? Do we really have time to relive all our past again and again only to grasp what experience is no longer relevant to us today? Another alternative is to hope for a cognitive adjustment in our society. That is to hope that over time we'll learn to devalue older information and to live in a world with an omnipresent past. Not society has to change or its laws but our individual process of information evaluation and decision making. Sounds right. And that would solve our problem. But is it likely? How long will it take us humans to change the way we assess information to modify what we've been doing for ages? What is the appropriate mechanism for us to make that change happen? Cognitive psychologists are very critical of our ability to change our way of decision making and the short run. I hope this may work but I don't hold my breath. A very different idea then is not to change humans but to change technology. Some have proposed to use technology to change behavior. We could create for example quasi-property rights to personal information and build it into our technology into the PC, smart phones and so forth. And the technology would then ensure that only those that can access my information are those that I have permitted to do so. In short, the suggestion is to create a global digital rights management system to protect privacy. But do we really need to create a global technology infrastructure that watches every of our moves to ensure that nobody abuses somebody else's personal information? Would we thereby not create a perfect surveillance system to ensure privacy? For such a system to enhance privacy rather than to undermine it we would have to ensure privacy is built deep into the infrastructure. And while some engineering progress has been made recently we don't have that ability yet. I presented six possible approaches to deal with the challenges posed by digital remembering. Three of them on the left side here mainly target the power aspect of digital remembering. Three of them on the right side aim at addressing the time challenge. None of these offers us a silver bullet although all help in their very unique way. Hence we may need to mix and combine them and perhaps even add something else. Something else. I advocate a revival of forgetting. That is to establish a mechanism that eases forgetting in a digital age and that makes remembering just a bit more strenuous. Not by much. I don't want to overly burden remembering just enough to shift the incentives of forgetting and remembering back to what we humans are used to. Some have called it an expiry date for information. That's pretty accurate. I propose that whenever we want to store information we are prompted to enter not just the name and location of storage but also a date until which we want the information to be stored. Once that date is reached the information is deleted. Of course we can choose the expiry dates that will and change them at any time. The core proposal is not the automatic deletion. The core of the proposal is the prompting for expiry dates will remind us humans time and again that most information is not timeless but linked to a specific context and situation. That it loses its value as time passes. Expiry dates offer us a meaningful way to linking digital memory with time by choosing an expiry date and thereby implement the temporal dimension into digital memory. So for example, search engines could offer us an easy way to tell them whether we want them and for how long to add a search query to our search history. If I search for something that will not be relevant for me tomorrow and once month's time it's better not just for me but for the search engine as well to know. Ask.com has already implemented such a possibility in rudimentary form and Google, Microsoft and Yahoo! over the last year have progressively lowered the length of time they keep non-anonymized digital memory from forever to 24 months to 18 months to 12 months and out of 9 months. Or take a bit more complex case expiration dates for photos. Here people could carry a small device on their key rings that would permit to set a small number of expiration date presets. Digital cameras would query the devices of all people in front of a camera and the resulting digital image would be tagged with perhaps the lowest, the median, the highest expiration date chosen of those in the picture. These cases are just quick illustrations of how some scenarios of expiration dates could play out. In fact, we can envision a wide variety of differing expiry date flavors to take into account the specific preferences of societies or the particular requirements of information types. I only foresee two common features. The first is that any expiry date system must aim at changing the default from remembering back to forgetting. They must give us choice. Forgetting can be slow or gradual or reconsidered. Empowering users to set it as they desire they must. And it must be made the default and remembering must become the exception. Second, the second core feature of any expiry date system that I envision is that it keeps reminding us of the temporal nature of information. That most information is not timeless but linked to a specific context and thus loses its usefulness as time progresses. Expiry dates make no mistake, come with a bag of weaknesses of their own. They too are no silver bullet and they are not designed to solve information privacy challenges beyond digital remembering. But expiry dates also offer some unique advantages and thus have used perhaps in combination with some of the other approaches I mentioned could significantly strengthen the effectiveness of our overall societal response to digital remembering. Forgetting remembering. Since the beginning of time forgetting has been easy for us. In the digital age the relationship has become reversed. Today digital remembering is the default. Google Total Information Awareness CCTVs. And it is forgetting that is often forgotten. I urge you to give back to forgetting the role it deserves. Let us remember to forget. Thanks. Let me tackle the second one first. If you do a little math on a billion queries a day and all that the amount of information that you have to record every day if you Google in order to keep that history going is actually not that big. It is surprisingly small. So you can do that for a couple of hundred dollars a day. I was shocked when you actually sit down on how relatively limited that is. So that I think the capacity problem was not a factor into their decision making. Certainly some regulatory rumblings particularly in Europe played a role. But also changing market preferences or perceived changes in the market space did play a role. The competitive of Microsoft they save information for a shorter period of time and they made that quite explicit. Put Google in the awkward position to kind of have to follow suit and so forth. So that is one. Your first point is very well taken. What is interesting to me is when I did the research was that Google is wonderful because they have such a huge amount of data and so they worry a great deal about the capacity to continue to keep that data alive. It turns out that the way they have set up their system is highly efficient and relatively reliable and future proof. So they have a whole set up of using off-the-shelf hard disks for example. A lot of server farms use highly high quality hard drives because they shouldn't fail. Google is using off-the-shelf and they found that off-the-shelf hard drives have a very, very high reliability rate. Much, much higher than we would think they do. Not as high as the manufacturers suggest they are but very high indeed. And so Google has come to the conclusion that they have not sold but reduced the problem of keeping data alive significantly. That was a much bigger problem in the past. I still have 8-inch and 5-1.25 inch and 3-1.5 inch floppy disks lying around somewhere that I will never be able to look at or access again because I don't have a floppy disk drive again. I should throw them away. Packret that I am, I don't. Which actually reminds me of one thing. This session is recorded speaking of international privacy. So this is obviously really, really interesting. So thank you. If we were to take the word remembering and divide it and actually save the word remembering for the mental human process that we're all familiar with and some of us have more and more difficulty with over time. That's remembering and don't call what we do with digital stuff. Don't call it memory, don't call it remembering. That's sort of an extended use of it. It's storage. Whatever. It's what we do with writing. That's not memory. That's something else. So if we do that then it looks a little different to me than the way you've been painting it. And the question I want to ask is just taking memory that we do. How has that changed? Is it changing in response to this new environment that has so many more records that are so much more retrievable globally as you said? What actually has been the effect on the real and literal human memory, the remembering and the forgetting? Well, I mean we've had a little bit of that discussion earlier today. And that gives me an opportunity to try another answer. I use the term digital remembering quite deliberately. And I use it because as I see the challenge, the challenge is not only that we have hugely increased expanded digital storage capacity. That alone wouldn't satisfy me if I think of my video, my analog or digital video archive on tapes. If I can't access that quickly and easily and from anywhere in the world it's pretty useless. It's sitting there and sort of dusting away. So digital remembering to me is a combination of the binary code, the increased digital storage, the ease of retrieval and the ubiquity of access to a lot of that information out there. And that's what I call or refer to as digital remembering. Now how has digital remembering changed human remembering? I don't know. There's a couple of ideas about that. My argument in that it has changed human remembering per se. What it has changed is it has increased external stimuli that trigger human remembering or that cloud our ability to decide because some remnants of human memory are being brought back to our consciousness. One of the ways that a culture gets is that it lets books go out of print and it lets be forgotten on library shelves and then moved off of the stacks into the depository where nobody will find them again unless they're really, really looking for them. Now we have the Google book search project which aims to reverse this by basically taking everything that's ever been printed and still around and making it available again and leaving aside all the copyright you see some negative effects of this project in the area that you're talking about? I haven't given it enough thought. The idea of expiry dates to try to limit the amount of digital information that we accumulate and have accessible would probably have little impact on the Google book project because authors would like to set their expiry dates very far into the future but it may have some consequences. I can only hint at something because our research is not completed. I can only hint that research that we're doing on courts and court decisions in the U.S. where we see an interesting effect of memory. The author of a book is not necessarily the only party concerned with the information in it because if it's a non-fiction book it's written about real people and maybe about events that those people who are written about would rather have forgotten in the culture. That's very true and of course we have to keep in mind that the author's rights are not necessarily with the author all the time but publishers might have some say in that as well. Thank you. This is a bit of a follow-on to that question of sort of cultural memory. Because it seems to me that the only culturally significant memory is not in books but a lot of this stuff that's digitally remembered now has a cultural value as well in the long term beyond the span of a lifetime. A lot of the sort of things that we're recommending we put expiration dates on we do well to remember even if they're contrary to the interests of the people of the time. Why? Because they yield enormous troves of potential meaning significance for people in the future. That's just conjecture. Sorry. No proof for that. That's just conjecture. Well the conjecture is the question of what that meaning will be. That we don't know. We don't know what that means right now. That we would willfully erase seems to me somewhat dangerous. So should I burn down the house because I'm afraid that my heater might not work 10 years from now? No, I'm saying just the opposite. I'm not saying that at all. I'm saying aren't there I mean what about the question of countless authors to see what's been in there. Thank you very much. That's a fantastic example a fantastic example. So suppose you're an author. You wrote love letters or terrible short story whatever shouldn't you be able to just throw that into the trash bin? Does society have a right to take that away from you and say we're expropriating that because you know what all right over what you do. You have no control over it anymore. Quite frankly I would not want to live in such a society. I would want to live in a society in which I can write something down and throw it away in the dust bin and it actually is forgotten. And if Shakespeare wanted to forget and wanted his plays never to be played again he would never have put them on stage. He would have perhaps written it down and then burned it up. If famous authors wanted their works to be forgotten some of them perhaps because they didn't like them shouldn't we permit that to happen? Should we always paternalistically go in and say you shouldn't be able to do that because that's cultural value I am very hesitant to follow your own path. Yeah I think that's a great point although we still do struggle with this question. I mean J.D. Salinger at present is in the midst of litigation about this kind of question. I think that's and I'm going to give the mic up but I think that's a really important point. Nonetheless I would recommend that we give thought to how these things might sensitively be balanced so that a lot of stuff that will have no impact on the legacies that we leave may be lost. I think of the Iran election Twitter stream which is very difficult to mine even now enormous amounts of fascinating material have been lost in that kind of stream. How do we balance this? But it seems to me that there actually is almost a solution built into this the solution is that for certain types of information for certain context we may be required to mandate a very long expiry date or an internal expiry date and we do that. We do that all the time. We have certain whole lot of mandates out there to keep information and to keep that on the record. We should not diminish that that's not what I'm arguing for. What I'm arguing for is to keep this the exception to the rule and then to have the default as to forgetting. I want us to preserve as much as we individually as well as we as a society believe is important to preserve and neither should trump the other. I accept many parts of your argument. I'm a historian trained as a medievalist. In other words, my main field of inquiry was originally a field where there's a kind of data scarcity as part of the ecology that's so fundamental that anything that was preserved is a kind of miracle which you pay attention to no matter how trivial it might be. So maybe I bring a slightly different perspective to these questions but I think this point that was just discussed I think it really has to do with the argument which has to do with the question of who decides I think it's a central piece of your argument as I understood it who gets to decide to make the decisions about inclusion, preservation and the like. Now in the history of the kinds of institutions of memory whether they're state bureaucracies or whether it's Google on the contemporary scene essentially performed the filtering role Google is not interested in all of your data. Google is interested in certain data. It's choice behaviors patterns your navigational patterns in particular, right? Other institutions are interested in your social security number, your tax records and so forth. So I think one of the core issues here is who are the gatekeepers of memory versus erasure. And do I think that there is a power to I want to just go back to this earlier argument to approaches like you know Brewster Kale's approach which is really to throw everything into his Internet Archive on the argument that precisely the kinds of categories of materials which a traditional archive for instance would not preserve could actually prove to be the most interesting materials for reconstructing features of the daily life of the year 1921 or the way driver education was taught in 1967 in American high schools, right? My core argument again to restate it one more time is to give back the choice to the individuals and I feel that the choice now the decision power has largely been shifted to very large organizations and institutions like Google and to implicit preferences and defaults that are implicit in application softwares that we use. So I guess I got a couple of things. The first is that I guess I too am a bit of a believer in the cognitive adjustment perspective just because I see myself and you know many many people who use these sorts of media are already doing that and of course there are these problems when you're looking back through your email and you find something that you wish you had forgotten about but I think that also people do learn to be able to take that into account especially as your email archive goes back on me so far I remember a sense of intense loss when I actually deleted everything before 2004 and that's gone forever but you know I've still got everything back to 2000 before it gets on anyway I adjust Do you want me to respond to that and sort of take it time turn by turn otherwise I might not remember each of the questions so okay so the I have to confess I too lost all of my emails at the end of 1998 and I was horrified completely horrified and then I discovered doesn't really matter just perfectly fine in fact I was perhaps an early trial run to Larry Lessig's email insolvency or bankruptcy that he declared a couple of years ago I actually was relatively close to the cognitive adjustment argument when I got into this project and reading the literature of cognitive psychologists and what we know about how the mind works particularly folks like Daniel Schecter here at Harvard really made me switch my mind and think somewhat differently about that he has a wonderful book out that I want to plug you know it's my book talk but I'm plugging on a book and it's called the seven sins of memory and that's a very good explanation of how we reconstruct our memory all the time and that told me that rational conscious readjustment is probably hard to actually I think there's an essential difference in my power as an individual in deciding what I wanted to keep or remember about myself and my life and then what others may keep or record about me and there I would actually argue that European data protection law has deletion as its default I mean the data holders whether public or private are required to delete data after a certain amount of time and okay then you could argue that much of the internet service providers Google etc are not European based but even with regard to those entities there are a lot of discussions going on between the European Commission and Google etc about how to actually enforce that rule when operating under European privacy law okay that's a very good point and that gives me a chance to sort of go a little more into detail on this one we need to differentiate between information privacy rights that is individual rights and then general privacy norms processing norms and what you're referring to is in particular the purpose limitation principle that is a core principle of the European directive and most national data protection laws in Europe that does limit the amount of information opposed to limit the amount of information it's an information ecology rule one of the many information ecology rules that we find in European data protection laws there the enforcement is usually through the bureaucratic state that is through some administrative procedure or so forth that enforcement is great on the books highly efficient on the books but the reality is slightly different we I think everybody in the field would wish more enforcement on this one but that is an element an important element in the whole picture of how to deal with the time dimension that's where information ecology norms come in information privacy rights however have as I said in Europe failed mostly not just in the United States but in Europe as well what are the what are the technical implementations of expiry dates what will it look like have you looked at the semantic web at all did the privacy functions contained in there do they confront the issues so basically expiration dates are like metadata or meta information and modern file systems that we use are capable of dealing with a lot of meta information so this is not something very new or very different we just have another type of meta information associated with it most expandable file systems can deal with that pretty nicely what is interesting to me is that I see more and more companies who are offering something like expiry dates in their applications and I am kind of surprised to see that so ask.com about a year and a half ago came out with a button to delete search history search query history there is a company called Drop EO that offers a document in image sharing site or sharing capability with expiration dates you can share photos with others for a limited period of time and then they are automatically taken down and there are attempts by the University of Washington you might have heard of that the vanish project to basically make emails and other documents disappear over time which is another kind of application of expiry dates I also suggest that expiry dates are really cool and they are they are very binary today you remember tomorrow is gone and that gives me great trouble I would wish a system that would be closer to our human forgetting rusting perhaps or something like that Ours was also very pointedly asking me earlier today whether we couldn't come up with a system whereby we can adjust the expiry date at a later stage more easily so that the system kind of has a somewhat of a sense of our changing preferences and ever helps us change then the expiry dates as we go along because we might not know by the time we save something the timeline that we want to attach to that so all of this is incredibly helpful and I hope that technological innovation will push us into that direction expiry dates are just a way by which I want to capture that idea that we should give forgetting a chance and revive it thank you very much