 So good afternoon. My name is Jonathan Zitron. I'm one of the founders of the Berkman Center. And we are so pleased to welcome today author James Glyke, Harvard College 76. So welcome home. Great to see you. And somebody who was the first editor of the Best American Science Writing Collection, something that all of us in one way or another benefited from, has worked over 10 years at the New York Times and has written books that have a kind of Keystone landmark feel to them by title, delivered on by the rest of the book, which include, of course, chaos, making a new science, genius, the life in science of Richard Feynman, Isaac Newton, the information, a history, a theory, a flood, which does make you wonder if this were like a Fibonacci sequence. What would the next title of the book be? He's playing that close to the vest, but maybe it'll be everything. In the meantime, his own New York Times called the information, quote, sexually theoretical, which is a wonderful pair of things to be called. And he's near and dear to the Berkman Center's heart institutionally because it is James who coined the notion of the worldwide web as the patent that never was. And what an amazing bumper sticker-sized observation that contains within it so much information about a choice made on something eminently patentable that wasn't and an association of that choice with its later success. So without any further introduction, please join me in welcoming James Glyde. Thank you so much for that introduction. I should have asked you beforehand. I should also say that we are being webcast. So anything that you say, can, and will be used against you in your later confirmation hearings. And that includes my mispronunciation of James's name. I presume it's the last name, but we're about to find out. I'm going to say Glick. Sorry we didn't discuss it earlier. Thank you, Jonathan. Except for that, it was very nice. And the Fibonacci sequence idea is nice to accept. The information is the end. It's absolutely the end. Now we go back or down or something like that. I'm very happy to be here. I just stumbled on a thing from 1997, an essay about the internet that appeared in the times of London. So this is 15 years ago. And Sir Simon Jenkins wrote, the internet is one more electronic craze that market forces will sooner or later put in its proper context. For the time being, its fanatical proponents need the sympathy and tolerance once extended to Esperantists and radio hands. All this will soon shake down. The internet will struck an hour upon the stage and then take its place in the ranks of lesser media. Well, Sir Simon Jenkins, if it weren't for the internet, I wouldn't have even been able to look this up. My purpose is not to mock the prediction, because in a way, this is exactly how I feel about Facebook. And there's $90 billion, apparently, that says I'm wrong. I guess I would like to make a simple point that the internet is a word, cyberspace is a word, the worldwide web is a word, is a word. As you all know, those are three different things if we are precise. And sometimes none of them are exactly the right word for what it is we want to talk about. I want to say that we're living simultaneously in two worlds, the world of atoms and the world of bits. The world of atoms being substantial, solid, slow. The world of bits is weightless and frictionless. It's basic tempo being light speed. And my book is meant to tell the story of this world of bits. In the world of atoms, we go outside to play and we feel the warm sun on our face and we smell bread baking in the oven. Right now I'm smelling burritos. Sometimes we call this the real world, as opposed to the virtual world, where we jack in, where time and space are annihilated, where the words friend and like have been hijacked by Facebook. But real world is a silly kind of shorthand, obviously. Both worlds are real. And by the way, the annihilation of time and space is a cliche that sounds modern but actually dates back 150 years to the rise of the electric telegraph. People who think that bit world has only just now popped into existence have been misled. What we are learning very quickly is that the world of atoms and the world of bits play by different rules. I take it that the point of the Berkman Center is that traditions of law and commerce don't always apply in cyberspace. It's also true that the laws of physics don't apply. Or to be more exact, they don't suffice. The thing we call information has its own natural law, its own science, which needed to be discovered and formulated. In physics, a turning point was 1687, the year Isaac Newton published the Principia Mathematica. An equivalent turning point came in 1948 when Claude Shannon, working at Bell Labs, published a 79-page monograph called the Mathematical Theory of Communication. It was a little more obscure. It appeared first in the Bell System technical journal. But it was a fulcrum around which the world began to turn. Among other things, Shannon proposed a new unit of measure, which he called the bit. A unit, he said, for measuring information, as though there was such a thing. Information was a word being pulled into service to represent the stuff that electrical engineers were transmitting through copper wires. And lately, of course, not just through wires, but through the ether, thanks to radio and any day now television. Of course, we know that information was everywhere, buzzing in the landscape of the early 20th century. Messages and letters, sounds and images, a hodgepodge of related species. But there was no one word for all that stuff until now, until Shannon declared that, for example, one bit of information could be stored by any device with two stable positions, such as a relay or a flip-flop circuit. And that n such devices can store n bits since the total number of possible states is 2 to the n. And if we all reached into our pockets, how many total devices with two stable positions would we be able to pull out? A lot. To put it another way, before 1948, information was a small and unimportant word, something like news and gossip. It was nothing for scientists to concern themselves with. After that year, it was something new. Mathematicized, reified, measured in bits. In its very first form, Shannon's framework covered coding, error correction, channel capacities, noise. It unified two distinct realms of engineering, analog communication, where messages went via continuous waveforms, as in the telephone system of the day, and digital communication, where messages were strings of discrete symbols. Shannon's theory linked information to uncertainty, to entropy, and to chaos. It made a foundation for our world of information storage and retrieval and information processing. In 1967, Marshall McLuhan remarked, man the food gatherer reappears incongruously as information gatherer. He wrote this an instant too soon. McLuhan considered us citizens of the electric age. He said, we are today as far into the electric age as the Elizabethans had advanced into the typographical and mechanical age, and we are experiencing the same confusions and indecisions which they felt when living simultaneously in two contrasted forms of society and experience. Of course, his electric age had no email, no web surfing, not even cell phones, much less Facebook and Twitter. McLuhan was mainly watching television. We don't call it the electric age anymore. McLuhan's two contrasted forms of society and experience are the world of atoms and the world of bits. I don't want to suggest though that these are somehow separate parallel universes. As far as I'm concerned, a printed book is part of bit world. We should understand, I think we do understand, that they are embedded in each other. There's just one reality after all. Information is its vital principle, and it always has been. We're finally beginning to understand. Biology is an information science, a subject of messages, instructions, and code. We appreciate that genes encapsulate information and enable procedures for reading it in and writing it out. Life spreads by networking. It was no wonder that modern genetics bloomed along with information theory. DNA is the quintessential information molecule, an alphabet, and a code, six billion bits to form a human being. Plenty of physicists go even further and treat the bit as their fundamental particle. John Archibald Wheeler, in the last years of his life, promulgated a slogan, it from bit. He said, information gives rise to every it, every particle, every field of force, even the spacetime continuum itself. The point is that when photons and electrons and other particles interact, what they are really doing is exchanging bits, transmitting quantum states, processing information. The laws of physics are the algorithms. Every burning star, every silent nebula, every particle leaving its ghostly trace in a cloud chamber is an information processor. Wheeler says, what we call reality arises in the last analysis from the posing of yes, no questions. So one thing we know now about information is that we don't own it. It's not ours. It's not about us. It surrounds us and it precedes us. We are adjusting to a shift in perspective that resembles the demise of the geocentric universe. It should be humbling. The biologist Jacques Monod proposed 40 years ago that we think about an abstract kingdom that rises above the biosphere. Its denizens are ideas. He wrote, ideas have retained some of the properties of organisms. Like them, they tend to perpetuate their structure and to breed. They too confuse, recombine, segregate their content. Indeed, they too can evolve and in this evolution, selection must surely play an important role. As you know, he was talking about memes, but this didn't become clear until a few years later when Richard Dawkins invented the word. The meme is the gene's cultural analog. It is a replicator and a propagator, an idea, a fashion, a chain letter, a conspiracy theater theory. On a bad day, a meme is a virus. Memes emerge in brains and travel outward, establishing beach heads on paper and celluloid and silicon and anywhere else information can go. They are not to be thought of as elementary particles but as organisms. Dawkins would say we are their vehicles and they're enablers. For most of our biological history, their main mode of transmission was the one called word of mouth. Lately, however, they've managed to adhere in solid substance, clay tablets, cave walls, paper sheets. They achieve longevity. Now, they propagate via broadcast towers and digital networks. They are self replicating patterns of information. They live in Monod's abstract kingdom, which we may call the infosphere. Most of the biosphere can't see the infosphere, humming with its ghostly inhabitants, but they aren't ghosts to us, not anymore. We humans live in both worlds at once. It's as though having long coexisted with the unseen, we've begun to develop the needed extrasensory perception. We're aware of the many species of information. We name them sardonically as though to reassure ourselves that we understand urban myths and zombie lies. We keep them alive in air conditioned server farms, but we cannot own them. When a jingle lingers in our ears or a fad turns fashion upside down or a hoax dominates the global chatter for months and vanishes as swiftly as it came, we may well ask, who's in charge? Then, as the role of information grows, it grows to be too much. When I started working on the book, TMI was short for Three Mile Island or Texas Military Institute, which by the way I know because I looked it up in Wikipedia. Now TMI means something else. We have information fatigue and information anxiety. We've met the devil of information overload and his underlings, the computer virus, the busy signal, the dead link and the PowerPoint presentation. There is a universally recognized metaphor for our current predicament. The word is flood. We feel a sensation of drowning, of information as a rising, churning deluge. One simple statement of the TMI problem is this. The multitude of books, the shortness of time and the slipperiness of memory do not allow all things which are written to be equally retained in the mind. That was written 800 years ago. By the Dominican friar, Vincent of Beauvais, it was his justification for his life's great project, the Speculum Mayus, a combination of encyclopedia and a history of the world. Facts used to be hard to find. People used to turn to the pages of Whitaker's Almanac published every year in England or the World Almanac in the US to find names and dates and sizes and populations. Lacking the Almanac, they might call on a man or woman of experience behind a desk at a public library. When George Bernard Shaw's wife was dying, he decided he needed the whereabouts of the nearest crematorium. And he turned to the Almanac and the Almanac let him down. He wrote to the editor, I have just found an astonishing omission in Whitaker. As the desired information is just what one goes to your invaluable Almanac for, may I suggest that a list of the 58 crematoria now working in the country and instructions what to do would be a very desirable addition. Shaw took it for granted that facts were to be found in print, even though by then he had a telegraph address and a telephone. Many other people were using the telephone to reach out for information. People realized that it was in their power to know instantly the scores of sporting events that they hadn't witnessed. So many came up with the idea of telephoning the newspaper that the New York Times in 1929 felt compelled to print a front page notice with the headline, don't ask by telephone for World Series scores. Now, the information in real time is considered a birthright. So what do we do when we have everything at last? Daniel Dennett imagined in 1990 just before the internet made this dream possible that electronic networks could upend the economics of publishing poetry. Instead of slim books, elegant specialty items marketed to connoisseurs, what if poets could publish online? They might reach not hundreds but millions of readers, almost instantly, not for tens of dollars, but for fractions of pennies. That same year, Sir Charles Chadwick Healy, a publisher, conceived of what he called the English poetry full text database. And four years later, he had produced it, not online, but in four compact discs, 165,000 poems by 1,250 poets, spanning 13 centuries, priced $51,000. Readers and critics had to figure out what to make of this. You wouldn't read it, surely, the way you'd read a book. Maybe you'd read in it or maybe you'd search in it. Anthony Lane reviewed the database for the New Yorker and he described it this way. You hunch like a pianist over the keys, knowing what awaits you, thinking, ah, the untold wealth of human literature, what hidden jewels I shall excavate from the deepest minds of human fancy. But it quickly transpires that the jewels are outnumbered by the clunkers. The flood of bombast, of mediocrity, the sheer unordered mass can wear you down. Not that Lane sounds at all weary. He says, what a steaming heap. Never have I beheld such a magnificent tribute to the powers of human incompetence, and also by the same token to the blessings of human forgetfulness. And he cites, by way of example, a funny self-referential couplet by the utterly forgotten Thomas Freeman, circa 1610. Whoop, whoop, me thinks I hear my reader cry. Here is rhyme dogrel, I confess it, I. By the way, since Lane wrote about him, Thomas Freeman did get an entry in Wikipedia. But the CD-Roms are obviously obsolete. All English poetry is in the network now, or if not all, some approximation thereof, and if not now, then soon. The past folds into the present. Different media have different event horizons. For the written word, say three millennia. For recorded sound, a century and a half. And within these time frames, the old becomes as accessible as the new. Yellowed newspapers come back to life. Record companies rummage through their attics to release every scrap of music, rarities, besides bootlegs. For a certain time, collectors or scholars or fans possessed their books and their records. There was a line between what they owned and what they did not. For some, the music they owned or the books or the videos became part of who they were. But that line is fading away. Most of Sophocles plays are lost, but those that survive are available at the touch of a button. Most of Bach's music was unknown to Beethoven, but we have it all, partitas, cantatas, and ringtones. It comes to us instantly or at light speed. It's a symptom of omniscience. It's a mixed blessing, the embarrassment of riches. The information theorists gave us a version of information from which meaning had been excised. They announced that meaning was irrelevant to the engineering problem. For some people, especially social scientists, this mathematical incarnation of information represented a kind of reductionism run wild. So we are constantly reminding ourselves that information is not knowledge. We can pull a device from our pocket and tap into the global collective memory and somehow we don't necessarily feel any smarter. 50 years ago, in 1962, the president of the American Historical Association, Karl Breidenbau, delivered a polemical address warning that human existence was undergoing what he called a great mutation. So sudden and so radical that we are now suffering something like historical amnesia. He lamented the decline of reading, the distancing from nature, which he blamed in part on ugly yellow Kodak boxes and the transistor radio everywhere, and the loss of shared culture. Most of all, for the preservers and recorders of the past, he worried about the new tools and techniques available to scholars. He said that bitch goddess quantification, the data processing machines, and also those frightening projected scanning devices which we are told will read documents and books for us. And he added, notwithstanding the incessant chatter about communication that we hear daily, it has not improved. Actually, it has become more difficult, which I think means that he would not approve of Twitter. I think he was wrong, and I think he would still be wrong. The TMI feeling, the sensation of noise without meaning is certainly real. We know how hard it remains, Google and Wikipedia notwithstanding, to find needles in haystacks and to distinguish the true from the false. No wonder some people feel we've arrived in a hellish world devoid of grace, a world of information glut and gluttony, of bent mirrors and counterfeit texts, malicious blogs, banal messaging incessant chatter. But this isn't the world I see. However we choose to define knowledge, it seems clear enough that we've got more of it and we were able to spread it more widely than ever before. As intensely as we now experience the present in our faces all the time, I think history is more alive than ever and more with us than ever if we care. One of the first people on Twitter was Margaret Atwood, a master of a longer form who you'd think wouldn't condescend to 140-character soundbites. She explained it very simply. Let's just say it's communication and communication is something human beings like to do. We're dealing with a tension between individual knowledge and collective knowledge, between creativity and groupthink. Wikipedia is collective, crowdsourced knowledge at its absolute worst and also at its absolute best. Everyone here knows two things about Wikipedia that it's utterly unreliable and that it's exceedingly useful. When it's wrong, it may be very wrong. When it's not wrong, it may be insipid. And yet the breadth and the detail are astounding and absolutely unprecedented. The same applies to everything Google does and it applies in a different way to the apparently formless mass of bloggers and Twitterers. Groupthink gains sway and yet individuality pokes through again and again. We can accept the mathematical point that information does not imply meaning or knowledge or much less wisdom. And meanwhile, we can find meaning where we can. We're engaging in a project of organizing knowledge, sorting it, filtering it, reviewing it. We need to remind ourselves that this project has been underway for many centuries and it's never gonna end. It is subjective and imperfect and unstable. Thank you. In a pleasing kind of recursion, I couldn't help but look up the Wikipedia entry for Thomas Freeman. And at the end, of course, you are mentioned. Oh no. Seriously? Oh yes. Freeman was mentioned by Anthony Lane in The New Yorker with the four CD-ROMs, et cetera, et cetera. This incident was described by James Glick as an example of how unprepared people were for the World Wide Web to bring all of human nature, literature to the tips of their fingers. Glick mistakenly states that Freeman is not mentioned in Wikipedia, although it's possible that this very page was added as a response to Glick's anecdote. Fabulous. Clearly, we should edit that page now to say, in fact, Glick now has changed the anecdote to account for the existence of the page at which point the page will wink out of existence. So, yes. That blows my mind. So, let's open it up to questions and reactions. I've got plenty, so I can go first, if no one wants to be first, or I can just call on people, which I've been known to do. Yes, sir, and Dan, am I right? We gotta get mics over to people, and feel free to tell us and the world who you are. Of course, that's a good question, and I don't know if I should have said nobody owns information, because that makes for a nice sound bite. And it felt true when I said it, speaking of information as something that exists apart from us humans. But now, on a more immediate level, I'm, to some extent, a great believer, a passionate believer in copyright. I live by copyright. I'm an author. I earn my living from writing books. And I'm a supporter of the American version of copyright law, which is based on the principle that creators need an incentive, which can be given to them by offering them a partial sort of ownership of the form of their expression. The form of their expression being certainly a subset of information. Now we could sort of devolve into arguments about all the tough questions that are arising about copyright. And I have opinions about all of them, but I don't know if you're asking about all of them. But it may also be that as an author, you may have to make choices, or the choices you make about how to assert your copyright can affect how your memes do or don't travel. For what it's worth, I found that having a book available on the web for free with a deep link createable to each individual paragraph means that all those moments that when somebody asks a question, you might say, well, read my book. It says it much more carefully there than I can possibly say now. You could say, here's the link. Click here and you'll see what I think. Where the normal corrals, which feel a little bit 20th century in order to effectuate traditional copyright, make it really hard to do that. It's kind of all or nothing. You buy the book, you read it, or you don't. Okay, let me say two things about that. One thing is, there are two different issues here, I think. One is people often say its authors are motivated, not just by a desire to make money, but by a desire to have people read their stuff. And if they put their stuff online free, more people will read it. And if library books are able to lend them free, sales will rise and so forth. There are many, many versions of this. I don't know to what extent those things are true. That is, I don't think anybody at this point has any idea whether, for me, making my e-book available free for download from every library in the world is gonna make me richer or poorer. I'm really, I'm sort of agnostic about that. What I believe, though, is that at least for a while it ought to be up to the author to decide. I mean, I do think that I ought to have some temporary ownership rights. And if I want to be a Luddite about the dissemination of my book, I sort of feel that there's a moral principle there, even apart from the practical principle of encouraging me to write books. Without expressing that at the moment, temporary in American law is 95 years. Okay, that's right. I mean, we can argue endlessly about 95 years. I'm not a fan of 95 years, but don't tell anybody I said that. Just don't tell my fellow authors I said that. But you raised a second point, Jonathan, that I wanna respond to, which is this possibly lovely idea of a hyperlink to every paragraph of my book for easier access or something. And I have to say I'm kind of horrified by that idea. More than a little bit horrified. That would be great for a cookbook. It might be great for the online database of poetry if each link is a poem. But I wanna emphasize that for me, the book is a unit. This is the musician who says the long playing record starts with dropping this whole shuffle play thing. Well, and do you want excerpts of Beethoven's 9th? You know, 22nd excerpts, the best of? It's Twitter for Beethoven. Well, yes and no. I mean, I would say even Twitter, Twitter, I'll defend Twitter and say that Twitter is meant to be 140 characters. That is the work. That's the work. And it's a restricted form, but there are people who are really becoming masters of it. So if you could have your way, your book would be better as a scroll rather than a codex. No codex is fine. Because that would compel people to read it straight through. Codex is fine. I'm not trying to compel people, but I'm trying to tell people that in my opinion they will get a better experience if they read it from beginning to end. Somebody actually asked me when I started going around talking about the book, said, is your book intended to be read here and there or is one supposed to read it from start to finish? And well, there are books I suppose that you can dip in and out of, but I really would like to say that a reason I believe in the future of the book as a form is that some authors use the length to create a structure that is not apparent unless you read the whole book. There are things in my book that echo other things and that will not make sense. I mean, I hope I'm not sounding pretentious here or... You're not sounding inappropriately pretentious. Once you... Once you... I hope I'm not sounding inappropriately snarky. And I'm also not trying to assert control over the reader. I mean, once you buy the book, do what you want with it. Cut it up into little pieces. Theodore Holm Nelson, who is often credited as the inventor of hypertext, although his proposed hypertext system did not end up the precursor to the World Wide Web we have today because it only has... He has two-way links. The web has one-way links. He recently wrote a book for which the chapters are in such odd order. He then gives you a separate map that says you should start with chapter two, then read chapter five twice, then go back to chapter one. He has a suggestion, but that's his choice to say... That's right. And the man who invented hyperlinks might well be... I'm sure that new forms of art are gonna emerge that make use of hyperlinks. A lot of you may remember there was a big flurry of, I'm sure you remember, Jonathan, big flurry of kind of literary interest in the form of hypertext about 10 years ago. Who are some of the writers? Some very good fiction writers got involved with it. And I think it's fair to say, it's premature to say, but in my opinion, it's fair to say that that didn't amount to much. Sort of the choose your own adventure was the harbinger of that, a form of hypertext and reading that book. Maybe the technology wasn't ready for people to come up with good works of literature in the form of hypertext. Now, I also wanna say, I really think this is an important point. It came up, it comes up, I should say, in connection with programs like Google's book search program, because if you are going to authorize somebody, not Google apparently, but maybe some kind of digital library to put books online, do you wanna have hyperlinks in them? The way there are hyperlinks in most online newspapers now, you're reading along in the newspaper and suddenly some phrase is blue. And what is that phrase and why is it blue? I don't want hyperlinks in my books added after the fact, except for the strictly limited cases of links to end notes. End notes, sir, that's an anachronistic form. End notes, that's something that e-books can do better than physical books, but I don't want when I make a reference to Claude Shannon and information theory, a link certainly not to the Wikipedia article about it, but even to Claude Shannon's book about it. No, I want, again, maybe it's just precious of me, but I would like people to have the experience of reading the thing in order. But this is very interesting because part of the point of your book is that information is something that can repose in a medium, like within a book there's information, but it's obviously also a relationship between a brain and or a mind and that book. And that the act of a mind reading something, I don't know, I think of people encounter Shakespeare in high school, the number of footnotes added after the fact to explain to them that this actually means that, and when it looks like this, we really mean, I mean, because the language has shifted, the culture has shifted, so you could see, your first objection was to incoming hyperlinks being able to unduly segment the work. This is a very different objection. This is outgoing hyperlinks that let the mind encountering the book choose how much to take a detour and learn more about something. And I just remember, I read The Rise and Fall of the Third Reich not long ago, and it had a kind of throwaway reference to the Night of the Long Knives, which not having been reading the newspapers during the 40s, or the 30s, I didn't know what the Night of the Long Knives was. And being able to actually look it up, and I think it was Wikipedia that filled me in a little bit. And I'm like, oh, wow, like, who knew? That's fine, we have to keep, and of course it's very easy to do that now on your, I don't want to name a brand of e-book. And I do that all the time. I mean, I'm not gonna deny that I do it, but we should keep in mind that somebody, whoever is, if the hyperlinks are embedded in the text, somebody is making choices about where they're going. There's a curator who's doing it, either the author or someone who follows. So you're by default going to Night of the Long Knives on Wikipedia or wherever. Probably on Google, and then it sends me to Wikipedia. That's right, which, as I said, is a mixed blessing, because you're settling for the collective wisdom about the Night of the Long Knives. But even worse would be if somebody decided that the word knives, they're needed, a reference to the history of cutting materials. They're choices made. Now, again, we can imagine a world where authors take the trouble to do that, helpfully. But there is that dilemma as an author where you're figuring out who is my audience? And for academics, it can be very easy. The audience are the five other people who are going to read this book. And you can just write them a letter, really, in the form of a book. But for somebody writing for a mass audience, there must be trade-offs all the time about how long do I go into this and lose the people that know it versus. Yeah. Yeah, yeah. I have just curious, a quick poll. How many people when watching television, its own form of text, find themselves in the midst of enjoying the program, looking something up online to explicate? How many people? And how many people are slightly guilty about doing it when they do? Oh, it's just me. I'm the only one who feels guilty. Because I don't feel guilty about it. Really? I don't think anybody has to feel guilty about anything. That's how, that's how laid-back. You know, anything in this, we consume information the way we learn to consume it, the way we choose to consume it. And I think we all waste time browsing through stuff that's trivial. We waste time maybe playing games. I know I do. And if we start to feel bad about that, then we know or we should be able to learn how to adapt and find balance again. And I think that means sitting down with long books or listening to long pieces of music in a quiet room without a magazine in front of you. But not because, not out of a feeling of guilt, I don't think. I try not, in any of my writing, to tell people what they ought to do. And that refines your too much information point. Because too much information can mean there's just so much out there, I feel paralyzed. But to the extent that it's pulled, you can go out and graze on it whenever you want. It's great. If you have a question on the RFK assassination, anyone in this room can bring up the actual video of the ballroom in the California Hotel within five seconds and watch it. It's amazing and doesn't feel overwhelming. Whereas it's the push part, your agenda is getting set by this number of Facebook invitations to which you have to respond. This set of reading lists, all that kind of stuff, which even sounded like the 17th century fellow you mentioned who was complaining there were too many books. Couldn't read them all. Live Nets. Yes. Google them. Yes. And tell us who you are. Thanks, yeah. I'm Steve Chapman. I work in the law library and a number of us are in the room that work in digital libraries. And I love hearing this conversation because it helps us in our traditional role as disseminators of information or any kind of produced works. We librarians are struggling with discriminating between the presentation and versions that would display the author's intent as perhaps differentiated from a reader's experience. And so we're gonna have a conference at Schlesinger soon about marginalia and championing that. And so from the author's perspective, what I would love to have is some way that we would be encoding information in the enterprise of moving things between the author and reader. And we have relationships of author to reader, of reader to reader, a reader to author. And we can make these distinctions more explicit than we've been able to in the past. But it's something that we wrestle with. I mean, on a political level, do we privilege the reader over the author, the author over the reader? And what you're representing so well is what I'd expect you to represent, which is the author's perspective and this interchange of those. Okay, but let me add this. And maybe I won't be able to convince you, but I would like to say that as a reader, I want you first and foremost to privilege, as you say, the author's intent. Whatever it is that I want to read, I want to at least have the opportunity of reading it the way it was written without the distractions of marginalia. Now, marginalia are fine. I don't object to that as an extra vehicle. And if you're in the mood to see what everybody else has written in the margins of some book or which passages they have highlighted, that's fine. But I have to say that it is as a reader that I don't want that experience to be primary. I want to, when I pick up a novel, I don't want to see what other people have highlighted. I want to read it absolutely fresh. There was a question here. Hi, my name is Judith Donoth. I'm a fellow at the Berkman Center. But I think it's also important, and I think we live in a good time for doing this, for separating out the physical form that ideas are being presented in and what is sort of the natural form of the way they're done. I think very few people would argue if you had a best-selling fiction writer here who said, of course, my book gets read through. It's a spellbinder and this is how it's supposed to be. When I sat down with your book, I tend to read books in chunks and all over the place and it was daunting and I thought, well, I'll dip in here. I did read it through from cover to cover, but I think that has a lot very much to do with your style of writing, which is, I think, owes a lot more almost to fictional narrative than to the way a lot of nonfiction books are written today. It was far easier to read straight through than I had expected upon looking at it and its daunting size. And so I think a lot of this argument is really about the emergence of multiple long-form texts because I think there are many books that really, really would be enhanced with this idea of hypertext or being able to be read. Have you read The Dictionary of the Khazars? No. It's the one fiction book that really was designed to be read as that type of skipping around narrative where it's written as what looks like it could be a series of encyclopedia entries. There's no entry point into it, but you have to keep looking things up and it eventually turns into a novel as you read it, but most fiction books aren't like that. And so I think the importance is for the author to know what kind of book they are writing because I think there are many authors who've struggled a lot with the linearity of books and who are gonna be greatly liberated with the ability to write books that are in hypertext to say, here's this idea and it branches in all these different places and I can let you move around in it, I think is one piece of it, but it's a separate issue, I think, from that fear of your audience escaping into this infinite world of Google and stuff and maybe coming back a week later, which is what happens when you have links that go outside, but again, that's an interface issue because what you really want is a e-book that pops up those little pieces and doesn't let you escape. So it's not so much in the writing form, it's in the form of the medium and I think what we're seeing in the next few years is how these multiple different ways of both writing and dealing with hypertext and then dealing with what the interface for long form should be emerging, but we're just at the beginning of that. I think that's true. I think that was very nicely put and thank you and it's true that people use the form in different ways and some people write books that are with which there is no particular advantage to reading them through and sometimes that's because they just aren't very good but other times that's intentional and it's fine. This is maybe a digression but I read one of my favorite books of recent years is by Clive James. Somebody helped me come up with the title. It's a collection of biographical sketches of 20th century figures. Okay, Clive James's last book. You hear clacking. Yes, so do I. It's two words. Yes, thank you. Cultural amnesia. All right, the form of the book is sketches of 20th century figures from politics and the arts. Sketches of maybe five or six pages each. The length is very consistent and they are in alphabetical order. Alphabetical order can't possibly be, it's as if he is announcing that this is a work of reference and you're not meant to read it straight through because those people, I don't need to explain, it's an arbitrary ordering scheme and yet I did read it from start to finish and I read it, I was left with the sense that despite the arbitrariness, I mean he couldn't have written it in alphabetical order, it made sense that way. It's not just a work of reference, it's not at all a work of reference, you're not meant to dip into it. On the other hand, I won't pretend that I read all of them and by the way, I should have said, I think it goes without saying that when you read my book, you're allowed to skip the boring parts. Other questions over here? Hi, I'm Espin Anderson, I'm a visiting scholar at CISR at MIT. I'm currently working on a collection of papers from various people discussing the singularity, which is, and the notion that computers at some point will be smarter than us and I'm sort of seeing a path forward towards that where more and more of the information we're relating to is so big, it's so much of it that the only way we can get at it is through some sort of computer mediated interface. I have a friend in Ireland who walks around and takes 5,000 pictures every day and that goes into a computer and he can sort of look up everything he's had for dinner the last six years. But he can't browse those pictures, he can only get at it through search. And I'm sort of wondering, how do you feel about that and think about this increasing flood of information and our mediated interface to it? Where do you see that going? And by the way, will computers eventually sort of do that for us? I've decided that I can't address a question about the singularity without quoting William Gibson, who calls it the geek rapture. I kind of feel, yeah, yes, I believe in the singularity and it's all we're already in it, it's already happening, it's already happened. And I don't especially worry about it. What you say is very true, it's sort of the point that the way we, you're describing I think a sort of special case of the general fact that the way we are interacting with information in the world is dramatically mediated by technology. But it has been for a very long time, it's just more and more. And I think just to worry that we're dependent on the technology and therefore becoming less human, well I don't, that worry doesn't bother me. I'm, yes, we are becoming cyborgs, let's face it. The fact that everybody, that many people in this room had to tap away to help my memory lapse on the title of my favorite book of Clive James is, well, that's okay. Without the mediation of technology, we wouldn't have had an answer, it changes things, it's different. Our relationship to our photographs is very different and I'm at the age where I'm in the middle, I have a lot of old photographs that I suppose you would say because I can't search, I can only look at the old fashioned way by digging through the piles and then reality is I have not looked at those photographs since they were taken, they're in envelopes someplace I don't even know where. And the ones that I have digitally are the ones that are accessible to me even though they're only accessible through search. That's okay, we'll deal. Well, let me try to push you a little more on that question and it bears a little bit just being on law school territory. Often the question asked in a classroom like this is, well, given if all you say is true, what then? What do we need to do or not do or is there some policy that's called for? I think we're doing it, we are doing it. We are finding our way through a very unstable situation and we are looking for new ways to balance something that has always been out of balance. I mean, that's sort of the, I think I've come to this Pollyanna-ish sounding view of things from writing this book where I learned so much about the history of, through about times during the history of the technologies of information when people felt comparable forms of being out of balance. I mean, it's what McLuhan was expressing in that quote. It's what, I quote Bertolt Brecht somewhere in my book. Don't look it up right the second way till later. At the dawn of the age of radio. Brecht was fascinated by radio. You can see why he would be working in an art form where his audiences were about this size suddenly being able to think about broadcasting to the millions. But he was also very, he was pressured about it and he said things about the unbalance between speaker and listener that apply today to Twitter in the opposite director. But the kind of concern you could see with a form of mass communication that consumed the theorists of the electric world as you put it in the 20th century would be its use for demagoguery, for pushing one idea out and squelching others because of the power of that rare broadcast tower. I think in the event it turned out a little differently that we ended up with most broadcast towers in the American context at least being the most anodyne that they were not particularly radical unless you saw the status quo itself as radical because it was reinforcing of that. But they tended to be on the one hand on the other and you know the kind of, if you just listen to the cadence of a Dan Rather or a Walter Cronkite when they were on camera it was all very a little bit so horrific. Like, you know, there's some concerning news and some human interest. Oh well, had my concern. He'll let me know when I should actually run. And a concern you could see coming in this cybernetic present and future is one in which partially as part of the quest to offer us the most wonderful word relevant information to our searches. And as we see on almost every search engine there's no such answer anymore to what is the response on a Bing or Google search for X. It is what is your response tailored to you and your interests. That's not too far removed from as we use these computer mediated lenses to view the world, we get answers tailored to us we are just mimetic processing units and you could see the number of natural interactions dropping which is to say interactions in which you have a kind of encounter with a piece of information or a person where they're not trying to particularly get you to buy something that even they don't believe in that that becomes rarer and rarer and more and more often every time you turn to how for advice what you're getting is something that may be as banal a goal as to sell you lifesavers or to have a goal of persuading you of something that may or may not have a truth value. And if that's the case you could see your world being one in which knowing what is true seems much more difficult doesn't it? Yes, yes it does. And that's our world. My view is there's only one wrong thing to say in all of these areas and that is that if we work hard enough if we get the computer science right if we discover the correct algorithms we can find an ideal balance. That is let's say a perfect search engine. Right now we know that if Google gives everybody the same response to a search doesn't make use of any personal information about us those responses have to be flawed because sometimes you're looking for night of the long knives and sometimes you're actually looking for cutlery. Correct, the same problem of writing a book for a broad audience. Right. And so it might be tempting to think that when Google gets better and better about knowing who we are and what we looked for yesterday then finally the search results are gonna be perfect they are not gonna be perfect because in the next breath you will ask what about serendipity? What about the happy accidents? What about the times even I didn't know what I was looking for? Or more trivially what about the times when I just yesterday I was looking for auto parts and that does not mean that for the rest of the week Amazon should be trying to sell me books about auto repair. Yes. In the same way there is no perfect answer to the question of whether it's best to have a single carefully controlled voices broadcasting to the millions or complete freedom for everybody to just broadcast to everybody else. We need a mixture of the scales. I mean you're absolutely right that in this country during the era when the electronic broadcast media were a few TV stations, few big newspapers for that matter they tended to be anodyne and not as harmful as George Orwell might have thought but in countries where the media are state owned and tightly controlled you know what the countries I'm talking about and where governments are at this moment trying to exert control over cyberspace. I don't think anodyne is gonna be the appropriate word. I mean I think we're right to worry about that. We get a kind of balance to centralize to media by doing things like having Twitter where there are a billion voices but and of course as soon as you do that you have a billion voices most of which the overwhelming majority of which are rubbish and the large majority of which are even offensive. So in what sort of era do you expect madness to spread, hoaxes to spread, misinformation to spread more efficiently? This new era of where everybody can talk very freely or the old era where you had the New York Times and you had CBS News or the older era where you had I don't know what the authority of the academy. I will answer my own question and say both, either way, it's strange to me that in our time when people have access to facts there are so many people who believe things that I know to be absolutely nutty that the president of the United States was not born in the United States. But part of that is you're right that you could attribute some of it to just a natural phenomenon of people reading whatever they click kind of thing. You see a YouTube video that says vaccines will kill you. You're like, well, it has a lot of views. I do like that. Part of it is. But part of it may be the next stage is if I want to write a check and in doing so influence belief. Absolutely. The way to do that before was you run commercials or something and this is talk about a out of context statistic but Larry Lessig just last weekend was pointing out I guess he just crunched the numbers or looked it up. 80% of super PAC spending in this election so far have been from 189 Americans. So fewer than 200 people have been responsible for tens and tens of millions of dollars of spending which clearly had an impact. Again, that's mostly ads. You could see if you had $20 million to spend on getting a belief out there you would be trying to be doing it in a much more subtle and individualized way for which I don't know that the gatekeepers who can make that happen have the kind of rectitude and independent identity that a CBS news believing in the church state separation between new editorial and publishing. I don't know how much those gatekeepers even subscribed to that. Well, I think you're making the point that even though we've lived through a period of a couple of decades when before our eyes we could see the rise of a very democratic and free flowing form of communication. Yes. We must not be blind to the fact that there are as much now perhaps more now than ever forces that exert control over the flow of information by virtue of state power or economic power. And when you look at when you ask why so many people enjoy the in my view lunatic view that global warming is either not happening or not man made or race it but they enjoy it. They enjoy it. That seems weird. And there are two things to be said about that. One is there's this kind of dynamic of finding out other people who like your conspiracy theories that you can see by following some of the nuttiness on Twitter. But it is also the case that there are powerful economic interests who encourage that kind of thing and deliberately spread misinformation in that particular area. Yes. I don't know. I mean there are other kinds of misinformation that spread where I don't have any personal conspiracy theory about economic forces. I don't know. I thought the whole YQK episode of 12 years ago was an example of a kind of mass hysteria. Unless it was the very hysteria that saved us. And I don't. Well there aren't. Well seriously there are a lot of people who say it was the hysteria that saved us and that I am wrong. A lot of people believe I am wrong. That there was a very real problem and it was a good thing that hundreds of millions of dollars were spent on it and that there was a certain amount of panic. Because otherwise I don't believe that's true. But in any case I haven't been able to figure out who benefited from the spread of that hysteria. I think it just happened. Yes. Yes it's funny too there was often in that era quotes from the BU Center for Millennial Studies. And I just wonder like what are they doing now? Do they just have like the next 950 years off? Like what do they do anyway? Yes sir. Let's get a mic over to make sure it's on the. Thank you. So I'm Chris Daly and I'm happy to say that Jim Glick and I were classmates in the Harvard undergraduate class of 1976 and delighted to see you again. I now teach at Boston University, not in the Millennial Studies Department. I believe. But I do teach. Ten year takes on a new meaning there. I do teach about the history of communications and journalism and this sort of thing. And Jim I want to thank you for a very elegant and I'm sure deliberately low tech presentation today as a real treat to be in the presence of a thinking person thinking. So thank you for that. Thank you Chris. But I want to ask you about something that you mentioned in your book. You quote approvingly from a John Archibald Wheeler his phrase it from bit. I think indicating that the physical organization of that atomic based world is a reflection of information about the relationships between things. But what I wondered is it, is this a symmetrical kind of a statement? I mean is it work in reverse? It seems that most of the bits that in other words the things we or the knowledge that exists is knowledge about something, right? So that you know we have like a whole body of knowledge about geology but it exists only because there are actual rocks, right? So this is a two way flow. No geology without rocks. Right, in other words. Whereas the rocks presumably predated the geology. Right, and those relationships predated, right? So is this a two way flow is what I'm getting at? The relationship between the atoms and the bits. I think it is. I mean I think I tried to say that the world of atoms and the world of bits are embedded one in the other. It's true, as you say, it's almost true that I quote Wheeler approvingly. I quote him kind of admiringly and hopefully. And that is he's expressing something that seems a little bit mystical to me. And it's above my. That's what caught my eye. It's above my pay grade, you know. I'm not a theoretical physicist and at some point I have to just be a science reporter and report on what other people are saying and try to understand it as well as I can. The role of information theory in modern theoretical physics is incredibly interesting and I try to cover it to some extent in the book in a tentative way. Quantum theory even apart from quantum information theory is it seems to me in a kind of a state of misunderstanding. I mean I have a sense that quantum theorists revel in the obscurity of their subject and the mystery of their subject. They do and they don't simultaneously. Well yes, they love making that kind of joke. And it's all the more interesting when you add information into the mix because then, well, I can't answer your question except to say yes, I think so. But there is a real question in there which is which is a phenomenon of the other or they intertwined phenomenon. I exist therefore I am a piece of data by my mere existence but you would also think that data could be independent of that and then there's like the number two which doesn't point to anything except two. So I don't know what to make of that. This lecture brought to you by the number two and the quantum state strange. You know it's interesting just to continue your point it's interesting to look in Wikipedia and figure out which numbers have their own Wikipedia entries. You'd be surprised. You can't have one for everyone. Well there are a lot and actually this becomes a real issue for a modern branch of information theory, algorithmic information theory where you start to worry about the paradoxes. One of the paradoxes, a paradox you can express is what is the smallest non-interesting number? Interesting number. The smallest number about which there's really nothing that you'd wanna say. You can see what the paradox is right away, right? That whatever that number is. The number right below it is gonna be. Well whatever that number is it's interesting because it's the smallest non-interesting number. It is, it's Gertl's theorem applied to Wikipedia. Yeah it is, okay. Well, last question. We started by asking you somewhat playfully about your next work, your next cluster of thoughts when you're taking and you hinted only that it's going to be very different than what has come before perhaps in format. We've talked a lot about format but maybe in substance too. Is there anything more you wanna say about what you're thinking about? Jonathan? No. There really isn't and I'm mindful of the fact that not only do we have the people here in the real world listening to me that there are people in a virtual world and honestly I have a kind of fuzzy idea that I'm toying with but it's not firm enough. If I talk about it it'll disappear. We look forward to it and we have every certainty that it is not non-interesting. Please join me in thanking James Slythe. Thank you all very much. Thank you.