 All right, I'm Dylan Mulvin. I'm a postdoctoral researcher at Microsoft Research where I work with Tarleton, and I also collaborate with Merrill. Thank you so much for being here. Thanks for spending your lunchtime with me, and thanks to Kerry and Dan, especially for helping get this set up. I'm gonna just start by taking you back to 1999 when there was excitement in the air, and people were wondering, would airplanes fall out of the sky? Would bank accounts suddenly dissolve into dust? Would nuclear missiles self-launch? Would Wired Magazine run an image of the Angel of History totally naked on its December cover? The answers, of course, were no, no, no, and yes. So I'm gonna spend about the next 20 minutes or so talking about the Y2K crisis and why I think it's a potentially rich episode in the history of computing and technology, and really gives us possibly some important lessons for thinking about the management of crisis and how we talk to people in the face of vulnerability, and especially in the face of vulnerability that's a result of technological precarity. So I'm gonna introduce you to some of the ingredients that are a part of this project, but it is a work in progress, and I've tried to make this slide presentation out of only objects and text that were available prior to the very first 2000. I didn't use a slide software from that period, but you'll have to forgive that. So today I wanna argue that the Y2K bug is a crucial chapter in the history of the public pedagogy surrounding computing. Key moment in other words of how we kind of calibrated a way for talking to the public about how computers work. It was also a moment for teaching people how their lives were already captured in data, whether or not they recognized it, and that things like infrastructure and computing could put them and their data at risk. And something as simple or seemingly simple as the representation of the calendar in computer code could bring everything to a crashing halt. So the agenda for today is pretty brief. I'm gonna start with some background on Y2K for those of you who weren't actively working on this remediation, and for any teenagers who are here who have no idea what happened. And here I'll detail some of the technical origins, and it's a pretty broad gloss given the short time period we have, and some of the concerns about computer literacy in the 1990s. This will introduce my Y2K archive, such as it is, and some of the documents and sources that I'm drawing on to try to compile this history. And finally, in the last section, I'm gonna talk briefly about the labor of repair and how people were talking about the labor of repair in this period and kind of contesting narratives about who was doing the work and what that work looked like. So let's start with some background. As you remember, the main problem that people imagined was driving the Y2K crisis was something called the century digits problem, or representing a date in six digits instead of 10. And this was an especially dangerous problem because the two-digit year format was used in a bunch of different computer languages, mostly from the 1950s and 1960s, but one language in particular called COBOL, and we'll come to it later, gets kind of singled out as a problematic language, especially because it was used in financial and banking systems and things like waste treatment facilities. So why would you choose to code digits in six years instead of, six digits instead of eight? Sorry, I think I said 10 earlier. So anybody who remembers that period, why was that choice made? Yeah? To save space. So the main reason was compression. There was limited computer memory in the late 1950s and early 1960s, so you had to make hard decisions about what kind of information was crucial to include. And actually excluding one and nine is a pretty easy choice because most of the information that you were coding would eventually be printed out. If you had to use it, you could print it out and you could pre-print a template form that had one and nine on it. If you had a checkbook in the 1990s, you'll remember it always said one nine so that you wouldn't have to go through the laborious act of writing out those two digits. And there was some thinking, right? Maybe these programs and these applications will last 10, 15, 20, 25 years, really optimistically. Maybe they'll last until the 1980s. But very few people, though there were exceptions, but very few people were imagining that they would last 40 years. And if we think today about the kinds of temporalities that are imagined for the applications that are being written, still hardly anyone is thinking in a 40-year timeline if they're even thinking one or five years ahead. So those two digits that you erased from the year were pretty minor memory savings if you were just creating an Excel spreadsheet of your friend's birthdays, let's say, or if not Excel, but a spreadsheet. But if you were trying to manage a database for a financial firm or an insurance corporation or government agencies, the Social Security Administration, those memory savings were significant because of hardware costs and the labor costs of keeping up that hardware. So what was COBOL, which I'm gonna kind of focus on as a problematic language? Well, COBOL stands for the Common Business Oriented Language and it emerged from something that's referred to as the software turmoil of the 1950s and early 1960s. And here I'm really indebted to Paul Ceruzzi and Nathan Enzmanger's histories of this period. And this is from Nathan's book. This is the cover of the communications of the ACM from 1961. And you'll notice, oh, it's not that clear, so I'll describe it to you. This is from January 1961. These are all of the available languages and there are dozens and dozens and dozens of different programming languages spiraling upwards in the shape of the Tower of Babel. And this represents the kind of question at that period is like what language should we use and different hardware manufacturers had different proprietary languages and different applications had different languages. And there wasn't a shared lingua franca for getting things done. So the government and the military stepped in, this is all in the United States, to say we need a compromise, right? We need a shared language that can work across hardware and across applications. And COBOL was one attempt to create that compromise. So there were a few criteria, right? It had to be readable by novices, in particular a managerial class that wouldn't necessarily be coding literate, had to be able to read or roughly grok what it said and what the purpose of a line of code would be. It had to be free of proprietary control and it had to be used for a wide range of applications. All of that meant that it was inelegant, right? It was super ugly, it was super easy to read which made it like less specialized and less interesting as a language to learn. But in 1960, the US government announced that they would not purchase or lease any equipment that couldn't run COBOL, which basically sealed it as a lingua franca for the short period going forward. So despite all of that, a tiny number of people learned COBOL. And this is gonna come up again as we talk about the labor of repair. If you can see this picture, this is the quote unquote 25th anniversary celebration for COBOL and you can see Grace Hopper in the front there who had, and what you can't see is that that's a cake in the shape of a tombstone that says COBOL on it because this is some of the people responsible for creating it and imagining that it's kind of heyday was fading and this was of course still 15 years prior to the Y2K crisis. So we have a kind of contradiction here. By the late 1990s, COBOL was by most measures the most widely used programming language in existence. It was used in 80% of all business operations, but had some of the fewest practitioners. And this has everything to do with cultural distinction and the politics of prestige. So it was, COBOL was de classe, it was designed to be readable again by managers, not necessarily people doing the programming. And it was referred to as a trade school language. So those are the first ingredients of this project. A problem that was created out of compromise, out of a desire to save on hardware and to make an easily readable language. And already you can kind of see the foreshadowing considerations of literacy and pedagogy and teaching new kinds of people to read code. Sometimes this slide just makes my computer shut down. So I might have to get through it kind of quickly. But there are other aspects of the Y2K crisis and how it was sort of constructed in the late 1990s that are really important. But first I have to say there's still, of course, mystery and widespread disagreement over whether or not the Y2K crisis was real or whether or not it was a false alarm. But I wanna say that whether or not Y2K was a true threat, it instigated massive programs of remediation and public education. In the lead up to the year 2000, estimates of reaching compliancy and the cost of reaching compliancy range from 300 billion U.S. dollars to 600 billion U.S. dollars. And that's a pretty big range, but no matter what a massive amount of money in the United States, compliancy is estimated at having cost $100 billion. In terms of disasters, would be one of the most expensive disasters of all time. And it didn't actually cause a disaster. So in other words, the legitimacy of this threat, historically speaking, is pretty secondary to the effects that the perceived threat had since the changes that it engendered were real. And this, I call it kind of crisis pragmatics. A crisis that is felt to be real is real in its effects to tweak the sort of Thomas theorem of sociology. And I think that the pragmatics of this crisis were tied entirely to the structure of the problem, that there was a precise moment at which you could point to and say, that's when everything's gonna go haywire. That's when planes are gonna fall out of the sky. And there were, of course, there were these lags and these kind of strange temporalities to the crisis. Some credit cards stopped working in 1998 if their expiration date was for the year 2000, but there was this sort of imagined moment that when we flipped from 1999 to 2000, everything's just going to explode. So unlike a hurricane or an earthquake or even climate change which have these hazy timelines for unfolding, you know something's going to happen, but you don't know exactly when. Y2K was sort of perfect for creating fear and compelling action. The next thing that's important is competing notions of literacy and numeracy. So first was a concern with how computers would read or count numbers, right? And fear about ambiguous numbers that 99 would be confused with 19, or the year 2000 would be confused with the year 1900 or some other unknown date. We might call this a kind of machinic numeracy or innumeracy. I'm more concerned with a second kind of literacy that we more often think of when we say computer literacy, which is efforts to train non-specialists in understanding how computers work. And specifically what the risks of something like old code or a legacy system would be or what a calendrical error would mean for something like infrastructure. And there were enormous hurdles to getting people to understand these risks and to appreciate these risks and to balance that out with just straight up fear mongering. All right, and it wasn't just like normal folks who had to be confused, convinced in late 1998, the New York Times was still writing about the Y2K crisis as a problem for quote-unquote computer users, right? Unable to imagine that in 1998, everyone was a computer user, whether or not they sat at a keyboard or used a gray box, right? If they weren't actively using computers, their lives were being used by computers and structuring the mundane and everyday aspects of their existence. Another aspect of getting over this sort of barrier to understanding is a strong potent sense of betrayal, right? Ellen Ullman writing in Wired in April 1999 said, I think puts it really well. She said, after being told for years that technology is the path to a highly evolved future, it's come as something of a shock to discover that a computer system is not a shining city on a hill, perfect and ever new, but something more akin to an old farmhouse built bit by bit over decades by non-union carpenters. So in other words, the chances of successful literacy campaigning depended on teaching people that they were already entwined in digital technologies and all that, then the technology always involves constant upkeep and repair, right? That it's not just a finished perfect product that works or doesn't work, that it requires attention, right? And people to work on it and to keep it going and it might not go the ways that you expect. And you can sort of see in this quote some of the issues again of repair that are gonna come up and labor. All of this brings me to my archive. So as a historian, the Y2K crisis offers a really challenging problem. How do you write a history of something that may or may not have happened? And one way to do that would be to go out and interview people and weigh their opinions against other people's opinions and then compare the evidence and decide whether or not it would have happened. That's not what I'm doing. Instead, I'm trying to focus on what did happen, right? That pragmatic response to the fear that something was going to happen. This creates another problem, which is because it didn't happen, because Y2K appears as a kind of punchline to the joke that was the 1990s, its archive is really spotty. And Y2K played out in newspapers and network news, but it also played out in pamphlets and paraphernalia and memorabilia. If you were alive then, you'll remember just the amount of stuff that was branded with Y2K. And it played out on the web, on websites that nobody pays the server costs for now in 2017. So some of it's captured by the internet archive, but actually very little of it is captured. There is an archive, however, in the classical sense and the Charles Babbage Institute has some of the papers related to a couple of organizations. And I've been there and I've looked at them and I'm gonna tell you a little bit more about them. The two organizations that the CBI has are the Center for Y2K and Society. Does anybody know or remember this organization? Is that it for you? No, so the Center for Y2K and Society, it was an independent organization run by concerned citizens who believed that the government was downplaying the risks of Y2K. So they were on the far end of the chicken little spectrum. And I'll tell you a little bit more about their collections in a minute. But the other organization that CBI has is the papers of the International Y2K Cooperation Center, which was paid for by the World Bank and backed by the UN and headed up by a guy named Bruce McConnell, who was a high ranking bureaucrat within the White House. So looking at their two approaches, we can kind of see differing styles in teaching people and approaching vulnerable populations in the face of technological precarity. So the Center for Y2K and Society or CYS is fascinating. And one of the most fascinating things is their record keeping. So as far as I can tell, everything the CYS ever read or looked at is printed out. And it's printed out and kind of thrown into a box in Minnesota. And it makes sense, right? If you're an organization devoted to the idea that society is about to collapse, and in particular computer technologies are about to collapse, you can't house your records or your documentation online or on computer hardware. You have to print it out. So the documentary evidence that they leave behind shows this build-up of the kind of ballast of paranoia, right? And everything that they have is an email, four words, press, clippings are all organized and annotated to support the theory that some sort of systemic collapse is imminent. I have, you know, there are just millions of examples, but here's one where somebody else has produced a pretty mild risk assessment and they've printed it out and the person has just written nothing new here, right? Unremarkable because a mild risk assessment. And other things that are less mild, right? They'll have more exclamatory annotations and then that's been filed away. The most significant thing they did though was circulate supposedly 20,000 copies of this thing, which is called the Y2K community report card and they sent it across the country trying to get different organizations to respond and speak to their preparedness for Y2K and then they compiled the results and here's an image of the questionnaire which you can't really make out, but the results of their survey confirmed their worst fears. They, it seemed to show that things like waste treatment facilities weren't ready, that vulnerable institutions like nursing homes were totally woefully uninformed and that America was doomed. Their holdings also really captured the long tail of the Y2K crisis and as I mentioned a minute ago, the many commercial opportunities that it seemed to present to kind of shill survivalist materials. So here's instructions for ordering something called the Y2K kitchen, which was a set of resources for cooking with canned and dried foods. This is a role-playing game called Countdown to Y2K. This is a book called Y2K for Women, How to Protect Your Home and Family in the Coming Crisis and I think what, and there's just many, many, many more that show that no opportunity was lost to process stereotypes and cultural affinities and cultural norms through a kind of paranoid template that Y2K provided. The second organization, so if the Center for Y2K in society was devoted to this kind of grassroots fear mongering, then the International Y2K Cooperation Center was devoted to sort of top-down fear assuaging. Everything will be okay. And the materials from their archive include professionally organized news clippings, by day and subject, a script for a 1-800 number that anyone could call and it would sort of tell you everything was gonna be okay with an automated voice. And if you could make it through the phone tree, you might actually talk to a human who would tell you everything was going to be okay, especially if you were responsible for an institution that had a computer system that you thought needed to be remediated. And it also includes details of something that was called the Y2K Expert Service Core, which was a quote, international network of Y2K experts who have volunteered their time and expertise to make available advice, intellectual capital and information. It also includes this very extensive project called the Y2K Community Conversations Campaign. And these conversations were promoted as opportunities for business leaders and institutions to share their Y2K preparations with locals. So come and hear how your local business or your state business is actually totally ready for Y2K. So in this case, it wasn't about explaining how things are gonna be okay or helping people prepare for disaster, but just saying like, don't worry, we're ready. It was also an opportunity to kind of demystify computer technologies. So whereas some government agencies in this period were focused on risk assessment and remediation, the Cooperation Center was also trying its hand at community organizing with developing a kind of pedagogy of preparedness. And I think what the collections of both of these organizations show was an attempt in this period, right, in the late 1990s to educate diverse populations in the realities of technological precarity, right? In the realities of what it means to use old code and old systems and legacy systems. It was an attempt to kind of collapse any division between computing and everyday life, right? By educating people about the many daily activities that were dependent on computers. And they lodged their explanations at multiple registers at the local, right? What your pharmacy is doing, what your school is doing, and at the mundane, right? How your credit card may or may not be at risk, how your social security checks may or may not be at risks. And then also the catastrophic, right? So one organization devoted more emphatically to the catastrophic and the other one to downplaying those risks. The last part of the archive is press coverage. And specifically I'm looking here at how labor was talked about in this period and who was being described as responsible for repairing Y2K errors. So you'll remember from the beginning of this talk that cobalt's widespread popularity was in total contradiction to the number of people who were fluent in cobalt. And out of this you get these stories about old programmers being taken off of golf courses and brought back to work on programs that they had worked on their entire careers, that they hadn't been paid very well for, that no one had appreciated them for, and being paid lots and lots of money to come back and fix them. This is one story from the Washington Post, but there's an LA Times story that I think really captures these dynamics. Here's an excerpt. Once considered the dinosaurs of the technological revolution in mind the metaphors. Cobal programmers have become a hot commodity precisely because they stayed in the high tech Jurassic era. Like priests who dutifully maintain dusty heaps of scrolls, they're being recalled from seclusion to recover the ancient knowledge of cobalt. Okay, so if cobalt is the ancient knowledge kept by priests who weren't appreciated in their time, then Y2K was the sort of secret code that lay dormant in the text, right? And this is funny except I think it's entirely typical of the actual active mystification that was happening about technology at this time. So on the one hand you had very active efforts to demystify technology, to explain how things are actually fit together and how you're at risk and not at risk. And at the same time, very strange ways of talking about the labor of fixing computer code. And we know this because the Y2K crisis also led to one of the first massive hiring Indian IT workers and the first wave of body shopping where hundreds of thousands of workers in Hyderabad and Sydney were employed to fix Y2K errors. And nowhere in the coverage of the remediation effort as this talked about. And of course this wave of body shopping was a part of ongoing changes as a result of globalization and shifts in labor markets that Mary Gray can tell you a whole lot more about. But instead you get these stories of underappreciated coders who were ignored in their time being called back for one last job, right? So this is where I'm at. Maybe the threat of Y2K was real, right? Or maybe it was a $600 billion pragmatic response to an imagined disaster. But in either case, it was one of the world's largest attempts at basic technological remediation. There are kind of two stories that can be told here. One is what looks like a punchline, what Louis Menand recently called a nutty cocktail of Luddite millennialism and digital overthink, right? But from another perspective, this is a hugely significant episode in crisis planning and management and a population-wide attempt at basic computer literacy campaigning. I think that the reality is probably somewhere between these two. And that the kind of interest and focus on technological vulnerability, on relationships of interdependency and that Y2K surfaced in this period make it a really rich site for talking about literacy building and the acts of care that we undertake at particular moments and just how we talk to people about how computers work. So thank you for your time and I'm really looking forward to our discussion. Merrill. I'm wondering how US-centric this was considering you are from Canada and if there was a sort of US isolationist or some kind of response that, and that position of the US in a particular way in relation to the rest of the world when it came to managing this, but also if there were like Canadian, if there are like Canadian archives of this too, yeah, I guess, two kinds of things. I mean, I can answer a little bit of that from the Canadian perspective, but does anybody else want to comment on what they perceive as the US-centric nature of the crisis? I mean, Sheriff, you were talking yesterday about the UK. So it was crazy to bug, that was one thing I wanted to ask you. Yeah. The terminology. In the UK, definitely, the Y2K bug was something anyway. To be honest, some of what you were saying would absolutely map over. I don't remember the survivalist instinct. I think the North America, perhaps has their monopoly on narrative, but it was very much about finance because of the city of London. So the city of London being seen as this kind of great, we had the Big Bang and the way in which the stock exchange was kind of electrified and stuff. So it was very much around that. So we didn't have as much of the community effort from what I remember. So it was there, but it came out differently. Yeah. This is related to that question. US is also different in that you do a month a year instead of a month a year. And so how did that fit in? Was this code used in other countries where that format already had to be re-arranged? That's a really good question. Again, because Merrill gave me off as a Canadian, of course, that ruins my life, especially because my wife's birthday is March 10th and my daughter's birthday is October 3rd, and so it's just like, it's impossible. Was it also about the ages of the infrastructure, so what different countries have dealt with? Like with COBOL or certain kinds of instantiations, would this problem not have shut up in countries where the financial infrastructure was newer, so a newer language was set up with different choices or maybe there was more storage, so certain kinds of, I don't know. That's an interesting question. So I can give you a few responses. One is that there's always a center in periphery problem, which is so the way it played out in Canada was as an American problem that we also have to deal with, because our computer software comes from the United States, our banking software comes from the United States, and we're interdependent with the United States, so anything that affects you will affect us. So there was no idea that we could somehow throw up a border and be isolated from the Y2K crisis. In the United States, and in some of the ways that it was talked about, there was also a totally xenophobic discussion, which was an idea that, okay, we're gonna spend $100 billion to reach compliancy. We're gonna introduce new regulations that are going to limit our exposure to this problem. What we can't predict is what all of the countries who stole our software are going to do, right? And I'm not gonna name the countries that they named, but they named very particular countries that had stolen our software and used it in their infrastructural projects. And we can't guarantee compliancy in that case. Some of the largest, most populated countries in the world, right? China? I was gonna say, yeah, right? So, but again, that's a question about empire and its stretch and the ways in which it gets taken up either intentionally or unintentionally. There's another element to this, and were you the one who worked with insurance companies? It was you. So we can talk a lot about that, which is the efforts to limit liability within the United States and exposure to lawsuits if there were disasters. So that's a big part of this question, too. And other countries just don't have that structure of litigation. So in Canada, there was no fear that lawsuits as a result of Y2K would bring down the economy, as there was in the United States, that fear, because you can't sue people that easily in Canada, in part because you have to pay their legal fees if you try to. So there are, yeah, all sorts of different ways that they play out, but certainly it's a U.S. focused problem because that's where the hardware and the software came from. Yeah. I definitely agree that the litigation risk of avoidance was something that certainly drove a lot of the headlines. I think it's also interesting to map this crisis with the growth of the beginning of PCs being sold because for the first time, I can't remember when IBM sold its first PC, but at what, 95? Oh, much earlier. Oh, much earlier. Okay. What? Was it that much? Okay. But in any event, the boom in PC sales taking off meant that there was a lot more mind-share penetration in the general population, which goes to that wire, quote, sense of betrayal. Yeah. That could be experienced by a lot more people as opposed to it just being an expensive business problem. Right. And for the most part, PCs weren't at risk. No. But if you had one, you'd probably worry about unknown known. And sure enough, if you bought a computer in 97 through 99, you'd most likely bought it with a sticker on it that said Y2K compliant. Yeah. Since you were out as a Canadian, I'll myself as somebody who once run a cobalt program. A high priest. That was disgusting dirty work of dealing with money. By the late 90s, if you're doing stuff on the web, I mean, there were no web servers running cobalt. Right. But I mean, this is a really rich talk and there's so many threads to pursue. I want to ask about a different kind of labor because one of my recollections of this is sort of call it the formative aspects of the Y2K crisis. At the time, I worked for Tufts University and it was very important to the university that we hire a famous consultant. It wasn't phrased like that, but that was the signal that we were taking it seriously, that we had a guy who had written columns about Y2K coming and talking to us every week. And it was functionally useless because there was great war stories, great, call it perspective disaster porn, how would society collapse, et cetera, complete waste of time, but there was just so much of that which speaks to sort of the crisis mongering, et cetera. I mean, have you followed those threads in sort of the rise of this sort of high-priced consultant fear mongering? Yeah, I haven't followed that thread. I don't know how I would follow that thread except in the largest version of this project I would do something that I think is partly happening in this room which is ask people where they were for Y2K and how they prepared for it or if they were responsible for an organization or a part of an organization, what was imagined. Talking to somebody who was a professor in 1999 and happened to be the chair of a department, his university asked him to sit in the sort of server room with all the other departmental chairs on New Year's Eve 1999 to just stare at the computers and he said no. But those sorts of, I think oral history is really the only real way of recovering that aspect of it and it would be really difficult to parse how much of that was theatrical and how much of that was necessary and what we really actually think of as unnecessary. It might actually be important to bring in high-priced consultants if it assuages fears and allows people to just do their job that might actually be a kind of minor cost or it's just a total boondoggle. It's really difficult to draw those lines but it's an interesting historical question, yeah. Are you going to go through all this again in 2038? Well, I think it depends on if the world exists in 2038 but it seems like, do you want to explain to everyone why? There is a counter for, I think it's a number of seconds or is it a number of microseconds? I can't remember microseconds. In UNIX, the traditional 32-bit UNIX or you'll gradually be replaced by 64 but 32-bit UNIX, the last date that can be represented in that format is somewhere in 2038 unless you change it to an unsigned in which case you get it longer but then you cut off earlier dates. So some version of this problem might exist again in 2038 to the extent that people are still using 32-bit UNIX systems especially embedded systems that might not be replaced for 30 or 40 years. So it's entirely possible and it would be a question if it plays out the same or if it plays out differently because we went through this in 1999 and it seemed like a false alarm, right? And what the social costs are to apparent false alarms for... I could also look forward to try 100 about how the... Maybe the moment's going to be around then, so... Yeah. So Bruce Lewinstein at Cornell had an NSF grant to document the Y2K, public perceptions about the Y2K problem. And I'm curious as if you worked with it and if the idea of trying to document as you go along is successful or not. Right. And I've looked a little bit at his research and it's actually in the archive as well. I think possibly both organizations noted it. And what for me has been important about that research so far is kind of arc of fear that seemed to peak around... Now I can't remember if it was April 1998 or April 1999 and then by December 1999 fear had really sort of dropped off and cynicism had peaked, right? Which is again funny except that when we're talking about the management of crisis really important to balance those things out that you can't... If you have too much cynicism you can't prompt collective action and if you have too much fear you can't prompt responsible action. What are the innovative features of that collection? They figured out a way to capture television references from the major broadcast networks relating to Y2K including J-Leno and anything else and you'd be able to track that that way I think really strongly. Yeah. Yeah, it's such a rich project and I appreciate the archival dimension of it in particular but I'm wondering a little bit about what the post-mortem of all of this was. I mean, I'm thinking in particular about a lot of SDS scholarship that's looked at the breakdown of large technical systems the Challenger disaster, Breitner one that comes most immediately to mind the way that these large bureaucracies sort of knit the world back together after this happens and this seems like it would be an interesting comparison with some of those. I think also of the Hook and Mouth Outbreaks of the UK and we're going to file a breakdown protocol breakdowns. So I'm just curious about what, you know, did the James Glikes and public intellectuals and journalists undertake any post-hoc analysis or did we just kind of move on? So it comes up a little bit in SDS scholarship. There's this pretty important piece about repair by Thriften Graham that talks about Y2K but mostly in the valence of perceived disasters that didn't happen, right? And so it gets totally rewritten as this punchline, as the Louis Monon line, as a kind of joke. And as like a person committed to self-sabotage I often seek out these jokes and then try to make them serious and see, like, oh, what really happened there? And if we did a different kind of post-mortem on that that wasn't just, like, oh, okay, we blew it, right? We didn't understand computers well enough to reasonably assess what kind of danger they presented. That being said, I'm working on a kind of long-term project with a colleague who does histories of the AIDS movement in the 1990s and its relationship to technology and community internet provision. And so the project, compared to this one, which we just kind of call bugs, looks at these two things as kind of mirror images, right? A crisis in which everyone was paying attention but nothing happened and a crisis in which a lot of people died and no one was paying attention. And when we started on the project we were thinking of them as these sort of mirror images and they would offer an opportunity to kind of provoke and understand the others from a different angle. But what we also started to find was just tons of references in the Y2K literature about HIV and AIDS, right? About the spread of harm through networks, through vulnerable systems and vulnerable people. And even in Manuel Castel's work in the Network Society he often returns to HIV and AIDS as a way of mapping networks and especially networks of sort of compounded harm. And there's an online resource for people living with HIV and AIDS called The Body that some people might be aware of. And on The Body you get columns in 97, 98, 99 about the other millennial crisis, right? And Tony Kushner's Angels in America subtitled Millennium Approaches, right? And so you actually have these kind of twin crises of the 90s that play out in different ways but are then refracted in each other. And so part of looking at this project is looking at the understanding of crisis, not just now and doing a postmortem but in the 1990s and the ways that these sort of relationships were being thought through. Yeah. Other than the 2038 problem, have you looked into possibilities for other such disasters looming like internet addresses, et cetera, et cetera, which I think are pretty well under control but I don't know. There might be a number of other things. In the next 50 years it could happen. I mean, I haven't actively sought to compare this to the internet addresses problem or to the 2038 problem because there's no shortage of impending disasters and it's a kind of question of, this goes back to the thing I just said, right? There is no shortage of disasters and crises and people in harm's way and people who need help and who are vulnerable and what's interesting about this crisis, and this is what I pointed to, the actual structure, that it was a switch from 9s to 0s and all of the sorts of millennial fear-building that come with the turnover of any century and I know it's not the century but whatever, that it's a question of which crisis actually emerges as the one that needs attention paid to it, right? So it would be remiss of me to look forward and say, well, that crisis is going to play out this way or that way but we have to look backwards sometimes and see, oh, it's strange that that one is the one that got the attention. Yeah, Tarleton. I think the point about the shape of the crisis is really interesting, this idea of like, it hasn't happened yet but we all kind of know the date, right? It's a very unusual one. It was making me think about this as an exact comparison but I was thinking about part of your story is how do people have to learn that infrastructure that they count on is vulnerable in ways that they have otherwise not wanted to think about? I mean, you know, Helen Olma's like, encourage not to think about this. It's perfect, it's seamless, it works great and then suddenly to have to discover and then have to act upon. I was thinking about things like, you know, when there were major commitments to retrofit bridges because the science, you know, sometimes it's because, you know, the Tacoma Narrows collapses or there's some striking disaster but other times it's moments where the science develops an argument that says what we thought was sturdy could be vulnerable in certain circumstances and when you said that this was one of the most expensive kind of remediations I was wondering if we could, if we counted those things like if the sort of like bridge retrofit has a like widespread, that's a network that, you know, has certain kinds of invisible vulnerabilities. I don't know what the numerical, I don't know how you amass that number but it's another one where you sort of like, you want to count on the idea that roads, roads are put and bridges carry cars and that those things, you know, if the wind blows the wrong way they don't twist themselves. There was one region Minnesota that triggered a lot of this. Right, right. So there's one shape of like there's an incident that then makes other things vulnerable not just that bridge but every bridge but there's also things like discoveries about infrastructure that then have to raise questions. Yeah, so I brought this up rudely. This is a, you know, Google Ngrams aren't perfect things but this is a Google Ngram of the term infrastructure crisis. And infrastructure crisis is really interesting to me because it seems to be one of the only problems that we currently have that has sort of trans-partisan agreement that infrastructure is a virtuous thing that need in need of repair and treating, you know, in Trump's acceptance speech on the night of the election, you know, he talked about the infrastructure crisis. So you can see from this that it kind of took off in the mid to late 1980s that's where the term really started being used and I have another one that goes to 2007 but it really peaks around the year 2000 and it's extremely, it is the highest popularity in the 1990s which is interesting to remember because it seems to also be very current presently. So one answer to that question is there are these moments in which certain things seem to be in crisis and now we're talking about infrastructure being in a 30-year period of crisis and how that actually works as a kind of engine for promoting whether it's like scientific investigation or social scientific investigation and the kinds of questions that we have to ask is like, okay, so are we concerned about bridges or roads or computer infrastructure or internet security or nuclear safety or but the infrastructure crisis just becomes a kind of pen for holding those things in, right? We can kind of lump those things in there. That being said in the late 1990s few people were talking about computers as infrastructure, right? Even within science and technology studies, Paul Edwards writes computers as infrastructure in 1997, I think. So convincing the very field that I think would most embrace that concept contemporaneously with this crisis and I'm not sure that if you ask politicians when they think of an infrastructure crisis, if they're thinking about old computer codes and the risks that it poses I don't know that partly answers your question. Yeah. Two provocations I'd like you to respond to. I mean if you view this as a sort of calendaring software promote crisis then you'd think one of the responses is okay, time is important we need to get this right in our systems. Yet it seems like every leap year every leap second etc there's a news story about some major system that could not cope and did something wrong. The other way to look at this crisis is that it wasn't the Y2K was the trigger, but this was really about misunderstanding understanding the longevity of software. I mean those of us who wrote some of these programs knew we were doing a terrible job and simply assumed that they would be thrown away. Right. That would have been the responsible thing to do and oh my god they're running 20 years later of course they're not Y2K compliant we were lucky that they were 1990 compliant and there were actually 1990 problems as well. But then you see something like the Amazon S3 problem last week if you're familiar you know a typo in some command line takes out systems then it turns out they've never rebooted this stuff you know and that's sort of like one of the maintenance lessons out of Y2K. Yes of course you're going to have to there will come a point where you will have to restart these things and you really want to know how that stuff works. So we haven't actually learned the longevity reasons longevity lessons either. I guess if I turn that into a question it's like are we doomed? No I mean I think your provocation captures I think was kind of the tension of this of this history and the way that the problem is described is it an easy problem to fix like okay we just need to do a better job of X in this case representing the calendar right and yeah those problems come up all time there was that thing last year that 4chan somebody on 4chan made an image that said like oh set your iPhone to the date January 1st 1970 and reboot it and we've introduced a cool thing that will make it seem like Apple in the 1970s right it'll be a blast from the past of course a brick to your phone because if you're a west of GMT that's not it doesn't exist in UNIX time right and so that like the Y2K crisis exposes the fact that calendars are infrastructural and they're actually pretty easy to tweak and the ways that they're integrated with other forms of technology are easy to tweak so yes that is certainly an aspect of it and you have to be careful about the representation of time and technology but also it was a trigger right it was a way of saying yes we need to be more responsible about the longevity of our software and if anything we're less responsible about the longevity of our software as we were in 1999 so if there was one lesson that came out of it it was that very few people will be held responsible or accountable for technical for decisions and design choices that were made at one point and how they played out in the future and if there's another lesson it's that you have to be really careful with the public's trust right so either you're saying I'm sorry we made a decision in 1960 and now it's having this impact right I'm sorry but here's what you need to do to prepare yourself but then you have the after effect which is I'm sorry we told you that that was maybe going to be a disaster it probably was but we fixed it right either that's an opportunity to build trust and say okay we're telling you we're telling you that we screwed up or it's an opportunity to lose trust but it was also at that point imagine as a chance to make computer engineers more responsible right and so I just there's another article from the 1990s about well this is an opportunity to start certifying all computer engineers right this is this is finally the call like we need to have a professional organization so everybody can be a software engineer unless they're a certified software engineer and there's this quote in 1998 John R. Speed who is the Executive Director of the Texas Board of Professional Engineers right he says many have called themselves software engineers wrong they're the local music dropout who chooses to use that title right and so this is actually a kind of incredible moment in which this division of kinds of engineer was still sort of animating the ways people viewed this problem it's like this is a result of people who aren't engineers who made engineering choices and now we're having to pay for the consequences if engineers built buildings the way programmers wrote programs the first woodpecker would end civilization yeah I remember that it's a great name thanks everyone what was the 1990 problem and I don't remember hearing about that I mean it was a widespread but the same sort of thing where people didn't expect it