 Gr join. Thank you very much for coming. This is I think the most wildly speculative talk of FOSDEM. I hope it will provide some entertainment. So hello. My name's Àizli Amprovin. As of this year, I've been working in IT for about 30 years. For the first 10 years, as a support and cisadwind guy and for about since then, mostly as a technical writer I'm currently a technical writer on the documentation team of Suza, but I would like to say, I sort of have to say, this talk is based on my own research going back for about five or six years now. It's not in any way connected with my job at Suza. Indeed, Suza, who were kind enough to pay for me to be here this weekend, didn't even ask about the content of this talk, so it doesn't represent their views. So, forgive me if I am a little rusty. This isn't the first time that I've done a talk at a conference, but it is the first time in 22 years, so I'm a little out of practice. So I went through a number of titles for the talk, and one of them that I discarded was the great lies of computing. Because this is a talk about myths and traditions that many of us take for granted, but which I think are often just not true. And it's a very good sequel in a way, actually, to a Diomedes talk earlier, because he's talking about the history of Unix, and I'm talking about a war that happened 30 plus years ago, which I stumbled across and which seems to be mostly unknown. So I better talk about how I got here. I was a freelance technical journalist in the UK for many years, and one of the problems of being a freelance writer is finding stuff to write about that your editor wishes to publish. And way back, probably more than 10 years ago, I had a bit of a dry patch when I just couldn't find any tech news that nobody else was writing about. I've written about a lot of stuff over the years. I started out writing about an obscure, quite experimental free-operating system called Linux. That one did me pretty well for a long time, but after a few years everybody's writing about it. And about six or seven years ago, I wrote about what I thought would be an important up-and-coming technology, containers, which I felt were a very useful tool that needed to be used more. Again, a few years later, everybody's writing about it. So I came across this period when I couldn't find anything very exciting. So I reached into one of my hobbies, retro computing, collecting and using vintage computers, and I wrote an article about a new release of Amiga OS, and I submitted it to my editor and he said, no, forget it, Liam, nobody cares anymore, it's dead, it's gone, let it go. And I said, well, I haven't got anything else right now. Neither have I, okay. So he ran the article and 48 hours later and about a quarter of a million page views, he came back to me astonished and said, okay, I was wrong, people are really into this stuff. Give me more. And since then, I've written about Atari operating systems and Sinclair QL operating systems, the Acron Archimedes operating system, which was a personal favourite of mine in my 20s. And after a while, retro computing got really big as well and everybody was writing about it. So I needed to find something new again. So I went digging for a more obscure machine to write about or more obscure piece of software. And I found one. I found it through a slightly indirect route and I was digging away at this story trying to find something to write about and I uncovered something a bit bigger than I expected, something I hadn't really heard about before. There's a saying, you've probably heard of it. A British national hero but a questionable man these days maybe? He said, history is written by the victors. When there is a war, when there is a conflict between rival camps, one doesn't get wiped out generally, one doesn't get completely extinguished or exterminated. But the side that wins set the agenda and the losers become sidelined but often some of their culture is incorporated. And that's what happened this time. So, here's a screenshot. You've probably heard a story about this software and its role in the history of computing. This is the original small talk system written at Xerox Park. This is what Steve Jobs and a couple of Apple people went to see in about 1979 and Jobs was so blown away by what he saw that he went back and redirected the team on to making a computer that became the Apple Lisa. But by Jobs' own admission he was so dazzled by how this looked, by the gooey that he completely missed the other two things that Xerox were trying to show him. This was the first gooey computer, really. It wasn't the first to use a mouse, it wasn't the first to have hyperlinks and things, but it was the first with an entirely graphical display where everything is presented in windows on the screen which you can move around and overlap. But there were two more important things, arguably, that Jobs missed, source Steve Jobs. One is that this system was all built in this one remarkable language, small talk. It's one of the original object-oriented languages. Now, the stack on the Xerox Alto workstation didn't go all the way down to the metal. The base operating system was written in a language called Mesa, which is vaguely Pascal-like. This was on top, but the point is everything that you could see and interact with, the file system was managed by it, the network was managed by it, the file structures and everything were all managed in small talk. It wasn't a thin veneer on top of a conventional operating system. It was the bulk of the stack, but with a conventional operating system underneath. But Jobs didn't get that. He also didn't notice the part about pervasive networking. All the altos were on a network and they all talked to each other and to a file server and he totally missed that part as well. Apple then spent 25 years or so putting a lot of this back in. But one of the most important things never got back in, this whole deep rich stack of language designed for building interactive graphical applications. Of course, Steve Jobs was a co-founder of Apple. He hired a man called John Scully from PepsiCo with the famous line, do you want to sell sugared water to children for the rest of your life? Would you want to change the world? And Scully came and took over at Apple and one of his first actions was to fire Steve Jobs. Jobs went off and started his own company which was Next and that now is the basis of Mac OS X, but Scully commissioned one of the only non-Macintosh computers ever to come out of Apple. It was a failure, but it was a very interesting and educational failure and I went digging into that, thinking maybe I could write about this. It's one of the first computers to become famous in a comic strip. Some of you might have seen this. It's from Dunesbury by Gary Trudeau. The computer is the Apple Newton and it was a pocket digital assistant or a personal digital assistant. It learned your habits, it learned your favourite places and people and things and tried to help you. And one of the things that they did is when you're on a keyboard, not even on screen, you wrote on it using some pioneering handwriting recognition technology which had to learn your handwriting before it worked. So when you tried it in a shop, it didn't work, which meant they were almost impossible to sell and that's what got parodied in the comic. He tries catching on and it goes egg freckles. The Newton was written in... or the applications for the Newton were written in a language called Newton script. It's a vague relative of Java script and of Apple script which is still used in OS10 today. But that wasn't the original plan. It was originally meant to be written in a language called Dylan. Dylan is quite interesting. I suggest reading the Wikipedia page. It's a good overview. Dylan is still around. There's an open source compiler for it now. Development continues. And Apple's plan was quite bold. They plan to write the whole operating system in Dylan and the applications as well. And that's pretty unusual. One language that does the whole stack like that, it's not the way things work. Unix argued with the most successful operating system in history. Core of it's written in C. That's what made it portable. But these days we don't write many applications in C. But Dylan, they proposed, would be the whole thing. And that led me to a discovery. You've probably heard of the discovery. It's Lisp. It's one of the oldest programming languages still in use. It was invented by a guy called John McCarthy who is sadly no longer with us. There's a lot of truth in jokes sometimes and I quite like this one. Programming, you're doing it completely wrong. The thing is, Lisp is a very strange language. It's famously quite hard to read. It's imperative, but it's also functional. But one of the interesting things about it is that it's immensely versatile. Now a lot of people are fans of Lisp. People you might not expect are fans of Lisp. Even notable open source luminaries sometimes are fans of Lisp. So, Eric Raymond. Lisp is worth learning for the profound enlightenment experience you will have when you finally get it. That experience will make you a better programmer for the rest of your days even if you never actually use Lisp itself a lot. Kent Pittman, head of a company that was involved in the creation of the Common Lisp dialect. Please don't assume that Lisp is only useful for animation and graphics, AI, bioinformatics, B2B and e-commerce, data mining, EDA, semiconductor applications, expert systems, finance, intelligent agents, knowledge management, mechanical CAD modeling, simulation, natural language optimization research, risk analysis, scheduling, telecom and web offering just because those are the only things we happen to use it. Dylan was a dialect of Lisp. They planned to do the whole thing in this one language. And I thought, well, does anybody else do that? Did anybody else do that? And I went looking for a weird old machine to write about and I found one. But I also found an awful lot of stuff about Lisp. Neil Stevenson is an American science fiction writer. A bit of a fan actually. For one of his books he wrote a little essay to try and explain some context. It's called In the Beginning Was the Command Line. It's free on that website now. It's on his website for free as well, but in a zip file of various formats. You can just read it there. I really recommend reading it. It's 50 pages long, but it's very dated. It talks a lot about BOS, another favorite of mine, but now pretty much dead. But he became a fan. This man also wrote about it. I've been talking to people in various tech communities I have about Lisp and my discoveries and my research about Lisp. And some of them are getting very annoyed with me and I've been getting a lot of criticism, people telling me that I don't really understand this stuff because I'm not really a programmer myself. For one guy, this quote was what made him reconsider and actually listen to what I said. Lisp is the Maxwell's equations of software. Maxwell's equations are the fundamental laws of electromagnetism. If you're not a physicist, a bit impenetrable, but if you are, these are as profound in their way as E equals MC squared and a bit more useful. Alan Kay, of course, is the guy who invented small talks. When somebody who is famous for creating one of the world's greatest and most influential programming languages goes on and praises one of the other greatest and most influential programming languages, I thought, that's interesting. And then this guy, another person I really recommend reading. I don't know if you've heard of Paul Graham. He's a venture capitalist these days. He runs a company called Y Combinator. He has a series of essays on his personal site, which are well worth reading. They're very informative. Boiling down one of the essays, basically he made his money because he wrote the first general purpose e-business that you could sell to customers that they could come to your site, customise and set up an electronic shop and they could sell goods over the web. He wrote it with a team of six people in a year or so. It was called Vioweb. It did extremely well. Everybody's forgotten about it now. It did so well that Yahoo bought them. Yahoo bought them for several hundred million dollars, which is pretty good for seven guys. And once Yahoo bought them, apparently Paul Graham and a few of his team were in a meeting room at Yahoo's HQ and they said, So, we have whole teams of programmers working in all the major languages. What did you write it in? Is it C++? Is it in Python? Is it in Perl? And he went, no, it's in Lisp. And they went, oh, we don't use Lisp here. We don't have anybody who does Lisp. And he said, well, you know, it's Lisp that made it possible to write this small but powerful piece of software that you've just paid lots of money for, so I guess you're going to have to start doing Lisp. And Yahoo said now, now we don't do Lisp, so Yahoo set about rewriting it in C. They put in the end 300 people on the project. It ran for about four or five years. And the end result famously contains in the middle of the code about half of a Lisp interpreter because that was the only way they could get the logic to actually work. And just people know this, so it became a joke. Philip Greenspyn was one of the luminaries of Lisp back at MIT, and he said this, any sufficiently complicated C or Fortran programme contains an ad hoc, informally specified, bug ridden, slow implementation of half of common Lisp. I began to get curious. He called it his tenth rule because he wanted people to think he was the kind of guy who just came out with these wonderful aphorisms. Actually, it's the only one, but you know. There are lots and lots of more people writing pithy stuff about Lisp. There's a page on Graham's website just called Quotes. That one takes a minute or two to read. It's quite interesting. You get a feeling. You come away from it and go, all these people think it's so important. Why isn't the world using this? So what I found was that once upon a time, unlike small talk, there were computers that ran nothing but Lisp. They were called Lisp Machines. Several people made them. Probably the only major company that made them that's still around is Texas Instruments, but for some years they sold them. But the biggest company, the most important, to give you an idea of their significance, they bought the first ever dot com domain on the whole internet. It was called Symbolics.com. That's how big these guys were in their time. And there's one guy that used to work for Symbolics, Calman Reti, who is still out there maintaining these machines, because some people still use and work on these 30-plus-year-old computers. And he has a couple of presentations on YouTube showing the operating system. It's called Open Genera. It's out there. Symbolics no longer exists, but it's in legal hazy territory. But you can run it on an emulator under 64-bit Linux these days. It's too big and too complex to run 32-bit machines. And in one of his presentations he made this comment. We took about 10 to 12 man years to do the Ivory chip. That's one of their first single chip processors to run Lisp. The only comparable chip that was contemporaneous to that was the Microvax chip over at DEC. You saw some microvaxes in Diomedes' presentation. It's the first machine that had virtual memory support on Unix. I knew some people who worked on that, and their estimates were that it was 70 to 80 man years to do the Microvax. Symbolics, a much smaller company, did it 10 to 12 man years. That's a very big delta in productivity. That's substantial. That's stuff that makes you sit up and pay attention, I think. From an 80-person team, or, well, who knows, it could have been 100 people for a few months, it could have been one person for 70 years, but 10 people doing 70? I thought that's quite impressive. So I got to wondering what it was about these machines, what it was about their architecture. Why did it take so much? Why is it gone? This led me to discovering what I call one of the first big lies. One of the biggest lies. It's this. Computers today are better than they've ever been. They're better in every way. Obviously they are thousands of times faster, they hold thousands of times more stuff, but also that the operating systems are better, that languages are better, that the applications are better. The idea is early computers were very simple, and they got replaced with slightly better computers, which were replaced by slightly better computers until we got to the ones today. This gives me an excuse to use my favourite German quote. Please apologise. Forgive me for my terrible pronunciation. Das is nicht nur nicht richtig, das ist nicht einmal falsch. It's not only not right, it's not even wrong. Mac Planker. Or similarly, John Glenn, America's first man in space, allegedly was interviewed by somebody who said, What did it feel like right before launch? And he said, allegedly, I felt about as good as anybody would, sitting in a capsule above a rocket that were both built by the lowest bidder. That is where we are today. The story about computer evolution that you doubtless all just implicitly taken on board is completely wrong. It didn't happen like that. That's one of the really big lies. Is how it really happened. The first computers, of course, were mainframes. Huge room-sized things, they cost millions of dollars. They evolved, they got faster, they got more capable, until they could run tasks so quickly that you would feed a deck of cards into them, they would rip through the cards, print out the result and be ready for the next one. They backed up all these decks of cards and the computer, which was very expensive and which broke down all the time, had a team of people maintaining it, but you ran it as much as you could. Then they got a bit more powerful still so that several people could run tasks on them at the same time. Multitasking, big deal. Then they were replaced by mini-computers, principally, from deck, digital equipment corporation. But mini-computers were much smaller, the size of a desk or a filing cabinet. Eventually they shrank down to the desk side thing. But they were cheap enough for a department to afford not a company. Just a small number of people would share them. But the mainframes by that point had very rich operating systems based on a technology you might have heard of. It's called a hypervisor. They used virtual machines. This is early 60s. All that was thrown away in the mini-computers. It was all forgotten. Because processor time on the mainframe was so precious, the peripherals were highly intelligent. The peripherals had their own simpler processors in. They did the work and they communicated the results to the mainframe. All that thrown away with the mini-computers. All forgotten. They started again. The first mini-computers were again very simple, very stupid, could do one thing at a time for one person. They gradually got more sophisticated and smaller. And they were replaced by workstations. Initially, desk side cabinets, like the Xerox Alto. Eventually, little pizza box type things on your desk. Very powerful, very capable, dedicated to one person. Which means usually somebody rich and important and senior. Because that's the way it goes. And then they were replaced by micro-computers. Little 8-bit things at first. The workstations, which mostly ran Unix, that's one of the places Unix got its first grasp in commerce, in industry. They threw away a lot of the stuff that had been learned in mini-computers. The deck-vax mini-computer I learned to programme at university had version control built right into the file system. When you saved a file, it automatically kept a delta of all previous versions of that file. So you could go back to any arbitrary version. It was part of the file name, it was right there. All that was lost when we went to Unix. All the stuff that they could do, gone, thrown away. Then we got micro-computers, little 8-bit things. The first ones were pathetic. They could take a maximum of 64 kilobytes of memory. So they had nothing, multitasking, not enough memory. Hard disks, too expensive. Choice of programming languages, they're toys, they're for children. Given basic in a ROM chip, they'll be fine. It's all the clever stuff that was in the workstations thrown away. The first ones could handle floppy disk drives. They had basic file systems, too expensive thrown away. The second generation, you plugged in a cassette recorder. That was all you got. And then, of course, famously in 1981, IBM decided to get into this growth market, launched the IBM PC. Everybody forgets now the original IBM PC didn't have a floppy controller as standard, didn't have graphics at all, didn't have sound at all. It had basic in ROM and a cassette recorder port. All of that advance, all of that evolution from all the previous generations thrown away. They started again. And then slowly these things got expanded and enhanced and they got more powerful. And all of the features that people really missed got put back in. But there's a sort of law of nature that says if you try and put stuff back in that you left out in the first place, they're quite as good as if it was designed in the first place. In the first draft of this talk, I had a section about the Plan 9 operating system, which is what Thompson and Ritchie went on to do next after they released UNIX. UNIX, when it shipped, didn't include any networking. That came in later, with a choice of two APIs because, you know, open source academia. Plan 9 is UNIX where everything really is a file, including the network. That's where the proc file system in Linux came from. But the slash proc file system on a Plan 9 box shows all the processes on all the machines on the network. It's a deeply network integrated UNIX didn't catch on. What we've got today are the remote descendants of the worst, cheapest computers that ever were. There's a saying, you can have good, fast and cheap. Pick any two. We got fast and cheap. That's what you all run on today. Hope you enjoy it. We all think computers are great. They're so powerful now. They've got these fantastic graphics, videos, stereo sound, quite quintuphonic sound or whatever. They are, but there was stuff in the past that did some fundamental stuff better. So back to small talk. It was developed on the Xerox Alto. It was the size of a height of a desk. With width of one, it cost $30,000 and you needed a file system and a network. Nobody bought it. Then in 1981, Apple launched, sorry, 1983 Apple launched the Lisa. $10,000. With a GUI, no networking worth speaking of, but it still had multitasking. But a GUI that was built in Pascal on top of an assembler operating system. Too expensive, nobody bought it. So in 1984, they launched the Mac. $2,500. In one year, they managed to chop the price to one quarter. That's pretty serious. The Mac was very limited. It had 128K of RAM, one floppy drive. The next year, the Amiga, the Atari ST, a couple of years after that, the Acorn Archimedes, its own bespoke processor, the ARM chip. Everybody in this room probably has 10 or 20 ARM chips on their person right now. I believe this has about eight of them in it. They're all dead and gone now. They're operating systems, they're architectures, they're all gone. The X86 PC caught up and passed them. In fact, all that's left now is X86 and ARM to a rounding error. You go and talk to these people, they're really passionate. The Amiga fans especially are super passionate and you can sort of see why. In 1985, they had a computer with a GUI, proper pre-emptive multitasking, 512KB of RAM, half a megabyte of RAM. Of course they loved them, best graphics and sound in the business at that time. But everybody's writing about Amigas and I thought maybe they can write about list machines. So I started researching list machines and I found that, wow, when you go and read the stuff that was written on this machine fans, they make Amiga owners look like amateurs. These guys were really passionate. So what happened? Well, they were big, they were expensive. The whole operating system from the metal right up to the GUI is written in list. The processors ran a sort of bytecode-like metal language that worked in the way that list works so it didn't require a huge amount of compilation that compiler could be relatively simple. Because unlike small talk, small talk is still famous, people still use it, it runs on operating systems, it's just another choice of programming languages now. Small talk is one of the original object-oriented languages. It's objects all the way down. But you can't really run objects on hardware. I mean, it has actually been tried. If you want to look it up, there was a British high-fi company called Lyn who make very, very expensive music players and they made a chip called the Recursive R-E-K-U-R-S-I-V, an attempt to do a chip that could do object-oriented software in hardware. It bombed horribly. Probably deservedly so. But the other of these two great languages that back in the 80s everybody was agreeing with the ultimate, Lisp, the difference is, Lisp makes everything lists. And lists you can implement in hardware quite well. It's a basic abstraction. But instead we've got C machines. We work at the abstraction layer of bits and bytes and words. And all of the bigger structures have to be built in software. Which has now built a multi-billion trillion dollar industry on daily, weekly, monthly patches for stack overflows and for buffer overruns and software vulnerability. Finding software vulnerabilities is now a full-time job. Why? Because we use descendants of those very cheap computers that are built on a very low-level abstraction. One of bits, bytes. You can put whatever you like in it. I'm not going to check. It's quicker that way. Of course not everything's written in C anymore. These days we have a huge choice of languages up on top. There's stuff like Perl and Python. There's Ruby. There's all sorts of wonderful languages. There are things like Julia, a very interesting language, which has some of the important properties of Lisp. But they're all running on top of operating systems built in layers and layers and layers of C. Nobody would write desktop apps in C these days. They probably use C++, vast language. Loads and loads and loads of calls and functions. Nobody knows all of it. But that's okay. You need. And if somebody else has to maintain your code and they didn't know that bit, well tough to use an old joke. It was hard to write. It should be hard to understand. But Lisp, it must be said, is very hard to understand. Here is a very, very simple program in Lisp. And this shows one of the things that many people dislike about it. Look at that pile of parentheses closing the code there. So going to the repository of all knowledge and wisdom. These are your father's parentheses. Elegant weapons for a more civilised age. Nobody can read it. I've also, the most over text I've just received word, that the emperor has dissolved the MIT computer science program permanently. Put a certain Richard Stallman on the streets and we all know where that got us. But another one is in here. And there's a line in this. I wish I had a cool little laser pointer, but I don't. The speech bubble in the middle. God, it's full of cars. That's one of the basic instructions of the early Lisp's. C-A-R. Contents of address register. It means the first item in a list. You get the rest of the list with Cudder. Contents of data register. The point being these are low level machine level instructions. You can make this stuff go quick if you build hardware to do it. But we didn't. We built hardware to make C go quick instead. And everybody loves it. We all love our computers today. I've got enough of them. I bought this one in January. Used. All that progress that we had has led to a machine I bought second hand. It's seven years old. It cost me 150 euros. 8 gig of RAM, two SSDs. It's lovely. It's quick. Very fond of it. But I do have to install hundreds of megabytes of software updates every single week. Just to make sure that it keeps working and somebody else doesn't make it their computer instead of mine. But that's implicit. It's another of the big eyes. When you have vastly complex software systems, people make mistakes. And if people make mistakes, somebody has to go and fix those mistakes. So you need software updates. Well, this machine runs Linux naturally. It doesn't actually run SUSER, I'm afraid, but I've only worked for them for a few months. But it also runs WINE because in WINE it has Word 97, which this talk was written in, because it has an outliner. I love outliners and LibreOffice still doesn't have an outliner. But by Joe of Word 97 isn't half-quick on a computer made 20 years after it came out. But Word 97 had two major service releases. So word processor, but it needed two major fixes. The first ship diversion, when you saved a Word document, it actually saved an RTF file, not a Word document, and nobody noticed. We put up with this kind of stuff. It's believed to be implicit in software design and it isn't. Now, once, in the 90s, in the era of Windows, there was a language called Delphi that almost everybody used for writing their software, because it was based on Pascal. It's quite high-level. It's type-safe. It can't have buffer overflows and things because it checks for stuff like that. And it's readable. You don't need to be a COMSI whiz. You don't need to understand this arcane stuff involving lots of parentheses. But actually, that's Apple's Dylan from 1995 or so. It's Lisp. It doesn't look like Lisp. It looks much more readable, really. It's fairly straightforward. Instead of having, you know, one, two times, it's got one times two. It's just Lisp, but with the syntax rejigged to be like what we're all used to. Algol style, basically. Dylan isn't the only time people have done this. Plot, written by one of the originalist gurus, programming language for old-timers. You can tell he's a software guru. He doesn't know much about marketing. That's the same function in Plot. It's even simpler than the Dylan version. Here's a very early one. Seagol. It's got a great Wikipedia page. I wrote it. Again, it's Lisp, but unpacked. This stuff could be readable. Now, what I did is I went out and I looked at the small-talk people and what they thought of their operating systems and what they thought of modern ones, and they are all in dismay at where we've come to. And I talked to the Lisp people, or rather, I read what they wrote years ago. There's a wonderful book I highly recommend. It's called The Unix Haters Handbook. It's a distillation of a mailing list for years, during the time when universities were taking out all of those expensive list machines and replacing them with Unix boxes, mostly suns with sun-offs, because they were much cheaper and faster. And boy, the Lisp guys were unhappy about that, and they catalogued their woes on the mailing list. The Unix Haters Handbook, ugh.pdf, is widely available for download. It's really funny. There's a lot of wisdom in it. A lot of the stuff they bitch about has been fixed now, but it's worth a read. It's very funny. But it's not C, I'm complaining about. I personally don't find C very readable. I don't like it. I learn basic on my Sinclair spectrum, which I've still got. It has a 4 gigabyte SSD drive in it now. It's hilarious. But it's not about the language. It's about the abstraction. This loperos.org is a wonderful website that I would recommend. It's a blog, and it goes on. It goes on a lot. But to be honest, he got most of his points over just a few dozen posts. I rather like this summary. It's quite harsh. The computers we now use are descended from 1980s children's toys. Their level of bedrock abstraction is an exceedingly low one. This will be acceptable in a micro with 64K of RAM, but when scaled up to present proportions, it's a nightmare of multi-gigabyte bloat and decay. Witness, for instance, the fabled undebugability of multi-threaded programs on today's architectures. It stems purely from the fact that truly atomic operations can only exist at the bedrock level. To fully comprehend what's going on in the entire machine requires wading through a vast sea of binary soup boiled and stirred continuously by an asynchronous world. The futility of this task is why programmers usually aren't given even a sporting chance. Observe the lack of a hardware debugger in any modern computer. This is why I write rather than doing support work anymore, because the state of modern software support brings me to tears. I used to be able to master it. I did a project in 95 for a computer magazine. They had a very, very early SSD. It was 16 megabytes in size. And they wanted to benchmark how fast Windows 95 would run on it. And I knew what virtually every single file in Windows 95 did. So I cut a copy of Windows 95 down to fit it into a 16 megabyte SSD with some free space because it needed to make temporary files. I did it. I got it down to 14 meg. Because I knew which font the little window gadgets were drawn in. That's no longer possible. It's only 20 years ago. This is a product that is related to one still on sale. Nobody can know it all now. Why? Because among other things we have this layer cake of different languages. Each one chosen because it seems to be suitable for that particular task. So you write kernels in C but you write applications which are speed critical in C++ and if they're not speed critical well you can do it in Python. And if it's going to talk to a web database well Ruby on Rails and blah blah blah blah blah. So we have loads and loads and loads of languages now and that is perceived as a really good thing. That's one of the small lies. It's not... I'm not saying it's a bad thing. I'm saying it's a symptom. It's a symptom of the fact that we built a stack on languages which are not great for any arbitrary task. And if we started again well maybe we could do something else. Now people have tried. There are various operating systems out there. Here is a quick description of one of them. This is written by a very strange guy called Curtis Yavin. He bogs under the name of Mencius Moldbug. He is very smart but he is one of the people who got the neoconservative movement off the ground. He has the stated ambition of causing the collapse of the US government and replacing it with a monarchy. He also wishes... Well he has also written at some length about the theory that different human races have different innate levels of intelligence. You can guess how that one panned out. He was invited to a list-related conference lambda.conf a couple of years ago when they wouldn't back down almost all the other speakers quit and a lot of the attendees didn't turn up. But he wrote a remarkable bit of software. I do not endorse the man. I never met him or spoken to him and I don't intend to. But he wrote a sort of virtual machine to run on existing computers because he decided they were too broken to fix. And he wrote an instruction set for it which is the basic unit of storage as a list. And he wrote an assembly language which is called knock and on top of knock because it's an assembly language, it's very hard. He wrote a higher level language called which is awfully like Lisp in some ways and it also has its own network protocol which is sort of encapsulated over TCPIP. He doesn't even trust DNS so he replaces that. It's very weird. But he came to the conclusion that modern software is broken and we have to replace it all. It can be done by a small number of people in a moderate amount of time. Here's another one I quite like recently. This was going to tie into something I had to drop about an operating system called DAOS from the late 80s. There's an operating system built so the same binaries could run on every different architecture. It ran on about eight different CPUs with binary compatibility across all of them. Unfortunately the company eventually went under. The guy that wrote the core product has now started writing a new one. This is it, Chris a Lisp. This is his early tech demo from GitHub and you will not be vastly surprised from what I've said that this time the language he's chosen to build it in is Lisp again. There's something to it. If so many people go on about it it can't be coincidence. So why would we do this? This is where I get to the second bit of the talk much shorter. It would take a lot of work to throw away our entire software stack which most of us make a living from and we actually often enjoy working with. I do. To throw it all away and start again with a different language even if it was a more human-readable one like Dylan. It's a big price to pay. Why would we do that? Why we would do that is to do with what's in this chip. This is the first shipping product to contain a whole new type of logic gate called a memrister. Hewla Packard are very proud of it. It's basically a memory chip. It's a memory gate that will keep its contents when you turn the power off. So they got very excited at Hewla Packard and started writing at great length about what they were going to do with these things once they got them working in the lab. And they talked about a project called The Machine which would be a pure solid state server which kept all of its working data in memrister memory. Well they found that it's a very long way from the laboratory to the fab from lab to fab and after eight years of work this contains eight memristers. Not eight kilobytes, eight. But there are others. Intel has one. 3D crosspoint. It's a bit like flash memory but thousand odd times faster. This is actually a shipping product. There is something that they are selling as an SSD but you can actually put this stuff in servers now. Here are some dims that you can put in your server which contain as well as a bit of RAM they contain flash, very fast flash memory so your servers memory can be non-volatile. We are heading for a brand new type of computer. That's the first time a shift this big has happened since about the 1960s because soon enough we will have a type of technology available where you could put say half a terabyte of memory in your computer. We already have demos of this sort of machine. Here's one. It's an arm, it's got 3 gig of RAM and it's got 32 gig of flash but there's still a separation between the RAM and the flash. You put memrister memory or 3D crosspoint or anything related to that in. You have a computer with say half a terabyte of RAM and no disks, no file systems there's no drive controller. All of its memory is directly addressable and the processor's memory map and when you've stopped using it you turn it off. You don't suspend, you don't shut down you just turn it off. Turn it back on, it's exactly where it was and it starts working again. You never install the operating system what you install it onto there's no disk drive. At the factory it boots over the network from the manufacturer's server there's the operating system in memory and it stays there getting patched occasionally of course you can still reboot it. If your operating system gets corrupted if you reboot it as a Mac I can boot it off the internet, off Apple servers and load the operating system straight over the web. You can do it with these as well. It's a whole new kind of computer and it's coming. It's visibly this is one of the next big things but nobody's talking about it very much partly because it's taken quite a long time to come to market. We will have machines which don't need file systems. Can you imagine what it would do to Unix if you took away the file system? What's left? The file system is everything in Unix the whole point is everything's a file. What if there's no files anymore? What if it's all in memory? There's no separation between RAM and non-volatile storage? You don't need Unix anymore. I mean I know file systems are wonderfully powerful tools. Some of my friends say what have you got against file systems? Why do you want to take them away from us? I don't. File systems are there to control block addressable storage. Non-volatile storage you read it into your working store you look on it and you write it back. That's how they all work. It's how they've all worked since the early mainframes. And in a few years that's going to go away. We still use shells. I have a t-shirt that says taking the SH out of IT and my colleagues all think what do you want to get rid of the shell for? No, that's not the joke. The shells were designed for teletypes and moved to terminals. We still love them, we still keep them. But file systems exist because of this artificial division between live memory and non-volatile memory and that's going to go away. And in our history we have two examples of rich operating systems which were not file system centric which were data centric and kept their work in memory and when at the end of the day they saved their variables to disk and you turn them off and in the morning you turned it on they loaded those variables from disk and you're back where you were. List machines and small talk boxes both work like that. If you run small talk today under a linux box IBM visual age or something like that it's how it works, at the end of the day you don't save some files you save the state of your small talk VM to disk and it comes back. We can clutch it with Unix, we dump a whole of RAM onto the disk when you turn it back on it reads it all back but it takes gigabytes of storage these things took tens of k. There's a really really big shift coming and nobody much is talking about and we are going to reinvent the wheel again as we did with intelligent peripherals as we did with hypervisors as we did with programming languages as we did with multitasking we threw it away and then we started again in the next generation this is a bigger shift and I think maybe there are lessons to be learned I don't say re-implement list machines but learn a bit about how they worked because maybe we can build a new and smaller stack this isn't really relevant to servers maybe there are lessons to be learned from the past from the side that lost the war and that's it Thank you very much indeed Thank you very much for your talk Do we have any time for questions? We have like two minutes for questions Are there any questions? Hi, I'm going to be very rude to make two points I did have a talk entered into the retro computing you said that no modern computers have a hardware debugger I had a TRS80 and all of that generation of 8 bit market computers you could get the schematics of the motherboard and the chips on it and that meant that people could build their own hardware to interface to it and a modern computer will not give you that and I think that's an example of early open source my second point is there I mentioned Meltdown Inspector one of the points I saw one of the excuses made for Intel was that they put in speculative execution because programmers couldn't be bothered and parallelism and they have to keep increasing the performance because the market demands it therefore they put in these features and what your opinion is on that So, I don't think we've got any more time I'm easy enough to find online I'm the only Liam Proven on the internet or come find me during the rest of Fondem or email me Google me, you'll find me I would be very happy to talk about this and enlarge on the themes if you'd like Thank you very much