 Hello, everyone. So the talk is a rather obscure topic of these de-unix command line and text-based user interface ecology. If we just simple it down a bit, it's just, it all is just a clear ecology, really, because text-based user interface is basically parts clear and parts gooey. And of course, clear is usually a de-unix. So what's that? What am I talking about? What's ecology? What do I mean when I say clear? CLI? Is it, what is it defined by, is it defined by a terminal, a specific device, a specific, specific piece of software that manages user input and processes it, tools that you use when you use any CLI? Or is it the process of software development that you virtually doing in real time when using the interface? Or is it much more? Is it, is it none of these or any of these? But do we mean something more, more whole and more general when we advocate command line user interface? So I kind of, I'm kind of fascinated with the, with the idea of simple interfaces, but I'm also amazed at how few studies there are, there are about it. When you Google like all the sources, you'll, you'll only get like maximum a few dozen studies. And actually maybe half of those references will be patents. So actually there's, there's a patent by IBM filed in 1999 and granted a few years later that basically patents, patents command line interface, but they do it in relation to network devices, but it's, it's pretty much that. So user input, processing, batching and stuff. So how about a little bit, a tiny bit of history? The command line interface to the extent that we know it basically starts with this guy. And this guy's name is Louis Poisson. He's French, but he worked in the 60s. He worked at MIT on CTSS. And it was a batching system, which obviously dealt with batches, but here all this nifty little program called RunCom for it, which allowed users to, to have this shell-like interface that they could manage their tasks in. And so everyone loved him, this guy and the program because they, through this RunCom interface, they could launch batches and they didn't have to wait overnight for them to stay up in the university. They could just go home and the, the manager will, would, would solve the, of the kind of problems. And then next year in 64, Louis went on to work with the Multics team before going back to France. And it was in that year when they, when he was kind of toying with the RunCom principles and suggested them to the Multics team and they, all together they came up with the word, the word shell for it. So that's the word, the term was born there then. And the Multics team loved the, the concept. So they adopted it. By the way, this Louis guy is pretty interested. I don't know if Kirk or, or anyone knows him personally, but later in 72 he started working on the French Internet, Cyclades, which implied invention of datagrams, packets, and generally heavily influenced Vint Cerf's work on, on the real internet. I want to speak about text user interface. It comes down to this guy, Bill Joy, who invented VI in 76. And to, to make VI work on, on terminals, Bill had to invent this term cap dictionary, sort of software dictionary for terminal capabilities that allowed him to easily port the program to multiple or him or others to port this program to multiple different terminals. Before the terminal text user, text-based user interface were actually existing in other areas like on mainframes, but it's just with this, with, with term camp, term cap and VI that it became possible to code something for like interactive, interactive, interaction, interactive management of screen terminal. And then it would work everywhere. So that gives a punch or a boost. And then there was this guy who, Ken Arlalt, who ported parts of VI into separate library in 77, which is, which was called cursors. It's actually more correct with note that it's a library. And then everyone, everyone started using it. And of course, the first use for it was, was the rogue game. Then different people started working on cursors and introducing more and more support for, for more and more terminals. So the, the modern version of curses named and curses, new curses, was born as part of a GNU project in 1993. And basically ever since it's just more support for more, more terminals and unicode support. But so what do I mean by ecology? How did that word get into, get into the topic name? So some time ago, I stumbled over this, this very interesting captivating field named ecological interface design. What is ecological interface? So most interfaces and most, most design, much design and most interfaces are designed quite in an unscientific way. So let's take a quick look at computer interfaces. This is Xerox some 40 years ago. And then, of course, Steve Jobs came, came by and his Apple Lisa, the, the first Apple computer with GUI. And then they, this is Windows 95, like 20 years later. And another 10, 15 years later, this came by its Macro SX, the most advanced operating systems, system in the world, as we know it now. So whatever. Consumer, I mean, the word right. But if we look in this, at this, in a quick succession, then we, we kind of have a chance to see that the progress that, that has been made is quite, quite little. And it's, it would be a stretch to say that, it would be a stretch to suggest or it wouldn't suspect that a team of real scientists sat down in a room and looked at all the data they have and kind of tried to construct a GUI in a very scientific and mindful way. It doesn't, they, the way these interfaces look, it doesn't suggest that, at least to me and about you. And it's not just about computer interfaces. It's about any interface out there. If we're lucky, we get some artistic qualities about that, like, about them, like a sports car dashboard, or, of course, Hollywood movie computer interfaces like this in Minority Report or this in Matrix are always fascinating and very interesting. But in real world, we, we can only look at exceptions of the bedroom. And the exceptions are very, very rare and far between. This is a fighter jet cockpit, a shuttle enterprise cockpit and nuclear power plant. So the exceptions are generally military space and very critical industries, mostly just nuclear power plant. And it's obvious, it's obvious why, because, well, I'll touch on that later, but it's obvious why the exceptions are what they are, because those are not consumer products and they can't afford any mistakes to be made. So they virtually have to hire scientists and to sit on, on their designs and interfaces and come up with something foolproof. But usually we get, like, gooey, like this. This is think, GNU, WJET, WGET or like, consumer products like this treadmill, which is, has not very positive reviews. When it gets to text interfaces, they, they tend to be even less scientific. They were first kept quite cool and mindful thanks to severe limitations. For example, well, historically, limitations were quite severe, starting with manual, manual control rooms and then to these slow teletype machines, early terminals, and of course we get to this, which is almost almighty compared to older devices. And when it comes to modern terminals, modern terminal emulation, modern computers, they're so fast that the quality of software and user interfaces is now completely at mercy of particular developers working on new and old programs. So most of it, even if it gets some thought, most of the designs currently, is, follows the so-called paradigm of user-centered design. And what it is, it's a very simple paradigm, which implies that actually this explicit set of norms about it, which says that user-centered design is about predetermined, fully defined environment. We understand who the user is, we understand what all the tasks are, whenever they will be, and there are no exceptions. So it's all, it's all, the workflow is completely bounded to something that we know before it started. The user is involved in the design and actually each time we design, we look at the user, what he says explicitly or not about it and we iterate. And it's all about procedures. So it's mostly, if we know all the tasks, why do we need separate tools when we can just provide the buttons that are tied to all the procedures that we know about. So the ecological interface design came about as an upgrade of that mindset. And it started with this guy named Kim Vincent. He's working, he's still working at the University of Toronto. But in the late 80s, he joined effort with researchers from a few other universities from across the world. Chance Rasmussen was, well, probably from Sweden, I don't remember, but they kind of joined forces. There were some researchers from Japan, too. And they looked, what they had to do was to invent some ways to look at those critical military space and other industries to come up with something more foolproof. Why? Because actually defence agencies started looking for researchers in universities to help them with that. By the way, Apollo program was the first one that really engaged a lot of scientists to work on specifically interfaces. So they tried to, they looked at how numbers are located on a keypad, like those start from like 7, 8, 9 and go down or the other way around. They experimented, they tested every possibility and they came up with a scientific way to build a perfect interface. This was in the late 80s, there was a critical mass of researchers willing to upgrade the field. And so they came up with this new ideology of ecological interface design, which was one about complete, which was centered not on users, but workflows. So they analysed work domain as opposed to analysing separate user tasks and separate users. They analysed the whole workflow domain and they focused on the cognitive part of the control we exerted on machines, not the kind of key presses and button presses that some designers might focus on, like, you know, let's get these buttons together. But they focused on how in our minds were process interfaces and we kind of tend to exert control and build a mind base or internal to our minds, build maps of interfaces that are much larger than what we see visually. And what this approach allowed for was unanticipated tasks because when you're in a fighter jet or an Apollo program shuttlecraft or in a nuclear power plant, most of the dashboard functions before you are not for anticipated tasks. They are for unanticipated tasks. So you don't know if you fly an F-15, you don't know which model of fighter jet would North Koreans or Russians fly as you or if you're in a nuclear power plant, you don't know if it will be a tsunami or a failure or a structural failure or something else. So these types of interfaces are for unanticipated tasks. And it's based on the notion that some systems are so complicated that users can master, even professionals can master only partial understanding of the whole system, of the whole entirety of constraints and complexity. So users don't understand how the systems work and they have to cope very quickly with very diverse sort of emergency tasks that they can't anticipate. So that's the sort of environment the new ideology was built around. And the research in the field, it wasn't the fruits of the research, weren't specific to these emergency style interfaces. In fact, the area is so fascinated when I first stumbled upon it, I immediately recognized that the classic command line interface is a very good example of ecological interface design. Why? Well, because obviously it focuses, at least when we look at free BSD base system or any BSD base system, it focuses on instruments and empowering you with these instruments for arbitrary situations as opposed to fixed procedures that that's what GUIs are usually more biased to. It's also the classic CLI also implies that we don't understand the system completely. It's actually what it does, it does a really good job at progressive exposure of different parts of the system to us. So we may start using a UNIX system with just a few basic commands, but we then learn more and more and we feel more empowered and at each tier, at each level, we feel in control. And of course it's workflow based because it was from the beginning designed even though it's not about fixed procedures but about instruments, from the beginning it was built around certain workflows. If we take UNIX, it was about different filing, administrative filing workflows and what they just did is find the right abstracts in the workflow and materialize them in form of simple atomic commands. Another speaking of workflow based design, another fascinating topic is flow based programming. Some months ago I was kind of quite tired of, well, we, like most companies, we developed software and I was quite tired of looking at program as cope with the tasks that, I don't know, if you look at current modern sort of enterprise coding, much about it just would tickle you the wrong way because people, people tend to develop the same stuff over and over again, make the same mistakes and I was just googling, you know, new paradigms, paradigms in software and then it was recurrently redirecting me to flow based programmers, programming and it was this guy, Jean-Paul Morrison and in 1971 he published a paper named Data Responsive Programming which he later renamed to Flow Based Programming and from then on he was a huge advocate of it. He is actually Canadian and he convinced a major Canadian bank to switch to this kind of paradigm and actually starting in the early 70s a major Canadian bank, I keep forgetting its name because I guess nobody cares to know because it's everywhere where flow based programming is mentioned. The Canadian bank is mentioned as a major Canadian bank and never mind but it's real and they switched to this paradigm back then like 40 years ago and they still run the same, the same framework. It's sufficient to fulfill all their needs and the development is really simple and when you look at the principles of flow based programming it's all about building these little black box style sort of software which is very versatile and very atomic in the conceptual sense and then what happens is the programming boils down to interconnecting all these black boxes all these tools between each other and when you look at it and if you have enough experience with the code shell code then you instantly recognize that that's exactly what you're doing when you code born shell. It feels really natural because that's what we're doing in real world we don't build everything from scratch we have a lot of tools and we just know how to interconnect them we have people we we know which information to pass back and forth it's really addictive most people say like well most people say that about lisp too but with flow based programming it's really it seems to be addictive so there's a bank using it and Unix shell is about it and I instantly understood that really I have been coding in in this FBP manner for years and that's why like any other style of coding is a bit feels a bit not natural to me so back to back to clay for for a while so what is it what again is it defined by the basic objects that are terminal the terminal or any kind of media device the interpreter and the tools that you use through the interpreter so if we look at terminals we've saw a few pictures a few minutes ago but the history the terse history of it of them is very simple so from the start of electronic computers in the 40s they use manual switching punch cards and basic teletypes and then in 63 this revolution retelletype model 33 came around which was really cheap like 10 times cheaper than anything in his class it was just $700 for a keyboard but it really revolutionized access to computers and then seven years later deck came came up with this fascinating series of terminals hardware terminals starting with vt-05 which was already already quite quite kind of capable it ran 72 per 20 characters on the screen and then a few years later vt-100 of course and then in the 80s there was there was more and more computers with bitmap access to the graphic cards and screens so emulators started to spring up here and there and just a quick for a quick look at the parallel universe IBM was so what was happening in the mainframe world which was all about terminals so it was pretty straightforward and the interfaces I I assure you haven't changed a bit in 50 years so this is what you see today when you use the time-sharing extensions for the the very latest the very latest version of mainframe operating system ZOS in IBM it's a it's half command based half text-based user interface most interaction happens through these text user text-based user interface extensions which is ISPF and TSO it's pretty straightforward back to the real world so with terminals it's pretty straightforward so emulators started springing up in the 80s and we still use them every day with interpreters we saw that Louis came up with Runcom in 63 which then evolved into a multi-shell and then it was a pretty consistent history of shell evolution culminating with I think born shell and sea shell during the years that BSD was most actively developing of course if you're not using that shell by this time you're doing something wrong but it's not just about shells it's also about simpler shells like what we're using free BSD for different boot stages like boot zero or lower it's also like legacy systems with mainframes but also mainframe like systems like PC systems power PC systems for enterprise they use this REX language which is quite arcane and obsolete but it's it's kind of pearl like and people love it and actually it's the it has a huge open source community about it dating back to 50s but also active today this power shell that I think introduced the most the most important introduction that Microsoft did in power shell was structured pipelining when they learned how to they they introduced the function to automatically serialize objects when you build a pipeline of them otherwise it's mostly based on on UNIX of course it has access to different proprietary APIs but they took most of the inspiration from from UNIX and of course there are well Ruby Python and interpreters for actually almost any other language any other modern language and I know a lot of programmers who a lot of quarters who actually do most of their tasks in in let's say in an interactive Ruby shell and it's not just about interpreting your input it's about it's also about how exactly do you input for example three or four or maybe five years ago I switched to VI mode on all my shells and yeah when when random people come up to to my terminal and try to do something with it they get slightly surprised but I tend to to think that it's far more efficient and obviously it's not just just like the the way your input data edited but also history management and everything that helps you do it and the most the most important and the biggest part of it all is of course the the tools and components of the system the the black boxes that we connect that we build collections connections between and the the major players in the area was and or of course UNIX which bootstrapped and ignited the whole process of this component development and then BSD and GNU of which I think BSD played and possibly still plays the the critical role because it attracted it was the first to attract at a critical time to to attract a lot of researchers from many universities around the world and by by virtue of Kirk here and a few of his old pals to sift through best innovations in common line interface utilities to sift through them and to merge them into something much more usable than the original UNIX much more usable than the later GNU reincarnations very clean and concise base base system which we're still enjoying in current BSD systems then there there was obviously GNU which worked mostly from one one pages and other documentation resources to re-reimplement everything and basically what what they what what kind of is obvious about the utilities is they were happy to to say okay the common line common line switches are too short let's let's get them longer and let's get more of them in but the original concept across all of all of the three camps was of course do one thing well for each utility and by that virtue we get flexibility we get interoperability because no one steps on each other's toes and the whole all of the interconnections between all of those tools were text-based which was also quite a advanced decision and very virtuous decision because it's allowed so much to be text is very versatile it's allowed so much innovation built to be built around the concept so sooner or later get out got to be the standard way of processing a common line arguments mostly options and their parameters there there are still two main flavors of it get up and get along there are other flavors a growing a growing number of them unfortunately all they do like most of them are focused just on one or more of error handling automatic usage passage generation and casting of parameters to correct values so mostly useless so I did a quick get up study just some crapping of free bsd base there were a few hundred get up calls and basically there was just just about seven options per per call meaning just about seven options per per tool on average and a large significant portion of that require parameters which from I couldn't get the hard data in time but from subjective looking at base system scripts and the scripts and histories in the wild only a fraction of this is actually used in the while in the world so what's happening the the other way to kind of extend the other way to extend functionality is by using subcommands like we would get rcs in the base in free bsd base system which is which is composed of several programs different tools like ci co check and check out and then cvs and subversion and git come along and instead of finding a modular way to kind of introduce themselves into the basic abstracts what what is easier for them what makes sense for them is to introduce the whole other namespace so it's now cvs ci cvs co which makes sense but it actually introduces some more complexity so what I think is happening is massive specialization of both tools and the options and parameters which invariably leads to loss of concepts so if in the original and in the classic bsd unix most of the tools are almost conceptual so there's no like tool to to copy files from from hard drive to floppy there's just copy which does anything so it's very close to concept but now what we get and especially with 20 what 25 27 000 ports is massive specializations so so that we get more and more utilities that are focused on a very narrow task and there's no concept to describe that so people it gets harder and harder to memorize and talk to gawk the the the meaning of of these over special specialized tools I found a few studies on comparing clay to gooey and text user interfaces one of them said that they they did a quite an interesting study they they took a few two groups of people they they tried to teach one of them a text a common line interface and the other gooey based interface and they both of the interfaces did the very same thing I like control a very simple process but it's still required learning and it turned out that people learned equally quickly and remembered equally well both of these interfaces but there's another study that says that if you ask users afterwards then they always prefer gooey or most of them will because it seems perceived ease of use and perceived ease of learning is much higher for visual interfaces although it's not true altogether so learning is like the the real factual qualities are similar but the perceived qualities for for many people are much higher for visual interfaces and just a few just a couple other notes that that you can't pick up out of common line interfaces is thinking about how they how they serve you in emergency procedures like you know backup what backup so when it comes to when you really need to do something to solve a very massive problem really quickly subjectively again I think it's much more efficient to get to the right stuff whether it's login review or quick fixes or quick scripting with uh common line interfaces than than gooey's and what's fascinating about it is that the military solve the same problem they everything everything they design is about emergency procedures like what happens when the nuke falls and seat belts are very closely related because when you have emergency procedures to do something very big very quickly then you must make sure that you don't make it you don't do it by mistake or without meaning so that's why I think you'll find sometimes that some programs could be made it could be named a little bit closely like the first I mean literally they could their names could be much more similar but on purpose they were the the creators chose the names more distant from each other not to allow people to mix them up and the other wisdom is something that I would call lample as if naming lz compression is about finding the the stuff that that that's most recurrent most frequent and assigning the shortest name to it so it's all about classic unix you find whatever you're using the most tends to be tends to have a very short name and that allows if you if you marry that to to the right names like contractions of real worlds of real words then you get instantaneous monocle retention because because it all adds up even if the contraction is very terse if it's really short that means you use it very frequently and it makes no effort for you to learn it if it's slightly longer then it gets even more similar to the original title of the concept and so it's really easy to to learn unix for most people so thanks and this is kind of experimental or kind of obscure and ladle covered topic so if you want to kind of for the five to three minutes that we have to discuss any parts of it you're welcome if not then off to refreshments