 I have to thank the program committee for selecting this talk, and you all for coming here to see something which is really offbeat, I think it's fair to say. And thank you to Venkat in his wonderful keynote yesterday, and many others for talking about the joy of being outside the mainstream. I presume some of you have come to hear a bit about that. And I think I can promise you that the likelihood of APL becoming mainstream within the next decade is quite small. Two decades, I can't promise you, but for ten years we're good. I should also apologize to you all in advance. I'm going to run out of time. I have my brain and my heart are so full of things I want to tell you, and I only have an hour. So I need to justify that APL is a functional language and give you a brief introduction to you. But hopefully enough of you will find it intriguing that maybe I'll get invited back next year to do a much more slower talk, maybe even workshop. Anyway, so APL has been outside the mainstream I think always. This is perhaps the most famous quote about APL that you'll find if you Google. Dijkstra was not impressed. Actually, if you visit these links, I'll be uploading my slides so you can follow these links. He did have a more nuanced feeling about APL and you can read out about that. But really this is about learning how to read, right? So there's parts of this statement that I really like. You may be surprised. I decided to keep this last one as something that I see as positive. And I think we heard yesterday also that laziness is good. The right kind of laziness is good. I think APL, hopefully you will be convinced, APL helps you do more with less effort. Now, since it's been such a long time since APL was invented, I think it's probably worth spending a little time talking about the history. In fact, the inventor passed away already in 2004 and that's not because he died young. He was 84. Ken Irish when he died. And as you can see, his path into computing and mathematics, of course, this was very early days. So everybody's path was a bit strange in those days. But he actually grew up on a farm, you know, Norwegian descent farmers in Alberta, Canada. And he finished the one room school and went to work on the farm. But fortunately for him, one could say, he was drafted by the Canadian Army. And he completed high school in the Army. And in fact, his service mates told him that there was this thing called university. And after they had seen his love for mathematics, he really enjoyed teaching his soldiers mathematics. One of his mates, in fact, told him if he didn't go to university after he left the service, he was going to come and beat him up. So Iverson went to university, Queens University in Canada, and continued to do his doctoral work at Harvard because he graduated top of his class in Canada. Worked with some fairly high powered people, later Nobel laureates in economics, did work on mathematics, matrix mathematics. He taught mathematics at Harvard, but he found it very frustrating. He found mathematical notation quite frustrating. And when Harvard decided that he wasn't good enough for them, he left and went to IBM, where he continued to work on what was Iverson notation, called it a programming language. And IBM, after he had used it for teaching and modeling complex systems for many years, finally an interpreter of the language was developed. So the language actually existed on paper and on the blackboard, and was used for teaching before it became a programming language. The first APL interpreter in 66, and yeah, those are the important things from last night, and Ken Iverson received the Turingel Award in 1979 for his work on APL, and educational uses of the language. So a little bit about me, as you might see from this picture, it's supposed to indicate that I also have some Norwegian roots, if you recognize the flag, and I'm a cyclist. My parent, my mother's Norwegian, my father's South African, I live in Denmark and work in the UK, so I'm really a mixed kid. I'm really pleased to be here. I was born in the same year that Iverson's book was published, so I feel some destiny there. But I started out doing the usual things. I tried to solder. I soldered all of these components onto a board and nearly got them to work. But fortunately two years later I was rescued. I met APL, and since then I've actually been doing nothing else much. I did write one program in a whole bunch of different languages, but none of them caught my fancy. Yeah, so I've been doing APL perhaps since before most of you were born. And I'm a failed academic. My first year of math and computer science from three different universities. But working with APL has worked quite well for me, and for the last 10 years I've been the CTO of this company called Dialog, which is an APL vendor. So be warned that I sell APL, right? That's my job. Dialog of the company is the youngest APL of the APL vendors. It's only 35 years old. We released version one of our product in 1983 on Unix. Today we deliver it on a variety of platforms. The latest one is the Raspberry Pi where there's a free version, which you're interested in that. We're moving on to the Mac and Android and other platforms I think will follow, although they're not typical analytical platforms perhaps, which is where the sweet part for APL is. So we definitely haven't crossed the chasm yet. We've had low growth for 25 years. We now have what we call rapid growth, but if you look at Bruce's curve, we're still down here. Although you could argue APL actually had a wave on the mainframe in the 70s and 80s, came down again, and now I think we're seeing changes in the way people want to be programming, which means APL is becoming more relevant again. So I'm not sure that sine wave only has one top at least. So returning to Iverson's story, what really frustrated him was he looked at mathematics and there were all these completely different syntaxes for expressing different things in mathematical notation. So not just that the syntactical forms were very varied, but the precedent rules would, you know, if you put ABC next to each other, that might mean different things depending on whether one of them was a trigonome, one of them is high, you might decide to multiply it before, and so on and so forth. And when we started to deal with matrices, which Iverson got involved with, things got a lot worse. So he came up with this new notation, which is very much simpler. I don't know, I hope you can read that at the back, otherwise if you're interested, move forward. So the syntax of APL is extremely, there are only a very small number of forms. Either you can have a function, a prefix, and an argument. So for example, iota, which is the index generator 6, returns 1, 2, 3, 4, 5, 6. Or you can have a function, a left and a right argument. So 1, 2, 3 times 10, 1, 10, 20, gives you a vector back. So multiplying three lists a map is implied in APL. APL doesn't have functions as first-class citizens, but it has things called operators, so the slash here is something which takes a function as an argument. So the slash is reduced, this is a times reduced, and that gives you a derived function which multiplies the elements together. You can also have a dyadic operator, so an operator which takes two functions and derives a function which takes one or two arguments. In this case it's the vector product, so it's the plus reduction of multiplication map. But in APL that's completely general, you can apply any function there. And then there's indexing, which is sort of seen as a rather non-functional syntax, but it's very useful. And one of the things that's important in APL is that you can index with an array. So this is selecting four items out of a list of characters in one go. And of course you can name all the various parts. You can name arrays, you can name derived functions, you can name operators in the latest release that we just put out, we couldn't do that. Well, the operator is reduced, so it's one times one, plus zero times two, plus two times three. So it's a vector product in one shot. If the classical vector multiplication in APL is written from mathematics, it's written plus dot times. So you can put any function there. And if I don't run out of time, we might see some examples of that. Okay, I'm not going to have time to show you this live, but up here this is tryapl.org. So if you want to play around, there's an online raffle that you can go to. And it includes, in this case I've hovered over the question mark here with my cursor, and it pops up the help and shows you the monadic, which is the prefix, we use the term monadic. Sorry, we started using that before monads became popular. And dyadic is the infix where we have two arguments. And you can see there's a whole bunch of mathematical functions. There are structural functions, reshape, catenate, catenate on the first dimension, reverse, transpose, take, drop, enclose, depth. And you'll see some of these sorting, searching. But this is basically all you need to learn. And this is the entire language right here. And when you're working in the raffle, you have those symbols available that you can hover over. Okay, so what did Iverson do? Yeah, we don't have time to wait for all these animations. He lined them all up, and then with his new notation, so this is what guided the design of his notation, he found that he could say, well, a, b is a times b. e to the x is the monadic power. So if you don't have a left argument, it's e to the power. That one's obvious. b, logarithm, so this is the exponentiation symbol with a circle around it. And you can see it looks like a log that's being chopped over from one end. There's a lot of that kind of mnemonic symbolism in the choice of the symbols in the language to make it easier to learn. Square root is a to the power reciprocal n. Vector product, matrix product, those are all the same. If it's a matrix, it takes the rows on the left and the columns on the right and repeatedly applies to the map reduce. fgx is fgx. f plus g of x is f plus g of x in the latest release. This is something that's just been added in the last decade or so. Tangent is the third trigonometric function squared. These two forms, the plus reduction of four times i over six is the sigma. And the times reduction is the big pi. And this one here, okay, I don't have room there for that. But it's the 2a divided into minus b plus catenate minus. So that executes both the plus and the minus between these two parts. Square root of b squared minus 4ac. So you can see for somebody who's coming from a mathematical background translating formulas into ATL and starting to do work with data should be quite easy. And it's much more, this is really the origin of the name of the language. You have to remember that Iverson was a mathematician and he was looking for something to express steps in a mathematical... It's a terrible name today because people compare it then to classical mainstream programming languages and they say, no it isn't. Okay, so fundamental rules, there are very, very few. There's only one data type, the array. You could say that actually there's numbers and characters. But for us, the numbers are everything from a Boolean, so true, false is 0 and 1 all the way up to complex numbers. ATL will just promote the type as required. Characters is now everything in Unicode or any item of an array could be another nested array. So you can nest these things. And note that a single number is also an array, just a zero-dimensional array. The vector has one dimension, a single number has no dimension. As we've seen, for most primitive functions, math is implicit. All functions are either prefix or infix. That's it. We can have one or two arguments. All operators are postfix or infix. And there's a very nice interaction between the fact that functions are prefix and operators are postfix, which is what makes the language work in my account. The order of execution, people say it's right to left because this Iverson-Selec is this sort of as the predominant sort of classical syntax for mathematics. FGH chains the data. That's a chain of function applications going from right to left. Okay, there was a little star up here that says where I fed our arrays are immutable. That was true until we added object orientation and you could put object references into an array. So you could have something in the middle of an array which is a reference to a .NET collection. And of course, that's not a value type. So once you start playing with objects, the immutability goes away, but that's probably not a big surprise. Okay, so enough slides to begin with. Let me try and show you some of this in action. Okay, so I'm... Right, so first thing, the lamp, this symbol is the comment symbol, it's a picture of a lamp that illuminates the code. APL is interactive. Okay, so basically the way you use APL is you sit here in this grapple and that's been how it's been since 1966. This is not a recent invention. It was always like that. One, two, three, plus four, five, six. Map is implicit. And, you know, multiplication is the same. Here's a prefix function. So exclamation mark, prefix is factorial. With two arguments, it becomes the binomial, the gamma function. And here's I inserted this after seeing the keynote yesterday. So here's, it's not all numbers. You can have an array of character vectors. And we can find Nemo. I need to box this here because otherwise it would be a four element vector at too high a level to compare to the list of vectors. I can also reverse it and say which of the elements of names are a member of Nemo. Because really finding Nemo is this function. So we saw the iota which was to generate the numbers from one to n. In the dyadic case, it's where in the left argument will you find this thing. So Nemo is in the fourth position in this argument. Sorry? Yes, APL, well actually I didn't want to talk about that. You can switch the index origin to zero. And of course everybody who's programmed in assembler or C or anything like that wants to do that. APL is really, the people who are most successful with APL are any kind of engineer other than a software engineer. And I'm sorry. The people who became millionaires writing their own software in APL typically didn't have a software engineer in the background. Chemical engineers, financial engineers and so on. If you look at Excel are a number of tools that are sort of aimed at that. They are index origin ones. Of course once you include a .NET object in one of your arrays and you start indexing it. It's doing the indexing. And you end up with a horrible combination of index origin and that's what happens when you do interoff. So here's the list. You want to double the list by now. I guess this is no surprise, you multiply it by two. So the rigmarole here, there's not a lot of procedure and protocol. Okay, so here's a nasty example just to highlight the syntax. You probably weren't expecting minus two. APL has to use a high minus, a different symbol to show the sign than the function because otherwise the syntax would be ambiguous as I think it is in many other programming languages. But this is essentially that you have to put the parentheses in like that. So it's ten divided by five, it's minus two and not what you might expect. Iverson's reasoning for that is this. Once you have 30 or 40 functions the idea of having a precedence is just stupid. Nobody can remember it. Most functions have both, as I said, prefix and infix. Typically there's a relationship between the prefix and infix. Prefix will maybe have a fixed left argument like with the exponentiation. So generally read from left to right, but execute from right to left. And I'll show you an example of that. So functions and operators plus reduce. I'm going to have to go really fast here. Plus scan. So this is where you get the partial result. So one, one plus two is three, one plus two plus three is six and the final sum is ten. So you can generate all of those in one go, which is quite useful. I think as a type I really only wanted to show you this. So the vector product we've already seen that I'm repeating myself. And of course it works with characters as well and other functions. So this is or any, the or reduction of equals. So are any of the characters the same here? And so it's the or reduction of this one zero zero. And yes, one of them is, one of them is the same. I keep hitting the wrong key here. Okay, we talked about that. So here's a user defined function. Square root is omega is the right argument. Alpha is the left argument if there is one. So here's the square root function. And because it's using a function which is in fact both shape and rank invariant, you could pass it the seven dimensional matrix, it would still work. So we're sort of allergic to types in a way, although I do realize that types have, have their uses. We do have customers with a 50 million lines of code who would really like some static type checking of their application. Okay, so here's Pythagoras. It's the square root of the plus reduction of alpha and omega to the power two. So we're doing a map reduce there. And I can do, you can also do, you can use user defined functions anywhere that you could use a primitive function. So I can do a plus dot Pythagoras. So that does the Pythagoras map and then the plus reduction and adds all these numbers up, giving us 16 at the end there. Okay, so even something like generating random numbers, you have a function which takes an array as the right argument. I'm rolling four dice. Reshape is a very useful function that creates an array of the shape given on the left. I can roll 10 dice. So I can roll 10 six-sided dice. I hope nobody's offended by rolling dice. You have to be careful. And if you don't like the squiggles, then of course you can, you can rename them, right? You can just do an assignment that makes you feel happy to begin with. You won't feel happy with it after a while because naming these things actually makes, in my opinion, and the opinion of I think most people who become experienced with it make the code much harder to read. One of the great things about APL is that it has no reserved words. So generally you can rely on the fact that anything that has a name like this is your code and anything that's a squiggle is part of the underlying language. Okay, so we'll make some throws and we changed random number generator in the last release and I have no fives now in this demo, so I'll show you that. One of the nice things about booleans being numbers is that I can apply a plus-reduced to a boolean array. So I can easily say, well, how many throws had the value five? So booleans are not outside the domain. Well, we don't really have booleans. We have one byte integers. So here's an outer product, so the null dot equals, that takes each item here, gives you a row for each one, and then you have the comparison. So I want to count, or I name my outer here, so count to six, I can write sum, count to six, outer product equals throws, and I get the count of ones, twos, and so on. And I can do a million of them. You see, it's pretty fast because although this is an interpreted language, it's pretty close to being bytecode. I mean, there's no interpretation really going on here. It's bytecode as you write it, despite being, I would claim, a very human-friendly notation. Now, in the last release of APL, that example also became sort of irrelevant because we introduced a new operator called key. The symbolism there is, it's like an SQL boot byte, so you have a table here, and it's grouping things. So if I generate a million random numbers between one and six, and then I say apply the function alpha with the key, the unique key is given as the left argument, and the data items corresponding to that key are given as the right argument. So I say return the key and the count of the item for each distinct key, and I get this. So this is the equivalent to select key, sum, count, key in SQL. Okay, so I think I will, since I not unexpectedly used too much time, I will move back to the slides for a moment. Right, so I made the claim, I think once you get some experience with APL, although it executes from right to left, you would generally read it from left to right. So you can read this here, which is the transcription of that, two times a divided into minus b plus or minus the square root of the determinant b squared minus 4hc. And if an APL expression can't be used in that way, it should probably break it up, because it's supposed to be, I mean, APL was invented as a mechanism for communicating between humans, which happens to be very nicely executable. So, yeah, we're not just trying to jump on the bandwagon here, although this is the first appearance of APL, probably at a functional language conference for some time. Don Bactus recognized in 77 in his Turing Award lecture that Ken Iverson had, in fact, created a lot of the basis for what became functional programming. He did go on to say some things, you know, there's not enough functional forms in APL, and he really wasn't happy about the fact that it wasn't pure. He can very easily step into a procedural mode in APL if you want to. I think today it's recognized that multi-paradigm is really where you need to be. So what we say to describe dialogue APL today, which has moved a very long way since 1977, is that dialogue is an array-first multi-paradigm programming language. So we're claiming that array programming is a paradigm-like functional programming. We support functional, object-oriented, and imperative programming based on an APL kernel. So sort of to prove my functional credentials to you, I have a bit of Rosetta code here that's one of our team who does scheme setup for me. So mapFA is just FA because most of the time it's implicit. For the functions that are not scalar that don't just map automatically, user-defined functions, for example, there's an operator called each, which is the equivalent of map. So you can explicitly map things. Filter, the compress. This symbol actually is overloaded. That's a bit unfortunate, but it was done back in 1966 before the distinction between functions and operators was properly understood. If you have a Boolean array on the left and you compress an array, that selects the items where the left argument is true. And we've seen bold right. The classical staple diet of functional programming. So juxtaposing things creates lists in APL, or there's a catenade function if you want to do it. Actually, this was extremely controversial in APL, the fact that the space is a function here, essentially. So bad that, in fact, Iverson more or less quit his job at IBM when IBM decided to put this into an APL 2 in 1983. Yeah? You have to know at one time what X and Y are. If X was a function, that would be a function application. If it was two functions, it would be a function composition. If it's two arrays, yeah, I mean, it's both beautiful and horrible at the same time. Yeah? You need to take care. I think I'm, you know, at heart, I'm with Iverson. I'm now the CTO of a company that has an interpreter that implements lists. It makes code very, very simple to write in a lot of cases, but it makes it more ambiguous. In many ways, it's more of a problem for us than for the users because writing a, sorry? Yes, and writing a compiler is a nightmare. Yes, it could, it could, yeah, yeah, yeah, it could. But the problem is to avoid that, you have to introduce a lot of syntax and protocols that I feel distract from, you know, then you no longer have a mathematical notation, so you have something with these programming. Yeah. So you can take and drop vector and matrix products, you know, lots of code in, in Steam and in APL, we see it's just, you know, whether it's the vector case or the matrix case and whether the functions are plus times or and dot equals or dot equals or counter disloser and dot or, or dot and, sorry, you have a completely general mechanism. We saw the outer product, the Cartesian product, so here I'm doing a null dot maximum and this thing is called commute because it reverses the argument, but when there's only one argument, we now call it the selfie operator because it f selfie x is x f x. So if I want to do the null dot maximum with iota 6 on both sides, I just say null dot maximum selfie iota x and I get this thing where I have a symmetrical around the diagonal, right? Other dialogue operators? Well, we saw scan, we talked about commute, which is selfie when there's no left argument, it's copies to left argument, the right argument to both sides. We have function power, so that says apply the function if there's a number on the right, it applies some number of times. So 0.5 times omega power 3 would halve the argument three times, but if the right operand in this case is match, whether the arguments are identical, then it applies the function until the one invocation and it passes the one invocation and the previous invocation, the result to this function and if it returns true, that's the end of the isolation. So power match is the fixed point, so repeat this until it stops changing. So if you do 1 plus the reciprocal of omega power match 1, it computes the Jordan ratio by repeatedly applying this factor. Rank. So where normally the map is done on scalers, you take one scalar item at a time and apply the function, there's an operator to control that, so you can say I want to take a vector on the left and apply it to make it from the right, or in this case, use the ravel, which is when it's monadic, it removes the shape of an element, no matter how many dimensions it has, it becomes a vector. So if you say ravel rank two, it collapses the last two dimensions and it makes it into a singular. Here we saw, and then an experimental thing in the latest version of APL, dialog APL, is a parallel operator, so that derives a function which puts your function, puts f inside an isolate, which is a closed space, executes it and returns a future, which you can wait on when you need it, but the execution of your main program carries on until you need this result. So that's sort of our concurrency model, if you like. Unfortunately, because the APL community has, at least until recently, been very isolated, we've been our own little trickle, independent from the main tree for a very long time, I don't know the right terminology to tell you guys which of the concurrency models you know is just for responses, but maybe somebody... I asked that question in the fiscal yesterday, but I could get an answer. Also, in the latest release, we have a number of 0.3 forms. So we saw that f plus g is the way to express, well, f plus g. So if you have three functions just suppose like this, fg8, so if it's a monadic function, it's f of omega, g8 of omega, and if it's dyadic, then it's the... it's called fork, because you have essentially this is your root and then you have three for a fork going up to the function on the right-hand side. So it applies the left function to the argument, the right function to the argument, and then it applies your root function to those results. You might think, well, why is that useful? Well, it has a lot of useful applications. So for example, f plus g is an example of that. If you have two functions, you can just write them like this. It applies f, it applies g, and it counts the result. And you can compute the average, or you can define the function mean or average by saying it's the sum divided by the count, which is the mathematical definition of the average. So this is going to count the number of items in the argument. This is going to sum it and then the two will be divided. So the number of applications for this is surprisingly large once you get used to it. And of course, the interpreter can really optimize that kind of stuff. And then you write a compiler. Writing a compiler for these things is really, really easy because it's all pre-passed, basically. You just execute. Two functions is in a top. So floor divide with the integer division rounding down. Then we have sort of classical compositions. So I can bind one argument in the case 32 composed with plus with the function of add32. Multiplication composed with 1.8. So Salon height is add32 composed with scale. We talked about the fact that you can also curry. Well, actually, we didn't talk about curry. We talked about the power operator in fixed point. And one thing you can do in recent versions of ACL is that you can curry the right operand of an operator giving you a monadic operator. So fixed point will be power max. And then you get a monadic operator that you can apply any function to. One of the, another really interesting thing about that really colors the way people use ACL. Well, any keyboard, you can define whether you want to use the control or the alt key or the windows to map the symbols to the keyboard. And in the in the REPL you have them all up here. When you hover over them it will tell you it'll tell you you can't really see that so it tells you where it is on the keyboard plus the definition of it. So while you're learning it there's a lot of help available. But that's true, it takes a little bit of getting used to it. But you know anybody who's productive is touch typing anyway. You're not looking at the keyboard. Okay, so because map is implied and because indexing can be done with arrays you can write a lot of logic where you would expect to write if statement as parallel expression. So for example, if I have this data here vector 2, 7, 15, 60 and I want to make sure that none of the items are less than 5 so I essentially want to say if data sub i is greater than 5 then keep it otherwise gives me 5. I can just write that as data max 5. So no loop, no conditional. Now it gets maybe a little bit weirder so I want to increment data if it's 3, 7 or 15. So I can write that as data plus actually I don't need one time to if I wanted to implement it by more than one I could just say data plus data element 3, 7, 15 that would give me a Boolean vector we call it and the ones would be added so I implement each element and then the sort of the classical switch statement here I say I have two arrays x and y I want to select and I have a flagged vector of the same length as both. I want to select x where the flag is true and y where the flag is not true I can write that as x times flags plus y times not flags. And that might seem like we're doing more work this way like we're multiplying everything by a Boolean and so on but that's very very highly optimized and for compiler can optimize that very very easily and these expressions are all very easily parallelizable so writing a parallel compiler that knows how to do this stuff in parallel is quite easy. So here's an example of doing sort of bucketing I have four values and I have my ages and I want to select one of these values I do the division rounding up in this case by 10 and then I make sure that it's at least one and no more than four and I select them rather than. I don't talk much about this we've seen a lot of these forms already you can write functions user defined functions in this functional form there's also a procedural form which I won't show you because this is a functional. Alpha is the left argument and omega is the right argument so yes you can only have two functions are either prefix or in-fix of course they can be tuples they can be nested arrays but they can't have different names I admit it but the elegance of everything being in-fix or prefix allows the construction of these big functional no you can but you don't have to and if I don't run out of time I'll show you an example where I do exactly that yeah so I think I'd like to push on so here's a multi-line function with a guard so this is a recursive Fibonacci definition that we start by default if there's no left argument we start with 0 and 1 if the right argument is 0 we just return the head of the left argument otherwise we sum the element in the left argument and we drop 1 and we do a recursive tail-force tail-force are optimized in this language and you can see you can have guards in this functional okay so 19 minutes to go okay so this is going to be really fast because I have too much stuff to show you here but I'm trying to give you a flavor of what it's like to work in this language don't expect to be able to follow these examples I would be amazed if anybody could but hopefully the flavor is coming across so here's a little function I've written Wikipedia which takes an argument of a page name from Wikipedia and if I say and it's called cached get because I didn't know whether I would have an internet connection but I actually have a variable in my workspace with all the data and it gets it from there if there's no internet connection so I got some HTML which fortunately is xHTML and then I've written a little function called table1 because I want to generate some names for a demo and I found this page on Wikipedia a list of the most common surnames in Asia I thought I'd try and make it relevant and it's got lots of tables in it so I'm writing a little DSL here so table1 returns the content of the first table in this page and now I'll risk blowing your minds away here with a bit too much code so what I'm doing is I'm taking this xHTML long vector of text we have a function called quadXML which turns this into a nested array with five columns the first column is the level of nesting the second column is the name of the tag the third column is the data and then the fourth column is the attributes and so on and so I'm going to look in the second column for the first occurrence for all the occurrences of table and I'm going to do a plus scan table1 when I get to the first table 2 when I get to the next table and then I search for the number that I'm looking for so what's the location of the nth table in this vector then I look for items following that which are all more deeply nested than this tag that I found and then I index out of the array so now I have I have these are all the HTML tags that correspond to the first table and so I then look for all td and ths and I just find the indices of those because that's where the data is I then count how many ths I have because this is really screen scraping this will not always work but that's how many columns I have in the table and then I do a division round it down which gives me how many rows I have and then I found when I sort of inspected this it took me about an hour to get this function I discovered that some of these ths have don't contain any data they just contain an anchor and the data is actually in that so you know this is really hacking I just said well to the index add one if the next tag is an a and then I've got my matrix and then I wrote a function called column so this is very typical or end users might work with it they're essentially building their own little dsl as they go along and the fact that the functions are all infix or prefix really aid helps I think it's fundamental in my opinion to making these a little bit like natural language without having to do any magic so I can get the romanization column out of this so column and columns are two little functions like that because I'm running out of time so I'm going to get romanization from table one I'm going to get bangler romanization from table two and I've got this vector and I discovered are there some of them with slashes in them so I had to write a little function for first word which drops everything after the first space or comma and then just to show that I did a very unfunctional thing here I mutated this surname thing horrible bad news but of course I can write a little function which is column alpha of table omega and apply it with those two arguments and get the whole thing back in one shot so there's all my surnames and then I got the given names and I don't have time to show you that but essentially I'm doing the same stuff and I got a bunch of given names and then I want to generate lots of data of people's names so I give them all initials from A to Z and just catenating to the list of letters to each one dot full stop and now I can say well take initials and do an outer product with catenation of the surname to get a 26 by 39 array with all the combinations and then I can take all the given names and I put a space between the given name and the initials and doing another outer product and I have a 64 by 26 by 39 and all the combinations of given names, initials and surnames and then I rattle it and I count how many people's names I've generated 64,000 I just pick 50,000 of them at random and then yeah I don't have time to explain this but basically I take the base so the base 2 logarithm of the number of people is 15 so I take random numbers between 2 and 15 and I use those as exponents base 2 to the power of that 2 to the power of this and then I generate random numbers from that again so I get a roughly logarithmic distribution of the number of friends so I've got a number like this and then I generate I take for each of these numbers here I do a deal with this as the left argument so I create this number of random numbers for each one of those 50,000 people and it takes about I think 5 seconds on my laptop there's a serious amount of random number generation going on I need to deal so I have to make sure I'm not getting used to this and so on so that's a fair amount of work and then I don't want people to be friends with themselves so I have this without and then the index of each person so the first person is this guy he has these friends and I can use that to index I said people indexed by one from friends I get the names of these people if I want to generate a nice application count how many friends I have after removing themselves from the friend list so I generated a total of 100 million friends connections between 50,000 people and I can now do something like compute the average number of friends the mouse is up there I can compute the maximum the average and the minimum I'm playing around with this data what are the impressions I want to give you here I can find somebody who's lonely look to see somebody who has only one friend so that's Ahmed Pal for some reason and then I've written a function here which I really would have liked to have more time to talk to you about but I have run out of time hopefully I've given you a flavor so basically there's a function here that recursively goes through it finds out who this person can reach it creates a Boolean array filling in the ones where they have friends and then you recursively go through looking at their friends again so reachable in each iteration through the recursive call if you compress the friends by the ones which have been reached which gives you the next group of friends so after I've recursed on this four times I have the the distance to every one of the 50,000 people from that person and I can use key to say well what's the distribution of those well I have 49,520 people who are four hops away 477 who are three hops away and so on and I can compute the path it's a function very similar to that one there and find out who are the people on this path play around with data the point is here if you sat down and you thought about this a lot you would probably just sit down and write the code but the process I'm trying to give you an idea about here you've got some data it's probably a more complex example and this is messing about with the data I haven't written really much function yet I wrote 10 line functions but I'm exploring this space and I'm out of time so 10 minutes right so just to show you I'm starting a web server here written in ATL just to show you that you can build applications with this stuff one minute from that it always happens when there's 10 minutes to go but here's my web application it's a tag cloud it just has 10 friends and it shows it's using a tag cloud which is a control from a company called Syncfusion one of the latest and greatest things so a tag cloud is something which gives you these words related to the size of something so basically it's the number of friends that they have and I can click on one of these and it takes me on just to that person that I clicked on this sort of educational web server that we have here is one where if you click on the logo at the top this is our constant logo from two weeks ago at our user conference you can see the code that we've written to this so we have our here's our little code that generates the friends matrix so this is really the ATL application code down here and the rest of it is just the code to generate that web page so you say well give me an H2 which say friends tags out give me a link an anchor to reset it add a tag cloud Syncfusion control and then there's a little bit of logic depending on whether you've selected a user or not and then basically just set the properties on this tags out object and it renders okay so with eight minutes to go the sweet spot for this technology is still as I think it has always been people doing financial software so the big users are companies today that make software typically asset management systems where there's a lot of numerical data that needs to be crunched and it's really important to be able to read legislation in the morning and start simulating new products that you might sell in the afternoon and of course the key thing about ACL is it allows people to do that when you build a large application of course you do want software engineering in the process but you can leave the nasty business you can actually let the people who understand it code it themselves and then you can review what they've done there's also some non-financial users they don't dominate but some of them are very large in the world's largest, I think, electronic patient's journal system in Sweden and Acton mobile users to optimize how they crack the code in the product and this is not a large application this is just something there's a customer of ours in Finland who won a prize last year for building this application I'd like to show you a video now wait a bit of my precious time can we turn up the sound maybe I can do that this is an ACL application it's not already an ACL in code generation and quite different from the other applications so where have we been well I've actually maybe skipped over this slide because I've talked about most of it already but you know it's the people who tend to get attracted to ACL are technology avert and the ACL market really took ten years to discover that the PC wasn't just one of these bad things that was going to go away again and the companies who sold ACL were very badly hurt in that process and then we had really dark ages for ACL where everybody wanted to move to C++ and the GUI APIs were changing so quickly that we couldn't really provide our users with the kind of covers that they expect as relatively non-technical people but we really feel that the focus on arrays and functional programming concurrency, big data ACL applications tend to be very lean in terms of the data arrays that we create so the arrays are very very compact and the functions are very very highly optimized and we're somewhere in the middle of this range of built so to successfully use ACL my recommendation is you need to find the right mix of domain experts and software engineers for your application you're going to need both you might not need the software engineers for the first few weeks but you will need them eventually be pragmatic you know use objects if you must if you saw the source code of that web page which was only there for six seconds that was a class it started with something called on class it had a constructor and so on because I think for user interfaces a lot of user interfaces I think that orientation remains a good tool of thought and that's the thing that's important to focus on and for in the fishbowl yesterday we were asked what's the next big thing and I think the next big thing after concurrency is complexity the data that we're going to be asked to analyze and work on is just going to get really really hard to look at and if you don't have tools with a kind of expressive power that ACL has maybe something new that embodies what ACL does in a different framework will be more successful but I really believe that this is true this is from a colleague of ours who has an array based language derived from ACL this is true and there's a list of major language extensions you've seen some of them you can look at the slides I'll upload them we still have a whole bunch of work to do we don't have a compiler or type those are getting important for large applications we don't have closers or lazy evaluations we do recognize that we need those but after 50 years where was the automobile after 50 years of invention I think the computing is still roughly in that place it's going to be another 50 years with the right software if you were slightly intrigued by this unfortunately I won't have time for questions I think I have one minute left but you can use tri-APL online there are some tutorials there there's a bunch of videos if you're a student it's free and note that every year we run a competition for students where the prize is thousands of dollars on the trip to our user conference paid by us and there's a low cost it's usually 50 pounds, 75 dollars non-commercial version but if you register for that within the next week and say you were here then just ignore the payments I had a lot of help putting this together and I apologize to these guys for running out of time I'm not using their material to its fullest extent thank you all for listening I do have a slide where it says any questions but there is no time but I'll just remind you that the deafness in those questions is this so we'll have to take the questions in the breaks or fill in the forms and say if you want them back next year for a slower fork on EPL and I'll be very happy to do that thank you for listening another 15 minutes