 Along the whole day, we had a number of questions and I tried strictly to avoid statements. Well, I believe that now is the time that everyone here present should say whatever they have in mind and start to think about what could be done with this if we are allowed. Yes. So, essentially the question is imagine that I am an entrepreneur or venture capitalist and say, well, part of the computing sounds interesting. Should I apply to what? Which industry? Should I go to gaming? Should I go to genetics? Should I go to financial modeling? Where is the selling point? Because either we are the aim of the mini-conf has been always putting together different areas, academia, industry, community, but towards where? So, let's start with... I don't like telling people where to spend their money, but I might make an observation that it's in fact the gaming, the PC gaming that drove GPU development, which is now sort of coming full circle. And as someone mentioned before, you can rent EC2 instances with GPUs and you can buy for you. Sure, sure. So, yeah, I'm actually going to dodge that question, but I just think it's quite interesting that something that you think of as disposable and consumer and kind of light entertainment has actually spawned a revolution or the makings of a revolution in high performance computing. So, it's a bit of a counter-intuitive outcome, I think. So, boy, what should I do with my money? I was hoping. So, entertainment has been driving the technology for a long time. Used to be in the late 80s, people would brag about getting new machine and their spreadsheet went faster, but mostly it's been the guys playing the molding. Scientific computing is another example, but another example of entertainment more recently has actually been sports. I'm not sure what they do for this part of the world, where I come from and they put a ten yard line, a virtual ten yard line that appears on the television for football, which is how far they have to advance to get another chance to go through things again. And there's also some, there's been some talk about being able to use light-fueled cameras, where you have multiple cameras that collect enough light from enough angles to be able to produce what the absolute direction of light is anywhere in the volume, which could then be used to get a view of the soccer match right on the field looking at something. And that's going to require a huge amount of computation. It's something that the machines are getting cheap enough that you can imagine doing it. And sports is a branch of entertainment where the economics are for professional sports beyond belief. So still it's all about economics and entertainment? Well, you said you want money, I suggest you follow money. I mean that's... Right now in the sports they will sell you a space on the football field and they'll project your ad on top of it after the fact, post as it were, so there's money in that. So Wayne, what do you can tell us about this? I will have a question about the future. I mean a lot of people in this room enjoy parallel programming, so I see because it's hard, but I think the average programmer, the business developer, is going to see parallelism as a burden. It's an extra thing they have to do on top of to simply getting the program cracked and out the door on time. It's something they haven't had to do in the past, they've relied on simply processes getting faster and their program runs faster. So I see it as an added burden, so the unknown to me is what is the mainstream business developer going to have to do in 10 years time? To what extent will they be thinking parallel? There's always going to be a hardcore group that are very happy and need to do the hardcore parallels in development, but to what extent is it going to impact the mainstream developer? That's a question I'd like to pose rather than answer. But the mainstream programmer wouldn't be worried right now? Why wait until 10 years time? The reason I say 10 years is that there may potentially be a transition period where initially now they need to be worried or they need to do something because there is no other feasible alternative, but perhaps 10 or 20 years I'm not sure what the period is, but perhaps in the longer term we'll look back on this as being a transition period where later on arguably the average programmer doesn't have to, for every single, you know, the majority of the program they develop, they may not have to consider parallelism. Really already changed? I mean, Paul has been talking about the history of the last couple of decades, so why wait again? So what I would suggest you do, what I would ask you do, go to a medium-sized company and look at the data center, count the number of CPUs, they're already parallel. They may be using single CPU techniques and wiring together through TCP IP or any number of things, but except for the very tiniest data centers, they are parallel. And these guys working on the data centers have been learning with you or how have you done it? Oh, they make their own mistakes and learn their own lessons, I'm sure. So there's a bunch of different subclasses. Databases have been parallel for decades. Whether application servers are inherently parallel and nobody even thinks twice about it. In fact, all the transaction-based stuff is inherently parallel and nobody's thought twice about it for a long time. But I think what it is, is that not so much that parallelism is new, it's not, but that the change in economics of parallel machines means that things that you wouldn't even think about doing in parallel in the past suddenly aside from a little problem with software, are inherently reasonable to do. Well, we had a funny morning about quantum computing and a lot of interesting things, because I've been, I've always thought, okay, we can do a lot of things that are faster, cheaper, but what about starting to think about new things? Things that we haven't thought ever before that now are feasible either because they are less expensive or simple because they are feasible. So lens. I'm sorry. Quick synopsis, we've seen like most things that we see in this environment, we've seen it in the mainframe world for 20 years. We ran out of processor in single-processor environments, multi-processors came in, locking structures came in. All these discussions I see in this world now about race conditions and overlays and things like this, the techniques aren't new. It's just a matter of getting people to think about it rather than thinking, my program works, I'll whack it on a quad core, three instances of it start up and we get an overlay. Hello. I mean, it's not rocket science. Please, pass it to me. So now we are on beauty. I'm probably, I do small stuff for businesses. So actually, Define small stuff. So conversions from one type of Excel format to another for inter-business communications, basically the garbage level programming. It's only small stuff, it has to be written and it's required. I will tell you now, a lot of the times if I end up using Java, I do not have a choice. You try coding in Java without using threads, it will kill you 12 ways from Sunday. This is that simple. Businesses turn around and ask you for a Java program. At this point, whether you wanted to go single-threaded or not, it is already out the question. The world has changed. You don't have businesses walking up and asking you for BB programs that you can write at a single-threaded piece of junk anymore. They walk up and ask you, we want something portable, we want something usable. And as soon as they start asking you for languages that have that minimal requirement of threading, it's over for the programmer. The concepts of being single-threaded are no more to me. Okay. The world changed, but it seems that some people still hasn't noticed because we heard today that certain things are not parallelized. But as languages change and they get less territory and less jobs, they'll simply be weeded out the market. That's the simple fact of it. The world is changing. The ones who haven't caught up are going to be dead. Okay. We have a mic in the back. We have a mic in the front. So your turn. Yeah, I think learning more than one language is essential. And learning a functional programming language helps you definitely to think in a different way. And I'm convinced that functional programming languages help us understand programming multicore CPUs in a different way. It makes it way so much easier. But I also fear that functional programming languages probably are not mainstream and will never be. Even though it would be a nice world. But I think learning about those concepts, how to write functional programming languages really helps you improving your non-functional programming in a big way and helps you getting around a lot of gadgets that you have in non-functional programming. Yeah, you can ask. So let's say I wanted to learn a functional language. I know you're an Erlang fan. So would you pick Erlang as your first functional language to learn? Or would you pick something else? Obviously, sometimes you get on the web and then you look at some Haskell code and you're like, what the hell? Who wrote this? I can't even read it. It's just soup. So what would you recommend as kind of a first functional language to learn? Okay, I am coming and going. But I would appreciate if that make stars who work too. Okay? Yes. Yeah, I would totally recommend Erlang. Not for the reason that I love the language, but for the reason that it is once you get over the initial hurdle, which you can really literally get over in a day or two, you're at a language that is really easy to read and really easy to understand. Erlang has a very, very clean syntax and Erlang has a very easy and dense way to express problems. And it is very easy to write or to read Erlang code and understand Erlang code and that ultimately helps you write Erlang code. So after about probably a week of tinkering around with Erlang, I promise you you'll be able to write an Erlang program that is substantial enough that you can call it a program and it works well enough that you can run it in production. So it is a language that's easy to get into and it's a language that gives you a good feedback in an early stage already. Please. Okay. So, yes? Sorry, in the same vein, I'd be useful to learn another architecture as well, especially in multi-core programming. You've got different data dependencies, different adieness, things like that, but especially the data dependencies at least. Memory ordering in different architectures are quite important. So if you're programming for ARM instead of your Intel. And what is IBM doing about Erlang? What's IBM doing about Erlang? I'm sure there's, you know, there's over 300,000 people in IBM and if I were able to tell what all of them were doing, they would be very lazy. I'm sure that somebody, there's a bunch of people in IBM doing some of the Erlang. I'm not one of them and I don't know of them. So, okay. It's a good book to learn Erlang and to get into it. It's a very gentle introduction to the basic concepts and then goes off into writing really large-scale applications. I'm not the author. I'm not affiliated. Erlang and OTP in action. Okay. So I, before closing, I want to please all of you, the presence, join me and say thank you to all the people, the volunteers that support us with it. They have done a fantastic job from, oh, yes, yes. The presentations onto the shuttle. Oh, yes. Yes. So if you send it to me, you send everything that I will either on the present or on the website. Thank Nick Kloss for organising the conference. Thank you very much.