 Hi again. For my second talk today, I want to focus more on the art side of things. My last presentation was more about curating and copyright, but in my other life I am an artist who makes artwork. I also do performing as well. I'm just going to talk a little bit and demonstrate the tool that I use when I'm doing performances and just a general way of programming anyway. So, excuse me. Programming is a thing that we all do, and some people like to do it live as in live coding. Live coding is more than just basically watching someone code because we can do that. We can just look over the shoulders of someone and watch them do programming. The thing with live coding is that, especially when you're doing programming as you are, you are probably having to wait to compile. Write your lines of code, you wait 20 minutes to compile, and only then do you see the results. When it comes to live coding, the whole thing is live in front of you. When you write something in these languages, it comes back and the results you either see and or hear at the same time simultaneously. So that's where the live aspect comes into it, not just watching them do it, but also hearing the results live, much like people who play instruments. They hit the drum and it makes a sound. So, yeah, that's the best way to think about it. And with live coding, it's kind of, I'll say, I don't know how, well, it's probably live coding has been going on for longer than I've been alive, but especially within the last sort of five years, I've become involved with a community called the Algorave community. So, Algoraving is almost exactly what it sounds like, algorithmic raving. They are writing algorithms in these live code environments to make rave electronic music, and it is cool. So I'll just show you, there's like a few videos from a couple of Algoraves, just so you can get a bit of context. I should do my sound. This is just like a video of some of the visuals that sometimes happen at the Algoraves, these ones that I made myself. Actually, from my first ever Algorave, I think back in 2014 or 15. And these are all generated in the real time, so it usually starts off quite simple and then builds up to quite chaotic and noisy. And in this example, you can, it's more the sound focus, so Alex McLean who's standing on stage there, the code that he has written is projected behind him. One sort of has a way to reveal what is happening. Again, with music and like musical instruments, there's an instant feedback loop. A person's drums and guitar, and you hear the sound. Here with the code being projected behind, if you really wanted to try and understand it, you can see, oh, this has been typed. And now this sound has appeared. And it makes it like, I'm sure we've all been to electronic music performances where it's a person standing behind a laptop nodding their head. Yes. They could be checking the emails. But here, at least you get to see there is something happening. Someone's typing something, someone's and something's happening. And now I'll move on to the next bit. So this is, in this video, you're seeing more the visual side of things. So quite usually when there is performance happening, there's a person who's live coding music and a person who's live coding visuals. You may have heard that VJ or visual jockey. Depending on the software that you're using, some VJs will load video clips, much like a DJ loads samples and chooses different records. A VJ will choose video clips. A VJ will sometimes use video clips. But in Algoraves, they live code all of it, more or less. So if there's something on screen, as you see here, they are typing in clear, defined with state hit, whatever. And that's what's appearing. And so it starts off quite minimal and builds up. So and that's what my role is when I do Algoraves. I don't make music. I make basically just visuals. And I originally intended to this presentation to be a counterpoint for a person called Guy John. He also makes visuals. And we do it in, I use visual programming languages. So for those who worked with nodes in Blender, or you saw Tom Lechler's talk before with laid out, working with nodes, whereas Guy John uses types languages, much like every programming language you've used ever. And I wanted to originally want to talk about the benefits and the drawbacks of both of them. But since he's not here, I'm going to show you a demo of my experiences of live coding using nodes or live coding. So hands up, who has used Blender before? Blender. Yeah, so you probably used Blender before. It's a really good piece of software. In fact, I'm going to mirror my screen because that'll just make it easier. Mirror, mirror, mirror on my computer. Thank you. So you've probably used Blender before. And depending on what point you've used it, you probably know that there's a node-based interface for it. Usually this is for textures. So I'm just, I've just set up an incredibly simple scene here. And you can use nodes to generate textures for it in a, I think, quite intuitive way. So I'm just going to do a little bit of a demo here. This is the result down here at the bottom. And up at the top is the nodes that are generating whatever graphics you will be seeing. So I'm just going to do a very simple thing. So right now it's now gone black. I'm just going to add in a shader which will make it almost shadeless. And then you want to add a texture to it because right now this is just white, yeah? You might want to change the color. You see? Now the color is changing. The result is at the bottom and it's changing color. Well, hey, so impressive. But Blender has a number of in-built textures which are kind of generative. That is, they're using maths and numbers to generate shapes. So if I go, I'm going to read it out because I realize the text is small on there. But I just go to texture and I'm going to choose, let's choose the wave texture. No, not sorry, not wave texture, that's rubbish texture. I'm going to choose the Musgrave texture. So now if I connect that to here, you see, you've now got what looks like kind of, I don't know how to describe that, it looks like spots on a cow almost, an inverted cow. And with these, you've got different attributes that you can change. So the scale of it, how big is it? So if I choose scale one, it's really big. Whereas I do scale 10, it's like multiplied by 10, the amount that there is. And yeah, as I drag it up, you see it being updated in real time, et cetera. Again, things like that. And I'm going to try just another couple of textures, another couple of ones that you've got here. So if I choose the, let's choose noise texture. That is almost like what it says, it's just basically white noise or here it's colorful noise. Useful for if you want some form of randomness in your texture. And what I can do here instead is I'm going to input the Musgrave texture, but it's going to be modulated or changed by what happens in the noise texture. So if I connect that to that, you see the noise of the original noise texture like this is changing the distribution of the Musgrave texture, making it look pretty darn awesome if I do say so myself. Right, so I'm showing you this for a reason because although this is updated in real time, you see here at the bottom, it's using a rendering engine called Cycles, which is really good if you want very realistic images. But as soon as you start adding in like 20 more objects and wanting to have all of them moving around and everything, it has to update it and render each and every time trying to make it look real really realistic. And it's just basically slow. I won't show you here how slow it does get because you'll be waiting forever obviously. So this is good if you want to, and this is a like a very low resolution preview here at the bottom. When you're doing live graphics, you're of course not going to get like Disney style quality, but you want it to look really good. You want it to be HD, etc. Having this way of working with this rendering quality just wouldn't work and it's also not in way real time. So again, actually if I keep dragging this, can you see how it starts pixelated before actually fully gets like clear? That is it trying to render it in real time. So there are sort of like if you've been following the progress of Blender, you'll have seen, oh yeah, you can see my desktop now. Doesn't matter. You'll see that there's been a lot of developments with new renderer called Eevee. That's being used in Blender. They're still coding it. It's essentially, and I'm completely simplifying massively here, it's like having a game engine for your rendering. So we've all played computer games, PS4, etc. Whatever. And the graphics are good enough, right? It's not completely realistic, but it's good enough. And this is like, for example, this here is a real time viewport render. There's no like post-processing. So what I'm looking forward to is once this is stable, being able to like do all of these like live noting that I just showed you there in real time at a really high quality. But it's not stable yet. Why I chose this model to show you? I'm not sure. It's kind of disgusting. So let's try this one instead, which is actually a nicer example. So I'm looking forward to having that because then there's more possibilities for almost reworking the way that Blender is working for a live performance tool. Currently, I haven't seen much else being done with it. So another video that I just downloaded of that other person that I follow who does the Blender Sushi blog does examples occasionally of live noting, which right, the example that I showed you before was working just on one single texture. This person works with, you know, multiple different objects. So like, you know, instead of having just one texture with, and I'm adding all these colors for it, what about if there are 20 objects or 100 or 10,000 etc. And this is just an example of showing live noting. And yeah, you can see that it's quite slow in building up. But that's because programming is quite a slow process, right? Yeah. Show you show you this just as an example of kind of what I really want to look forward to is the power of Blender, which has got so many great tools, but for a live environment, hopefully that will come one day, hopefully. So again, yeah, I didn't make this video, but it's just a good example of like using Blender as a way to create graphics in a live environment. So I showed you that one. And now, let me un-mirror my screen so you can not see all my desktop. Okay, fine. So the program that I use when I'm live coding is a bit different and it's still node-based. Yeah, errors. Yeah, it's still a node-based program, but it's something called pure data, which has anyone ever heard of pure data? Oh, cool. Did you know it has a graphics environment built into it? Oh, brilliant, even better. Okay, let's just try joining my screen one more time. And if it doesn't work, I'll just let it be as it is. Yes, brilliant. Okay, cool. So pure data is a live code, well, it's a live performance environment. It's usually originally built for using it for music, but lots of people and lots of different libraries have been built for it to work with visuals with other types of data. So I'm just going to show you a little example of how I create with it. Okay, screens black, brilliant. And yes, because it's a live demo, of course, it's going to start a bit slowly. Okay, here we go. So essentially, what's going to happen here? I've got my nodes and I've just set up a very basic environment. And I'm using a program called Open Broadcast Studio just to overlay the code on top of it. It doesn't usually come like this, but whatever. That's just a little byproduct there. So I'm going to now first create my window to render the graphics. And it's really simple in some ways. I'm going to just first, for example, type the word gemhead. You can't see it at all. Funny. Yeah, you can see I've got this word gemhead, which is basically the rendering chain. Everything underneath where it says gemhead will then get rendered. So if I have gemhead here, and I'm now going to type the word cube, and I connect it together by clicking on these little spots. And you can see a little line going from one to the other. As soon as I drag it to here, there's a cube on screen where it's magic. And after that, maybe I want to change the color of it. So it works with RGB, of course. And so if I just type color, and let's say if I want to make it red, it's 1, 0, 0, connect that back up to there and that back up to there. Oh, it's gone blue. I know why it's gone blue. Yes, there we go. So now we have just a red cube here at the bottom. So what if I want to start it rotate instead? I'm going to put in a box called rotate x, y, and z. And you can almost see on the object itself, these little boxes here, here, and here. Each one of those represents a different property of the function. So here the function is rotate x, y, z. The first one here is just like the rendering itself. So that's like it's going to render the graphics. And these are the ones after it is on the x axis, the y axis, and the z axis. So let me connect these up just to demonstrate better than as I'm saying it. So tube's there, and I'm going to put in a number, and I'm going to connect it to the, let's do the first one on the x axis. And as I click and drag on it, the numbers go up, and it starts rotating. Well, hey, it's getting there, right? Yeah. So it starts like that. But now I want to introduce a bit of maths because the reason we're using a computer is to automate a lot of things, right? I don't want to have to, for a whole performance, just keep clicking and dragging on it to make it move. That would be very inefficient. So I'm going to use what's called a metronome. Metronome is? Yeah, musicians will definitely know. It's just a counter. It does tick, talk, and whatever. So every, let's say, this is measured in milliseconds. So let's say every 100 milliseconds, I want to create a loop which will add one to itself. So F for float, and I'm going to say plus one, create a loop. And now I plug it into there. And as soon as, currently nothing's happening, currently nothing's happening because it's a switch, just more much like a light switch, you have to turn it on. And I've got a little switch here. As soon as I turn it on, it's now moving. I'm basically a wizard. So thank you. So now we have it, as I said, the box or the inlet on the left usually is the data traveling through it and everything after it is changing the data that's going through it. So right now this metro is set to 100. That's as in 100 milliseconds. I'm going to, I can change that on the fly by having a number go into this box on the right. So if I type in a number now, it will override where it says 100. So if I type 1000, it's now barely even rotating because every one second is adding one degree of rotation to the rotation. Again, you can see the number changing, but if I was to change it to 10 and press enter, you see now it's going faster. Okay. Making sense. Yay. Very incredibly simple stuff. Well, say simple. And again, I can attach this to the X and Y. So it's on a pivot somewhat and so on. This is what I do, generally speaking, when I'm doing live performances and live coding. And what I like about this live coding is that it's so I do typed languages as well. I write lines of code. But I feel when I'm using these no base languages, more my personality comes through. You can see more like how I'm working or sometimes how I don't work. There's a diet, the diagram, the spider diagram of my, the diagram of my visuals builds up as it's not me, okay, builds up and it represents how sometimes my thinking process, how that is actually working. And you can see it gets to be a little bit of a mess. Whereas with type languages, yes, it's still there is still structured to it all. But and it still is you can still kind of sometimes tell someone's programming style. Thank you. But I feel with this is more representative of how I operate. Because Alex McLean, who is a very prominent live coder and is sort of the founder of Algorave movement, he was talking about how with visual programming languages, they're not really that visual because the placement of these boxes doesn't necessarily affect the code itself. So if this is here, this box toggle is here. It means the same if it's that size or that size, if it's this location or this location. So in that way, the visual representation of the diagram, the code doesn't actually affect how it operates. Whereas, I don't want to get into this debate. But you know, tabs or spaces, right? How that looks on the screen can actually affect the code itself. So in some ways, type languages are a bit more visual. And of course, at its can bet further elaborate on that better than I could. But still, it's visual in this way in that it's built up in a way that is responding to how I work. So yeah, this is an incredibly short talk. But one wants to do to end is more just to actually do a very short live coding performance. I said I wasn't going to perform, but that was a lie. So I'm going to do that. So what I'm going to do, I'm going to play some music. And there's these really amazing musicians, Shelly Notts and Joanna Armitage, who go by the performing name Algo babes, check them out. I'm just going to base it. I haven't even listened to this track. And it will build up as it goes along. Okay, so if you'll have to excuse me once I just get incredibly focused on my computer, but you will be able to see my code. So it won't be too boring for you. Let's press play. Hit it. Just gives you an idea of like, that's how it begins. Like even the music itself, they would have live coded that from scratch. And it starts to build up much like the visuals. It was one cube, then it was many cubes, then rotating, then the color start changing. And sometimes in a set, like you introduce audio reactive elements and so on. But that's just basically a demo of how live coders are working. And with the node light with the nodes as well. It's, you can just see this diagram how my four presses are working, I think better than you can in types of programming languages. I shall stop there. Thank you very much for listening and watching.