 My name is Neil, I'm my first LGM and I'm an artist and technologist from Oxford in the UK. I also over the last couple of years have been working as part of a group called Digital Prisoners making interactive spaces and projections. So this sort of thing we've been doing is like this large scale projection on a building, fairly large scale. You might be wondering, possibly, hopefully what those two shapes are, they're actually people. So what we did in this public space is had two infrared cameras picking up people, one person in either side of a square, and then projecting and they could basically interact with an environment projected on the building. So this is a real-time projection and then complete the other end of the scale. Most recent project we did was an interactive for a museum and this is, we've got quite an interest in the history of projection. This is a contemporary twist on a magic lantern. So we've got various interactives in there, interactive sensors, and we crowdsourced people doing selfie videos and made slides except they were RFID tagged and they would start off projections. So what both of those projects have in common is that they were built with Praxis Live, which is, are in software, a hybrid visual IDE for creative coding or hybrid visual IDE for live creative coding, which is something I'll get to. So it's going to be some code again this, hopefully. So this is one of the demo projects that will start up in the examples project. Mute that so I can talk over it. So what we have there is some audio going on, which is triggering some GPU shaders doing some interesting things, spinning things around in 3D. This is, if I open the two files that relate to this, it's really tiny on this screen. So we basically have a visual node graph of components which we can edit live to create this. So that's the audio patch that's running there and that's the video patch. So to give you an idea of the environment and how to use it, I'll move to something that's a little bit simpler. So I need to restart this. I hope you can see that better than I can see this on here. So we have a very simple example project which is just iterating through a folder of images to do a simple slideshow. So these are all built-in components that come in with the software. So on this, that was not the intention. So we've got a palette of components on this side here and each of these come built in. However, the problem that, in a way, Practise Live is trying to solve is the fact that any sort of high-level patch or program like this is what happens if the built-in components don't do what you want them to do. So each of the components on here you can edit the code of. So Practise Live actually wraps processing as a library. So you can see that the code for this component is based on processing code. It's going to get awkward. Thank you. So rather than edit one of the built-in components, I'll just give you a quick overview of here. So we can bring in a custom component connected to the screen. At the moment, it does nothing. So if you use processing, you should find this vaguely familiar. So basically the Practise Live runtime embeds a compiler. So you can write components on the fly. Is that now working? No. Sorry, I'll just code some stuff. You can see how it works. Then I can talk. So we can do various bits of processing code in here. As you can see, where we've got two inputs, it's doing additive blending. So we'll take that out of the way for now and go back to the code. So it uses familiar processing types, such as pImage. So maybe we would draw that image. In order to link this into our visual programming side, we use annotations. So we'll do that. So we've now got an input port. And if I connect those two together, we're now drawing our image back on. One other cheat slightly and copy this. The other thing we can do is rotate our image. And this is obviously the rotation at the moment is baked into the code. So as well, we can use property annotation. And we now have a property which we can set here. Now, this means once we've exposed the property environment, it means you can connect MIDI, OSC, GUI control. You can take input from one source and immediately connect it to your code graphically. And as well as that, we tell it it has a range. And we can now just use that as a slider. And what other simple example with this is can annotate a method. And we now have set up a simple animation for our image. And we've now got a rotate port as well so we can connect the timer to that. Thanks. That's what we're doing. We've now got a simple transition. Right, five minutes. So it can get more complicated than this obviously. So if you've ever used processing, you may have seen this example or variation of this example and their examples. So you can create three-dimensional shapes. So again, the code for this we can redefine. It's simple. Just change the resolution of the shape on the fly. And you'll see the source of this is actually a GL shader, which is this component. So we can also actually edit the fragment shaders. So there's an editor for fragment shaders as well. And we have built-in G-streamer. So we can do simple things like play familiar trailers. But of course we don't. We could decide we want to actually just input that into our three-dimensional shape. So we can put you guys on the screen there. I'm out of you to three-dimensional shape, if we wish. Let's try that quickly. One thing in terms of using this as a live coding, if you want to show your code, as is difficult enough, as you use G-streamer, we can literally bring our environment in. See the code behind our shape. So if you look on Praxis Live Twitter feed or it was retreated by LGM last week, there is actually about a 10-minute video of me doing some three-dimensional live coding in the environment. How long have I got? One minute. One other thing, as it's a graphics conference, I won't talk about audio, but you can live code audio DSP as well. Live code lots of different data processing. And we have finally bindings to, I don't know if anyone has heard of a German project called Tinkerforge. It's an open hardware project with... I like this because I can't solve it. Clip-together sensors, motors, controllers, that sort of thing. So here I've just got a very simple connected infrared sensor. So I'll just quickly build this project. So again, you use the graph editor, but can it change the code of these? So this is our... This is the code binding this distance sensor. And I'm sending this to the video patch. So if I run this, and then we go... This is the crowd-pleasing video, particularly good with kids. So projects there, the web addresses there. There is a fairly usable manual linked on that. This is fairly well documented. If anyone is interested in using or helping to maybe develop plugins, plugins are literally you copy and paste out a bit of process and sketch code you've done in there. Or anything else, please come and talk to me. Great. Great, awesome. Yeah, he deserves it.