 Hello. This is a, well, was going to be a presentation and some demoing and live coding of Practice Live for similar technical reasons. I've moved my slides onto USB stick and there won't be any actual live coding demonstrations unfortunately. I am doing, this is going to be kind of an extended state of the level graphics about Practice Live then with some slides. maen nhw'n ymddych chi'n ymddych chi'n amlwg yn yr hyn, neu oedd y gallwn ar fy gweithio'r cyfnod. Rwy'n mynd i chi'n fawr i'ch gweithio'r cyfrifoedd arall i'ch gweithio'r cyfrifoedd, a bod neu dweud o'ch ddweud o'r ddwy o'r anwysau sy'n ei wneud eich bod yn rhan o'r lleidio. Yn ymlaen, rwy'n ei ddweud. Rwy'n ei ddweud. Rwy'n ei ddweud yn ymddych chi'n cyfrifoedd ar Fyllgrifydd, I worked for ten years doing kind of web consultancy work for cultural charity sector prior to which I'd worked for five years doing culture and local government. So slightly eclectic background, spent 15 years making various creative applications in Java, various kinds of interactive projections and performances. So I'm now freelancing doing various things around open source Java technologies particularly media based and graphics based. So practice live as part of that. So I'm doing bits of work around G streamer bindings for Java, audio utilities and getting involved with the Apache NetBeans IDE. What I'm not coding or what I'm coding for sometimes is creating audio visual performances, generative interactive spaces and projections. So since this, this was an interactive audio visual instrument which was capturing movement in a space. Projection mapping onto an old chimney space. This again was interactive where faces appeared in the smoke and followed people around the room. And whispered Laura Mipsum at them, which is one of the scariest things you will ever hear. And so this is a projection piece for a public square actually in Oxford itself. We have two camera booths, two infrared cameras, one on either side where people could interact with scenes in each other projected on a building. And then the other end of the scale. So that project and this one were created in collaboration with someone called Naomi Morris who's a dance and video artist. And this is a quickie take on a magic lantern which actually had a little Intel computer stick and an LED projector and RFID tags in the slides. More recently I've got into doing live coding, mostly audio stuff, some graphic stuff. And particularly taking ideas into nightclubs in Athens or scary mornings in Poland talking to 800 Java developers. OK, so try to cover, if you saw me talk about this in London, try not to cover exactly what it is but some of the things that have happened since. But practice live is a hybrid visual IDE for live creative coding. Obviously I'm using it quite a bit for live coding at the moment. It's not specifically designed just for live coding and most of the projection work you saw was created with it. It is as much about having an interactive environment for creative work installation, graphics, creative flow, that sort of thing. And it's as much a run time as an IDE, you don't have to use the graphical aspects of it. You can build self-contained applications with it, that sort of stuff, which I've talked about. So this is what it looks like, that's what it looks like now actually. It's had a little bit of a revamp in the recent version. I added Bezier Curves to the graph. I've had so many features over the last year. I added Bezier Curves which took two hours and that got the most interest of anything I've done in the last year. So what you see here is an IDE that mixes node-based interface and code. Any single one of those nodes is backed by code that you can live edit. So the runtime has an embedded Java compiler. So we're talking about something that you can recode anything that it's doing. Key features. In cheer to graphical patching, we can basically extend it as much as we want as it's running. So built-in support for doing things with OpenGL, GStreamer, Jack Audio. Built-in support for binding MIDI OSC UIs or physical computing. So things like, I do quite a lot of stuff. Magic Lantern had Tinkaford which is Open Hardware. So there's Tinkaford support built-in, but any sort of Open Hardware sensors and things, it's easy to integrate that and link it to any thing you're working with. And built-in to create standalone projects, cross-platform. And finally, since when I last spoke about this, it wasn't running on the pie. It now does run on the pie. It's free and open source, built on top of many great projects. It's not, you know, it's kind of, I was talking to someone that, yes, I'm in many ways the only developer of it, but it brings together, it's much about joining the pieces together. So it can work with anything that runs on the JDK, but particularly comes with built-in support for processing. So any of those nodes in a visual graph is basically a processing sketch that you can live edit. Built-in support for GStream, a video built on top of Apache NetBeans platform and IDE. Some of you may know NetBeans, as in just NetBeans. It was donated by, so it was a Sunday in Oracle, they donated it at the Apache Foundation, which I've recently got involved with as a committer and it's an interesting process. Very different. Alt-Tab now, that was going to be the bit where I showed you some live coding demos. It might be less easy, but I do have, never done this before, so last minute I have moved a video of, where are we? So whether this will work or not, we'll soon find out, but this is a video of the talk I was doing in Poland. I'm just going to skip forward a little bit, so to give you some idea of some of the things I'm talking about. Fossib. What's that mean? Can I click that? Just about hear me talking in the background there. Some examples of this talk. I thought I'd see how easy it was to recreate this. The bit you just missed was me saying that was, I was looking at various other sorts of live coding environments and creative environments. That particular example is ported from Ferro, which is a small talk environment. This is one of the graphic examples. So again, something we can play around with. It's what you may or may not be able to see. Shut up. What you may or may not be able to see is, it's a bit smaller, but that's a very simple processing sketch. So just the draw method processing sketch, which is one of the nodes within that graph, and every time I hit save, it's recompiled and immediately. So all the state is maintained. Keeps running. And then the stream. It's a little bit simpler. To echo what I just said. So then I started playing around with this in 3D. So obviously you've got 3D open gel support as well, and working with that. One of the things that's been added since I last spoke about this is the ability to pass any kind of arbitrary binary data around a graph, between graphs. So you can have multiple graphs running at any point. So there's more of your graph and a video graph here. And you're sending FFT data back and forth in order to... So that gives us the start of a building block, but that particular API is a good thing to use generally, but it's not necessarily so quick and easy to put together when you're trying to play something immediately. So in this particular component, I just built up some simple forms that use these clocks and filters, do various things around indices and things. And then we can instead do something like... So I'm sending audio data there and FFT data into the shape and manipulating that. But sometimes it says, maybe do this. There's been a lot of support. I know this is a graphic conference with a lot of support for doing audio DSP live coding and sequencing. Cheating time! The good thing about that was I didn't actually have to try and do that in front of you, but that's where the fun is. So history of this is a project. It actually dates back as far as 2011. I started as a project, I was building various things, just as a kind of repository to throw some code at that I was reusing all the time in building. So that originally was not a processing base. I'd got my own OpenGL pipeline in there. And there was very limited support for adding custom components and coding. The fourth thing was basically extensively rewritten with processing added and this ability to recode anything towards the end of 2015. So that was kind of the state you saw if you were in London. Version 3 in 2017 added support for Java 8. There's quite a lot of... which you won't get into the standard processing of using functional programming where possible. Processing 3. It's about support for sharing textures if you're on Mac or Windows and it's really annoying there isn't an equivalent on Linux. Support for the Pi. Support for adding third-party libraries in which is... OK, back to... Here's the unprepared bit. So that's... That doesn't do F11. So this is one of the examples I wanted to show you, but in terms of third-party libraries, this was bringing together everything. This is HE Mesh, which is a processing library. Very interesting to be doing geometries and things like that. So all I was doing there is actually bringing in... doing things with geometries and then bringing in video textures and shader textures and things into that. You can quickly and easily patch between and work with. And also mouse responsive. So that was something that's been added. Which is actually keyboard and mouse support. And then most recently, which was meant to be finished by now, but it's almost there. If you're coming tomorrow, you get the release candidate to play with. Bezi Connection today. Much more focused on recoding anything. So there are a few components that weren't recodable. So it's been pushed to kind of get them everything to be recodable and also pushed towards being able to recode with anything. So generic data paths and pushing any sort of data, vector data, that sort of thing through building components. And that's part of a kind of way to see about building user base and making it a bit more sustainable project. The core runtime, which says there's an idea and there's a runtime, which has always been slightly separate, but more of a focus on that as a usable thing and re-licensing it under LGPL. So practice is kind of built around... So Andrew Sorenson, developer of a project called Extemporary, talked about cyber-physical coding. It's the idea of real-time programming of real-time systems, which is kind of a big influence in terms of where practice life is and is going. It's the idea user code is a first-class citizen. It turtles all the way down, that's a phrase you know, but the idea that it's not a node system that has a lot of native components and a scripting language, if you're having to create custom functionality, it's running in slower or something like that. The idea is the entire stack of this application. You can go as deep or as high level as you want. You don't have to code at all, you can code very low level, it's up to you. And that goes down to, so on the audio side, being able to code sample level DSP and play with filters. That sort of performance level up to patching together video components and filters very easily. That's getting too technical, but it's based around a forest of actors architecture, so actors as in programming model. So we have a graph of components that are running there, which you patch together as you wish, but you can have as many graphs as you want. So different video patches, audio patches, whatever. So there's built-in support for communicating between them. Various sort of black-box services that are part of that, so live compiling or background loading of images. And then with the same thing, we then split that across different processes or different machines. So it's quite possible to have the IDE and bits of a project running on one laptop or something running on another. I know a few people have used that. Running things in multiple processes, meeting real-time stuff on a JVM gets criticised occasionally with a certain thing called the garbage collector. But actually there's one way of mitigating doing some real-time stuff is to split things across different processes. And then this is so running on a pie, so it's possible to have a project running on a pie or a group of pies and control them all from a laptop. And if you're, say, coming from processing background, you know, processing what's practiced live, give you above PDE, real-time live programming. Multiple sketches, so it wraps things so it feels like you're programming a sketch in every component. Lots of pre-built things that you can use as a basis. Built-in support for Gstreamer 1, is coming, I think, in the President of the Video Library soon. Blending fixed. So I extended the processing pipeline to use pre-multiplied alpha because they have it. I don't want to get into the flame war of whether you should or you shouldn't, but processing mixes the two. So you start compositing down a whole graph chain of things and it goes wrong. First things around resource management, so loading images, caching things. Also, in terms of if you have a long chain of operations, it will reuse textures from before and work out what it can copy and not. Threading and distribution done right, so this idea of being able to pass things lock free between a background process that's talking to the net or audio or sensors or whatever that's all built in and you don't have to think about. Built on top of a professional idea, it's all built on top of a net bean platform and IDE. Lots of added things included and the one thing that niggles me about processing is claims of floss but running on, not supporting OpenJDK properly. So that's one thing I'm definitely doing and the Windows and Mac installers generally use OpenJDK. What might you build? I thought I'd share a couple of images of things that I found that other people have made. Max, who unfortunately couldn't come, was going to be coming and talking about another project called Freeliner, which is a mapping software. But that's him. I'm just quite sure where that is, but he's using the distributed hub so he's got the video project actually running for another machine projecting in a nightclub, I think. Doing things with live graphics nice set up with two screens to be actually editing that and working with it live. That's one of his things, mixing, practice live and Freeliner together. Some other images. Hopefully this will. It's a nice video he sent me last night which I tried to put into him. It might be a bit loud. It's about really nice visuals anyway. Sounds a bit odd. Mathias I met doing a workshop in Tewbingen for Generate in October last year. He'd done a lot of work with processing before. He made good use of the... Well, Tinkerforge are German, so it's a patriotic thing to do, but he made this excellent setup with a lad of Tinkerforge components to control his processing sketches. He's doing really beautiful things. Lots of monochrome things. This is using luminance of a video file to manipulate and mash up lots of still images. There's... I think it's punkamac.de or his Instagram. There's a lot of videos of his stuff. It's really nice. Lots of simple 3D, really effective tunnels. And then I got a great... I had a great interaction on the Gitter support which I just set up a couple of weeks ago. A guy's name I do not know. I just know that's his Twitter handle. Who had a little bit of support turned out... Had a little bit of experience turned out later with working with Ruby, but hadn't used any Java before, not used Practice Live before. In about two weeks he was doing his first VJing setup and he built that. That's with the UI builder. Controlling, working with shaders. It's probably one of the most impressive UIs I've seen someone build in Practice Live because I don't tend to use it. Definitely for someone who'd only been using it for a couple of weeks. He's on a Mac so he's able to use Siphon to bring textures in and run preview windows and things, but he built that. That's on GitHub as well. A few nice things people have said. So the guy who made the VJ thing was talking about how speed developing it and the thing with it was really enjoyed. Okay, so that's really why I'm speaking. Tomorrow morning if you want to have a go, you want to see some more examples. Workshop slightly early maybe. 10.30, not too much beer tonight. Thank you. Everything is on GitHub. I know someone questioned that at some point in the mailing list when I applied to talk. Practice Live website, all the sources there. If you want to come and get involved, have a go, talk to me and be great. Any questions? I have a question about the Siphon stuff. You said that's on Windows and Mac and other things. What does it does exactly? Just grabbing textures from another window? So spout is the windows equivalent. Siphon is on that, so the support part. They allow direct texture sharing between processes. How does the other process share that texture? Is it something that depends on the other software also? Yes, so you basically it's like, I think it's like a client server. So something that sends textures, acts as a server and any client can connect to that and it just gets pumped, a texture. Does there really no way to break with text to get the text? Not as far as I'm aware. I've seen or been involved with loads of discussions with people and lots of people who, so there's a number of groups who are pushing Linux as a Libre VJ platform who are really pushing. I mean, I'm not a VJ. I just happened to write some VJs quite like you. And I know there's been... So you would need to have a new protocol like that to mix those processes? Potentially there's something coming. Is it pipe wire? Pipe wire, I think it's called. Which is one of the guys who really set up GStreamer. So it's a little bit kind of trying for video and for audio. So like equivalent of Jack for audio, but also for video. And I think there might be a possibility that it will allow texture sharing. I have a vague feeling from things I've seen in the GStreamer mailing list that there's something in there that would allow it now, but it feels like a bit of a black box. So from your perspective, what layer would that offer would be implemented at? What do you mean? To do texture sharing? Or should that be in the openGL on the stack or higher and lower? I don't know. There's a simple answer to that. And I think it will change in terms of what the openGLs you're using is. Cos I think GLES might have some functionality for that anyway. Pipe wire is a lot also about these container-based things like backpack and snap and app images where you isolate the applications and then you want access to the webcam or doing things like sharing media things or getting access to the media devices. Of course the way things normally don't exist is to directly get access to the webcam from the application that this provides and maybe requesting texture data from the webcam. OK, thank you.