 So, welcome back to the C-Base channel. You know the face? It's right again. Today, or now, with his talk about AVEO, audio, video, input, output. Sound stupid, but it's great. The stage is yours. Hello. Welcome back or welcome to this lecture. This lecture is about a little bit of an artsy thing, but it's also a technology demonstrator, because maybe you followed my earlier talk from an hour ago, I already mentioned that this project was built with the ISOMARE framework, which was developed for the Hacker Freed operating system initially. And I wanted to showcase one thing is that Python developers always get stupid comments like Python is slow. And I was kind of set out to demonstrate that that is just untrue. It just depends on how you use it. So I started building a multimedia system or a solution that is, yeah, well, let's see what it is. I hope you like arts, because this talk introduces you to AVEO, and AVEO was made for arts. Arts in the context of many things. What does AVEO stand for? It's a short acronym, and I know it's a little bit stupid, but it stands for audio, video, input, output. And actually, it doesn't stop there. We are taking much more than just audio and video data and mangling it. We're also taking lots of weird inputs and outputs. Like you can control stuff with your joystick, and we'll see later. It came up because I had all these formats, and I wanted to mix and mingle them and be creative with them. So if you look at what the cool kids are doing in the industry, in the music industry, or VJ industry, whatever, they all have cool tools. They have tools like VVVVV or Blender or something. Everybody has to have some tool in use to make their performance greater. And AVEO aims to be the Swiss army knife of these tools. But it's got some focus again, but again, not. It's complicated. But you can combine it with any other tool because of the multitudes of inputs and outputs. So let's dive into the technical aspect of the software suite. It's actually just a bunch of lots of imports and some very interesting glue to get everything together. I'm standing on the shoulders of giants here because Python learned so many tricks regarding multimedia and various input and output formats. For some things, you just have to import this and then use it, and you're happy. Yeah, please clap now. It's not really that much effort, but I think the collection or how to glue it together, that is what makes AVEO special, probably. Behind the scenes, as I already mentioned, it uses the isomer as core framework because it brings some facilities that are really useful in building such a tool. There's a web front end, which you can use to configure various parameters of your operation. And it's got live previews and you can use it as renderer. But it's also got the full power of the isomer back end as in modularity and components. Let's see. I think I have a slide about that. Yeah, we're getting into the gritty details now. So the general idea of AVEO is that everything is kind of like a first class citizen. I'm not really focusing on any aspect specifically. Every idea, every part of it should get the same attention and be intermixable in any imaginable way. Some of these ways don't make sense. Others make a lot of sense. And the kind of like drive that you should have when working with a suite is to try out things. It's very experimental and sometimes you delve up on very interesting things to do. The overall system component architecture allows to combine various aspects of technology together in new and sometimes meaningful ways. It's much like pure data where you can build graphs of components that communicate with each other to achieve certain specific goals. Plugins can be developed and built with the isomer infrastructure in mind. So you have some general purpose tools for communication or network operations, but also some simple components like a Pygame input component where you can use SDL input devices. The components are communicating by event-based messaging. So you just emit your data and somebody else might pick it up or might not. Depends on what components you're running, but you can design concise graphs. And this allows asynchronous handling, which is very important because I don't know when there's some media input coming or some joystick input. Everything happens on the fly. So everything needs to be processed as such. This also allows for very efficient computation. If you do it the right way, think of streams and then you're pretty much set. The detailed user interface, which is not really performance-oriented, runs in a web browser by web servers so you can fine-tune things or load configuration data or whatever, but this is not meant immediately as performance interface. I'd like to get the computer out of the way when I'm performing as a musician. So this is just for setting up, kind of like. To actually be able to do something with AVIU, you need a little bit more than just maybe input and output components. There's a multitude, a real bunch of components for different kinds of things to do and I come up with some ideas every few months. Just recently I built a beat counter which can allow you to synchronize better to music or a joystick interface for switching presets, for example. So there's lots of batteries already included. We have human interface device support for gamepads, joysticks, analog sensors in them. I was very astonished to find out that certain gamepads, although they look like they have buttons. Those are analog buttons. So you get 26 analog inputs on one of these controllers, for example. There's also cameras and other open-cv-based sensors available. And we have MIDI input and output via Pygame, but I'm working on other solutions as well, so you can communicate with Jack immediately. There's also an OSC library that I integrated so you can get data from OSC controls or send them out. You can obviously import and export various file formats. I'm working a lot with animated image sequences like GIFs, but I also loaded videos already. And there's all sorts of weird stuff that might come in useful depending on what you are doing, on what you already have. And it's easily extendable. You can write a protocol adapter in less than 10 minutes and it's good to use. So the one interesting part is the output. Like we talked about lots of inputs, how do you output the result? Well, with the data buses, it's pretty easy. You just send out some MIDI clock or data or some other control information, but sometimes or very often you want to render video data, for example. This can be done in future. I'm working on this with the Phaser IO library in the JavaScript front end. So you have some rendering head that runs in your browser and can make use of 3D surfaces or 2D arts. And you can play back sounds and music if you want to. Or you could stream audio from the Avio server itself. Talking about server, it's obviously very strongly attached to network devices. So you can have multiple machines running on your system and have one dedicated to this task, one dedicated to that task. And they can communicate with each other and exchange meaningful information, like scenery control data or something. But this mostly needs to be built by hand because there's not much infrastructure yet to automate these kind of things. While I was playing around with the mixing video sequences for our Marta Lite, I don't know if we can sweep the camera to that maybe. But yeah, someone can do that quickly. It's a 16 by 40 pixel Lite. Some people may have known this, may know this for some years already. It has been at Congresses. And it was one of my prime output examples because the mixing video information for this tiny display is really that you can even do that in PHP or in basic or in shell script. And people are doing it. And there's actually really nice applications working with that. And so it's a perfect candidate for Avio test runs. And I started mixing video information, I think, four or five years ago. And this was actually the groundstone for the Avio framework because it started that way. But at some time and point, when I was really bucked off by all those naysayers that Python is too slow, I decided to just increase the frame buffer a little bit and take bigger input imagery and mix that. So I was mixing six to seven full HD streams in Python in real time. And I think that's pretty impressive, considering that there was no optimization going on at all. I was just using NumPy to transform these matrices into each other. And it worked. Since then, I've gotten many pointers and input on how to build a blazingly fast working system. Like there's approaches on doing this on the graphics card ram. Because with textures, you can be so much faster. And you could animate textures by just rolling them by, for example. Or there's other approaches. Many interesting ideas came up from some communities. And I hope to be able to add some of those in the next few months. So we get a fully fledged video mixer for full HD or even more resolution capable. So what did I do with that already? Some stuff was just too good to not try out. And some stuff stuck. Other experiments were not so successful. But let's check out some. I already mentioned the Matalite mixer. I'm getting ahead of myself. But this is really a nice tool. And I hope to have a nice front end for controlling the VJ functionality soon. I've been building something with a web front end, where it's like a mixer deck. And you can see several slots and add more of those. And you can also render text input text labels, which is all preparation work for a larger system that is capable of not running on just 40 by 16. Then there was the Virtual Vibrato with the Sony 6-axis controller. Hello, Sony. That's really nice that you developed an Linux driver for your recent gamepads, by the way. They are not completely evil. I love that. And I was playing around with that a lot, because it allows you to get a lot of analog input. And it's conveniently already byte-sized, so you can just take it and translate it into MIDI data. That was what I tried. And then I hooked it up with Bitwig Studio and added some of those modulators to the pitch frequency of a synthesizer. So I could, with the accelerometer way, by shaking the gamepad, I could play a perfect vibrato. And I could tune it. Like you can shake slowly. You can shake fast. You can shake hefty or just small movements. It's very fine tunable. It's like playing a real vibrato with a real instrument. But you can add it to any aspect of your synthesis process. You could also convert this into mixing data for VJing, for example. Or you could just hot glue the controller to your instrument and then do some movement things on your FX chain, for example. It offers lots of possibilities. And sadly, I don't have too much time to try out new things. Otherwise, I would probably be doing wicked shit with it. So why am I focusing on this isomer aspect? Because the isomer framework is about to get some upgrades in the next few months that are really beneficial for Avio as well. Like there will be pipes and buffer tools for more protocols which are not core related to Avio, but are sharing common functionality with other applications that were built with isomer. For example, we have added MQTT for the sailors to get sensor data across networks. This may be used as well for performance situations. There's strong support for command line tools, which may sound like it's not so relevant, but I catch myself quite often fine tuning stuff with command line tools I wrote. And the comprehensive configurable web access, it allows to collaboratively work on your performance. Because essentially, this gives you over the client server infrastructure, it gives you multi-user access to what you are doing. And people can fine tune their aspects of their show where they can completely control everything if you want to. You can also limit that by permissions, but no, we're artists. We're not limiting ourselves. Then there's this aspect of peer-to-peer with mesh-based networking. This opens whole new areas of performances for large auto sceneries, for example. And I'm really wondering what the community may come up with. And I hope you have a look at this and maybe adopt it and try out what you can do. So I sure hope you like Isomer and Avio by now, and that pretty much concludes my talk. Maybe we have some community questions going on now. Perhaps, never give up the hope. Otherwise, if not, you can always find me online. I'm Riot at cbase.org. I am Riot at Freenote, and there's several other channels you can contact me over. So yeah, I hope you enjoyed this talk as well. And there will be another last talk from me at 8 p.m. This evening, it's a German lecture and it's about the Leerstandsmilder, where I will be presenting a social tool to get a better stronghold on illegal activities around Leerstand, vacancies of shared flats. Thank you and have a good RC3. Bye.