 All right. It's time for basically our last talk for today in the main hall. Give it up for Rui, the creator of all the Q stuff. All the Q stuff. And this talk is also about the Q stuff, an extended version of it. And yeah, give it up. Thank you. So this is not probably a talk, nor a workshop. It's the same conversation that we take when I come to LAC about my work that I do at home, at night, as a hobby, for already for more than 13 years now. It started with the tools that you know already. Probably who doesn't know, I will ask you, who doesn't know what Q stuff means? It started with QJAC CTL. It's the tool that you use to control the jack audio connection kit. And I think most of the people are aware of his existence. Am I right or am I wrong? I've seen everybody that the talks before about me use QJAC CTL as a controller for Jack. It's just a graphical interface for Jack. And it's been since 2003, like so. So when it started. The next project that I entered into, a GUI interface for FluidSins. It's always aging software. Probably we can call it already junk software, like QJAC CTL. And then it became also in the same year, in 2003, almost was the emergence of the Linux sampler project. And as a server or without a GUI, he deserves a proper GUI on itself. Then it appeared Q sampler. Q sampler is again a front end, a GUI front end for Linux sampler. You know, if you know, Linux sampler is a client server at Jack. And the client is Q sampler. And the server is the real engine is called Linux sampler. Nothing new here. This is old Q stuff at the time. So already 10 years ago, when it started. And then almost right after I started my attendance to the LACs, because I get to attend the LACs yearly, you ever since. That is my, I think that is my 11th attendance to the LAC. And it was when it was introduced, Q Tractor was introduced in Berlin, 90 years ago, as an audio and MIDI sequencer for getting a gap that was missing in Linux audio at the time. At the time it was, we had already Ardor, we had already Rose Garden, we had already Music sequencer. But Ardor was only audio at the time. Rose Garden was only MIDI at the time. Music, already at the time some audio, but the source code at the time was all in German. Even common variable names and so on. I could not look to the source code of music for me to improve and to get in the project because it was a nightmare because I didn't understand a bit or a word of German. So I started my own. And my own was Q Tractor. Q Tractor was presented first time already fully functional to Berlin in 2007, in LAC 2007. And it was a complete success. Okay, I hope so. The year before or later, 2008, it appeared DSSI plug-in supports and native VST support and TempoMap. So Q Tractor by this time was a general purpose audio and MIDI sequencer. Why do I keep calling a sequencer? Not a DAW as many of you also call it because it is a sequencer with DAW features. It's not a DAW with sequencer features because it was audio and MIDI from the start. In 2009, it appears the LV2 and that's when the LV2 craze or spec came about also. So it was implemented even in this year. And in 2011, a few years back, it appears one of the great features additions to Q Tractor was this acronym that is here. I'll ask you if you know what it means. It was a question. Do you know what this acronym means? No? But everybody who knows this is not in the room. No, this is called the year of Q Tractor optimization. 2011 was when automation was ready and released in Q Tractor. So this was the major milestones in Q Tractor. So it's like so. Most of this time, Q Tractor has a major application. It stays mostly with the same interface, with the same functionality. And now it's just an ongoing project of mine, of course, that only fix bugs and making some maintenance and responding to user demand for some features that it could be possible to do or not. The years after, then I found myself with nothing to do at home. I say so me as a hobby because I have a day job. I don't do Linux audio work full-time. It's actually a part-time and a hobby at home. Yeah, he's hobby software. QXG was an editor for Yamaha XG synthesizers, which I have plenty of daughter boards back at home in my desktop computers. It's just something from the 90s. So don't worry if you don't know what this means. Get along. Then when it appeared, the first smart phones from Nokia, it was the Internet tablets at the time. Also, the first operating system based on Linux was my demo distribution or operating system. And then I started also these toys. This is a controller and a MIDI gateway over the air that you can connect raw MIDI over the plug-and-play over the wireless or wireless. You can choose wireless, not but it's a real-time, not real-time stamp, but this is real-time enough. You're faster than real-time. It's just a gateway that stays on the system tray of your desktop or your Linux system. It just provides you MIDI ports that are automatically connected between machines that are using the same device or the same appliance. If you have any doubts or any questions, you can interrupt me because this is not a formal talk. If you have doubts, if you have something to say, say it, raise your hand and ask me whenever, okay? Years later, then I found myself again with nothing to do and I said, wow, let's make some toys. And I made three kinds of toys, one pure synths, subtractive synths, polyphonic, polyphonic sampler also with the synths and engines similar to the first one. And a drum kit sampler or, what do you say, a rompler, but with one sampler per key or per pad. If you remember Fabula of this morning, he assigns a set of samples to a pad or a key. The drum key view only affects one sample per key, but all the lines, synths line that envelope generators, LFOs and effects, etc., are per key. Each sample has its own, its own line, its own channel. Like I said, this is all the work of one lone wolf developer of all times. I'm an old school programmer. I was once a full-time developer in the commercial world, in the financial world, back in the days in the last century, in the late century. Then I became a, I changed jobs or functions or so. I became a database administrator and specialized, sorry, in IBM DB2 and now I am an IBM employee today, until today. So this is a completely different career and nothing related to Linux audio or programming or C++ programming. But I have the skills, I have the know-how, I have all my baggage programming as a programmer or developer programmer and then I have to give away some of my knowledge to the open source in free world. That's why I call it I'm a gift economy player. So what happened to Kutcher? Kutcher goes to 11. Do you know the joke? Well, yeah, I think you don't know what this means. This is an old joke about amplifiers and it was appeared on BBC, I think. Will correct me if I'm wrong. On BBC there was a guy interviewing the spinal tap guitar player and they were showing these new amps that are making, it was a Marshall amp and then, oh, you have something special here. Your volumes, your knobs goes to 11 or normal amplifiers of Marshall only goes to 10. So what's that? Does it sound louder? And now it goes to 11. Yeah, it goes to 11. It goes to 11. Yeah, of course it's louder. But year 11 is for the 11 year of life of Kutcher. That's why this is a joke. So lots of features they have Kutcher and this is repeated year by year or after year is always the same. It's the multi-track audio and MIDI sequencer. It does recording of either audio or MIDI. It's developed now in QT4, QT5, sorry, but you can build it by your own option to QT4. Use jack and also for audio and MIDI respectively as their infrastructure. And why, you may ask, why still also sequencer? Yes, at the time when we started developing, there was no jack MIDI at the time. But anyway, I choose myself to stay in the health sequence basis or groundwork or framework because I want this to stay in Linux. You never port this to any other operating system. Kutcher is a Linux application. Don't ever think, don't ever ask me, at least for me, to make it on Windows or on Mac or X or whatever. I will not do it. I will not do it. Of course you can fork the code and maybe use yourself, but okay, I wish you luck. Sorry. Okay, that's why I still have a stance and this attitude to keep in the Linux and I'll keep it in Linux and I'll keep it in health sequence and health and MIDI exclusively. There are shortcomings that we all know we did, but there are work arounds and Kutcher provides that work arounds to solve all the under features that the other engines, MIDI engines provide, like Jack who provides a sample-accurate MIDI sequencing or transport of data. But I would say everything inside Kutcher, once MIDI enters in the Kutcher model or engine is sample-accurate, be sure. Don't get with that argument that health sequence is not sample-accurate. Not sample-accurate overall the system, but inside the Kutcher, the workings inside Kutcher is sample-accurate. So that's the pair. So the basic interface using interface is a multi-tracted recorded controlled paradigm with a timeline, a timeline display, which everyone knows what is it because it can get from since Cubase and Cakewalk and then Sonar and then Logic and then GarageBand always has the same layout and distribution about a timeline tracks over the timeline. It is the horizontal axis. It supports several audio formats as long as the lib-snd file library provides the same. So it takes all the support from the lib-snd file and also mp3, but only read-only, and the org-varbys. Now I don't recall the name. The company that made the org-varbys. With the same encoder or native encoder or the siphon for org-varbys, but you can use the same with the lib-snd file, no problem. You just have several lines. So is it a sequencer or a door feature that is known destructive? Is it as non-linear editing as usual on those tools? You have unlimited number of tracks, unlimited number of overlapping clips per track. So overlapping clips are the port views of content that are in files, or audio files, audio sample files, or MIDI files. Let's say that QChucker doesn't introduce any new format, audio or file format. He uses the standard formats for audio and for MIDI. He doesn't get any translation in between, so no need to back up also the data file. It's only the description of the session, of the layout. So the layout session is an XML file. We can edit it also, but you also can bundle all the stuff on a zip archive file, just like the LibreOffice does with this content. The rest, I think we all know, is a Windows icon click in mouse orientation. We use an interaction, so most of the actions are accessible from the mouse, the mouse, and the keyboard. And later, also from MIDI controllers and the keyboard, of course, and the audio lines and now the signal stream, but the less more, it's structurally fundamental. I only want to say about the interaction with the user interface. So he supports the old Latspa, the later DSSI, or DC, native VST plugins that are compiled in Linux, and of course the LV2 plugin support. I must say that he supports these from quite a long time, with several drawbacks with each other, but it was stones in the way, in the road, that we have bumps all over the place. But I must say that today things are much stabbler in all, in all vertens, or in all ways, DSSI, VST, or LV2. You can put any time, any number of plugins one after another on tracks, on buzzes. I would say, explain what is a track, what is a buzz, is a track, you know what is it. It's a very common concept, and buzz in Qtractor is a little different thing than you find on the other on the other doors, or the other applications of the same type, or similar to Qtractor. In the other applications, a buzz is something that emulates a set of strips in a mixer, there are audio channels that you find, or each strip mix down on a buzz. Almost similar is also a year in Qtractor, all tracks converge and mix down to a buzz, or in audio or in MIDI. It's the same concept, the same streaming in workflow as I will show it later, will happen in the same way, being audio or being MIDI. But buzzes, has a special meaning in Qtractor because buzz is also the technical interfaces, it is the external ports of the applications. One buzz maps to one MIDI port, or jack port, or also MIDI port, or inputs, or outputs, and the signal flow always flow from input to output. There are input ports, there are output ports. In Qtractor is a black box, if you think about it. The concept of buzz here are really the most important are that buzzes are ports, input ports or output ports, either audio or MIDI. They have a common concept, as you see. It supports, back again to plugins, supports plugin presets, or the factory ones, or you can create their own in Qtractor also. It can be available for other DAWs, or special DLV2 ones, and the VST ones through the FXP and FXB format files. And this is special because you can change these files as presets, as in program presets, for across the same VST that works on Linux, only works on a Mac, or works on Windows. You can exchange presets over several across architectures. They are not locked down to Linux or to Qtractor. This is only for VST. You can put as many plugins as you wish, you want to exchange, per track or per buzz. In one special feature, you can loop recording, you can have loop recording, you have intakes, very super rated, or unfolded and unfolded as takes, as a call it takes in Qtractor. Most of these things are completely strange for you, but later, when we get the hands-on demonstration of the way that we can work with Qtractor, easily or not, it depends. I can show you all you did. So just make a mark or a bookmark which feature you see is that you don't understand what it means. Ask me later, and then I'll show you what it really means on the screen or on the... and working out in practice. Most of my more features is one hour after another, it's a pilot. I can stop right here because this is most specific and most boring for most of you, but these are all the features that are there already for much, much time. So I will skip it very quickly, not that I have less time because I want to reserve more time to the hands-on demo demonstration or working hour and talking to you, of course, because I'm just making this presentation now just to put in the zone, the people put in the zone. And then, you know, this is the official screenshot. There are the latest version, have a teeny new Icandy feature that is icons protract. You can put an icon saying just to making it look more like logic or garage band, if you know what I mean. And this is the major block diagram that represents the workflow inside Qtractor. This is the most general. As you see, the buzzes are the input ports or the output ports. Then tracks are stuck here and they converge for one output or for another output. It doesn't mean that they are always the same. Buzzes doesn't mean that they are always stereo. You can add or create much more buzzes or much much ports for you to make a real multi-tracking recording, for example. Simultaneously, you can record all your bands or your mics, microphones and pickups or guitars or amplifiers or whatever in each several audio inputs. You have to need a multi-channel sound device, of course. It only gets in the box when the default is only stereo. That's why the default session in Qtractor is always stereo in and stereo out. But that's the default. You can set up plenty of input buzzes or plenty of buzzes, each one with the number of channels of your choice or your design. You set up your studio as you will. You're not locked into being stereo in and stereo out. A buzz can be input, output or duplex. When I say duplex, it's an entity that has input ports and output ports and a switch, we call it monitor, here, that is a pass-through. What enters input appears on the output automatically. Just see that's the switch of monitor. There is an option per track and per buzz, but on each track and buzz. There are differences if you are dealing with audio signal or the midi signal. Here it's about what happens in this triangle, yellow triangle over there. That's where are the gain and panning controls and the plug-in chain of each entity or being a track or output buzz or input buzz. If it's audio, the audio signal enters the plug-in chain and then goes to the mixing controls. It is prefaded. The plug-ins are prefaded. The volume fader and the panning switch or switch or control are after the plug-in chain or the effects. All effects appear as prefader effects or prefader inserts, as we can call it. There's no yet, it was asked before. It's asked every year, but I'm thinking about doing something like hardware. It's like having also post-fader inserts. When I say inserts, they are plug-in chains. It's a plug-in chain that we insert plug-ins, one after the other. On an audio track, it only makes sense to insert audio effects plug-ins. Audio effects plug-ins are plug-ins that I have. Take audio as input and transform input audio as output. No MIDI signal enters here. It's exclusively only for audio. That's why MIDI tracks and MIDI buzzes also, to a certain extent, are maybe called super tracks. Tracks are more specialized because it can handle MIDI and they flow audio signal also. It can be exclusive MIDI tracks or MIDI instruments that accept MIDI and output audio or it can be also MIDI transformation or MIDI filters. You can accept MIDI, output MIDI, then you can put a MIDI instrument over there and then after other more audio effects. You can interleave here several types or any type of plug-in. You can insert MIDI filters first, then a MIDI instrument. A MIDI instrument is a synth or a sampler or whatever. It is a plug-in that accepts MIDI input and produces audio output and that audio output can be fed to more audio effects. That's the big difference of MIDI MIDI plug-ins. Also, when you see the volume fader or slider and the panning slider, they don't affect directly the audio signal. They are just specialized MIDI CC controllers, dedicated MIDI controllers. Volume emits MIDI CC number 7 for GM volume and for panning is the number 10 CC 10 for panning. But these are events that go to the plug-in chain then. It depends the implementation of each MIDI plug-in to react to MIDI volume, MIDI channel volume is the number 7 CC 7 or MIDI channel panning, MIDI CC 10. I'm not mistaken if I remember correctly. That's a big difference. What happens to the audio that gets out from an audio track? It goes for an audio or it can only go to an audio bus. So here it doesn't appear the MIDI output but MIDI output bus doesn't enter for the MIDI instrument plug-ins. MIDI instrument plug-ins produce audio and this audio is mixed down on an audio bus of your choice. By default, it's the first one. The first one is the first bus, output bus that exists. Normally, by default, so you only have one is the master out. But if you create more, you can give it a name and you can say that this track or this plug-in chain, when producing audio, this audio goes to this output bus in this case. More evil things. Now what happens is if I don't know, if you didn't notice that you don't give a number of audio channels to a track, the number of channels of an audio track in Qtractor are determined by the number of channels of the audio bus that is getting the audio signal out. So what happens if you put a multi-channel file or clip in an audio track that are connected to an audio output with more or less channels than the audio file? That's what happens when the multiplexing is automatically done at the clip level. For example, you have a track that is connected to an stereo audio bus on two channels. But the file is F4 channels. So what you see here is all the channels goes mixed down to the first channel. The even channels get mixed down. This is not properly correct or useful in any way, but that's the way that it gets. Most importantly, you have to have in mind when you are putting mono files on stereo tracks or stereo bus and vice versa. That's what happens in this column until there or over there. I don't know if this is clear enough or did you understand what I'm talking about. So on a track, in other words, it follows that in a track you can put stereo clips, mono clips, clips with multi-channel. Now that the new format that they call it stumps, which is just a multi-channel audio file, there's nothing special about it. Believe me, it's a stumps data. If you have four stumps, it's four channels. It's like this one. So it's better for you to put in a track with connected to a special audio bus with four channels. Otherwise, you get this mess. Probably it's not desirable. Yeah? Okay. Now about... Oh, sorry. Okay. You have plugins. This is about the audio plugins for Flow and what gets on a plugin showing what happens when a plugin has different number of channels for the audio output that they are intended to do. For example, it's just for taking an example because you can follow about this. Each row represents a chain of plugins with different configurations, with different layout of ports or number of ports. On the first one, do we have a laser pointer? Just for nothing. Does it work? Okay. In this first one, you have one audio channel. It means that it's a track connected to the mono audio bus that you are inserting a plugin that has one input part and only one output part. Only one instance of a plugin is created. Then on the other way, what happens when you have more outputs than the number of channels of the bus? The signal that gets on the second output gets lost. It's not used. Sorry. You can't do it the other way. Yeah. That's the automatic routing that you get in Kuchak with plugins. Now, let's see another example here. We have two stereo inputs on a track that have stereo channels or two channels. When we select a plugin to insert a plugin that has only one output and only one input, a reversal, they create two instance and get there in parallel. The same is here when you get this layout. They are connected, they are shined in this way. After this application, you can understand what is here drawn. No? It's the way of automatic shining that you get in audio when you have a different configuration of input and output between plugins and the tracks and the track shine that we are dealing. It depends per plugin. Sometimes it creates two instances of a plugin and puts it in parallel. Some other times you get just one plugin and the signal is forked or it gets lost. It doesn't get anywhere. Okay? This is the basic idea, not only for showing up what happens in this situation. This is for audio. We are only talking about the audio ports because in MIDI things happen a little different again. As you know, there are at least three types of MIDI plugins. There are MIDI filters that only take MIDI input and deliver MIDI outputs. There are MIDI instruments like synths and samplers and so on that take MIDI input and render audio output. Then there are pure audio effects. Only with audio input and only with one or more audio output. In this way, on a MIDI track, this only happens on MIDI tracks in a MIDI bus, the MIDI signal is just passed through. It's not handled by an audio plugin. The same happens when a plugin doesn't have MIDI outputs. What that means? It's time for math or tea. I must remember that I asked for a workshop of two hours at least. And if the plugin presents at least one MIDI output, then there is no automatic pass through. Now, if you have a mix of MIDI inputs and audio inputs and outputs, there's the way that the signal is delivered on these two ends. These drawings or these diagrams are only for reference, for your future reference when you have doubts about what's happening and you can refer it for it when it happens. There is something that gets obvious and implicit, but we never know when people get into the problems. Now, there are also two kinds of internal plugins, where I call it inserts, that serves for you to accept external inputs or and deliver external output besides the normal buses or output tractor buses. This is one of the, this is the send return insert. You can have this or this is the signal. The layout of the internal working is exactly similar. If you are talking about an audio one or a MIDI one, just that on audio these gains or these amplifiers or attenuators are exactly that, amplifiers or attenuators. On MIDI, they only apply for the velocity of the event. For note, only applies on note events to you can attenuate or lower the velocity or it's a velocity multiplier. Any doubts? What is the difference? In audio, it applies to the audio signal. On MIDI, it applies only for the velocity, the value of each note event, note on because note off doesn't have velocity or at least velocity zero by convention. It's not in the fault of the specification on MIDI. This is one of the types of this kind of research, special inserts, where you can all, it's called a dark send that you can fork. If you remember, each track can only send to an output bus. Each track only assigns to an output bus. But you can fork, you can say, goes to the default output bus or another or simultaneously and mix down on the other side. It's a couple of helper plugins that you can insert on any plug-in chain in any place or in the beginning, in the lever or as many as you want. Not advisable, unless you really need it, but it's really necessary for special side-shining configurations and for alternate or aux bus like Fowler to present this morning. And these kind of things, you just may not get limited to the left to right flow of the original diagram that I presented. As we can get back for you to look, to get a look again. This is the way that I'm talking about. This is general and when you can insert here an aux send insert and then instead of going to that output bus, you can say go also to another. It's the way to fork the signal, the default way of the signal. Okay, now we get to what's wrong or what's missing, still missing on the original idea if in future list of Qtractor. Something that is still missing and I probably will miss forever is this. We have jacked transports for syncing the transport and the playback and the speed of the playback. I don't have the Qtractor in other way. Qtractor doesn't read the immediate timecode or SMBT in linear timecode. But probably someone can step in and introduce the code. It's not very difficult but I don't find it very useful because this was only useful on the old studios when you have machines, real machines, separated for a world clock and several type recorders that you have to be in sync. It's not really needed on a modern sequencer to today. Okay, well that's okay. Then Jack Meady, of course, I already said that he stays on the awesome sequencer for quite a long time. There is plans, there is a branch for a long time on Git that he introduces the OSC interface. We've a little specialized for one application because it was only meant for a GP Mercury for the freewheeling looper. I don't know if you remember what it is, the freewheeling looper, but it's a live looper. It's a lot of funny to work with it. But it has an OSC interface for recording the loops that are produced live on that looper. It could be, for example, the, how do you call it today, Superbuckle already has this feature included. It can record only the orders of the loops that are playing in any one time. Integrated scripting, yeah. At the time, I would prefer the Angel script because it was much similar to C++ and C, and it's all objective oriented and it is high performance and it can be real time. Of course, as everything in C or C++, we have to be careful what we're doing. But as a scripting, you have to special with memory allocation. But it is one thought and I'm still thinking to improve it sometime in the future. I don't feel the urge to do it now already. Okay. Anything that I miss and then you know it? Frank. Okay. There are these lists of things that is not what is missing, but it won't get fixed any soon. As a Q-Tractor, this is very technical, okay, if you are getting into very details. But as it is a one-single-jack clients, you cannot, you, connecting his own inputs to his outputs, like you wanted to bounce immediately from output to what is producing to record on another track immediately. It doesn't work always. Much of the time, depending on which one comes first or on the time that you are doing the connection, the next time when you load it back, you only get silence. You could record at that time, okay, but then you save it, you go have lunch or have dinner when you get back. Oh, it doesn't work anymore. But you get silence. You cannot repeat what you have done before. It happens. Okay. But it's very deterministic. It's not a random bug. Once it doesn't work, it doesn't work anymore. So, it won't ever work again unless you repeat the same creational tasks or create again the new connections. Okay. Much for that. Buzzes, they have plugins, as I show you, but you cannot automate any of the parameters of buzzes, only for tracks. Automation is a track property. You cannot make any parameter of any plugin. You can automate most of the track properties, like mood, solo, volume, gain, monitor, and that's it. On buzzes, you don't get the automation. You don't get a timeline for varying, dynamically varying the values of buzzes. That's one shortcoming. This is one shortcoming. It can be fixed or created later, but not yet. Like I said, there's no post-fighter plugins inserts. The post-fighter appears on the output buzzes, on tracks, if you are thinking about tracks. It's a different model from the old analog mixing desks, but yes, this is the way that I was designing. There is no plug-in latency compensation. There is recording audio compensation that you get things compensated when you are recording audio, just because Jack can provide that information. But the plug-in latency compensation, on the plug-in chains, you get, you have to compensate manually on the timeline for this delay. Okay. And nothing more. Do we have something more to express? No doubts. Then we pass for whatever you want, and I'll show you how I can make a tune from scratch. I try. I'm not a musician, but it's the way that I say, and then you will see how the, one of the ways that we can use Qtractor to work, to work or make some sound. Okay. Questions, please? Of course, this is not closer than Qtractor, please. If you have any questions about Qixitl, Qsync, Qsampler or any other Qstuff, please ask. I'm ready. I'm here. It's just about the microphone, maybe it could take it a little bit away from the mouth because it's popping. Yeah, something like that. Thank you. Yeah, good question. Rui, about your list of shortcomings. So in the last two weeks or so, I used a little more active than normally. And what I found is still shortcomings about usability. Like, I noticed that I often have to use the scroll bar a lot to find the right position. I find that the playhead is hard to pick up and pull around. So I think this is more the, not the technical part, but rather the usability part of Qtractor. Here, I think we have some shortcomings and I've written down some of these that we can discuss maybe dinner tonight. I think this should go in the list as well. And others also have ideas about where it can be better for my workflow. Question accepted. So let's get out of here and try to make something. Oh, I didn't talk about the V1s. Anybody interested about the V1s? For example, this Synth V1, this is a subtractive synth. It exists as a standalone application, Jack application, or as a LV2, or the sampler with most of the same controls for the synths as the synth is a subtractive wave table synth size. It's polyphonic. And the sampler is also just one sample, one key sample, but it's polyphonic and all over the range. And the drum key V1, it is a drum kit sampler. So we have a key for each drum sample. Let's see if I can just map here. Where do you change octaves, Albert? Octave here. I only want to show up. I will see if it's connected because I'm using the standalone application. It's already running. It's already. So maybe you have to connect again. It happens. No, no, no. No problem. Let's get back to contract. Then I will see what is wrong. Yes. Yeah. While you hopped at drum cave V1 open, one thing that I found a little puzzling when you create or when you add a new sample to your mapping list, you have a default envelope, which is like a typical decaying curve. Normally for drum samples, I wouldn't want that. I want a false volume and then go on to the end and it was down again, like a square. I wouldn't want any kind of decaying curve for a sample that I want to play at its own volume. But that's the default. You can. Yeah, but it's the default. I would have to drag items or have to drag the ADSR points around myself first. Okay. Can it be stored for each of the following sample entries? You are seeing that the DCA. Yeah. Okay. You can just say the control or the control or drag and drop. Yeah, I can certainly change it, but I guess the default behavior should be that it's always full volume right from the start to the very end. Why would you want a prerecorded sample to even get additional fading? Like say I have a crash symbol playing. Why would I want to be double attenuated? Because it's more prettier. It's prettier, yes, but it's not logical, right? You're putting a square. It's not very pretty, but you can change it anyway. I just, I had a few times I had loaded samples and I was wondering why they sound so short until I noticed that it's the default of that envelope to always fade immediately or pretty early while I want them to be played at full volume or at their own natural volume without change. Just a food for thought. Yeah, but it just, it just need me, need picking, isn't it? Okay, let's connect here. Let's see. Okay, midi is in, but I don't, oh yeah. Thank you. Very good question. Oh yeah. It's still nothing. Okay. It has effects. And they already changed the profile of the envelope. So this is the, without the effects, oh, it's still there. Oh, I know. It could be not this. Oh, okay. It is not. Just, these are the effect chains are in, like an aux bus. You just say, this is the FX send and they apply send to the whole channels and then they mix down at the output. So ended this FX send is a property of each sample. Okay. So that's what I'm talking about. Okay. Let's try to play something. Start playing. As you probably know, I'll put it much more greater for you to see. It starts with no blank slate, right? As everything. We have to know the things. I think it's pretty obvious most of the things to for you to start playing. I will just show you how to, I'll put a random clip. It usually starts on the tempo 120. And let's practice about that default also. Okay. I will start inserting a drum loop, which I don't, I still don't know which tempo I want, but I will try to adjust it first. Okay. I have here a copy of a sample library that comes on, used to come on, want to see the old magazines that you can buy in any newspaper shop like computer music and future music and others. And we have here a sequence of drum loops. Rui, another one. Yeah. I think I asked this once or twice before in the last two years. If I don't use KDE, how do I get your theme running on my system? The nice dark gray, what is it called? Vonton soup theme, right? I think it's a KDE theme, but can I do this also with just pure QT? You go to the options and you can choose here the Vonton soup. This is a QT theme, not a KDE theme, right? No, this is the QTractor. If you don't like your default desktop environment, you can use mine or your old folk TX. You get the Vonton soup. For example, I will put now the KX Studio. It's a little more darker. This only applies when you start again. Next time, yeah. It's another item for usability, right? But I don't like this. It's too dark for me. I will put back where it was. It is default for me, but you can use this. It's the same. It's my default. Let's start again. For you to enter a new clip, you want to start on the beginning or on somewhere else? You have a blank sheet, but normally in these blue lines, these blue lines is the edit cursor that I call edit cursor. Things happen when when you are entering something or introducing or creating something, they start over here. I will just import an audio track. It is an audio file. There will become an audio clip on audio track. This is the clip. I can hear what it's doing on the audio or just play back here. You will listen to it. Of course, this is obvious. Now, you can see that the clip is not aligned to the current temple orbit or timeline, but you have already noticed that this is some library. The temple is in the file name, but let's assume that I don't know it, a priori, at first time. I want to know what temple is this loop. I don't read this information because I don't know if this file comes from the old sony, acid loops or acid? It's just acid. It's not acid loops. Okay. This was one of the first loops. I want to do some adjustment on the temple. I don't know, but I will just call the default. Oh, this looks like with this length and divided by this number of bits, I will find that this is 125. Okay. Now, it's aligned. I will make it immediately like a loop on a timeline and then I'll start playing. Now, let's see. Now that I will import another loop with a different temple, but I want to adjust to this time, to this temple. What should I do? In theory, I left to time stretch, isn't it? Okay. I will take another temple for example. I created another track, but this is not what I really want. Okay. Now, I would say that the loop will be on this range, but the new clip is not on the right temple. It will be here. It's a little more faster and there's a gap on the measure. So, I will time stretch on the file. How do you time stretch a clip? You have to drag one of the edges until it gets to the end of the measure that you want, but pressing the control CTRL, control key or modifier, and then, no, it's not the, sorry, it's not control. Control, this is another thing. It shifts and then it got stretched over the measure. Now, this is clip was originally in the 20s, 130 BPM, now got stretched to 125. Okay. Let's see if it sounds right. More, more less. Now, okay. Much for drums, but now I will replicate something here. I will, I will doing copy in paste and put the new clip afterwards. Okay. Now, I will extend the loop. I pick up the one of the loop marks and drag over the new end. That's one the way to do it. You can probably set there the blue lines and press the loop whenever you want. Okay. Again. Now, this is the way for you to put loops one after another on the timeline. It's not a looper. Okay. This is the sequencer that you are sequencing clips one after another. Okay. One of the recent features that they have was an original idea from Reaper and you can replicate the clip over the timeline, just changing one after another and making a one-time loop. So here, yes, you are extending the edge of the clip, but with the control or CTRL key. And then you will, when you leave the clip got replicated, repeated all over the length of the timeline. The loop is still in this range. I will put on the other side. Blue line here, left, right blue line here, left blue line here, and then we just change the loop. Okay. This is the other way. Now it's getting boring, so only drums, isn't it? Okay. Let's add something else. Maybe now the next step will be what? A baseline? Okay. Rui? What? Feature request, perhaps. It lights off? No, I didn't ask for that. No, wondering cue points. So if I have a longer song and I want to have certain cue points beginning of the first verse, chorus, and so on, so like multiple points I can easily jump to within my song, to easily locate back and forth in a song. Is that an idea? Yeah, but this is the feature that already exists. It's called markers. For example, this is the, I don't know if it will be a verse or not, and you can choose the color, whatever. Yeah, and it stays there over the call on the timeline. I'm impressed, okay. So, but it's the only point to put these markers, and you just, when you move forward or backward, they skip over the markers. They stop over the markers and they skip, stop. Okay. Feature implemented, thanks. You got to be more careful when you ask for features. So much to learn. Now for the bass line, I will make more audio clip. I have already this already done in 125. What key do you want? Okay. Let's see. I will move the clip for the beginning. Then I replicate this. I didn't yet, but let's have a surprise now. I want to replicate, use the control key all over the range. I got replicated. Okay. So, I can put the fade in and just let's hear what the hell is this? This sounds familiar. Okay. Maybe I will start a new verse here, as Frank suggested. I'll put here another kind of clip. So, just for playing. Okay. We'll go in quote, on one. Let's do F, D, F. Let's put an F here. Okay. Copy. I can do it. This is always on the fly while he's playing, but I think people get fed up with the beat. Okay. I didn't like this. I prefer the other. So undo. Okay. That's fine. Okay. Now for a bass line, a synth line. For example, if you have any idea, please let me know. Then I will just show it for then automation and whatever. And then let's see what this can may come about. What is this? The way that I go back, right back to zero is shift and backward. Okay. Let's repeat this over the end. But I think I had the wrong tempo or the wrong... I have to time stretch also. So shift on the edge until the end of the measure. I didn't copy. I just move it. So now I will copy. Yes. And put somewhere else here where it was. Am I copying? No. I'm just moving it. Control C. Control V. I'm using the shortcuts. Control C, Control V. I think it's standard for clip part operations. Now let's put here a plugin on the synth or in the bass. What do you prefer? Or better. Let's use a MIDI track for more recording. Most of the people who insert plugins, they use the track dialogue. But I find it more convenient to come here to the mixer strip and pull with the right click, right click, right button click to get the concept menu. And I will ask here for one of my samples. Of course you can use any other plugin, but I will use this because it's made at home, homebrew. Sample V1. It starts empty also. You have to choose a sample or other sample in the beginning, but I will use a practice preset. Make it relative. Let's see what it sounds like. It's not connected. I will connect from the input bus the keyboard, the external keyboard, the MIDI controller. This is the MPKE MIDI 2. This time if I play it's already routed to the plugin chain and the sample, the plugin sampler is already playing sound and pointing to the other output mask. Let's get the effects also. I'm overreacting about the effects. Let's put less of accents. Okay, yeah, probably it's enough. Now, maybe we can record. And probably we can try the takes recording and whatever. First of all you have to see if the loop recording mode is all right. Put something, any other thing but the known option and this is which of the takes will set when you stop recording or get a loop. Loop recording is the way that you are recording and the player is moving and looping and each turnaround of the loop becomes a new take. Okay, this is the concept of takes in Qtractra. Now, I will try recording. I'll arm the track for recording. They will record the MIDI input that I will play in the keyboard and at the same time I'm hearing the plugin rendering. Now, many people complain about this. I'm ready for recording and they ask for something here. This is you have to put the name of the session. At least the files that will appear on the disk have to have a name. And it should be not an untitled or untitled one. This is the name of the session. You cannot see here. Oh, sorry. But there is an title there somewhere. Okay, I will call it. This is electro style so I will call it. But it's 2016. Yeah, we're still on time to change it. Oh, yeah, yeah. All right. Okay, let's start recording. But before I started recording, I want only recording inside the loop. So I will say that I have a punch in, punch out, punch region. So why? No, okay, okay, sorry. I have to put the blue lines on the right places first and then in. Okay, so it's loop and then put punch in. Maybe you can say it in any order. Okay, now this is ready for recording. If I start playback, you will record it on this region that is more brighter and what is shaded is outside of the punch in region or in the loop. So I will start. We started here is where the player was. So in this time it is recording. Let's see what I can say. Probably I don't like very much this take but I will try again. Don't give a case because I'm not, I don't never play in tune. So okay, but we have already made a recording. The clip is over there and if you, if you look at the name you can get the two tips. It is the, you have take three. This is the third time that is recorded under a loop. Although I switch between takes, you have to go to the menu or using a shortcut. The shortcut exists already. It's the default shortcut but I have to show you where this takes or takes menu it is. You have which take is current. Now you have, you have three takes or I recorded over the loop three times and you may choose the first, one of the middle of the last. That's why it appears the first and last when you're setting the mode for the loop recording. Okay, but now I will select the first take. You remember that the first take I started recording here at the middle of the loop. So this is the content is the what it has in the first time that you play. Let's, let's remember what was there. It's empty for all that time. I will change takes on the fly. Let me remember which shortcut it is because I can shift it. I'm changing but this is take two, take one, take two, take three. Okay, I'm just wrapping around the takes. This is only to show you what this takes is all about and for me the recording I took over the loop and over a punch in punch out region or range. Okay, now automation, just to show you automation. I want to automate what? Let's see if I can automate the filters or the filter of the scenes, these plugin scenes. So I will select that one of, for example, I have to know the names but it is DCF cutoff or name. Now this is the, it appears the curve of the automation curve and I can see the effects while it's playing and I am automating this parameter here, cutoff. Okay, indeed the curve is flat. It's the red shade that you can see over there. But I will edit here on the fly by hand with the mouse. We have to put on automation mode and draw here some lines, just some curve. Don't worry, that doesn't make sense. It's just a random entry. It doesn't seem to be working, isn't it? It's because it's not in play mode. Now it's sure it will be working. But now when it turns back you can see more. You can see from here that it's moving. Okay, and it's moving according to this envelope, the red envelope. This is basically, this is the automation. You are automating a parameter of a plugin. In this case, you get cutoff. You can choose any other one because if they are exposed as ports or LV2 ports, this is the LV2 version of the sample volume. Okay, you can or not hear a very difference in this curve. It depends on the sound, on the sample, it's mostly on the low frequencies. Okay, you see that I've made it with the mouse on the screen. But if you map a MIDI controller to this parameter of the plugin, you can record the curve while you are tweaking on the MIDI controller. So this is the way that you have to do this. Like I said, I will now exemplify that. I will map this parameter to a MIDI controller. I have to look at the generic plugin GUI. It's the internal of GUI chapter. It's not the GUI of the plugin. And I'll look here for the parameter and this is this one. I call it for the context menu and I will try to assign a MIDI controller. This already is expecting MIDI input. If I turn here a knob, I will see that no, I don't see nothing. Probably this doesn't send anything. Okay, let me see if this is connected to... No, it isn't. Okay, now we should be seeing something. Okay, this is the map of CC1. This is modulation. Now I will turn another knob. It's the breath. Another knob is the 3. It has no name in the GM slender. I will try to make it this one. Okay, this is the numbers that are assigned currently here. I will use this number 5. Okay, now it's the number 5. I'm tricking here and you are seeing it change here there. Okay, correct? And it out. So now there's a MIDI controller assigned to a plugin parameter. You can see also that segment on the GUI. I'm just changing it via the MIDI controller. And now I can record these tweaks or these movements on the call as the automation curve because I'm changing the value of the parameter via the MIDI controller. And all changes can be recorded as an automation curve. So I will arm for recording now the current automation curve. And now if I play back, it's just playback. This is not recording the content, media content. This is recording automation series of data points. Now I will change here. I hope that it changes. Okay. I think you are seeing the same as me. I'm just tricking here and the automation curve is being recorded. Okay. I think I'm also using the modulation because you hear a tremolo. But there was a mistake here when I took it here. This is assigned here sending CC1 and it's changing. That is a hard-coded reaction on the sample V1. It's not a Qtractor of the MIDI controller. So more explanations. Now it's your turn. Albert asked for making shortcuts. New shortcuts is being in this menu, in this window. And you can change all the keyboard, PC keyboard shortcuts for any command of the main menu and on the MIDI editor also. And since last year, you can assign also a MIDI controller. Let's see if this will happen here. I will put now assign a pad, one of these pad buttons to what? It can be for playback. No, that's already assigned. But let's put something. Oh, there will probably be one that could be funny. What would it exist? Just close the application. Let's see. This is sending a notepad, a notepad. Probably notepad was the last one. Okay. Let's see. Now it's assigned. And then this is, there's a MIDI shortcut for the menu file exits. So if I, now I hope that if I press here, the pad. Okay. It's probably only to exit. Okay. But I will cancel. Okay. It's not what I want it to do. Rui, one question here. Something more creative. Sorry. When you try to assign a keyboard shortcut that is already in use, would it warn you that you are reassigning the shortcut? No, you can't. If it's already assigned to something else, you cannot put in your own. You have to erase the old one or reassign to something else and get the new one. For example, Albert will do it for the create track. He's taking the example from Albert. The track you wanted to be control T. Control T, he said, yeah. And you just press here the control T combination. He says that control T is already assigned to something else. You have to find the which. Oh, it's there. Okay. The mark track is selected. Okay. So now we can erase it. And you have to go back here and put control T. Not control T. Yeah. Now. Now. And you will see by the menu that the track now is control T. Okay. Of course, right now usability, right? So if you know what he's assigned to already or what function is assigned to and say it, don't just say it's in use, but in use by what function? Now I have broken this because I have made the Albert suggestions and I don't have any more of my own. Okay. My default ones. But I will put here also a MIDI controller. I would take, for example, this pad. I will do it. Okay. When I release it, it's when I add a track. This is for the same command control T. It's also doing this pad. So let's see. It's not working. I hope this only happens to no tones. Okay. No toffs though. It doesn't, but can mean not be a signer. No, I remember. We just changed it because it's the same key. Okay. Let's see. Okay. That's right. So one thing that I overlooked was that you cannot assign no toffs for MIDI controllers. It don't really make any sense, but only the no tones are valid. Okay. So I could get rid of when we are pressing here and seeing. Just don't never select no toffs. Well, okay. But any more things? Whatever. Let me let me see if I remember anything that could be as oathful. Rui, one more thing about keyboard shortcuts. I guess you have no way of not offering the sheet that the shortcuts, which are already in use by your window manager, say, alt and cursor keys are normally bound by your IWM or your XFC. And these, I guess, whenever I write the application. No, you can't. You can, but you have to erase it because the problem is you can put it there, but it depends on your desktop environment. Possibly shortcuts are in your application level. It's not on system level. They are global for the environment. It's only for application or for this window. But if the window manager already eats the event, then you never see it. So you cannot know it doesn't arrive. But that depends on if the desktop environment doesn't follow up or also feeds the applications we did at keyboard, you get swallowed and you don't see it. How can I do it? It's just like I haven't typed it. So the system takes precedence. What more? Let me see. About that discussion about QT4 and QT5 and if they are applicable to plugins, GUIs or not, I think in my own opinion, of course, and it's mine, that is a debate that is bullshit. QT is very, very applicable for plugins, GUIs. The problem here is that about, for example, ARDOR and the way that it gets plugins or the plugins GUIs inside their own widgets or embedded widgets. The problem with QT, special QT5, they are getting away from X11 dependence or reference. So you don't get a X11 window. This is very technical, but you don't get the X11 interface anymore with QT5. You have to get rid of it or arrange some other X for you to get the X11 window description, is the opaque number of the window and get it seen by the code of the plugin. Each plugin has to do that, so it doesn't depend on spec or the LV2 or nothing. So the other problem is also between QT4 and QT5. QT4 only accepts plugins that are loaded in their other memory address of the same type. You cannot mix hosts and plugins of different QT versions. For you to load a QT plugin on a host, the host must be of the same, at least only loads the same QT library. Let's see. QT5 contractor only loads QT5 plugins, period. QT4 only loads, only can load QT4 plugins, period. You cannot mix in crossover. Sorry? I tend to disagree on both accounts. You can. It's just a pen in the ass. There is two possibilities. One of them being building either Qt with a prefix, which avoids symbol clashes. The other challenge that you may have is event loop integration, but I guess that runs through VST. They can. It's more if you go into specifics. The problem is not between QT, but the special modulus. You cannot mix the QT GUI or QT widgets on the QT5. You can load the core library of QTs. You can mix between the two. When you get to the widget module, you cannot mix the versions. I'd like to talk about that later if you want. And the second part is, if you need a native widget, you can still do that on X11. The problem is that you have to use a Qt version-dependent API. Yes, you can. But I think there's an approach that is not going to be supported in the future. It will end some day. And one QT is far again behind. But again, we can talk about that later if it's feasible for you. Of course, you can do it. You can reimplement the old embedding with the XCB. It's not really a question, but I know you told me I had to make Karla work on QT5. It was already working except for the VST, then it's X11 embedding. Yes, you've already confirmed that it's supposed to get... I think now it works anyhow. Yeah. You will change anything and you get... The problem was with the styles on the Karla VST. Styles super-directed, it's just a sim link. If you get rid of that sim link, everything loads well. Yes, I mean about Karla. I made it work with Qt5. Now you can actually embed in other applications. It's a bit tricky because there's no official support for it, but I can show you some details. Then you can actually embed Qt5 widgets inside X11 random windows and it actually works. Yes, I didn't say that it wouldn't work. Or it's not possible. But the inverse, you can get more troubles on. For example, you will get a Qt5 window. It will be very difficult to embed on Qt5, on other, for example. It is possible. It is possible. I don't know if it's possible, but it's a hack and we will not stay the longer of history. It's very tricky because it's not official. So on the same topic, say from a user's point of view, they've bought a Qt4 plugin, or a proprietary plugin with a Qt4 GUI. They also have Qt5 packaged Q-tractor from their distros repository. They load the plugin and the whole thing crashes. That's right. Okay, so what do you say to that user? I don't know the name of that. Yeah, sure, sure. We can discuss this. I think that GUI is from a user project, I think. No? I don't know. He has a solution, but this is also an hack. He involves the DLOpen call with special flags. My point is if we have to resort to DLOpen hacks in order to support a widget toolkit in an embedded interface, isn't it just an interface or a toolkit that's not suitable for that purpose? We cannot possibly embed Qt4 in Qt5 without hacks, some type of hacks. And similarly, we can't embed Qt5 in Qt4. And the same goes for GTK2 and GTK3. So this is a problem that's across the board. It's a known problem. And why are we still trying to hack fix this? There's other solutions. We are not saying that don't go that way. Okay. Sorry, maybe I misunderstand then. But I mean, from a user's point of view, stuff is going to start blowing up. If people start packaging GTK3 plugins and they try and load them in order, which is GTK2. And similarly, with the same problem can arise with the Q stuff in general and other people writing plugins. So call to the community, please don't use GTK or Qt to write plugins because embedding the interfaces is not a viable or at least not a proper solution to the problem in the first place. Yeah. But why the embedding interfaces is so necessary? That's a very different question. But if we're going to go and implement plugin interfaces that should be embeddable, please don't use... Why not run the GUIs in your own window always? Okay. That's a very valid argument. And that's not what we're discussing right now. What we're discussing is the detail of if we're going to write embeddable plugin UIs, what toolkit is a good one and what toolkit is a bad one? But the problem is embeddable GUIs plugins. It's not about the Qt or GTK are not suitable to do plugins or GUIs. Okay? Yeah. We discussed this in the LV2 mailing list, I think. Yeah. So one solution is just get rid of instance access and run your UIs in a separate process, which I do in Carla. Yeah. So actually I can have running at the same time as GTK2, GTK3, GT4, GT5. Yeah. That's not the same address place. So you are already... You can load the incompatible libraries or certain objects. The problem is that if you use instance access... Of course. I know it. You are only targeting my view once because I use the instance access. Okay. But they work? Well, they work if the host also doesn't use a different Qt version. Yes. Because this instance access, the GUI and the DSP needs to be on the same binary. So even if the dot doesn't load, the UI can still crash because the UI symbols are in the binary. I know it. I know it very perfectly. Of course, that's problem solved if you get out of process GUI instead of the DOS, but it is a plenty of boiler code, code that each host developer has to do it, and you are already made it. Congratulations. Okay. It's good for you. I know it's a solution, but I'm not willing to do just for that person because I don't have the time. I have the time, and I have something else to do. Just to solve that problem, which I find unknown problem because in a few years Qt4 is history. Then comes Qt6. Problem, there will be that problem again. Just changing it. But okay. When Qt4 also happens to me, it's the same again. But we have time for me to implement also out of process GUIs. You've done it already, and I also get rid of the instance access of my plugins, which is a pain in the ass also. To do it, I have to refactor most of the code. It's a price. They're always a trade-off, but that doesn't imply that Qt is not suitable, or GTK is not suitable to do plugins GUIs. This is the first for me. I said in my bullshit. Yeah. This is the wrong problem. It's a wrong solution for the wrong problem. Okay. Don't take it personal. I already congratulated you for doing that in Carla, and you are doing, of course, a very good job. I'm sorry. I'm sorry because I don't have many things to do, many, many, many, do less time to fix my own things or to clean my own backyard. Well, anything else? Ari, is it enough? Remember, Qtractor goes to 11. So one comment from IRC. Yassin Phillips says he doesn't have any problems with Qtractor right now. I only want to congratulate him, especially for the awesome track takes feature. Track takes? Track takes, I think that you can do multiple recordings. Recordings. Yes, it does. I'll just show you a few moments ago. All right. There's another question. All right. Just about the feature that you can change the takes. You did it with MIDI. You can also do it with the audio, doing multiple... Yes, yes. This recording of Lou Chang's is the same for audio tracks or clips, and audio MIDI or audio tracks or MIDI clips. So it is the same concept and the same workflow. It's independent. The things that are different between audio and tracks, I explain with that diagrams flow. The flow is there are pitfalls that is not very clear. It's not out of the box that you can see it. That's why I explain it visually with diagrams and flow diagrams. I can go back there again to stress more the difference. Most of the rest working with audio or working with MIDI is right. Mostly the same. Yeah. We didn't come to the MIDI audio editor. MIDI clip editor. Sorry, I'm just messing up here. And there's another one that you can edit the MIDI in a piano world interface. Very well known. So a question from IRC. Is it possible to now use Drum Gizmo's outputs in a Q-tractor track? No. It's not clear. It's not very useful. The way that a Q-tractor matches channels with... Yeah, like multiple outputs into a stereo track. One of the slides you had there. That multiplex doesn't work because there are channels, Drum Gizmo output channels that goes nowhere. Yes. End of story. You can get the first two or the first three or the first four depending on the output bus that you are dealing. If you get all the outputs they have. 16 for example. No, it depends on the drum kit perhaps but also 32 plus. Okay. Then you had to create another output bus of 32 channels. Let's see what it looks here. Just for you. No. You have to move the mouse. Okay. Let's create another output bus. Let's see. For Drum Gizmo. And you have to set the same number of channels that they work. Which here? You have to manage to show them. And now if you... This only works if you get an MIDI track and the output for this bus. Okay. And then you can then map channel one by one for the output of Drum Gizmo. It can work but I don't know if this becomes useful. And useful. I strongly believe that the Drum Gizmo has to be his own mixed down stage. Or effects channels or effects bus but they have to channel it on his own. This is good for Arda because Arda you can also route that for independently as well. But after these, after you make these, you cannot rearrange the outputs to different places in contractor as that standard in design limitation. Okay. So unless there's more questions for Rui, I think maybe we can wrap it up and then it's gone past six o'clock I think. Or gone past the time. A little more questions in the room first perhaps though. Nope. Okay. I think thanks very much Rui for the QStuff workshop. And thanks very much. Any... I'll be here today and tomorrow. So any other questions you can talk to me face to face. No problem at all. One final question. Okay. There's no more talks after this so you can say. I think. Is there more on the schedule? Okay. Does Qtractor support sidechain ports? Like if you have a stereo input plus an extra sidechain port. Yes, we can do it but it is with several buzzes and with the AUX input or the send return insert. You can use the two options that you have. I think that there is a demo on the YouTube from in a Libra music production from now. Jeremy also made one, much older. He has more solution, much more complex. But Bluebell or Old Germansen. It's nicknamed Bluebell on the forums on the air. That made the demo how to use sidechaining with Ducca from Harry. I'm also seeing here. Okay. Anything else? More, more, more, more, more. Anyone wants to try MIDI over the air? You have to have your laptop also with the kids stuff. So I didn't explain but it is only a year here. You can route Alfa MIDI and Jack MIDI over the air in pure and direct raw MIDI. It's so useful, so good this MIDI over the air that Paul Davis incorporated in Arlar. So, in Windows. Okay. Anything else?