 Let me test mine, if you don't mind. Well, Kofi, do you want to do it now? Yeah, I will do it now. Yeah, just to make sure. It says this meeting is being live streamed. So I think we're live. OK, no problem. Should be fine. Right, Kate. I'm sorry, I can't hear you. I don't know where live is yet, so who knows? Hold on. But I'm trying to get it going. I don't know what's happening with all my links. Oh, great. Kofi, were you trying to test now? Oh, yeah, Kate, OK, yeah, let me. Let me get back to that. You know, I'm on bizmercuryandretrograde.com right now and it's not, but it does tomorrow. So I don't know if that's anything. Isn't there isn't like everything in Mercury one OK, so you see my thing. Let me just make sure this place. I don't even know what it is. I think I might have deleted this thing. No. And. Oh, camera presentation. It's nice. Yeah, OK. OK, so it is streaming, but it's streaming to a different link. So let me. It's the proper link. I have to update the website. OK, that sounds good. OK, so I can start here. Cool. Are you done checking Kofi? Yeah. Oh, sweet. Awesome. All right, I'm going to go back to the sheet. Ed, are you ready for a quick check? OK. Ed, are you there? OK, Jack, are you ready? OK, I think I'm ready. Oh, wait, never mind. You can hear me? Yes, thanks. OK, now I got to share my screen. I've got a video. I don't have a webcam. But let me. Let me. Can you see that? Yes. Like toasters and Picovision 2.1. I see a media player. Yeah, a media player. Yeah. OK, now how do I connect the audio? The video has audio about halfway in. Cool. When you shared your screen, did you enable to share audio? I don't know. Where's that button? If you stop sharing your screen, when you try to do it again, it'll pop up in the menu. I didn't see it pop up in the menu anywhere. Let me try it again. OK, now how do I share my screen? It should say there should be like a pop up menu. It's usually like really long and really long. And it will say stop sharing, like towards the center, usually of that. Oh, up at the top. OK, start. That's mute. OK, pause, share, add more. OK, let me talk about more and see if there's a recording of this computer disabled. Start share sound. OK, I see it. OK, now how do I unshare? It'll be easier for me to get out of. Here, if you don't mind, I'm going to just boot you off. Thank you. And then you can push the share again. OK, thanks. I hate these and I'm going to command one. No worries. Just start thinking of moderating questions. It's going to get weird. It's going to be like, what's your favorite color? You know, you've got to ask the fifth, seven, not the first. Remember, that's the key. The what? You never ask people favorite color. Always ask the four for fifth, because it gets them to think longer and a systematic. So you kill like three birds with one stone. Speaking of systems, yeah. Great, I have my question for every session. Hey, Ed, were you done with your tech check? Just checking in on how you're doing. Yeah, I'm done with it. You feel OK? Yeah, I think so. If not, the video only runs five minutes. So I've got plenty of time to stop working if it stops. OK, yeah, sweet. You feel good. We'll just keep going then. All right, hi, Jack. It's your turn. Hey, how's it going? OK, so this stream is definitely live. Yeah. It's live. It's not at the right link, but it's live. Can you hear some sounds? I don't yet. Any sounds? No sound? Oh, yes. Oh, you got some sounds? OK, great. And then I just want to quickly I'm realizing that possibly Zoom is doing something funky with my graphics card. And I'm presenting a very graphics-y thing. So I just want to see if it runs. OK, it does run at a fairly slow frame rate, but that's fine. Oh, cool. So thank you. All right, keep going. Hiya, Lizzy's. Hey, how's it going? Good, how you doing? Good. Can you hear me all right? Yes, Mike is good. Sweet. Now I want to test whether this is visible. Yes, can you read it or is it not readable? I can maybe try and change the brightness of it. It's not readable because it's small. Can I make it? Like, it's small. There we go. OK, yeah, I'll probably be doing this anyway. That's readable then. Ish, it's really bright. Ish, and then the me. Yeah, OK. Is that picked up better or worse? Well, it's blurry now. I can tell their letters better. Medium-ish, maybe. Oh, it's got blurry because of the video. Right, it's not getting the right thing. Let me try and do it with the lights. I'm sorry. For this presentation, we can do it like, you know, spotlight. No, isn't that based on? Oh, that's good. Oh, yeah, that's good. Is that better? Yeah, yeah. OK, it's got a bit of my reflection from my thing, but I think that that should work. Let me try and turn the brightness up some more and see if the reflection goes, maybe. I don't know. OK, your reflection is not bad. OK, yeah, cool. And then I'll switch to this, to the stuff. Cool, I think we're good then. Let's see. Thank you. Stop pinning. Stop pinning. There we go. Nice, OK. All right. Alex and Sam, hello. Hey, everyone. Good morning. Hi there. All right, I'm going to try a little screen share. You want to do stereo? Yeah, let's do stereo. I'm just going to stop us. OK, and then we should only need to share the browser, right? Mm-hmm. OK, so you should see this and hear this. Yes. OK, we're golden. Nice. There you go. OK, thank you. That's everyone on the bill. Yes. You're already streaming. Yeah, we can just start. Oh, can one minute of breathing, please? Yeah, and anyone who's reminded if you're not presenting, if you want to turn your camera and mute your mic, and then, yeah, a minute of breathing, then we can get started. So, Kate, sometimes would you do the timer? I can do the timer, yeah. OK. I'm going to present to you. OK, and sorry, everyone, for my YouTube problems this morning with the links, but I think we've got it out there. The proper one is ready. I'll let her take it away. OK, you can get started. All right, all right, good morning. Hello to you all out there in the world. This is a global presentation conference. So we have people joining us in different time zones. So hello wherever you are, whatever time it is. This is part two of the hybrid life coding interfaces workshop for today. Earlier, we had presentations on live coding as community and humanity. This session is about life coding as systems and explorations. Our first presentation is going to be by Sejo of Compudences. So whenever you're ready, please take it away. Thanks. Thank you, Melody. Yeah, I'm very excited to be here. OK, let me start my timer also. Hi, so I'm Sejo, VGA from Compudences. And I will use my five minutes to talk to you about the Jarod team or Jarod team. A playground for the discovery, exploration, and life coding of Turing machines. The idea of this playground, of this tool is to allow for the exploration and visualization of these foundational computational machines that are Turing machines, hopefully in a playful and joyful way. Many people might know about Turing machines, but I'd say that the details of how they work are somewhat, or I would say they tend to be somewhat obscured by very formal language. That I mean, I know it's necessary for that context, but with this approach, with this playground, I want to present an alternative. So other people can learn and explore these computational machines. And now, based on this, that maybe not everyone is super familiar with what Turing machines are, I will use the playground Jarod team to talk about what are Turing machines. And therefore, at the same time, I will be showing you how the playground works. So first of all, you might have seen, so in general, a Turing machine consists of an entity, in this case, a bird with a hat standing over a possibly infinite row of tiles. In our case, these bird with a hat can have up to four possible states or poses. And then the tiles in the row of tiles can also have one out of four, in this case four, possible designs. Then bird with a hat can move along the row of tiles, either to the right or to the left. And also, the bird with a hat has the ability to change the design of the tiles at which they are standing. So we have the bird with a hat. We have the possibly infinite row of tiles. And then what we can say is more or less the program or the code, in some sense, of a Turing machine is what I'm showing in most of the screen. That is a table of rules that tells the bird with a hat what to do according to the combination of the current pose that the bird with a hat is striking, along with the design of the tile that they are standing on. So for example, in this case, the bird with a hat is standing, is extending their wings. And they are over a completely white tile. So that case is represented here. And so this is a table of the 16 possible combinations of pose plus tile. Then what we have below each combination is the instruction of what to do once we are in this situation. And an instruction or rule consists of three different steps. First, we tell the bird with a hat to reach design to change the tile they are sitting on or standing on. For example, we can here actually say, OK, we want to change these white tiles for one tile with lines. Second step, we can tell the bird with a hat in which direction to move, either to the right or to the left of the row of the position that they were before. And finally, we can tell the bird with a hat to choose or to stand in a different pose. So we have the four different poses and an extra halting pose to end the procedure. In this case, I will tell them to stay with the same pose. So we are telling in here, once you're in this situation, move to the left. Sorry, change the design to the lines, move to the left, and stay in the same pose. So that's basically the behavior of a Turing machine or the playground here. We can step and see that some things are changing. We can play this. And then we have a timeline view where we can see the effect of all the rules that we are applying. And as my timing is running out, we can, the idea here, what I want to show you here or what I have been actually doing is changing life, either the pose that the bird was striking, the state of the row of tiles, and or changing the rules that they are following. So in a sense, because we are changing the inputs, the poses, and the rules, I would claim that we are life coding the Turing machine. And this timeline view might be used for maybe your next algorithm or something to life code and visuals. Or also, you can use the playground to explore in different ways how to engage with these machines. Just a last comment. You can get more information and download the tool from our site, confudansas.net. There are also some ideas of what kind of things you can do there. And finally, as it is programmed in the love to the platform, you can download it in different platforms, including your mobile phone, for example. So maybe instead of doom-scrolling or something being social media, you can experiment with life coding during machines while on the go. And that's it. Thank you. Very cool. Thank you. Thank you for sharing this. So something that's struck to me in your presentation, I was thinking a bit about the poses. I was curious if you could talk a bit about the inspiration of the poses that you picked. Yeah, that's our question. And actually, that's some context that I couldn't include in the presentation. Because actually, this idea came from in our work. We have actually performed live Turing machines persons and using blocks of wood and then following rules. And then what in this case is shown as poses, we have done as many different types of movements or different types of music. But then as doing performing Turing machines live like in person can be slow. And maybe you don't really know what is going on. So these two came after some iterations. First, I had a more boring simulator. But then this one, to try to connect with this embodiment, to be able to see a simulator, to explore what would happen without having to engage completely without the whole setup of setting the floor and setting all the tiles. So the poses come from actually trying to invite towards movements. And actually, they're already doing some kind of binary code in a sense there, but it's more about keeping these movement soul, in a sense, movement-based soul. That's really interesting. When you started saying that I imagined, I'm familiar with your work and how movement, I imagined you all dancing together, dancing with your scenery behind you. It was really, it was very exciting. We have a question from Yoni. What do you think is the importance of teaching Turing machines? Yeah, that's a great question. And yeah, I'm interested in exploring these foundational ideas of computer science, computer science without or before electronic computers. To try to, maybe in all one sense, to imagine what other possible present filters there are regarding the technology or the impact of these digital or computational ideas. And so I think, I mean, nowadays or at this moment, I would say it's more about that, not to try to share more about these foundations and to see if by sharing these two people that normally wouldn't be attracted to these very formal languages and stuff, maybe these ideas can be reused or redirected towards other possibilities. Something like that. Let's see. Yeah, about one minute left. I was wondering, one other question I had was thinking about playground. I couldn't help but think about children. Is that something you ever think about when you're making this work between the aesthetic and the idea of a playground? Does that come up? That's interesting, because I kind of do. I mean, I haven't tried it, but yeah, I like to have these, I mean, for me, I connect this kind of childlike or playful approach to trying to be more inviting towards other audiences. But yeah, so yes, that's some purpose. But actually, I haven't tried these other ideas with minors, with children. But I would like to test that now that you say it's interesting. Cool. Well, thank you. Thank you so much. Thank you. And so for reference, at the end of everyone's presentations, we're going to have a group Q&A. So Seho will be back with us. And so we're going to move on to our next presenter, So Yoni with Knitted Fabrics as Visual Instruments whenever you're ready. Thank you. Let me share my screen. Hi, everyone. My name is Yoni Maltzman. And I'll be presenting on Knitted Fabrics as Visual Instruments. My collaborator, Gabrielle Olson, unfortunately, couldn't make it today. But her main contribution to this work was knitting these conductive fabrics and advising on how to use them. And so the idea of conductive yarn is that the more you stretch it, the more conductive it is. So in this image on the left, this yarn is at rest. And so there's no current flowing. And so that LED is not turned on, whereas in the image on the right, the yarn is stretched out. And so current can flow, and the LED is on. And so there are some main questions for this work. The first one is what kind of electrical signals can we get from knitted fabrics? And in particular, how are these signals we can get from Knitted Fabrics unique from just like a standard electrical or digital component that we can find in the electronics lab? And then what kind of tactile experiences and body movements are available to us in order to obtain these signals? And then how can we take those signals and map them to visuals to make them into visuals? So there are a few different electrical signals that are interesting signals that I found. I'm sure that there are more that I'd like to find out more about. So one of them is just stretching the yarn. And that device in the back with the yellow chart is called a oscilloscope. And it's pretty neat. It can show electrical signals in real time. And so stretching the yarn just increases the electrical voltage. And another effect is poking the yarn at different distances between the two electrodes. And so as the electrodes get closer and closer to each other, the voltage is higher and higher. So you get kind of the step ladder effect. And then the fabrics can also be sourced in noise. So in this video on the left, I'm just crumpling up the yarn. And it's getting this noisy stochastic signal. And in this video on the right, I am just scratching the yarn with one finger. And it's getting this noisy signal. And I think we're going to go out to that first question that I mentioned. This is kind of unique, because if you're in a standard electronics lab at a college or something, you're not going to find an off-the-shelf electrical component that can generate noise in its own, unless it's like some kind of complicated integrated circuit, whereas this is just like a piece of cloth. So I think that's pretty cool. So then in terms of mapping from the electrical signals to visuals, I ended up going with live coding. And the interesting thing I think about this is that you can change both the signal that you're sending by messing with the yarn. And you can change how that signal is being interpreted by the visuals by changing the code and doing that in real time. So I'm going to show this demo. Hopefully it'll broadcast OK over Zoom. And while that's going, I'm going to talk a bit about the setup. And so basically I have the voltage coming from the yarn. And that's going to an analog digital converter on Arduino, which can take a continuous analog signal and convert it to a binary digital signal, which a computer can read. That's being sent to my computer and being read by an application called Open Frameworks, which is an awesome creative coding open source library. And then that's sending it to a Compute Shader, which is a really powerful graphics processing program. And that's that file that is on the right that I'm editing in real time. And for the demo, it was only like a minute demo. So I can't really show too much both modifying the code and modifying the knitting here. I'm more focused on just trying to get different signals with the knitting. But if it was a longer set, then I think there would be more interaction between both changing stuff with the knitting and changing stuff on the code. And then for further work, I'd like to build some actual physical instrument for the fabric so other than just having alligator clips on an embroidery hoop, maybe building something like metal or wood, something that I could actually perform with. And then dance would be an interesting application. Like having a dancer have some kind of clothing with conductive cloth and having that, so having their movements modulate some sort of signals. And then having a tapestry with its conductive different places or having a regular string instrument that has conductive yarn for those strings. And then in the images on the right, those are some of the stuff that Gabrielle knit for her undergraduate thesis. And just to give a picture of all the different things you can knit, there are just a lot of different structures and colors that you can knit. And you can stick conductive yarn into a lot of these things. So you have a really wide variety of possible instruments that you can have. And then before I conclude, I just wanted to make some acknowledgments. Gabrielle and the Carnegie Mellon Textiles Lab for knitting the conductive fabrics, Sharstiles for showing me how to live code each shaders and open frameworks, the developers and engineers of open frameworks, which is the source, and the HLCI organizers. So thank you. Amazing. Thank you. Thank you for sharing that. So we have about one minute left for a quick question. And so something that I was wondering what we're thinking about was you're working with electricity and currents, and you're also manipulating stuff a lot with your hands. I was just wondering if you have any interesting stories or moments like working with these live currents that you wanted to share. I think the interesting thing about that is that it was kind of hard when I first started experimenting with seeing what signals I can get, getting consistent signals. And I think I might have this nid thing, and it starts at a certain voltage. And then I press it, and it goes up to a certain voltage. And then it comes back down. It doesn't come back down to the same voltage. I think with digital stuff, you have more precision and more of a consistency that you can rely on, whereas I think with this analog stuff, it was a little bit harder to control and really figure out what's going on, which is also kind of fun that you have to let go of that control a little bit. And so looking at the chat for, I guess, one quick question before we move on from Katarina on YouTube. How many iterations has this project gone through? It's got a few iterations. I only started working on it a couple months ago. But I think the main thing that's iterated is kind of that third question that I mentioned of how to map from the electrical signals to the visuals. And I kind of ended up with this live coding framework fairly recently. But I've experimented with a few different methods for how to take the signal and then convert to visual. And I guess the neat thing is that now I kind of, now I just have different ways of controlling the visual. So I don't need to throw out any one of those iterations. And I can kind of combine them together. So for example, kind of what I'm doing now is just taking the current voltage and having that be the thing that modulates the visuals. But another thing that I was looking at too is having the voltages over time kind of load up into a buffer. And so then you have kind of like an audio window, just like a window of voltages over time, kind of like you'd have an audio sample or something. And then you could do a frequency decomposition and stuff like that. Sorry, hoping that's not too technical. But that's another example, I guess. Interesting. OK. OK, so I got my timing a little wrong. So I'm just sharing a little more. So we have one comment from Kate saying how they love the idea of the gesture pope coding or scratch coding rather than typing. That does the gesture is very interesting. And then one last follow up from Katarina. How long does it take to make one piece of conductive yarn fabric as a size shown? And what sort of iterations prototypes did this group? Does this project go through when you and Gabrielle made decisions together? Yeah, I think I can't answer ton on the knitting side. I think Gabrielle will probably be better at it and stuff. But I think with just like kind of like a simple, I guess this is sort of like a relatively uniform, simple like cloth. Like you kind of make it on like a conductive or sorry, on a knitting machine, which like takes like a few hours. And yeah, in terms of like what the kind of like process and decision making that we were talking about, I think kind of the main questions we were kind of talking about were like how to sort of like how to bring out the unique aspects of yarn. So like like stuff that you can't get from just like from like a regular instrument, like a guitar, a drum or something, you know, hook it up to like an electrical signal or that you can't get from like regular electrical components. I think that was kind of like the main sort of the main things we were discussing, like how to sort of bring out the unique aspects of yarn. And also like something like the visual aspects of yarn. So like something that would be cool is like instead of just having like a uniform piece of fabric, like having some kind of design and like some kind of tabs to go on, so that's kind of like a future direction, I guess. Thank you so much. Thank you. Right. Right. So thank you. And that was with knitted fabrics as visual instruments. We're going to move on to our next presenter. So next we have and Barasky with a musical clam bake a five minute algorithmic microtonal video produced on a Timuroni Pico vision. And so Ed, whenever you're ready. Okay. Can you hear me? I can hear you. Thanks. Okay. Let me see. Let me, let me just play the video first and then I'll answer questions. Sounds good. Let's see which media player. Media player. Not sure if you're trying to share your screen, but we don't see your screen yet. Thank you. Do you see it now? Yes. Okay. Here it goes. This is not my first studio, but it is my first video. Thank you. Thank you for sharing that. One question. That I had. And this could be a little ADHD of me forgive me, but you know, your title says a musical clam bake. And I was wondering if you could talk a little bit more about what makes this a clam bake to you? Clam bake comes from the name of the project clams. It was just a pun. You know, funny thing, you know, I've never seen a happy clam. Most of them were pretty steamed. Clams is an acronym that stands for command line algorithmic music system. And so I just thought I would call the clam bake. I see. I see. I see. Thank you. Thank you for clarifying for that for me. So could you talk a little bit more about where this project has like lived or where else you have like use this? Like, yeah, your applications of the project. Yeah. Right now it lives on GitHub. I'm in the middle of, of finishing up the first release, which should be out by Christmas. The, I'm mostly a fourth, mostly an R programmer, scientific computing. And I've done a lot of microtonal computations in R. And about a year ago, I decided I was going to start playing with microcontrollers. And so I got a Raspberry Pi Pico, which is a beautiful little thing. They cost about 10 bucks. And the digital analog converters actually a little bit more. And I said, well, you know, be nice if I could sit as a computer and live code this thing. And so that's what I've been building. I've been working on it for about a year. And it's almost ready. I have, I had the audio sort of working, but I just started working on the video a couple of months ago. So the, the, this, I did the demo in, in micro Python, because the, the clams video isn't ready yet. But at any rate, once, once it's up, I'm basically going to use it for making music. That's great. It sounds like a really, you know, thinking about how you've worked on this for like, it sounds like a really like personal system to you is like, meaning like something you've, obviously like are cultivating, right? Is that accurate? And do you plan on like sharing this with other people? Or is this more of like a personal project or a personal tool for you? Well, it's a personal project, but it's, it's sitting there on GitHub. Anybody can go buy a Timuroni Kiko vision for, I think they're about 35 bucks. And the code will run on it. I'm writing a book to go with it so other people can learn how to use it. But, you know, if, if nobody else uses it, I don't care if somebody wants to buy the Kiko vision boards, I build a box and sell it as a product. I'm fine with that. I'm basically just building it and putting it out there and seeing what happens. But it's, it is designed. The design of it is towards my own personal musical style, which is algorithmic, microtonal and stochastic. Very cool. Thank you for sharing it. So you're going to move on to our next presenter. And so that is Jack Armitage with Tolvera Live Coding Alive. Thanks very much. So, yeah, hopefully you can see my screen. Yes, screen looks good. Great. So, yeah, this is a quick presentation about a project that I've been working on over the last 18 months or so called Tolvera. And the idea of the library is to aid with exploration of artificial life and self-organizing systems, both in a live coding context which I'll show, but also when working with musical instruments, which is what I'm currently doing at the Intelligent Instruments Lab in Iceland. And, yeah, some of the inspiration for this project has come from the artificial life community. And if anyone watching is not familiar with artificial life, that's kind of a less well-known sub-discipline of artificial intelligence where the goal is not to do statistical analyses and optimization like in machine learning, but to create models based on complex biological and physical phenomena and explore computation more in an open-ended way. And I was also really inspired by this YouTube channel called Journey to the Microcosmos. Definitely recommend checking that out. And yeah, obviously I also have a background in live coding and thought it would be really fun to be able to mess around with this domain using the practice that I've developed in live coding. So this library is implemented in Python and the reason for that is our research group here in Iceland is focused on AI and machine learning in the context of musical instruments and most of the software in that domain is centered around the Python ecosystem. So we've been making different Python packages to make it more attractive to creative coders to work with Python. And this library that I'm showing today depends on two other Python libraries that we've been making in the lab that enable open sound control and MIDI mapping and interactive machine learning. So yeah, if you're interested in creative coding in Python you might find those helpful as well. So the library itself is kind of broken down into different sort of namespaces and you basically import it like this and then create a class instance like this and then from there you can access these namespaces here. Oh yeah, I haven't explained the name. So I live in Iceland and I've been engaging with the language in some ways and so the name of this package comes from mashing together the words for computer and being. So Tolva actually translates to number oracle and Vera is like a being or a spirit. So in this library the behavior models are called Vera and then there's a particle system as a sort of n-dimensional state structure. There's a drawing library which is kind of similar to p5.js. There's open sound control, interactive machine learning. The graphics backend is based on a Python language called Tai Chi and there's computer vision via OpenCV. So let's look at a very simple example. So here I'm not doing it in the live mode. I'm just going to first of all just show a scene run as a script here. So just from a few lines of code you get this quite complex scene. So let's just break it down a bit. There's a call bug that I actually need to fix. So as I said we just import the library and initialize it and then we have this is a Python decorator if you've never seen those before and basically it's just one render function that returns some pixels and then within that we can do lots of different stuff. In this case all we're doing is updating the particles, choosing a diffusion amount for the pixels. So how much the previous frame sort of decays. Then we're using a flocking model and we're passing our particles into that and then finally we're drawing the particles. So that's all you need to do to get this scene. So let's just now look at the same scene but run in a live format. So the live part of this is enabled via Sardine, which is a Python live coding environment that was created by Raphael Forman and it's a really great project. I recommend checking that out. So in Sardine to have a sort of live editable function you use this. Again it's a Python decorator swim so when I run this we've got a similar scene but in this case I'm not using the flock algorithm. I'm just using the simplest kind of behavior which is move. So all of these particles have a position and a velocity. So move is just moving them around based on that. But then if I want to again switch back to the flocking model that you saw earlier, now they're flocking. So that's kind of the basics of the library. I'm just going to whiz through as many features as I have time to share. How are we doing for time? A couple of minutes left. Okay, so yeah, so maybe I'll just save it on the demos but I'll just show a couple of quick video clips. So this is a piece that I did with my colleague Nicola Privato called Ferro Neural. So in this case it's using the inbuilt open sound control sort of patch generation. So when you write one of these scenes you can automatically export pure data and max patches without having to do any max or PD coding. And then I just recently did this installation for the Magnetic Resonator Piano here in Iceland. So here we've got Grand Piano being controlled by the OSC coming out of the Polvar scene. And yeah, it looks really nice in a space like that. But I think that's all I've got time for. So I think that means we're going over to questions. Thank you for sharing, Jack. Yes. First of all, we don't have time for an individual Q&A but we'll definitely, I'm sure people have questions when we have the group Q&A in a bit. That was really lovely. We're going to move on to our next presenter. So we have Ulysses Papel with phoning it in with no dizziness. Hey. Thanks. I'll switch the camera over. Whoops. I opened up my camera on my phone instead. Okay. This should be good. Let me know if that looks good. So this is no dizziness. I am going to kind of change a few things while talking about the challenges of building a live coded system for the phone. So one of the first things I'd like to point out is that I have an airplane mode on. So everything has to work in offline if you're on a phone because sometimes you won't have internet. So the thing itself can, I can change things like you would expect. This is a node. It has a couple of fields. There's like some data you can write a comment edges what the edges to the next node and then graph I'll get into in a bit. So like if I change the data around, you'll see that the size of it, the particles change. So that's because this value node is going into the three node. So if I take a look at the three node, the graph is three dot node. So this is an openable graph that kind of represents a node in 3JS's system. It's a bit messy. And then I will go back to our main graph. So another thing that is important when working in a mobile system is to not lose your place. So is to be able to change. So like all of the nodes can have individual, all of the nodes can have individual GUIs. So this one, for example, is a drop down that lets me change between different colors. And you'll notice that it also, I don't know if you can actually see that, but there's a similar thing in the bottom left. So I can basically take any individual GUI from a node and promote that to being in my kind of toolbar on the bottom left. Some of the other challenges is you'll notice that there's a lot of particles moving around. So it's important when thinking about mobile systems to use all the threads you have available. So that because all of the interaction on the screen runs on a single thread in JavaScript. And all of the interaction happens on a single thread in JavaScript. So I kind of want all of the particle stuff and all the 3JS stuff to go in the background. So I'm going to make a few changes. So for example, if we have, we have our MXFractalNoiseVec3 and we can make some changes here. So when I make these changes, these are all the nodes that 3JS has available to it. And this autocomplete function is essential too for the node. I don't think that actually changed anything. I'm going to have to refresh, which is fine because everything is automatically saved. Something else, which in your notice, the colors are slightly different. Something else that's important is not having to save manually because sometimes you'll be making some changes. And then you want to kind of put your phone away and you want to come back to the state where it was. But maybe the browser has shut down or something like that. So autosaving is important. And yeah, I think that's all I have time for. Maybe the only other thing to put in is, so now I've made that bigger, but I can undo and redo how I usually would. So having nice big buttons for all the common actions or like restructuring the graph is very useful. And that is my presentation time, I think. So I'll open it up to questions. Let me change over the screen. Thanks, Ulysses. So I guess something that I was curious about was if a touch or pressure ever like come into a factor since you're like working with like a touch spring like just the heart like how hard you push ever or like maybe the X, Y or Z. That's a really good question. I honestly hadn't thought about it before, but it is so essentially the system itself is a wrapper around JavaScript. So all of this stuff is written in JavaScript, obviously. So anything that JavaScript makes available, you can kind of use. So it's just regular pointer events and stuff like that. I in my exploration so far, I have not done that because I haven't thought of it. So thank you. Oh, I see we have some questions for you. So from Sam. Sam is wondering where can we find this to play to play with it. Yeah. And so no DCS.io. I should have put that in my project description, but I might add that later on or maybe we can make it somewhere. Yeah, no DCS.io. And there's a if you look up notices on YouTube, I have some tutorials. I'm looking at adding more tutorials soon. Very cool. I have more questions. So from Kate, does live coding on a mobile device become more personal, less performance for an audience and performances for oneself or maybe more like a game? Yeah. So this is actually my first performance performance like with with it on a mobile device. I did it on my laptop before. So yeah, this is something that I've kind of been thinking about. It's also sinks like you can have rooms like you would have a room of hydro kind of or like some other sort of thing, except you have synchronized graphs. So if you think about maybe standing around in a circle with everybody on their phones and they have like this graph that they can look at. And it's synced with everybody else to make music with visuals. You for me personally, when I've used it most on my phone is building out a standard library like on the on the commute to work, because I can just like change things around and it works on the train, all that kind of stuff. Yeah, I hope that answers the question. I think yeah, maybe it is more personal. It's more difficult to perform from a mobile device. But you could also sync up and have it projected from a from a computer with you doing with the performer in the audience like making changes to the graph is reflected from the computer. That's really, that's all right. I think it's all really interesting. I just started imagining images of like, because you know we bring our phones to the bathroom into bed and then like right everyone and then like, I'm just like also imagining when you start talking groups of people like I just like imagine like we interact on apps and talk about them all the time and I just imagined your work, living in that I don't know it's really interesting. And so I guess, and I, and so I guess you just answered kind of this other question from Leon, who somebody on YouTube was asking, can you output the visuals but you just mentioned yeah you can share whatever on your phone to make that communal There's something integrated I need to write up dots for it but basically there's a graph that I made that can take another graph and show it as just as it is. So the idea would be that then you could use that you can because it's in JavaScript you can export to JSON file that you can use in the disease elsewhere but you can also export to HTML. And it will, it will take the latest version of Odysseus from npm and like run this without the Odysseus, like stuff just as a, as an HTML. And that's HTML is like, you can put it up, host it wherever, you know. Yeah, I think that's time. Very cool. Super shareable business. Um, thank you. Thank you. Thank you, Odysseus. We are. So we're now going to go on to our next presenters. So we have Alex van Gils and Sam to action with rhythms and other things getting narrow to go broad. Cool. Thanks, Melody. And thanks, Odysseus. That was incredibly cool work. So, yeah, we want to talk to you a little bit today about nest up. So hello, everyone. Yeah, I'm Alex van Gils. This is my collaborator sam ter occasion, and we wanted to talk to you about something fun we worked on called nest up and some new explorations in that space. So the story goes, we were really interested in these sort of musical structures called nested tuplets. The basic idea with one of these is you have some unit of time, you divide it up evenly, and then you go into that and you can subdivide more units of time to make these kind of complex rhythms. So these rhythms sound really cool, and they are a feature of a lot of the music that we really love. The challenge is that they're not always easy to program in a digital audio workstation. Here's a demonstrative rhythm from one of our favorite rhythm influencers WTF groups. Okay. It goes on, but you get the idea. We think these rhythms are cool. We wanted a system that people could use to work with these more easily. We've nest up short for nested tuplets. The language looks like this. And very basically, it lets you make containers of different sizes, adjust the relative size of those containers according to a certain proportion, or divide them up evenly into, yeah, evenly spaced pieces of a different size. So here I am doing some live coding with nest up in the DAW. And I want to particularly point your attention to this colorful little UI at the bottom of the Max for Live device right under the words by Qtlab. You'll see that UI kind of change as the rhythm that we're coding changes, and that's going to be important later. Cool. And here's that same WTF group's rhythm before as a nest of expression. But this presentation isn't about music. If you think about it, the two basic elements that we manipulate in nest up, proportion and subdivision, they appear in other domains as well. And that's really what this presentation is about. It's about our exploration of nest up in the visual domain, which started with that graphical visualization at the bottom of the Max for Live UI. So before nest up actually generates any MIDI makes any sound, everything is represented in this really abstract way you can sort of see kind of the onsets here and maybe get a sense of both their hierarchy and their relative proportionality. So here's a nest up expression where the outermost container is divided into three, and then the third division of that is divided into three, and the third division of that subdivision is further divided into three. And as an image that could look something like this. So for this presentation, we made a playground using P5 and nest up where the rule is that every time the container gets divided, the canvas rotates 90 degrees before actually drawing. And now we can do a quick demo of. Oh, God, where is the demo is it. Oh, this thing is in the classic thing where zoom is in front of the thing we're trying to get. So anyway, here's that that playground so let's just try to try something quickly here maybe I'll make container four and then maybe take the second by that into three, and maybe the fourth and by that into five, and then the fourth of that and by that into three. Well, don't do that don't have runtime errors all I probably made a syntax error I'll show you what the WTF groove rhythm from before looks like. So here's that same rhythm that we were listening to rendered as an image say it looks kind of nice. And because this is a playground and we're just kind of experimenting with different stuff. So some other things that you can do here, you can come in and just like change one of these numbers into a variable, and then change that variable. You can change the so we thought. Okay, what if you take this thing and just go deeper and each cell you can fill with the same image so there's a depth control here. You can crank too high because it'll, you know break my computer but if I make a simpler expression maybe we can see kind of what the depth can do. So sort of crank up the step then you get the sort of recursive drawing which is kind of nice, and just for fun you can also crank up the rotation at each step, and I don't really know what this is supposed to do, but it looks kind of fun, and you can also turn on some Have some have some fun with your drawing maybe change the color. Oh yeah and I how could I forget the most important part you can change the color. Look at that. Okay, so you got a little demo of that playground. So boundless thinking, the name, you know the theme of the conference and what's interesting for us here is that we were able to explore new boundaries by thinking kind of in a very bounded way or or explore beyond boundaries by thinking in a very bounded way. We were interested in this super specific problem which was musical rhythm and making it easier to program in a dot, but by focusing on just that one specific thing, we ended up exploring something much broader concepts as universal as kind of proportionality and dimension. So, we were able to expand outward by looking inward, and frankly it's just kind of fun to take a project that was meant to be about one thing and use it for something else. So thank you very much for your time and attention. Yes, thanks everyone. Thank you. Thank you for sharing all that. I, there's something about the, when you started in talking about the visuals and like how you have the relationship between like this seeing this relationship between the syntax and like the visual or like the representation. It all felt very synesthetic to me I'm just like wondering if that's something like synesthesia if that's something that you all have thought about when working on this. I mean, it's not something that I wish I sometimes I wish I experienced synesthesia but I definitely don't I do think that like there is something interesting I mean, I feel like it's one of the first things that really got me hooked on computer music was this thing that probably is pretty it blew my mind at the time that you could take a rhythm, say it like 10 hertz or something just like a steady beat and then increase that, and there's a threshold where it kind of crosses from being something you experienced as rhythm that something you experienced as pitch. And I think that if anything there's this kind of quality where there's sort of like an underlying order to the world in terms of portionality repetition something that can express itself in different sensory modalities and it's fun. I think to get to kind of build something that takes those kind of universal properties into account and then expresses them in visual domain or the musical domain and maybe you have a similar experience. Yeah, I think that's well said I mean I think I also don't experience synesthesia necessarily like as a sensory phenomenon but like I think what tickled us was the sort of like symmetries across senses that because we like music that included these particular source of rhythms, and we were making music that use these particular sorts of rhythms. It was then sort of surprising, maybe shouldn't have been surprising, but was surprising that we also sort of liked the artwork, the visual artwork that came out of a similar process. So, yeah, I think I think that's how I would describe it. Oh, I really like what that you just said on symmetry across senses I really I really like that phrasing that's really nice. So, looking into the chat we have some questions so or some comments so from Kate. Love the multimodal experience as well also wonder how to embody this with gesture or movement that have you thought about that. I just wanted to say that, you know, this is something that we put together for this presentation but it's built on this like very, is kind of alluding to in the presentation that the, you know, the messed up language itself. You, once you get this kind of like structure out of it, you can apply the visuals music. I don't know what the application movement would be, but I'm very curious to find out. And like a choreograph, a choreography like interpretation for sure. That would be very interesting. Okay, and so we are at time. Thank you Sam and Alex wonderful wonderful. And so last presentation of the day. Before Q&As we have Kofi. Yeah, with code fit life dip in code and hair and coils multi sensorial excursions. Thank you. And I'll just said the title. So let's go system senses. As we were just talking about sense of seizure. I have a more direct approach about translating senses into other senses so let's say from food to taste color to sound all these different things. But also throughout my process I have also realized that there is fun in also life coding the human because humans are built of multiple systems. So outside of the sensory system we also have the respiratory system the endocrine system, nervous system, cardiovascular system and how can we implement these things through live coding and then through different works and different design stages I have been able to prototype and demo some of these concepts. So first off I would like to ask everyone you don't have to like write in the chat you can just remember what is your third favorite color. Second favorite animal and seven favorite TV show. And you don't have to write it down just keep in mind while we go through this. So as we were saying this is all about and those questions senses true code and improvisation. So one of the things that I've been doing is how do we get a multi sensory dining that including the five senses plus the systems in and I realized that we can get data driven experiences that incorporate the senses in unusual ways. Because just as we're explaining throughout every presentation today, there are familiar but unfamiliar traits when using different devices it could be coding on a phone it could be coding on a computer. It could be coding while you walk and all of these integrate different senses because not only is it what you're doing, but it's also how you do it so if you're coding on a bus in the middle of traffic or if you're coding on a plane. There's different sensory things and there's research to back it up that these things do make us think differently. And through all of this is really about also taking code from an experience place making and a sensory place make it and the goal of all this is to have a divergent narrative that each user would be able to connect while having different experiences. So when I ask you questions like I did in the beginning, there's almost no way that we will have the same type of answer, which then creates this narrative of how we're designing a new system within our old system. So this is just one of the design principles of like the immersive dining which is for different code. So as you can see it takes in sound design language design psychology of nutrition case studies multi sensory research and puts it together in a way that we can have some of these topics that typically are hard to do it. But since we're doing improvisational performance and in an art steam type of way, we're able to get answers and conversations that double in this. So this is just some of the translations as you can see on the top left we have a picture of where there's a video playing and then I'm trying to translate the food and the drinks into color and trying to represent the same amount of data that you see. And then one time German grocery while I was on my phone, I decided to do a thing that what if I did the fight the food, how can the food turning to code. So what you're seeing is Hydra and live code in YouTube where I take previous food concepts and then try to alter them for a different dimension. And here are some sounds to go in so this sound that you're hearing. So the sound you just heard is actually two different rice dishes and I was able to quantify each ingredient by a color and then turn that into sound. And that's why you sort of hear this confusion but maybe familiarity because I am playing with that notion. And as you can see in the two other images, this one is one that I have spoken about a lot in the past few years where I turn color into sound through the various RGB CMY case. And this one I took Hydra once again with live code in YouTube and tried to transform chocolate from something that is sweet to something that just looks obtuse. And in this piece of one I took, I data fried the top 10 toppings of pizza and brought them into their sound. And one of the highlights of this thing is that back in ICLC 2023 at U-Trick at a restaurant we meet with Renee Jordan. We actually performed different code where we were able to take participants in a restaurant and bring them into this multi-dialing thing where whatever they would say, whatever they were eating and how they were interacting did affect moves. So it not only touched on the senses but it touched on the other systems and also the communal aspect. Because yes it was a performance for all but also individual because depending on what you were eating and how you replied to the answers, you were probably in a different mind state. And this is just some of the things to show the process of sonifying the food where I turned the food into the RGBs and CMY case to get their values. And this is one of the other design processes from data sonification by Sarah where when you're doing some of these performances you can lay out the design elements. So what modalities are you? What behaviors? What sound? What is the context? And even though we are improvising it is good to have a design system in place because whenever you do choke or you know we embrace failure, failure is just part of the system that you didn't account for and that's how life coding usually is. So it's one of those things that how do we integrate stuff so even when there's failure or things don't go as well, it's still part of the whole process. And why a reactive divergent narrative through the senses? Because it allows things to be open source, open knowledge, free styling, observation of the mundane, finding ways to be random. And it's not necessarily result driven like the questions I asked you before, but process driven. And then also adaptive narrative book versus movie. In a movie they choose the characters for you. When you're reading the book you can decide that Batman is 7 feet and if it's snow, if you're in Montreal that's like 40 inches. If you're probably in Florida that's 2 inches. So it allows for a personal unique take while being sustainable for both the performance and the audience to take it in. So it takes technical concepts and allow them to be diverse in how people approach it. So whatever way you learn best or interact best it allows for that to shine. And then of course I think as adults we need to play, play, play. So a lot of this design is designed by playing because when we're playing we allow our senses and systems to engage with things that we might normally not engage with. And as explained before the design process really is a lot of rapid improv design in the case of different code that's designed feedback and combinations of all. And what if our algorithms were told internally. So once again another project that worked with Renee Jordan recently and this was for the Apaka salon was the algorithms of hair. And where as different code was more of the external things this dealt with more of the internal thing because we all have hair and hair girls. So as you can see this is a piece of tessellations that was done in Hydra to represent the tessellations that can be seen in the hair and head. And while we were diving into this a lot of styles like twist braids and corn rolls actually did have a mathematical thing that if you have musical entities you could probably attribute some of this to know. So how thick this round is what pitch pitch can be the length of the hair or rest and all of these things can be incorporated into something like hair. So we I talked about internal systems such as hair I turn about external system but what about the other systems what about your cardiovascular your brain your muscles your skeletal system. Well this is something that I've been playing around with recently and done so there's the data vacation process where you can play with fitness data so I have some works that I take information from my watch and let it play out in Sonic Pi and then also digital therapy sensory works cold diaries meditative coding street code live yoga dancing cold walks urban snowshoeing. Once again. All of this is based on a simple thing that when we take multi sensory approach where we're allowing ourselves to explore experiment and play. And then the outcomes comes a reflection collaboration misdirection interaction and randomness. Thank you. Thank you. One question from the listies. Have you found there's a limit to the number maybe specific combinations of senses that don't make sense together. What happens when it gets chaotic. Well I think actually the combinations is I want like we're human so we have a finite amount of information so I think it just depends on how much information you want to incur. So one thing I answer that is if we know that if you like because of higher than all that stuff we know like a one movies like 2.4 gigabytes of one hour race. So we can say that a human probably if we see every day that's 40 gigabytes. What about urine a good MP4 or do you year in way that can add more so it's one of those things that we naturally do have a limit and when we reach our limit sometimes we get tired or we get overload. I think that's where the process of doom scrolling is. Yeah. Interesting connection to doom scroll. Thank you. Thank you. So that concludes the individual questions portions of today's session. We are now going to begin our group Q&A. So if the rest of the presenters can join back in the zoom room. Please do. What we're going to do is we're going to have a group chat. Let's see if there's any questions from the audience. All right. Not yet. Not yet. I just want to take a moment to acknowledge like all the various types of systems that have been discussed in this you know starting from you know to playground and just like you know even just between like the mediums and like and what not from like very physical very physical tools that could include gesture to not. It's hard for me to know where to to begin if I'm if I'm honest. I was wondering maybe something that we could start off as maybe the role of play for everyone. Because even though like some of you explicitly mentioned play there's just a lot of playfulness that feels really tied to exploration exploration and play and I'm just wondering if people can either share moments that stood out to them or feelings or anything like that. And like Kate just echoed play versus rigid system maybe. Yeah, Jack. Yeah, I think that's really essential to me. You know I've been developing this library through making performances and through playing in front of audiences, rather than just focusing on making very sort of clean and professional code. And that's like a very explicit choice to kind of be able to fully experience. What the library can do when I'm in that playful kind of zone and also in a in a sort of live situation with with other people and seeing what they're, what are they responding to and using that to kind of drive all the decisions that I'm making. Because I think in the absence of that it's very tempting to just kind of sit there coding extra functions or features that might naturally be inspiring when when you finally come to hit the stage. And so I kind of, yeah, I think it's a good way to sort of keep your keep your wits about you when you when you're surrounded by so many choices. Yeah, I would also follow that up that and I think this actually answers both questions that were asked by Julius and Yoni that my system is based on flexibility because the one thing that I really wanted to note was that the human mind and even how you live code and all of this stuff changes dramatically. Right, like if it's snowing, you're not going to wear the same thing as if it's raining. So that's where I it's one of the things that the chaoticness of like, oh, we're going back from food to food. It's like burnt burnt mac and cheese is a thing that happens. See the pictures all the time on on Twitter or whatever social media use now. So it's one of those things to elevate on those things that like failure is part of life. So chaoticness and failure of having the same 10 not match is part of the system. I think I'm kind of responding to this question. In terms of play, it also sort of addresses something that you only asked in the chat about about nest up like responding sort of automatically versus like the human gesture. And I think one of the things that was really important to us in the design of nest up maybe Sam can speak to us a little more is that every keystroke could be like an expressive decision, musical decision, artistic decision, like how, how kind of simple could the actual like expression of the language be so that you can be playing as your live coding. I mean, it just makes me think what was funny. The first thing I think when I think of play is I think of like how play is something that needs to be sustained in order to exist like play is something that can come together and it can fall apart to if you're with people and I don't know enjoying the game it like kind of falls apart and there's a degree to which I feel like play, you know, can actually take a lot of energy beforehand to sort of establish the rules of the game so that when you get there, you can all the things that you do kind of are interesting and have weight and that's kind of what what you were saying makes me think of and what I see in a lot of the tools that people have made here is there's a lot of intentionality behind deciding. Once you're in the system and using the thing, whether it's like Turing machines or editing a, you know, graph doing graphics editing on a on a phone. The kinds of things that are in front of you that you get to work with are things that speak to the system that you're dealing with but also are like fun and characterize the system well and let you be really expressive. That's like, when I think of play I think actually, when I think of play I actually think of work and like how much work it takes to get to a place where you can play as broken as that maybe sounds. And then watching Kofi's presentation I think about play with your food, you know, then you have this like additional element. Maybe it's lunchtime I'm hungry. Yeah, that's probably what's happening. And I think maybe a final thing is the doing so like creating Odysseus I'm obviously using it for live coding but I also use it to prototype stuff at work which is just JavaScript development. So I've kind of had to take a look at the like IDEs and the kind of tools that you get in integrated development environments like like this code like whatever else. And, you know, people look at a code editor and they don't think oh that's fun that's playful. But then we have this whole lovely community of live coders who do look at a code editor and say oh that's fun that's playful. So I think it's also using the app that the holder somewhat where you can make systems and they can be more or less geared towards play. And if they're simple enough people will play with them and have fun. I think to what Sam and Alec were saying about yeah like it was interesting to think about the work that comes before in order for like to enable play and yeah like in my case or in the tool that I shared. And when you like normally look enough for this computational abstract machines, it's like, like you find the notation mathematical notation the letters and numbers and it's like what is like very unapproachable. But then once I am at least this is what happened in my case now once you like get the time and energy to try to decode that and you will see that actually is not that difficult is just for the discipline or whatever and that it has to be expressed in that way. And like, like I wanted to make this in order to share in order to invite another as I was telling you before that this other kind of approach or perspective and try to bring more playfulness into something that is actually not very difficult but by historical reasons it's made as it's made seen as very like difficult or like for someone like super very into formal things. So, yeah, I also I like that all the ways that play was manifested not in the different projects that were presented. Anyone want to add anything before we move on to another question. All right. So, looking at to the chat. We have a okay so looking in the chat. Let's see. What is your favorite story from overcoming a live performance system failure using your systems. That is a question. So, so the first time that it's not really during a performance but the first time that I was going to perform. Basically, I was working with, and I still work with a local rock band. And they had like, I'd like gone up to them after one of their shows I was like hey do you guys on visuals and like I was lucky that they kind of agreed to have me. And basically I was I was showing them stuff kind of in the weeks that you have to show and like everything was working is great. And then I practiced with them the night before the show. And like everything is just like breaking. And it kind of, and it whereas like, like, you know they're all like super skilled with their guitars and their drums and all that and it kind of showed me a little bit like, like they're, they're practicing a ton with their instruments and so like I need to kind of step up and also make sure to be practicing and like, like get up to that sort of level with with my instruments and treat my instruments sort of more seriously and, and I did I did get everything fixed kind of before the actual show but that was that was a kind of mental experience for me. Well, I'd like to introduce another question then so here's another question from YouTube so do you feel you're working against affordances set by conventional programming languages. What are they and what do you get by breaking them. We have been thinking a lot about this actually recently. Because of the, when you move to like a visual programming language. It's very different than a text based programming language. And there are a lot of things that kind of don't really don't really move across very easily. So, for example, variables even something as simple as variables. You have a text, this text and you kind of you call it by a name and whenever you use that name, it's with a specific value and change it, and then the value changes and elsewhere in your system is affected. But there's not really that in a visual programming language necessarily. Like, in a visual programming language, you kind of want everything to be visual so that you want to be able to click on something and to kind of know what the value is of that thing. When you're looking at it, you want some way of being able to debug not by printing out lines of text but by maybe putting a note down to say like, what is this value here what am I what am I looking at. And so, yeah, I think there's a lot of things that we we take for granted when learning programming languages and when using programming languages. And I think that maybe thinking about inspectability and kind of live systems is a really is a really in a visual way, I think is an interesting way to look at like the structure of code and and how how it's actually performing the things is doing. Have any sense. I can also comment on the visual aspect I was thinking about that not because in I mean at least in this tool I was also like trying to not show words, also because I wanted this to be used and actually it has been used by like Spanish speaking people for example, and I like thinking about the, you know, like affordances or maybe limitations of programming languages that most of them are based on English. So that's, I mean, there are people like already working a lot with that kind of perspective knows the, even if you go like all the way down you still find the English in the components and that's interesting also not that maybe connecting to what Lisa was saying about the visual. I mean, I haven't explored it further but I mean I know I'm interested in that not like what happens when you start in my specific case instead of using letters and numbers and some stuff, just having the graphics that because of the way the substrate machines work they are actually like equivalent is just a different representation of the same thing. And yeah, I'm interested in exploring for their one. What does that imply nor where does that gets us. Okay, just checking in do we have time for for salmon Alex to chime in and look like they had like the last thought. Yeah, one last thought, please wrap up for the day. Thank you. Here's my, I was just saying to Alex is not important. And the last thought was just that I feel like this is the kind of thing that comes up all the time in interface design this kind of fundamental question of like, what do you, you know, out of the choices that you make in your design make certain things easy certain things hard and it just reminds me of the one definition of propaganda as something that redirects your attention. I feel like there can be a lot of contention around questions like, in this system, microtones are difficult to express in this system. You know, you can't easily break out of four for explore different rhythms or meters or tempi or whatever. And sometimes the emotion behind those kinds of concerns surprises me and sometimes I think it's totally justified because there is a degree to which these kinds of decisions to have a kind of propaganda like quality to them in the in the way that they say certain things are important certain other things are unimportant. All this just to say that it really makes me happy to see the kind of work that we saw here today because all of it pushes back against a certain kind of decision that someone else made in terms of designing their interface and proposes a different set of values so anyway just really happy to see all this work. All right, let's wrap it up. That's a great way to start to end today. So this concludes our second session or second or first second session but first day of HLC I 2023. Thank you to everyone for your presentations. Anything you want to know just a big thank you for all the presenters like really exciting work and apologies for confusing around YouTube links. I'm not sure what's going to happen tomorrow but stay stay tuned we'll post them where we can figure them out. So yeah thank you all.