 Welcome to Papers G. And first we have Nikolaj Zasko from Live Coding to Virtual Media. I'm just introducing about this talk. So I tried to, you could find the paper and read it actually, but I tried to make, concentrate you on few concepts which made possible this transition because it's really, I think, a dream from the long time. But everybody of us I think who are in this auditory know the failing of virtual worlds which were in 90s when computer couldn't run them. And Facebook, Twitter, and some social media stuff come out and again, we are all, or most of us, sitting in text editors and couldn't move forward. So it's just an image. And we all know the self-explorative environments, one of the most known in Squeak. And now I'm just doing this in Squeak. So what I mean by this, now I will close my full screen. So it is the space, a 3D space of a croquet which is run on Squeak image. And maybe you know this, what is self-explorative environment for me? Maybe you understand it. It means that everything is in image. So all source code and all methods and all your stuff just saved in one binary image. And now I will show you a simple object of this environment. For example, a keyboard, which you all already know who work in Squeak. I just put it on my, and yes, it plays. But it has a latency and something like that. So it is implemented in small talk. But what I do, I just, in small talk, connect this small talk to Super Collider and just run it. So here it is. And a Super Collider was made actually. And a lot of us made and inherit from small talk. I use it properties, just one line could make this connection. You just like a trick I executed, sorry, to see. And on small talk side, I just enter, sorry, this is objects in small talk, what you want to see. And I open a Super Collider keyboard. And so this now running this sending of some messages. And I have a browser inside this image. Where could I just change, not maybe like in life-coding sending comments, but I need to change in method body. And when saving it, it really changed me. I just show you how you're done. This is keyboard. This is my, it's not a good view. But I choose another instrument for my mouse and just play a game. So this is another synth. And now I come back to my world. It is about self-explanatory environments. So everything just in one image, and you need it to work with source code, with files, and others. So the same environment has appeared about near future in browser. You can name a lot of the kernel, and you could try it. But I'm going forward to virtual worlds. And it is collaborative self-explanatory environments where we have several users that could interact. And this is OpenCrocket, OpenQuack. We use this. So just to mention that. And also I will show you that it works in browser. So it's technology from the latest one. It's virtual framework and sandbox the projects from IDL laboratory in the United States. So these are the concepts which I want to, which inspire this work, this paper. So in 2006, was the projects a few. Somebody maybe know. It is the project from Institute, the Future of Book. So they think about that all our programs are really working in the way that we have one app and a lot of media that come to this app. And so app is just a single. And the Sophia idea was that no distinction between app and media. So with every new media, you could redefine the app. And when you, for example, should give me your notes, you should distribute these notes in the app. It's like everybody have a difference of collider. And it's different internal in score, like a core. It's the inspiration. The second inspiration is Ometa. It's the new object-oriented language for party matching. So actually, if you imagine, what can I do? So we are really working with one virtual machine, a language, and have some object space which are programmed in this language. Ometa could do the following that we have every object with its own virtual machine and with its own language coexist in a united environment. So it's the second factor for traveling to a virtual beam. Then the methodology and technology from the 80s, from Kennedy, Lieberman, we know maybe about software agents so that now, actually, and we, LifeCoders, communicate with the computer and calling it just as user and app, no intermediate, actually. But Henry Lieberman have in 80s a concept of software agents. Now we could project it to avatar. So every interaction with computer should be done not by you, not by the user, but your implementation, like an avatar. So you're not coding, we'll call it your avatar. So you're working on Kbert with your avatar only. Another assumption is virtual time. So a lot of talks was in here and also in programming on computer science conference, a virtual time. And here we see, so the problem is really solved, all is about synchronization. But virtual time is really not implemented yet. And this technology come maybe closer to this solution. So I'm just going later. And we introduce this integration of language, ometer, which could define for component object on language of virtual machine in this virtual framework, collaborative environment with partial virtual time and avatars. So it means that I show you we could run a browser and for object, we could add some grammars. For example, a grammar for Lisp, or JavaScript, or smoke talk, or maybe a logo. So I want to show some demonstration. You should see that here I'm terraling with my avatar. And now I will just to continue with Super Collider. I close this Kbert. And I try to open such Kbert in this world. So you should imagine that I need Oh, I have a really great picture. And here you couldn't see. Here is the avatar. So I'm walking. I'm just going. So I run all this support just to a server, which will get messages. And I create a two-year table. And run reactivision application. And then this interaction, I just not print, but have this one. So you see, actually, I just open the. You see that this, my two marker are sending these messages composed to OSC. And everybody in the notebook should just connect to this distributed application and also send. But it's opposed that every client, or this is we're talking not about client-seric texture, but really pure distributed, should also have a Super Collider, the whole thing. So now I close my this reactivision framework and show you in the browser what is done in the browser. I run the server, the local server on my machine. Sorry. So this is one person who actually come to the space and I log in just by one. I show you. And I get an incognito window to show the second person. Which go to the same server. Sorry. It's, I'm sorry. It was working, but OK. I'm going to this. So I create the world for you. So just create one broke. So actually, this technology brings what I said about open-crocket architecture to the browser. So that we have everything in JavaScript. And here is realized this replicated model of some notion of virtual time. So especially it means that every message and every is go through the reflector, which distribute these messages to all instances of the same, like island. And so all participants have the same model of space. And it distinguish the messages inside space and external. So for example, all hardware controllers, mouse, it's external. And they really like move forward this time in care to start the simulation. So I just open, for example, create an object. It's built using node class system in JavaScript. And it use a pure prototype oriented programming here. And so this framework was done by a sandbox project. It's available. But we connect Ometa to this project. So I create an instance. And now I open script editor. And I define the simple grammar from Ometa to parse my new strings on my language. The simple language which could add its from tutorial from creator from meta Alexander Roth. Just create. When I define the grammar, we actually Ometa make these functions available for parsing this grammar inside space. So now I open the create a new method to test our new language. I have some to copy and paste here. So here you see primitive expression, which will be parsed inside. I save the method. And I open the developer tools. I think they will open to show you the result. So I call the method now. And you see it's actually 100. So it's parsed from string. Now the more complex example that I added here already is a L-system generator. So we could realize this L-system generator just in JavaScript in functions. But actually, it is better to parse its strings in a more functional way. So we have a meta grammar for that. So for example, I made a Serpinski triangle. And if I actually select this triangle, I could see all grammars for that. So one for analyzing the first string, then building the structure, and then parsing this structure here. And then also one simple parser like the trivial logo. So forward and turn around. And so when I have a GUI, like interface, use interface for this object. Actually, I could test. I could change the string. And it will go through this parsing generators. So it all co-exists inside one environment. So for example, if some people, another left corner, come to this space and will program his own language, he will program in his own language using his own grammars. And I will program in my own grammars. And they are co-exists. Of course, it's not a real solution to interact between these languages. And maybe it's the problem which should be solved in LNK team, which they do a lot to. But for now, not to be waiting, we could experiment with this. So being in a complex environment, where you could use some pre-defined things, I could experiment also with language. Because if you come to this environment, you need to program in JavaScript. And if I want program in SuperColor or Lisp, I could implement it. And in better way, not in hard-coded, but in more virtual, in a meta-level way. So this. And I return to my presentation. Sorry. So this is about projects that we tried to experiment. For example, here is Tanya Saschenian, an architect and designer who made a paper with me. We experimented with some musical experiments where we could or wanted to make some interaction between musicians who sit on the chairs with sensors. And it is a super-collider that chair just go near another people and produce a sound. And this chair could be not one, but several chairs, and how they could interconnect with each other. And we just tried to implement it in this virtual world, trying. Of course, it is an experiment. It's not finished, really. But it's one of experiments that we could do. Another experiment that we also do is a cover system. A cover system is actually a four walls in which we have a lot of visuals. But the problem is that it's a really complex system. And what these open source tools give us, that every, I think, pupil could made it. And also extend to. And the problem with this system, that it's difficult to reprogram it. So you need some script editors or script languages, which are, so these virtual worlds actually easily solve this problem of extending these environments and reprogram in real time. So this is actually the screenshot of this. Multi-digit table also you see an avatar, what they have shown you. We made something to a piano, keyboard, and experiment with the markers. So I just wanted to experiment with this. And maybe the questions could be. Yeah, maybe we have time for the traditional one question. No, no questions. I'm sure people will talk to you more later. Oh, it's a little. Yeah, I understand. The people are standing around to cheer, while you're capturing the movement into the virtual reality. I mean, you saw the picture of people standing around to you and the picture of people in virtual reality. They're being censored. Well, censored, I mean, censorized. OK. Yeah, maybe, OK. So I mean, could that be working in your virtual reality, in reality? You should imagine as an artist. So it is a tool really for artists. It's really experimental. It just fell down every time. And you just experiment and dig in it. And because it is open source, and this environment is for experimenting with the new languages, with new paradigms. So when yesterday was the discussion between live programming. So in small talk, for example, we have, and you could modify and met as a body, and you could just make it do it. And in JavaScript, you have also this time. So when I show you about OSC, you could just use this environment at a spider, which could produce this super color code. Because it's really produced this code. And when you will look at a live coder program, actually, you could program in some language and show you in a screencast super color code. And so it's really the new. But the main, maybe, feature, or what I want to say you, that everything, what here you see is done with intermediate object as an avatar. So it's like, if I want to make a text, I should bring the window of the text window. And then to program this environment through my avatar. And this changed the game in live coding because you have a lot of virtual stuff which will move forward in another level. Because now we are like puppeteers, just keyboards with arms on this hardware, which is not good enough for now. It's good. OK. Thanks very much. So next up, we have Talia, who's been a present on social magnetization distribution in this locality. And she always determines which in real time and real world. And the avatar's messages to the artworks, our aim is to merge real and virtual museums on architecture environment. And it's a code for all the world. And it must be in real time, in live coding. And we could change it from all over the world and be together there. OK. Well, today I am presenting a short paper that proposes an approach to the activity of live coding as an artistic configuration constitute in a creative practice from improvisation, openness, and constant exploration. I just want to share some thoughts about sociability in live coding and from an anthropological approach whose method is ethnography. And I won't get in detail about ethnography because Giovanni just already did it. And he made a very detailed presentation of the ethnography. Well, live coding activity arises from the start as a collective activity, both keeping interaction with each other in mailing lists through the publication of their programming languages during performances, by opening a connection with the audience, and many, many other ways of interaction. And for example, another new one is extra muros that we saw in last session and another example of interaction. In live coding activity, there is a current intention to explore the capabilities to skillfully improvise with code in a challenging way. But there is also an intention of further developing a kind of participatory community in which everyone can participate without being required programming mastery. These are some quotations from the fieldwork. And in order to ascertain the participatory intention, I would like to refer to sociability in terms of imagined community. According with Benedict Anderson concept of imagined community, communities do not be distinguished by their falsehood or legitimacy, but by the style in which I imagined. In this sense, there would be an idea and a collective construction from that idea. On the other side, there is music as mediation. For anthropologist Georgina Born, music has a plural and distributed materiality. It's multiple simultaneous forms of existence as sonic trace, as notated score, and technological processes, social and embodied performance indicate the necessity of conceiving the musical object as a constellation of mediations. Music, she says, requires and stimulates association between a diverse range of subjects and objects, between musician and instrument, composer and score, listener and sound system, music programmer and digital code. Music appeared to be an extraordinary diffuse kind of cultural object, an aggregation of sonic, social, corporeal, discursive, visual, technological, and temporal mediations, a musical assemblage where this is understood as a characteristic constellation of such heterogeneous mediations. Light coding has been constituted as a collective artistic expression that mediates and builds on sociabilities and subjectivities in a socio-technical context, then could be socio-technical mediations in the case of the light coding scene. Because when technology is not only appropriated but is being experienced by people, it's in turn built. And the own experience with technological devices makes sense. Those are some authors of socio-technical interaction. But the main topics are that the social and technological are not meaningfully separable things. Social behavior influenced the technical choices and the system participants are embedded in multiple overlapping and non-technologically mediated social relationships, and therefore may have multiple commitments. Well, new projects, new proposals proposal to demitify the relationship with technology, making the code a craft or artistic material, but more than anything, the construction of participatory community, open spaces, spaces to express and build transformations, not only in the artistic or cultural field, but also institutional. The light coding scene involves building an entire world, an art world in terms of Howard Becker. According to the author, who cooperates in producing a work of art, do not do it from nothing, but rest on past agreements, conventions, which usually cover the decisions to be taken. And this makes things simpler. However, Becker explains that people can always do things differently if they are prepared to pay the price. And there is a quotation that talks about what is to pay the price in order to make things in a different way, in new ways. So Becker says, if that's true, we can understand any work as the product of a choice between conventional easy and success and unconventional travel and lack of recognition. A kind of output that light coder found to this difficulty in building an art world was, I think, to place the art in the process, more than in a finished product. So the emphasis on process in which materials, digital analog, are more important than materiality or a final product, allow light coders to advance in the construction of their activity and an art world, always changing, exploring the role of technology in art and art in their technological forms. It is there in the construction of those environments in process where light coders feel creative and create from improvising in the space of active materials. And as a conclusion, regarding the social aspect of social settings from artistic practices, art wars, if, as Howard Becker explains, conversions make it easier and less costly to build an art world, but more expensive and difficult to make deep changes, then the case of light coding probably contributes to the acceptance of change as a constant. Within a framework in which artistic expression is a process rather than a finished product, the light coding scene make change opens and constant exploration practices that constitute a creative activity. And this is one last quotation. People expectations, anxiety, and concerns represent the formidable social power, the one who brings a group into existence. Thank you. Light coding is participatory in terms of letting the audience in in this process. But I'm a bit concerned, after being here for three days, about the actual person who's coding and if that's actually opening up enough and being enough of a participatory community. Because the majority of the things we've seen are white men. Right? So are we not like, is this where we're failing building a participatory community, is actually within the coders themselves? You mean that there is a problem of not so openness, maybe? Is that something you've come across, or? Yeah. I remember once talking to Alex when he's arranging the people that will participate on algorithms and all that. And if there are women to make a performance, all they are in and people that get out of that algorithm, in that case, are men. But it's the experience that I have with this group of like-olders, I don't know. I don't know if you answered your question, but. I was just wondering if that was something you considered in this idea, just before you came in. Yeah, yeah, yeah. I suppose it's between ideals and actual realities. It's kind of the same inherited, maybe, from the free software community. Very strong ideals that people very forcefully defend, but kind of end the mind by, if you actually end up participating, you do tend to be a certain personality type, and forceful gender. And, yeah, the idea of sharing and collaboration end the mind by entity enforces, which stops certain kinds of people from participating. They don't have any answers for how to improve it. So it's a bit clever. The Mexican life, I don't see it. I think it's also a matter of yesterday evening, I was having a very interesting discussion about performing for all different audiences, and what it is to be inside this whole of the academic world and the research world, and what it is to be actually in the wild, and playing in places where nobody cares about what is going on in the academic world. And not only mention the Mexican life calling scene, I think it's something that is happening pretty much outside the academic world. Although there are some people involved in the academic, but it's not that strong in that way. The academic part is really not important in that sense. And I think this same phenomena might be happening in many places in the world. Many people who are not around here that actually would be interesting to have them around to discuss about these kind of things. But this is going to happen in every topic that you discussed inside the academic realm, because we are always dismissing everything that is not in this room. I think that includes gender and racial things. And I don't know if there are ways to open these spaces also, like to merge what is the academic in the wild in a better way. I think this is a challenge for every performer and for each of the academies. We should move on to our next row, next to the last group's channel. So we have Charles, who has some questions. Like, talk in the title, live patch, live. Let me check my settings. It's turned off on that. That's good. Great. Okay. This talk is called live patch and live code. And I want to start by, it explores sort of live approaches to non-digital computing systems. So I want to start sort of with, how do we go to the next one? The historical live systems. The Moniac, I'm bringing this up partially because it's kind of cool. And also it's got a connection to leads. This is the Moniac, which is the Monetary National Income Analog Computer. This is the Mark II. This is one in New Zealand. There's one somewhere in the UK that actually works. But the Mark I, of which there's only one, is actually in the lobby of the business school. About five minutes walk away from here. So during your break, the receptionist will happily point it out to you. It's really cool. I didn't get any good pictures of it, so you get to look at the New Zealand one here. It models a Kenzian economic theory. It's a little bit hard to see with the lighting in here. But what you have is a bunch of little tanks and houses. It's a water computer with red dyed water in it. And you can turn up a pump that says tax rate. It pumps a lot of water into the treasury. And then you can turn on government spending and investment income and savings. And water moves around the thing. And in 1949, there was the idea of trying to create a stable economy by getting the knobs exactly right. So people used these for years. They didn't fall out of fashion until the 70s with the advent of Monturism and sort of the ditching of the idea of having a stable economy that's key and benefits people. If you get the knobs wrong with this, the buffer overflow is a very literal event. So this is absolutely a live system. And then it was used actually in research. The guy who invented it was named Phillips and something called the Phillips Graph that has to do with unemployment and inflation was partially worked out on this. But sort of more general purpose, you've got analog computers. This one isn't special aside from the fact that it's really cool. These are also operated very live. And they're much more general purpose. You would plug in a bunch of patch cables, set initial conditions, and run simulations from an initial state. And then if you don't get what you want, you have to reset it to the initial state, which is slightly a pain. But it is absolutely live. All of these modules you see are function generators. That's function in the term of, I don't have a whiteboard, but for every x-value you get exactly one y-value. So every physical patch cable is moving a variable voltage that has to do with functions. So you could have something that's generating an upwardly moving voltage and then something else that's taking a sine function of that and then running it through an integrator, which is an idea that might be slightly familiar. These are also functions. Unit generators are also function generators in the same way that analog computers are function generators. So writing a Eugene graph is creating an algorithm. And so this thing that came up yesterday about the difference between live programming and live coding, and while not wishing to court controversy, something that is considered an important part of live coding, especially in one of Nick Collins' paper in 2011, is that live coding is perturbing an algorithm. It's not just establishing an algorithm. It's making a change to it. So this is part of the reason that I want to talk about analog systems, partially because I do it, but also because sometimes looking at boundary weird cases gives you an insight into the thing as a whole. So when I write things saying, I'm going to come to your live coding conference and I'm going to patch cables, but the algorithm doesn't change. So it's clear from reviewer feedback I've gotten not just from here, but every other place I've proposed this that perturbing an algorithm is an important part of live coding. And that maybe is the difference between live coding and live programming if we're going to have that difference. The other thing about perturbations is that we tend to think of them in terms of control systems, rather than, this is my synthesizer in church, control systems, not just the Ugen graph, but more the control rate stuff seems to be an important part of live coding. Synthesizers obviously have the capacity for that kind of liveness and also for complexity, which seems to be less important now than it was in, say, 2007 when the late and sadly missed Klick Nielsen wrote a paper where he described email exchange he had with Julian Warhuber about systems growing in complexity until the programmer can no longer tell what's going on anymore, which I think probably all of us have experienced this at some point, like I have no idea why sound is coming out or is not coming out, which is, again, a thing that can happen with the synthesizer. I started doing live patching in about 2004-ish, and I didn't know anyone else doing it, and the way I started sort of intuitively was from a blank surface and I would plug in cables until I had no idea what was going on anymore and either the sound ran out of control or it just stopped, which is an approach that live coders use, but since I heard of Toplap I thought of ways that I can integrate live code performance strategies into synthesizer plugging because, as we all know, just because the system has capacity for liveness and live coding is fairly so, so I've tried to apply these to synthesizer patching, so for show us your cables, as it is, it's hard to see here because of the angle the picture is taking at, but the synth is an angle to the audience rather like the piano there, off to the one side and the angle, which means that if I were patching the piano most of the people in the room would be able to see what was going on from a distance. I experimented at the live code festival in Karlsruhe with adding video projection to this with a webcam and then for that I thought the Royal Conservatory of the Hague has a convention for cable colors where control signals aren't blue cables I mean obviously the cables themselves aren't different, but just so you can help keep track blue cables for control rate signals, red cables for triggers and black cables for audio rate signals, so I tried doing that, but what I found out is even here you cannot see the black cables at all on this nice high-res picture and on a webcam forget it so that didn't work out as well and then I had what? These cables are expensive for a further pedagogical thing I had the idea of because I was running a webcam anyway I had a little USB MIDI device and when I plugged in an LFO it would bring up a slide that said LFO that was sort of over the top and people can see what was going on nobody else does this for live coding and it was kind of a pain so I've put doing that but the experiments that I've done and then another thing that seems to be important for live coding is just building up a very complex graph but switching between graphs so you do this on a synthesizer much the same way you do it in Maxim or PD is that both graphs if you've got two graphs they're both always running at the same time but you can change which one or both of them are going to output it's also possible to make decisions with analog synthesizers this particular one has a binary decision module where if a voltage goes over a threshold it changes routing but actually even an envelope generator if you're using it to control amplitude of something is a way of making a decision because when the voltage is high the envelope is going and when it's low it's not so you're getting sound or you're not some people also feel that scheduling is an issue with live patching it's not something I've actually ever really worried about but if you are into scheduling of course analog sequencers is a thing that has existed for a long time but also if you're planning on triggering an envelope and you've got any sort of stochastic triggering thing you don't know when your envelope is going to fire so you have the same sort of uncertainty in scheduling future events is possible and my talk has gone really short and does anyone have any questions yeah how fast can you patch isn't that important I have all the cables around my neck so it's like sort of a shocking scarf and I'll remember why it's gone fast I skipped a section anyway yeah you plug them in but I mean you have to look at what you're doing I think it's not a physical speed problem as much as a cognitive problem which is also the issue of losing track of where things are going and then you think you just have all the cables in front of you there's color coding here so like I have three colors of cables and they each have six colors of ends so I should be able to look at but you end up running your fingers along them and what's going on so the speed problem is here not here probably enough I was having a conversation just before this session about what might lie in the territory between the symbolic and the sub-symbolic and it's being perhaps something distinctive about life tokens and the symbolic structure is quite a lot worse there seems to be the sub-symbolic thing I was wondering if if you managed to theorize any of that territory between the two I mean this is part of the reason that I was talking about analog computers because it has grown out of the symbolic and it's become more practical but I I mean I don't do patcher languages anymore because I find them annoying when you're not actually plugging them but I feel like patcher languages actually are operating on a very similar level and so I'm not sure I think the creation of a binary opposition there is problematic yeah I don't actually know the answer to your question but it's really interesting because I think if you're using a laptop and a patcher language and you're making continuous textural sound with it then that's a very analog experience I think but if you're making lots of sweet tech novels and using it as a you can do that with an analog machine like this then it's a digital interface and just plugging in and out that's a digital you point out discontinuities that's another digital layer so I think everything is inherently hybrid and that also always layered up so you have this analog system and you sort of simulate this digital system in that which is the laptop and then use that to simulate another analog system which might be a 3D word or something so I think yeah I think people get a bit hung up on this but analog and digital are clearly different domains but they're always present and they're always defined this division between digital and analog is very much something that has arisen more recently like the legendary laptop band they were using analog signal generators especially in the early especially when they were the league of automatic composers automatic or algorithmic anyway those were analog signal sources that happened to have a digital control mechanism and a lot of people who were especially in the sort of US west coast scene that we're doing the sort of home built electronic systems as soon as they got like Kim computers or other digital boards the first thing they thought is great I can use this as a module to control my analog sound generation system because I mean obviously there was no digital synthesis that anyone could afford unless you were at a massive institution but the things very much went together in the early days have you ever used the computer to control your synthesis on your synthesis I mean you can get I think it did something like this once when I was at the Hague I can send out numbers to what do you call it a digital to control voltage converter and so then you're just generating sort of a number generator but I mean at that point what do you do with the number generator you have to plug it in and it's very much a divided attention problem I find it too hard to concentrate on both things and at the end all I'm getting out is a stream of discrete numbers it's not that useful for the amount of work one thing I have done is I played in a duo with a guy that turned out to be a live coder although I didn't know what he was doing at the time where I was sending him a signal and he was processing it live so I've done that kind of radically from one state to another like I mean for example if you're patching in Macs or PD you can copy, paste, make the changes and then repatch but I guess you can buy just another set of synthesizers and prepare your to be a grand yeah I mean this is very much a limited system I've got three audio ray oscillators and three filters that I can make oscillate if I need to and I have and a low frequency oscillator if I want more oscillation than that I'm out but I mean if I were running PD on the Raspberry Pi especially a first generation one I would also very quickly run into CPU limits and to some extent PD passwords are doing where they actually breadboard like small scale, lower scale synthesizers build the logic up from the chips as they are building and like PD passwords because for electronics these building explicitly breadboard interfaces into the synthesizer modules so people can actually make these live connections and change the circuitry as you are playing so I think that's and maybe as Philip Serns explained we had a music code and you know how he was working with this for me it was definitely coding even though it was with chips and hardware and put in wires and breadboards cool so your title live code the live patch live code you had me at the time the thing I'm thinking about is the interface what's different between the environments we use often when I'm coding on computers than that the thing that I got to the right way was the fixed interface that our coding environments we often have these interfaces where we can move things around juggle them, reconfigure them for particular moments that you are showing us in the photo is very kind of immobile so of course there are other modules in the size of other artists in the size of more little bits that they move around so I'm just wondering if you have any thoughts about them there is sort of a to use the top-level manifesto between a chainsaw and an idea where a tool is a chainsaw and an idea is an idea and this does have something sort of about it it very much has tool aspects and interfaces for using it that are fixed and if you are to patch something, especially if you don't grow to outrageous complexity you can start making predictions about what's going to happen when you turn knobs and it becomes very much instrumental speaking as an experience of a player I also play tuba that is a really embodied experience it doesn't even feel like using a tool it's almost like an extensive of myself and then there's live coding where it feels very much like a cognitive activity which honestly I'm very aware of being embodied when I program so when people talk about programming gesture it's not something I actually relate to in a certain way but this can be either way it can be in between depending on how it's approached and in the future maybe are supposed to have better interfaces for computers like people doing weird 3D things that you can think maybe some live coding interfaces will be more like this maybe not, I don't know