 Hello everyone, thank you for hanging out or showing up if you just got here. For those of you that were in the room at the end of the temporary distortion presentation and conversation we're going to continue to discuss AI and performance with the artists here on stage. And sorry I need my glasses nowadays. So where I'd like to begin by way of allowing people to sort of introduce themselves is not only just introducing their work and what they do and who they are. I'm sure that most of you know a lot of the people up here. But also talking about what first led the artist to introducing AI into their work and a little bit about what that project was, what year it was created. Some of us on the stage have been doing this much longer than than me, for example. And so yeah so I just kind of figure we start there. And then I have a few questions, but I also wanted to open up to this maybe like a little bit differently than common because I'd like to open it up for people that are on the panel to moderate this as much as I'm operating it right because I feel like there's a lot of expertise in the room. And, and I'm sure that there are incredible questions that each artist up here has to pose to both themselves and the others. So if we could kind of spread it around that way and then hopefully at the end have some time for questions from the audience. Yeah. So why don't we just go across and we'll start next to me and is this supposed to be on. Yeah. Hi. I'm Marianne Williams. I'm the artistic director of the builders association across media performance company that's been around forever, by which I mean 1994. And, yeah, we've done a lot of projects, you know, using various kinds of technology, mostly to kind of talk about technology and how it is framed in terms of labor and transactional, you know, aspects of our culture. And then that doesn't sound very interesting did it that's so boring. So for instance, we did a show recently about micro workers and online labor, the micro Turk community that serves Amazon, and we worked with them for a couple of years so that's part of our processes it's kind of labor intensive. And they created a project with us a performance and online show where it was kind of a second screen event where you as the viewer were had to perform tasks, based on what the micro workers assigned you. And so it was like a, you know, you were scored and paid paid in builders bucks. And I can't even remember but okay let me just quickly say so when the next show that we're working on is going to launch election year it's an election year show and it's about a post truth candidate that we're creating with AI, embodied by Mo Angelo's is in the audience tonight. And so a lot of what we're working on now is kind of how that's going to be generated that specific candidate. Right there for now. Hi, I'm Andrew Scoville. I'm a director and a creator of original work. A lot of my work is around bringing science concepts into theater. That's some of my more recent stuff but in 2013 and in 2015. I worked with a robot called being a 48, and being a 48 is essentially a bust, an animatronic bust, and it mixed with the chat bot. And so with some collaborators up here who are here they've kind of Lynn Rosenberg we made two projects where Bina was essentially an actor in the show interacting with other. Yeah. Yeah, yeah, yeah. Yeah, okay. Thank you. So being a 48 is essentially chat bot with a bust animatronic and it was interacting with other actors in the show. We didn't at that time set out to with the intention of working with AI we were mostly exploring content it was one of those situations where you're making a piece that you think is in science fiction and then you stumble upon something that is sort of a proof of the concept of this idea that you thought was not real and so we stumbled upon being a 48 as sort of a research project and the more we got to know Bina, and the more we got to know Bina's handle or Bruce, Bina became part of the project and we incorporated Bina into a couple pieces as an actor. Thank you, Annie. I made my first piece using a kind of very rudimentary old fashioned form of natural language programming. Kind of AI in 2009. That piece did not use big data or statistical analysis. It was an old fashioned AI that was the statistical turn in the development of generative AI. I just made a piece that is my response to generative AI. So, you know, we can talk about that if you're interested but I guess what I'm mostly wanting to propose after almost 15 years of working with these things is that working with AI is a political that there's a political dimension to use of these tools. It's not at all neutral. And it's not at all really like previous sort of technological products that have been introduced at various times. And I'll be happy to say more about that and I also would mention that I'm also in law school right now. And I've been studying tech policy. So I'm also happy if people have questions about some of the lawsuits that artists are bringing against the tech companies. And I can talk about this on both the input side the training on works and also on the output side and the status of the AI generated material. We just talked a lot. So Sally's part of temporary distortion so but some people just walked in the room. So we'll just keep it short where we just showed some work in progress it's our first piece working with AI. So we don't really have the history that I feel like most people here have. But it's what made me curious about sort of putting this panel together and asking these questions and starting this kind of conversation in a venue like prelude. So I guess one of the questions that I have off the bat is if there's anything that the integration of AI into your work has allowed you to do that you wouldn't be able to do otherwise, or is that not true. Well it's definitely allowing us to play with live text generation specifically working with this candidate because we want to try to make a model that could be electable. This is our idea of this post truth candidate so but we're using Eyn Rand's work as the point of departure. So it's a tough sell. But not impossible if you kind of tweak it with a little Dolly Parton little role doll, you know, so the the whole idea of like generating it in real time and finding how that voice could be, you know, as a jam. Right so that's something that we couldn't do and we're very much like depending on the tool. Yeah, I feel like. Is this on. Okay, I feel like it, what it allowed us to do in the, in the particular pieces that we used was just to give people a chance to observe that kind of AI that's sort of like embodied or attempting to mimic human expression and and and hold a conversation with another person to just to create a different sense of awe that I feel like we're always looking to do in theater, create this sense of wonder. And there was something about being as presence in the room that does that automatically how long that lasts is a is a question. It's a good question, but and that I feel like is determined by the actors who are on stage with being your interest in actually staying with it. But there is a there is a sort of there felt like a shortcut to this new sense of awe at that time. That was really exciting. All my work is really about investigating these technologies natural language programming what computer generated text does how it functions. Some of the Eliza effect stuff that you're talking about. And so all my works would not have been made unless the technology was something that I wanted to be looking at. Yeah. I would just add for me, I'm kind of a analog advocate. So I like to juxtapose the new technologies that comes in because it kind of shines the uniqueness of the magic moment of analog had its peak in history. And then it was replaced by digital so something about that I think is like a human story to through our machines. So no, I probably wouldn't be able to do the work if I didn't have whatever the next thing that comes down the road. So that's what I find really interesting about it. I'm curious from Annie. So Andrew mentioned the sense of awe and wonder. And I read, I read the piece that you wrote recently about AI. And so I'm for those of you that haven't read it. I wonder if you could speak a little bit about it, and also. So my short version would be that you're kind of stepping back a little bit or is the sense that I got from reading that or is that a false impression. I mean, I've only made the one piece with generative AI and I would not do it again. And I only made that piece in order to criticize generative AI. So I'm not stepping back from algorithmic work or, you know, but I am like a strongly advocating that people think very hard about what they want to do with this tool and how they want to use it and what the meaning of using it is. In that essay. Would you want me to talk about the Eliza effect or the, or the. I'm happy to let you talk about whatever you'd like. But I think it would be nice maybe for people to have a little bit of a kernel of what that essay was about and then maybe that could bleed into us asking questions of one another. I will try to be super succinct. So the basic thesis of that piece which doesn't really have a thesis it was more of a thinking through kind of text but an American theater. Yeah, no one saw it. But is that, you know that artists have a are playing a very precise role in the dissemination of these tools, and that role that we're being asked to play is something like being a propagandist for the tech companies to make these tools seem cool, sexy, interesting, available to everyone. Democratic. They're we're going to make these beautiful things in MoMA that look like screensavers and we're going to, you know, we're going to make these cool. Huh. Yeah, yeah. I'm not the first to, but, you know, we're going to make these like, I can, you know, read it all day but the, we're make these like funny things where we say oh you know chat to be T right me a funny scene and then somehow it's interesting just because chat GPT wrote it, or the special voice that these the text seems to have. That is something like a default style or something is charming because it's kind of dumb or it's kind of banal it's you know there's a lot of things we're doing with these tools that are thoughtless. And there's a writer that I recommend to people, his name well I could give you a syllabus but I'll just say Dan McQuillan is the writer who wrote a book on resisting AI. And he was the first one to make me think about this question of thoughtlessness that using generative AI allows us to skip all the hard work and just get some results. And then we also don't have responsibility for those results because it was made by the AI. So that's been the case with algorithmic decision systems which are used in many different social contexts that's been a problem that a lot of AI critics have been involved with for many years. And I'm talking about the ways that AI gets used to make determinations for various public services for various government functions, incarceration recidivism rates, some educational uses as well public benefits you know it's these tools are getting embedded in all kinds of places in our society. And when government agencies or private companies use them, they get to say, Oh well we don't know I mean it's just what the thing said, because the statistics are so complicated how could we possibly know why it's making that determination. Therefore we disavow the responsibility for having made the decision. When you start looking at how these tools are trained and how they're deployed, you find all kinds of what the AI researcher Kate Crawford calls like a parade of horrors really in the training data. So, the notion that their value neutral in some way, because they've been run, because the information has been run through a high level complicated you don't want to know it's too complicated for our small brains, you know, these systems allows us to get results without having done any work for it, and also without having any responsibility for the harms that those results might cause has led me to I mean that's just this is like one corner of my problem with generative AI, but it's a pretty important corner. So in that piece I basically say so we know the harms are happening in terms of algorithmic bias we know they're happening in terms of incarceration in terms of military. I'm not sure I mentioned the military but that one too, in terms of all kinds of pretty important things going on that are being delegated to these tools without very much oversight these are private companies, they don't have to show their work. There's an incredible lack of transparency about what goes into the tools and how they function, and what the confidence level is even if the companies in the results that they give. So, given all that in the piece I say, you might think who cares about artists in this because it's just fun. But I think what artists are doing is super important, because we are the ones making it seem cool, and making it seem interesting and fun, and making it seem like all you other people like don't be afraid. Somebody already said something about oh Frank did about being comfortable. Yeah, we're making people very comfortable with these things. And I hope that we are on guard against getting too comfortable. That's my spiel. I've on guard hopefully. I've on guard hopefully. Oh, on guard. Yeah, I don't know. I don't know. I don't know. I know you're dying to ask the question. I don't know anything but could the all problem also possibly lean less on like how the technology is made, and more on the inhumanity of the institutions that it's like revealing. Yeah, all that in human all that and you know the short story writer Ted Chang has talked about these tools as being a kind of new form of McKinsey. I don't know anything. Well the company that advises corporations on how to streamline streamline make more efficient their operations which usually means cutting labor force weakening unions, going for cheaper labor elsewhere lots of arbitrage you know like the way in which the basic tool that would the basic function so far that everyone can agree is an actual use case for these tools is to eliminate jobs and weaken bargaining power and the part of workers, even by the threat of using these tools. So that's why the WGA strike was so exciting because they really stuck to their guns and they really pushed back and they got an enormously. I don't know encouraging contract, let's say that we're not just at the mercy of these things and it's all inevitable and there's nothing we can do. So yes, I think you're right the institutions the corporate structures that are dealing with them like all the money that these companies are making is not from users. It's from enterprise contracts. So the only way they're ever going to make money back for their venture capital investors is by selling contracts to embed these systems in other companies. So anyway sorry I'm going this is a totally derailing the whole. No no it was a question it was a question from the audience is that at all relevant to your question. Yeah, I mean one another I remember who said this could have been David Gerard, not sure, or David Columbia, but that like generative AI isn't actually going to eliminate all our jobs. It's going to eliminate some of them and the rest it's going to make much more alienated. Yeah, it's super I mean it's it's not a pretty thing what's happening here. And anyway, so I think it's interesting to think about the role that artists play in all that. Is there a question you'd like to pose to someone else on the panel the rest of the panel. No. So I mean, yeah, we could talk about AI in the larger sphere for five days but I think we let's talk about theater because that might be sort of what we're supposed to be doing. So, I, to me, one of the interesting things that you did in your prod in your latest piece that I thought was very effective was you know how in Halloween masks like if you're wearing half a mask it's much more terrifying than if you see the face or you see the whole mask. So that that moment right that gray area where it's like there's still a person there and there's this like friction around you and the AI and you get to see both in dialogue with each other like that being able to stage that I thought was really great. Congratulations. And so do you know what I mean that I think that there's a lot going on. And it's not like AI is totally winning at this point, at least in this, you know, little corner of the world. So yeah, would you like to push back already. I was, I was going to add to the, the very real problems that are coming up that it's interesting to think about. I wouldn't this panel seems like a already something that is pushing back on what seems to be the thoughtless aspect is people saying it's value neutral, or perhaps that it's generative actually because what's interesting about the work I know of the five of you actually, but then also what you know you're you're explicitly talking about in in everything that you were talking about making is that you're actually pointing you seem to be pointing out how artists point out that this is a very human tool, and there is the human thing behind it so I'm absolutely terrified of the embedding of AI in a completely unregulated, you know, only through the interest of a private corporation. That's perhaps dehumanize or something. However, it's like the art with AI and what I just saw with your work. And it's like, kind of slowly, like, actually shows points out that this isn't without an author. I actually didn't. I don't really see that you know you even you saying this image was created by AI I just don't see it that way I mean it's edited it's, it's, it's, it was prompted it was you know whatever so I understand, you know, the AI aspect is something, but in fact you artists seem to be pointing out to us how, you know, that there is a face, and a, and a thinking, you know, non algorithmic thinking being behind it. Right. That that I, you know, I'm not, that's not really pushing back I just wanted to add that I think your art seems to be. Yeah, that's what I mean by a serving a little bit as propaganda. Right. So, well, but just not necessarily making us comfortable that's all I mean I don't see it that way, even the MoMA stuff it's like well that's just a bad. That's just a bad piece of art, perhaps it doesn't necessarily make me comfortable, or me think that AI is cool, necessarily I mean that sounds like I'm pushing back on you but I'm just saying, it could be used that way. But I don't think you all are talking about using it that way. So, yeah. I was very comfortable until I failed the terrain test or I won the terrain test with your piece when I thought like oh no this is a human this is a two guys they're brilliant. It's the second time for me that the terrain test has been failed. It happened in March at the Whitney with the cashier I bought a little pin and it was this Nazi chatbot that they had resurrected did anyone see this. It was something that happened like a few years back, and they had this chatbot, but they fed it through the whole internet so it became fascist. And so Microsoft had to cut it I don't remember is a female name I don't remember but they had resurrected at these artists. And then I talked to the cashier and I was like, she was like yeah it's so crazy. And I'm like yeah right she's like yeah I wouldn't even know if you were a robot. I'm uncomfortable. No, even though you're very optimistic it's not like denial to me you're the wrong. I don't know. I don't know Peter at all. I agree 100% with everything she's saying, I'm trying in my like I said and as I'm going to be 60 years old I'm trying to stand next to it. So you can see me as the artist, even if it's all going down in flames. So that's my optimism is that I'm going to be an artist, oh to the end. So you can watch me as you're choosing AI, you can watch me go down and you can choose the I it's up to the audience. That's my optimism that I'm optimistic, because I'm going to stand up as the artist that's my optimism. You have to juxtapose it. You got to put good next to evil. Yeah, you have to put good next to evil. I had a I had a question actually for the whole panel. I think like a lot of your points are really quite in your projects are really quite interesting. I'm a lot from like, higher knowledge from like not only the system but the thought that surrounds, like, working with these tools. I wonder like, especially now that AI is becoming something that many artists are being asked to engage with at least in various levels of their process like if you could give. If you could, if you could put together like three questions that an artist should ask when thinking about or being asked to like speak about AI in a in some sort of context, whether that's like at the bar or in a in a in a contract or in a in a contract or in a in a sort of a new pitch proposal piece. And that's for the whole panel. Like ethical questions, three ethical questions. I actually don't think it's necessarily only ethical. I think it also could be aesthetic or practical or philosophical. Thank you. Well, well, no one will just one thing that I don't, this is formulating a question but just one thing that I'm noticing is there's sometimes the sometimes the AI is mysterious in its role in the piece. For example, what we just watched do we what do we know what part of it is what part of it is AI, are we as an audience intended to know and be able to distinguish as a question that I have. In my example, there was no question, you know who the AI was and who the real person was it was in your piece I saw there was no question who the human up there was and what the role of the AI was and so one question that comes to my mind is, will it be clear or what's the distinction what's the audience's experience of the thing. I mean, I think it's an excellent question and I guess I would want to know more what's what and my law brain kind of flipped on. And so, in terms of contracts, I would anything that we do work for hire, or do in addition to the internet in any sense, these days, you've got to ask if the material is going to be licensed to third parties, and whether you can opt out of that and get that in writing. Licensing to third parties means it's going to be used for training data. If you do any recording of your own voice, any biometric information photographs of your face, anything that is biological information about you. That is a big red flag. And then I also have some copyright advice but maybe I'll just start a legal clinic. We can talk after if you're happy to look up at your contracts, but I can't give legal advice because I am not a lawyer yet. See where this is going. Thank you for putting together this panel. I'm Saviana Sanesco I'm a playwright, and actually I just completed two productions with AI characters that I wrote. But while I did the research with generative AI, judge EPT and all that, I wrote the characters myself, the line so I kind of outsource the AI only like the research, and then I created the character the dramatic journey is everything. I think that 2.0 was presented in the ice factory festival and emotion is a dance theater piece that was a cherry art and is going to come to New York, but what I like to ask is, though, in my experience, one piece is a dystopia. The AI revolve rebels against the creator, the neuroscientist. The other one is an utopian version, the AI and the human get along very well so I put both things out there. But isn't there, in a way, these AI are just like another medium in the conversation that's theater and the arts. Couldn't we instead of you know sort of resisting it or questioning it so much to just use it as a tool for us humans the creators, and then maybe you know just treat it as just another medium in a multimedia performance. I personally don't see that such a scary tool, although I wrote a dystopian version of it, but more of a, okay, the AI can be a character my character that I write. It can be an actor, it can be a designer, we work with the IS we work with any other human or any other entity, and maybe this is the future that we learn to work with this kind with different entities that are not humans right. So why not. So my question is, do you see any utopian version of this collaboration with the AI. Thank you. I kind of guess we might know Annie's answer so. I'm totally talking way too much and I really apologize I just want to I just want to say that the one of the interesting and a little dystopian maybe things about the big AI companies and a lot of the big Silicon Valley companies is that they're run by people who actually do see this choice as dystopia utopia. They, I don't know if you guys have heard of the sort of test creole or effective altruism long term ism. These are all free banks that evening. One, he's one but he doesn't do AI he does crypto but he that doesn't work out very well for him. But yeah, he's an effective altruist right, but there's an enormous weird ideological component to this, which I would recommend everyone just Google, you know, it and or duck duck go it really and you know, check it out because there is. I agree with you completely. The most likely scenario is neither utopia nor dystopia it's that we have this new thing, which has been more or less imposed on us, we learn to live with it but it makes everything kind of shittier. And so, you know we can learn to live with things being shittier. Things have gotten shittier. We've been dealing with it, but we could also like decide we don't want life to get shittier. And so, in that sense, like, it's not a question of necessarily dystopia, but it's a question of saying, No, we don't actually that's a road we don't like that doesn't lead to anything great for us. That's like, to me that's such a key moment is this. We're on the road, you know, they call it acceleration risk in Silicon Valley, which is like, we don't know what's going to happen but we got to get there first China's coming right after us. You know, like the, it's the, you know, break, move fast and break things is there. Right. So, that is how these things come into being without our being able to regulate and then by the time, you know, we're like what the fuck. Congress is throws up their hands and says well it's all it's in the system and we're all using it all the time. So I feel like we're at that moment. And that artists do have a chance to like put a little pin in the sand draw a tiny line in the sand and say like in this moment. Pre regulation, just point to it, because in another, whatever, a couple months, couple years, you won't be able to point to it. So that part of it to me is the, like, the dystopian part wins, because it's, you know, infiltrating all of our it's going to be like electricity or the steam engine or anything like that except evil. Exactly. What do you think. No, no, no, go ahead. I just, the thing that comes to mind for me is like the theater of it, you know, that you, the utopia is somewhere in the collaboration that theater requires everybody showing up, even if you've made it in a dark room by yourself everyone showing up to you know, witness it together, people continuing to come together. That to me is, is meaningful. And also, like, just the idea that there's collaborators here who worked on those projects with the robots and and our bond from that collaboration is the thing that is lasting it's not actually the collaboration with the robot that was meaningful, but also expensive and like needed its own plane seat and like what are we doing you know, what are we paying for but but it's you know I wouldn't have done the second piece with being a if we didn't get along so well with Bruce, and Bruce is like the human person in charge of that robot So there's something relational that theater requires that I think I see some light or some, some version of utopia as these tools go. So I'm like the isolation part of it that the things that are left on isolation it's like, we're the, we're the, we're the part that has to persevere and like bring together, you know, is there a question that you'd like to pose to me. Yeah, I am curious because in my situation I was dealing with a thing that was already a character and we were sort of like trying to put like bench like a fence around and be like what is this thing capable of how is it theatrical. So I'm just curious, particularly some folks creating characters using AI as opposed to kind of inheriting a character that I had to deal with and figure out a context for so I'm curious about the how AI is used to create original characters new care. I mean I'm going to say more. Please ask things but um, yeah I would say that the anything that chat to BT created in Prometheus is totally uninteresting to me. It was there as an example. And prompting I think is not interesting I don't think it's an art form I don't think it's even really a skill I think they're trying to convince us it is partly because they're challenging the copyright office's guidance on outputs, and I'm serious. The, so the copyright office has issued guidance that any output of a generative AI is public domain. It can't be copyrighted, because it wasn't made by a person. Tech companies, obviously, would rather that not be the case, because they want to be able to sell their products to movie studios and have movie studios make stuff with AI that they own right. And so, one of the things that they've been wanting they've been doing in terms of their marketing and their hype is really pumping up the discussion of prompting as an art form that you can get better at you can get skill and we're going to have new jobs called prompt engineers. And I believe yeah, it already, I mean this is already happening sorry this is happening happening happening happening. Yeah, right so so the new thing is right instead of a bunch of artists working in a studio, you know, drawing the backgrounds. You're going to have two people three people prompting the AI to do the backgrounds. So that's partly what the labor issues are about. The other part of it is convincing us that prompting is a part of your creative process. And so I want to push back on that. I've played with these things for six years, I think, when they were first being developed. I still can't really tell you what happens with, you know, mid journey if I use this word versus that word. If I change, you know, this phrasing for my prompt or use some other phrasing. I think that you want a picture of a banana it's going to give you a banana right is that art don't know it you know I call it like all said yes yeah we're all right and exactly, and if you nail it to the wall, but you know I consider it like playing the slots. You put a little nickel in that's your prompt. You see what you get. And then if you want to, you don't like the output. So you change your word and then you pull the lever again and you see what you get now. But the relationship between your prompt and your output is completely obscure. And, you know, anyway, I have more obviously to say about this I'm sorry, I think about this basically seven hours, eight hours a day. So, yeah. I think that that's why prompt engineering is becoming a thing is because they're going to push back on uncopy right ability. Yeah, this is an open question this hasn't been decided yet. If you if you write your own model and train it yourself and you've written the code you own the copyright in the code for sure. The training data is another whole copyright issue which is a real quagmire. You know, so this has been the hypothetical that, you know, you use only your own work to train a model that you have written yourself, you on the software you on the training data. And the output we don't know copyright office hasn't said and there hasn't been a court case to decide that. So that's what they call an unresolved issue. But, you know, that's where the tech companies are going to start putting pressure on those open questions to try to get favorable questions and start eating away at the guidance that none of the outputs are copyrightable. And this is how you shift the law is you do test cases where you push on all the little weak spots and all the little hypotheticals. Until you've made the original guidance kind of look like Swiss cheese. It's unclear how much human intervention post generation is required before it becomes something that can be copyrighted. You know, if you take an output from Dolly and you completely change it in all kinds of ways. But we don't know yet. So these are all the kind of live issues but I'm pretty sure that's my prompt engineering is that's what I wanted to say. I'm getting back to the question about Andrew's question about original characters versus word. Did you want to dodge it. I don't know, do you have anything to say as the character. Well, it's funny because it's kind of reverse engineering in a way because I'm the human, but I'm, I'm saying ai's text right and then I have well at this stage of the game anyway there's some kind of neural net. There's a crazy neural net that I'm wearing on my head which is hilarious. But the idea is that it's, you know, trying to pump this stuff into my brain and then I am spitting it out right and I have a video mask, I guess, right. So, it's, it's, you know, I keep as the actor as the human I'm like, I don't want to say the shit, why am I saying this stuff you know so like, I'm just thinking a lot about what is my part in it how do I resist this thing. So, it, and, and, you know, and, you know, Annie I saw your show and the, the, it will generate stuff, right, it just makes words that what's the next word what's the next word what's the next word, but they're not that interesting. That's the thing. It's, it's dull. It's kind of like, it's kind of like the idea. It's the idea, but it's with no zero humanity behind it, I guess. So, right. Meaning is incidental. What? The meaning is incidental. Yeah. Well, it doesn't understand meaning. I mean, it has, it's not understanding anything. It's not understanding anything. So. But so here's one. So that's a great example just of character is that, you know, there's this little delicious moment when you see Mo not saying what it's telling her to say. And she breaks out for a second, you know, that's the half the Halloween mask. So I think like staging that is very delightful because you keep looking for the human to like peep out for a second. And it's that like the duet between the or the tension there that I think is worth, you know, staging as a character. Oh, again. Well, maybe my, my question follows are very nicely from that, because we've been thinking about the use of AI for people making theater and doing theater but what about for the spectators and I'm wondering what are the consequences of AI for spectatorship. At the moment I'm at a point which once I've realized something is AI generated, my brain switches off. That's it. It just, oh, clock. That's it. Next, I don't have a next set and the next level of questions, feelings, ideas that come from that it's purely registration and then and then go on. So I'm wondering how each of you are thinking about spectatorship and, and what, you know, what's good, because there's the, there's the borderline in a moment, which you're describing very nicely going in and out but there's also moments when you can't tell that. And then it's just, you know, the deflation or the explanation for you afterwards read retroactively this is why this was quite dull. And now I know. Next. Yeah, I'm wittering now. Well, I mean, I think what we're going to what we're doing in this show is we're also trying to deploy a lot of swarming technology so that the audience can, you know, essentially kind of affect which way the performance is going. So they can, you know, hopefully they can kind of steer it in and out of being Moe or on round, you know, like the idea is for them to be able to also have some agency in it, which is complex in a larger, you know, in with a big audience, who knows, juries out, but that's one idea that at least doesn't put you in that passive position, which I think is ethically one of the issues is like when you are. Yeah, when you have no idea and you have no tools to know then why should you care. I think about, you know, the most when the second piece we made made was called an evening with being a 48 and it was Lynn at a restaurant at a table at a blind like in a restaurant at the frying pan at a, and people were watching her have a blind date with this robot. And Lynn was the most interesting part you quickly you're like oh this thing's like and like it's interesting you're like, and sometimes you're like what did they just say, you know, and it's like so nonsensical that your human sense making brain is like oh that's interesting I guess but watching Lynn deal with it and have to improvise and keep keep it going was was what was like the hook right so that's I guess why it came to my mind, the question of like, what can we tell, can we tell if it is or not because to me, if you can tell what is and is not, you can hook into the thing that is not and still find like a lot of value in the in the performance. I just want to bring up real quick because this that what this reminds me of in a strange way and I can be completely wrong or it might just be relative to my experience. I'm a musician I put years of my life into learning how to play instruments and put bands together and go up on stage and play and go into recording studios and multi track and mix it all together. I'm a Kurtz while if anyone knows who he is, who is the he's leading research at Google now created a book called Age of Spiritual Machines. He created a thing called a sampler and overnight, the music industry flipped upside down, and all these great records that I loved and I listened to him a funk a tier George Clinton James Brown all this great music instantly was recycled and created and now all these musicians and all these artists and all these producers all these art forms that it takes to make a George Clinton record now could just be with one button on a keyboard and you just take a few seconds of it and now you go make hit records and you're left into dust. So that happened but no one sitting there going a freaking hip hop, you know what I mean it's like hip hop happen. So but I get the same feeling a little bit of like what I'm what I'm here it's a fee it's more of a feeling thing that not so intellectualized but as the conversation keeps going on. That's kind of what it seems like. Well I think the thing I just keep want to say Naomi Klein wrote this brilliant piece about this, which is that essentially what's what's happening is that Google and Microsoft and are are walling off that information to make it proprietary and to sell it back to you. But the information is scraped from things that you made things that we all wrote and made right and faces and tweets and etc etc so that's the part that I that sticks so badly is that it's being proprietized. That's not a, you know, not a unique point but it's like that's part of enclosure. You know, I mean again. There was a moment that you're describing when sampling started to become popular and people really weren't sure what it would mean for the original musicians. So in these days, I dare you try using a sample and not crediting or paying the person who sent right well. But it's been, but it shifted underneath you so what they did now is they created Dawes which are digital audio workstations that have all the samples in the dog, which doesn't go back to the musician they're all in there by a dog set it up use it and it's all in there. I'm trying to say that that's the only difference. I mean that yes there was a moment of uncertainty. Yeah, and it resolved itself to the benefit mostly of the big record companies and big labels. So, you know, it's true that we love sampling like that we don't have a problem with sampling, but one of the things that has happened over many years is that more and more power has accrued to these giant corporate artists and less and less power has been held by the artists and the creators. So I wouldn't you know I think maybe your story actually kind of tells has the opposite moral. Then maybe what you intended is like yes we're in a similar situation now where we're not sure how this technology is going to be incorporated into our lives what it's going to mean for all of us what it means for artists what it means for the culture. But if history is a guide, you know, we probably should not be fully embracing it without doing some careful thinking about how we want to use it if at all. Yeah, so that's that's all I wanted to say is there's another way of thinking about that. And also you might have read me wrong no I think it is shittier I actually think it was better like I said I'm an analog advocate. And I think now they'll just find another way to flip it like once they did legalize it and finally George Clinton finally got all his money when he was old and, and then it was over and then he basically retired. But then they figured out another way to do it one Spotify remove the power of the musician anyway, which in a way is kind of an AI. And obviously the digital audio workstations is full of I if you go into music programs with that running max and obviously garage band AI is running through all that for years now you can literally it prompt. That's what I did a lot of music that you hear in there. I know music I know composition so I know how to speak and prompt language for music I have to talk to arrangers have to talk to musicians I have to talk to producers. So there's a musical language there which I'm able to utilize and the AI has been taught that music, the language so it actually pulls that back and does it so I can take anything that's out there. They already have that built into the digital audio workstation so there's people now worse than sampling, because at least you knew that cool music you learn about and I thought that's James Brown let me go check out James Brown. But now with the digital audio workstation this hit records that are being made you don't even realize that they're not making it at all the software is 100% well I mean sorry sorry sorry I apologize to everybody, really. The one generative AI product, the one class of products that is trained only on public domain work is the music generating. And that's entirely because the music industry is incredibly litigious and super super powerful. So the authors get nothing, the painters get nothing, the digital artists get nothing, none of us get anything. But the record companies have made sure that open AI is music generating products, you know all these different music generating products are trained only on public domain work or licensed to the gills. And open AI it's amazing they will say the reason we do that is because we have copyright concerns in the same breath that they defend as fair use what they're doing to visual artists and what they're doing to authors. I think I think the point that also that I hear sully making is perhaps that even if the music generation or even if the dolls are working with stuff that's free and clear in terms of copyright. You're still replacing the drummer you're still replacing the Taurus you're still replacing the keyboard play you're still replacing what would normally take a full band and maybe now can just be done with one person. And I think that the other point that I heard in the mix which I think that you both agree on is that the artist is getting the short end of the stick at the end of the day. But I think that the question I also also heard sully asking was, but do audiences care. That's what I was Oh, yes, do audience, you know do audiences care, us as artists care, but you know, we'll, I mean, we'll, again, that's why I say it's a totally different pool. And I agree with Claire, you know big surprise shocker everyone that when I find out something is made with generative AI I'm like, But that's partly because you're not making anything cool with I mean because it doesn't it's it's doing the thing it's making all the interesting all the important decisions itself. The results are shitty. So that's not the case with a lot of hip hop. Well, so I am getting the signal from Anne of prelude that our time has ended. I'm sure there's opportunities for people to talk to each other on the way out of the room. If you have questions that you might want to ask to a panelist that you didn't get to talk to, or not, people may be running, but I think we have to continue in the lobby, and we'll turn over the room. Thank you.