 and I use she, her pronouns. It's my pleasure to welcome you to the Level Up Symposium presented by the Associated Designers of Canada with support from Toaster Lab's Mixed Reality Performance Atelier. I'm a member of the board of the directors of the ADC and I'm really excited to be your host for today's panel discussion beyond reality. To begin our session today, I would like to acknowledge that I am currently located on Treaty Six territory, the traditional land of First Nations and Métis people. Edmonton, as it is known colonially, is and has been home to a diverse range of Indigenous nations and peoples, including the Cree, Blackfoot, Métis, Nakota Sioux, Iroquois, Dene, Ojibwe, Soto, Anishinaabe, Tsutsina, Inuit, and many others. Since time immemorial, this land has been a meeting place for this diverse range of Indigenous peoples who enrich this place with their histories, languages, and cultures. As a settler, I have benefited from Indigenous generosity, hospitality, and knowledge, and for that I wish to express my sincere gratitude. In this spirit of gratitude, I would also like to acknowledge the support of Canada Council for the Arts, our primary funder for this symposium as a whole, as well as our dedicated member volunteers and volunteers on the board of the ADC who have made this symposium possible. Thank you so much. We're equally grateful to these additional sponsors, IACI, University of British Columbia, Theatre Alberta, CITT Alberta chapter, Concordia University, Ryerson University, and York University. Thank you. For your information, all symposium events will be recorded and presented in a freely available archive. Check back a few days after any event you've missed to see the recording at levelup.designers.ca. Thank you for joining us today. You are watching this Level Up livestream either on the Level Up website, levelup.designers.ca, or thanks to our partners at Toaster Lab on HowlRound at howlround.com, or on the respective Facebook pages for the ADC or Toaster Lab. Regardless of your viewing platform, embedded on the same page as this video, is the chat function in the top right-hand corner of your screen. Questions can be asked in the chat at any time and will be read out to the presenter during the Q&A portion of this panel. If you have any technical difficulties at any point in the session, please send an email to levelupatdesigners.ca for immediate support. This event can be enjoyed through auditory or visual access or a combination of both. I will read aloud all questions we address from the chat and this information will also appear visually at the bottom of your screen. Visual access is also supported with captioning for many of our speakers. Captioning will appear directly below the active speaker. If you require technical assistance to support your access, please email levelupatdesigners.ca for immediate support or to provide feedback, which is greatly appreciated following any of our events. If you enjoy this session, please consider donating any amount to the Associated Designers of Canada to support our National Arts Service Organization, achieve its goals in the areas of advocacy, mentorship and industry promotion. Donation links are available in screen on all viewing platforms and I hope you'll consider donating. Thank you so much for your patience with our announcements. Today's panel discussion Beyond Reality features the following artists, Ian Garrett, SV Proctor, Beth Cates and Paul Sigus. With that, I'm gonna hand it over to Ian Garrett. He'll be today's panel moderator. Welcome Ian and thank you so much for leading us through this conversation today. Thanks Erin. I think that my connection might not be the best right now, which is happening to me. I did a presentation yesterday where it slowed up. So I might change that in a moment. Can you give me a thumbs up if you're hearing me okay? Or getting my video okay, okay. It looks like my video might be jammed up a little bit. I look very contemplative in my setting there. I didn't look at my captions are necessarily coming across, but hopefully it will resolve itself. As we get started, I'll give a little bit of context for where we're coming from and then come back, hopefully with better video when we come back for the conversation. I'll get it sorted, working just before this. So Toaster Lab is a small mixed reality performance production company. We're based in Toronto, where we're seated on the land of the Haudenosaunee here on the Wendat Metis and the Anshabing. And we've been working through this two year deep dive on to the Toaster Lab mixed reality performance atelier where we've been looking at performance practices that integrate mixed reality with live performance. This started before we got into our current pandemic situation. A lot of it has become much more relevant in that case. And the artists that we're talking to today, all of whom have started to approach this topic of where do we sit at this intersection of a scenographic theatrical design practice and the integration of mixed reality in a couple of different scenes. Everybody here also is part of actually the advisory board for that project, for that atelier. And through that, we've been boosting up or we've been boosting up a number of different projects. We're looking at having done 18 to a couple of dozen by the time we get to the end of it in June and have been having regular convenings. And it just so happened that along with being a member of the ABC board with Aaron, that we've been, we were having a convening that was gonna be scheduled about the same time through our series of six convenings that we're having through that project. So we're really grateful to be here and really happy to be supporting level up and all of the different back end. It looks like my connection might be starting to actually catch up, maybe I'll catch up by the time that I'm presenting as well. So I want to introduce a number of illustrious people that I've had a great pleasure of talking to quite a bit about these topics and working with over the years, the last couple of years as well through various networks. We're gonna do some short presentations before we get into a conversation, which we hope that people will participate with through the chat that we will start with some leading questions as well. There we go. There it is. It caught up eventually. It's important to understand what exactly it is we're even talking about. I find that this idea of mixed reality performance really puts us in a position where it could be any number of things. And I think that one of the reasons that I think it's important to see what each of us are working on is actually to see that we're coming at it from slightly different ways, where the flow should work so that there's some idea of how we work from one project to the next, but that everybody is bringing in different skills, different technologies and working together and with different aspects of performance and supporting different assets as a performance so that it's really a convergent field with many different things coming together and then a divergent field and the number of opportunities. It's not like lighting design. A lot of us have had some. Or sound design, which a lot of us had some experience with, where we're dealing with sort of our preset notion of how we're trying to express dramaturgical information through design that we still haven't got there yet. All right, so the order that we're gonna go in and I'll introduce our panelists upfront, but I'll put it into the order in which we've sort of agreed ahead of time. First, going to hear from Beth Cates. Beth is a self-taught and award-winning lighting, hi Beth. Nice to see you. There you are. So I'll introduce everybody and then we'll come up. So Beth is a self-taught, award-winning lighting projection set in Costume Designer and she's been working in rock and roll since she was 14. She's the creative director of Playground Studios that has done a number of interactive installations and she recently completed her master's at University of Calgary in drama and has been working on performance and performance creation at the intersection with digital technologies at a virtual reality and augmented reality. And that is a standard from an in-person performance called Berry the Wren and to fully immersive VR social experiences that we'll talk, I think we'll hear about today and the development of projects that are integrating some of those technologies into what we would assume to be more of a, what we expect out of a stage performance. And so there's the whole hybrid range of the ways in which these VR technologies have been integrated into Beth's work that had different levels of engagement. We're going to hear from Sarah S.B. Proctor who I'll be referring to as S.B., most of the time, S.B. whose background is as a sound designer, a new media artist and dramaturg, she's based in Brooklyn and she works across live digital and virtual forms of storytelling through multimedia theater, audio dramas, virtual augmented reality and immersive experiences. She and Beth were collaborators on an immersive fully VR production as well but she's also a really well-regarded VR filmmaker as well and has been working from a filmic perspective of working with VR, different types of immersive technologies as well. And then we're gonna be hearing from another practitioner who crosses over into a lot of the filmed VR realm as well, Paul Sieges, whose work merges multiple practices of performance creation and design from theater and opera to site specific installation and intermodal mixed reality scenographies which you've been completing or have completed. You'll update me, a PhD on. I can't remember where you are within the set of doing it but we also have a common background in sustainability and he has MSC, Sustainability Environmental Studies as well. He's the co-creator of Blue Hour VR which I know that we're gonna see a little bit today, a site response of mixed reality performance that premiered in 2019 at the Prague Quadrennial. So we'll hear from each of you today. Beth, I'm gonna turn it over to you to start things off and get us rolling. Absolutely, thank you. And it's really, we were all remarking as we arrived into the web browser, how nice it was to see everyone and I think we're all really missing seeing each other. So I'm dialing in across the airwaves from Mokinsis, from Calgary and I have a beautiful view of a beautiful sunny day and I'm really, really grateful to be here. I'm also a member of the Associated Designers of Canada and heavily involved with a whole bunch of different committees and I'm so thrilled that this is happening. I think this kind of knowledge exchange is really critical and so I'm going to talk a bit about the work that I've been doing most recently that I came to the University of Calgary, which is where I did my MFA in Drama and Computer Science to look at the intersection of VR and AR and live performance. And I started this almost four years ago. So before the pandemic, before we entered into this hyper digital age that we now sit in. And it's elicited a whole bunch of really interesting connections across disciplines, which I think is something that I think all of us can speak to in this panel. One of the pieces that is critical for this kind of work is reaching across the disciplines and outside the silos and really looking at ways that we restructure and redo or dismantle entirely the hierarchy of how we've traditionally made theater, which is something that sits in a really interesting place in the digital realm. I also, and I can't remember because things went by too quickly. I'm also a video projection designer. So all of this digital work comes from that and I've been doing digital projection. We were using video on stage long before video had the capability to be easily brought onto the stage. So all of this contemporary work comes from all of that work over the last 30 years. So I am going to do a little walkthrough because pictures are better than words sometimes. I'm gonna share my screen and share a couple pieces of work with you. So you've now all disappeared for me just so that the stream yard folks know. So this is me. This is a traditional piece of extended reality work. This is Trina Davies Silence. It was directed by Peter Hinton and Michael Jambranchesco did the set design and was a fully immersive for the performers a fully immersive experience. So among all the other things in this I find really important and we are gonna talk or we're gonna try to have some time to talk about real life. So I am also a mom. This is me nursing while focusing lights and sitting at the production table with my son Aaron who's now eight and a half years old and is with me all the time in theater. And I present that and I say it to just because I know I've had conversations with a lot of very young artists and people who are starting in this challenging industry just to say you can have kids, it's possible and it's awesome. And then you can teach them how to do stuff and they can help you edit video and things like that. And that's really great. So digital extended reality work. I'm gonna touch on three of the most recent projects which started with or didn't start with but this was a piece that was done for Randolph College in Toronto. Alastair Newton adapted a very old play and had adapted it and then the pandemic hit and then decided that he would re-adapt it for Zoom because he was never going to be in the physical space with the performers. Randolph set all the performers up with the right kind of equipment. And we then basically did live off the floor not live to air but live off the floor recordings that were then either augmented in the moment or augmented in post to really flesh out what story we were telling and to really try to break the rectangle of Zoom and we stuck with just Zoom, we didn't go to OBS, we needed to stay simple because this was in early, this was the Randolph's first show of the season. And so what was beautiful about this was that integrating Zoom into the dramaturgy of the performance helped us to bridge the gap of those rectangles and to really make it, have a high aesthetic, allow the aesthetic and the extended reality parts of the design to leap through the screen as best as we could given some very, very short time frames and a lot of great support. So that was what we did and it was great because Alastair and I had worked on that as he was adapting it. So we were able to have really good thorough discussions about what was possible, what would have impact, how could we work with the costume designer and so on and then how could we then work with the actors too, what did they need from their backgrounds and it was a really fruitful collaboration. And it was really, it was very, very colorful which was thrilling. So the piece that Ian mentioned in my little introduction there is this piece called Bury the Rand. So this was my thesis, central project for my thesis and was an active way to explore how we could bring together virtual reality, augmented reality and live performance and what I call carbon reality. So the stuff made out of carbon. And I created this with Neil Christensen. We worked in a devised method with the technology which meant that the actors, Val Ponch and Val Campbell who helped to devise the script and the storytelling were working with the technology as we developed the show. And so as the audience was a one-to-one performance and as the audience member, you were within another reality that we crafted while the performer was with you in body only in the physical space. She never appeared as an avatar. She did have control over the story that she was telling. We used a process called photogrammetry to capture real-world objects, the real-world objects that we had actually used as the devising prompts. And that was how she told the story. So it was object-driven, design-driven, performance-driven. We were playing with the nature of reality, with the nature of liveness, presence and really trying to experiment with how we could tell our ultimate goal was not to make like a big flashy super high-tech show but our ultimate goal was to make an affective journey, an emotional journey through the story. And the story was telling, some of you in the audience might know about the Donnellys, which is an old piece of history from Ontario of a family who was murdered. And most of the women have been erased from the retellings of that story. This was an effort to also reclaim that feminine voice. And so, and then we used the AR, the augmented reality capacity of the headset that we chose, which was a Vive Pro, to introduce the human performer to the audience member mediated through the headset until the final moment of engagement where the headset was removed by her. You were given an apple to eat and you had this final moment of pure human to human exchange once she was ready to reveal herself to you. And so all of that work too was really pushing the boundaries of what was possible with the technology. At that time, we had things develop as we went along. This is one of our dramaturgs. We invented ways of trying to dramaturg from within a headset. While you couldn't write anything down, we had to rehearse in computer science rooms because we needed the access to the technology, which was key to our process. So there was a lot of learning and a lot of exploration that happened. From all of that, that VR work, one of the things that I have been doing in the last year in particular, starting with Leval VR. So this is me as an avatar because I once colored myself purple and got in trouble for it. So now I can be purple without getting in trouble. And so starting with Leval VR, which was a conference that was very quickly shifted into virtual reality with other members of a working group, which Paul is part of, I led these performance experiments and had people engage with their avatars and create miniature performances using all the limitations that we had within this particular platform, a platform that was geared specifically towards conferences. So you could only dance if you hit F1 and it would make you do Gangnam style. And then there were different ways you could move your avatar's body, but it wasn't intuitive and it was all through the keyboard. But what we ended up creating, the groups had about 20 minutes and what they ended up creating were works of physical performance in this virtual space. And we ended up with the Ken Bai called it a happening. We had a happening in VR unintentionally that was performative and performance driven and it was a great deal of fun. And it has led to, there's, yeah, you can see how the avatars move there. It was all very awkward and kind of fantastic. This has led to other experiments starting at PXR which is a performance next to our symposium. And this is an embodied exercise. So you actually are wearing a headset. You are these avatars. You're physically able to pick up, move objects. My goal with these was to experiment with how and what we could create in VR, in real time with each other. So using collaborative device methods and training people to be what I'm calling sonographer formers in VR. They're often called terraformers but people to actually be able to create space within the virtual space and then to create performance. And so there's two of these experiments have happened now. I now have a collection of all of these different sonographies in this world. And each time we've managed to in like 30 minutes create full sonographies and full performances and it's really exciting and the capabilities of VR are improving all the time to allow for this. And then in a much more focused way, this is the piece that SB and I collaborated on created by Kira Benzing of Double Eye Studios. This is a piece called Finding Pandora and all of these other amazing, incredibly talented visionary people. This is me lighting in VR. My work on this piece, which is a piece of VR theater. So it's live performers in VR. You share the space with them in real time and they tell you a story just like when we used to be able to go to the theater. In this case, we get to be in multiple worlds and we get to experience multiple storylines. So there's a threaded narrative. What my work on this was both as a world lighting designer for Mount Olympus and live lighting designer. So we created a system with developers to allow us to control lights in real time in the virtual space, which we can, there's lots of questions to answer about that. But it was a pretty extraordinary team effort and was a really interesting exploration into what exists right now and what really doesn't. We did a lot of hacking of systems, building on top of other systems to make it do what we wanted it to do and with great complexity. And in the live lighting system, I was able to light with three lights at the same time. So it also felt like being back in like damn straight or leery McNichols old studio where I could only plug in three lights at the same time. And ultimately the piece went on to win a prestigious award at the Venice Biennale. And I've also since been the stage manager on it and we've done some private presentations and we're all this group of people are all continuing to explore where we can go and what we need to go forward. So you go, I will then pass the baton back to Ian. Go ahead and continue on that trajectory because we're talking Pandora X and actually quickly move us over to Espy. May be able to provide us another view on that as well as your other work, especially the film work. Take it away Espy. Thank you. Glad to be presenting in front of you all today. I'm going to be talking about previous projects that I've done in cinematic VR as well as theatrical projects and then what I'm currently working on in terms of more hybrid productions. So actually Ian, if we can just give a moment because my browser is actually not allowing me to share sapphire foxing, I believe. Okay, and do you wanna, should we just hop to Paul for a second and we'll come back to you. Yeah, let's do that. Easily done. Paul. Hello. Hi. Hi. There's something that came out of Beth that actually this seems like a good opportunity to mention is that like with all this new technology we're devising a lot. I think about like, I have a background in lighting and I know you do too. And I remember getting trained and it's sort of like there's various conventions that you're just like, okay, 45 degrees. And you're like, at some point you interrogate it but when we start to deal with these new technologies it's like, oh, I don't know. Let's just try something. Like it all requires that level of experimentation to get us through to figure out how to put all these things together. That being said, you've done a lot of that. I'm gonna turn it over to... Yeah, that's nice of you. It makes me think a lot about what are the transferable skills that we have from one industry to another as a designer. Certainly we're both also teachers and a lot of the work we do. So thinking a little bit about how you can hop to different technologies and do the same skills to the same design principles apply. That's a question that I'm thinking a lot about especially when I'm lighting in VR. I love Beth's limitation of three lights. It's so true. And it asks us to kind of go back to basics once again or to think a little bit about what are the principles at play? So yeah, maybe we can just start my presentation, my screen share. Thanks, yeah. I mean, I was thinking a lot about this and in preparation for the panel today, I was just kind of taking a journey back through my experience. You know, just thinking a little bit about mixed reality, thinking about beyond reality, these different words. And I was certainly thinking through my journey through technology for the most of it and thinking about technologies and how they give us ways of engaging with the world around us. This is a part of the new age of engaging scenographically or with design and certainly how we think about technology is something that's really fundamental. I've always been interested in technology, you know, as I kind of journeyed through. You know, I think back to my work as a lighting designer and as a projection designer and I've continuously been reliant on technology. Of course, usually someone else's technology, but technology nonetheless as a means of artistic expression. So this reliance on technology is not new. And you know, the accessibility of virtual technology is new and that's what's I think bringing us all together today. And this development towards greater accessibility changes the way in which designers can engage artistically and autonomously outside the system that we have been dependent on traditionally for so long. And so my brief talk today is really about accessible technologies. This is especially pertinent today where we have lost our sense of personal agency in many ways. And we've lost a sense of what might bring us a tremendous amount of joy like creating beautiful theater. For me, designing has always been about two things. The exploration of my own artistic potential on one hand and certainly as a means of leveraging design for social justice and environmental issues. When I go back to old projects, I remember playing with live feed on stage and how this radically changed the performance landscape for all of us. How it gave us new proximities to performance and to how performance spaces could be usually witnessed at a distance, but certainly became completely augmented. We all of a sudden could come in very close to the performance subject. In 2013, I began working on an intermedial opera that brought together quantum physicists, musicians, actors, and visual designers together to create a performance, an intermedial performance that explored human relationships with technology and the increasing domination of algorithms in our daily lives. This project was really important to me for my own on many different levels. It foregrounded design certainly and it was almost a completely design-led process and it explored the idea of being in multiple realities simultaneously. Since then, I've met a couple of different technologies along the way that have furthered this preoccupation I have with multiple realities. 360-degree video, groundbreaking, and virtual reality headsets such as the Oculus Rift. This opened up compelling new experiences of space between the real and the virtual but it also fundamentally changed the means of producing. I was able to have a greater autonomy as a designer and as a researcher. In fact, I now consider 360-degree video as my primary medium of choice for its almost limitless potential. And at the last Venice Biennale, I presented a 360-degree work inspired by the trilogy spheres by the philosopher Peter Slotardike. Who came, anyone who came to this finished research pavilion at the Venice Biennale had the experience of being immersed in virtual foam. So that was a new experience spatially as well as tactilely and sensorially. In this case of the experiment and the presentation, the use of the head-mounted display technology to watch that 360-degree video offered up a proximal experience that was only possible with the technology and placed them the experiencer of the performance in a performative space. So just before the Foaming's project, I was working with a bunch of colleagues of mine at the University of Waterloo on a digital set design, digital sonography for an exclusively virtual reality project called The Home, which was trying to leverage virtual reality as a way to present the oral histories and reconciliation efforts of former residents of the Nova Scotia School for Colored Children, which was just outside of Halifax and Dartmouth. This project is now going to be deployed into classrooms all across the province as a way to teach students about the histories of systemic racism in the province and as a way to have some reconciliation from that. It teaches about the past harms as well as the present harms and how we can move forward from that. In The Home, the students are immersed in a virtual sonography, a sonography of spaces where they navigate and encounter oral histories of the survivors, which blends computer-generated 3D environments with 360 video shot on location and then in studio with maquettes or dioramas of different spaces in The Home. This project really made me think a lot about how these different spaces, the real and the virtual blend together and how these digital spaces have their own agency, if you will, in retelling the stories. It seems only natural that the project I co-created at the last prog quadrennio called the Blue Hour VR combined all of these technologies then together and introduced uniquely real-time computer graphics or real-time CG and tactile environments then into the design mix. For me, this project was a completely design-led experimentation, experimenting with form, with aesthetics, the body in space as a sensorium individually and exploring our relationship fundamentally to the Anthropocene and the challenges and changing environment. It was a way to go between, to radically reconceive the relationship between the spectator and the performer, placing them at the center of the performance event. So something that I'm focused on right now is exactly this, this kind of exploration of in-betweenness. This is what I'm thinking a lot about in my doctoral work at Alto University in Helsinki. It's how mixed reality performance design can navigate us to new ways of being and experiencing the world and the spaces that exist in the margins or in between real and virtual places. So in conclusion, I can only say that it is one of the most exciting times in a very depressing time, one of the most exciting times for designers to be exploring new technologies and to be investigating a relationship with these mediums. It's accessible, it's autonomous and to really extend our perception of what design can offer and of performance and change the way in which we think and know the world that we live in. Thanks very much. That's what I got on offer. Thank you so much, Paul. You're welcome. It's always great to catch up. You know, one of the things that one of the takeaways that someone could have from this is that if you start doing VR work, you're going to end up in the BS Biennale because I think I didn't realize that I didn't think Beth had collaborated on Pandora X and that myself and through Daniela Bartolini, we had contributed, Toaster Lab had contributed to a piece that was there in the theater Biennale and then so there's an interesting exploration there. And also interesting because of the current pandemic situation, they've changed their programming for the, like is no longer, what was that? There's subdivision of the film festival, but they're no longer going to be holding it in person and are looking at hobbed ways of distribution as a way of adapting the festival moving forward. So there's something interesting that it's like, with these festivals starting to present this, thinking about how it unmoors it from a specific festival location. That's a great work considering it's Venice. Yeah, I was thinking about the liminality of the place and that there's this continual transitory and of course, as we all know with sea level rise, there's a enormous change about to take place in kind of the seat of, you know, at least, you know, Western artistic culture. So yeah, I don't know Venice, it's a strange place. It's interesting, our first, we had a piece in the festival, the Future of Storytelling Festival in 2017, which was like one of our first large pieces as Toaster Lab and they didn't know how to tell people, like it was site-specific and it was site-specific and geolocated. And so you had to go find the immersive media and they just couldn't figure out how to get the people direct, like the docents to direct people towards it because it's like, it's this ephemeral thing you sort of have to do and then you can experience it. But there's not like a thing to go seek out that's what brings us to it. Yeah, that's very interesting, yeah. No. I'm going to bring us back to Espy. Who's ready again? Welcome back Espy, I'll turn it over to you. Our greed, like smooth segues, who needs them? I'm really excited to have you presenting. So I'm going to turn it over to you. Thank you. So I'm going to, as I mentioned before, I'm going to be going through different foundations that I have that's brought me into working in virtual and mixed reality, recent projects as well as current explorations that I'm doing. So as was mentioned before, I am a sound designer and also a media designer. So I design visuals for live performance and the tween sound visuals moving into, well, currently have been working in virtual reality. In terms of virtual reality, I've been working in both cinematic and theatrical VR, working in a range from documentary to just regular film, theater plays and coming up this year, moving more into music as some of my original visual work space. My early foundations was as a performer. So I did device theater, I did device performance and physical theater, singing, mask work. And I also, in my time in undergrad Jewett at Virginia Tech, I was part of the Linux laptop orchestra. We had used very small Linux laptops, hemispheric speakers as well as wemoats and the trucks attached to them to be able to trigger sound and be able to create gesture with that sound, to be able to ship that sound in a physical form so that the audience could also be able to have a visual and feeling attached to those sounds. So for me, sound visuals movement have all been intricately linked from the beginning. My first major VR project was Grow Icon, which was a documentary done in partnership with Oculus and Malala Fund and Milan Foundation in India. We went for two weeks and used, focused on the, using the 360 cameras. We focused on at the time, 17-year-old Ronnie Kennauja, she was getting her education and looking at the different communities that were supporting her in that journey. So that ranged from the Grow Icon fellowship that she was part of to the school, her family, as well as any other outside mentorship activities. In this experience, what we did was we, we positioned the audience as a companion on the journey. We wanted Ronnie to be able to have her own space and move away from first-person experience. In this, in being able to be a witness and a companion to Ronnie's journey, we used a combination of short visuals back-to-back as well as montages of various moments in Ronnie with her activities. In these moments, we used the full 360 sphere and cut it up into pieces so that people could be able to watch the images float in and out, as well as sound to be able to evoke a kind of memory. We also had a physical exhibition at the National Civil Rights Museum in Atlanta. And so that provided a more physical entrance into the experience. I had also, while working on Grow Icon, I had the opportunity to take part in Garage Stories, which is an ongoing workshop that introduces people into VR and different ways to be able to bring audiences into immersive storytelling. We, we worked, we had a, it was a comedy and what we did was we worked with the actors and into, into the setting was into separate rooms with the camera being on the cusp of where the two rooms with me. And so the comedy was about a couple that was about to get engaged and a severe misunderstanding and a proposal gone wrong. And based on what view you were looking in, you could be able to pick up what was happening in the story and be able to put together through context clues and physical cues, a slightly different narrative as to what exactly was, what the truth was of the scenario. We sat and we continued to do script work as we, you know, as we do in just regular theater and film and really allowing people to express nuance and to be able to not only, not only express themselves in a, not only be able to express themselves physically so that people can be able to understand the larger gestures at hand, but also be able to focus on particular nuances like the way that a hand curves while putting lipstick on or being able to search for a ring. So these are very small details that people pick up and then be able to create an additional narrative in their mind and then be able to then piece together what was going on. With Without You is a, is an ongoing, going to be multi-year project, thank you COVID, is partially VR and partially AR. The AR audio visual installation focuses on the young, focuses on the young girl who is separated from her mother and is caught up in the spiritual realm and the VR experience focuses on both the daughter and the mother's story as they find their way back to each other. Part of the, part of the intrigue in this experience is the order in which, the order in which these pieces are meant to be experienced. Looking at whether the VR experience can be, can, whether once things reopen if the VR experience can happen inside the installation at large to be able to service additional sculpture as well as what understanding do people have while these two are experienced separately. Brief, Beth previously mentioned finding Pandora X so we'll get too much into that but I was a sound designer for that experience and that entailed me going into the world and doing a site visit. That is my avatar of choice that is actually me waving to, it's actually me waving to Kira as she took a photo and this was my first time working in a theatrical environment in VR which still required physical presence, still required an understanding of scale, still required being able to understand how all of the design could work together and what people were perceiving, what people assigned reality to and what were other things that they could suspend or disbelieve and just go along with that fact. As Beth had mentioned before, there are a limitation of cues. I believe I had less than nine to be able to work for the entire play and so the focus was on the environment at large and then any prominent objects that were in the space. So for example, a torch that someone would be carrying that would have a sound but the trees or footsteps that you're walking past like those may not have a sound. If there is a city and there's water and there's water flowing to be able to heighten the sounds of a city and then maybe have water baked in underneath that but not have that be a separate cue. And there are other, as I mentioned, other sounds that like in a theater not recreating everything to be hyper realistic but being able to find more metaphorical and poetic ways to express sound. And this picture just gives a sense of the scale of the world that we were working with. So again, finding the balance between intimacy and also grandeur. Black Imagination was the first VR play that I directed. That is me, my avatar of choice, more customized to fit myself but still always staying true to the purple and pink combination. I worked with Crux XR and for this, what we did was they gathered three writers and five actors and so we all collaborated together to create a series of three plays. This involved not only rehearsing the play and then actually performing it but also considering what that onboarding process was like for instance, being able to understand the physics of the world by being able to interact with objects and then also finding ways to bring people into immersive spaces. So theater has also been, and I believe this might be the last thing that I'll present. So I did cinematography for that project and not necessarily in a virtual reality environment but coming back to thinking about accessibility and the resources that people have. We looked at, this was the top, top of the pandemic last year. So in looking at this, finding out what kind of angles people were able to do in their home, what kind of lighting they could do with their lamps or more of like a ring light as well as for setting, looking into more traditional practices like models that set designers build and being able to incorporate that into the film. So what is it like to be able to film the set and then be able to then, what is it like to film the set and also have a DIY feel to it but still have something feel very immersive. So in conclusion, oh, I didn't hear a sound, but there's subtitles. Yes, there were. So in conclusion, the design process that I've been looking at, design and directing process that I've been looking at has been first thing, what does it mean to combine sound visuals and moving together partially based on my backgrounds but also just looking at what also just looking at how to create a complete and immersive experience in VR. And also just in terms of, in terms of like just the creative process having regardless of all the pieces, whether it's something that's prerecorded or it's a documentary or it's a film, continuing to have a sense of presence and like have that presence be highlighted and magnified. Thank you. That's great work. I remember the first time reaching out to you and I think that I put this in the email that I sent you to. It was, I got so excited when I saw somebody else that was taking pictures with the Insta360 Pro because I like pose with it. Cause it's like having a little like robot alien with you all the time. It's really hard sometimes to film with it when you're like in public spaces cause people are just like gonna walk up to it and sort of greet it. Yeah, it's just a fun factor. I give cameras nicknames during filming partially to cut down on the technical jargon and just make it more friendly instead of like, okay, now we're gonna use Insta360 Pro with the tripod. We're gonna use CL. CL is the sticker that the studio bought. I love that. The thing. And then our 180 camera was Walla named after Wally. So like, even just like things like that, you know, giving them quirky names to be able to use also helps. Yeah, I love that too. Cause then I think about it, I think about like the, especially on the film excited, especially 3D. So you end up with like lenses that have the parallax effect. So like close like you end up with like eye contact. So that they become the thing that you're sort of performing to. So giving them a personification is really, is really like, it makes a lot of sense from a performance standpoint as well of like bringing the synths. This is ultimately like your scene partner when you're staging things. Thank you. I'm gonna pick it up before we get to questions to offer some of the, and we'll bring everybody back in a moment to offer a little bit about our toaster lab work. I've got some slides too. Nope, come back thing. This is only gonna work if I can, if it allows me to come back. There we go. All right, now I can actually share my screen. Here we go. All right. All right. So, right, this is a book. So this is the basis for part of the way the toaster lab came into doing mixed reality work. Now, I'd say that like the basis for it is a little over a decade ago. We were having a conversation about a site specific dance piece. And in, and when we, I guess my, there we go. Let my captions catch up. We were, when we were talking about the site specific dance piece, we were talking about archive and site specific archive and wouldn't be great if you could have an like immersive record of dance happening in a place where you could place the dancer into the space in which it was happened if it otherwise wasn't accessible, like a historical marker. And that went down this road of like, well, could you do that with augmented reality? So you're overlaying it with the environment. And that's sort of beyond the sort of like started down this road. But from like a dramaturgical standpoint, I always point back to the invention of moral, which is about this convict who escapes the law by going to this island. He's from Venezuela and he ends up finding himself on this place where some sort of event happened and that suddenly he's not alone anymore. And that there's a woman there that he quote unquote falls in love with. Obviously he's not interacting with her. So he's enamored with them. And so what he finds out is that there's actually this machine that is recreating the space and recreating this recording of people and it's captured them. So it's an immersive projection of that. And he inserts himself into it over the time. And this idea that virtual and augmented reality is really a set of, is not just a set of technologies but an achieved state. So it's not even called that at first. I mean, if you had here, this is just Christopher Walken and brainstorm, which is one of the first cinematic representations of a VR experience, which involves very few cameras. It's mainly about a mental state. But now we stare into these things. Our VR cameras or lenses ring a standalone headset to something that we put a cardboard box we put our phone into with a lot of different options. We do a lot of stuff filmically. These are some of the old slide with a few different ways of approaching cameras, but they all essentially do the same thing. In this case, the Ricoh theta will take like two wide fifth eye lenses. And if there's like very wide shots, and then when you go and render it together, however many lenses you have, you get this equirectingular, sometimes too equirectingular, like when you project the globe onto a flat world map and stretches and so on. And then you can play it as a photosphere. And so this is this player if you could turn around. So the experience for like, when you're looking at this, if you put it onto something like YouTube, that's a bit like this where you can like sort of drag it around. Sort of look all around you for it. But what we started trying to, what went on places that we started was like, is there an accessible way for us to deal with this sort of content or mixed reality content? And it was like, well, can we get video? The YouTube app plays it or we can get a player on the phone to play it. So we started doing, I won't let me advance with that sort of thing. This sort of experiment where we had pre-recorded video and would lock it into an oriented to a space so you could come back later. And this is this idea of technological hunting, right? So this idea that we could look at a place that has two different, very different realities and look at that at various points in time. Now we have an interest in environmental work. So like we look at like, as an example, like watching glaciers receive, right? So it's like this picture of the Beargulation in Alaska over the period of 85 years, right? Seeing it recede back. And the way that these experiences change, so that sometime this is actually from a film called Taneo about Kiribati's mother and her only record of her experience there on the island where she was born is these VHS tapes. So some of our projects have included some different sort of community and remote work. So we've been working for a number of years as part of our groundworks, a number of collaborators in Northern California. And as you can see, Raskadee is both in his regalia doing our youth workshop. Here we're in a, we're talking about eight corn harvesting with Bernadette Smith, artist at the Pointchester, or no, Point Author Manchester band of Pomo Indians. And then here on the top of Kanimoto, a sacred mountain and we're going around and looking at site-specific ways of creating immersive experiences. And we also brought this home as well. We're based here in Toronto. So we've done this ongoing project in Parkway Forest Park in North York, where we're working with youth to create VR stories around the park as well. So a lot of this is like doing theater workshops with kids and then giving them cameras and they're running around the park with them as well. And then previously we've done this and planned to do this last summer, but COVID got out of the way. We'll then have pop-up cinemas where we're sharing their VR films. And then once those have been created, we then like create them and place them into this web app where someone can like explore the park through geolocation and find these immersive experiences, all about these different accessible ways of using immersive tech to understand place. We've done, our projects vary a lot and we've been going through this Atelier project as well. You know, just because it has a lot of things, we held Hackathon in June. People were looking at remote performance and using VR spaces and augmented reality. We did a project in Philadelphia called Trail-Off, which looked at user instigated, like interactive storytelling that's geolocated. We contributed a VR film piece that serves as the introduction, the rabbit hole to Daniel Bartolini's The Right Way at Venice this last year. We just did a workshop with Theater Pass Marai in partnership with Cohort around looking at exploring augmented reality for captioning. And Beth had pointed out PXR before, it helped support this as part of our series of events that we were working on to bring people into VR space. So we've got this team of people. ToasterLab is at its core, three people working together. Myself, I'm in blue there, blue and blue. And working together to like find all these different ways of making the tech accessible, like what are ways that people can get into it, making it and solving not the question of what is the technology to use, but what is the right technology to get the experience that we want? I'll go back to that idea that we have of, we were trying to create an experience of place and this is what led to the question of the question of, what is the type of experience we're trying to communicate and can someone actually experience it? Cause when we started out, we had these very severe technical limitations of augmented reality in the way that we experienced it through Pokemon Go or something like that even was very difficult. And we worked on it for a long time, trying to get dance content into that. And then it just got a lot easier. But the way that people are using the devices that they have changes the way that you approach doing things. So it becomes less about like, am I gonna build it in one way or another? It becomes more about what do people have and how can we make it accessible to them and what is the type of experience that I'm trying to communicate? That's one of the basis of our work, rapid fire. I'd like to invite all of our speakers, panelists back into the screen. That was a lot of different types of projects to throw at people. So I can only imagine that someone is watching something and like, I have no idea where to start. I have even more. My thought was to start with whatever you have. It's always to start with whatever you have and then start seeing what you can create from there. You can try like, if you get a new iPhone now, not that I'm suggesting anybody go get the, like drop the money on an iPhone, but like people have phones and now you can do like pretty good photogrammetry on a phone with, if you have since the iPhone 10 and then that sort of that same era with Android, you can do motion capture with the font facing camera, like that you used to need to go rent a studio to do. Like so much of this is being built into devices that we either already have or like the Oculus headset now is cheap. We're gonna get into the ethics of the Oculus headset in a moment too. Everybody's smiling because everybody has a knowing look of what exactly that means. I had sent out a couple of early questions to get us started. But I think, you know, one of them was like, what began your interest in work in extended reality work? And I think that we got to a lot of that. And I'm sure that we're with cover, especially people from the chat have specific questions about like essential skills or technology. So I wanna go right to the big question because I sort of alluded to it there. Big tech, like right now, the one of, like if you're experiencing VR, very likely you are interacting with Facebook on some level. And there are various things that I think that we've all touched on now and we've all talked about it, that like big tech is willing to accept some collateral damage for pushing the technology ahead. Which is also a challenge for us to work together, right? We see Timnett Gebru who just got dismissed from AI Ethics at Google. First, like speaking about how this might move, how are we gonna bring like, in this pause, we've been dealing with a lot of issues around, a post me too, around not in our space here in Canada where a lot of people are integrating the ad hoc assembly is voluntary addendum around creating a more anti-racist space. And we've all talked about accessibility. We've touched on sustainability and environmental issues. And Beth, you even got to the life work balance. I'm surprised my kids haven't popped in here to say hello as well because we're all working from now, from home here. How do we, like what are the concerns that you have? And are there any ways that you would like to address them with these sort of ethical questions that we may have backed ourselves into a little bit with being on like a front wave of new technology that depends on big tech for advancement? It's a big question. I know you all have opinions about it because we've all spoken about it. Well, I can go. One of the, I think as we move into using immersive technology more VR and AR and just all of the different components from it. I think one of the dangers is to have all of that be bundled into one company. Even if it wasn't Facebook, I think just the very nature of where VR is where we had three three-doff experiences where they're more mobile, you can't walk through them but you can still look around and be immersed to stand alone where you have that immersive interactive combination and then the PC version which is just like, you just go all the way. And I think over the time, there's been a consolidation of what type of headsets and experiences people wanna have or the industry beliefs that people wanna have. And while I appreciate that there is honesty in saying this is actually where we wanna go as a company, I think it's important for everyone, artists, creatives, developers to understand that at no point should there be all of, like the entire field should be hinged on whether one company can succeed or not. I know that people have their issues with Facebook and privacy and I also have a lot of concerns as well. But I think part of that goes to this question of, well, what happens if these lawsuits pass through? What if they break up Facebook? Then all of a sudden it break up a Facebook for people can mean the end of XR and that's a huge jump that shouldn't even, honestly shouldn't even exist. Even just things like Google going down means like you can't access all your emails and documents and like your whole business is shut down for however many hours. Like these are concerns that I have in addition to the privacy and the ethics of each company it's just that this move towards monopolizing I think isn't that great. In theater we have, just speaking for New York we have Broadway, Off-Broadway, Off-Off-Broadway. We have regional theater, we have community theater, we have dinner theater, we have digital theater. There's all different kinds of theater. And I think in the same spirit when everything closed with COVID there was this understanding that regardless of what Broadway decides to do or regardless of what New York theater decides to do funding aside it shouldn't stop people from being able to create. We can talk about like it's paid and how much and funding and all of those things but just the ability to continue to create shouldn't be determined by just one unit of a whole. Even if it is the unit that may have more power and resources in the structure. So I think as we move forward in emerging tech I wanna encourage people to not only look at like high-fi experiences like the Oculus Quest or the Rift or all the other PC VR headsets but if we can have more three-doll headsets that'd be great. If people are able to have more experimental DIY stuff similar to how artists use the Kinect for sensors until they were more commodified later like these are the things that are great. Even for AR there's many different startups apps and even just like being able to now you're able to scan with your phone or device. It may not be like a complete hyper realistic representation but I think with all the media that we consume that not everybody wants that and story-wise it doesn't always work. So these are things that are on my mind. Thank you. Beth or Paul? Or Beth or Paul wanna go first because I know that you both have an answer to this question. Yeah, I can go. It's so complex. Like in thinking about very recently like we're seeing the impact of these mega-corps and what they're doing to the progress of this world. So this virtual extended reality world. So the wave was had a VR presence and was doing a lot of really interesting work in terms of live performance and creating immersive work in live performance but it was all built off of Google Poly and with Google removing their support and essentially dissolving Google Poly the wave is no longer able to continue because they are a for-profit corporation and it doesn't work with their business model to now have to build a whole substructure to support their continued development and this brings up some really interesting questions around now have five million thoughts but around open source platforms and that being part of how we look at accessibility going forward because so much is proprietary. I think headsets and stuff will get there. We'll either see a resurgence and like the cardboard versions of things or we'll be able to start hacking some of that stuff and maybe Facebook won't take over and not everyone's gonna have a quest and things will come. I'm hopeful around that but it's these platforms that are all built on other structures and looking at like is there a way to create open source hardware? Is there a way to create open source software? And so that we, the greater we of the globe continue to be able to access these things without it being driven by profit margins and I have ideas about how we can do that. I think the artists too, one of the things when we were talking about this at PXR was like, you know, so Facebook runs Oculus and there are all these issues with it and we sort of have to look at getting at the system from within. So how can we use some of these platforms now to learn what we need to learn in order to create our own systems particularly as artists who aren't driven by the profit model of a gaming company or a film studio that we may be looking at it in a slightly different way. So I think incorporating that too into and I know it's part of projects that I'm working on that development of open source, this way of sharing, building another approach to this so that we, that prioritizes accessibility, right? Because that's not prioritized in any of those companies in any way, shape or form. No, and in multiple ways too. I know that one of the things that as well like one of my sticking small points that I use as an example is just the way that they determine like there is a limited range of pupil distance that can be accommodated by headsets and like they went from having like a fixed one on the go to like a slider on the quest within a range to like three preset, that it's like an acceptable tolerance between those and like those sort of things it's like the various photo sensors in it that are tracking lots and lots of our bio data but their ability to recognize different skin tones and know that somebody actually has a headset on is highly problematic. There's a lot of things that get into that sort of collateral damage that I've talked about that they're like, well, you know, the headset's profitable enough but as you mentioned, like having the fidelity the fact that there's no support for the go anymore the Oculus go, which was a three dof headset that had one controllers that had limited use that instead of advancing that as a way that people could get introduced to things they just killed it and some others out there that are really useful for things like VR cinemas. Yeah, everyone just riffing on that concept of accessibility, we had a question from one of our listeners, the question is I'd love to hear thoughts on how mixed reality artists or organizations are or are not serving accessibility for both audiences and artists to participate in these technologies especially in rural communities. I'm really curious who asked the question because I'm wondering, there's like one person that I might end up misquoting them that they could have very well been the person to ask it. Oh, no, it's not the person that I thought it was. Anyway. And often I'm the one to ask that question too. Well, I'm thinking of like, so the Toaster Lab, the hackathon that we did in June was based on this like low bandwidth areas question and talking about, the thing that comes to mind is like thinking about data infrastructure. We were working with Mackay Theater and thinking about data infrastructure and the Yukon and the ability to get high speed anywhere even in the middle of Whitehorse is difficult and limited. And like having these plans to have like 5G drones feels very, it feels very cute. But like 5G drones flying. So you can get the data coverage to actually, they actually do those sort of things. Yeah, Paula, you're actually tethered right now. Yeah, I'm tethered. Yeah, so, and I seem to have the best bandwidth, so. I don't know. I mean, it's a really, you know, as everyone said, it's a really complicated question, isn't it, about accessibility. We're already speaking at a rather privileged level, you know, in so many aspects, whether it's geographically, locationally, or whether it's just with our pocketbook, these technologies cost a lot. And, you know, I think one interesting maybe component is that as these large platform models, these Facebooks, you know, they're doing two things at the same time, they're making it profitable, they're pushing out the technology, they're making it, therefore, more democratic in terms of accessibility. This is a slow process, and I wouldn't say that maybe it's their highest priority, of course, but you know, we can look historically at where we started in terms of the accessibility of these technologies, and it's a completely different landscape. You know, a lot of what I wanted to focus on in my talk was really about saying that there isn't a strata of accessibility at this point with the technology that didn't exist five years ago, that technology moves so quickly that it's almost impossible to imagine where we're gonna be in five years, but we know that all of us are using things that we never used to use before, or even we're using things in terms of 360 camera capture that is not prosumer or that's not progear, but it's doing the function that we needed to do that is accessible to us at this point that wasn't there even five years ago. So that's already a shift in terms of like market dynamics for having small mixed reality companies or small companies enter the playing field on one level, but with all good things, there's the converse, and we see the same thing, we have a lot of people who are left out of the playing field, left out certainly in VR, and I think there's a different kind of level of discussion that we could have in terms of what specific technologies we're talking about too. So it's very different for VR. I mean, you're right, I'm rural, there's not much support, nobody really cares out here, and there's no way I can run a VR system out here or create with it. I have to be in an urban environment that has high speed internet access. So there are some major barriers as well to creation depending on the platform. Yeah. And I wanna throw out the idea to help some people also think about this technically, that one of the reasons that there's been the recent explosion in available headsets is for the sake of like, if you think about it, positional sensing, locational sensing, high density displays, that's like before the ago came out you or the explosion of recent headsets you would have said, I'm talking about like a smartphone and a lot of the development in smartphones, like essentially the Oculus devices, the stand-alones are Android phones, modded to be what they are, so that there should be a capability of starting to run, like if we're talking about mobile infrastructure, but the ecosystem also has to be open enough to allow people to figure those problems out. On some of our project at ToasterLab, we've had to come up with a packaging system so that people can download things before they go onsite to things and do it in a way that conserves their data in the background, because that's the last thing that you want is for someone to like be doing a remote streaming performance through 360 and not have the data, because it's data intensive because it has to be fairly high resolution to do it, right? Yeah, absolutely. Yeah. On that tech end, there's another question from the chat here. If I wanted to live stream a theater performance using a 360-degree camera, what kind of technical equipment and considerations should I factor into a project budget? Hey, Beth, do you want to talk about Prague? Beth and I failed miserably to live stream in events. We tried though. We tried for no lack of trying. So a lot of the dedicated 360 cameras do have the ability to live stream, but without... I'll grab a visual reference. If somebody else wants to jump in here, I've got one nearby so you can... So to see when the limitations is there, you just don't have a screen to see what you're doing, right? So the interface is limited. Yeah, Beth, what happened when we were in Prague? I'm going to get a device to share. Sure, yeah, what happened? I mean, we had set ourselves up with lots of time. I mean, it was Ian and I working on it, so we had the knowledge in the room we had a pretty robust internet connection, so that was Czech. We had the right equipment, so that was also Czech. We had all the pieces there. And when we ran into... And Ian will remember specifically what it was, but we couldn't get YouTube to... Like we couldn't actually get the stream out in a way that anyone could see it. So we had all the pieces, but then it was YouTube that had updated something and it had worked and then it didn't work. And we tried a whole bunch of different end runs around it and we spent a fair bit of time, several hours, trying to make it actually work and just couldn't. And that was really ultimately nothing on our end because we had all the pieces and the know-how, but it was the greater gods of reality. We had enough bandwidth in both places. So this is, I brought down, this is actually both a 360 and a 180 camera. This is the Fuse XR camera. So it folds together and works as a, make sure my camera's being reversed, that it works as a, it's got two lenses on the side so you can do monoscopic 360 on it and then it flips out like that so you can do stereoscopic 180. But you might notice there's not like a screen or anything on it. So you can tether it to your phone to preview it, but even when you're recording, you can't even do it. You're meant to, a lot of these devices, one of the technical limitations is actually being able to see it or interface with it. And so that also means it's hard to bring it into other devices and they're meant to be in your pocket so one of the limitations that we ran into and like Beth, you mentioned, there was like YouTube change something just so happy with the data stream. They were sending it on the other side. It was like on a university Wi-Fi network so it was running into firewall issues because you had to like connect to the streaming service directly from the camera and you had like three buttons by which to select what network you were on and to put in a standard Wi-Fi code to it and then at the end of it, after like four hours we're just like, this might not work. And you get like, if you're used to the current Zoom performance delay of 20 seconds for live streaming, you get like 30 seconds and up. Even on a high speed connection, you need to send at least 4K, which is high for streaming as it is because anything that's not, yeah, need really high bandwidth to be able to get, both on the up and down for the viewer too. Sounds like the moral of the story is that you have to test, test prototype and test and still it might not work. And we were aiming for interactivity too, right, it was a dialogue that was back and forth between Prague and Kingston, Ontario and it just couldn't do it. Yeah, but yeah, test, test and then test again and then be prepared when something changes somewhere and the whole thing. But also to know that if you get it working that this is pretty high stakes that you'll be one of the first to really make it happen. So it might be worth putting that investment in, yeah. I know, it's a question of what you need in terms of the project as well. So sometimes we think like, oh, we need the latest, whatever, but actually does it serve the project? Is it dramaturgically relevant or can you actually do it in another way? That's still very similar. Just because perhaps the technology of the budget isn't quite there. And so like, I love SB, your presentation, like the kind of going back to the roots of analog paper and pen and stop animation effects. I think we all as creators and as mixed reality creators, we're all trying to think about what we actually need to convey the story or convey the dramaturgy at the time. So that's just something to think about. And also just one short note in these efforts of live streaming, Ethernet cable is your friend. In these instances, Wi-Fi, we all talk about having Wi-Fi, Wi-Fi is the thing to do, but during this pandemic time, we've realized that actually being hardwired in is a good solid way to go. Yeah, our Wi-Fi stuff is there. SB, I have a follow up question for you because this is somewhat related and thinking about like the limits of the camera interface too and thinking about your like the film work as well. I know that because we've done, and Paul, you might want to hop on this at some point as well, but in recording things in 360, especially with some cameras and like the limitations of preview and how close you get, far you can actually get from the camera because essentially the only way to see it is to like use a very small Wi-Fi network with your phone and you might as well be standing next to the camera when you do it. One of the limitations that we've experienced is that you sort of have to trust your understanding of the equipment to get the footage that you want because you're never gonna see it until like you've had a chance to like go back to your computer and then spend a lot of time actually rendering it. What has been your sort of experience or best practices with using the camera to shoot things or what level of trust do you put in the technology? So a few things. When we filmed Girl Icon, not only did we have the Insta360 Pro camera, but we also had one of those clip-in 360 cameras for your phone. So in an effort to, you know, again, being that we were filming in India, we were filming in Varanasi, very busy, very busy city, as well as like just a lot of different things happening around us. Even moving to film at Ronnie's home, we didn't want to have such a disturbance of sending the camera up, shuffling it around. No, that doesn't work, let's carry it, let's go somewhere else. So what we did was we attached a small camera to my phone and that was how we scouted for a spot. So if I can see it on my very, not as great quality, I think it's great. Not as wonderful quality as the larger camera, if I can see it on my phone, I can definitely see it with six lens. So we would just take a scan, just kind of walk around, okay? I can see everything, eye level, I kind of want to have it at this particular point, okay? Now we're going to set the camera up here. And then also because of the heat, we were only able to between the heat and then the fan that would interfere with sound, we can only film 15 minutes at a time. And that's like pushing it. So it really was like eight to 10 minutes at a time. So when we were filming, we were also, like it was a very strategic process. Little baby 360 camera, find a shot, set up camera, briefly look at the interface. And when you're looking at the mobile, see what's around and where's a good place to hide. And then go hide there when you're, when we're actually doing the larger one. And to just film in short spurts as to not cause too much of a disturbance. We would film extended over short spurts. And so people would just, you know, get accustomed to this object being here and continue about their day. But that was an important part of the process was to be able to look at it in that way. And I know for our projects, essentially, like for our crew, it's essentially like, you play where's Waldo? Like if you look around, you can start to see the same people's jackets turned away from the camera when we're out in space. It was like, there's nowhere else to go or you're going to burn a minute of footage until everybody plays hide and seek outside, I think. There's actually one more thing to forget that I forgot. So oftentimes in VR, we talk about field of view. Like it's a bad thing. Oh, this headset only has 90 degrees of field of view or this one has 60 or this one has whatever. So due to particular visual vision limitations that I have, my field of view is actually not as wide as other people. So I, when I'm filming in 360, like I rely on that field of view to actually set a frame. And I understand that like, as I'm setting this frame, you know, what's that composition within that? And then if I turn, what is a new composition that is before me? That is before me. So just also taking a look at, you know, understanding that we're filming in 360, but also understanding that as people moving in every day, there are times that we are like constantly looking around and seeing what's around. But once you get a sense of your environment, there tends to be particular points of focus and everything else that's around you informs, you know, where you are and just reaffirms that you're in that space. So picking like what that initial frame of focus is and then figuring out like what other frames of focus are with the very small camera and then knowing that when we're filming that these are particular points that people would look at. Yeah. I actually might use Paul's a quick. Oh, I was just, yeah, very brief. Yeah, I was just going to totally say everything you said is brilliant. I think it's really about knowing your technology. Like any kind of design field, like we're just, you need to become so familiar with that piece of technology that you, you know, you know how it's gonna respond in whatever condition. And one thing that we often leave out is just the, and this goes about, but this goes into knowing your technology, which is knowing the timelines of the technology. So although it's true, you might be able to, as we all know this particular camera, I can do this kind of thing and I can entrust without that view, it will do what I want it to do. But sometimes you just need to be able to do a lot of prototyping and test shots and take it back. And so I got into a lot of, yeah, I capture a lot of nature imagery, especially for the projects that I was telling you about. And so that tends to be needing me to take long sequences as well, because I'm interested in the duration, the time duration. And so then in order to render a process that, just to see if it's, if the test footage is worthy, takes a long time. So I'll end up capturing small segments, just to see, and then going away to render it and taking a look and then trying to come back to that same thing. But it does just require a significantly different kind of time, you know. I feel like, I often refer to one of the things that helped me most in preparing to work with the cinematic side of VR is having been trained how to do like traditional film photography and not being able to see it until you've processed it and being able to like, how do you frame shot and how do you trust your equipment to know it well enough because of your experience with limited resources and limited, in this case, space and time to be able to do that. I did want to use that as like a moment because we're coming up on time. I feel like, I feel like, Aaron's here for that. That we just came right past it because I think that actually there's like a really, like there's a really good point here that I want to pull out of actually what you were both saying. Especially what you were just saying, Espy, and sort of thinking about it as like, we're talking about something that is very cinematic and screen based and technology based. And we've also talked about like a wide variety of technologies that like you could do it with like a little thing that you clip onto your phone, a giant camera or you're going to render it in like a VR social space or like build a whole VR video game. Like all of those span the gamut of the type of work that we're doing and sometimes overlapping in a number of different ways. And we need to talk about all the different approaches ad nauseam. In fact, we are all part of a two year project of having those discussions in an ongoing way. So we definitely could talk about it. But I think what I wanted to do as we're closing out is pull it back to it is a design practice too. And there's something Espy that as you were describing it, one of the things that I felt really on in approaching this was like all of these 360 camera makers are going to people who use cameras who are used to framing. And that it's, but then, and so many of those are unsatisfying because they're not thinking about, they're not adjusting their field of view. They're like here and then other stuff's happening. And that ultimately that practice, it's like recorded performance because it actually has to think about what the entire environment is. Like it is a scenographic practice to think about everything that's around you, whether or not it's virtually created or cinematically recorded, whether not it's 3D or 2D, whether or not you're looking at a headset alone or working on a phone device or whatever device you're looking at it. It's ultimately about like organizing space and then using technologies to sort of collapse different times into one place together. I think it's exactly why this has been my argument for designers coming into this world because we understand orally and visually from a scenographic point of view, from a people movement point of view, we get that organization of space, whether it's physical or sound. And it's in our training, it's in our craft, it's in what we do to direct attention and to really, really think about the 360, even if we've ultimately framed it in a proscenium, we still get it. We have spent our lives developing those dimensions and understanding of those spaces. And so it's why this place, certainly for me has always felt really familiar from starting with projection and moving into VR and it's why I try to bring people in because then they come in and they go, oh, yeah, no, I don't like it, it makes sense. I'm standing in space, I'm crafting space in all the different ways we do it with light, with sound, with shape. And so I think that's a really critical and key and super exciting place to be. Well, we're over. We are. I think that's an excellent place for us to wrap up. We're all around everywhere. I'm going to turn it over to you, Erin, to cut us off, just cut us off, because we're just going. This is what we do when we get together. We just talk about this and that's it. Well, I'm so grateful to all of you for your energy and for this discussion and for this ongoing conversation. And I like the spirit that the conversation continues that we don't feel like we wanna stop because in fact, this afternoon, we are hosting a roundtable discussion about this topic. So if you are watching now and you have more questions or you wanna know more, you wanna engage more with this subject matter because of the broad nature of this topic, we are doing another event, which is a roundtable to discuss VR, AR, Beyond Reality topics, where you'll get to engage with your fellow attendees and speak at length with fellow attendees about the concepts that we discussed here today. So a big thank you to all of our panelists. Emily, maybe we can show the four of them again. Thank you so much to Beth Cates, Paul Sages, SB Proctor and Ian Garrett. And thank you especially Ian for moderating this panel as well. I'm so grateful to have the four of you here, really for runners, not only in the Canadian ecology of Beyond Reality, but also internationally. So we're really honored and grateful to have you here and to hear your wisdom and to hear your thoughts on this. Particularly not just from the technology perspective, but as this is the dramaturgy of digital performance symposium that we're also hearing from you as artists, that we're hearing from your perspective as creators, is really significant to me and hopefully to the people attending. So thank you very much. I'll do a few more moments of wrapping things up and then we'll call it a day on this event. So I just wanna encourage everyone here attending not to miss out on our brand new works that have just been posted to the Digital Art Gallery. They've just gone live and I hope that you'll check that out as well as the most recent postings in our live event listings. If you have digital art or live events happening in the digital sphere during the next three weeks that you'd like to share, please contribute. Our call for submissions is still open and both the new works for you to view or the call for submissions are available on our website at levelup.designers.ca. And if you enjoyed this session or you're looking forward to future sessions, we hope that you'll consider donating to the ADC. It makes it possible for us to have programming like this and to support our community of freelance artists. Please check out upcoming events including the round table this afternoon like I already mentioned where you'll have the opportunity to discuss the ideas presented here today in today's panel. And please, please submit your feedback on this or any of our other events via email, levelupatdesigners.ca. We love hearing from you. We have three more weeks of events and so all the feedback we're getting we're able to integrate and make the rest of the symposium even stronger. Thanks again for tuning in through all the different platforms and we hope to see you in a few hours time at the round table.