 everybody for coming today. My name is Patrick Rosati and I would like to begin by acknowledging that we are on land. The land on which we gather is the traditional ancestral and unseated territory of the Musqueam people. We're grateful for the opportunity to learn, work, and play here. I also want to bring your attention to a couple of other events that are happening in the area quickly. One is a play by Savage Society called White Noise. It's happening April 19th to May 8th here in Vancouver, Canada. It's at the Fire Hall Arts Center. Another one is the First Nations Talking Stick Festival. Again, here in Vancouver, Canada, it's at Roundhouse Community Arts and Recreation Center, February 18th to the 29th. And then here on University of British Columbia's campus, we are lucky enough to have the Museum of Anthropology up on the north end of campus. And if you have not made your way out to campus before or haven't visited the Museum of Anthropology, there's actually an amazing exhibit going on right now. It's in a different light. It presents more than 110 historical indigenous artworks and marks the return of many important works to British Columbia. These objects are amazing artistic achievements, yet they also transcend the idea of art or artifact. And that is, I've been there twice. I've been to this exhibit twice. It really is worth taking a trip 10 minutes north on campus to see it if you haven't. And that runs through this summer, through summer 2020. So please do see that if you haven't. I'd like to introduce Justine Garrett. Thanks so much, Patrick. And welcome, everybody. We are grateful to HELL Round for helping us with our live stream today. So hello to everyone tuning in that way as well. I'd just like to give a brief introduction of Toaster Lab and why we're here, what we're doing. And like to say, Toaster Lab is a producer of Mixed Reality. We collapse space and time to produce original narratives. I'm one third of Toaster Lab. I'm joined by my partner, Ian Garrett, who will come up here in a moment, and Andrew Sempre, who's actually going to be joining us from Switzerland in a few minutes. So Toaster Lab combines expertise in storytelling and in original media design and programming to produce original work and stories for our partners. And one main element ties our work together. It's a sense of, it's a sense of projector about advancing. No, no. It's in place to immerse the media. So Toaster Lab's signature focus is on in place media experiences and our apps guide users to specific locations to provide them with a new window to another point in time or another reality. These images are from Transmission, which premiered as part of the Future Play Festival at the Edinburgh Fringe Festival and the Future of Storytelling Festival in New York. Basically, as you all probably really well know, two events line up together to provide different experiences that bring different types of understanding to spaces. And this augmented reality ties two points in time to one experience. And one project we were really excited about doing was in Toronto's Parkway Forest Park. This was a really special project that we did with the Toronto Arts Council. And we worked with the local community who live around the park, which is like a giant courtyard amongst many different apartment buildings to create VR stories that told stories about that community. We did a VR workshop for youth. And they were definitely not easy on our 360 cameras. We got some amazing shots here. And they were highly enthusiastic about telling stories that were important to them. And they quickly adapted to staging and shooting in 360. At the end of the two weekend workshop with youth, we held a pop up VR cinema in the park where parents and friends and family members got to experience the movies that the students had created. And they were very excited to be host and tell their stories. And then at the end of the summer, we did a mobile application launch, which emplaced the students stories within the park at Parkway Forest. And my partner Ian's going to talk to you a little bit about one of our next projects, Groundworks. All right. So Groundworks is a project that we've been working on sort of continuously for a couple of years in Northern California. It combines a sense of restoring land, working with a number of indigenous collaborators in Northern California. Groundworks combines elements of traditional knowledge. So we have one of our core collaborators here on the board, Ross Cadie, with his cousin Paul, and in their traditional regalia. But also is working, Ross works as a youth, with lots of youth across a number of different tribes of which he is affiliated with. And we were combining that with his current practice where he actually tours as a musician and hip hop artist, touring as one of the leaders of audiopharmacy. And so he's been combining all of these things together. And a number of the collaborators that we've been working with there have also been doing, have also been looking at their contemporary practice through this indigenous lens with the idea of looking at issues of food security, water rights, cultural transmission, etc. So we joined up, partnered up on this, around this idea of around location based storytelling. So we've been creating 360 immersive audio and now working on some more specifically AR projects with a number of the partners that are around the Bay area. Similarly to Parkway Forest, we've been using what Andrew will explain in a little bit. So the basis for a lot of how we control these things through what we call the map tool. So we've been working with emplaced mixed reality where we've been creating content, immersive content that then is located in space. And so you can see a little bit of the map tool that we work with here and some of the 360 that came out of a live performance that went along with this, which sort of gets us to this idea of where we're working in this combined space between mixed reality and live performance. So in this project specifically, it culminated in a live sense after doing a year of work with a number of different collaborators from different communities, sort of in about a 100 mile radius of the San Francisco Bay area with on the first official Indigenous People's Day for the City of San Francisco, where there has on that day, which had previously been designated as Columbus Day, had been a sunrise ceremony on Alcatraz, which is partially preserved for for its use as a ceremony site, had a live performance that culminated at the end of the sunrise ceremony with Dancing Earth, which is a multi community Indigenous dance group that had created a site specific dance work and through whom we were working through a number of these different partnerships. So you can see a little bit here with this idea of over the course of creating now totaling about three dozen different experiences that you could be guided around the San Francisco Bay area or just Northern California, because you got pretty far from the bay with these different issues to combine these topics. They also went and turned a lot of the thinking that came out of this collaboration into another project in the Southwest, based around Phoenix and Santa Fe, where they're sort of co-located with a number of different communities there to create a piece called Between Underground and Sky World. And we have created an experience which later today, we'll be able to share some of the content that came out of that that's available to easily access on mobile devices, that we were able to share that along with a show. So we're right now working on creating a combination of the Implaced app using what we've done before with creating this content, with something that ties directly into the performance itself so that when people congregate around coming to one of their performances, like here you can see in the lobby, that there is also a mixed reality experience that goes somebody to interact with that content that was done in specific places which might otherwise be harder to get to. A couple of the other projects that we focused on worked on this. We did a project in Kansas City with choreographer who's located between Kansas City and Toronto called Public Squared, which took a place along the Kansas City Street car line, having three site-specific dance pieces that happened and then creating companion VR films that went with them, that one would be able to access through a mobile app and mobile web app specifically. One of the things that we're dealing with with a number of these projects is also around access, so trying to make things that are usable on the devices that people already have as opposed to trying to make things for the most advanced devices in a number of these different projects, so about interoperability in that regard. Another of our ongoing projects has been Trail-Off, which is, did you want to talk a little bit about Trail-Off? Yeah, you've been involved in the narrative more than I have. So Trail-Off is a transmedial walking performance, you know, that sort of normal thing, that's in collaboration with Swim Pony Performing Arts in Philadelphia and the Pennsylvania Environmental Council. And the focus of this is creating original audio dramas that are in place in trails around the Philadelphia Circuit Network. Central to the project are 10 original audio stories and that are inspired by the trails and line up with them in a triggered way. And here you can see my ponytail, it's a braid today, during one of our three development workshops on site in Philadelphia, which has been really rewarding, and a preview of the app. And then each trail, as I said, will fit your feature in original drama that unfolds over roughly a two-mile route. And they're written by local authors who demonstrate rigorous artistic practice and also a connection to the communities that they're writing about. This is writer Aaron McMillan at one of the workshops for writers along with the bio of Affaq Mahmoud. And we're very excited that this project will be launching in June of 2020 in Philadelphia. So audiences are going to experience these intimate journeys through an original app produced by Toaster Lab. And each author is going to have the opportunity to select where the gates would trigger the next element of their story. So we're working really closely with them to come up with these really elaborate, essentially triggering spreadsheets that show where and how long each element of the story will unfold. And there's other elements to add to that, too, in that we're also working with other, because there's a lot of data that comes into somebody's mobile device, that there are also things that are triggered based off of current weather conditions or time of year that they go. So moving a little bit beyond the GPS triggered sound block, also do something that's more dynamic and reactive for the way that people are moving through space. We hope it works. So yeah, we have a few different projects update on part of the purpose of the symposia and the digital strategy fund grant that we've been working under has been to share out as we've been partnering with companies, the type of development that's been happening across different ways of interacting with different types of mixed reality technology. A lot of our core work as ToasterLab coming into this had been around recorded content that was matched and in place, like matched with place, but we've been expanding that as we've been partnering with a number of different projects as well. So this is sort of our point to sort of update in the sequence of those. Yeah, and the today's focus really, as you may have known, is about ways of working within mixed reality and performance. We're also going to talk a lot about mistakes that people have made and things that don't go so well. So please feel free to ask questions and interact with folks as they're presenting. And we're going to hope to give you some more information about either starting to work in this way on your own or you know, sharing successes that you have as well. Yeah, and I'll point out that we've we've been lucky to have the general support of the Canada Council through this digital strategy fund. It is a fund that is still it's sort of winding down in this current stretch of the strategic period, but it is still available that's out there. And one of the things that's nice about it when there there are challenges about it. But one of the things that's really nice about it is that it is failure is an option in here. So there's a number of projects that people have been attempting to do things and they don't know exactly what they're going to make. So as opposed to a number of different creation grants that you might approach. If you're looking at something that you also just don't know how to do, it's a good thing to do. They have very large grants that I think that we might be out of cycle with those. But if you're going for the $50,000 under, which is not nothing, that those have an ongoing rolling application in there. So I'll plug in there. But so some of the projects that we've been working on, we've been working with DLT experience in Toronto and a project called The Stranger 2.0. Actually it's completed in the fall. We've been doing over the course of the last year just sort of an interest. DLT experience is known for creating immersive live site specific performances for small audiences, typically averaging less than two. So sometimes there's more than one person involved, but not very frequently. So they're hard to scale too. So we've been having this conversation about VR. So we didn't play a lot with VR cameras. This is Daniela Bartolini who's the artistic director of DLT experience with one of the views camera that we were doing some workshops in, in the kitchen. In the Italian cultural center. Yes. Wonderful. So he's been in residence at the Columbus Center in Toronto. One of the projects that came out of this, one of the more notable projects that DLT has done in Toronto over the last few years has been called The Stranger, which takes one individual at a time through various downtown locations of performance to lead them through a different experience of what it's like to be someone landing in a city for the first time. They're focused on the immigrant experience. In The Stranger 2.0, the idea was to allow more people so that you went as pairs and that there are two different tracks that you could follow around the urban space around the Columbus circle or center rather. It was good. So it worked out really well. What we ended up contributing to this is how can you integrate something that's filmically VR to transport somebody out of these experiences to create something in contrast. So there's two tracks above and below. We created two short VR films that you would interact with over the course of your tracks about four minutes long that were inserted into the narrative structure of both of those of those directions. So you come into those and that's sort of a still from the VR here. One the above being a much more happy choice one as well playing with a baby and then below the baby's fine. The baby's fine. But it is a bit of a darker narrative that you're following with that. One of the projects that we were able to talk a little bit about in our previous symposium in November was the Albion Library VR project. Yeah, we did this in partnership with Kitchen Band Theatre which is also in Toronto. They had received funding. There's a program that the city funds around artists in the library and we worked with them to similarly to the parkway work with community members and different community groups, different people who intersect in this public space. We're like just as an aside we're fans of libraries as they're like the last remaining like business like establishment that has a space where you're not expected to buy something. So like how people use this space. So there was some narrative work in here. We combined it with some narrative work combining looking at the history of what that site was. You can see that in the lower left hand corner here where we were also making companion films that we go between that we're looking at local park space to look at what the natural landscape would have been like before that. There were community dance ropes. This is Navron dance that's in North Etobicoke. Elder Philip Cotay introduced the area from an indigenous perspective and then we also had some fun. Carolina who is our contact person is the branch manager. We did one piece where we just strapped a 360 camera onto a book that was being returned to the book drop and then we get reshelfed into things and it could go through that process. The other projects that we've been working on there's been two others that we've been able to sort of complete in this time. One has been just actually partnering with a new festival that's in southern Oregon called Live Culture Coast because they're looking at a decentralized it goes up and down the coastal areas. We had met each other myself and Amber Peoples who is the artistic director of this new festival had met and talked about what the infrastructure of this has been like. So the purpose of this project was really to have people on site to talk about how emplacement and emplaced artwork would work with this festival. It's a combination of agriculture and arts festival. It's culture like culture food and culture. Yes. So there were like various stands going for it and then there would be installations at in on farms. Yeah. And so guiding people through the space is a challenge that based off of other festivals that had happened before and a project that we're working on currently is a production of so moving into the actual back into the theater because we've been doing a lot of things outside a lot of it because of the emplacement and the ubiquity of GPS. So part of it's been convenient and working in various types of public spaces to talk about what types of stories exist in there. A lot of time taking immersive audio recording and 360 cameras into like very hot forests. Yes. It's it's like if anybody is familiar with geocaching. It's not entirely unlike that but with media and the best way that I've heard that described is using a system of multi-billion dollar satellites to find junk in the woods. So we make the junk that you put into the woods. We have we are also working on projects that are coming into theatrical spaces as well. So this is a project that we've been working on. We have been working with motion capture just sort of in various projects that we've been working before. This is our at the time three year old son testing out an older organic motion stage that we had. I also teach at York. And so we were doing some experiments here. A lot of this technology interestingly has become much more accessible. So on this production we've been using a much smaller area to do digitization. So this is the lighting designer for this production standing in to do some lighting tests for some work we were doing with the actors and green screen for various parts of the video design that end up there. But then also creating these augmented reality characters. So some of the characters are related are created through these three dimensional puppets that we're creating. The main one of which and you may never look at me the same again after this is that we've been using facial motion capture to animate one of the characters in sort of a Wizard of Oz sort of look. We've got the chroma back here for the purpose of taking that out later on when we put it on on the stage. But one of the characters we've gone through an entire motion capture process of capturing the performer's face and altering that and all of this using the recent iPhone that has the depth sensor on the front of it. So we've gone from replacing the full stage that we were working with before to to something that we're always have with us in our pockets. And that's been a period of five years. It's not actually puppeteering me talking right now. And then the last project that we'll mention is sort of our update is that we've just gotten a second round of funding to work on a new version of Parkway, which will allow us to expand elements of it to bring in the storytelling element that we've been doing down in Philadelphia and bring that back home. And also to start looking at other ways of integrating other types of air content that our community driven as well. So this will be another app that we'll build that will be a dedicated app and still use the same map tool, which itself creates a web app and what it's what it's doing. So you guys are super special, but you're not the only symposium we're hosting this year. We're really excited because as part of our grant, we get to host six symposiums symposiums. Today, we're here in Vancouver. We had one at York University to kick it off in November. And then in Kingston will be at the Folda, the Festival of Live Digital Art Conference, which you all should know about or think about attending in June 2020. And then we're going to head down to Bloomington to Indiana University this fall and then we'll be at the Cersei Conference at Brown University, which Cersei stands for the Conference on Research into Choreographic Interfaces. So yeah. Rolls up the top. Yeah. In winter 2021, and then we'll we'll wrap up our final symposium at Folda again in June. That should say 2021. Yeah. Yeah. Trust us, we'll be there again. And then you should also all know about the world stage design in Calgary, which is happening in August, where we'll report back on basically everything that has happened. Yeah, there will be a final summary and sharing of everything that sort of came out of this project. So I just wanted to thank, deeply thank Patrick Rosati for being our partner and producing this event today and for bringing so many of you here. Also, thank you to Laura Isabel for doing some amazing prep work and production design and all kinds of organizing and Gabriel Garland, too. They're in the back over there. So thanks to those guys for making it look so nice today. Aisha Bentham is here from York University helping. You saw her at check-in, so we're grateful for her support as well. And we're also super excited for all of our panelists that we're going to have today. So as you can see from the schedule, today is going to work. Oh, yeah, Andrew. Andrew is going to pop in from the ether in just a moment. And yes, Andrew will be talking for a second. We're going to have some presentations today. Please feel free to pop in with questions. We'll have time for Q&A at the end of each panel. We'll have a lunch break and then come back for two more panels. And then we're going to do some hands-on demonstrations of some of the presentations you're going to see today, which will be over in the other room, we think, very probably, or this room. It's going to be a real fun surprise, but we'll tell you where it will be at the time. Are there any questions right now? No, and Andrew is here. What? Technology. Andrew, can you hear us OK? No. And he can see your chest. I know. We are technology professionals, so you know. It always works. Hello, Andrew. Hello, am I live? Yeah, you are. You're here. OK, great. So I'm watching the stream, so there's obviously a bit of a lag. So in my land, I'm not live yet, but that's OK. Good morning, everyone. I'm going to try to hang out and lurk a bit on the feed, and I'm always excited to be there and to see another room full of awesome people doing cool work. I'm also at the tail end of what I hope is the end of a kind of cold bronchitis thing. So I apologize for my voice. And if I start coughing, I might, but I'm not going to talk that long, so it should be OK. Ian asked me to give you a really quick update from Softwareland. About two months ago at the previous symposium, we talked a little bit about this tool that we're working on called the MAP tool. It's not an exciting name, but it's stuck so far. It's basically a kind of Swiss army knife for making locative based projects. So it's very bare bones, but you can imagine it's essentially just a way to affiliate information and media with a location on a map that then can subsequently be used by whatever you like, by a mobile app, by a web-based app, or by itself, because it's sort of the offering tool itself also can display this information. So in some cases, we've used it as the actual presentation layer. In any case, what we're trying to do strategically and in the context of this grant is to kind of drive software development of a multi-purpose tool but using specific projects as the drivers. And this is pretty neat, but it's a balancing act trying to keep an eye on the specific requirements of a project, which are often more presence and more pressing or seem to be then kind of making the piece general purpose. So this is a really normal thing that happens. But because of that, we now have, I think, close to a dozen projects running on top of the software platform, which is awesome. But all of them have, like, you know, maybe slightly different approach. So until about two months ago, they were running the same code but with a little bit of a tweak. So this is sort of beta mode. And over the last two months, I've put a large effort into streamlining that and making sure everything is on an exactly the same code base with only a configuration file change. So this is not super exciting to show you. This is very nerdy update, but it's been a lot of work and I'm really excited to say that we're done with that. So now every single one of our projects is on a single shared code base, which makes my life a lot easier and makes maintenance going forward a lot easier as well. So last two points with regards to that one is that the goal of that also is to try to get this project open source. It is, in fact, open source. But as I mentioned at the last symposium, there's a difference between just publishing code in public, which is fine. But that's not really an open source project, right? An open source project is one that can be can and is used by other people. And in order for that to happen, the code should be maintained and structured in such a way that other people can actually use it. So that's been another part of this big two month push is that I'm really driving us towards the one release of this and getting out of beta that works in Andrew's head and into something that other people can actually use. And I'm really excited. We're not quite there. We're very, very close to having the one release of this code base, which is great. And all of that, of course, is in the context lately of the trail off project. So we've also been doing a lot of work on infrastructure for figuring out ways to distribute large amounts of media in a way that is cost effective. So trying to figure out how we can host several gigabyte files that represent this walking tour and get them to people's phones without paying through the nose for large amounts of data and streaming services, which has been an interesting challenge. But I also am pretty excited about the solution that we've come up with there. So that is an update from engineering land. Thank you. Um, so that's our that's our toaster lab update. That's what we've been up to. I'm not sure you're audio. Oh, there are questions. That is can you hear me now? Yeah, I can hear now. Thank you. That's an update from toaster lab land, giving you a sort of the grounding for the context of why we're bringing these these convening together so that we're able to give an update on all of the different strands of work that we've been doing and also then host these conversations around people who are also doing interesting work for it. There's there's in this sort of hybrid area. There's not necessarily a huge amount of direction. There's a lot of overlapping and people coming at it from different whether or not they're coming from a narrative end, whether or not they've been working in digital media and types of distribution or immersive content for a while or whether or not they're coming from it from a theater and performance end of things trying to gather everybody together. We have a moment for another for a question or two of there are here. Otherwise, So the question was, does our map tool have directionality in terms of orienting audience members to performance? Cardinal directions. Yeah. And you can talk more technical on that. That has also been part of the narrative, especially within story trails because we trail off. Sorry, it's old name with story. Yeah. With trail with the trail off project, like coming at that from a few different that specific issue from a few different directions because some of those are very linear and some of them that line is obvious but many of them are non-linear and many of them are non-obvious. So how do you get all three of the production elements, the technical elements and the narrative elements to line up together and combine in the app? Andrew, can you comment a little bit more specifically about what that means in terms of how the map tool works? Yeah, I can. So the, okay. So sorry for the splitting of hers here but it's actually necessary to explain. So the thing that we're calling the map tool is basically the authoring and the infrastructure for hosting information about the experience. It's not the experience itself. So in that respect, you can put any information you like including orientation information into that tool, it will host it, it will have it available. So that's a thing. But there's a more important part of that question which is I think what you actually want to accomplish which is like understanding where a person is actually standing, how they're oriented and then feeding them the media for that. So that's less about the map tool and more about the actual thing that you're using for the experience. In this case, it would probably be, I'm assuming, a phone app. So the thing that understands the orientation is going to be the device, not the database if that makes sense. So then the challenge there as Ian mentioned this in the introduction that we've been lately working a lot on the idea of creating experiences that are low threshold. So the thought is that anyone who has a mobile device can access our experiences. So because of that, we've been deliberately limiting ourselves in what we can access on the devices. So the latest iPhone, for example, actually can handle the orientation information relatively well, but not so much say an early Android device. So we haven't put a lot of work into that specifically because of that because we haven't had a project that has articulated a hardware device that has made this sufficiently easy for us to do. So that's where I would draw the line. Sorry again, this is a bit of a technical thing, but I think that basically if we were to do an orientation based project seriously, I would first say what set of devices are we using? Pick that and we'd have to build up for that because they're really to be completely blunt. There is no hope for coming up with a solution that will work for all possible devices throughout history. We have to draw a line somewhere and that's one of those things that's highly device oriented and relatively new. It's really cool, but not everybody has it. So yeah, that's a start there. Yeah, I'll stop. Any other questions? Does that help you? Right over there. Yes. You guys are great. Ha ha ha. Can I start? Yeah, yeah. There's lots of- So cathartic, I love answering that question. So we had our first project was a 30 plus site specific GPS triggered app in Edinburgh where we did not live. And so it was extremely ambitious and extremely expensive. And we learned it was so painful, like it really was not critically well received because it was extremely complicated to experience. But we began working on that in 2015 and then launched it in 2017 at the French festival and at the future of storytelling. But we learned so much from that in terms of the number of mistakes we had the opportunity to make. And so, yeah, I mean, it's really laid the ground. That colossal, it wasn't a colossal failure. I won't say it that way. It was a lot of work. And there were some amazing parts of it too. But all the things we did wrong in there were really valuable. Like we had a stage show portion of it that we put it the middle of the story. So you would have had to experience a certain number of GPS triggered things at the Edinburgh fridge festival. It should have been at the beginning just simple stuff like that. Yeah, there is throughout a number of things that each of them has been sort of highlighting like what, where we need to go next. So there have been the technical issues that we've been able to solve with each of them because we're trying to do something new or because there's something new that we wanna do with a particular device or a new feature that becomes available or more accessible to more people. I think more so, and especially looking back at that specifically at transmission, there are things that coming from, so just as an example, even a year before that, when we were just trying to describe what exactly we were trying to do, it would take a page or two whenever we were writing a narrative, writing a proposal, so what exactly we were gonna do. And then in summer of 2016, Pokemon Go was released and all of a sudden we could have a one sentence description of, because the show that we were working on was based off of communication with like extra-crustrial communication. So like it's Pokemon Go meets Arrival and people would get it like that. So it's like learning what the limits are to what we're doing and having people come at it from the technology side, from the immersive site specific, like there are people who have been, who have long theater careers, long technology careers, people who have been part of like third rail projects as a notable like immersive company and been part of those shows for long periods of time. That with all of those different things, there was still a lot to figure out in each of those in sort of the speed that we needed to work. So it ended up highlighting a lot of those. We'll be able to talk much more about failures throughout the day. We can highlight a lot of things. It's a very long list of things, which is again, why being in the position of being able to go through this Atelier project of throwing things at different questions and not always like a lot of the projects that we're doing are going to production, but a lot of them are also things that are experiments to see like, what if we added this onto something that was already happening as well. I'm gonna just put a pin in that for right now because I know we'll be talking about it more, but I want us to switch over to our keynote. We'll just kind of cut off our litany of failures for now. And I'd like to happily introduce Patrick Penafather, who's assistant professor here at UBC's offices right upstairs. And he's part of the master of digital media program. He's a designer of interactive experiences and award-winning sound designer and composer and improv performer, divisor. We are so delighted that he's gonna give our keynote address today and I will turn it over to him now. Thank you. And we're gonna leave a computer swap, so give us one moment to do that. Thank you, Andrew. Yes, thank you, Andrew. Mm-hmm. Thank you.