 I feel good in front of the pxr 2020 sign. I'm just gonna I have muted you all. Sorry just temporarily. Yeah, hey that's thank you for the applause. Does everyone know how to do that right now? Show us your applause if you know how to do it. If you know how to clap your hands in all space. Nice. So if you're on a headset, look on your left there, there's a little smiley face. Just open up the smiley face and nice. Amazing. Thank you. So we're going to get started. Something I've noticed about virtual reality is you should never ramble on too long because people will maybe find a firework or a snowball and that will become more interesting. So but let's try. Let's try to actively listen just a bit and I really wanted to just to welcome you and also to acknowledge the land that we're on today. If I invited you to help me with this land acknowledgement, would you please join me up on stage right now? Just come on up here. So that's Clayton, Talia, Jacob. Come on up. If you're in here, everyone's looking around. We'll give them a second. So I'm gonna get started. So my name is Alex Doe and I am the symposium director for PXR 2020 and on behalf of the team, I want to welcome you all to our very first day. I am so incredibly excited just to see so many of you here. I've been in this, spent a lot of time in this place but it's been kind of like an empty amusement park and now suddenly there are people here so there's I feel a great joy about that and I also want to thank you for all the work that you have put in to being here for attending our orientation, for joining Discord, for all the RSVP notes we were sending out frantically to you. We appreciate that because this is a big jump to make and you are in some ways taking a risk in setting aside time to be here and we appreciate that but now we're here in alt space at PXR. We're standing side by side and now we can talk about how we are going to use this incredible tool to create performance and I want to promise you that you're going to leave here with tremendous insight into how XR tools can be used to create art, to create performance and I guarantee you at the end of this you will know if this is something that you want to use in your own work or if it's something you just want to totally ignore you will be able to decide that. You're also going to make important connections with artists across the country. Yesterday I was speaking to someone who's here from Yellowknife and talking about projects together and kind of dreaming up ideas and that was incredibly exciting for me and I hope that you all get that opportunity. We have people here from all over Canada who are excited about working with this tool so I really encourage you to make connections. One thing that's fun is you know everyone's got a username so being able to correlate a username to their real name and their real affiliations we're going to try to put together a sheet at the end of this that links a username to a real name so that you're able to to leave with that that info. One second I'm going to unmute I want to know if, hold on, lost my hands. Have I got my other folks who are going to help with our land acknowledgement? Are they here? So my big plan was to you know we have people here from across Turtle Island and to have each of those people speak about the peoples from those particular places but they don't appear to be here so I'm going to go ahead with what I had. Yes, did you have a thought? Well, I'm one of the fellows at the symposium and I'm also Indigenous so I could at least speak from the Vancouver territory which I know a few of us are from. Amazing, thank you. Do we have anyone here who is from the Yukon? Okay, how about, is there anyone here who is from the Northwest Territories? Okay, Alberta. Yeah, I live in Banff, Alberta. Okay, I will put you on the spot with that. I just don't know what you're asking me to do. Yeah, I know, I know. So, okay, and this is a learning moment for us. What I've realized is in order to have this many people in a space we're actually, there are people in a parallel version of this theater. They're in another shard and so there are some of those representatives are in that shard. So what I'm going to do is do the short intro land acknowledgement that I had and then I'm going to jump over to them and then what I'd like you guys to do is explore, meet each other and have a look around. Okay, how does that sound? Give me a smiley or applause if that sounds okay as a temporary fix. Awesome. And I think on that note, I definitely want to say things are going to go wrong over the next two weekends and we're going to have to roll with it. And I think being patient and being understanding is definitely a good approach. So I want to take a moment to acknowledge that wherever we are coming from today, that we are situated on lands and waterways that have known human activity for thousands of years and that we are privileged today to be able to enjoy the benefits of this land that supports all that we do, including being in virtual reality. The original caretakers of this territory have always known how to benefit from land and water and to ensure that future generations will have what they need to live. And now I'm going to try to jump to another shard and say this in another place, but I encourage you all to have fun and our first talks are going to be beginning at one o'clock. We have A. Thomas Goldberg. That's off of VR. We've got a link to a Zoom presentation that's on the Discord that you can go to. I highly recommend this presentation. It's Circus. He's at a studio with seven fingers and what they're going to show us is going to be spectacular. We also have an amazing panel discussion on spectatorship. Right in front of me here is Milton, who's going to be moderating that. Maybe Milton can tell you more about that panel. And then after those, we have Dan Miller from Unity Technologies who will be over in presentation. What are these presentation rooms? We'll have volunteers to tell you where. I think it's A to talk about Unity. And Unity is the tool that I use to do this worldbuilding and there's so much possible with it. So I also strongly recommend that you check that out. Thank you all so much and enjoy PXR 2020. Oh, yes. You had a question over here. I'm going to unamplify myself. Yeah. Raven, thank you for your offer. I'm one of the PXR 2020 hosts. I'm very excited to see everyone here. This is my first time doing this. So excuse any little technical difficulties. Hopefully there won't be. There hasn't actually been a full rehearsal, so bear with me. So yeah, I'm just going to do a little quick intro. Now, I guess I do have the option to mute most of you, but I think I'm going to trust you guys for this. And if you guys want to mute yourself during the presentations, it's totally up to you. And make sure that when the presenters are speaking, just give them the space and time to speak. Thank you very much. Okay. And I'm going to read something in front of me so if my head looks weird, again, bear with me. Okay. And if everyone, can everyone hear me? If you can hear me just gesture, just like move your hands, you know. Okay. Wow. Great. That's surprising. No audio issues. Fantastic. Okay. PXR 2020 would like to begin by acknowledging the support of the Canada Council for the Arts and State of Israel Strategy Fund. We would like to acknowledge that we have organized this event primarily on the traditional ancestral and unceded territory of the Coast Salish peoples, Squamish, Salewatouf, and Musqueam nations. Video games, Twitch, YouTube, and algorithmic culture more broadly, continue to transform the context for spectatorship of performance. Audiences, users, players, and performers alike are developing new ways of active participation in digital and virtual experiences. As we come together to discuss the emerging modes of spectatorship, we will explore and challenge the status quo in each media's presentation and performance, and imagine things will continue to change. Now, before I bring on our moderator, Davis, are you here, Davis? Hi. Do you want to come on stage real quick? Hi. So I was told you also wanted to do an acknowledgement yourself, yes? Davis? Great. Let me just amplify your voice before I do that one second. Okay, you're amplified. Cool. How's it going? My name is Davis Hesslip, and I'm contacting you guys from Sambake, which is also known as Yellenife, in chief strategies territories, and also the traditional lands of the North Slaves of Metis. I know we're gathering here virtually, but I think it's really important that we're also acknowledging the lands that we're standing on and stuff. So yeah, I thought I'd just do a land acknowledgement of where I'm from. So thank you so much. Fantastic. Thank you so much. And now I'm going to introduce our moderator for this evening or afternoon wherever you are. Please welcome to the stage Milton Lim. Hi, everyone. So my name is Milton Lim. I will be the moderator for this event. I'd like to welcome you to identifying and strategizing around emerging modes of spectatorship and attention, which truly is just a fancy way to say that we're talking about how to deal with the changing nature in the way that we watch performance. So just at the top, I just want to say that given the notion of spectatorship, it's necessarily broad. I'd like to frame this discussion within a more specific context that each speaker will bring to the table. And furthermore, the spaces between and the relationships that arise from them across VR, across theater performance, across different types of the attention economy, and so on. So we have asked each speaker to bring examples of how in their perspectives, these new models and platforms challenge the status quo in performance and importantly, how to forecast what the imagined things might look like as they continue to change. So by way of introductions, I'd like to call upon our four panelists. I will read a little bit about them and remind them to please amplify their voices so that we can all hear them. So first, Raven John is a two-spirit Coast Salish solo artist. Raven is a visual artist, cultural consultant, mediator, photographer, and sculptor who graduated from Emily Carr University of Art and Design with a major in visual art and a minor in social practice and community engagement. Raven also completed the Northwest Coast jewelry arts program at the Native Education College. So Raven, welcome to the stage. Next up we have Melissa Dex. Melissa is the director of the Vancouver VR community, aka BANVR. She is an accomplished filmmaker and photographer, recently releasing Snash Ford, a documentary covering Vancouver esports on the cusp of growth. So she lends her media production skillset to various studios focused on the immersive space. Melissa, welcome to the stage. Thank you so much for having me. Of course. And we next have Liam Carey. Liam is the founding artistic director of Engle Thread Theatre Company, founding member of the Co-laboratory Theatre Workshop Initiative, the Constant Theatre Alliance, and is a co-founder and current artistic director of the Cake and Push Festival. He directed UNLESS, which is a site-specific immersive adaptation of Walker Percy's The Moviegoer. Liam, welcome. Thank you, Milton. It's great to be here. And last but certainly not least, we have Jacob Niedzbietski, who is a queer artist who works in code, media, and movement. After retiring as a professional dancer, he has choreographed and directed dance films, directed multi-cam dance live streams, and created several live performance works. In addition to his work as a tech dramaturg and performance creator, he's also the instigator of the Cohort Initiative, which is a program that makes it easier for artists to integrate mobile devices into their work. So welcome, Jacob. Jacob, are you muted? Oh, yeah, there it is. So welcome to our four panelists. I've given the instruction to use the space very informally if it helps to illustrate the things that they're talking about. These are the kind of capabilities that hopefully we can continue to explore in virtual space. So first, I'd kind of like to open up to a rapid fire prompt question for each of you. And then I'll switch it up to talking about each of the things that you may or may not have brought into for the discussion. So really, I just want to ask a quick, like, 30-second thought and response, which is, what's an influential piece of performance that has transformed your conception of spectatorship or participation? So, Jacob, can I throw it to you first? Sure. I would sigh, folks. Lovely to be here and be able to talk with my hands on Zoom. That's not so effective. Yeah, I would have to say that this will probably be a, I don't know, for me, a big awakening point, having grown up in the world of classical ballet and with some exposure to contemporary dance. I think actually that when I first saw Sleep No More was when something clicked for me in terms of the possibilities for an audience experience. Yeah. Great. Sleep No More. Thank you very much. Is audio okay, by the way? Can I get just hands thrown up if they are okay? Perfect. Thank you so much. All right. Liam, can I throw it over to you? What's an influential piece of media or performance that you've seen? It's hard to choose, but just I think I'm thinking of this one. One, because it was incredible, but two, because I think it just closed. But then she fell by third rail. I just recently posted in New York. Being able to experience that immersive journey based off something I knew, but by myself, I found that incredibly powerful. I think I'll just leave it there for right now. Thank you so much. Yeah. Melissa. So I kind of come from kind of not performance, like your traditional performing arts, dance, classical music, all that stuff. I come from more of the gaming industry, but in the gaming industry in itself, there are, I feel that there are performances on Twitch because those influencers are acting a certain way. It's an interesting thing though, because I think there's a fight for attention and money and sponsorship. So the kind of content that comes from that can be a little bit questionable because I just feel like in the kind of the current turmoil of polarization in society, like this comes from like political, even our hobbies have become this way. A lot of people are more outspoken and extremes. So I'm a little bit concerned that some of the content that we're going to be seeing in performance arts in a virtual space will be more extreme and attention and mostly for attention. But yeah, that is just kind of what I'm witnessing from the gaming perspective. I really love learning about seeing ballet and this thing. And even this event hosted in VR, it's incredible because it's modeled after kind of like a real place that you can be in. It's gorgeous. I love this. Amazing. Thank you, Melissa. We'll definitely dive back into some of those thoughts that you brought up. Graven, can I throw it over to you? Oh, yeah. I'm going to help myself as a dirty, dirty nerd. For me, something that was super transformative was actually larping. So my like, I guess I can't say to be all, but I assume most of us know about role-playing games in general, like Skyrim and similar, where you are immersed and you play a character within a video game space. Dive further deeper into kind of that idea into tabletop role-playing games where you might act up, but generally you describe what your character does in a world-building interaction with other people, performers, game masters, and then I started live-action role-playing. And as an artist, I really enjoyed this because I was part of really telling a story with other people, but also I'm a performer at heart that is part of my spiritual and indigenous practice. We are performers, but as we all know, you generally don't see people like me in any form of media. And if you do, it's very limited and heartbreaking generally. So being able to participate in live-action role-playing games was a great way to use some of that much-needed performance activity that I love, as well as creating spaces with other nerds that are very inclusive, very progressive, and just very wild being able to create a world of your own. And then I have a lot of friends who've turned their corporate storytelling into books, into video games, into web comics, that sort of thing. Amazing. Thank you so much, Raven. So we will have time for Q&A. So if you do have any questions, please hold on to them. Later on, when I open it up to everyone else, we'll probably just do like hands somewhat in which we'll kind of popcorn style it. And both Jesse and I will go around and make sure we amplify those people's voices. So please hold on to those questions. As for the panelists, I did say that I would give you approximately five minutes to talk about your thing based on my gauge of the timer. And now I might ask for like three minutes or less, if that's okay. But can I throw it over to you, say Melissa first, about some particular specific idea you might have around how spectatorship is changing in your area of expertise? I actually got really sidetracked just kind of looking at the room around us and listening to what Two Spirit Tricksters said about kind of inclusion and diversity. And even just looking around this room, like, we could be purple and blue and it doesn't matter. And I think that's really special about this virtual space because it literally doesn't matter where we are, you know, where we're from. I feel like this is a really uniting platform. And yeah, again, I think I'm just surprised that I'm here and it just feels really special. And I'm just kind of in awe right now. In regards to what I was saying about games, I don't know, I want to stay positive for this talk. So I'm a little bit afraid of what I'll be saying because I've been watching a lot of videos on kind of propaganda social media. So I mean, it's tools that are used for good and bad. So again, I just wish that there was more education around, you know, the kind of manipulation and content that's going around. I'm so sorry. That's some pessimistic thing to say. But yeah, I really like this space. This space is where it's at. Amazing. I think that we'll definitely come back to that as we consider how the attention economy and the gamification of behavior has really shaped how experiences are being structured or laid out, especially in things like games as Raven brought up. Liam, can I send it over to you to just talk about one specific example of how spectatorship is changing? One of the things that I've taken up a lot of my time that I'd offer to the group is that someone has been doing site-specific immersive theater for a longer than I care to admit. I mean, as Melissa said, all these new possibilities are a dream. But I find as a curator of site-specific immersive content and online content, being a curator and an advocate, I find the game is changing really, really quickly. It's always been a struggle. Well, not a struggle. It's always been a wonderful challenge to advocate to non-theater people about the viability and the magic that's possible creating immersive environments using theater. That's always been challenging. Now that virtual reality is upon us, and thank goodness, I feel like there's more understanding about what an immersive reality is. I don't have to explain what it is as much, but it's not necessarily easier just because people understand what the environment is like. There's a different type of advocacy. There's a different type of resources that you need to get people to this space. Everyone needs technology. So as a curator and an advocate for this type of work, things are accelerating faster in the last three years than they ever have before. And it's an interesting, challenging, beautiful time. And I just would like to offer that, if I could. Absolutely, especially from a curatorial and festival-based model, seeing how different people are coming up with different experiences. We'll definitely dive more into that as well. Raven, can I throw it over to you? Oh, sorry. I wanted to say one last thing. Sorry, Raven. One thing I wanted to say, in addition to virtual spaces, I also think about Zoom and conference calls. I've been watching a show on Apple TV Plus called Mythic Quest, and they did an entire storytelling episode just through Zoom calls, text messages, and conference calls. And because our society is so based on these kinds of forms of communication, as a viewer, you can understand it completely. So I think that's really an interesting way of portraying art and storytelling as well. Thank you. Absolutely. Raven. It's hard for me not to get distracted. For myself, I've been really interested in the different ways that visual languages change when engaging in audiences through virtual format. Like, for instance, here we have the telltale sign of the user hand on an Oculus, Oculus One, and then I can see an audience here that we have a bunch of TDPC users whose automatic hand placement is at the side, as well as maybe a couple of Go users where only hand is capable. So I'm really interested as an artist in some of the ways that we are able to communicate, but also the things that we're able to learn through the way that we represent ourselves, be they purposeful or not. Absolutely. Thank you. Thank you. We'll come back to a lot of the performance of self and identity and hardware restraints. Jacob, can I throw it to you? Yeah, sure. I think one of the perspectives that I found really useful in trying to understand what's going on right now is actually a historical perspective. It helps me because it's hard to separate my personal discovery of technologies and my personal discovery of new mode from the wider stream of how those show up in culture. The writer Douglas Adams once said that any technology that existed before you were born is a totally natural and normal part of how the world works. Any technology that's invented from between when you're born to about when you're 30 is tremendously new and exciting and you can probably get a career in it. And any technology that's invented after you turn 30-ish is dangerous and frightening and will probably be the end of the world or the end of culture. That's just one of my grounding precepts is to not look at new things and treat them as new things is a useful thing. Early in my career as a choreographer, I spent time studying Lowy Fuller, who was a contemporary dancer American but became most famous in Paris. And she actually came of age as a performer during the switch to gas lighting, during the introduction of gas lighting and the switch to gas lighting. If there are academics in the crowd, I apologize. It's entirely possible. I am pledging dates here accidentally. So she was an example of somebody who controlled, who developed her performances based on ideas of how to use new technology. And so that, I feel like that is the spirit that I'm kind of carrying forward. But for me, it always helps to look at those changes in retrospect and to take a historical perspective. And I think that can help destabilize the perspective enough that you can sort of then find useful things to do and use toys to interrogate and useful sort of gaps to work within. Absolutely. That's a brilliant segue actually, Jacob, to the second part of that question. Are there any forecasts that you'd like to make whether incorrect or correct in the future that you think will change with spectatorship? We've heard a lot about how venues and festivals and expectations around performance are changing. We've heard that the gamified community or our attention economy is being expanded upon gaming as Raven brought up and yourself from the histories of performing arts as well. So what do you think? Well, I mean, the first thing that seems pretty clear and has for a couple of years is that the disciplines of video game creation and theater creation are merging in a substantial way. That's usually where I sort of try and do a Venn diagram with my hands. And I can't quite make that happen. But yeah, I think those disciplines are starting to cross over and have been for a little while. If you look at the work that immersive theater makers do, we make levels. We're level designers. We can translate those roles across disciplines quite easily. I think that there's going to be crossover in terms of tools, in terms of practitioners. And to be honest, I think that the indie gaming community and the immersive theater community actually have a lot to talk about with each other and a lot to learn from each other without even going to higher production values or higher levels of that creative hierarchy. Amazing. Thank you. Lynn, can I throw it to you? Yeah, I totally agree. I mean, I'm a dirty little seeker with my artistic practice. You know, when I was 22, I was like, there's video game design textbooks. Let's get some and start reading them because they know what they're doing. They have money and resources and people have written, yeah. So I think that conjection is definitely something that's happening. One of the things that I'm very interested in artistically with the way things are going is I would posit that we're increasingly hungry as audience members for customized experiences, things that are about us, things that revolve around us. We want Wonderland to see us and look back at us. And that's great. I'm like, it's fascinating and I love that stuff too. However, the increasing fragmentation of our society politically and artistically in the customized content that we all want, I'm really interested to see when that, forgive me if I'm being critical of society, everyone, but when that narcissism smashes into the community that's the heart of live performance in theater, I'm really interested to see what's going to happen. Like it's too diametric, it's too opposing forces hitting each other and we're going to end all of us in the summer to be the center of that and I don't know what's going to happen and I'm excited because I feel like they're too strong opposing forces. So I would offer that. Thank you. Thank you. Moving over to Melissa. Hey, yeah, I kind of agree a lot with what Liam said about kind of the increasing narcissism in our society. Like we're seeing it from the top down. Like I'm not even engaged in US politics, but I can't even avoid it at this point. I know I keep on talking about this show that I've been watching Mythic West, but it's got really great insights on what's kind of going on in kind of entertainment and communities. So for example, there was one episode where there was a hate group online and you know, how do you get rid of them? And then they said, well, we can't get rid of them. So we're just going to put them on their own server. And it seems like this is actually taken from real case scenarios. Like I think it happened with Grand Theft Auto and my brother recently told me that there was a bunch of cheaters and fall guys. So they put all the cheaters in one server too. With the fragmentation that Liam is talking about, is that what we're headed to? Like just like-minded people being one space, other people just being their space and they don't conflict? Like Liam said, he's excited, but I have to say I'm also a little bit nervous. But I think in this space, I think I can say that we're all like-minded people. We're very open. We're very artistic. We like expressing ourselves in positive, meaningful ways with purpose. So I guess all we can do for ourselves is just to keep going and keep each other informed and keep supporting one another. Totally. And for a little bit of context, if anyone doesn't know Fall Guys, it's a battle royale physics-based game that you should definitely check out just because it was, still is maybe, still very popular. Ribbon, can I throw it over to you? Pause it for the future. Yeah, I'm generally pretty hyped about the future. And part of that is gathering with my community at least digitally and focusing on those beautiful things that we all have within our practice. I'm really interested to see the way this develops the space. I've been heavily immersed in it as I familiarize myself coming up to the symposium. And generally, I'm really interested to see how the, but right now we're looking at the way that the software of this kind of space develops, but also like the hardware kind of competition between different headset types is really interesting as well. The old space is focused from my understanding and talking with a lot of people in unifying everyone's experience, which has meant basically nerfing a lot of the functions that we used to have. For instance, we used to have elbows and we used to have legs. The graphics were not as good, but because of apparently the voice and its changes, as well as some of the changes to making this more accessible to Mac, a lot of the functionalities have been developed to make sure that everyone has specifically the same kind of experience, but that is reduced functionality. So I think I understand the wanting everyone to have the same experience, but then also we're kind of not making space for the further development of the technology that are influenced by the use of programs like this. Really, really interesting. I think I'm picking up on a few different kind of parallel trains of thought here, especially coming back to, again, maybe more theater performance heavy, Liam with the kick and push festival thinking that audiences, the way that the audience, capital T, capital A have often been talked about, is that it's one kind of hegemonic group that we can all tailor things for. But I think especially with a lot of gaming cultures that you bring up, Melissa, that especially with the tailoring algorithmic cultures, we are going to start seeing more and more individualized experience all within the audience held within that container, which I think is really interesting, but also scary kind of prospect to consider that these kind of algorithms are going to be making decisions for us and what we see. But perhaps it's just automating a curatorial process that we would otherwise normally see. Just before I forget, there's an excellent talk by Jesse Shell, which is part of Game Developers Conference that's on YouTube, where he talks about that our games are going to start listening to us and in that the game is going to become more about our relationship to it as a virtual companion rather than something we play. So those are all kind of exciting propositions for what's going to happen in the future. I was wondering to open up if there are any kind of thoughts or comments you'd like to make about each other's propositions. Yes, Raven? Oh, I was just going to offer up in that reference that we've already seen a lot of that within the gaming community, specifically gaming simulators where the courses that you make affect what opportunities you have in the game, be they romantic or in furthering any kind of ongoing storyline. Definitely. And the data that's collected could be really interesting for anyone who's been watching Westworld Season 3. No spoilers. Yeah, any other thoughts that kind of came up for you? I guess I have something to say. One thing that I was thinking about, I mean, you're saying that the algorithms are curating content for us. It is a good and a bad thing because I feel like we're seeing a saturation in content like there's just too much content out there. Words, images, videos, streams are happening constantly on the internet. It's kind of exponential really. But at the same time, I remember when I was younger growing up, I'd go to a library and there'd be a whole bunch of books and I could choose whatever world I wanted to explore. So I kind of worry that we're not going to have that same kind of library and choice to see what's out there. Instead, it's the algorithms choosing what we see and I don't get to see what the full array of choices in content would be. That's a, oh, go ahead, Raven. Oh, yeah. And that's definitely in for relationship to who holds those actual libraries, speaking directly to book planning, as well as books that are available to people above a certain age, above minority status. And then, of course, we look at reference, what's going on in China and how they have to jump while being able to access Facebook, Instagram, et cetera, because their digital libraries and servers prevent access to those things. Jacob, did you want to jump in? Yeah, I guess for me, I still, I don't know if any of you feel this way, but I still have a bit of an allergic reaction or like I should be putting a quarter in a jar every time I use the word content. And I think having sort of sat with that and thought about that and talked about it with people, we didn't really need that word before the internet, but in the same way that computer hardware is like the fundamental ethic of computer hardware is that it is general purpose and can do all the different jobs. When that hardware started to get caught up with my software, suddenly and networking, then suddenly we actually had a series of tubes through which anything could go. We used to have disciplines. We used to have, I'm a writer, I'm a photographer, I'm a photojournalist, but because we had pipes that could accommodate anything, we needed a word for anything and so we picked content. So I think that there's like a bit of a historical collapse of discipline and sorry of disciplines with that word and media within that word, but I think also those sorts of collapses tend to be temporary because we tend to come up with new vocabulary to reflect sort of the world as we experience it. So totally. I'd love to throw it over to Liam and then quickly after that throw it open to the Q&A. One of the things that I'm really interested in right now is the responsibilities of care. As someone creating immersive products in this new world, what are our responsibilities in terms of bringing people into virtual environments? And as we live in a world that's more run by algorithms and everyone's getting more of what they want, like all these Facebook, for instance, is designed to give you what you want. It's designed to feed you what you've already decided is delicious. What does that mean as creators to environment? Is the public less prepared for bringing them into a space and there's something that they really don't disagree with? Because when they spend all their time on social media, they're getting fed with what they like. What are those mechanics? And I'm sure this we talked about elsewhere, but the big players, as someone who's involved with the conference, I was involved with securing the hardware. And there's very few players. And there seems to be one player that's kind of taking over. So yeah, it'll be interesting to see how that plays out. Anyway, that's it. Thank you very much. Yeah, of course. And I guess by players, you mean like different companies? Yeah. It feels like I'm dating myself here, but it feels sort of like, oh, let's go for it. It feels like VHS and beta like it feels you can watch the giant companies making moves to corner the market. You can like it's happening right now. It's so obvious. So what, as long as the winner has been Evelyn, good, but you know, it's interesting. Right. On that note, I'm going to stand over beside Jesse right now. And I'd like to open up to Q&A. So if you do have a burning question or something that you'd like to offer up, please just raise both your hands or send up one of these emojis, or happy face, one of those two. Just let us know. Yeah, opening up for any questions you might have. Yes, Frank, right at the front. Let me just amplify your voice real quick. All right, Jesse is on it. And I'll just dance while this is happening. All right, Frank, the floor is yours. Oh, thank you so much. First off, let me thank everyone here. It was a really great, a really great panel. I had a question for Raven actually. So I'm also a Uber nerd. I love D&D, you know, I haven't quite gotten into LARPing just yet, but I'm interested in your, your perspective on using virtual reality for something like live action role play. Do you think, or even just regular tabletop D&D, do you think it takes away from kind of like the imagination part of it? Like I'm trying to get my group to go into tabletop simulator right now and they're having a hard time because they're saying, well, I like to imagine things. How do you see that? Do you think it's improving it or is it actually taking it back a little bit? Well, it's all about language, right? So I have an indie group where we utilize Zoom and something, a program similar to Roll20. So we're all in a chat and over audio. One of our players is someone who has a lot of tech work during the day. So they're constantly looking at screens and then have a hard time with that. And the system that we're using is similar to Pathfinder. So it's very like interpretive of like, we have our own stats and the, the storyteller, it makes the world collaboratively and it's more done over audio. We're talking about stuff. So the visualization of maps is rare. Sometimes we'll throw an image map and say, you're in this approximate location and your goal is to the north and south. So I'm interested to explore a couple of different worlds and they hold of them that are, they're spaces that are much larger than anything I've made. I've only made one world here. But I know that editing while there are people in the room can really slow down your experience. But what I'm wanting to do is load up most of the world and edit it while people are immersed in it so that I can create clientable walls so that they're limited in their movement. There are virtual dice, which is exciting, but also being able to bring in assets live would be a lot of fun. But again, you're still coordinating people to be there and you can have to deal with the technology overlap. Most of these people don't have assets, so a few of them will be in 2D, which is going to definitely change their experience. Right. Thank you. Thank you, Frank. Thank you, Raven. Going back to the audience, I used any questions or comments that we'd like to put forward, raising hands or emojis. Yeah, was that a hand? No, it was not. You were just stretching. Okay. Okay. TPM Eric, do you have a comment or question? And I think Jesse is on the application floor. All right. Thank you. So something that I'm familiar a lot with in regular theater is the idea of masking and kind of like hiding the behind the scenes. And I kind of peeked around a little bit in the main PXR hub and I already fell off the world once. Like how does that, do you have opinions or observations about sort of the masking and seeing behind the world or outside the games? Like there's lots of reaction videos of like people breaking Skyrim. Is there a perspective on that in VR as well? Yeah. Is there anyone in particular that you'd like to first or just open it up for anyone? Open question. I have something to say on that. I think with VR in particular, one of the funnest things about it is you're exploring this kind of weird world that's not yours. And I feel like people like playing in the sandbox. So I have a feeling that like even though you can fall off the world, I have to, I kind of liked seeing behind the scenes because like I almost felt like, Oh, this is an experience that I discovered on my own. And even though it's a little silly falling off the world, and it's kind of, some people would describe it as a bug. I still thought it was fun. I liked it. That's all. I think I have a lot of the same, oh, sorry, I think I have a lot of the same natural responses. Like I was saying to my colleague, Amanda, who's here with me, that it took me, it probably took me like 15 years to just learn and understand how to behave and act like an adult in public. And in my first five minutes of this conference, I managed to climb the grid and fall off it, taking away, like taking attention away from the introductory speech. And I really didn't notice, like, there's something that, you know, like I enjoy, as you say, like, testing the edges, you know, figuring out what the, what the rules are that you can, you know, dance between. So I think that that, I think there's a liberating effect in new spaces that makes that more likely. Thank you, Liam. Any other thoughts from the panel? I just, if I may, just because Jacob referenced sleep no more. I remember the first time I saw it, I saw the crowds going one way and I deliberately ran the other way and I was just like, opening every drawer and like, like, I'm like, just how deep does the rabbit hole go? How deep does the rabbit hole go? Like just looking for that. And I found that impulse that I have, judge as you will. Wow, this is paradise. Like just where does this go? What's going on? Just push, push, push, push, push on the edges of the world and so forth. There's a YouTuber I really like. I think his name is Curio. And he was talking about immersion in games and whether games owe us immersion. And yes, it's something that you already do when you go to theater. You don't go, oh, I saw them enter from off stage right. And then there was a shadow and I could tell that they were moving on the stage to go to the other side and then come out there in another place. Like you just go and you expect that you are going to be in that kind of new visual world and landscape and that there could be writing in that story for you. And I think it's the same in VR. Since you can just drop in and out of spaces very easily, I'm still learning how specifically events like this are run and what you can and cannot control. I made some artist friends here and was like, oh, they're online and dropped into the same world that they were in and it just happened to be their own just rehearsal for a performance that they're planning to do in VR. So that was like an automatic backstage moment. Yeah. Thank you, Raven. Jesse, how are we doing on time? Do we have enough time for like one more short question? Or do we have to wrap up? Well, we're actually bang on two right now, but I would say we have time for one more question. Okay. Is there one more burning question in the room? Otherwise, I do a few closing things, I could definitely say. I'm going once, going twice. Okay. In the back, in purple, we have parts of PXR volunteer. Comment your question. I have a question. So first of all, thank you guys for doing this panel today. My question is that so we know that before you go on here, can we just make sure that you're in? Jesse, you're good to go. Okay, cool. Hey guys, thank you so much for the panel today. My question is about VR and medical training. How, what do you think, like how effective do you think VR would be for medical training or any sort of training in general, like which in real life kind of has real life stakes, for example, like pilot training, medical training, soldiers, for example? Just, anybody have any thoughts on that? I might, maybe just a quick one is that if we think about VR and theater, which we're kind of doing here together, VR is a way to simulate and a way to role play. And to me, it's just interesting to note that that already is a big part of medical training. I know actors in the States who, you know, pay their rent by pretending to be patients for doctors as part of their medical education. And so I think that again, like that's part of a continuum that feels like the technology is just a new extension of or a new way to realize that. But that kind of activity is already a part of how medicine is taught. I would agree. Anyone else from our panel? It's funny that the initial question is about the medical world and using this kind of space as a simulator. Also, one problem within an online space is I think it's more cultural than it is an interaction with the technology in itself that keeps our world humanized. One of the main things I've noticed is in most of these VR apps, one of the warnings or initial messages that you get or on the loading screens between virtual spaces is behind every headset, behind every single person. And we should be thoughtful of those people. And I think that's because it's less that I have a technological understanding that another person is behind an avatar. But rather, culturally, we're taught to humanize people within virtual spaces. And I think that's a big problem within the gaming industry overall. And I think that's kind of where it is started. But as well, we see that online, even just like on Facebook, et cetera, people are willing to scroll and humanize it because of that screen. Yeah. Those are my thoughts. Thank you. We're definitely at time. So I just wanted to wrap up by saying a few things. First and foremost, thank you to our wonderful panelists, Raven, Jacob, Liam and Melissa. Please give them all the emojis and thumbs ups and waving of hands. This is the first of four panels that we have for the whole conference. Please stay in touch. We'll definitely be around. So don't don't be afraid to reach out. And I would implore anyone who found anything interesting or has anything interesting to add to the discussion to just add it to the Discord. I'll certainly put up that link to that Jesse Shell discussion that I was talking about. There were definitely calls for being more critical about the kind of spectatorship within the performing arts sector that we're seeing in the gaming sector and being more thoughtful about the kind of algorithmic cultures coming out. I think this is a very ripe time to having these kinds of discussions. But we are unfortunately out of time. And of course, that's not enough time to delve deep into the large discussion that is what is spectatorship right now. But this was just the tip of the iceberg. So please have a great rest of the conference. We'll be around to talk more. Make sure to take breaks and keep your sugar levels high and drink lots of water, please, everyone. Otherwise, thank you so much. And we'll see you at the next event. A little applause, Moji. Yay. Moji, emoticon. What's the difference? I guess Apple owns one. I don't know. If you could see my shoulders, they're shrugging. All right, great. Welcome to our Unity Technologies presentation. As a technical front-of-house speech, I'm just going to go over and say that PXR 2020 would like to begin by acknowledging the support of the Canada Council and the Digital Arts Strategy Fund. I'd like to acknowledge that I am speaking to you from the stolen and unceded territories as the Musqueam, Squamish, and Tsleil-Waututh nations here in East Vancouver, B.C. If you're starting out as an XR creator, you should be familiar with Unity. And as most of us were laid off this summer, you've probably already downloaded it and start playing with it, in which case thumbs up to you. It's great. Smoldering his way in from San Francisco, we have Dan Miller here to my right. Dan Miller is from Unity Technologies, and we'll show you how Unity can be leveraged by creators to create interactive experiences for new audiences. Now I'd like to hand it over. Dan, take it away. Thanks so much. Wonderful introduction. Go ahead and amplify my voice there. So yeah, I'm really excited to be here. Thank you so much for the invite, and I hope everyone takes away a little bit, learns a little bit from my presentation. There is going to be some time for Q&A after, so you can hold some questions there. I will be around for a little bit, but have a lot to cover kind of doing a bit of a broad approach, diving in a little bit on how to get started. So let's go ahead and do this. Yeah, my name is Dan Miller. I do post a lot of content. I'm fairly active on Twitter. So if you guys are interested in some AR and VR content in Unity, you can go ahead and follow me there. But let's just dive right in. So the agenda for today is we're just going to be giving a high level overview of Unity's XR platform, some of the updates and things that we've noticed and really what that enables, which are things like the XR interaction toolkit. This is a toolkit for getting started with AR and VR. We're going to be covering a little bit on the VR side today. Next, I'll dive into some XR creation tools. So some tools actually built with Unity for creating, as well as some other ones. Talk a little bit about where to get those, how to get started, and some of the capabilities. From there, I'll give a bit of an overview for 360 video, how that can be used, and then I'll dive into spatialized audio. And again, we'll have some time at the end for Q&A if you have any questions. So really, Unity as a whole, we are a platform for creating experiences. And we've had a lot of success on the gaming side. So things like Job Simulator, Beat Saber, and Pokemon Go. And more recently, we've been really expanding outside of gaming. So we're looking at things like industrial applications, whether visualizing something in AR on the job site, or looking at a piece of furniture with Ikea Place before you buy it. And on the creation side, there's a lot of different applications in VR that can actually be used for just creating in general. A lot of these are in the 3D space, and you can use VR to then create VR. So I'll be talking a little bit about how you can export content, some of the features and capabilities of some of these apps that you see here, like Tilt Brush and Gravity Sketch. So one thing we've noticed at Unity with the augmented and virtual reality platforms is that there's a lot of shared features between them. So when you look at things like input, device tracking, you know, different rendering and things like that. And we've also noticed that they continue to, you know, kind of advance and release new features. This has been, I'm sure a lot of you are familiar, you know, in the AR and VR space, you're constantly seeing new platforms, new features, new capabilities. And so it's been in the past a challenge to kind of keep up with some of that. So what we've done on the Unity side is we've kind of broken this down into three categories. And first, we have the actual providers themselves. So these are kind of the specific APIs to the specific platforms. On top of that, we have what we call the Unity Developer Layer. And this is kind of this abstraction that allows you to kind of build once and deploy anywhere. So you're not looking at things like, you know, the Oculus specific controllers. Instead, you can look at VR controllers. And lastly is packages. So the way that all this content is actually delivered to you is kind of a modularized system within the Unity editor itself. So this allows us to release updates, new features and new versions at a much quicker rate than we could in the past. So you don't have to wait for an entire update of the Unity editor. You can just grab that latest update package, which we can ship a lot quicker. And so at a high level, it looks a little bit like this, we have what we call this XR SDK that kind of sits under the core engine. We have those platform providers. And then we output that development layer that you can then kind of build against. But that's not really a core part of this talk. I just wanted to kind of set the stage a little bit, talk about how we're keeping up to date with some of the platforms themselves, and really what this offers or what this enables. So a big part of that is the XR interaction toolkit. So this is a cross platform toolkit that we're developing here at Unity. It is available now in a preview state, and it covers both the AR and the VR side. Right now they're kind of separate entities, but there is work to kind of merge those together and make things a little bit more seamless between platforms, as well as we start to see, you know, different platforms kind of cover similar features. So as mentioned, this is available kind of in preview in a package. So if you enable preview packages in the Unity editor, you can find the XR interaction toolkit. And really, when we think of kind of the different toolkits out there, as well as a lot of the different platforms, we're trying to provide some of this base level functionality. And what do I mean by that? Well, these are things like physics interactions. So being able to grab and throw objects, having them track to your controllers, you know, interact and manipulate them at a distance. We're also looking at things like native UI support. So this has kind of been a challenge in the past. There's lots of different systems on how you interact with the UI, you know, using things like rays. And you see a lot of the kind of platforms themselves do this at the like OS level, if you're looking at like the Oculus shell, for example. Well, because we, you know, are also developing the UI system, we're able to kind of create this system that allows you to, you know, interact with just World Space UI. These are things like button, text input, toggles, et cetera, kind of through this toolkit at this kind of distance there, using what we call ray interactors. And last but not least is Locomotion. So this is the ability to kind of teleport around, just like the features we have available here within alt space, where we can kind of move around. You can do things like snap to turn, et cetera. If you're interested in the XR interaction toolkit, I highly recommend checking out this escape room project. So this is a project developed by our learn team. So that's the education team that does a lot of these tutorial content, as well as build up some of these sample projects. So this is actually available on our learn platform. It walks through the different steps of how this project was put together. And at its core, it's a kind of stylized escape room where you're in a little environment, you're solving puzzles, and then you can kind of escape the room. But the cool thing is, as you can see here on the right hand side, there's also an additional scene here of some more kind of realistic props. So with these props, you can do things like turn the hourglass over, manipulate a drill. And really, this kind of serves as this learning base or this learning project on how you can start to understand to create some of these more complex interactions using the XR interaction toolkit. So highly recommend checking this out. You can search the escape room VR Unity or XRI escape room. And this will all kind of come up through our learn platform. There's also a blog post that kind of dives into some of the content there as well. So now I just wanted to give a quick kind of overview of what it looks like to get started with XRI. This will just give kind of a high level understanding of some of the different pieces, how they fit together. So here I am in an empty Unity scene. And so the next thing we're doing is we're enabling these preview packages. So I talked a bit about what packages are from the platform level with the different kind of VR supported platforms. But really, this is where a lot of our kind of functionality is moving towards. And one thing we've noticed is that some things aren't necessarily, let's say, fully production ready. So the XR interaction toolkit is still what we call a preview package. So there's kind of a caveat there of consider that there might be some breaking changes, there might be some updates to the API as it continues to get developed. But right now, you can get started. We are also looking for lots of feedback on it. So once you enable the preview packages, then you can just search XR in your preview, in your package manager itself, and you'll find that XR interaction toolkit there. So you can go ahead and install that. The next thing you're going to do, and this is also kind of an inherent part of the package management on the XR platform side, is the XR plugin management. So this is a kind of package in itself. And this is how you enable and tell the Unity application as well as your builds, what platforms you're building for. So once you've enabled that, then you're going to get a list of these kind of included Unity packages or Unity platforms. So you can click something like Oculus, and that's actually going to auto install the Oculus package for you. So one example here with the Oculus package is we've actually, at one point, we had a separate package for the Oculus Quest as well as the Oculus desktop. Now it's kind of unified into a single package here. And so once you've enabled XRI and you've installed that package as well as enabled the plugin management, then you're going to get this additional menu in what we call the create menu. So when you're inside Unity, you can kind of right click within the scene hierarchy, or you can hit that little plus button up there at the top. And that's going to give you a menu of things like primitive objects. And also this XR tab, which is now available once the package has been installed. So here's where you're going to get things like your what we call a room scale VR rig, or a stationary XR rig, sorry. And what that does is that's the difference of, you know, how is your content tracked? Are you making like a seeded experience? Are you making one that has, you know, full kind of what we call six doff tracking. And here you also get things like the teleportation area and the different ray interactors there. So if you notice up here in the upper left, this is kind of the structure of what the XR rig looks like. So it has the kind of camera offset where all the content lives underneath it, you have the main camera that's what's kind of driving your display there. And then you have the two hand controllers as well. So we also have these what we call kind of generic controller models. So these are great for, you know, building applications for different platforms where they can work for things like the index controllers or things like the Oculus controllers or the Windows mixed reality. And really what you're doing here is I've kind of adjusted these and you're going to basically put them as child objects under these controllers. And you can assign them as prefabs as well to kind of dynamically spawn there. One thing to keep in mind always with controllers, you always kind of got to understand where their kind of pivot point is, things like that, don't necessarily always line up between platforms out of the box. So here the other small note I want to mention is right now when you create an XR rig, it's actually going to have a near clipping plane on the camera itself. That's kind of the basically near plane of where content gets clipped. And you should adjust this down to kind of the lowest value here. So 0.01. And that's what you typically see in a lot of VR games and applications where you can really put your hands fairly close to the camera before objects get clipped. So the next thing, once we've kind of set up these controllers, we've assigned this, now we can basically hop in, you'll have visuals for the controllers there and you just have what we call these ray interactors associated with your controllers. From there, I've kind of basically, I've just kind of scaled out the floor. We're kind of making just a base environment to move around within. And then we can spawn these locomotion system and the teleportation area. So two things to keep in mind with the locomotion system is it's basically going to kind of spawn as a large area here. This is where you have things like the mesh collider, which determines which of the area you can actually locomote or teleport to. So if you had some kind of restriction zones, you can also make just a smaller zone for that as well. And one thing to keep in mind with the locomotion system is you do need to assign this XR rig. So that's kind of a field within the system that you spawn itself. So you can kind of just drag in that XR rig that you've already set up. And then here it also comes with what we call a snap to turn provider. So this is a way where you can use the joystick. It's available here right now. We can kind of just continually rotate around by just moving that. All right. And this is kind of that base XR array interactor that I was talking about. So one of the interesting things here is we have what we call this line type. So by default, it's just going to be a straight linear line. But we can also change this to this Bezier curve. And what this allows us to do is this is what you see typically used in a lot of these teleportation or locomotion systems. So it's kind of this curved arc, which makes it a little bit easier to kind of aim and manipulate as you move around. So that's kind of a adjustable parameter on these array interactors that come with the locomotion as well as the XR rig itself. And here's just kind of a little scene I put together, just various kind of primitives where you can just kind of move throughout. And at one point, this was kind of a gift for a video of me just teleporting around. But this is kind of what you can see here. It's a little bit subtle, but the green array there on its left, I have set to kind of a curve. And you can adjust some of the parameters there and basically kind of manipulating around and moving throughout the space. Alright, so that was just kind of a high level overview of the XR interaction toolkit. Hopefully kind of, you know, a couple takeaways to think about where, you know, how you set up your scene, you get the XR rig, you get kind of the locomotion system and how easily you can get up and running with just kind of moving throughout your space and stuff like that. Now what I want to do is just shift focus a little bit and talk about some VR creation tools. So obviously Unity itself as a platform can be used to create VR experiences, you know, alt space here is created in Unity as well. But I wanted to talk specifically about some tools for some more kind of art or content creation. So the first one is tilt brush. So this is really kind of one of the most common ones or really one of the early creation tools. So it's actually developed in Unity as well. And it allows you to kind of draw in a 3D space with a lot of these kind of more like 2D brush strokes that then you can kind of start to layer up. Now there's a lot of really cool kind of additional tools and functionality of how you make your content, scaling around, making these really large scale scenes. But one of the things that I think is most interesting is there's actually a toolkit specifically built by the tilt brush team they're required by Google. And it allows you to export the content you've created in tilt brush and directly import it into Unity. So because the app was made in Unity, they're able to kind of create a bridge there. And there's actually a lot of these things with the kind of special shaders and the special kind of rendering of how their content looks in tilt brush and then bringing that directly into Unity. So I've seen some people create, you know, entire short films or do some kind of early iteration or prototyping using tilt brush, bringing that into Unity and then adding some additional functionality. There are a lot of tutorials on YouTube available for this. So I definitely recommend checking those out on just kind of how to get started, how to start creating with tilt brush, some kind of, you know, pro tips and things like that. The next one is Gravity Sketch. This is also created in Unity. This is available on some PC platforms as well as available on the Oculus Quest. I should note that tilt brush is available on the Quest as well. And this is more of what we call kind of NERB or subdivision modeling. So here you're looking at, you know, rather than doing certain strokes, you can also kind of create more generalized geometry now, although it's using kind of NERBs, not vertices. But this has actually been used heavily in things like automotive or product design. So they're able to kind of iterate different ideas. There's also some interesting things with co-creation. So they currently have kind of a closed beta here where you can have multiple people within the space all creating at once. So that's pretty interesting. And this is kind of, you know, they have a full kind of product around this VR creation tool itself. And they also have some support for 2D creation with things like the Wacom tablets or iPads. So next is Microsoft McKet. So this is again built in Unity, still in beta. But what it is is it's more of like a 3D or a spatial layout tool. So this allows you to create a lot of these primitives, build up certain content, change colors, change scale, and really kind of mock up these different scenes within VR. It can be used or you can see here it's kind of used in both an AR and a VR context. And you can again import the content you've created in McKet directly into Unity. So kind of, you know, really streamline that process when you're ready to do things like, you know, adding it additional functionality or things like that. So the next one is Medium by Adobe. So this was actually originally created at Oculus. Adobe acquired the team that develops this and now they've actually recently kind of re-released it as Medium by Adobe. So it is only supported on the PC Oculus platforms. But this is really looking at what we call digital sculpting. So before when you're thinking about, you know, different paint brushes or strokes, this is really thinking about more manipulating things like clay. So you can kind of add, you know, adjust and make different geometry here. And they do also have some export functionality to pull out, in this case, OBJs. So this you really, from Medium itself, you've seen some really high fidelity sculpts you can see in just in this screenshot that I have here. And there's been a lot of people that actually have sculpted something within Medium and then go and get it like 3D printed. So really turning that kind of digital into physical itself. There also is things like vertex coloring and stuff. So you can actually, you know, color your creations as well. All right. So that was kind of a little bit on the tool side on, you know, how to start creating content more in this kind of 3D spatial space. And now I just want to talk a little bit about 360 video. So if you're not familiar, 360 video is just a format of video that allows you to kind of completely be immersed within the video itself. There's both stereoscopic and kind of flat 360 video. And what I've seen a lot, especially in kind of the performance space or things like film festivals, is people are starting to use 360 video as kind of a layering technique, where you might have a really high fidelity, really nice video of, let's say, a sunset. And then you start to kind of layer in different 3D content into that. And you really use that as what we usually refer to as kind of like a skybox. So that's kind of the edge of what's rendering around your space. So in Unity, we supported 360 video or just native video for a while, ever since 5.6, which was kind of back in 2016. We support for 4K and 8K video. So this is kind of what you need when you want that higher fidelity, especially in things like VR. And it is hardware accelerated on platforms. So the codec right now is the H264. And that kind of offers that best compatibility between the platforms itself. And the video can be played back in different ways. So when you're thinking about or talking about just normal video playback, you can do it on things like a camera plane. So it's kind of a full screen video for the user. You can also do it on a render texture, a material texture, or really just any texture within a material or a shader itself. In the 360 context, you're usually looking at something like the material texture or a render texture that you're then applying to a kind of shader for the skybox itself. And when thinking about these videos, especially when you're talking about VR 360 videos, they're going to get really big. So there are different ways to kind of have your video sources within the Unity application itself. So you can embed it within your game that's going to kind of compile it within the binary itself. You can also load it remotely from a URL. So you can actually kind of stream this in from an externally hosted server. And you can also stream it in within Unity itself. So there's something called the streaming assets folder, which just kind of does get shipped alongside your application, but doesn't necessarily have to be fully initialized or loaded at the start of your application. And then last but not least is asset bundles or addressables. This is a way to kind of dynamically deliver content to your runtime application heavily used in mobile devices where they kind of ship you a very small binary and kind of build it up over time with different levels and things like that. All right. So the last thing I just want to kind of briefly mention is spatialized audio. This is really important for a lot of immersive applications when thinking about where the audio is coming from, how it's sounding, really ups the immersion. And yeah. So within Unity, we do have just generic 3D audio and there are settings within the audio source itself. Now, if you don't include what we call a spatializer or don't include a spatialized plugin, you will it will basically take the position of the audio. And if it's on my right hand side here, I'll hear it out of my right ear. If it's over here close to my left one, I'll hear it out of my left ear. If you include a spatializer plugin, then you'll hear it out of both ears. You'll just hear it much louder out of whichever ear it's closest to. So without that, you'll just hear it from one ear with the spatializer. You know, if it's over here on my right hand side, I'll hear it mostly on the right, but a little bit on my left. And that really allows for that kind of full spatialization. So off the bat, we ship with Microsoft's HRTF spatializer and the Microsoft spatializer itself. So this works on the kind of mixed reality platforms allowing you to go in and lay out your spatialized audio. There's also the Oculus spatializer audio, which works on the Oculus supported platforms. Again, you're kind of laying out different settings and conditions where your audio is what size type of source it's coming from. And then last but not least is Resonance Audio. So this is developed by Google. Originally, it was kind of targeting just their daydream platforms, but they've opened it up and made it really cross-platform. So it's actually probably the most widely supported spatializer plugin. It has different things like setting kind of material properties, so you can do things like, you know, a wood floor or a concrete wall kind of adjusting and defining how the audio is moved throughout. And this last thing I want to mention here is, one second, is this is something that just got released very recently here by Anastasia. And the Apple AirPods Pro actually has an API that they released really recently in the iOS 14 version of their operating system. And it actually allows you to track the head position of the AirPods in what we call three degrees of freedom. So you can do rotationally tracking the head itself. So this was a little video of her kind of moving back and forth. And you can see that cube moving back and forth as well. So she actually created a Unity plugin or a wrapper for this API itself. This is a native iOS API that allows you to link in and then start to create interesting things by understanding the head rotation and position of the user using AirPods. So that's available there. You can find her on Twitter. She posted that plugin there on GitHub. And that's it. So that's pretty much it for the presentation. We definitely have lots of time for or a bit of time for some Q&A. So if anyone has questions, I'm happy to answer those. So thanks. Yes, thank you so much, Dan. That was amazing. Coursera is stuck in the floor. Hopefully you're doing okay down there. If anyone does have a question, I'm going to please check to see if you are muted first off. Let's go a little bit anarchy. Yeah, if you have a question, please unmute yourself and scream it into the void. Yes. My question is, first off, let me say thank you for the talk and community. I've been using Unity for a long, long time and it's really come a long way since 5.4. So really good job. My question is with all these emerging new devices and ways that the devices are used, things like hand tracking and eye tracking, a lot of different companies are using these different technologies. How closely are you guys working with the Facebooks and the Microsofts and that kind of stuff? Yeah, good question. Sorry, I was grabbing a drink in real life there. So the question was basically with a lot of different new technologies, new platforms, how closely are we working with companies like Facebook or Microsoft to kind of work together? And the answer is Facebook and Oculus and Microsoft are actually some of our biggest partners. So we work really closely with them. They use us a lot, both internally as they develop their own tools as well as when you think of the platform providers themselves and you think of Unity, they really value our partnership and we really value their partnership and it's about bringing more developers into the ecosystem and really enabling these different features. So as I'm sure you're familiar, hand tracking was shipped on the Oculus Quest and early on there was a developer preview to enable hand tracking when developing within Unity and within the editor and actually enable it through the Oculus link. So yeah, we work really closely with them and what we try to do is kind of understand some of the partners and the platforms and their roadmaps and align those with some of the tools and things we're doing. So understanding that, hey, tracked hands, which is a perfect example, is becoming a really important factor with a lot of these different platforms. How can we on Unity enable that and allow people to create with that easier while hiding some of the complexity? If you want to create with hand tracking, you shouldn't have to worry, are you targeting this platform or that platform or something like that? If we can kind of create a bit of an abstraction on top of that or enable that in a similar way, that's really what we're trying to do there. I hope that answers the question a bit. Great. Anyone else? If you have gestural capability, raise your hand or pick up your hands. Anyone? Yes, Peter, hold on one second. I'm going to amplify your voice too. Or start speaking right now. Let's see what we can do here. Yes, I was wondering, is there a site or anything that you would suggest to do some training with? Yeah, are you referring to just kind of general Unity training or something more specific to? Okay, yeah. So I usually recommend our learn platform. So I think it's learn.unity3d.com. We've also started to incorporate what we call in editor tutorials. So if you download the Unity Hub, there's some additional content that we call micro games. So you can search micro games. There's a good one that's kind of a carding style micro game. And that actually walks you through some of the fundamental kind of concepts in terminology within Unity. So I always really recommend that. I mentioned the XR interaction toolkit has kind of a piece of learn content as well. And really what I've found is if you can kind of have an idea in mind, if it's a kind of small scope project, once you understand some of the terminology, how do I enable a rigid body to roll down the ramp or something like that where you can know some of those kind of fundamentals, then you can really start to kind of Google or search for the correct answers there. But I definitely recommend the learn platform and those in-edit tutorials through our Unity Hub as kind of a getting started point. That's great, Dan. Thanks. Yeah, no problem. All right. Toggling between all the things. Great, great. Anyone else? Any other questions? Yes, Jason? Can you hear me okay? Yep. Okay, great. I just wanted to just echo what you just said about the learning platforms. Peter, because you were asking, the learnings, the tutorials and whatnot that Unity has put out there are really, really good. I got my started with their famous roll-a-ball tutorial and they've recently reissued that just on their YouTube channel. So it is very learnable and all the answers are there. I'm using YouTube all the time as my learning platform. There's a lot of users out there that are sharing. So don't be intimidated. But the one thing you mentioned, Dan, about finding a project that you think is something small you can start with, the emphasis on the word small. I can't tell you how many things I've decided to just try doing, but I find out, nope, way too big. The scope is just huge. Pick something really small to start with and the feeling of accomplishment is greater if you actually finish it. That is all. For sure. Thanks. Frank, did you have another question? I did. Yeah. So a few years ago, Community West actually went on stage and showed off, I think this was in the age of five days when that was first. Okay. She unveiled that Unity had been creating a editor in VR. So I'm curious, is that still a thing that might come eventually? I tried it. I got to try. It was pretty cool. Is that still happening or is that still being talked about? Yeah, it's actually really funny you brought this up. That was actually re-released and is now available as a package. And I want to say that was last week. So it has been, to be perfectly frank, there was some resource allocations towards other platforms and other tools, but it still has been continually worked on. It was actually re-released and recently updated. So if you search, it's called editor XR. So they added a little bit new paradigms for different UI and interaction. But there is now a forum post, I believe, that announces all of it. But yeah, it was actually really recently kind of re-released and put into a package so that it's a little bit easier to get access to and enable. So yeah, editor XR is the name there. Thank you. Anyone else as we come to the near end of this final day of PXR? Anyone else? Questions, comments, experiences? Bueller? Is that right? All right. Do I see it? Frank? Frank again? Great. Let's bring it back up again. So in the kind of VR space, I see there's two kind of major engines that have been chosen, Unity being one of them, Unreal Engine being the sort of other one. How much do you kind of like look at what is being done over at Unreal and kind of like apply it to what you do? Or do you do that at all? Yeah, I mean, I like to think as a whole, we're always doing some competitive analysis at some level, paying attention to what's happening in the space. I would say more on kind of the VR, the AR side. We're looking more towards kind of community platforms, things like VRTK as well as other toolkits and things like that. But really, we're always trying to create an easier platform to create more easily and kind of enable more creators. So yeah, I think it's something we're always kind of keeping an eye on, trying to listen to feedback, iterate and improve the existing product that we have. All right. Well, we technically have five minutes left. So, Dan, is there any closing words you'd like to bring us on with? Did you want to ask a question here? I did. Thank you, Dan. First of all, thank you for coming and doing this presentation also. One of the things that I find from Unity is I've heard, and I haven't dabbled with it, but that there are a lot of possibilities of what you can download for guidance on how to do that, picking the right package. What is it that you suggest? Yeah, that's a good question. So the question was, I guess, where to get started? There's lots of different versions of Unity and things like that. Typically, depending on what application you're building or kind of where you are in the production stage, we have what we call the LTS or long-term supported version of Unity. And so that's supported for two years. We backport fixes and things like that. The LTS version that's out today is 2019.4. So it's at the kind of end of our release cycle. So as we get into or once it becomes 2021, we'll have released a 2020 point. It'll probably just three LTS. But yeah, usually we recommend building on top of the LTS versions as that's going to be the most stable. So that's where you'll also find that difference, as I mentioned, with preview packages. So we have both preview and then verified. And verified packages are versions of the package that have gone through kind of the entire QA cycle. So at a high level there, the kind of official recommendation would be download the LTS version of Unity and build on top of verified packages. Now the kind of Dan caveat over here is in the AR and VR space. There's lots of stuff that's continuing to advance and continue to be released. And some of that is only available on kind of beta or what we call text streams within the editor itself. So I find myself working within the text stream or in a beta or an alpha even sometimes. And certainly I've been using a lot of preview packages. But as far as stability and all that, we recommend verified packages. So hope that answers that a little bit. And for the secret sauce coming down the turnpike, anything that you are allowed to reveal that we can look forward to? I mean, Unreal just came out with that amazing video of their new possibilities and just wondering what you guys are doing in conjunction with that? Yeah, I mean, I guess I would say just kind of stay tuned. We're constantly trying to improve our technology and release interesting and new features. We're certainly working with a lot of our partners and things like that for additional platforms and stuff. So yeah. And then did you have a question back there? More of a, sorry, can you hear me now? Yep. Yeah. More of a, I guess a call to arms or advocacy point in that while we're developing these sorts of technologies to try and be inclusive as possible. I have friends for creating and learning in spaces that are for dubbing and including stuff in multiple languages. And while they were using images to lip sync, they had to do a lot of modifications to be able to include black bodies in this lip sync process. A lot of the programs are written without black bodies and brown bodies in mind. We've seen this a lot in a lot of facial recognition technology where it facial recognition does not work on those bodies. And then those people are not included within the development of those technologies, which in some cases might be good considering the militarization of facial recognition technology. But in general, when we're developing work, trying to include people who have different abilities. For instance, I was just in a talk where they're advocating that space VR be able to be used while in bed so that people who are bedridden are able to use this the program without having to sit up. So I would just say that while you're looking to help develop and create these new technologies and add to you as well and that you include black and brown bodies in this work and as well as people who are differently able. Yeah, for sure. Excellent, excellent points. And on that amazing note, I just need to note that we are theoretically at time with that. We've we've wrapped it is three o'clock in the afternoon Pacific time. That concludes this session with Dan Miller. Thank you again so much for coming out and speaking with us today. And also technically the end of day one of PXR 2020. So we made it, folks. We're all in the same shard. We're doing it together. We can hear each other. Amazing. Thank you again all so much for your patience and playfulness as we experiment and experience this together. For those of you who are interested, there is a snowball fight that's going to be going back on in PXR Central. So if you'd like to make your way over there, we can do a bit more jamming and see what the possibilities are. Otherwise, I shall see you all tomorrow. Again, we start our presentation day at 1215 Pacific. We have two presentations running. One is Toaster Labs Toaster World and the other is Nick Foxgig showcasing his lightning tool kit. So please enjoy your evenings, enjoy your afternoons, wherever you might be coming from today. Thank you again so much for being out with us today. You've all been great. Go forth, be free. Go jump some worlds and play.