 So, PXR friends, how are we doing today? Can I get a round of happy emojis if you're with us? Yeah, yeah, yeah, we got one corner. We got the whole spread. Perfect. Welcome to this fabulous day, Saturday, October 3rd. My name is Aiden. I am your project manager here with PXR, the Performance and XR Symposium 2020. I am speaking to you here from the Stolen and Unceded Territories in East Vancouver with the Musqueam, the Squamish, and the Slewa Tooth Nations. Just like to take a moment to thank our beautiful Canada Council Digital Arts Strategy Fund for helping us pivot what would it be in an in-person conference into a full VR experience to talk about what we can do with digital technology nowadays. Beaming in from Toronto, I would like to introduce Nick Boxgig, who has created this wonderful lightning toolkit. And to speak more about it, the creator. All right, so hello everyone. This is my first time presenting in VR and feel reasonably articulated. So I'm gonna talk about the Lightning Artist Toolkit, which I've been working on for the past four years or so as a way to draw in VR and then use what I make in animated short films. And I've been amazed by six stuff controllers since they came out. The idea of being able to create in 3D as expressively and intuitively as in 2D has always been just amazing to me. So my background is in traditional animation. So drawing frame by frame in flash. And being able to bring that stuff into Blender grease pencil specifically has been kind of the taking off point for all of the short film work that I hope to make and has called, the host has called on you and made you louder. Okay, cool. Good, can everyone hear me? So the ability to actually create in all sorts of using all sorts of different platforms, not being locked into one, bringing all that stuff into Blender grease pencil and make a short film just like I used to in flash was just incredibly compelling. So this is one of my later experiments here. I am using this in a mirage with the mixed reality mode turned on. And here I am moving around the space where I am right now and doing a little six stuff drawing. Now, as far as I know, actually, to mirage what I gather is pretty much canceled is one of the only headsets where you can actually do this pass through while also holding two controllers. The focus might be able to do it. I mean, the quest might be coming out with that at some point. Point is being able to work independently of any one hardware platform or application. So it's a bunch of open source tools for Blender, for Unity and for 3JS. And so what I can do is take frame by frame drawing, which is in a file format that I created and move it between any platform I want. And we're going to pop into a world at the end of this presentation and see some animation that I've created and was able to bring into Alt Space. So here is even earlier, I started off working with the Hydra. If anyone remembers that, the Razer Hydra earlier just before the Vive and Leap Motion Controllers, which used to have a stylus mode. So all of this passes into Unity and it can all be rendered, however you like. In this case, just with the basic Unity line renderer. And as we got into Tango and ARKit and ARCore enabled phones, I was able to bring some of my experiments over to those. So here we are drawing frame by frame over live video footage using the same framework. And here is a group of experiments. I've been sort of posting these on Instagram for the past couple of years, not limited to that Unity line renderer, but because you have all the stroke data preserved, you're not just dealing with a sort of a mesh and you're done. You can take all of this information and render it however you like. You can mesh it in a number of different ways. You can bring it into a number of different platforms. Unreal is brought in there, 3JS, HoloLens, magically. Just trying to find a way to a sort of perfectly portable and universal drawing toolkit. So this is the process. So here I am drawing a tilt brush. Tilt brush has a completely open file format so tilt brush can't do frame by frame animation, but it does have a completely open file format. So I'm able to integrate that into my toolkit. So it reads tilt brush files at the stroke level. So here I've brought that into Blender and I'm able to animate frame by frame in Blender grease pencil and then render out the result. This is Blender 2.79. More recent versions of Blender actually add the ability to view real-time NVR in Blender itself, which I find really interesting. And also render real-time with Blender EV. But the really interesting thing that I'm heading into now, and I've actually just started a PhD at York University working with machine learning and volumetric drawing because I can access all of the data, all the stroke data myself. I'm working on a way to convert volumetric video into brushstrokes and work with it like I would composite traditional flat video and 2D drawing because my animation practice has always been heavily based on collage and compositing and the idea of bringing all of that into 3D. So here I am working in a Vive actually having brought in just contour-based volumetric video and here I am painting it and here are some renderings out of Blender. So I'm still a little crude but I really am looking forward to being able to work again, like exactly like I do in 2D but in 3D. And here is a 3JS thing. This is live on my website now. I hooked this up to Magenta to play notes. Here is a machine learning model of my own piano playing from high school. So it is using the points in the stroke data to generate random music. Again, this is only possible because I've got all of the stroke data accessible. I'm not just making a flat mesh. I've got all of the brushstroke information. So it's sort of become a platform for all sorts of things, all sorts of interesting research besides short film production which was my kind of original idea. Trying to go with this now is to be able to record a RGBD video. So I'm really interested in depth capture strategies that I can feed through the analysis tool I'm sort of working on to generate brushstrokes. So I've prepared a little world there to sort of demonstrate the flexibility of this. So AltSpace doesn't natively support any sort of frame by frame format that I'm aware of like Alembic, for instance. We tried when I was working with Alex and Aiden to get some of the stuff in there for this presentation. Like it won't let you import Alembic files, for instance, or volumetric video from movie clips, but we were able to figure out how to generate with my toolkit some frame by frame animation that we can go see. So before we head into the portal and we have actually tested this just now for Quest People as well, which is awesome. I'm on a Vive. Before we head into the portal, just any Q and A. Also my little Amplify voice does not work in the other world. Someone has raised a hand. Raise hand. Hi. So we can, I won't be as easy as to hear. So yeah, any questions about this? Yeah, I was wondering, since you're just testing out this animation to be seen of the Quest here if your program is usable with the Quest. Yep. So in the end, it is just a Unity package. And yes, it does work on the Quest and I have tested it in there. Awesome. Any other questions? Any other questions? Nope, all right. All right, well shall we all head in then? Further questions can be asked in the space. All right, let us head over to, will we still be on air in the world? So in the new world, we are not gonna have megaphone capabilities. So if we can all huddle around, don't worry. Social distancing doesn't count in VR, we'll be safe. And if you'd just like to head up the stairs into our Lightning Artist Gallery, we will meet you in the next world. Okay, so this is the basic trick of this. I've managed to mesh one of these drawings and that I'm actually just using scale keyframes to show and hide each frame of the animation in turn. And I'm using a vertex shader to handle the colors, the same as you would for tilt brush or for quill. Do any folks here, do you do VR drawing? Or no? I just started using tilt brush. Cool. I'm only a week into my VR headset. Nice. Yeah, that is one of the best ways to actually use this. The tilt brush files are actually prepared, are actually able to be imported directly into Blender with this tool kit. That's awesome. Can someone say the name of that program again? I didn't catch it. So Blender is the, Blender grease pencil is where I've actually created all this animation. And I call my plug-in, the lightning artist tool kit that I use to convert as many different kinds of drawing as I can get a hold of into Blender grease pencil where all this stuff actually gets created. Another thing is that this allows me to do point cloud editing. This is a really interesting aspect of Blender grease pencil works with points as well. So I'm able to erase, draw on, recolor point clouds. So I really am hoping that this ends up being a frame by frame animation situation that goes beyond just working with hand drawn cartoons which I'm primarily interested in but also incorporating any live action footage into a drawn environment. This sort of hack is going to be just temporary because this is already seven megabytes I believe and the limit on the worlds that Alex had was 25. So we're already, this world's gallery is going to be up throughout the festival and I'll add a few more pieces to it. As I could further them. But the, what would we quickly run up against the size limits. So one thing I'm very curious if anyone's working with volumetric video or I'm drawing any ideas about formats and how to get essentially more space. Super interested in talking. I'm actually talking to some of the folks at the MPEG group, they're actually working on a volumetric video format that they, them and Kronos, they want to push to the standard which would let us basically get a lot more because you can compress a mesh, the same as you can compress an image, get a lot more. And also at, yeah. I think for alt space the public spaces are more limited in size. And bandwidth in general. But like they control that on their servers, you might actually, because you're developing something they probably be very interested in if they haven't already contacted you, be able to access larger spaces. Interesting, okay. Because I have friends, I made some friends on here who have been here for quite a while. And the common space campground, that's like on a larger, better quality server because it's an invitation for like everyone. So it has to be able to have the capacity for a lot of people. So when we were testing stuff out, they were like, oh yeah, this is not as laggy or whatever no matter what you do, because it's like one of their prioritized servers. So you might be able to speak with them to develop in a space with a bit more priority service. That is really cool, okay, thanks so much. They also do, I think it's monthly meetings with the owners or whatever, the main creators. Right, yeah. Neat. Yeah, I mean, like all this stuff bandwidth is going to be a huge, yeah, such a huge problem. Even getting this ready for Quest folks was a very last minute affair. Like they didn't have documentation on their site. They only had desktop, how to prepare the world. So we winged it and we got it working, which is great, which is why probably a lot of you can't even be here. But yeah, it is such an interesting challenge to get this working for everybody. Any other questions? Got some thoughts. Has anybody working with machine learning and point cloud based solutions or 3D graphics in general? So I see a ding people. What was that? Did anyone else hear that? I feel like I'm a... I think you're hearing things, Nick. We're not yet. Right, I'm wondering if that's my time. I think that might actually be my notice that we're 15 minutes from the hour. Okay, cool. So thanks so much for coming. And yeah, I hope to see you guys around the conference and any more questions are gonna be on the Discord. Just so everyone knows, thank you so much, Nick. That was, I'm right behind you. I'm gonna yell out like, yeah. Thanks so much for coming, for speaking. It is 12.44 right now. Our next presentations won't be starting until 1.15. So you have a half an hour. Feel free to hang out in this space. There will be a portal created that you can access this space for the rest of the week. There are also other surprise portals that are popping up in PXR Central. But I also encourage you to go hop around, enjoy some worlds across alt space, see if you can start building something in your own custom world, utilizing some of these tools. Otherwise, we can meet most of you back in presentation room A for Renaissance presentation of their VR opera, the next generation of opera. Thanks so much, folks, and go have a good time. You're going to be stopping people from porcelain in during the show. You could also move the spawn point, not in the middle of where people are built. I know, I know. So it's interesting because I'm using an alt space template, it doesn't, and it's an event, it doesn't let me change the spawn point where it's in your site or the interface. Welcome, everyone. Can I get a round of applause? Some smiley applause. Awesome. So, PXR would like to begin by acknowledging the support of the Canada Council for the Arts Digital Strategy Fund in helping us get this whole event organized. We acknowledge that this event's been primarily organized on the traditional ancestral and unceded territory of the post-Salish peoples, the Salem-Tooth, Musqueam nations. And now I would like to introduce to you our presenters for today. Oh, they're doing some checking. Great, nice. And these, this is probably one of the most ambitious projects I've ever seen. And they're way ahead of the game in terms of what they're doing. I also find their presentation very ambitious. So they have really set themselves a high task and it just shows that they are the folks who like to shoot for the stars. So I present to you, this is Debbie Wong, Conrad Sly, Yohan Guan, Neil Nair and Brian Topp of Renaissance Opera Company, here to tell us about Orpheus VR. Let's give them a round of applause. Thank you so much, Alex. And thank you everyone at the PXR Festival that's been helping us put this together. This is no small feat and you have all done an incredible, mind-blowing job. Thank you so much for having us. My name's Debbie Wong and I am the creative director of Orpheus VR, a choose your own adventure opera that immerses audiences in the mythological world of Orpheus and Eurydice. I'm very happy to be joined with my core creative team. They're to my right over here. They are Yohan Guan. Yohan, can you wave or emoji? That's our developer and our interactive director. Brian Topp, our composer and interactive audio director. Conrad Sly, our 3D artist and art director. Thanks, Conrad. And Neil Nair, our animator. Orpheus VR is currently in development for Oculus Quest and today we're gonna tell you a bit about our process and the tools we've used to translate a historical art form and a kind of wild idea into a high fidelity prototype. 450 years ago in Florence, an interdisciplinary group of philosophers, poets, singer-songwriters, artists and performers started gathering regularly to bemoan the current state of the arts and ideate on how they might reclaim dramatic texts and bring them to life in a way that would immerse their audiences and affect their emotions. They looked to reclaim the practices of ancient Greek theater. They dissected the art of rhetoric and they wondered what kind of role music played in Greek theater. And if the culmination of music, poetry and rhetoric might not be the key to recreating the powerful storytelling experiences they were seeking out. With these foundations in mind, a new dramatic style was born. One that embodied the rhetoric of Greek drama but had music underlying all of the theatrical action and texts resulting in the birth of what we now know to be Western opera. This was all very much in line with the humanist movement at the time which was at its peak in Europe in the 16th century and sought out a revival of classic Greek and Roman culture. I'm going to play an excerpt for you now of the one of the first and most popular operas to be created called Orpheus. So the original story of Orpheus as was presented in the operas and is often told in the myth is centered around this musician whose voice is so sweet that even the gods of the underworld let him pass through the realm of the dead unscathed to reclaim the soul of his dead wife Eurystheus. I like to tell the story of the birth of opera and introduce its roots because it was very much a process of innovating, innovating and prototyping all in the name of reconnecting with powerful modes of immersive storytelling. And even though the intent was to reclaim historical art forms, the process of doing so laid a foundation for something unprecedented and entirely new to emerge. So two years ago, another group of artists sat around at a local bar on Commercial Drive in Vancouver, BC not necessarily bemoaning the state of the arts but certainly questioning what kind of artistic experiences might be possible given the boom in access to emerging technologies. Orpheus VR was born out of that conversation and a series of questions. What would opera look and feel like if it were invented today? What is essential to what we understand as an opera and how can technology innovate the audience experience of those essentials? And what can historical opera offer to teach emerging technologies and artistic forms? So to answer the first question, I will now play you a small excerpt of our version of Orpheus. So what you saw there is our design of Orpheus singing something similar that you saw to the first excerpt, which is Orpheus singing to their lover Eurydice. The second question is, what is essential to our understanding of opera and how can technology innovate the audience experiences of those essentials? And there were two main aspects of opera that we really wanted to explore within the framework of VR storytelling. The first is what you saw showcased in the video clip, which are the unique ways that opera singers bring their characters to life both physically and vocally, which led to the use of motion capture to preserve the physical and facial expressions of our performers, which Neal will tell you about later when he speaks to animation. Second aspect to opera that we wanted to expand on is the use of music and sound. Because opera is a musically driven narrative form, that means that the text and the music are inextricably sound and equally responsible for depicting narrative and moving it forward. This presented all kinds of opportunities in a 360 story world, which when we are creating one, we have to think about how the entire world sounds and how the users move through the world and how it sounds as they're moving through it. But in opera, what that actually translates to is kind of like an orchestral score. So when a user moves into portion A as opposed to portion B, what does that sound like? And how does that all come together to form a cohesive narrative? Book Yuhan Guan and Brian Toplis to those things a little bit more. But it also introduced the idea of interactive audio to us. As users are moving through our environment, there's things that they can interact with and instead of them sounding like they normally but in the real world, we thought what if it could change the orchestration and allow users to co-create their very own operatic experience with us by playing through the narrative. Last, we wanted to showcase what historical opera can offer emergent narrative forms. And again, this is where the music comes in. It's a powerful narrative force. It can draw our attention, it can drive our attention and it can impact our emotional responses. And the user testing for these ideas has been happening for hundreds of years. So in Orpheus VR, we are essentially exploring how these tried and true musical narrative devices can help overcome the challenges unique to VR storytelling. I'm going to turn the presentation over now to our developer, Yuhan Guan, who will tell you more about how we've designed our narrative and how we've dealt with storytelling challenges in VR. Okay, hi. I'm Yuhan. It's such a pleasure to be here to talk about implementation of Orpheus VR so far. Next slide, please. Sorry. So first off, I want to talk about why we choose VR. Virtual reality applications are a perfect medium to provide fully immersion to the audience. Unlike the traditional theater setup for operas, where the opera usually happens on stage in front of all the audience, the audience could be surrounded by the environment and the characters in VR, where they can be in the center of opera and even be a part of it. So we choose Oculus Quest as our target device, which is fully standalone and cordless. And this also provides us the opportunity to provide interactivity to the user using the touch controllers. We want, because we want the audience to be a part of the opera, interactivities came into a key element in our branch storytelling. So at the early stage of our designing the experience, we choose to use branch storytelling techniques to design our story. So in this case, user could feel more engaged and they will feel their own influence to the path of the characters. And their action actually changed the path of the storyline. So to implement all those interactions and branch storytelling logistics, we use Unity Engine as our development platform. So because we're developing for Oculus Quest, we're using the Oculus on software development kit to handle the tracking and the user input, but we customize it to our own version to fit our own visual and functionality need. We developed a few event system and action managers to hook the user's input with our sound and animation where Neo developed using the timeline and the brand developed using the voice sound integration in Unity. Next slide please. So there are a few challenges being encountered during our design and development process. First of all is to direct user's attention in a 360 environment, which is a very tricky and common challenge in nowadays 360 content, no matter it's film or interactive experiences. Because the user have ability to look around in your environment, which could be very big or even infinite, look like infinite, we need to make sure the user don't miss too much of our own storyline. They could miss a small part for the visual expression of the character or even miss like a major interaction between multiple characters. So we went through quite a lot, great 360 experiences and films to learn a few tricks. For example, we could narrow down the visible environment the major events happening and using highlighting to get user's attention. And we also use spatial sound and visual cues to attract user's attention. We also plan the intensity of the storyline a little bit with peak moment and relaxed moment. We make sure to provide the audience some exploration stage, with fun little details in the environment so that they could direct their own attention to enjoy the environment and then lead them to a few minutes of intense storytelling events. Next slide please. Also teaching user how to interact with the objects in the experience naturally is very difficult without breaking the immersion. They chose not to use any text or user interface guidance, which is instead we choose to use voice guidance and animation trigger for user to guide user what they can do and when they should move or interact with certain objects. For them probably use voice guidance to tell user to follow office where office then emerge and walk along so that user will then know to follow the character. Next slide please. Another thing we try to ensure the immersion is to use enough of feedback. For them probably we choose to use haptic feedback which is majorly the vibration on the controller to simulate the touch of the objects. Since their virtual hand is mapping with their physical hand one to one, we want the interaction to be natural as possible and however, because all the virtual object actually didn't exist, so we try to utilize the haptic feedback likely we have with the controllers as much as possible. Also we try to use visual and audio feedback like dynamic sound trigger by actions of the user to ensure that the user know their action actually matters to the experience where Brian will talk more later in his section. So now I will hand over my presentation to Conrad to talk more about art and design of the story. Thank you. All right, can you guys hear me? All right, okay, great. So when Debbie and I initially started talking about wanting to try to make a VR opera experience, we wanted to explore using VR tools to do so, such as Quill and Medium. Here you can see an early sketch I did in Quill which is a VR painting tool. So after doing a bunch of research into these tools, I thought that the painterly style in Quill could work and perform well on the Oculus Quest. Prior to doing this, my background was in photorealistic architecture role visualization which we knew we were not really gonna be able to achieve on the Quest. So we began exploring more art styles seen in games. After a few weeks of messing around doing some tests with the features of Quill, we had a nice opportunity to show early development at an opera event in Toronto. The idea we came up with was to showcase some of the storyboards that I painted in VR. I painted a landscape set and a couple of characters, Orpheus and Eurydice. This was then translated into a musical 360 degree fly through VR experience for the Oculus Go at the time. And the attendees of the opera event could get an early glimpse of our intentions for this project. We also experimented with having an opera singer inhabit one of these virtual environments during a live performance and the audience could see what she saw on a big projection behind her. This was really great fun and it was really fun to experiment a lot with these ideas but ultimately it was tangent to our goal of translating the opera to a VR experience. After the stage, we were experimenting a lot with ideas of what the next steps were going to be. We ran through a basic pipeline of blocking in the environment to scale, doing a sketch over and then creating assets in Quill to match the ideas in the sketch. What I discovered about the tool like Quill at this time is that it creates topology or polygons kind of on the fly like in your brushstroke. And these can get quite heavy, especially for mobile devices that are limited in the amounts of polygons they can support. So after doing a bunch of tests with these kind of assets, it became pretty clear that they're not really production quality for our goals and the platforms we had in mind. Or they would just require like too much cleanup. So it was really fun to use it for concept but we ultimately decided that we needed to find a new pipeline. One of the other issues was that we didn't really have a strategy for creating a production ready capture with these tools. So the technology just isn't quite there yet in that regard. So then we had to move on to figuring out what our strategy for doing the motion capture is going to be. And after some research into many different kinds of software and hardware in that regard, one of the connections that I had from the center of digital media where I did my master's degree in Vancouver was the wonderful people working out of the sawmill motion capture studio here in Vancouver. We chatted with them a bunch and we settled on using Reallusion's iClone character creator tools to create the characters and rigs to do real-time performance capture. And it was a great mutual benefit for both of our teams. Because we both wanted to try this pretty new and relatively important tool, which I was tackling at first. I drew a lot of inspiration from myths and games but we wanted to put our own spin on it. So for Eurydice, she's typically a damsel in distress. And we wanted to empower her role in our branching storyline, making her a strong will to draw features that emphasize his musical and whimsical nature. For Orpheus, because we didn't want to use any props which we thought would like over-complicate the motion capture production. We incorporated elements of his musical instrument into his clothing. As you can see the strings of his lyre somewhere along the way, it became pretty clear that with my background not really being in character design, it'd be awesome to have some help to create a production quality concept of the character. So we hired Dorothy Yang, an amazing character designer here in Vancouver to help bring our characters to a higher standard. Then it was my job to start using the 3D tools, the character creator and ZBrush to design the characters in 3D. The pipeline is really wonderful and the... Oh, sorry, I'm having a problem here. Resets, you can customize your character a lot but then like send it to ZBrush to do some sculpting and modify it but then quickly send it back to character creator and you retain a completely functional rig for the face and body. And this was honestly like really game-changing for us because we don't really... I didn't have any experience rigging a character before. So to have one of reasonably high quality. I think we may have lost Conrad. Brian, can you clap if you can still hear Conrad? Do we lose Conrad? Oh my goodness. Was I... Did it mute me? I think you got muted. But we're here. Okay, we're here. Okay, sorry about that. If you guys have any questions about any of that stuff, I'll be happy to answer questions later. Sorry about the technical problem there. For the environment, we started out sourcing some assets from the Unity Asset Store to quickly prototype interactions and motion capture sequences. Like it just saves a bunch of time to like get something going really early on. We have started designing our own assets for the project, which is still like very much a work in progress. But somewhere along the way, Debbie had written this wonderful backstory for the characters and creatures in the story, including the forces and gods and fates behind the scenes pulling the strings. Here, for example, we have the forces of the fates being chaos, the weaver, prophecy and destruction. And these are all the prime movers of the events leading up to Orpheus and Eurydice's story. We wanted to tell the backstory through world-building and environmental storytelling in fun ways and subtle interactions, translating the ideas of the backstory to architecture and illustrations in the world for the players to discover. Once again, we relied on Dorothy to help us realize some of these visions, characterizing the events and beautiful illustrations that appear around the world. Our goal here is through audiovisual synthesis to tell an immersive and encompassing story, breathing new life into the ancient myth and musical score. All right, so for prototyping, early prototyping, it's really important to keep everything modular and non-destructive when you're creating an environment so that if you need to move things around and design the level, redesign, it's flexible. And for me, Substance Designer is really useful in this regard for the text room of the world as it's entirely procedural. I can create a dirt and a grass and we can blend them together with endless variations to enrich, I'm gonna hand it off to Neil who's gonna talk a lot more about the animation pipeline. Hear me, hello, I'm Neil. I wanted to talk to you today about how we captured performance for Orpheus VR, this project. First, I wanna tell you that there's two primary ways that we capture, motion capture these days. One is optical where we use a series of cameras and reflective markers which bounce light off them and we capture the data that way. And the other, which is a developing form is active tracking where we use suits that have motion sensors which can then send data to computers. Next slide please. Yeah, so our process for this project was using an optical capture system to capture the body and the iPhone X with a plugin to capture the facial animation. Typically, this process involves using, so our director will give you a shot list and then we will capture body motions and facial capture together, but in our case separately, I'll go into that a little bit later. And the last two steps are cleaning up that data, processing it and then integrating it finally into the engine. So next please. So this is typical of what a shot would look like from start to finish. You would get your facial and body data which you can see on the left and then you see the point cloud in the center which is the raw data which we get it in. And then we apply that to our 3D digital models. And finally, we integrate it into the game engine and add all the art in the thousands list. So here you can see a little behind the scenes video. So how we did the body motion was in that room that you saw there using a 24 camera optic track system. And what we do is put markers on the actor's bodies and that forms this green sort of point cloud that you can see which is a direct representation of them. And then much like film production, we record take after take. And our facial capture, we use the iPhone X with its forward facing depth camera and we use Reallusion and they have a plugin called Lightface. It's part of their suite of plugins for motion capture. You can use this either head mounted or you can use it fixed in place later on. Initially we started off with the head mounted but we found that it was quite heavy to have an iPhone strapped to someone's head for a long time. So we moved to fixed placement and recorded all the takes separately. Yeah, so finally and I think importantly the data that we collect all needs to have some editing done to it because the computer will often introduce some glitches or markers might be moved out of place and so on. So the first step is cleaning all the data. Second, we solve it which is just connecting it onto the characters as I mentioned before. And then we move on to motion editing where it's likely that we will need to stitch several pieces of motion capture animation together or make them into loops which can be very easily played in the game engine. And lastly, also exaggerated in case the animation needs to be made more interesting than real life in certain aspects. And finally, all of this goes into the game engine. In our case, we use Unity and we use this tool called the Timeline to sequence out the animations and also we use Unity's inbuilt mechanism system to trigger specific animations when we need them. Yeah, so thanks and feel free to ask me any questions later. All right, can everybody hear me? Hopefully, all right. We're running a bit short on time so I'm gonna be quick. So my name is Brian. I am the composer and sort of music director for VSVR. And the music is a very, very big subject. So mostly what I'm gonna talk about today are some of the considerations and challenges that we face. So one thing that we've talked about this whole time is that in VR, the role of the audience actually can change and that you're no longer just this passive entity watching something unfold in front of you but you can become something much more engaged with it. You can interact with it. And our goal with this was that you as the user as the audience would be able to interact with the characters and shape the story but also be able to shape the musical outcomes of things, this being a musically driven art form. Now, one of the major challenges that comes up with this, once you start adding interaction with the music is the simple fact that music exists in time. What I mean by this is that music relies on structure and pacing, direction, carefully planned attention and release any moment in a musical composition, the way that it has the impact that it does is largely because of everything that led to it. Now, once you start adding interaction, time becomes very flexible because musical layers or instruments, melodies and ideas are now being dependent on a user doing something and you can't guarantee when, where or if they'll ever do that. So this is also being timed with or combined with very strictly timed performances and motion capture. So the place where we sort of started by conceptualizing the music was less about how you might think of a traditional opera or musical form and more of what is the user doing or what are the possibilities for the user at any point because the musical material is gonna have to adapt to is a section very free. Like are we expecting a person to move through a space? Is it user directed in that the music is coming out of something they're doing or is in any moment this is fixed? Now they're watching a part of a captured performance. Next slide. And what is really meant for us is that the music is now intrinsically linked to the design of everything else because you have to sort of plot the performances in terms of, well, the person is starting in one area and they're gonna make some type of decisions so we have to know what are all the possibilities musically for them entering any section, making decisions, leaving that section, moving to the next thing, how long will it take for them to do that? And even just going through the choice structure of, in this section the user can interact with five or six different objects. Well, what happens if they only interact with two or all of them or none of them or they decide to just stop and wait there? There's a lot of design choices that have to come from what you are gonna allow the user to do but also accounting for what anyone might do because it's entirely possible for someone to just kinda stop and smell the roses, so to speak. Hang on. And this is a very, very big topic, this idea of interactive or dynamic music but essentially what we're talking about is this idea of the music is responding to something that the user is doing. And when you're developing anything interactive, really what you're looking for is usable data, some kind of information, some type of numbers or things that you can connect musical parameters to. And VR is very fruitful in that it's a digital world. So literally anything that you can imagine we can use as data. So user actions, they can touch things, pick things up, interact with objects and characters. But we can also use attributes of the characters. So their position or their location in the world, relationships to objects, how far they are, are they between things and even physics parameters such as how fast they're moving or where are they even looking. And really the only limitation that we really found with this is that the more obscure the parameters that you're using, the less evident those interactions are to the user. So sometimes that can be very nice, but that's a huge element of it. And so what Debbie's brought up here, so one of the methods that we've been using quite a bit is dynamic layering, which is this idea that a given piece of music actually exists as a number of independent or independent layers. And as the user makes choices, the music actually evolves in the direction of their choice by adding, removing, remixing or changing how all these different layers are sort of recombined to the user. And so this is from the opening scene where the user sort of shown their powers and said, okay, well you have the ability to breathe life into the world or you have the ability to take life away. And the actual music that you hear is a mixture of about nine different layers that allow you to hear Orpheus, to experience music, but to also have little UX things like they know when an interaction has happened. But the music also evolves in the direction of their choices. So this is why I'm not the artist for the piece of the little graph I drew. But essentially the music, when you come into the world starts this green layer, this very sort of simple string layer. We have Orpheus singing over top of it, which comes in. And then as you make choices towards breathing and life, you add layers on top as you decide to take life away, you add layers below and they all sort of serve different functions for the actual composition. And the stars here are actually that one of the challenges that you run into with dynamic layering is that once you have a piece of music that's made up of about nine or more different layers, it can all of a sudden just turn into this massive wall of sound. And if you want the person to know that they're actually doing something or affecting the world, a lot of these layers aren't immediately like you touch something and a huge timpani roll comes in. They tend to be very sparse so that the gestalt of it is still a reasonably constrained piece of music. But we also have these little musical flourishes that happen so the user knows, okay, I touched something, the music's gonna change a little bit, but it might not be immediately apparent. So we're just gonna show a little video, just a very accelerated version of that. That's the end of our presentation. Thank you all so much. We've run over time, so we're supposed to wrap up it too, but if you do have any questions about any of the things that were up to you, please feel free to come to any of us and ask away. The other thing is that part of our environment is that portal over there. You're welcome to go down and pop in and check it out. Unfortunately, if you're on a standalone Oculus Quest, it's not heathered to a computer, you will not be able to get in, but please, it will be open all throughout the whole conference so you can have, it should be solved in a couple of days. But thank you all so much. Thank you, PXR. Thank you to my amazing team for being here and sharing with us. Thank you, everyone. Thank you. Thank you. Thank you. Thanks for coming. It's amazing to see all of you. Thanks so much, Orpheus VR team. That was amazing. That was great. Look, I have gestural capabilities again. This is just a reminder that at 2.15, we will have Elena Vachitsky from Oculus Experience here to talk with us today, and that will be in PXR Central, so the place that you automatically spawn to most of the time. Be sure to take a break. I know we've been going through a lot today, and hopefully we'll see all of you in PXR Central at 2.15. Thanks so much. We're going to get started. We're going to get started. Thank you for your patience. Finish your conversations. We're going to get started. And you have all of 10 seconds, so please feel free to wrap up. We're not monsters here. All right. And there we go. Thank you, everyone. Thank you for your patience. It's wonderful to have you here in the final event of today, our keynote. Before we begin, I would like to acknowledge the support of the Canada Council through the Digital Strategies Fund without their support that of this would be possible. And I'd also like to acknowledge where I am. I'm in Kingston, which is the traditional land of the Haudenosaunee and the Anishinaabe Peoples, and I'm honored to be conducting this beautiful journey on the lands that are traditionally theirs. We've talked a little bit about mechanics as far as the raising or anticipation because feedback is very important. So thank you all for jumping on board there and another reminder for those of you on the live stream, please post in the Discord and we will try to get to your questions time and panel out. So let's go to you. Please let me introduce Alex Doe. Alex Doe is a theater creator, director and actor. He's the associate director of single thread. He is also the former director of theater by the Bay and he's also the project lead for this wonderful conference that we're all participating in. So please give him a warm clickity, click, click, click. He has very good. And also a fact that I need to say, he actually made this space that we're all in. So please give him a warm hand, Alex. Please take it away. Thank you. Thank you everyone. That's very kind of you. But I have the real honor right now of introducing to you a really, really special guest that we have with us, Yelena Rachitski. Yelena is the executive producer of Experiences at Oculus. She has overseen dozens of groundbreaking narrative driven VR projects that range from Pixar's first VR project to a whole bunch of others and we're going to hear about those today. And before she was at Oculus, Yelena was creative producer for the future of storytelling, which aims to change how people communicate and tell stories in the digital age. She has also helped program for the Sundance Film Festival and the Institute's New Frontier program. She spent four years in the documentary division at Participant Media and has worked on films like Food Inc. and Waiting for Superman. Yelena is passionate about big creative ideas that are going to make this technology and all technology meaningful. And let's give her a big round of applause because this is awesome that she's come. She is here from, speaking to us from Los Angeles today. Yelena, welcome to PXR. Thank you so much for having me. Speaking of these are so much easier now that I don't have to travel and have jet lag, although up in Vancouver is the same time zone. But I'm really, really happy to have you to be here and I'm really, really impressed with what you were able to do with this virtual space. So it exceeded my expectations. Oh, excellent. Yeah, maybe this is the future of all conferences, Yelena. We don't have to get on a airplane every single time. Yes. So maybe we can begin by, we'd love to hear your story and how did you make this journey from documentary filmmaker to VR producer? Yeah. So let's see. I grew up in Los Angeles and I was a creative in Los Angeles. And so generally when you're a creative in Los Angeles, you go towards the entertainment industry because that's the place where you can get a job. We're also trying to be creative. And I first started out in the independent film and documentary world because I really thought how do we create change? How do we use creativity? How do I satisfy my insatiable curiosity and all of that, which is really exciting. And it was very powerful to be able to be part of films like Food Inc. where you can actually see how it changed people's emotional stance on topics and how they were emotionally affected and to be able to experience that for something that helped come to life was really powerful. But what I also started to see was it just felt kind of like a the same cycle over and over again. People make a film. People sit in the audience. They're emotionally affected. But there really isn't that participation, which is funny because I was working at a place called Participant Media. And at that time, about over a decade ago or so, I was a participant for about four years, I started seeing really interesting things starting to pop up with how people were using technology to tell stories in different ways. And actually one of my big inspirations was the National Film Board of Canada and the raw experimentation of the future of storytelling with interactive projects like Welcome to Pine Point and Bear 71 and both things happening on the Vancouver side of NFB Interactive and on the Quebec side of NFB Interactive. So that was really exciting starting to see that pop up introduced to Sundance New Frontier before I started working there. And then I also went to Burning Man for the first year that you know, over a decade ago where I started to see the possibility of what participatory culture could be and true interactivity, how engaged people felt when they felt like they were more part of a something versus just an observer. And that really started my investigation into what that meant for me. And I remember when I left my job at Participant I really didn't know what I was going to do. There was like a spark of something to explore that technology and creativity had things to discover that could potentially be really impactful, truly participatory and really engaging. So off I set my parents were not happy that I just quit a job without having something to move out to. But I had to trust that gut. And then through that, that took me to Sundance New Frontier which is a really exciting time when I was there because things were just starting to kind of come up and it was still very I felt like a lot of the artists were still very isolated in the things that they were doing. There was people doing really incredible like projection mapping hacking connect sensors which incorporated your body where you can like move your body and felt like it was part of the projection. People were doing interesting stuff on the web web documentaries or some interactive storytelling especially a lot of the stuff that I was seeing from NFB and also Arte in France. But it was all kind of like isolated in different parts of the world and our job was to help bring that together help inform audiences of what might be happening. So that was fun and I love being on that space of just discovery. I think that's what really gets me excited about the following curiosity. Then the there was the Kickstarter campaign for the Oculus Rift which popped up and I remember as at Sundance I was like oh it says that it's gaming but there's something kind of interesting here that feels like something that's in this one device that doesn't have to be location specific and feels truly immersive like is there a use case for this gaming potentially. So I went with another person from Sundance down to like they're very one of their very first Irvine offices and I remember I tried I think it was called VR Cinema where you're sitting in a virtual movie theater basically watching a short film, a movie and then some game I think it was Evalkyrie or something where you're in space and shooting things, Mary Gamey and even the part of sitting in a movie theater that embodied aspect of me feeling like I was present in a space even though there was nothing really truly innovative about sitting in a movie theater watching a movie but there was a spark of there was something there that was unique. So we brought it to Sundance that year I forgot it was maybe six or seven years ago and before then I found that New Frontier was just this really cool space that you'd love to go to but it was always like the weird kids in the corner. And then the year that we brought VR there it was a bit of an explosion I think it was the first time that a lot of the traditional storytellers and filmmakers tried a technology and recognized like oh there's something here for me too it's not just you know a projection art on the wall or something that's like installation art like there's actually something that I think my narrative skills can take me there because I feel so present in the space and it affects me in a more embodied visceral way versus just vicariously living through a narrative which is still very very powerful so then from there that journey took me to Future of Storytelling so I moved over to New York I don't know how many of you are familiar with the Future of Storytelling Summit but it was also another really awesome job it's a summit that happens every year that thinks about how is the digital age changing the way we tell stories but also bringing thinkers from all around the world to discuss and understand what's happening so the psychology of storytelling the technology of storytelling and so forth and so that gave me kind of a greater understanding of where things can go but you know I think since that first spark of putting on a headset and knowing that like I think this is the thing we are looking for because you don't have to be in a specific location and almost any kind of art or creation can fit into the space and an audience from anywhere in the world can actually experience that just kept tugging at me so from there I moved over to Oculus Story Studio that was about almost five years ago and I don't know how many of you are familiar with Oculus Story Studio but it was founded with the question of can you tell a story inside of VR because at that time that wasn't something that we knew because you're present in a space but you don't have all of the techniques that you have for storytelling you don't have the cuts you don't have the close ups you know it's what is that new language that we're kind of creating and from there started that investigation which has been really really exciting and I think through there and also just going back a little bit in when I was living in New York for future of storytelling one of the things that I really got excited about and that gave me that real sparkly eye effect was my deep dive into immersive theater within the New York scene and that to me was also a drive within VR because the feeling that I got from immersive theater of being embodied in a space and feeling like I'm part of the show and an actor just taking me on this like whimsical journey and creating these moments of presence and making me feel like I was there was so powerful and I was like can technology replicate that in any way can technology help at least give some feeling of that strong human feeling that immersive theater gave me especially I think then she fell was one of my big inspirations because it was just it was so intimate and it was so wonderful so when I was at future of storytelling I don't know that when I was at Oculus story studio we thought a lot about that as well in telling the story because in immersive theater space they really think about how do you guide bodies through spaces how do you tell physical stories how do you use space and the movement through space to guide how to use lighting and sound to help guide because that's really the power how do you find that balance between creating a space in a world for exploration but creating a world and guiding your audience or your participant or whatever we decide to call it on the journey that you've created for them and so it's that constant give and take that I think immersive theater has been like crafting so so beautifully so we actually worked with third rail who made then she fell on one of the VR projects Wolves in the Walls which ended up winning an Emmy last year to really learn of how do you craft your connection with character connection to space using the techniques that have been learned throughout so that's a long-winded answer of kind of how I came to the VR space and then we can chat a little bit about what's happened since then I think to say that was long-winded doesn't a great to service because that was a great answer and it was so it's so cool to hear how you've had a front row seat to this really incredible moment for VR and how your career has developed alongside that I want to kind of zero in on something you were talking about when you you know the first time you tried VR that it was like a game and if you could speak to I don't know in terms of what people are expecting when they put on the VR headset or how is how is VR not just a platform for games how is it a platform for storytelling and how do you communicate that to people I think I think games have been a very obvious fit because in the past games are basically created in a 3D three-dimensional world you just experience it on a flat screen but the models are 3D you're going through various worlds it's really just that transference of bringing it into VR it's definitely not the same and it's very hard for game developers to transfer their skills over into VR but it's an obvious jump all the other things that we're bringing into VR were kind of reinventing not from scratch but really much bigger reinvention than it has been for games so I think the questions in two parts there's the creative aspect of how do you make content non-gaming content for VR I think VR has a incredible potential to be for everything like right now we are at a conference we could easily jump into doing what we're calling infinite office of or virtual desktop of creating work and doing something that's productive or we can go and watch a story or watch a movie the idea of a virtual world is that it's in a sense can recreate reality but in a virtual space so the possibilities are limitless I think the the slight gap that we're going to be taking a little time to work through is who currently the audience is because realistically the people that have been buying headsets early on were generally early tech adopters and usually those are people who love games and love new technology so it's important for us to have the things that we know that the audience that are actually purchasing the headsets at the same time working and experimenting and pushing the stuff forward that we think will be here within the future the best way of showcasing to people or convincing people that VR is more than just games is putting our headsets on and having them try things VR is such a show don't tell type of medium and I think that's been one of the most challenging things of how do you convey the magic that VR has through a commercial or through a 2D ad the real power is when you actually put on the headset and someone feels physically embodied in a world and so many things are opened up to them we have travel apps, we have storytelling apps, we have education apps there's just so much that's there but it's really just getting the headset and having someone try it so they can see it for themselves I think another thing that's been interesting especially for me because I have been working a little bit more in the experimental space than the mainstream space of trying to discover what storytelling is even I helped launch a piece this past year called The Tempest and the under presents with the tender clause I don't know if you guys had a chance to try it but it's been one of my passion projects for the past year and I've been loving working with tender clause to bring something that to life and I feel very fortunate that the company's been giving me resources to be in that explorative space but it's all kind of new so we have to balance out understanding that the audience is looking for stuff especially if they're paying for it at a certain quality level and it's very hard for a lot of things that are coming out that are outside that that are experimental to be presentable enough for a general audience the things that we have to deal with is like frame rate and motion sickness and so forth so all the stuff that we're working on outside of gaming as well we're slowly getting there to that space that is high quality enough that the mainstream audience versus the niche audience starts really fascinating and getting connected to thank you again an amazing answer one of the things that inspired me to organize the conference in this way was that when the pandemic started I was kind of stuck inside and I convinced my two friends to get a VR headset and we were throwing frisbee around in Rec Room and just that feeling of tossing the frisbee back and forth made me go wow I feel like everyone would benefit from this type of this type of technology and something that I was really curious about was what was it like for you at Oculus when the pandemic started do you see increased interest for this technology when that happened what was going on behind the scenes let us into the smoke filled room definitely so I think it would have been perfect if definitely not perfect we don't want the pandemic to hit it all two years from now it would have been awesome for the pandemic to hit for VR because a lot of the things that people really would need like social collaboration and connection would be a much higher quality and much more useful than it is now because this is the stuff we're just building right now but with that said the headsets were sold out consistently during the pandemic and usage was definitely much higher for obvious reasons unfortunately we didn't have all the things that people needed like we're getting there but VR isn't yet going to be more productive than your computer or your phone necessarily but you know there's a bunch of games that again people who love games can go into but I think that I felt to be really magical was like for instance my brother lives in San Francisco and I haven't seen him since before the pandemic because they're really hunkering down and we went into Oculus venues and watched the the SpaceX NASA launch together and watched the countdown with like 100 other people cheering and he was sitting right next to me and it felt like we were actually hanging out and having a moment in a much more embodied way and in a much more connected way than we've been able to have through our phone calls or through our video chats which has been really powerful then you know it's been fun going in like venues we've also had live webcams and I remember when I went in there there was a bunch of teenagers who are homeschooled right now and they were feeling like they were on a field trip which is pretty exciting so you can imagine people kids or teenagers stuck in their rooms really not seeing that many people and then being able to virtually travel feel like they're at the Monterey Aquarium and actually connect and talk to other people the other thing is like live events of course I mean the push that we've been seeing the creativity rise and I do think oftentimes constraints do breed creativity so it's not only like Oculus that's been pushing pushing hard on social features on making it all the things that we really really want the R to be it's also been incredible watching the community come together and figure stuff out just like you're doing with this conference I'm sure this is not what you planned last year but that you were able to create this awesome space and have this community here I probably wouldn't have been able to travel to Canada at this time but I get to be here and I get to connect with your community and I get to connect with you and so I think it started to break some barriers that were there before same thing with the Burning Man Alt Space event that we were just talking about before I think my Burning Man community they never talk about VR with me they are just love analog love being out in a space love like tangible things but to see the community just kind of come together and make incredible digital sculptures and this expansive space and find creative solutions to bringing people together virtually I think has just been so mind-blowing how quickly things are happening in such a short period of time and I would encourage everyone to check out the Burning Man space like you can see the environment in Alt Space and it's really really neat Yelena we have a lot of people here who are creators themselves particularly in the field of performance directors writers yeah I usually just say theater creators what advice would you have for them as they embark on creating content in virtual reality there's definitely a gap in terms of technical knowledge what can they do what advice do you have for them yeah I'm seeing a lot of really interesting stuff pop up and specifically I'm thinking about the immersive theater space that's happening within VR so for instance with Tender Claws and The Ender presents The Tempest the Tender Claws team is their digital developers the two partners Danny is a technical genius Sam is an amazing storyteller but they partnered with Pie Hole out of New York which is an experimental theater group and so they combined both of those talents of understanding how you create story for a digital medium with actual experimental actors to come together and really create something truly magical and it was just so inspiring to watch it come to life and to be able to see actors start using the tools like immersive theater actors who weren't very familiar with technology they now have been spending like 20, 30 hours in headset creating intimate performances and magical experiences for people and learning how to use those tools the actors didn't have to develop but the build that Tender Claws created for the actors allows them to have like godlike features allows them to write a journey and they've actually started even creating their created followings by fans who have loved individual actors and have been following them on Instagram and so it's really fascinating to see this new kind of like actor and performance and fan base start coming to life through this. Apart from that, VR can definitely be intimidating especially to create something good and within the limitations of the technology is really hard and that's not something to skim around it's just a really hard thing to do well and there's very they're starting to be more developers making good stuff but it's been taking we've been in this for what five years now and we're starting to just see awesome work get made everything's been figured out before so with that some of the notable other things that I've been witnessing is like the incredible stuff that's been popping out from for instance, VR chat there was a project in Venice VR by Kira Benzik and her team I'm forgetting what the exact name of the play was but but they won a prize at the Venice VR festival through their immersive theater show created within VR chat so it's partnering with your if you have no technical knowledge it's finding the person who knows how to make stuff and co-developing it together because I think there's a lot of technical people that are really interested in making something unique and creative same thing for something called Nios VR which is another metaverse world and there was another project made from there called Meta Movie I believe and that's also another interactive experience and they used a lot of the tools that and the creator tools that Nios made to be able to make the show the other thing is Horizon which I know is still in its beta build and I spent a few minutes in there again before I came in here but once that starts becoming more accessible the creator tools in there are much easier than coding definitely more limited than what you can do with Unity and create spaces like this but I can see that really really growing and then I can see that starting to be a space for more performances to start taking shape so you find the partner who helps build the world if it's not something you're comfortable with it's not that dissimilar from if you're a director you have to find the camera person that knows all the technical stuff of how to shoot or if you're a theater maker you have to find your lighting technician and a person who knows the stage and the music so it's just a new type of technical partner that you would have to work with to really create the best thing you can always do in VR to really know how to make something is just try as much stuff as possible it's really really hard to think like oh this would be really cool in VR because once you're in it it's just generally about how something feels to you and why something works and why something doesn't work when do you find connection with a character how do you create something social where people can understand what's going on and have positive behavior with each other when did you get moved by some design that happened what didn't work for you and through that really trying and experimenting and not just theater stuff we have so much to learn from the interactivity all the different social spaces that are out there it's just learning to you because I think everyone's theater work is personal to them what their story is of what they're trying to create with the audience really understanding why does something work for you why doesn't it work for you start taking shape from there and then find your technical partners to help start making it happen in some way Yelena we started a bit late so I was wondering if it was alright with you and with our attendees if we extended by 10 minutes if we just went in is that okay as long as my wife I hold do you still hear me okay and I was having sound issues before I hear you so clearly it's night and day I don't think I've ever heard someone speak so clearly in VR we were cutting it close before my audio wasn't working at all so I'm really happy it worked out so and I was told while you were speaking there the title of that piece from Venice is Finding Pandora X yes thank you thank you so on that note since I'm getting okay all right definitely and that I think let's open it up to the to the crowd for questions and this is a brand new feature that we were figuring out this morning so let's see how we do Liam is going to manage our questions and let's see how we do here okay we have a question from Michael Wheeler Michael Wheeler you're live in VR hey Michael I can't hear you can you hear me I like your mohawk oh thank you so much it's a bit aspirational but it's doing well here thank you it was really awesome to hear everything you had to say and I do tempests I'm you know I'm mostly a theater person who's kind of arriving in VR this year I guess if I find it like a really fundamental shift in terms of what I understood could be possible in a theatrical context and I'm just curious I know the main performer I felt like they were more photorealistic than the avatars that were here wearing and I'm just wondering like what the differences in how performers in that piece are captured as opposed to audience so the question is the performers in a tempest versus the audience in tempest yes my experience was that the performer in that piece was more I would call photorealistic and I'm just wondering if they're captured in a different way or if they're just also using a quest yeah honestly so they're just an avatar just like we are right now all the actors are wearing are just quests and controllers and through IK they're able to predict like where the feet are moving and so forth but there's definitely some limitations but it's interesting that you say that because actually I find the avatars to be highly highly stylized where you don't see their mouths moving you don't really see any facial expressions versus here you see starting mouths moving and like the eyes kind of drift a little bit you don't see that so the audiences in the tempest you know it takes a bit of a playbook from something like sleep no more where everyone's a bit anonymous and you wear the mask and so everyone kind of looks the same but made in exactly the same way they're made in unity they're just styled very differently but I think what's interesting there is because the actors are so expressive I think your imagination fills in the gaps to make it feel more realistic to you and I think that's one of the things to think about for VR we're not there yet where we can have fully realistic avatars that exactly see what your facial expressions are or where you're looking or how you smile and as humans we are so registered to read facial expressions so I actually find when you try to put too much face and you don't get the exact reads that starts getting into uncanny valley zone but if you leave a bit of a gap and keep it stylized somewhat then audiences start filling in the gaps with their imagination based on what they're hearing and gesturing and the gestures you hear from from the actors so it's interesting that you say photo realistic because to me they're just highly highly stylized but made it exactly the same way cool thanks for that answer and we just said that my performer was extremely talented and a very good improviser and that was for me part of why it worked as well like I can't imagine like a subpar performer pulling that roll out yeah now I've been impressed I tried so the the team gave me the actor build and I remember I tried to go into the under presents once as an actor because so and it's funny is as an actor they have a virtual dressing room which you first start in and when you go into the virtual dressing room you can pick what avatar you look like and where you want to spawn from and you know how tall you want to be and things like that and so I did and I went into the under I was like going to do like a little one-on-one and I got super intimidated I was like someone please like me do you want me to do a magic trick for you that's amazing so the ability for these immersive theater actors to translate their skills into VR so quickly and to begin still create such a magical time of formance even though you can't hear the audience is a testament to just how they are and you know what people have said is that every time they've gone to different performance of the tempest it's always been slightly different so even though there's slight script the improvisation they're able to do is pretty incredible and a lot of them kind of have taken on and expanded their characters and to build them up in ways so they've kind of taken it into their own hands and created a life of their own with their characters beyond their initial beyond their initial maybe how their initial role which has been super fun on that we actually I've been told by someone in the audience that we have one of the actors from the under here and maybe Deirdre will join us at Penn and Fathers Tavern after him we can ask questions about the under which everyone should download you can download it on your quest the under presents and then you can get a ticket to see the tempest very well with your time actually sadly I know sorry guys everyone just ended September 30th oh but the under presents the main experience has actors in there you just never know if you'll get an actor experience until the end of the year but the tempest itself we might you know bring it back we have to see but this first run finished our loss if we missed it and Liam do we have do we have time for another question we definitely do and just if I may everyone just a reminder the emoji the button to hit if you'd like to ask a question isn't the hand in the normal emoji menu it's in the bottom right corner just to clarify just a reminder we've actually got a question from the discord from someone that's live streaming right now but we're going to go to that question but before we do Jenny Herb has a question for us Jenny could you give us a wave you should just hey sorry I think that's a mistake I was just using the raise hand back where the present when you started oh no problem thank you for this is the I'm actually sorry so let's pretend this never happened and it's all not real and let's go to our let's go to Aiden who has a question from the discord great our question from the discord is Oculus is a company you mentioned the role of guiding people how do you plan to guide for social cohesion and diversity in VR yeah that's a great question a big part of and so much of our conversations especially in the social VR space right now is how do you make people feel comfortable given that VR isn't quite real life there is sometimes some level of anonymity and the internet although started off as a utopian vision has a lot of dark spaces so much of the tools that we think about and that the social teams really think about is how do you create a sense of safety how do you make everyone feel welcome how do you make sure people have the tools to report other people if they make them uncomfortable there is even we have live moderators making sure that the spaces have no abuse in them and everyone is completely completely comfortable definitely a hard problem to solve but it's something that is thought about and I think that goes into all parts of the experience even like the design of an experience I think shifts people's behavior so like horizon I think the avatars are friendly and the space is also a lot brighter I gave it a talk at Oculus Connect a couple of years ago called the hierarchy of being embodying your virtual self and in it being my colleague Isabel who I gave the talk with about how even when you embody an avatar it actually shifts the way that you behave the objects that you pick up shift the way that you behave the space that you're in shifts the way that you behave so it's a combination of both design tools and the ability for all of the users to be able to protect themselves in some way in addition just the conversation of diversity is one that's close to my heart that I talk a lot about in order for VR to be successful it has to cater and be there for a broad range of people or else it's just never going to get big so it has to be a space that's welcoming for people but also created by a diverse group of people and you know it's hard I think people generally go to what they're most comfortable with developers that are very high quality but we have a bunch of different programs one of them is called Launchpad VR and that's a program where every year there's about 100 or so people that we bring to ensure that we're supporting diverse creators and diverse developers this year they added a section for Horizon World Building for instance but also including games and also including storytelling I know on my team one thing we really look at and we're really trying to figure out the best way to do it is how do we ensure that we support and fund are a diverse range of people and keeping an eye on that so I'm not saying that we've solved everything in any way and we have a long way to go but I know for my team and for many teams it's a really top of mind yeah and I've certainly seen that in the VR for good projects that you've undertaken that's been really cool some of that stuff in building empathy and allowing people to jump inside particular types of experience that's we have time for one more question Liam do you want to connect us to it? Yes we have Jason has a question Jason you're live Thank you I'm really enjoying everything that I'm hearing today I wanted to ask about this maybe my filmmaker background but with your theater shows even in immersive theaters in VR has there been thought put towards archiving and recording like a particular performance so that it can live on when you know say that actor isn't there or whatever yeah that's a great question I I haven't but for instance with the Tempest in the Etter presents a Samantha Gorman who's the co-founder of Tender Clause she just got her PhD in digital storytelling basically so she was going in and capturing different things as part of her dissertation so hopefully she's got a lot of that saved something I do think we need better tools for is in headset capture that's higher quality I don't think our ability to record from in VR with our experiences is a super great experience and generally doesn't look great when we post it online we think we have this amazing magical experience and we post it online and we're like look how cool this was and all people see that don't have VR is like weird jangly looking where are your arms? it's like but no it was not so yeah I haven't been I think that's a great a great thing to think about with things moving so quickly and thinking about thinking about the pivotal things that have shifted the course of the direction of a lot of these art forms sometimes you don't know that something's pivotal until it's like 10 years later and you remember the thing and you're like give me find me video footage so we could use that in our classes but it's a good question it's something that I'll take into consideration as we're making more stuff but specifically with the under I think Samantha might have some stuff I'll check in with her amazing Yelena we're at time I want to thank you so much for taking time out of your schedule to come here and be with us it's been so great to listen to all your thoughts on this this was Yelena's idea to do this sort of format as a Q&A and I think that worked super well yeah I'm taking a picture there we go cool that's cool just took a picture everyone thank you so much I know this is a favorite topic of mine I feel like all these different things are starting to pop up around the theater space and I do think like I mentioned with COVID it forces a different kind of creativity and really I've just been more of the kind of supporter of it the magic that all of you guys make in your community that makes is what's going to continue pushing this and making all of this better so thank you for continuing the art forms and making awesome stuff and having conversations about it and this was really fun to be part of we appreciate you saying that it's definitely a tough time for the live space right now yeah I could imagine maybe get us and conclude our day yes thank you everyone thanks for another round of emojis for Alex and Elena if you wouldn't mind I think they deserve it it's so good to have having here and thank you all of you for joining us in this grand experiment and building this together over the next afterwards I think we are going to go to Pennyfather if we have time please feel free to come to the tower with us also during the next week the discord will be live I don't know if this has been universally spread but there are Easter eggs in hidden worlds all over the space and it will be open so I would encourage you we would encourage you to check those spaces out it's a lot of fun also we have some as you probably realized amazing individuals here exploring this with us this week and some of them have built some amazing worlds two-spirited trickster Raven, sorry I'm just going to call it out right now I was generous enough to share something that she built in the discord I would encourage you to check that out I would encourage you to build and share on the discord let's keep that active thank you so much for a great weekend we the PXR 2020 team we're so proud of this we're so proud to be exploring this with you so thank you very much please stay in touch until we see you in all space next week thanks so much everyone do well, explore