 Hello everyone My name is Jules Urback. I am the Founder of the company called the Oatway Some of you may know us from some of the software that we create Octane renders one of the earliest GPU renders We've had a blender integration for that I think over 10 years now And we're also known for light stage which is a digital double scanning service and the render network more recently Which is a distributed GPU cloud service that runs on individual users machines and When time asked me to come speak I was pretty excited. We've been Really we're focused on similar goals. We want to democratize content creation in particular Oatway's focus has been on democratizing holographic spatial tools and and rendering and And that's something that I think leads pretty nicely into the talk that I want to give today, which is called towards the Star Trek holodeck Show of hands. How many of you are Star Trek fans? Wow, that's that's a great amount good. So You love blender. You love Star Trek. This talk is for you the Star Trek holodeck for those that may not be familiar was a room on the Starship Enterprise it was Introduced in 1987 in the pilot episode of Star Trek the next generation And it was also featured interestingly enough in the very last episode almost 20 years later of Star Trek Enterprise and while the holodeck was a room that you could experience almost any Virtual reality simulation in it also had an interesting Concept in the story of Star Trek because it turns out that the show you just been watching at least the last few years may have All been on the holodeck Will Riker is looking at the history of that show and he's on a future enterprise and so The idea that that sort of was stuck in fans heads was well Is the show that we're watching is Star Trek actually just playing out in a simulation on Further enterprise in the future is that enterprises all the way down and it's one of those Mind twisters that's been an interesting part of both Star Trek and even our Literal lives today when people talk about the simulation theory. Are we living in a simulation as universe run? in a simulation The most recent episode of Star Trek believe it or not was called holograms all the way down It was released just a few weeks ago Touching on this theme and at Otoi, you know, we've been working on projects related to Star Trek Which I'll be showing but you can see here some of our work on the I guess on the left you'll see on the right you'll see You know the show and on the other side You'll see our render and they're pretty close and in fact it makes you question I mean these were of course shows that were filmed before CG was was this good, you know is what you're seeing real or rendered right? We have this Same issue with AI where we're not sure there something we're seeing was created by human or created by a machine And it's kind of goes beyond rendering But I want to focus as an aside on the technology that might make something like the holodeck possible It looks like it's closer than the 200 or so years that it was set in the Star Trek timeline There's a company called light fill lab that we partnered with it's actually building the real life Star Trek holodeck They were inspired just like me To go and see if that's something that could be created and they're working on the display technology It's a couple of years out. It's going to be in location-based entertainment First theme parks things like that, but like you know giant 4k TVs that were hundred fifty thousand dollars And are now hundreds of dollars. This will go down in cost significantly and the beauty of these Holographic panels is that you'll be able to have all these incredible experiences without wearing any VR glasses We've been building Tools to create content for those things and we're using we're using an iPad. We have projectors with tracking But if you actually go see the display up north in Northern California at their labs, it's incredible. I mean, I've seen it firsthand Again, hopefully the public will be able to see it at large, but these panels are tileable It's like the Samsung video wall So the larger the surface area the more tiles you make the larger the hologram can push in and out and If you have I'm a something like 15 million dollars you can put this on the ceiling curve it around you can actually build the holodeck it's expensive, but it could be done and It's hard to capture how good this looks on a 2d video But here it is This is basically the holographic panel as you move your eyes if you focus on it. It looks real It's very different than just stereo or autostereo. It's truly Sort of this magical endpoint and it's very expensive to render on I mean these panels are something like 100k pixels by a hundred thousand pixels per meter So it's an insane resolution and it's challenging to create content that will render on those devices and also run in real time But we're working on that Going back to Star Trek and our work Related to that. I was actually thrilled to find out that our renderer and our software was used to Rerender my favorite movie the Star Trek motion picture from 79 Not only that it was also rendered on the render network It was actually those frames that were put in this remastered version that went out in theaters and on Blu-ray in 21 it was done on average users machines, which is pretty cool A couple years after that we did another project. This was featured in the Apple keynote Twice actually and I'll be talking about this project a lot more later in the talk But I want to show you a video That was all done by us in 2022 and I'll play it now. That was pretty cool. Yeah Thank you Now I'm very pleased to say that this wouldn't exist without Blender. In fact a huge number of the artists working on this project Used Blender and have been using it for a while. I actually have a bunch of Screenshots that they prepared for the Blender conference They're all pretty excited to sort of show how Blender is being used in Full productions and it's being used to great effect. There's about I Don't know how many years of assets we've been building for this project and Blender has been really instrumental and essential in making this happen a lot of our artists actually a lot of Star Trek used to be done in a tool called Lightwave and That fell out of favor and Blender really took over for a lot of these artists and we've brought a lot of them onto the team and It's not just Star Trek. We're actually doing other properties in IP as well and Blender and it looks fantastic it's kind of incredible to see the trajectory that Blender has taken and You know kind of grow with that as well We've been building a lot of tools and technology around Blender. Obviously we have octane as a render in their cycles is fantastic We love all renders But there are things that we've been trying to figure out Look, how do you plug Blender into other pipelines other tools? We do have other artists and other Assets in the in the production pipeline C4D Unreal Engine and so we'll be building tools for example like this one that take Everything's happening in Blender as a C-Rap and run that into another process. You can have live linking. It's super fast We can take something and bring it into Unreal Engine share those materials share that same Pipeline obviously as we look towards standards like USD material X. It's being adopted in Blender Those might help but in the meantime, we've been developing these tools in this set of Pipelines really to allow us to go from modeling in Blender rendering and octane doing virtual production on an air wall and Unreal Engine and You know, it's something that is is definitely been very helpful to us But we've also been looking towards the future as well I mean there are standards that are emerging that are going to be great for making this You know pretty much a commodity and I think that the hydra render delegate system is one such piece It's being released obviously in in Blender 4, which is amazing. We have a rendered elegant ready to go and Even within octane We have the ability to load other red red elegates and we've been playing with that For example being able to take to rendered elegates storm in octane or cycles and octane and just do a shader and composite those together Obviously, you know Blender has a great GPU compositor. There's a lot of really cool things. You can do a technology like that Open standards are really important not just for Materials and meshes and data and apis but also just for collecting the works of artists themselves How do you organize these things? And I think that's becoming especially important When you look at a lot of the major players that are trying to build call it the metaverse or spatial computing I think you know a lot of people don't necessarily want one company or one format to control something as important as The spatial web And so I'm a strong believer in having open standards open systems for this You know, we've joined a number of organizations including metaverse standards form group Chronos Even going back as far as I started with MPEG back in 2017 to try to foster open standards That would make sure that there's no piece of the asset pipeline that isn't open sourced and available because you know bit rot and these Things are really a problem And in fact they get in the way of some of the pipeline issues that on we were trying to solve We have in fact created a couple of open source standards One of them is called it came F that extend things like usd and material X with the last missing pieces that we need To be able to do something like this which is to send data between Blender and 26 other DCC tools Rendered them on the cloud without having to worry about the DCC being there all those things become really important at scale Especially on the render network, which is built on running up pretty much in the background on people's machines We wouldn't be able to have a million GPUs in the cloud without having that capability So it's not like running it on AWS or data center The the power and the speed and the costs reductions you get from having a decentralized GPU network Matter and it does help to not have to pull in 30 DCC tools into a rendered job But of course in the case of blender, it's open source We can integrate the blender pipeline into the mix and that's been a really strong Positive point for for our artists and our developers The work that we're trying to do to organize Individual artists and IP and all these things and if you looked at the last talk I think the idea of being able to have proven us over your artwork your creations whether you're coming from the digital world or the physical world is Incredibly important people who is a well-known artist digital and physical Has been a friend of mine and we've been taking his thousands of pieces of artwork and putting them into a system that he can Basically have proven us over not an NFT, but something that just says I rendered this I created this Sended on the render network every asset every texture is hashed and I can prove it was mine And I can prove that if somebody runs a AI, you know learning job on it this data This came from this render in the case of Alex Ross another wonderful artist a very good friend of mine He is purely you know a gouache painter. He uses watercolors doesn't do any digital work But we've been taking his paintings that you're beautiful. He's been Probably best known for doing yet tons of Marvel and DC work over the years and we've been turning those into 3d assets that have been going into a similar archive and You know, he's actually pretty Interested in sort of seeing that digital world being created around his physical pieces And in fact the amount of work that he's done is so expansive It really does seem to cover almost most of pop culture from the you know late 20th and 21st centuries That we're now and if you look sort of at this testee She almost see you know the you know the seeds of what could be You know the metaverse like you know something that comes out of snow crash Or Neil Stevenson's book or ready player one And in fact Neil Stevenson who I've had the pleasure of getting to know recently Is also pretty intent on having a very open standards-based metaverse Going back to people's work for a minute, you know Again a lot of his forward-looking work is really going back to physical pieces even though he's a you know Most well-known for being a digital artist and so some of his newer pieces are really cool I mean they are physical pieces you install them in a museum or you buy them in this form and They're pre-rendered and they're on LEDs that are you know meant to fool you into thinking you're looking into the piece But obviously with the holographic display panels from LFL that are coming out in a couple of years You'll be able to have a piece from him that looks like it's been physically Created, but it'll be digital to be rendered and this is the kind of stuff that I find fascinating As the future of these technologies all converge in these really interesting ways The third archive project that I want to talk about and it's the one I'll be spending the rest of this talk Discussing is the Gene Roddenberry archive. This one has enormous personal interest to me my best friend's dad was Gene Roddenberry he Rob Roddenberry his son invested in in Otoy and Created an endowment that started this project and initially the Roddenberry archive is really just you know scanning and documents scanning the millions of pages that Gene Roddenberry had wrote You know obviously he's best known for creating Star Trek, but there are many other sci-fi stories and things that he had You know put out there over the years And the archive work that we had done since 2021 has focused on not just the written word But also the visual aspect of Star Trek right basically taking everything and because it's a visual medium, right? It's not just the printed page You know, how do you preserve the work of all the people that created the sets the designs? for this show and we're fortunate to have a lot of those people still around Mike and Dean Sakuta who wrote the Star Trek encyclopedia and who curated the 11-foot enterprise model that's in the Smithsonian all of that is part of this archive and It's more than just you know scanning in you know data and models It's also figuring out how all these pieces connect to tell the story of Star Trek Both in the fictional universe of Star Trek, but also in the production of its creation So you can see in that video. I just played you can see the actual You know a piece of the of the show and you can then flip into the stage nine where it was filmed or the lot Paramount or Desilu One of the other aspects though on the flip side of that is you want to have the 1000-foot version of the Enterprise not the 8-foot model of the 11-foot model that was filmed and that was one of the very first Pieces that we took on in this project. It's sort of been a dream of many people even before me to see You know that enterprise made life-size. This is some renders from a proposed 90s Vegas Hotel that was gonna be a life-size enterprise, but this is our version of it We're about 30% of the way through it You can see the inside of the ship every room is is detailed. You see Kirk's coffee mug. It's all there And it's beautiful and we're also working on the world around Star Trek It's not just the ship. It's what's outside of it earth is solar system the galaxy as it as it's been you know defined in that show And if you're a fan of Star Trek, this is pretty cool I mean it really is something where this data this exists in a form with gross of how it's rendered and what the Medium is that allows you to sort of see the show and the world in a way that I think is pretty novel and unique And it's something that's really been driven out of passion and love But it's also gotten a lot of attention since we started putting out these renders in these pieces You know we've gone as far as to you create the entire show system as we as certainly as we can the Starship Enterprise and Even the leather and the you know the 70s Shag carpet that was in the recreation deck in 79 All of that's there and when you take these renders and you put them up next to the actual You know footage or film that looks pretty good You can also create interactive experiences of the where you can go inside of the enterprise and you can look at the buttons You know push the levers all of it So it's not just a static scene That's an important aspect also of trying to preserve how these sets and how that world worked And and while we have the folks that can help us basically create the provenance for this We're trying to build all of it And it's a pretty exciting project again sort of my favorite Star Trek movie is the 79 film So this is that bridge where you go inside of it audios there Every panel is working This is the in-universe version. So this is there's no plywood. There's no cameras and no set lights This is what it would look like if you're you know on the ship operating it is as a crew member And you know there's more than just of course this first movie. There's you know 12 movies that followed it so Star Trek 2 Star Trek 3 we've built all of those versions of the enterprise and Again, we've done our best to make sure that it's you know exactly like it was in the film And in the production In the case of the movie enterprise we do happen to have the blueprints for the whole ship It's one of the only versions of the enterprise and those six movies that has that there's a lot of other enterprises We're building 3d models of all the concept versions that were created before they went into production And then you have the timeline of this ship It's like 40 years from 2245 when it was launched so 2025 when it blew up in Star Trek 3 And if you look at all these versions to play them back It's almost like a time-lapse of history and these are all the you know renders of the of the ships sort of Overlaid on each other through time. It's pretty cool. And so the experience you can have is Almost going through time and going through space To see the world of Star Trek and the story itself play out As it turns out, there's not just one enterprise. There's many ships that preceded it and followed it The most famous one of course is from the TV show, but there's about 13 or 14 others One of the more famous ones is from the 90s. It's from Star Trek the next generation Enterprise D Captain Picard ship and again, you know, we've done our best to make it absolutely perfect. There's a follow-up ship the Enterprise E from the 96 movies and Again, the you know, the detail is pretty exquisite So, you know, there's another aspect to this which is if you build the sets of this quality Sure, you can give people an experience that allows them to go in there and explore that and have it saved a Prosperity, but you can also film in those sets and virtual production as well And we've gone to the trouble of I mean even Picard's fish from his office is in this in this world Recreated perfectly But there's a lot more to this than just these fun pieces. There's also documenting You know the people that are you know that the work on the show that also acted in the show So we had William Shattner come in he sat in the 1979 Starship Enterprise the one I just showed you in a you know virtual production facility in an air wall and he loved it he was super excited and You know his interview was just amazing and he talked about you know How he wanted Kirk's story to end and how it didn't go that way And so we have all this amazing, you know, these interviews This behind-the-scenes footage and of course, you know a lot of a beautiful renders that cover Effectively 60 years of Star Trek history 800 hours of the show everything from the J. J. Abrams movies to the You know to the original 1964 pilot that wasn't aired back in April We put out a interactive experience on the web It was online for three weeks just an experiment and people loved it It got a lot of attention the Thessonian of all places actually wrote an article about it This was the you know early interface that we had for the web portal, but you can see there's a timeline You can look at all the different ships The Akutis who wrote the Starship psychopedia helped us create the text and the information that's there Major Roddenberry my friend's mother who was the voice of the Starship Enterprise, you know She recorded all of her venoms all of her dialogue Right before she passed so that one day we could bring her voice back in 15 years later We did and so that's part of the experience as well And there's about 20 other alternate timelines and things like that that are actually also created for this experience It's pretty amazing Back in April. We also put out another video. It was again a concept video no dialogue Just meant to service interstitial after William Shatner's interview and it was unlisted on YouTube It got a million something views and people really loved it They went crazy and it got much more attention than we ever expected I'm gonna play that one next. It's about two minutes Here he goes that was a lot of fun How am I doing on time? I'm gonna make sure I Know go over here Just briefly. I know that we're we're already short You know the the work that we're doing also involves scanning, you know actual assets including the Set from the Star Trek card season 3 the Enterprise D. That was me on set our team scanned it in preserving it It seems also built these props and these assets It's it's an amazing project. You can go see this on otoi.com on the blog We're hoping to have a lot more put out there in the Months to come. I do want to show a couple of the behind the scenes of Star Trek the cage the very first pilot episode we brought in the director It was amazing to have him on set and to show You know, basically the world of 1964 brought to life in this, you know, really quiescent way and I think on the I'm gonna skip ahead a little bit to The end of the presentation here. You can see how we created Spock. It was done with and you know Basically, this is the actor Laurence Ellicott played Spock Literally, there's no digital work on him that you're seeing here He looks so much like Leonard Nimoy With just prosthetics that it was an incredible incredible match and we did use Some digital techniques on his face. We should can see here live as we're filming it to get To get him to look that you know to get a little almost perfectly like Nimoy We also would scan the actors in With a 40 capture system or we could relight them later Which was pretty useful and we experimented with nerves green screen CG turns out that CG scans and being able to sort of bring the actors in costume Through a rendering process looked to be the most You know to be the highest fidelity And as we've you know explored virtual production even further You know, we've done a lot of physical props that we've scanned in we've had a D age and age Lawrence who's our actor playing Spock on set And it was very challenging to do that last scene because again, he's transitioning from an older Spock to a younger Spock This is the virtual capture system that we were using This is me helping the director set that shot up and as you look at these beautiful renders You know where this is all heading before you get to the holodeck is a pair of really beautiful Mixed reality glasses which are being put up by Apple next year And that's something that I want to end this talk on which is showing our roadmap for the vision pro It's definitely challenging. We're doing renders at 16 K by 16 K or 48 K In the case of stereo cubic panoramas and that's just for a simple panorama There's something that you need to actually move through a walk into you have to render a light field Which is a volumetric rendered job that is Part of our pipeline and you can see here it's running in the simulator It actually will allow you to explore these worlds at final render quality on the Apple headset and that's pretty remarkable We've been trying to work towards this technology for a while and it's finally here now that play back side where we explored What you know Web XR is does work on the vision pro html 5 is a model element but we are building a native viewer for the light fields among other things and You know, we're hoping to have that out sometime next year when the device launches And you can see in mixed reality I mean the quality of some of these assets you couldn't really appreciate on the web page is awesome I mean you can see here this enterprise J The you know enterprise hundreds years in the future has a whole world the whole city inside of it And in mixed reality and you take a close look at that asset you can see all that it's actually incredible It means you really reimagine how you know content can be consumed when you have this kind of Fidelity and quality and so we see the future of rendering and content creation as being pretty awesome and really exciting So thank you all it's been a pleasure presenting this today and look forward to talking to some of you after After I wrap up here. Thank you all