 Hello! So after some technical problems, we are finally done. So we are Blendefix, my name is Sebastian, this is Simeon. Just briefly, just about us, we work in Leipzig in a small office where we do some things. We work as a team, also with other people that have their office here in this room, for example, Mikawa, you might have heard from them because together we sponsored the autonode offset by the great person in the middle of this image, Severin, Julian, thank you. And especially since the last year we worked together on a few more projects. So what we do in general is all things you can do with Blender, like VFX, visualizations, animations, cinema and TV stuff, stuff. And what we do as well is, especially since the last year is interactive stuff. So just at the beginning, one year ago Mikawa asked me can you do some interactive stuff and I said okay let's try, Unity is well known to begin with and we make two little games, Qtress and Everest for customers is what we asked for, yes, special style. And beginning with interactive stuff we also noticed that there is the Oculus, the Rift campaign and we ordered in DK2 just to, yes, we have the possibilities to try it in a game engine and then the package arrived, we put it out and we are not paid by Oculus. All VR devices are great, that's our story, how we get in touch with VR one year ago and we put it on our heads and everybody in our office was, yes, we were amazed and thought we should work our standing in the VR, yes, in the VR world. So talking about desktop headsets like the Oculus Rift, it has great performance because of the PC behind and position tracking is also feels very good but after the first weeks we noticed also it's very clunky, there are a lot of cables, you have to stand down the person. The consumer version one year ago it was, that means also if you want to give experience to 20 people maybe in a lost place, maybe in a building, you have 20 PCs, 20 VR devices so it was difficult to think of a use case for us, we are not game designer, we want to give an experience and all in all it's also expensive because every device leads PC. So we thought that's not our VR solution, let's look away, there are also the mobile headsets and we bought the Google Cardboard, Gear VR, Duravis Dive, we have about 10 different mobile devices and looking at them, they have limited performance, they have no position tracking yet but they are mobile, they are available and you can, they are not very expensive and so and also making a VR with mobile it's easier to publish them, you can use the Play Store or iOS Store. All in all it's a low entry barrier for us as a small studio we said that's our way, we should make mobile VR and now the next question, what should we do in there? So if you consider that there are these hardware limitations for mobile, so for smartphones, you can do things like film of course because everything is rendered and you just need to play back, so it's easy right? Well the render times are horrible for film for VR, no really they are horrible because it needs to be 4K to have a good detailed experience, it has to be completely noise free because render noise totally kills the VR experience and ideally you don't have 24 frames per second but ideally 60 frames per second so that's really like a fluent non-stuttering experience and then you just double that because it needs to be stereoscopic. So not only the production of film is a little bit complicated and has some issues also the file sizes are gigantic so if you want to have it mobile then you need to download gigabytes of data or you compress it so much that it doesn't look good anymore and there is a very nice app in the app store called VRS where you have small beautiful videos but that's like one gigabyte, that's the file size that you have to prepare for and that's not really mobile and that's also not fun anymore. The other thing about film is it's linear storytelling so it just goes from A to B so you cannot, well you can pause it but that's weird as well so you have not really a freedom in time and especially we like to just look around in a VR world without being constrained to time limits that you have in film and apart from that the render times are really horrible. So film, nah we don't do film yet because of these constraints so instead we do interactive stuff. I'm sending this mic better. Yes and sometimes the first thing you do is think about okay let's make something in 3D interactive, oh let's take our office and if you remember the photo that's our table tennis and yes it's a part of our bureau but we noticed it's also not easy that's another demo we make not the office we make a virtual environment in game engine in real time and we noticed it's difficult to as non-main game developers to produce such an interactive real time environment and having also the limitation to have 60 frames per second in mobile 60 frames per second and that's yes we noticed it's the visual quality we want to achieve is not possible in real time yet. That's another screenshot. It was a nice experience to go to one by viewing to go from one place to another and yes see this building from inside but you see limited geometry because we cannot do everything in there. Our solution is pretty simple it's interactive stereoscopic, equirectangular virtual reality panoramas. Maybe we should explain some more details about it just some short experience how this started. At beginning I thought okay putting one camera in one equirectangular camera render the image as a panorama put it in unity look at it okay it's nice it's monoscopic let's do it in 3D I press the button stereoscopic camera render it and oh it's looking weird because it's not correct it's only you can imagine if a sterile rig looks toward there's a convergence point but behind these cameras it's not correct anymore and left of the cameras it's not correct so you should have a rotating sterile rig and it was a next experience I thought okay if there's no possible it was one year ago I have to mention. I rendered four slices one to the front direction back direction left and right put them together and I had already a good stereoscopic feeling but also there are the edges between these slices so I thought oh let's make more slices and we made a python script which rotates the script and renders only a border of let's say four pixels and we had a picture with all these slices put them together and that's a stereoscopic image and then the life it into came. Yeah. And then it was and he right at the time where we were doing these experiments with python and this horrible convoluted workflow he just said yeah well I have a custom build that does that you press a button and you're done and of course that's awesome and we instantly use that with spherical stereoscopic rendering and of course you need a custom build for it. So Zemion started to work on his first app using that. Yes and one idea was to have music and singing around you it's called slicks it's also the name of the band you can download an iOS and play store and the environment you see in the backdrop is a spherical stereo image but in front there are six planes around you playing and beat you playing with alpha and it's like standing in the middle of a band and they're singing around you it was just it was not a money project it was just we have to make projects to get to see what the problems in VR are. Yes and here you see the slicks singing on the gooseberry island and you can switch it on runtime the environment and the next idea was okay let's make a space shooter. Because of course the first thing you do when you do interactive VR things is like let's shoot asteroids that's awesome and Matthias Eimann a colleague from our studio he modeled a spaceship and we rented it in like stereoscopic but as a panorama so you can sit inside the cockpit and we made the window of the cockpit transparent so behind this you can still shoot interactive asteroids. It was an experiment it's not published yet but it was very fun and because we were so enthusiastic about this and this workflow we thought this is awesome let's make a talk about this at the Blender conference and that's the name of our talk we are awesome in space because it's we thought this is just awesome but then when we were working on that the one guy from our colleagues he got a call if we can deliver interactive 3D content so we thought okay maybe let's postpone the space shooting business and instead do something more serious like an architectural demo and we did this VR archviz pavilion thing and I had this file laying around years ago where I was bottling the Barcelona pavilion and we just brought this into VR so we have this little demo VR app where we can look in stereoscopic 360 degrees in this environment and yeah and that worked really well and due to that we got our first commercial job from a client who wanted to have just that for visualization of architecture and that's what we did then. I just want to mention to compare to the film workflow this is an image and JPEG top button image which has this file size of let's say 8 megabyte and you standing in this 8 megabyte and watching them for half a minute or one minute or two minutes and you can decide where to watch so this is a really big difference to this film thing having 8 megabyte and having minutes of experience and so far and it's also possible to in 4K or 5K you're not so limited because you have no stream you have just one image. The fun thing is to just look around explore a space with your eyes without being constrained to time or file size stuff so just quickly because we are running out of time already so the workflow in blender is first thing you need to have the right blender so make sure that the version that you work with has this button spherical stereo so this is the custom build from Dalai Filinto unfortunately his website is always down so we there are links we will provide them maybe later on our website hopefully this will make it into master soon like Sergei will maybe work on the camera notes and then we have it. For now there is always this custom build where you can use that so just quickly how to set up a scene for VR in blender so first thing of course yes you have to enable views this would be also if you just want to do ordinary 3D this would be what you would do as well so you can see this is already like the red green visualization of the viewport the stereoscopy panel you can enable the 3D view and then as always if you work for 3D one thing that is very important is to set the convergence plane and that's where you kind of you could say where you focus like the main area where the viewer is supposed to look at that's where you set your convergence plane because before and after this plane the images will shift aside and your eyes have to your brain has to do more so set the convergence plane where the viewer is supposed to look and then if you want to render a panorama you set it to panoramic make sure it's equirectangular the off axis is a good default and for our case make sure spherical stereo is enabled set the pivot point to center and then set the dimension to 2k or you can just put it to 200% for 4k but this is the 2 to 1 that's the side aspect ratio and if you go to the render viewport then you can see this is like the panorama it's looking very different from the OpenGL viewport but this would be this 360 degree panorama and it cycles only yeah what else yeah you render that and it takes a while so we speed up the display a little bit would be nice to be able to do this in reality as well but unfortunately this is not the case but here I'm switching between the left and right eye you can do that in the in the header of the image viewer output and you can see how the two eyes are shifting and they are shifting around one point that's at the top of Suzanne's head and that's the convergence plane so this is where they the two eyes converge and before that point pixels shifting to the left and behind that to the right if you switch the thing and the point is that only with this build from the lie this will also happen if you go to the other side of the image so you would still have the same convergence plane and the same amount of shifting so this is the stereoscopic magic that's happening here and if you render that eventually the the render will finish and you can save the file the one format that we use in our workflow was to set it to top bottom so just set it to JPEG and then go to stereo 3D of course you can also output the left and right eye individually but we found it's better to set it to top bottom so the left eye would be on the top and the right eye on the bottom and this will then look like this so this is one file that you can watch with some kind of viewer the thing is if you look at it like this it will be it would look okay but you really you zoom in a lot so it has to be noise free so this is the pavilion that we rendered and one issue with the aquirectangular spheres is that the sampling isn't really smart so in this area cycles will sample the same amount like in the in the center of the image but of course because this will be squeezed into a sphere the areas on the top and bottom of the image will be very small so you lose a lot of render time so you are over over sampling and there is one other method that would be better and that's cube maps and this would look like this so you have the six sides of a cube rendered like each eye again left and right side and the yeah this will give you up to one third of the render time less so it's more efficient unfortunately when we were doing our first job and and this pavilion we didn't have that yet but again the lie to the rescue so we he said he could do this and we sponsored the cube map add-on that lets you render the this cube map file format with blender I think I'm going to skip this video because we are really running out of time just briefly it was set up six different scenes forget it cube maps are awesome but a little bit more in development still if you want to try it out you can find this on the lies guitar page maybe briefly about the workflow in unity sorry for the work for unity I will explain not the cube maps because I'm both methods spherical images and cube maps have both advantages and disadvantages because cube maps are 12 12 images the set up is a little bit more difficult so I show the spherical image workflow in unity but before going to unity I produce the mesh which is used in unity which contains the texture and this mesh is also created in blender so the base where to project the texture or where to use the texture is a sphere and I deleted the top and bottom vertices and extrude the lower and upper rings just to the center but not close not to zero this is in my what I yes used to do necessary for to unwrap and sphere correct to a correct angle you will see it in the moment unwrapping such a sphere with spherical projection leads to and correct spherical unwrapping and scale to bounce was pressed so that it matches to the borders and now I can close the pulse scale to zero and maybe I can also a little bit speed up after that I have a monoscopic sphere what I do now is flipped enormous to inside I flipped enormous direction to inside because later what's we will be inside the sphere so we have to see the sphere from inside and make it smooth to see it in blender I have turned on back face cooling which means you just see this side of the face where the normal is just when the normal is directing to you and now I copy this sphere the one sphere is renamed to L for left the other one for right and for the left sphere I put it the UV coordinates to the top of the virtual image and for the right sphere to the bottom so that it matches our top button image so what you see is geometry for left and right eye with the right coordinates for a picture which which is top button so the next step more important in unity I imported the Google cardboard SDK which is free available from Github also the 360 sphere which is the blend file and the gooseberry you see the gooseberry file this is in top button image the mouse look is just for presentation in here I deleted the scene camera and the scene light because I need we need nothing just the cardboard main prefab which the stereo rig this is our geometry we created in blender one minute ago and we set the shader to unlit texture that means shadeless now you saw I just I just dropped in the texture and at that moment the UV coordinates are correct for left and right image now I have to add also layers for left and right eye because yeah I put the sphere from also on the right layers to tell the cameras now which layers to see the left camera may not see the right layer and the right camera may not see the left layer and that's it you can do it in let's say five minutes with research you it may some some hours are but now it's in functional VR demo with a blender produced image thank you so yeah yeah we still have a little bit just quickly so our tool set is all free and it's all available and it's everything super cool so the 3d software is blender the game engine is unity which is freely available of course if you want to use it commercially you might have to spend a few euros but you can test it freely the VR from the Google cardboard SDK is also freely available and you all have a VR device so what's missing and especially in our workflow one thing was missing and that is a decent viewer I mean of course you render something and then you want to view it on your cardboard but how do you get there you cannot just always go to unity and build an SDK and blah blah blah blah that's really annoying we didn't find a decent viewer not at least not a cross plot from your viewer for iOS and Android so we'll just do it ourselves so let's build a cross-platform VR image viewer that was the plan and we thought yeah that's awesome so we want to have this viewer where you can just put your images and then immediately have them in your in your cardboard and view them of course we want to share it with the community but then you go and have your blender beer at the evening and then you have another beer and then you get crazy ideas for example wouldn't it be cool if you can all share all share our VR renderings and have not only the viewer but also a gallery and that's what we are actually presenting now and we have to rush a little bit because today we uploaded VR to the Play Store so VR awesome in space this is an app for you where where you can explore the VR renderings of the community you can mark images as your favorites and of course you can just upload and view your own images in this gallery there is a website where you can upload you have to register of course you can upload your images and if you want to you can publish them and they will go through a short review process because we have to make sure that no crap is being uploaded but then you can just share and view images of other people and the website is VREO coincidentally VREO also means real in French I guess if I remember correctly so this was a little bit of a funny thing so