 My name is Martin Fröhlich and this presentation is obviously about EV, but I will give you a bit of an expose of how this project actually came about and how I actually came about to work on this project. But first I would like to make some stats. I would like to know about my spectators. How many of you are working in the VFX industry and are familiar with green screen VFX? How many of you play around with game engines? How many of you combine that? Okay excellent. Okay I will quickly give you an introduction about me and the institution I'm representing here and then I show you some of the recorded and I'll show you the setup we used and then some recorded footage and then about the future that will maybe come. Now about more than half a lifetime ago I got myself a degree in mechanical engineering and it was a time when nobody could afford card systems so I was not able to learn a card system at the time. So I didn't learn any CAD and when 2009 the first 3D printers came about I had to and I wanted to do some 3D printing. I looked for a tool and it was Blender that I chose and after my studies as a mechanical engineer I was working for about 10 years as a software developer and then I got myself another degree as a media artist and ever since then I've been rendering between the two fields of technology and arts and a combination that I would call an inventor of things humanity hasn't been asking for. Now you wonder what might have been those inventions and I'm happy to show you one of them this is the impersonating overhead display. Obviously a device that will never make it to the market then there is the moss printer it's a water spray apparatus that sprays water onto house walls and you wonder is it actually working is actually moss growing because this is a process obviously that's going to take many months until maybe something starts to happen unfortunately turned out to be harder than I thought to find a house owner to test me wrong and the last project here this is a mechanical contraption between time and space and it is actually a mechanical contraption that is pulling white rubber bands into the shape of a rotating sphere and you might it is running yes and you might guess it actually I designed this with Blender with lots of drivers in the background the piece is actually open source it's actually a Creative Commons you can find it on my website download it laser cut it and build it yourself but the the topic that actually brought me finally to this stage was spatial augmented reality is anybody familiar with this term spatial augmented reality we see two hands three hands four hands spatial augmented reality is the little step brother of augmented reality and the difference between those two is with augmented reality you will hold a handheld device in your hand and and work with that one a spatial augmented reality or projected augmented reality is is using projectors to project the virtual image onto the real object now for a setup like this is quite a elaborate you need two tracking systems and the infrared tracking systems to track the object in the submillimeter precision then you need a 3d camera to track the observer you need two or three projectors that have to be calibrated then you have to be able to render the image you want to project onto the real object from the point of view of the observer then you virtually project this this texture onto the virtual representation of the object you want to project it on record it from the point of view of the projectors and then throw it back onto the real object in order to get this effect and it is a bit mind-bending and because it's a mind-bending and complex actually had to write my own software allowed me to get through all those different transformations and shaders and whatever in order to make the whole thing real-time and I didn't even even mention now it needs actually real-time soft edge blending so I have to for each frame figure out which pixel has to be projected by which projector in order to get another ugly seam on the object so now this this this tool I developed actually for the purpose of museums and fair trades but through some circumstances I had more contacts with with with performing arts so actually all the all my most of the applications I I got into public with it was in theaters or dance productions now in here I'd quickly make you or give you a make a little advertisement for my software it's called spark with CK at the end it's an acronym for spatial augmented reality construction kit and I'm going to actually publish it open source very soon if any of you have experience with licenses and would like to give some advice I'm very open for it most likely it's gonna be an MIT now this software and these these applications in the dancing and performing arts actually gotten me a job at the immersive art space and what are we going to see here it's a very short video is a animation made in unity of a virtual character designed by Tobias Kremler we used live motion capture suit the screen is captured as well so we know the position of the screen we also animate the virtual camera inside unity with the position of the screen and spark is used to project the whole thing back on to the object now the nice the nice thing about this installation was that the stage designer came up with a nice idea for the screen so she's used this rubber bands which allow the dancers to actually step through the screen and this way kind of created a portal where the dancers could step between the reality and the virtuality now I mentioned the immersive art space the immersive art space is actually a huge space 360 square meters and we have a large volume mocap system obviously we've got a 3d audio system we've got a large when you project us for these kind of purposes but it's not just a space we also in a research group and we research into multi-user VR experiences photogrammetry performance capture spatial augmented reality virtual production and low-cost tracking technologies the immersive art space is located inside the University of the Arts which is one monolithic block and it hosts about 2,100 students that study design film fine arts music dance theater transits planeritae and art education now one of the promises when the University actually moved into this huge campus was that more transdisciplinary work or projects could start to happen and pilot project before the immersive art space showed that using technology as a neutral facilitator could actually help achieving that because all these different disciplines they talk a different language to have a different creative culture and discourses which is tricky in order to find a common ground now let's get to the meat virtual real aesthetics and the perception of virtual spaces in film this is a Swiss National Science Fund funded project and basic idea is fairly simple we created two short movies each of them about five or six minutes and each of them in two different versions one on relocation and one in front of green screen the idea behind that was that we want to use these movies now in context of a psychological study to see how virtual spaces in this case this is virtual created movies how the perception of those influence the influence the spectator now green screenshots are notoriously expensive and usually we don't use them in the Swiss industry in the Swiss film industry we actually try to avoid them because our budgets usually don't allow this and the wages actually are so high that we our studios they cannot compete with or cannot really compete with the FX studios that are around us in Europe but there are still some use cases where we can't have a shot of a movie on location where we might have to need to scan things and based on this premise we decided we we go through this and we could actually learn about photogrammetrics capturing previsualization table virtual production and the post production thing and just a quick overview of the of a process that we have so we have a storyboard location location scouting we have a photogrammetry or scan modeling previsualization and then it goes to the shooting and here we have two shots one on location and one on green screen and I wish I could tell you here that any of those steps blender was used unfortunately this is not the case otherwise my title probably would have looked I would have be a different one but there was this part of virtual production and the virtual production happens during the green screen screenshot and when I learned about what they were planning to do and it was about a year ago and Evie just came out with 2.80 better I thought this is maybe the chance to bring blender into the game so what do we have to achieve we actually have to do the whole post processing part in real time so video tracking keying background rendering and compositing and so how does the setup look like so we've got a green screen we've got a camera we've got actors and we've got a moving camera and one thing that is important to see now once you're looking at the scene from the angle from a director or a cameraman obviously this actress they stand in the void and this is very very disorienting so what is this virtual production or is real screen real-time screen background plating would help is to help the cameraman to understand where actually the actors are moving what's happening the director can actually give some sensible directions to the actors and the lightning guys have an idea how the whole composition might look like so so that the lightning of the foreground actually they fit so we would need something like this and in order to achieve that we need a mocap system and are more is anybody familiar with mocap systems in here how many of you it might be five or six hands how many of you have direct access to a mocap system a large volume mocap system there's one person two person kind of three a little bit so it's not very widely available and they're actually quite expensive just to give you a quick idea how mocap system work so each camera is actually sending an infrared signal out from the camera there is a reflector that reflects is back onto the camera we see this on the right hand side these other little white points and each of these points actually represent an a view ray and if two view rays are intersect in the space then the system thinks this is a marker and in order to do what we wanted to do we actually had to track the camera and we did that through a so-called rigid body so we see on the top of the camera we see a rigid body with four markers and here in the tracking system we see how it looks like we can see on the top of the camera the four markers this rigid body has actually pivot point and we need to be able to place the pivot point into the axis of the lens and that's what the other three markers are for so it is where I can actually locate the lens axis and orientate orient it accordingly short words about the pipeline we use so we have a camera that sends an STI signal to a color here we used here TV grades real-time hardware to do this we've got obviously a tracking system and the way we the way I got the tracking data into Blender was we are open sound control is anybody familiar with open sound control we could have five or six hands it's not a very common protocol in this field here but when you do interactive installations it's actually very nice protocol you can very quickly design your own on on data structure and send it forth and back and there was actually an add-on for Blender but it was for 2.79 so I had to port it for 2.80 so this way I got the camera data or a camera position into Blender sent the sent information via HDMI to STI and then compose the whole thing inside of the colic here now short words about the way we created the models as I said we actually went on site and lasers can't decide there's no fancy science fiction background plates I'm going to show you here but actually rather simple places living places what we used we actually used laser scan and then used pictures photos we wanted to use for photogrammetry and projected them back onto the model and I have to actually look it from here now as you can see it represents itself as a box this is the model of a flat and the tricky thing here is actually the navigation inside of this box you always drop out of the box which is a bit of pain and lucky for me that Blender has this nice little nifty feature which you can access with alt B and alt B allows you to actually crop the object in such a way that it's not affecting the refiner render so in this viewport actually I could crop away the ceiling and I had a nice overview here you see some of the details of the scan it's actually not very detailed the details is more in the texture but it was actually quite a cheap model G at the model I got here was actually just ready one day before we got onto the set so I had to really improvise here with all the lighting and so on you see the camera is already moving so it already gets the data so have a quick look at the ask plug-in we see here this is the IP address the local IP address we have the port on which Blender is listening to and here we have to ask messages that are being passed on to a little empty that is receiving all the transformation data so I'm sending here position and a quarter onion I have to switch here the W value or Blender wants to have it first here we see the live data that's coming in on the empty and then on the camera dolly we see here I actually used some drivers that was in case I had to scale the data that was coming in from the motion capture system and and then on the camera level I only had to make some small little color corrections on there on the rotations here on the y-axis I could have made some corrections on a nodal point actually that the people inside the motif should be actually inside at the nodal point of the lens but I didn't go into these details in terms of lighting nothing special I used EV fairly high exposure in order to lighten up the thing because the texture was obviously already had the lightning baked in it was kind of unavoidable but I added see I've got some ambient occlusion and I did some some curves but that was basically it and inside the scene there are two light rectangles and that was basically it so I just lighten up there this in a little bit from the point of view of the window okay now this is the scene itself the same room but now here live now we as I mentioned this is not a Michael Bay movie the the camera movements were quite subtle most of the time you see here the motion capture system actually had a little bit noise that's why the background is a little bit shaky and there was a ride at the beginning of the shooting when I haven't realized yet that I was actually something like a focal focal setting where I could blur the background a little bit you can see now the girl in the background is actually blurrier than the background itself I certainly could have corrected that in later shots I actually did that so it was for me it was really learning learning by doing it was jump jumping into into cold water here and the noise we had here with the motion capture system is this is the same scene here from a different angle as you can see movie studios are not really the ideal location for motion capture systems because you have lots of lamps that irradiate infrared you have shiny fixtures that are reflecting as well which create fake markers so that's why you get this light this light noise in the tracking but then again we didn't want to have a final product here we only wanted to have a reference that that would be helpful so this little noise and this little jerks they really didn't didn't matter that much this one here is another scene again very subtle camera movement in this case here the keyer that didn't key properly the green screen in the background but again for the cameraman as a as a as an understanding of where the whole scene is located and for for the lightning guide to understand what's happening in the context that was actually enough and here I actually have a bit more so you can see here the real green green screen screen which what the crew would have been exposed to if they didn't have the the live composition on the left hand side we have the final final composition from the VFX on the right hand side we have the one from the original one obviously I didn't get really the coloring that well but again from a perspective point of view it's quite it's quite okay and the last scene here this is actually blue screen shot here again the keyer was was super there were not this was not everywhere was it was actually blue screen in the background so not everything could be keyed the lightning here and here to scene again with the final one on the left hand side and the original one on the right hand side so lessons learned I will structure lessons learned lessons in three parts what was the impact of the real-time background plate how does blender compare and where do we go from here now the director and the cameraman they were very happy about about this this setup because they said this was really saved their lives on the on the on the on the set they understood what was going on they could actually give the actors these interactions at the end the director said she totally trusted the image I produced here which is is nice to hear but then again it's also with some pressure on because if you do something wrong with the lighting of the background you might have you might cause troubles later on in the post production because your front plate is wrong little bit so what I learned here was it would actually make sense in such a such as it up to actually take a lot of care in the front to create really high quality background plates already lit up according to the standards you would kind of expect in the in the final in the final post production so and if you if you do that anyway it actually makes sense to do it beforehand because then you you actually have a cool cool or a good reference at the beginning now I can't really say that the lightning was was benefiting here because this because of this special setup we had the whole movie shown shot in a relocation so the guys actually they wrote down all the settings on the camera on the lightning and they just reproducing the in the studio so they cheated a little bit here but I think if you do it properly it can be very helpful and just to show it for myself I got the final model from the VFX guys and try to put it into Evie and rendering Evie unfortunately it is shown here very badly this projector is not capable of showing it properly it looked very convincing or it looks very convincing and if you ask can you actually play in real time yes almost 25 frames so you can actually do quite some some decent lightning scenes in real time but you need a very beefy machine for that now how does Blender compare to game engines which are usually used in this because my colleagues they wanted to use unity for it and also that for the other for the other movie they actually used unity and so I only can compare these two these two game engines or these two tools now both of these tools neither blender nor unity are actually made for this kind of task so how did they compare and obviously there's one important things in terms of speed unity if you set up properly you're seen it's obviously faster than Evie but you can't take care of that with careful setup of Evie now in term of real-time editing and keeping the values I have to explain a little bit in unity you have these two modes when you're in the editor you're in edit mode and in run in run mode but if you're in edit mode if you streaming information to the game engine nothing moves it's only moving when it's in run mode but in only run mode you can make fine adjustment to the camera position etc etc but as soon as you stop from the edit mode from the run mode in edit mode you lose all these settings so it's really a painful thing and you would have to remember all the changes you made in the run mode and it's just awful so here I have to say Blender definitely beats unity because there is no difference between that I didn't have to have a switch between and I actually could see in real time in a different viewport what I was doing then the view crop part of the model I mean it sounds like a very simple thing but it's actually crucial if you can see what's happening inside of a scan this is something you probably cannot cannot do in unity but maybe somebody here could enlighten me on that then full screen window without border not doable in unity in Blender you can do that and set different render modes in different viewports you certainly can do it in Blender I'm not sure if you can do it in unity though but it's very useful because if you have a high quality render which you want to use for your background plate you only need a low quality render for seeing what's happening in your screen and you want to have all the of the graphics card on the rendering so where can we get from here let's have a quick look at the whole path here again so in case of you have a modeling and the background rendering already in Blender it's obviously a no-brainer to do a virtual production in Blender as well and in terms of the data flow obviously the color key is very expensive and the tracking system is very expensive what can we do about that because this is something that would keep away people of using it that are in the field I have here a couple of ideas one thing is you actually capture the camera into the computer and then use a color key shader I also already have I already have a shader that is doing that so I that can do in real-time color keying what you then need to do is you have to send the Blender result to this gear and that's something you can do with siphon or spout unfortunately it's not working on Linux and this is a plugin actually also wrote so you can actually already do that in terms of tracking system I think I have a solution that might be much much cheaper than a normal tracking system that could actually potentially work in in more circumstances than the tracking system but I have to make some more tests so if you want to stay in contact with me I certainly can't provide you with with the details once I have conclusive information about it but what I can promise if it's going to work it's certainly going to be open source and last but not least there is another possibility instead of sending and composing the whole thing they could send it to spark and then send it to a projector and that means you could instead of working with green screen you could actually project the background into your background this is actually already done and you might have seen here on this video at the beginning this is spark and you see here you have to look at the background of the car actually the background adapts to the point of view of the camera so you can you can do this with blender as well now so if you send send the background texture to spark and spark is taking care of all the projection for the background you could could do something easily and if you had a nice LED screen like the ones we have upstairs then you would have a perfect background there is actually a nice video that shows this how it how it's being done with unity with unreal but I would say you can easily do that with blender EV as well here are some of the links for the at osc and the spout and siphon now they don't they're not they're both not really perfect yet if you are familiar with timer and cues for at osc or for render to texture for spouts I would be very glad if you would contact me because I think there can be can be done some improvements especially on the spout side where the rendering performance for render to texture could be been been improved because I need to have the setting of a view active viewport to pass it on to the render to texture and that means I render actually the same scene twice which is not necessary and yes and that's about it now I'm open for questions are there any questions yeah could be yeah so yeah so that I don't remember the names is Google but the systems based on unity that will do exactly that yes in terms of background especially so the key in unity the truck in unity and all stay without one system but what would be the advantage of this because if you've got a close system that's made for performance in terms of like I don't know if that's jittering that comes from latency on the mock-up or that comes from the tracking yes you always gonna have that yes well how is the you can you can you can improve on it yeah you could instead of having instead of sending infrared you could actually have active markers so you're not flushing the whole whole the whole stage with with infrared and you don't have these all these these markers that pop up that are actually nothing so that's that's actually that's the second question because those systems operate on the HTC Vive trackers which are active infrared and also they tend to use LED lighting which is like infrared neutral so I'm just thinking HTC Vive trucker would like to base stations I think you can go up to 16 now yes that would be that would make it cheaper that would certainly make it cheaper I mean we used the large volume capture because we had it and because it was it was more convenient because you could actually hook it up into the periphery of the screen while with the HTC trackers you would have them inside the stage and then probably rearrange them all the time if you change the light fixtures and so on and reflectors and whatever but then you need the the actual active IR illumination and retro reflection from the trackers the trackers get occluded sometimes yes everything gets but occlusion can happen with HTC Vive as well true I can put two on and then you can say fish well we had 25 cameras and we still had we still had occlusion so so you might you might have this trouble with HTC Vive as well but your your points are very valid now I just wanted to make myself for myself a proof of concept that I actually can use EV and I think if you are in a production pipeline where you use Blender anyway that actually could make sense obviously depending on the kind of setup the kind of model you have in the background if it's obviously very computational heavy to render then you would probably have to switch over to a game engine that could render it faster but then again it doesn't have to be real time it doesn't have to be perfect because it's really only used as a reference yes yes some more questions okay then I wrap it up thanks very much for being here