 Our next speaker is the perfect mix really for this conference. He has extensive experience from academia and he's an entrepreneur with a lot of industry experience also. He is an internationally leading researcher in computer graphics and he has been consulting some of the most famous digital film producers ranging from Pixar and to James Cameron's Avatar and Peter Jackson's Lord of the Rings where he actually also received an Academy Award for the technology behind Gollum's skin, something that has given me a lot of sleepless nights. So I'm excited to hear more about how you actually technically create that. He's also been appointed honorary professor at Aarhus University, but he lives and works in California and assists chief scientist at the company, Luxion. The company has offices in Aarhus and in California and specializes in computer graphics, among many other things, famous for developing the 3D rendering software key shot. And this is software used for visualization of products in design, advertising and architecture. So it is my pleasure to give a welcome to Henrik van Jensen. Thank you and thank you for having me here today. It's great to be here and it's correct. I spent some academia and now in a certain instance where I become more of a do-it-guys that I've joined industry, but I'm going to talk a little bit about today, about some research that I've been doing for many years and sort of how we've applied that in various industries and now I'm looking at applying that in the real world to basically replace photography with computer graphics. So one thing I always like sharing this slide, so one thing I've been doing for many years and people ask, what do you really do? I explain why the things look the way they do. So I'm really good if you have sort of kids that ask why is the sky blue. I'll sit down, I'll talk to you for an hour and 20 minutes. There's the time for lecture. I'll explain to you why the sky is blue, why is ice blue, why this butterfly has these colors, why the rainbows look the way they do. I recently met one of my neighbors and said, can you do rainbows? And he probably regretted asking me that question. One of the algorithms I worked on earlier in academia is called photon mapping. Here's just an example of an older scene here, it's an architectural scene, but the only light source is the sunlight that's outside the scene. What photon mapping can do is it can simulate how light bounces around inside a scene and fills the room with light. And if you didn't have photon mapping but you have an algorithm called ray tracing, which many people call a state of the art, this is all you get. So you can see how it can fill the room with the light. This is still an algorithm that's actually used. Here's finding Dory. All the light that's underwater, it's all computed using photon mapping. So when you have the waves forming the surface, you get these focusing effects of light that are really complicated. And photon mapping is really good at doing that. So it's used quite a bit for that, and it's used in quite a few movie shots. Another thing I've been looking at is materials. Why do materials look the way they do? And in particular, I was looking at if you have sort of natural materials that look very soft, you have sort of light shining through them, you get this translucency effect. And that is caused by a phenomena called subsurface scattering. If you simulate a marble bust where you ignore that, you get a very hard appearance as you see on the left. If you include the subsurface scattering, you get a softer appearance as you see on the right. It turns out for human skin, that can be quite useful. If you have, again, the left image where you don't do it, what you used to do, you get a very hard appearance, it looks like way too much makeup. This is a very simple model. If you have the subsurface scattering, you get sort of much softer appearance, you get sort of light penetrating inside the skin and coming out, and you get more natural appearances. So this is what I did quite a few years ago, and it was used as was mentioned on Gollum. They actually had, they didn't have this technology in the first Lord of the Rings. If you look at Gollum in Lord of the Rings 1 and 2, if you do it because in the number one, they had the first technique where he looked very hard and too much makeup, and then in the second one, the third one, it actually has this... One thing you notice is that the ears start getting this translucency effect, so they're really like that. So the early characters in movies using this technology had really big ears. There's also Gobi and Harry Potter, where you can see that. Now, one thing I was hoping for is to do an actual human character in a movie. There's still something that people are working on, but there's this phenomena called the Uncanny Valley, which a movie guys are very afraid of. What happens is, as you make things that look increasingly realistic, you start out with maybe something that looks like a doll, and then as you get closer to a realistic human character, if it's not quite there, it becomes creepy. Then the person starts looking sick, and that's called the Uncanny Valley, where it's a zombie or a dead person. And you don't really want that, but you also want to achieve a human character for movies. So we tried to ask ourselves, what can we do to get there? And we started simulating skin. As layers, we had different layers in the skin. We had the top layer epidermis. We had the dermis and the bloody dermis. These don't look like skin, but when you combine them, you get something that looks more skin-like. And one of the first movies that I was used in was Avatar. They're not quite human characters, so they instead had blue skin on the characters here. What we did have was one of the guys who did work on Avatar, modeled himself. He's a very talented artist. This is not a 3D scan, he actually modeled his own face. And what's interesting about this picture is it has a left side and a right side. One side is a photograph, and the other side is a renderer that he hid on himself. And now I'll just let you look at it for 10 seconds and guess which is which. I'm not going to tell you this. We went even further and started looking at one of the problems with these sort of complex materials that have all these layers is that it's not controlled. How do you instead substitute the skin for the epidermis? So artists didn't really know that. So we made a model where he does specify melanin concentration, blood concentration, and melanin type, and just three parameters. And with that you get some pretty good scan. But one of the things we did with that was we went and tried to go a little beyond computer graphics and say, does this actually match real-world data? So what we have here is measurements of different types of human skin and how the response is a function of the wavelength of light. And we can actually see we can match the data pretty quickly with just a model of three parameters. So I'm pretty excited about that. Another important thing for humans in movies is hair. So it turns out if you look at an individual human hair, there's a lot of stuff going on in that. And the way we went about that was we actually started measuring it, looking at it in microscopes, and really understanding it. It turns out hair is kind of like a glass cylinder but with these angled cuticle scales on it that gives rise to lots of interesting effects. Again, as I always have in trust, it seems it wasn't used on humans in the first movie, but in this case on a monkey. But it's been used quite a lot now. And then another movie of Houston was tackled where all the hair in the movie was done using this new hair simulation technique. And one thing you notice in particular is what it can do is if you have the highlight on the hair, the shiny part on her hair, right next to it is a color bright region as well. And that's actually sort of a thing that people didn't really do before this technique came out. So the last thing I'm going to talk about is sort of a research example says it's actually we did a simulation of sort of trying to ask if we mix molecules together, can we then predict what it would look like? So what we did here is we took a glass of water, then the second glass on the left is a glass of water with vitamins in it. The third glass is a glass of water with protons in it. The protons scatter shorter wavelength so it looks blue on the front where it's illuminated and red on the back because the blue light has scattered away at the front. The glass in the middle is then a glass of water with fat club use. This scattered the light independent of wavelength so it looks white. And then the three glasses on the right is the skim milk, regular milk and hole milk. So what we could do is we could adjust the amount of protein and fat concentration and then try to make the differentiation of milk. And of course this is useful for computer graphics for rendering but who's using it now? Which is interesting is I went and talked to a dairy company in Denmark and at the time they were using sort of like a spoon that took inside the milk to basically do a quality control. They look at how much fat and protein is in the milk that we sell. And they said, hey we could use this because if we illuminate the milk and you can predict what it looks like based on the fat and protein concentration we can actually replace the spoon which might be polluted that we're putting inside the milk. So today they completely changed this. So now they have lasers illuminate the milk and they predict from the footprint of the laser how much fat and protein is actually in the milk. So that was pretty cool and they use now outside of just the computer graphics area. So sometimes people ask, so when you predict materials how does it really go about, how do you come up with these equations? And I thought since this is an innovation festival how do we talk about that? So how do we actually get there? How do we just sort of think oh here's a big equation and then it looks nice. So we started looking at clothing and wanted to understand that a few years ago. So we took lots of pieces of cloth and they all looked a little different and we wanted to really see if we come up with a mathematical model that describes why clothing looks the way it does. If you look at clothing sort of a lot of it's made of fibers. The fibers are sponsored together and the threads then weave together to form different clothing patterns and we thought the work we had done on here could actually be used in this case. And if you look at all these weave patterns there are lots of different complex weave patterns as you zoom in on the clothing. So the first thing we did was we had an apparatus where we could illuminate the clothing from an angle and then we could see how much light is reflected in other angles and unfortunately when we did that we got a graph out with lots of weird peaks and then we started trying to think what could cause that why is it shining in all these different directions and it turned out to be really difficult and we spent a lot of time on that and the poor student that I put on this he's like, can you please let me graduate without having to figure this out and unfortunately for him I said no but fortunately for him we also finally after a lot of thinking we had to figure out some of the reasons for why it looked like this So one of the first things we did was we said what about measuring individual threads so we took these to really behave like human hairs we took all of the threads out of the clothing and put it in an apparatus so we could have a light and a camera swing around the fiber to measure it and what we observed is actually different types of light reflection what you see there's a light source coming in and then some hair would reflect light in a very shiny way in a focused comb and other types of fibers would reflect it and sort of spread out the very diffuse and that gave us a hint that there's something different going on inside the clothing but it turned out there's actually two types of threads that are used in most types of clothing there's a thread that's twisted when you illuminate that with light it spreads out a lot you get a very soft appearance but then you also have threads that are flat that move through and they act almost like a mirror so they're very shiny when you illuminate them so that was our first sort of aha moment so we developed models for individual threads in the clothing to describe how do they reflect light and here's then some examples of sort of photographs of these you can see you get some of them have pretty interesting shiny regions highlights on them as we call it in graphics and these regions move around in strange ways and again we had a hard time predicting or understanding exactly why that is we looked at this for quite a while and our next aha moment was taking the threads out and looking at them and what we saw was that they had these sort of flat regions some angles going on and basically turned out that modeling exactly these regions precisely like little mirrors that was the last key to getting the clothing correctly and once we had that we could now start doing clothing so we basically say there's a thread that goes through the clothing like this and if we did that we can now simulate the appearance of lots of different clothing here's a linen and here's some silk and now we can get all these different highlights on it this is silk wrapped around a cylinder if you illuminate a photograph with a flash you get a little light reflection that looks like that then even more highlights here so we're getting four we looked at this for the longest time and just thought there's just no way we can do this crazy or rather the student did and here's some more again and even velvet we could do and then of course some pictures here at the end and then the output of that is basically being able to do like a silk pillow instead of having to guess what it looks like now you can actually say this is based on the way silk is weaved together and the threads and everything so it's very nice to have these predicts or models we can predict with lighting coming out and now we're adding more clothing and key shot I don't have any examples now but we're adding so you can actually zoom in and see all the individual threads to get that appearance so getting crazy detail with models that we're doing now so Lucia talk a little bit about that so that was a lot of academic research so how do you actually put that into a sort of industrial setting and how did that even come about so Lucia was founded in 2003 it actually we have two offices today we have the headquarters a mile away from here one kilometer one and a half and then we have an office in California we have most of the development here in Denmark and when people ask why do you have development in Denmark instead of California and I always say it's because the winds are so hard in Denmark the computer it's a nice place to be you get the heat from the fan people in California have too much sun so we like the development here and we like working with all the universities as well so we actually have two products we work on one was one that cost really the start of Lucia we're working with V-Logs on a day lighting simulation too and then we have the key shot with our main business so V-Logs Daylight Visualizer what happened in 2003 was I gave a talk at a conference about lighting and I talked about these lighting simulation techniques and V-Logs then approached me and said we would love to simulate that with software we don't do anything about software lighting simulation we know about windows you can put windows in a room when we go to a house today our sales people do that they say this window will look great just trust me and of course you put it in and then if you regret it it's just too bad it's an expensive process so they wanted something that was a non-technical software they wanted something that would be huge, no rendering parameters they didn't really want lighting algorithm knowledge to be in it they wanted to do it on a laptop be very accurate, it should only be daylight and then they wanted to make this free tool and they still have that free tool today by the way it's called the V-Logs Daylight Visualizer and basically we simulate lighting in buildings to validate that the skylight would add quality to the interior lighting in buildings and we said sure we can do that 16 years later we will still work on it but what we have today is a tool that actually is pretty powerful we can import lots of geometry for buildings and you can add windows in it and you can get pictures out of what it could look like under sunny or sky or cloudy daylight conditions you can get various lighting configurations out and it's actually been validated to do a correct lighting setup you can actually use this to validate the lighting in the building before you build it they've done a lot of testing of this, this is sort of an old example but one of these images here is a rendering of a building before it was built on the daylight and then the other one is a actual the actual building photographed so there are some differences it wasn't done to model it super carefully but it just shows you how you with computer graphics and lighting simulation can get a good idea of what kind of lighting you'd have in a building in 2006 we were playing around with this ray tracing technology that we had already been sort of using in a daylight visualizer and we noticed we could make it faster because we just kept saying we don't want to wait that long for all this lighting how hard can it be so we kept working on making it faster and faster and we could start doing sort of interactive things in computer graphics if you worked in that area you know that we do a lot of teapots and bunnies for historical reasons and we could do those pretty fast at that time and we used various technical algorithms for that so what we started doing was we started putting it all together in a package where we just simulated the lighting on the fly kind of like a camera you could move the camera around and we had the computer now work like a camera and we started and that's kind of why the name T-Shot came about and really to think you can just have a camera your computer be the camera instead and really in many ways replace the idea of having to use a camera at this time I've been working a lot with different visual effects studios in the movie business and of course they're very technical a movie like Avatar was 800 people working on it for 4 years and very technical people very delicate knowledge they know every parameter in the software they can tweak it most people don't work like that they don't necessarily know all these technical algorithms so we wanted to make something one of our first examples was working with Ford so we worked with Ford went to the Detroit Auto Show and every year they make a car for that show and they wanted to have some press photos of the car and one guy they worked with for a long time was called David Burges, he's a car photographer at the time this was in late 2006 he had never used software he had always used maybe Photoshop and we'd been going into the desert with thin guys that would reflect the screens and they would illuminate the car that he had and then he would photograph it make beautiful images because he could say what should a quality photograph look like but he always worked with the actual car so Ford said we need photographs of this interceptor car and he said sure but he said unfortunately we don't have the car so he had to work from the 3D model and he said how am I going to do that and he had some talks with Ford at the time and he said you had to use this brand new software that is supposed to work like a camera and we had to fly to California and do that and he agreed to that, he came out and the first two days he just I missed my camera so much I mean what is this sitting in front of the computer and one of the early pictures he did but then he went back to his hotel room and started playing with light he's been used to being a victim of sort of if it's a cloudy day he couldn't get the great picture of the car he wanted he had to wait for the sunny day he had to wait for the lighting conditions to be perfect but he went back to his hotel room made some really custom lighting and got these sort of nice stripes on the car and he was all excited and he started buying into this idea of using the computer to replace photographs and he gave these to Ford and he said these are the photographs of the car that did not exist and he was even put on the cover of AutoWeek as he has a photograph of this car that actually didn't exist at the time so we thought that was pretty cool validation and and this really was sort of the start of KeyShop so KeyShop is his more technical description but the idea of this is really it's a photograph for your product but your product is now all digital it lives in the computer and really with KeyShop what you can do and what has started happening is that when companies make products nowadays they used to sort of you'd have a digital description a CAD model, a 3D model of the product you'd have materials a science lab you'd send that description to China you'd wait three months or six months you'd get a container back you'd send your photographer down to the harbor you'd get the first box out of the container run back to the studio start photographing it on the cell phone or dust or whatever it is so it's a pretty cost costly process, expensive process there aren't that many photographers so it takes time to get those pictures what we've done now is when you send the, and then you can start your marketing campaign what we've done now is when you send your 3D data to China you can also put it into the software and you can start making photographs you can start your marketing campaign before you even have the actual product available so a key shot the process is you import your 3D data from basically your CAD description we can import lots of different models from lots of different softwares and mix and match and so on and then you can paint it up we have a very extensive material library both the library that lives in the cloud and in the software so we have quite a few companies that have their own proprietary material libraries where we work with them to really match exactly what they have can work with that in an easy way so the reason why we set up materials you can basically drag them in on a model is that you don't necessarily have to understand if you're doing milk what kind of molecules are inside that glass to do it very accurately then you can adjust the lighting which again is a nice thing to do you can have natural environments or you can do just your own lighting environment so that's one of the big advantages on a computer because lighting is so hard to do in the real world and then you can set up the camera and then you get sort of a final picture this is just an illustration of an image now Keisha we've had lots of different releases we've had the renowned Keisha 8 one of the things we've done since day one is because of our sort of bases in academia is we've always worked with these very physically accurate materials so if you have a model that you worked on in Keisha 1 that Ford model that David Berger had you could still load it in, it looks the same some of the other softwares in this industry tend to change out material models and they say they're physically based so we try to keep that all the same and then we really have a focus on this not being movie softwares this is not intended for making movies it's not making it easy to use and attract people that come to the background, industrial design background and so on here's some of the remaining technology and I can add that Tushia Hachisuka Hachisuka actually used to be a professor here at Old University and we worked closely with them at design so I'm trying to see if I can convince them so maybe come back to Aarhus he's a very smart guy what you can do with this kind of algorithm that we use inside of Keisha you have to do very complex lighting effects here's an example of if you have a light shining on a prism reflecting on the inside of a cylinder you get some super complex lighting where the light will spread out into different wavelengths reflecting inside in a complex way and sometimes we have customers that demand that so an example is here's an example from Keisha Reit actually our latest version of what we can do here we have a spotlight this shining light through a fork onto a product about to be revealed we can do light focusing through glass spheres we can do creepy things such as a hope so if you have a hard time sleeping tonight because the thing about this person sometimes we have to do complex materials such as a sponge customers that make sponges and what's interesting about this one is that it's all a material effect it's actually a smooth cat surface so a lot of companies we work with work in cat cat is intended to make smooth surfaces so if they had to model something like this they would just, I think they'd all quit before you started so this is all a material effect where we can do that and of course you can take it to the next level we work with actually Espen Osborn he's a very talented Danish guy who sometimes takes the software and does some things, but like how did he do that so here's an example of melted foam where he put together some different materials mixed in a certain way to create sort of a very complex appearance that we had to ask him how did you do that so that's a really fun part of working with a lot of different industries when they come back to you like you did that in our software and then here's another example of more recently as you get more complicated you have products that have embedded metallic flakes inside of them so like a plastic could be a toothbrush but you have little metal flakes inside of them again, no one wants to model that you don't want to sit and put little flakes inside of it so we've now made a material where you just add flakes and you get much massively inside the material and when you think about that it sounds very specialized when we did that we had so many people come back to us and say, you see my light, this is a game changer I can finally get flakes inside of my material and then again these complex materials here's this glass that's transmitting certain wavelengths of light so you can do this advanced filtering effect we work with a number of companies where this is very important as well we've also worked on doing layered stacks of thin film, some glass coated glass and then we work with some sunglasses manufacturers where if you buy these cool sunglasses that look blue and transmit red light it all comes from stacks of little thin film on the glass surface now we can see through that very accurately and the companies who work with us again this is really changing the way they work because they used to do it all they do prototypes of all of that they would make tons of glasses look at them, see what it looks like and if they didn't like them, throw them away so by doing it all on the computer you actually save a lot of resources save a lot of physical prototypes you normally have to do now you can do an accurate evaluation and assessment on the computer here's this different steel reporting on it, again trying to get very accurate representations of how steel can look as you prepare it in different ways we also have a technology we can actually measure materials here's from a conference we had last year we made a key shot bag for that conference so this is a photograph so in the front of the photograph there's the bag that we had for the conference we took a sample of the bag we put in a scanner on the left corner of the photograph and we took a sample of this bag and we scanned it and then we applied it inside the software and then on the monitor on the right side is a rendering and key shot of that material applied to another surface so if you have materials and you say I just want to have this material and I don't want to know what it's really made of I just want to measure it, we can do that as well and do some accurate measurements for encapsulating the threads that are inside of that bag material and in there we have of course in the software as well once you have your picture out, you're happy with it photographs have all these adjustments you can adjust the color temperature you can adjust the brightness and so on and we have that as well, you do it alone in real time and then we work with the architects as well where we have set up that can simulate really complex material lighting efficiently there's a request again motivated a lot by V-LOX giving us some complex models here's the example so V-LOX has a product called sun tunnels or solar pipes or solar tube where basically you have a receiver on your roof and then you have a very reflective pipe that guides the light inside a building and that's an innovative way to get light inside a building where you don't have a room for a window so like inside here if you want a daylight to come in you would have these pipes guide lighting and they couldn't simulate that so complicated so we work on some algorithms for quite a while to do that and here's the example of a CAD model of a building where all these little holes on the top are receivers of sunlight that then guide light inside the building through these desks that are inside the building and here's what we added which is in the V-LOX Daylight Visualizer where you can actually simulate light coming through from sunlight outside and then you get a correct measurement inside the building and now you can take that data and put it into a key shot and then you can then paint the model as well and simply what this would look like under various daylight conditions here's just another example I'm going to show some custom examples now here's Chrysler with a headland simulation you have the lighting panel coming out from the headland of course they use it more for actual automotive renderings whenever you see a car ad where the car is parked on the top of the mountain you wonder how they got it there they didn't get it there so this is the way it is now so pretty much if you look at product catalogs now it's just all good there's very few photographs out there because it just takes so much time and it's so much easier to do even something as mundane as the Ikea catalog or mundane even like the Ikea catalog it's like 95% of it it's all computer graphics that kitchen with the plates that are stacked in a skewed way they have the software that does that for them so it looks like it's not computer graphics and again there's some more examples we always like cars cars are cool here's an audio website again you go configure a car now buy a car it's all computer graphics presented to you even the trees in the background are computer graphics some of the things we do like in Keyshot that makes it special and we work with customers that have large data not just physically large like a caterpillar truck but also large in terms of the amount of polygons they put into it so here's an example it has a half a billion polygons so it's really heavy data sets that they load and it's seen from a couple of years ago so today I wouldn't be surprised if it's twice that size but what they like is just being able to load something like that it just has to work there's some aircrafts here and of course if you buy a custom aircraft you want to have it very designed to what you like as well and then here's bicycles we like bikes as well so I'm always happy with the specialised in the whole marketing campaign in Keyshot and then the other one is Peloton if you like to spend bikes sometimes they do mix in the place the treadmill is computer graphics and then the lady there is an actual person a few years back there we worked with Microsoft they did the whole surface marketing campaign in Keyshot and if you buy a new product from them the keybot, the picture on the box which we have been done in Keyshot so pretty much if you buy electronics nowadays there's a very good chance the picture on the box is made in Keyshot here's Lenovo again and we've made sales for these this is not an example you can see on this monitor here one thing you may not think about is that if you look at a cell phone or you look at a watch or you look at a laptop they always have this reflection line on the surface so we made it in the software to click on the screen so we want that reflection line just for them to place that reflection line accurately on the screen because they use that so much and it can be hard to do if you have an outside lighting environment they always stand and sort of wiggle it around but in the software we can say ok this way you want it, that way you can get sunglasses and jewelry as well and then this one Coca-Cola we had Coca-Cola to give a talk at our conference and even the Coca-Cola has told computer graphics now they post everything, they model that one thing that's interesting about this picture is that you think they put a Coca-Cola bottle in with a lady on a bench on it and that she might be a photograph but everything is 3D, even a lady because if they want to zoom in on the bottle and change the angle it doesn't work, you just embed this in the photograph so now you can see how the focus reading changes and how her appearance changes in the bottle changes in the drops as well so everything is just becoming 3D now even what used to be this background material sometimes we have to do illustrations so we can do these cutaways, we can remove part of the model we've made software just automatically because it's very difficult to model and then sometimes it has to look a little bit more stylistic, cartoonish there are lots of products out there so sometimes they have to look a little different like a cartoon here's for the character design we do actually work, you can say that we do mainly product design but in the movie industry they also have products that's the characters, they put it in the movies so we actually work with quite a few studios that use key shot through the product design or the character design inside a key shot this is another example I don't know if this ended up in a movie or not but and then the watches here's an example from the fossil watch manufacturer in the US they use key shot and we work quite closely with them because what they do is when they make a watch, they have to make it lots of different colors so we've made technology to basically add configurations when you have one product you can have it in many different configurations different colors, different faces inside of the software let's see if I can play this one here this is an example of real estate animation in key shot and here's the product from Fender where you can notice the detail in software materials and then also how they show all the different color variations and everything and it makes it much easier to add that in software you can easily animate it and it's a quick done animation so where we are today with key shot and what we're really doing is we're looking at making images and using it as a camera for your data that works really well but nowadays there's just so many demands for getting your data out you want it out as pictures, like photographs where do they go, do they go on the web do they go in a marketing campaign do they go on a poster do you want it inside some dynamic environment do you want it on Facebook as a VR experience there's a new format for that called TLCF and there are these new formats and experiences coming out all the time do you want a VR experience we're working on that right now quite actively and about to release a product for doing a VR so you can put on a headset you can actually see your product and experience the scale of it or do you just want someone else to view it on their monitor and have a 3D model so we're trying to support all of these things and then we can actually have that as a source we can send the data out to all these different channels and the viewer as an example is used by a company called Under Armour here's this example for them where they have a shoe and they used to make tons of shoe samples and send them around the physical prototypes now they can just load it inside a key shot and then we have this viewer where they can share it and click around and see what are the different color types that they want to actually produce and try to make a physical prototype and this I think is really changing again a lot of physical prototypes to reduce the cost as a consequence VR we do have some newer things with this unfortunately I cannot really show that but VR you put on a headset you want to look at that car before you buy you want to see it in front of your house and see if it looks nice and you want to maybe see inside it and see what it feels like what we've done is when you set up a key shot you can actually just press one button and then you can have it in VR and what's different about VR where we had to change the technology a little bit is that you actually need 90 frames a second to get a good smooth experience we can really move around in a very natural way and we've developed all of that and a large amount of product for that and if you want to see that then yes, well, Morsigos right here he can show you all of that so I don't think he has anything to say but if you're interested, just let me talk to him almost done so I said we started working with the movie industry but what happened is we actually don't really work with them actually but they do use key shots for some of the characters of work that they do and in Star Wars 1 where they had Princess Rhea and Sarkin they actually used key shots to validate that they could do the characters accurately enough to do a digital act in the movie and I'm pretty excited about that and it was modeled by Marco DiLuca for that movie maybe the space that he did I'm still not going to say which side is the photograph but I'm just going to leave it with that so thank you for your attention thank you very much I basically just want to pull out two chairs and just sit for an hour with you and discuss all these things unfortunately we don't have that time but I am going to take the liberty to go a little bit into the break to ask you so I hope you'll bear with me and enjoy these questions also one thing was that I actually watched Avatar with my daughter the other day a little preparation for this talk and she asked me the question what's more expensive and more complicated making a movie with real people or making one with computer animated figures computer animated is very expensive if you look at Avatar which was a breakthrough in many ways in 3D 4 years under the effects I mean that's very costly and you have about 20 people jumping around in the forest you could probably do that on a much lower budget so it is expensive but obviously the technology is moving forward as with all software it just gets more and more efficient and that's what they're looking at now how do we do more and more with this and for now they're making Avatar 2 or 3 so of course they can now reuse a lot of the technology and so on and of course they can create experiences that you so they actually have some of the scenes in Avatar they have the real people replaced by a computer graphics so when they're jumping out of helicopters it's all digital because the real actor so does that mean that there is no use for stuntmen in the future? probably not, I mean a lot of it is starting to be replaced by computer graphics and you can imagine, stunt acting is dangerous, so again do you want to jump out of that car, drive down the road 100 kilometers an hour do you want to have a computer graphics person do that in the basement and make it all happen so yeah, that's I would agree with that you've shown us convincing examples of something that looks so close to human beings that we can't even tell the difference so this also raises some ethical questions because it creates the opportunity of making deep fakes of using not even just using someone else's picture but creating someone else's picture and using it in a compromising situation what thoughts are you having around these sort of ethical questions with the work that you're doing? I mean it's always complicated and I think you can do it with anything you can also make a picture of a product and sell it and it doesn't even exist so you can exploit all this technology in so many ways and it's really difficult to control and where you create it you just want to achieve a goal and making sure that bad actors don't abuse it it's very difficult and it's hard to say you can do the deep fakes you can do in so many different ways you can do them not only creating them from scratch but then all the deep learning technology coming out now we can actually just manipulate videos and put it in other people's faces and so on so it's difficult and I think that as the technology gets better and better it gets harder and harder to deal with them this is why I usually tell people that are very you know shocked of the teenagers sharing naked pictures of each other that they can't tell their teenagers not to share pictures because you can do it anyway and you can talk to the teenagers about why they should respect privacy and this is the conversation we need to be having so one final question it's pretty obvious that we can create digital twins or avatars of ourselves in a virtual reality space is that are we seeing this happening will we be able to know that Facebook owns Oculus Rift and will we be wandering around in a VR Facebook looking exactly like ourselves in the future and how soon with a lot of smart people working on that right now the technology is not quite there and it's still a little bit of ways but I'd say there's so much happening in this space that it is going to happen at some point but it's still a very very complicated thing to do because making it all feel very natural it's just difficult and that's why it's an exciting problem that's where we land in the uncanny valley and be careful not to do that and if you look at the experiences they have now it's all these stick figures they have in VR surrounding scenes where it's difficult enough and actually seeing someone else in that space is really difficult but it's certainly some of the things that we're looking at if we could do that because it has huge applications I mean right now I'm a bit dead light and hazy because I arrived late last night to have the so if I could just have a digital version of me here I could sit at home and just have a coffee maybe I'm standing in my robe I just got out of bed but you see me all that could be quite nice to have a feature like that so the problem here is that it that is a service that goes on in real time so you have to really I mean you can do you can do it today with kind of stick figure avatars but making it look really realistic at that speed is difficult and then having the rest of the environment seem natural it's difficult but a lot of people work on it and say computer games is kind of like they're pushing the envelope there by really trying to make an engaging experience and I think we've seen a lot of exciting technology come out from that area and what we're looking at is sort of just the idea of experiencing your products so you can say the barrier there is a little bit lower where you can have that static calm or maybe you can walk around that but we feel starting there and then take that to the next level where you now have retroncture companies and actually I went to Oculus and shops but they like since it's Facebook it's a social company so they want multiple people interacting with the product at the same time and we now work on that as the next level how can we have multiple people let's say it's car designers how can they socialize around this idea of deciding a car so they don't always have to fly to the same location again and again to discuss design details so that's why I see a lot of innovation coming in the next couple of years I guess it's also an ecosystem where it's not just the software but it's also what the hardware can manage and also what the network sort of supports okay, thank you very much and like give him a big hand