 I have to make pictures, yeah. Are we done? Monique, we need you. You can touch it. Behind something. More and more developers, good. Okay, so in this session we keep everything very informal. So this rule is simply this. You have a question, I'll give you the microphone, and then one of those people will answer the question. But before we do that, I want to go very quickly past everyone that they can say who they are, the name, what they did for Blender, or what they're working on. Oh, hi. Yeah, I'm Lukas. I've been working on Blender like two years ago, I think. Yeah. A tiny little bit here, a little bit off. Hi, I'm Monique. I'm working on the sample-based compositing. Does anyone know what I wish? No. Where is it? Is it public, finished? No, not finished. Where is it finished? Oh boy, it's going to take a while. In the future, definitely in the future. Hi, I'm Mike Erwin. I work on OpenGL and system compatibility. At least working now for Epic. And you are working for? And I work for Epic Games. I am Bastien Montagne. I'm working on Blender development mostly on the asset management, idea of a rise, idea management in general these days. And I'm working for the Blender foundation. And we fix this like half of the bugs in the bug tracker. It keeps Blender stable. I work for Blender Animation Studio. And what is Blender Animation Studio doing nowadays? I don't know, I think that's a question to you, but I do everything. We have no idea what they're doing. Hi, I'm Brecht van Lommel and I mostly work on cycles nowadays and some other parts of Blender. I'm Stefan Werner and I'm working for Tangent Animation, mostly making cycles faster, use less memory and not crash. Yeah, I'm Pascal Schoen. I'm working for Aridas and I implemented the principle shader in cycles. Hi, I'm Lukas Dockner. I'm doing cycles development stuff. I'm working for theory studios now and some stuff that probably some people may know as the denoiser and I'm also doing UDEM network rendering, stuff like that. When do we get UDEM? You can get it, it's public, but... The patch has been submitted for review? Not yet. I'll do it once I'm home. You do it when? When I'm home. I'm home. So that's like tomorrow. Hello, Tuesday. Okay. You have to keep them sharp. Hi, I'm Sibren and I'm currently working on Alembic. Ah, keep working. Keep working. Hi, I'm Joey. I work on VR stuff in Blender at the moment. Probably going to do some GL stuff as well for 2.8, maybe some Vulcan, but I'm not going to talk about that. Hi, my name is Jeroen and together with Monique, I'm working on the sample-based compositor. Hi, I'm Kier. I work on the motion tracker for Blender. Hi, my name is Sebastian and I work on fluid simulations. Oh, come on, come on. You have been working on Blender. Yeah, I have been. I'm Andrea and I've been working for Blender 2.5, the last time where I did the big things. A lot of work, a lot of work. The file browser, file system, a little stuff. And now I'm looking for an area that's a little bit abandoned, maybe, and come back. The sequencer. The Blender sequencer. That's really abandoned. Nobody does it. Maybe. We'll see. Okay. I do too. Hello, I'm Inesh and I was working on Blender a bit of time ago. Now not so much anymore as I got employed somewhere else. But I hope to get a bit of free time again and go back more into the game engine simulation parts and their activity. I see you're an expert now in game logic and how to design game logic. That's what I heard. So pick the brain and make sure that all the design comes out. No, I know how difficult it is. Okay. Thank you. Okay, I think I had all the developers. You can manage this. So now, guys, girls, what's the first question? What do you want to know? You know everything already. Come on. Feature requests. Everything is fine. You were first. Logging. I'd like there to be more logging and I'm curious to know whether it appears in anybody's plans. In particular, I get very frustrated that Blender's messages from script just pop up this thing that say, go look at a console if you have one. And I end up having to go back and restart Blender this time from the command line so I can start seeing output better. And I'd like someone to talk about logging. Logging. Logging. Come on. It's already exist. Well, it does exist, but it could be made better, but we don't currently have any resources to work on that. It's a nice project for a beginner interface designer, blah, blah, blah. Anyone just around to implement? What do you mean, not for rendering, but in general? For all the interaction you do in the software? Or you mean rendering, logging? It's like Python trunk errors and they say, hey, please check the console, but the issue is that on Windows you can toggle a console, but on Linux on OS 6 you cannot toggle console unless you've started Blender from the console. So it's kind of... So we can kind of hack something with overriding the STD out and STDR and piping them to the space in four things and print errors there, but there are interfaces possible here. I don't know, I hope someone from this audience, they see it's not a rocket science and they can totally help here and get someone else involved in development. Is that not something for Campbell? What is the Python department? It's not directly Python, it's like some C and some interface. It's mainly a question of how to present this in an interface, I think. It's on the radar now. It has been locked on video, it has been streamed, so we will keep track of it, okay? What are the very handshakes? What's the chances of getting Natron and Blender to share code or concepts in terms on the compositing side of things? Because both have benefits. I don't know, I know we talked to some Natron devs, but June... What do you think? Yes, we talked to the Natron devs, and it's also open source, so we looked at the code. They use a totally different system and based on that, it's not easy to integrate inside Blender as being sharing code. It's more on the technical level that they make and we make different decisions about how to do compositing. It's also a difference between a standalone compositor versus an integrated compositor we have. So it's choices. We can now make sure that Blender can export the files, a motor layer that we have to share in format for motor layer files. Is that working for anyone else? Anyone use Natron? Natron? Sure. Does it work with Blender if you save our motor layer? Can you read the motor layer? No. Some can, I didn't. I saw, maybe, no. He used Natron, that's right. But I mean the biggest problem with Natron are the two problems. Of course, one is the clone. It's meant to be a unique copy, right? That's good. It's really awesome if somebody does it. But that's what the design concept is. They should keep doing that. But that's a goal in itself. And the second is they support a plugin as an open VFX. Open FX plugins. That's a library of plugins which is allowed to be commercial and closed. For example, the first Natron demo I had from the maker, I said, okay, let me click around. So I'm going to add a picture and I want to blur it. And then he said, oh, sorry, I don't have blur yet. So you have to install a commercial blur plugin. And I applied it. What happened, I got a watermark on top of my picture. And then if you want to get rid of the watermark, you have to pay the code of the blur plugin, the license fee to get your blur working. And I can understand why, but that's really not my thing. That's not how I like to work. So Natron has been evolving now better, but they still use open FX plugins. And that's an infrastructure that you cannot combine with Blender. Because in Blender, you cannot have that kind of plugin. You want to talk? Yeah, yeah, yeah. You're wrong, but maybe their design decisions are more correct than ours. Are you a user? Used to be. Well, it depends on what you think that is better. Because also the open FX format is really in very old format. And this is also not really up to speed with modern hardware. So although the compositing code might not be shared, the motion tracking code is shared. They are using the Blender code. And they work with them to get it integrated. So Natron is using the same tracker. Ah, our problem. Hello. Is it possible to render Alpha Channel behind Glass Shader material? Is it possible to render the Alpha Channel behind Glass Shader material? I did look into that a while back. There were some requests like being able to composite refraction on top of something. The problem is, theoretically it's doable, but really because you have the problem if you have a refracting object in front of a background, you would want the background to be refracted. And for that you would need a separate render pass that stores the outgoing coordinates and everything. And it's not as easy as it sounds. It could possibly be added, but it would not be as useful as you might think. Well, I think, I mean, it's indeed true that you cannot then refract the background properly, but I think in a lot of cases it's okay to ignore it and just assume it. It's basically to treat it as if it's transparent. Indeed, if you have a window, the refraction is very little because you have just too late, you know, the refraction tool window is very little. So it's certainly something I think we can implement. But is it now using some kind of fake Alpha or is it not at all? Well, I mean, you can kind of do it manually, right? Right now you can have a refraction or transparent, and if you replace your refraction with a transparent BSDF, you can manually make like a glass shader that kind of does this. Indeed, they're not... I mean, in the internal, we had Alpha for glass, lots of quotes. But it was all fake, right? It's just a fake number. You look at, okay, this is, I think, the amount of transparency and we call that an Alpha number. A little bit. Cycles is very more complicated because cycles is about light transport and stuff. So the transparency is not a value you can take out of the picture and then composite it in another picture. Simple. Well, I mean, I know how to do it in cycles to make it work. I mean, I've implemented in Arnold, so I mean... How would you do that then? How? No? Well, I mean, you just... If you hit the background, then you'd make Alpha and if you had another object, you don't... I mean, it's... Okay, you would never look as if it was rendered in one pass. No, of course not because there will be no refraction, but if it's glass, then there's not a lot of refraction anyway. I mean, so it's... I mean, it's on my list somewhere to implement. I don't know when, but... In the bottom, what? Well, you know, everyone who asks, you know... The more you talk about it, the higher it will go. So what is on the top of your list? I don't know. That's... I mean, it's not a public list, you know. It's... You will surprise us. Anyone from this side? I guess my question has to do with rigging. Here you can answer that. But is there a way in the future to have more control over the deformations produced by bendy bones, you know? Maybe like weight-painted deformations to get just the result you want? Bendy bones. Yeah, bendy bones. I mean... I think animation right now is more in just a kind of area of work. So, but yes, it shouldn't be that hard to implement. It's always the same question. You need developer power to do it, so... It goes to the bottom of the list, I guess. It goes to the bottom of the list, I guess. It goes to the bottom of the list, I guess. That isn't a rider, yes. I think Joshua is totally up to look into bending bones improvements because he recently had quite some fixes and little improvements in there, so... Maybe Joshua is looking... Joshua is currently a developer from New Zealand. He's also working on a development grant from the Blender Foundation. And we would try to keep him on board for the coming period or sort of work on animation features like the bones and other things. But this is in ongoing, so I don't know what the result will be. You? Hi, I'm a bit of a cat background. I was wondering if there is IGS, Step3DM import possibility, maybe with a plugin? .3DM Rhino. Is that the 3DM or not? Yeah, yeah, that's the 3DM. I know only... I know only Rhino. So, yeah, I'm also using Rhino, so I know this file format because at RDS we use this extensively and I haven't really thought about doing some importer for this, but maybe this can be done. I have to have a look into how Rhino is storing the data, how 3DM looks like, and maybe this can be done. Maybe at first there's a plugin. Is it a text format 3DM? Yeah, I think it's a text format. But I think, yeah, I have to look into this because it's normally for NURBS curves and not for meshes. And yeah, they added meshes to it, but yeah, have to have a look into this. What I understand is Rhino is more or less like parametric surfaces and everything which we cannot represent in blender natively, so it's going to be some limitations anyway. No matter how much useful it is. Everybody's using subdivision surfaces now, right? What do you need NURBS for? No? And who needs NURBS? Can I see people who use NURBS? Wow, what? You? Oh. Or the people who use Rhino for modeling. Rhino will use this. But there's no way you can export from Rhino to blender. Do Rhino has OBJ or something? Yes, always converting first to meshes, yes. But then you lose all the potentials of having NURBS. So, yeah. You can export in OBJ or FBX, whatever. But, yeah. An implementation could be nice. I mean, the NURBS code from blender is at least 25 years old. At least. 30, 35. So, when I was a little kid I wrote the NURBS code. I thought, well, it was very modern then in 92 and stuff and everybody was doing NURBS. But nobody really picked it up. We had a couple of people who tried. But NURBS is very technical. It's mathematical surfaces and stuff. But NURBS should also be presented in a way that you can use it as a modeler. It's fun. Rhino is doing a good job to make it accessible for artists. And that's not a trivial thing. There's a lot of really cool tools you have for trimming and cutting and melding and blending and those kind of things. That makes NURBS editing cool. But we don't have all those things in blender. If you could export the raw NURBS blender can't do anything with it because we don't have all the support. So you have to stick to converting it to messes. Or we call someone from that side of the camera to help us improve NURBS. What we could do is the taste plan one day. Taste plan, yes. Is there any taste plan that we have here? Who knows what the taste plan is? No? Also plug-in for Rhino. It's about passing seamlessly through NURBS model to a mesh model modeling in SUBD and then bringing it back. The taste plan is like something halfway between mesh modeling SUBD and NURBS but it has been patented and locked in by an evil corporation from the United States. So we don't know about that of course but the concept is not that difficult. So the whole idea of making NURBS editing more accessible is something we could work on and blend it. I don't know where I was. Just on the NURBS thing there's this awesome program called Moment of Inspiration I think it is, MOI. I think it's the guy who worked on Rhino. He made a NURBS modeler aimed at artists. It's incredible. So I suggest maybe you buy them out. That'll do it. I can't collect the money from me. Crowdfund. Hello, my name is Serge. I have actually three questions. I'm sorry. Oh yeah? Okay. So this is how it works. Okay. The most important one. The control C is sometimes really slow. I mean for me to crash blender and reassign the file then to use control C sometimes. So yeah. What are you using? What do you mean? What are you doing to make control C that slow? Are you using like a pile of modifiers or meshes with billions of vertices? No, but we use lots of maps and terrains and visualization for architecture and every time control C is crazy slow. This is an undo system and basically when you undo it's like opening the file from scratch. So what happens is all the object needs to be evaluated from scratch. There was all plans to keep all those objects in the undo state and then we can save undo much quicker. But on another hand the question is do you have a multi-core machine and did you check if all the cores are 100% busy because if they know then we can optimize something out? Currently we have 2.8 mites and things because of collections and things. Currently if you want to optimize your work think of using group duplicators linking in your data from other streams. But if you have like a pictorial and you don't work on it anymore link it in because all your undo will then skip everything you link in. So your undo then is restricted to what you have in the local stream. So you can create really complicated things and if you link it in it is not part of the undo system anymore and then you can keep working very fast. That's a good tip. So my suggestion would be actually if you can come up with a scenario with instructions that actually reproduce it reliably that then you file a bug with it that will be very helpful. So if you can make it slow on my machine while I'm using it then I can fix it. That makes it a lot easier. So if you have some sequence of operations that you do and then you press control C and then it's very slow that makes it easy for a developer to fix it. Since if there are more than four steps including the .land file yes much more handy. The art of bug reporting that's what they can talk about forever. So if you talk after the meeting you can ask them hey how do I make a good bug report and they know exactly how to do it. But the key thing is always that you don't consider it to be a support thing and start like I have a problem which is help me now you have to transform the problem into somebody else's computer. I have to make sure that the developer gets the problem. So he should be able to redo it and recreate it and then experience it and then you can get it fixed. But if the problem is only on your system but you have no idea how or what then you need help or you have to find out what's going on and then you can get fixed in .land Suggestion for this thing in particular when you are clicking report a bag it will take you to the developer's website to report a bag so that could be the first line the extraction I don't know if they are there could be yeah they are a video from I've reported twice but I'm doing the thing without reading because I was able to successful but for our form yeah the form okay also a question I'm doing particle system with the linked dupli groups so only the empty are showing in the particle system when I'm doing particles with linked trees, linked things and the dupli group only see the empty it will be in the 2.8 maybe with overrides it will be feasible to particle that rendering groups with the dupli groups linked from a tree library yeah only shows the empty and if I'm appending the trees I'm fine with that I'm using the local groups this is for Sergei question do you want to answer this is just a dependency correct a simple dot blend file and I'm looking to this but is it there is already a bug on the website but it's kind of a dependency or a recurrency or something stuck so it will be maybe after the session talk to him but the duplicating system on particle system will be reworked for the new dependency graph and for the new render engines and ownership model so there's definitely something we can look into and see what we can fix later on the full particle system will be redone from scratch anyway okay so do you have any plans to improve the texture so we are able for example to paint in the separate channels in the red channel in the green channel something like that we can do in substance painter or if you use substance then you have other opinion no well we had the g-slot this year but it was more on vertex painting not on texture painting similar code to texture painting to vertex painting so so we know that it might be improved but we don't currently have any priority on those improved because it's kind of good enough and we have much more important broken stuff and better to take care about but it's also not something it's also not rocket science and someone can just also help what is rocket science then I don't know rocket building rockets is oh aha I see so when are we going to make rockets as blender community we could finally do rocket science yes and then we put blender on the moon on Mars but seriously I know Campbell he worked on the texture painter mostly there is also a developer who is making an add-on so you see was he on the conference Andreas no I don't think but Andreas is doing fantastic it's really good it's a bee painter I think it's called you know it the add-on so he's really smart in thinking about the features that are useful for an artist to have so he's doing all the research and design and we are in contact with him and I connected him with Campbell already and the kind of stuff is fun for him it's not that much work so when we have 2.8 a little bit more on track Campbell is also working on new eye stuff and workflow that kind of topics but as part of the whole process next year I'm sure he will pick up stuff from Andreas and make sure that the painting and blender is upgraded not only that I'm also looking for people to really look at a better texture but procedural textures combining them with layers and notes make sure that we don't have to have a plug-in with substance designer but we don't need it that's the philosophy of the blender project sorry guys but that kind of workflow is fantastic and the blender, our old blender material system was actually almost something like that but then with a really bad interface so we have to make sure we have good real-time rendering we have to get a good interface with good layer definitions and note systems to manipulate your procedural coordinates and then we have a texture system in blender it's really not rocket science right so who's going to code that who thinks that's interesting to code it's a trick question ah you think it's interesting when you're done with Alembic you have other stuff to do I'm looking for people to work on this and there's one of the things we can do in the 2.8 period it's a question about a new collection of the 2.8 will they be linkable like groups and it will be usable in rigging for linking rigging well I mean it's more Dalai stuff currently but he's working on type of collection which is actually going to replace groups so you can choose a collection and decide to make it a group group collection and then that would be linkable and yeah yeah I mean it's supposed to replace groups in the end so wasn't it a collection links to a group yeah but proxy we nuke them proxy we replace with ID overrides it's a new system which will be more robust and hopefully much less hacky so we can maintain it easily and make it more powerful more easily to extend all too we want to get rid of proxies hello I have a question considering masking and rotoscoping I was using this for masking video and my problem was that when I applied a feathering that the key frame stored all the information and when I altered it the feathering was all over the place somehow the key frame for one mask handle all the information and including the feathering and when you want to alter this one handle it brings it all what is the normal feathering for I don't know maybe I was doing something wrong but what I would like to have and I didn't see it as an extra key frame for the feathering yeah that would be great I think key frame is currently stored per control vertex of the the spline itself and then everything else is kind of interpolated based on the distance but it shouldn't be popping up that much so if you have such a case please do a report and assign to me and I'll try to see what we can do from Blender 2 to support it better but at least there's one person still using plastic masking code one person in the world oh there are two two people using it three people using it and more people using masking in Blender so we have to keep making sure that it keeps working all the time it makes the render arrows go away it's really good is there anyone still using Blender internal for everything just to ask are you very unhappy that we are kicking it out or do you even know that we are kicking it out yeah I'm unhappy yeah I'm unhappy so we still of course it's not out yet the code is still there right when is it going to be actively kicked I can kick out it tomorrow yeah you can you can't do anything you want but will you okay challenge accepted so this around lunchtime I was meeting upstairs about the game managing and people said you said last year that we are going to kick out the whole game managing I said I didn't say that we're going to work on have a better game managing and the same thing with Blender internal that we improve real time rendering and what the PV is doing is showing that you can now get better quality rendering in real time than what you can get with Blender internal for the game managing we want to have a similar thing people see now what's possible and then you have to get a tool that allows you to make interactive content in Blender that's the game managing but it should be well integrated so yes we will keep working on that oh yeah but in 2.79 you will have Blender internal forever alright forever and ever so in the first version in 2.8 I don't think the old material system is still there then you have to notify everything I typically use quite a lot of objects in the scene let's say like 2000 and these objects are typically instanced so let's say I have 50 or 100 unique objects each of these objects may be instanced and even I have LODs to run faster but I still let's say I have 2000 or 3000 of them in a scene it may run like lower FPS and I'm wondering if these or the implementation of new viewport in 2.8 will bring some speed up to render much more objects I'm not sure that's exactly the problem but so there are multiple problems when you have multiple objects the dependency graph would not be happy when you have multiple objects but it's only during construction time adding an object could also be an issue because it does checks with all the other objects to compare the uniqueness with the name for the drawing we implemented new botched rendering which prepares all the instructions and it just real easily instance it's all over the viewport but that's something that Mike can explain better how it works and limitations yeah it should definitely if they're like exact copies you're just instancing these objects the new draw system should handle that pretty well and getting into different things like LODs I'm not so sure about that but basically if you have one object instanced 10 times in the scene like the CPU side computation of that once and then it can draw that multiple times and it'll be it should be much better than 2.7 if Blender internal will be replaced by EV will we be able to render in background we are looking into that problem because there are some limitations of OpenGL which on Linux requires having X11 session but we some other stuff like EGL we can avoid this or some other tricks but this is on a list to look into it is on the list to look into this is actually the same issue currently having the OpenGL playblast there are lots of requests to be able to do those in the background and it's going to be solved sooner than later well even better you could also have a little render farm with like 3 or 4 graphics cards you have one second render time per frame or something that's work got something done exactly I wanted to know about weighted normals for low poly models game models what's the situation because I know there's a Google summer of code I don't know if it's merge if everything is merge and if there's anything that the core developers need to do before things like mirror modifier not working with weighted normals as it was as an add-on is there any development going on and that does that summer of code project is being merged completely or has problems we need to review the patch I mean the guy submitted the patch like one week ago or something like that so the developer so I just need to review it first and then probably get a second pair of eyes on it and it should go in most hopefully like next months or something like that yeah I mean globally it's okay probably some details to be checked but globally it's okay we're not going to release 2.8 soon anyway so there's time right so when are we going to release 2.8 when are we ready hi we're I think all really excited about EV and kind of excited to try that out on the MAC side but I think from what I can tell EV is still broken on the MAC side and I was just wondering how that's going and also wanted to say thank you for the shadow catcher I got an idea that Clermont was not here Clermont is hiding yeah I think it was broken like 2 days ago and then it was fixed but I mean it might depend on specific graphics car combination but in general it shouldn't be broken so if it's broken reported but yeah it might just break once in a while while things are developing we don't accept bug reports on 1.28 I'm wondering if there's any way to stream EV into a window in my application that I'm developing so that I just have it as a display using kind of blender as a packing engine maybe it's an open GL window in my app that it is sending the display to I don't know just make a bug report I would guess and at least some screenshots or so I think for example what Nimble Collective is doing they have a remote desktop client and they stream all the graphics oh that thing no this kind of stuff is a lot of development is going on making virtual workstations and make sure that all your open GL drawing can be streamed we have a wrap to other clients and stuff and that's not depending on blender it's just an external program that makes sure your drawing gets streamed somehow to another application I don't know details about that it just kind of depends how you exactly do this for example as the excellence the wording through SSH will totally work for this if you use VNC it will also work if you use remote desktop then Windows forces you to use a software rasterizer which is no longer compatible with the requirement of open GL we need in blender and this is something we cannot solve from outside because blender just needs open GL to work properly so but there are ways around this and they are all outside of blender there's so much we can do about this wasn't X11 was also designed to do this X11 is designed to totally forward X11 through SSH tunnel it's not an issue it's just X capital for the SSH command and blender and it just runs so this question is specifically to Lukas I think first of all thank you for your work on video noiser you literally saved my life like a month ago for the animation project so yesterday you had a great talk but I think you didn't mention the denoiser at all for multi-frame animation denoising stuff so what's the state on that the state is I started working on it yesterday and it's coming the problem is it already was working in the google sum of code branch but the problem of the multi-frame denoising is you want to use neighboring frames of course you also want to use the frames that are about to be rendered and in blender the render pipeline works the frames are rendered it's composited saved it completely throws it away and does the next one so there's no way to do it as part of like click animation and it just denoises everything so what I'm working on right now is you would enable a denoising pass that will be in the pass options you render the animation to EXR then you press a special button that denoises the entire animation and then you press another button that runs the compositor on top of the denoised animation so the workflow is pretty bad which is why I decided to just drop it for the first release but I don't see any better way to do it as far as I know Pixar render man does the same so I guess if they couldn't come up with a better approach this is also a pipeline thing where the compositor also has to work with because people who use the compositor also want to have like a whole shot renderer and then have all the raw files around on a fast SSD and then all the EXRs in the composite have a preview on a whole shot and do all that stuff then you can do some denoising, whatever, compositing but denoising also works better if you know what color space you have and blue and that kind of stuff there would actually be pretty interesting for the compositor because it would mean that you can change your compositing setup and recomposite everything without re-rendering yeah so I guess it's not too complex to do, I have to look into it but first the denoising part should be done pretty soon I guess it's not rocket science, huh? Hi, I have a question from Twitter coming in so one common workflow is to have one screen for cycles rendering and one other for different draw modes how would that work in 2.8 since the render they did extend limit on the message by factor of 2 like recently so it might very well be not 140, but the idea is you would probably use two workspaces for that or Brecht can know better ideas I think by screen he means well he meant like 3D viewport because not what Blender would call a screen I suppose it probably means the window so on the left you have the wireframe so personally I think it's very important to be able to have like one 3D view in like solid mode and another in cycles rendering mode and so I'll make that work one way or another that's good news I think make it work according to design yes I mean the decisions are in their hands of course in their good hands but I think the strategy more is let's first finish everything we already know because I also got stuck almost in talking about how much we do for 2.8 and it's a lot, it's really big, it's very complicated so first we have to prove that everything we already did and what we are working on is working sign it off get a really good interface make it really usable and then you can move on and say okay so now we have all those engines and all those insane drawing modes how are we going to manage that more efficiently we need users and designers and people to give us feedback for those configurations besides the quiet Hi, so more question about the way to interact with Blender and with that brilliant Python interface I think this is really great because it allows on the one hand to go and look what's going on inside on the data but also to control things now there are two ways to do that Python one is through just the console the Python console and you type things and that works fine but then you can also have this text window there where you can type a small script and let that thing do things but it seems to me like that Python from the text window once you run it and it's done, it has displayed things you have no way to find out in the regular Python console what it just did it seems like it's two different Python worlds so to speak and at times it can be annoying because you can do things complicated through a script there and then you'd like to go and hack into it but you can't you have to see the text window the Python code which is inside it it's exactly as if you were executing a Python file so it's exactly the difference when you have the basic Python executor if you are launching the interactive command line or if you are just passing it a Python file to execute and then it executes this Python file and then it's just quit it's exactly the same thing so usually you use the text editor to do something similar to addons or UI it means you create something which will be kind of persistent inside Blender while the Python console is more like very quick test or very quick hack to the data structure but something which is not really you would call the Python operator inside the text editor and not inside the Python console usually another approach you could use for this is if you call the text block something.py then in the console they import something and that module will reflect whatever you have in the text editor so if you have your code in the text editor as a function you can just call it from the console and then hack on whatever that function returns there isn't a plan to improve the Outliner yes I mean the Outliner needs a lot of work and a lot of improvement the only problem is manpower to do it but I will need to do it for asset management we ultimately will have to have some control of the assets in the Outliner but I would first like to kind of rewrite it from scratch nearly because currently the way he's handling its internal events, internal operations and everything is very weird, very complicated and very non-conventional compared to the other parts of Blender so it needs kind of a bit of work and not very firm of adding features in current implementation so in the dark corners of internet I saw the planning from Dalai Filinto and he had improved an Outliner plan in the coming weeks because we need to improve Outliner for collections anyway so you should work with him together but we're definitely going to improve Outliner to support all the new features we're going to introduce at quite age just stay tuned so recently like this year I'm a software engineer so I've discovered oh you've got a Python API that's great and I started writing operators but I discovered you couldn't add modifiers and I wondered if there was sort of a conscious design decision behind that or or anything sorry I didn't quite catch the question so you can write operators in Python the modifiers? No you can't because the problem is that modifier is handling huge amount of data through meshes so and it's like run every time you change the setting every time you evaluate it so it's just not possible to do it you know I thought Gambo was looking at how to do mesh manipulation faster to have a couple of functions and have a Python script do a massive mesh stuff No not with Python please no Python has a stupid idea of global interpreter log which will log the full execution pipeline when you have multiple orders using Python modifier then you all of a sudden fell back into single-threaded evaluation obviously so please no So then what? Then I don't know Ah she codes Python go whatever you name it Open shading language? Also that's a general point we're actually just discussing this outside we often get requests for C++ API, modifier API, custom closures and cycles Blender is open source so that's important for commercial software because for example if you're using Maya and you want a modifier you can either wait for them to edit or you can edit just Blender you can just implement it right inside the software and that actually works pretty well so That's also what I always say but I know there's an extra reason why people want it it's just business right so if you make a plugin and you can sell it and have an infrastructure of a marketplace for things to do on the other hand we still will work on our modifier design to have modifier nodes for example the node system to work with modifiers or node system for constraints and rigging and at some moment we will have to look at a way to insert Python in this even when you don't like it No I know the big studio has the same exact idea of forbidding Python from this that's what I fully support and at the moment if it's odd you can put everything to someone else and do something more fun but if you have a node API you can find a way to have things plug over Yes but not with Python like someone puts closer to OSL what is just getting just in time compiled to native code that would be the proper approach OSL I think you would look at for this Kind of maybe not one to one but it has better potential How do you hear this It's on the last so yes we are making an open source project so you can basically implement any modifier you want but I think if you have modifier nodes you can infallibrate that how much need that is for custom things Same is for the compositor you have a list of compositor nodes and you think ah shit I would like to have this one or that one or that option or tweak something but always or easy something we can look at In my addon I use quite a lot of external libraries let's say like OpenCV and I want my user to not basically care about if they have to use or download these and that library and its dependencies so I use the PIP which comes with each Python version but it's not included in the blender Python, would that be possible because right now I have to have let's say Python 3.5 installed to use that PIP-XE and download the library with all dependencies and install it I missed the beginning of the question something about PIP installer executable so it's not part of the blender so I have to have Python 3.5 installed to use it to install the would be possible to have it in blender in a limited way because on Windows the default Python that you download from Python.org is a different build than our own build Python so any binary extension won't work as far as I know that changed but I understand that the question is if you could include PIP like PIP so that package management kind of thing and I remember there was some discussion on the mailing list about this recently but I don't remember the conclusion I don't think there's any specific reason why we couldn't include it but bring it up on the mailing list or look in the archives and reply it's a pretty technical issue but I think it's solvable so aside from the new outliner Windows that we're going to have on 2.8 are we going to have something like a schematic build another way to see all of our scene content but on a macro level I come from Softimash and my past days I use that software and I really love this schematic view they had because everything is so organized especially for rigging you know so aside from the outliner we're going to have another editor to see our scene every single thing we have on it from a macro level so it's like oops which we had like back in the days but I would love to have some editor work on visual law as dependency graph on all the objects and everything on whatever detail level you need because that would help a lot for personally for me and I believe it's a good tool for riggers and for riggers we can have even something more fancy where we can visualize rigs like properly aligned bones and visualize all the possibly dependencies possibly dependencies, hooks and everything it's just what I understood also for Presto and Primo the tools they have in the industry they visualize it in 3D so you see the whole character you see all the vertices and bones and everything rigging or highlighting to show you conflicts and things yes, that's definitely what sound like it's really interesting for us but it's somewhere in the list and it's not in the very bottom of that list but did you ever use to blend the old oops the thing that we had an oops we call that but it was a way to show all the relationships between data blocks and that's fun if you have five or six and seven suddenly you have like a thousand of them and then it doesn't make sense right to see a diagram that doesn't give you any information or extra control so that was not really the way to do it but we can do like from the interface you can solve this and there are ways we just need someone to do this I guess so I think at last year's conference reality I think requested from Sergei like motion curves for animation like improved ghosting is that still on your list or is it like technically impossible to manage because for animation especially like character animation it would be great to have somebody in our studio who's backing on his knees every morning and praying to the coders and please give me the motion parts what do you mean that for character animation I don't know why we don't do it why don't we do it currently with the new dependency graph there are lots of optimization disabled just because of the new way they organize stuff with the copy and write implemented that will come relatively for free and we can multi-tread the calculation was almost ideally across all the threads and you can understand because if you have a character and you have a little bit of a complicated character that takes a little while to actually frame calculated and for the motion parts you actually want to have real time complete animation calculated every frame because especially if you edit things you want the motion parts to update immediately but to update the motion parts it has to calculate the whole animation for the whole system and that's a problem in the current code and that's something that will be much more efficient in the future and that's a huge to do in that area of code how and what we need to do there and just need to how high is that on your list the motion parts it can be done quite simply after the copy and write is fully enabled in 2.10 branch and then it will enable lots of features and improvements in quite few areas like motion parts, dynamic paint and few others which I saw around when I was reworking all workarounds and everything he talks to animators every day so he will be reminded and I will remind him too you'll get it soon are we going to have a sort of freestyle real time render like OpenGL based? No 50% chance I know Tamito Kajiyama he has been contacted for this, he's looking into it to find out how to re-code it in a way that we can fit it in the new architecture I haven't read him his mails for a little while, I don't know the conclusion but I heard there was somebody else from China I suddenly recorded putting something online last week fantastic rendering of lines but so it's currently not on the we know we want it but there's so many things to do there is a freestyle there's a grease pencil project going on the whole idea of 2D animation our cartoon rendered animation is really hot we know it's in the focus but I don't know the answers I mean I think it will at least keep working at the level it is now I mean you'll be able to edit on a cycles render or on an EV render afterwards and it won't be real time but the real whether we will do real time or not is unsure but I think it's pretty sure that current functionality will at least stay but did you ever look at freestyle the code itself, how it works yes I mean wasn't it like a really weird port of Python stuff that became C code and then it started reusing the blender internal in a weird way and convoluted and it was developed outside blender first and so it uses a lot of non-blender specific stuff but yeah I mean but you use freestyle for things okay some lines but we definitely want to have a real time freestyle in the fuel port of course but we have to look at how and when the planning is unclear so a question about Python API when we ask for users data.users we get a numbers is it possible to get a pointer also you actually in one or two versions ago you won't get it from users but you can get it from I think it's data library you can request a mapping between ID and the ID it is using or whatever I don't remember the details exactly but you can check that in the API documentation I think it's data that users map yeah something I don't remember exactly the path but yeah something like that I don't know if it will be possible to use a panoramic camera in real time like the cycle one but in real time I don't know if it will be possible in the view port like in EV but with so you can render equirectangular frames in real time kind of or streaming real time frames to an Oculus or something this is actually what the HMD branch is doing as far as I know is there a Joey in the room yeah so actually for the VR implementation we do it open HMD I actually thought about creating a method to render in cycles real time on our equirectangular sphere and place the user in the middle of it so you just be able to look around as the equirectangular was wrapped around the sphere so I guess this could be implemented in EV and stuff like that as well for a real time type of simulation but it depends on what you want to do because I guess if you're using EV and VR or something like that you want to see the entire scene anyway so but ideally what you want is that the view port is simply working in every normal way but display in things like another panorama or for VR or whatever so you don't have to go to a special mode to be able to see VR that's what you are targeting at yeah right so yeah I guess this could be something that could be implemented quite easily actually I was not here since the beginning so I hope nobody asked for this question I have a question about the denoiser this is a very magical thing you just have to click on this magical switch and everything is ok but is it possible to imagine to have something less magical I mean something you can decouple from the rendering process because in fact you cannot easily blend things between the original rendering and the results of the denoising process so can we have something more like a compositing effect that is indeed possible the main reason why it is not a compositor node right now is on the one hand when I do it in combination with cycles it's easier to do GPU rendering and the other part was that it uses a lot of temporary data and if I do it right during the rendering I don't need to save it which means that you can save a lot of RAM but there already is a way if you enable the experimental mode to actually get the internal denoising passes in the compositor so it would technically be possible to implement it as a compositor node probably you wouldn't want the full algorithm that's used in cycles but something a bit less complex and a bit easier it is certainly as possible in the future I would have to look into it I'm gonna ask you directly I've noticed that when I'm emitting smoke from an object and I enable emit from texture and it puts a texture on I can never see that texture on the object is that something that's possible seems like that would be simple it should be possible but make a bug report that's not a bug that's like a feature request or something I don't know but it sounds like a bug like if you use texture but you want to see where it's going to emit from so you can like dial in if you're going to emit from like a cloud texture you can dial it in before you keep simming and watching simming, changing yeah animated right moving the depth yeah see tawn wants it too I don't know why they didn't code it it's really stupid this colders I would have done it hi I'm a silly just a question if I want to create an uber cool PBR shader which I have noticed in Unreal 4 is that possible for an EV is that on the list in OpenGL no OpenGL doesn't matter can I create an OpenGL shader in EV yes or no so you mean you write your own GLSL code or just combine some existing nodes because you can of course create a node group and the way you can with cycles or whatever but if you want to write your own GLSL code that's going to be more complicated because the way for example EV PBR built in shader works it relies on all kinds of other algorithms running we have to generate probes shadowing algorithms and all those things these are very much tight to all other parts of the pipeline so it's not as easy to plug in a different PBR shader so there has been some work on an API to plug in GLSL shaders custom ones but in general it's really difficult to be configurable hello so I also walked in partway through so this may have already been asked as well or it might be too general in which case Tom can stop me but as Blender devs you have a lot of insight into how Blender functions I assume some of you use Blender as well no? okay what are the things that you really dislike about Blender right now no no no so what's your least favorite feature of Blender the most hated thing what do you dislike about whatever so what do you really dislike? I don't know wouldn't even touch it with a stick oh you mean coach wise coach wise nobody touches the sequencer I think it's a very difficult question maybe you first have to ask them something positive like what do you like best it's not fun anyone has something you really really hate wow where do I even start at least it's on maybe cycles dependency graph notes sequencer curves outliner there's one thing that's really really good of course there's motion tracker really the only part that has been developed up to the standard of engineering you won't like to see in Blender so you're going to work on upgrading all the parts of Blender to that level excellent anyone has a hated thing particles particles particles Blender internal to be honest Blender internal the rendering engine tomorrow it will be nice this is like saying that you hate your grandmother what why you don't do that you say goodbye to her that's it okay the variable names in the code yes and also there's a little bit of formatting I think that could be improved in places say it again formatting variable names Google style meaningful variable names comments, things to papers you know what developers really hate do they hate documentation all of them accept him documentation is the most horrible thing so if there's anything everybody here you say I would like to help out or do something I mean there are all kinds of nice guys and girls they're coding working on Blender and they really like it if they get help presenting their work either if it's as a documentation as a video as tutorials, as little demo files or blend files showing off features it's a tremendously fun and useful thing to do really never hesitate just connect into a developer looking at the work he's doing and make mockups or tutorials or do whatever that is it helps a lot questions further? not a question a request what? I wanted to go back to the topic that we had directly before which is the shader pie nodes and stuff so I understood them that there would be some GLSL code that you define in the pie node and that you can simply plug in everything and stuff there was one of the original plans it needs really careful design because as Brett mentioned there is so much you can do flexible and shader because engine needs to know okay so this type of closure I need those passes for the open gel to work what we can probably do is like enable some procedural textures similar to what our cell does with some extra limitations no extra no GLSL defined closures and stuff like that some and one of the other idea was that render engine itself should be able to provide code for its nodes so we don't need to implement cycle shaders in the generic place but those designs are kind of on hold for now they are on the list they are somewhere on the mid-list me again yeah yeah this is my second item it's from the guys from our studio the animator the furious animator we have lots of camera bindings to the markers I really looked up on Google and everywhere but they didn't find a normal solution we can't import bindings between files I mean if I we use the same terrain lots of shots lots of cameras and we switch between them with markers and we can't import them between files and we can't grab markers with keyframes simultaneously which is weird and yeah so this is my second item in the code it's still called durian camera hack durian camera hack durian camera hack if they have durian camera hack all over the code don't on get microphone nobody hears you get microphone but I already said it now anyway so we wanted the most easy quick dirty hack to switch cameras and then we thought the markers are available and then we made the markers to switch cameras which is actually total abuse of a feature that was never meant to do that but then you can add features to markers unlimited right that was not the purpose but the feature of how do you edit cameras and switch them in a scene is really important I mean now for grease pencils the same thing you switch cameras all the time motion graphics for storyboarding things you want to switch cameras but I don't think we found a better solution on the market thing game logic and the viewport hmm game logic editor for camera switching the sequence is a typical thing you do with a sequence just drips and swipes and have the cameras move directly keyframing the active camera property directly keyframing the active camera property of the scene there is no active camera property it's a pointer how do you keyframe pointer she don't do that no, no, you don't do that you know that, you know that all you do any more here, that's one what performance improvements of selecting in the viewport can we have in 2.8 because it's still not that quick in 2.8 well I have about a thousand objects and 20 million polygons it takes about 5 to 10 seconds to select an object in 2.8 it does in 2.8 current version what graphics card do you use sorry what graphics card what operating system which driver do you use windows 7 nvidia quadro I don't know quadro yes quadro what I think it's latest I made fun of you but this is of course a serious problem 2.7 hap issues with intel or some other graphics card OpenCL select for 2.8 we are using so many new features that some selection might be slow but it's not meant to be slow it is not meant to be slow but there could be buggy drivers and stuff like this so if you have 2.8 and you find out a way to make selection very slow in an easy way submit it I'll talk to the guys you have your laptop with you but builder says do not report bugs about 2.8 we have enough on our desk would you kick my ass but can you try another computer and see if it's slow and if it's slow then it means that we probably can reproduce this and look into that but I guess if I didn't hear any developers complaining about this anyway it is supposed to be super super super fast if you have a slow selection in blender it's a bug and then we have to find out how to recreate it how to redo it and we look at it I have to wrap it up you still have your 20 questions one very last one okay my last question about tracks I think I tried to use it but I need to use sequencer to upload thumbnails and stuff and shots but will there be an option to do this with After Effects or similar because we don't use the sequencer I'm sorry one of the things that we're thinking about is to create a new shot directly from the web interface as well actually in the last few weeks I've been working on getting subversion linked to the cloud so that you can click on I want to have a subversion repository for this project that gets linked to the cloud if you also use attract it can install a hook for that so that those things are also linked and we try to integrate these things more and more so that as attract is familiar with the assets that you have we might also want to link them to wherever they are stored in the version which means that from the web interface we have enough information to actually construct a whole scene for you that we blend file with those assets that environment and then maybe even create that file for you so once that's done there will probably some API calls that you can also call whatever other software that can do HTTP calls Sibren, is it going to be like attract SDK thing similar to blender ID so what application can we use that functionality are there plans to have attract SDK pretty much using the same style of API as the rest of pillar so you should be able to use the pillar API SDK ok ok we have to wrap it up so thank you blender developers you can still have them have an afternoon here talk to them try to get them to work and solve more problems next talk