 Okay everyone, now welcome to the developer ask me anything. We have a blender developers here lined up We're going to start with asking every developer shortly to say the name and what they do we all did for blender and then I and Plus my solutions or Thomas back We will look up the sides and we are going to ask people microphones And you can have any question you want but it has to be something related to software development, okay? Let's start Hi, I am Steven. I work for the blender Institute, and I'm the coordinator for the animation module working on a limbic and usd I'm Aaron, and I work mostly on documentation and fix small annoyances inside blunder My name is William Rainish, and I work as a user interface designer Hello, I'm Julian du Roux. I'm currently the main developer of GLTF important exports Hi, I'm Mike. I did a lot of the open GL upgrade for 2.8 and now I'm independently working on game development tools for blender I'm Pablo. I'm a character artist and I'm designed and codex cal mode tools I'm a you and I'm a coordinator of sculpting vertex painting and a lot of stuff And I do a lot of drawing code Hi, I'm Sebastian parborg and I'm more of a jack-o-all-trades guy I do a bit of everything I feel but right now it's mostly animation system or stuff like that I am Jack. I am currently working on the new node-based particle system The fans I'm Julian also working at the Institute. I'm doing UI development like workspaces and fire browser We are and everything and you are I am Stefan and I'm mostly working on cycles I'm but yeah, I'm mostly working on data management or library or write asset management all that kind of stuff I'm Inesh This year I focus mostly on documentation on the development portal and also a bit on the user manual Hi, I'm Lucas. I'm mostly working on cycles, but currently also finishing you them Hi, I'm Howard. I work on modeling tools mostly bevel but also Boolean, which will be released very soon Hi, I'm Monique. I've done I think most of my time bug fixing patching and I am blender education Hi, my name is Sebastian and I'm working on fluid simulations Hi, my name is Hans. I've been working on bevel recently Hi, I'm Dalai. I'm working as a one of the court development coordinators with Nathan Letterey Big show of hands for the team here Okay, let me show hands if you have questions Who has a question over there? Hi, David Wilson from the UK and I think my question is probably for Jacks big animation notes fan Any idea when modifiers will become nerds instead of just things in a sidebar So a couple of months ago. We had to decide whether we want to do Particles or procedure modeling first and in the end we decided to do Particles first because we really want to get rid of the old part system and get rid of the code and have something better The plan is to go into procedure modeling and modifier nodes after with the particle system rewrite Questions questions questions, so I have a question about a limbic is there like what's kind of the timeline on say bringing in animated camera data or like I guess just generally non mesh data I Am currently thinking about a new layer inside the animation system Because right now There is a patch available by Kevin Dietrich that loads some camera parameters But it is it requires some some nasty jumps through hoops that rather avoid So I can't really give you a timeline because it's all very much designed face yet But we have a I have some ideas to put it in the animation layer in the same as You have keyframes and drivers and then would also get loading stuff from cash Thank you, Sibran have a question over there the guy from the first row can go there. Oh, there's Right there, yeah user user interface question currently it's a little awkward to work on Multiple windows and blender on multiple monitors because you have to click twice to give focus and then start working I don't I don't know if that's something that would be easy to modify or I don't know who to ask that question to but that's kind of my question. All right It's Julia or really and that's the question has I guess Julian There's so many this year Okay, if you go to fabricator developer planet at orc right now There's a task for this in the user interface project and it's a high priority choice So it's actually something that we really want to get fixed finally But it is a bit of a hassle I had to do quite some stuff there for the file browser and you really need to dig into the low-level operating system dependent stuff So, yeah, it is a big hassle and we probably need to do some trickery and to do stuff that the operating system APIs aren't written to do But yeah, it's a high priority thing and we want to get it done as soon as possible Hello, I have a question on the particle system and I wonder if you have If you are planning to do a level of level of detail system for the particles because for example when you have Vegetation I would like that automatically what's far get some kind so maybe also so different mesh of different levels of division To me it seems like this is not particle system specific but more like instancing specific questions So it should be part of some instancing System and I don't see any real reason why we can't have it just Not my priority currently This is also something that the library of the right system can help with and start having different instances and having Overrides on them. What's the other one? Yeah, you could have different revisions for example or variants often same assets with different levels of details You can just switch between them in the From the outliner for example, and it would apply to all things or automatically First of all, I would like to say a big. Thank you for all your hard work. I admire you huge respect for that and Now the question we have something like Cyclists camera visibility checkbox and I would like to ask if we would have something like that in easy also You're the draw manager Resident developer Well, if it is possible, we will have it. We do have Evie as its own stand-alone engine But one of their goals is to be totally compatible with the cycles workflow Why you don't have it? I don't know. I think it's possible But I claim as the lead developer on that and then he's not here, but then look as I'm gonna fill in for him I have to say that I'm not an expert on this So I might be wrong in this but my assumption would be that you to how Evie works It's hard to implement overrides because for example a reflection everything is done in screen space So if you don't want to reflect the objects that's on the screen You can't reflect anything because you don't know what's behind it and a Rasterization pipeline So I would assume that the only thing you could do is hide it completely or not and we already have you put visibility for that I see thank you Of course with light probes you can get a lot of these done We have only a collection affecting affecting reflection. So go ahead have to find your way around that It's a question for the interface. It's the one 101 project Something that we can expect in the future Tom do you want to answer this one? It's a secret project. We don't talk about it I Mean, you know me I like to dream futures and plans We have to make blender so much more awesome because it's still not good enough and the one-on-one project is about having a 2.8 compatible startup file a template which would be more easy or For teachers to put in high school And it could be a UI with only some monkeys and some textures and you drop them in there That would be viewport and it's a proof of concept to Show to trainers and teachers that they can use blender to configure Environments in which they can teach people something But to do that, I think the code is almost finished for it But we still need people on board to set it up to set it up as a project without adding more burden to them so I've been Put a little bit on the back on this like guys that told me it's nice if you do this, but not with them Right, so it's a bit. I'm waiting for the right moment to put it back the other part if I may answer as well the other part of the answer is that The one-on-one project as a as like a separate thing is is a lot less necessary with a lot of development That has happened and will happen So you have things like the toolbar and active tools that makes it easier to use blender more visually You don't need to know about all the hotkeys Another thing that will that will enable Usage of blender that is a lot more sort of immediate and and intuitive is the asset manager So you could you could sort of drag things to get the dragon materials That kind of thing So there are lots of developments, but the asset manager is planned, right? Yeah. Yeah, exactly So so that will should greatly help as just a base basic built-in feature and now we have things like left click select by default So You see blender moving in that direction becoming more powerful and easier to use over time I Okay, you don't want to add something very brief about that I really brief like one of the things that I'm always being told when I talk to teachers or to some teachers They say that people are The idea that kids always struggle to use planner and want a really simple planner is sometimes not really true It seems like it's more the teachers that struggle with a complex blender than kids actually and So I rather spend time I rather spend time improving blender as a whole then focusing on simplifying things for the specific thing It's still the one-on-one something that we still see as a project for 3d printing for example You might want a subset of blender and gonna get it Hi, and I have question on the new particle system right now. We have like Start and and emission process Does it would convert it in a rate process because it can be really practical for us Are you my what? No, yeah, so I want to get the month of flow to talk as well so So I think the question was if we can have instead of a start and end frame for the emission just a rate that you can keyframe things yes if you go and then the beta.blender.org and then go to the experimental branch and test the functions branch which is where the common state of the particle system is and You will see that in fact the mesh emitter already has this functionality right now. Okay, great. Thank you first a question to a Question to everybody who experienced having the viewport in 2.8 slower than in 2.7 Not many, but it's the same with me. So Is there a good reason therefore? Is this something it's being worked on? Yeah, there's a reason for it. Okay, I Hear it's my fault. So I take all the blame but Well, yeah, we First went to a new OPGL level version what also add a lot of burden into how to Get the GPUs ready and get the GPUs drawing Correct we all know about it and the older developers are Working on it to improve the system and we have some ideas to improve it But it's it's a long run to to get it all in in there So we know about it Maybe Mike wants to add something about it as well Mike did one of early engineering engineering for the new drawing API Yeah, well if you're if you're blaming the new thing on the new open GL, then that is my fault. I'm sorry, but As far as why it's slow I mean one of the things was that I noticed was slow to resize windows and things like that and that's about reallocation of Buffers because we're drawing into buffers instead of drawing to the directly to the screen And so that's a thing but in normal usage You don't have to do that and what makes it slow is that I guess I mean I can think of one specific thing That could be improved and I might pitch in but it's like yeah Can I just ask a follow-up question because are you sure you don't you're not you don't mean that Transforming things in edit mode is slow. Is that that could be what? What you mean as well because that's a little bit not so much the viewport related but the 12 to 2.8 we try to The main goal was to have a playback animation as fast as we can at the even if had to sacrifice performance elsewhere So it really depends on where we're facing it. I'm not surprised. It might happen. It's more Way more workload to the shader to the graphic card with 2.80, but I expect in production computers to be Decent results people working in the studio are simply fine We're gonna get improved I hope Not asking for more Complexitations, but is there any perspective for programmable open GL or shaders? The program open gel the The one that's only I think it's not compatible with Mac, right? No, it's just that you can create your own Shader code and Evie. Oh, right, right, right another first one to ask this in the conference It's about how can I have my own material in Evie made by myself does anyone comment on this? You know, maybe basically maybe I can add a little bit that Throughout the Evie at 2.80. We changed the whole on the line Drawing API and draw code. We actually we still will we did a New API to help anyone drawing with the add-ons We had a draft in the beginning to have a whole engine that could be a hundred percent open gel Or to allow for something like looks render or Arnold whatever external engine you plug into blender to have its own Custom GLS drawing never went out of the paper We still haven't seen the need from the The other engines that would help maintain it is so that's maybe one of the reasons it didn't pursue it further But maybe you can help Oh hi Hi, I had a question about blender interactive mode and What the timeline was on when they'll be released and if there would be future plans for support with? web deployment Don't wanna answer that one Don't do an answer about the one about the blender interactive basically we tried to Or we had a design for how to replace what once was the blender game engine with whole interaction in DIY And if you think about the tool project with everything has a gaze mode the VR project Which we're working on which everything is a you can we are inside the viewport to work on It's a was a moves in that direction However to really wrap it really have as interactive mode. We didn't get the developer to do it. We had someone that Maybe would be able to do it. He didn't have the time and we kind of had to prioritize Everything else, which is a lot. So sorry, but there's no there's no timeline on that sorry And there was someone here, huh? I Think it was here for you. I have a question about the new bevel modifier I saw a presentation by the way great. Thank you for your work You were using the UI graph to create a new profile to use on a bevel Have you considered using a bezier curve instead the treaty curve? Because that's easier to control and it might be easier to save profiles to use later So first and background the question is about the custom bevel profiles, which should be coming in to 82 yeah, and The so I would like to add Sharing the curve with a busier curve object in the viewport actually So I'd like to work on that at some point and it should be fairly doable. Yeah. Thank you There's also a chance for people to ask about the code of the development How does my patch into the blender if I've been asking though? So it's not only about the users some others having had a question here, but I think you'll see it first Sorry getting line First off the eb viewport has been incredible and revolution is revolution as my workflow But along those lines, I've noticed that when I try to render out the eb viewport It's significantly slower, even if I'm just rendering out as if in the open gl viewport in 279 when I'm just trying to render out the viewport settings Equivalently I get significantly slower render times and if I'm trying to render out a significant portion of like a significant frame Counts I can preview it at 60 frames per second in the viewport But then when I render it out, it's something like five frames per second at a render speed Is there any plans to improve that or is there something that I'm doing wrong? I Will first answer the question what happens Why why is it slower? And basically when you do it in the viewport It's rendered by the GPU and it's displayed directly on the grabcode card. So that's that's fast and if you're doing a Playgrass in the viewport it's only Calculates a single sample So in the in when you render to to to an image a lot of other other stuff will happen It renders 64 samples. So but that's not the main case. What's happened What's happened next is that the GPU in image is copied back to the CPU and On a modern grabcode card. That's a really Slow process you can only do it 20 times per second That's the main burden of of that. Why do we do it? Why do we cook copy it back to the to the CPU? That's for the color management That's for the sequencer. That's for the compositor And basically that's that's the slow the slowdown the copying back from the GPU to the CPU Texture painting Texture painting what will be really cool is to be able to paint to textures that are not the active color map and See the rendered view say if you've got black-and-white map for switching between two textures to paint on a black-and-white, but see the mix between your textures materials look at that our first feature request is Sneak team as a question Do want to answer something around that the feature the plans for texture painting bubble currently texture paint has an insane amount of issues related like Starting from the brass engine how it's designed so I want to like Think of and make a new design on how textures will work and how we are going to talk textures with channels to paint PBR materials and to have multi-layer painting and Obviously that is going to come with that, but we need to get the basic stuff Right first like the brass engine like rotating the 2d view, so I think we should start getting that right What you're asking about which is could you see the final result while you paint in the viewport you can already do that Then I already had a separation between the active painting Channel and whatever you're seeing you can just see in the EV then everything is composed and you can just be painting on the blend channel But as Pablo said Udin moved to the layer texture painting. There's so many things you want to add, but they have to be done properly So what do we have people here to do it? Hello, I would like to ask about the recent AMD sponsorship and There was mentioned to Vulkan API integration into Blender What does it mean more generally and what does what did bring GPU? Rendering in cycles on Mac. Okay, so what does it mean now that we're gonna have more? incentive to have Vulkan in Blender It's about the The OpenGL is not being developed and anymore for a long time And so we need to to go to to a newer version and on our newer render API and that's Vulkan Basically the step of Blender 2.8 was first to get up to a level OpenGL 3.3. So it's easier to switch to another render engine and Vulkan is our choice So we will go that way But it's not an easy task. It's we Vulkan has is really optimized for for gaming and for distributing gaming and Application or 3D software app application. It's quite hard to get that fit into Vulkan But I know that Clamor has a plan for it How to how how to how to execute it? It's it's it's really up finding time and to do that and afterwards a lot of stuff will become available to EV to Whatever Don't there's a there's a chance that there's not gonna be great performance improvements Although Vulkan is praised as it is. It's fantastic because of the architecture We're pursuing it's not really thought to work on tiles in kind of optimization without huge revamp, which won't do Yeah, I would like to know about the undo system because it seems really slow Hello Basian Well, exactly. It's not his fault. He's just helping here There's a branch, right? There is a branch for the asset engine system. Yes, and Like we have plans to get it the first initial version in 282 or 283 Why is it slow because I've been working on a lot of other things, but wait Why is undoing? Is low in 2.8 and 2.7 9 and part of this is that we're now having copyright on everything and then Basian's gonna Yeah, the main issue is that after each and to slow and do step we have to rebuild the wall Depth graph so you have to re-evaluate all the all your modifiers all your a lot of things so that's what is making undo slow currently and globally the plans is to Undo less if I can say that so that we will only undo things that change and try to keep every The world data of things that didn't change and undo step including the old evaluated data so that you don't have to rebuild it so Ideally we should gain a lot of speed, especially on complex things with that system and That is planned for 82 That should give 2.8 in pair 2.7 9 and Then from there only can maybe keep improving can catch the modifiers stack all the way to the last one for the active object So we're tweaking something. That's and they're really really I got a quick interaction Hello, is there a roadmap or at least hope to drop the old internal texture for modifiers and be able to use texture nodes for this place modifier for example That one Jack maybe oh We removed most of the blended internal textures and it kept only in a few places and so there isn't really a schedule for Texture nodes currently like we definitely do plan to implement texture nodes and when you have texture nodes I don't like should be easy to use them and modifies if we Probably we won't even have modifiers at that point anymore. So But yeah, it's just Right because you have notes And So, yeah, we then you have texture nodes we can use them and For procedure modeling and everything. I don't see why not Just takes time For cycles why cycles doesn't have its own texture nodes. Let's say Stefan Texture nodes as in like more procedural textures more various procedural textures Well to a certain degrees is the more code we add to cycles traditionally we all set a problem if we Made cycles more and more complex. It would get Harder to get it to run on cheap use sometimes it will actually fail to run it all on certain cheap use Or it just would get progressively slower to more code and more complex remade cycles And on if you're leaving cheap use aside you can go crazy with anything you want with open shading language and Maybe one of the long-term goals will actually be using open shading language on the on the GPU as well And then you can actually create your own shader nodes and Go crazy with any texture any way that you like That would in my opinion be an ideal future. There's parts of Open-shading language that are already working with CUDA, I am not aware of anyone that has public support for OpenCL We'll see I mean it would be really cool to have that. I don't think there's anyone actively working right now on on really really adding Like a very complex system and we've had we've had then you few new procedures being added recently It's just a judgment. You have to make like is it really is adding that extra code extra complexity extra maintenance to an already complex engine worth the effort Or are we like adding one feature and blocking all the users from getting more speed That's for my curiosity at the two seven nine and all the texture nodes after on two point eight We got read up, but only for brushes and things They're unusable says Basia It's there, but it's kind of an exactly dormant state Yeah, I was just wondering about motion blur in EV How's it going with that? There's an We do have the camera motion blur and we have a task that that we will get object motion blur It's just one of the many tasks that still needs to be done Hi, I'm a big fan of the adaptive subdivision and micro displacement. Is there any plans to? Develop that further and maybe have support for EV All three Opus update right well Microsoft Microsoft we said micro displacement micro surfacing, right? Do you know does anyone know for the cycles be that Stefan or Lucas give us your opinion on the matter? Well, I think there is still some work going on of the micro displacement and cycles But I'm not really up to date on it right now and as for Evie I don't see a technical reason why it would not work But as I said before I'm not an expert in Evie at all. So there might be something that prevents it, but I Think it might work maybe Those more advanced programming shaders could even work on Evie But then whenever you switch to Vulcon and to metal if you want to support an Apple might be an issue That's what you're almost going to say. So there you go next I Hear someone asking about the Dean Lucas. I'll get to you. Sorry When is your thing gonna happen in Blender? That depends on how Do we want the full feature set right now or is it because right now most things are working The only thing that's still blocking it is the workbench engine Which is kind of important for texture painting. So we would probably want support for that I'll probably need help from yeah from Joanna anything anyone who knows more about that and apart from that it's mostly a case of just to review and organizing things and Making sure there are no bugs left because most of the code has been just sitting around for a year and things have changed and Yeah, just some polishing and workbench and review Now for the real question so I was trying to modify a stroke in a the sculpts mode by API Python API and I got a Erasing incorrect context, which I was told that that's The wrong mouse input. So what I'm wondering is is it possible to program? through the API a stroke for the sculpt mode Through just programmatically doing that and possibly recording and playing back sculpts. I Mean this is There are several level in that question on the generic level of the API the Python API So operators in blender. I expected to run with a certain set of Data which are featured by a context when you are running that operator from the command line or from a script You often don't have the same context that the one is expected So you have to fill it your Generate a context yourself with the data that is required. That's the technical part for them The sculpting thing I will let one of the sculpting guys answer but in general just in general you have the You can define curves in the API already for painting as Should be able to do it for sculpting too. I can I have a prototype some Working more or less and for storing the strokes and repeating them That is what you're trying to do But it only works on skull mode So I need to do something similar for texture paint if we want to keep compatibility normal We do have something similar for the MPX project They're doing via Python add-on a way to control the strokes for sculpting in grease pencil via via Python So it's a Think you got some answers here Yeah, hello, and I'm fell new to Linda. I used three years max for 20 years just to make the switch but One of the first things I was looking for is material library. I could build in material library. Is that something that's In the pipeline we do call it differently call asset manager. What's the What's a timeline for asset manager, but yeah well asset management itself it's for 82 83 the first basic release of it and then we of course keep building on it and for the material library there's also the cloud on blender.org Which features a set of easy to use shaders already And of course we have the blender cloud add-on which allows you to use some of the assets That are in the blender cloud directly into blender and there's a few independent Projects we had the blender kids presenting something last night and all that so asset manager those are going to be the main Of the main project that's going to take care of that Hi, and I'm a hard surface modeler and I hear during the introductions that there's a new Boolean system in the works. I just could you tell us about that? What's going to come from it? And how will it improve modeling workflow a new Boolean system from scratch Yeah, so the a lot of people complained when cork was taken away It was taken away because you know there was nobody really wanted to maintain it It wasn't maintained at source. So I've been writing Boolean from scratch. I'm getting close The main things I'm addressing are You can have coplanar intersections It uses Doubles inside instead of floats of some of the precision errors will probably go away and And you probably won't have to have completely closed things to intersect It'll be allowed to have a little bit of leak there. I just speak closer. It's gonna be fantastic It's already fantastic. When do we have it in bladder in a month or two? I Have a question I For the audience who is still using two point seven nine Aha, wow Now I want to know what is it that we have to do to make you move to two point eight No for me. I just need my time, but I'm moving soon. Okay. Who else is that something we really have to do to get to point eight working for you I Mean I for rendering in EV. I have to use screen capture for Like rendering a viewport render where I could do the open GL in the last one And if I'm sending to like a server farm, and I can't just open up a screen capture and screen capture the viewport Well, that's for offline rendering or what do you mean by screen capture? My previous question was about how like I can view render at 60 frames per second But then when I'm actually rendering it out to the to my files It's about five frames per second But if I open up a screen capture, I can quote-unquote render at 60 frames per second But I could have done that in two point seven nine at 60 frames per second or faster The playback speed of the viewport for things. Okay, so we have to make blender much faster No, it's for only for offline rendering. It's only for offline rendering There are a few people that have similar problem because it could render very fast with the blender internal But for EV they don't know how to use a render farm with GPU and that's a part of the problem But some people are fixing that already One more two seven nine features, okay? Yeah, what? The compositor The compositor it works fine Is the compositor not working as we had in two point seven nine? Real, it's not retry. I mean two seven nine and two point eighteen compositor at the same basically hahaha We come back to that at the end. Okay, we come back to that Who else has two seven nine things that we have to fix in 280 over there? Can just scream and we repeat? Thank you We passed the project from 279 to 2.8 and we have very lack of Performance when we had a lot of modifiers like array or sub surface And it was very complicated because we can't finish the project because of that We dropped to 24 frames per second to 5 frames per second for it was really really hard It meant arrays and particles. That's what it said. No, no arrays and sub surface modifiers subsurface object Does anyone want to well? Yeah, okay Well subs we have the open sub div in two point eighty, but only the CPU one So that's one of the problems people are noticing like it's lower now than it had because now it's CPU used to be GPU We hope it's okay. It's gonna work on that. Oh Hold on hold on So one of the reasons that the open sub division is slower than the previous method Is that the previous method we had in two point seven nine before we deprecated remove it did a lot of? Like shortcuts in the subdivision method so and for a lot of people that was a problem because it produced low-quality results and Because we didn't want to maintain ourselves. We moved to open sub div that computes very high quality subdivision surfaces and the issue with that is Because it's high quality it takes longer time to compute and especially if you have Vertices that don't have Four edges, so if you have something with three edges five edges or more it becomes a lot slower because it has to do specific tricks to figure out how the Metapacal surface for the subdivision surface will look Sergey has looked into this quite a bit and it's a hard nut to crack because a lot of the Optimizations we thought we could do wasn't possible So now we're kind of stuck between not really knowing if we can speed up the open subdivision on the CPU anymore We're still looking into the GPU stuff But we're not sure either because it seems like it's quite a common problem with open sub in itself That it's not really built for or maybe it's built for but at least in our cases It's slow when you edit the meshes if you just have a static mesh pull it through It caches the result then it's fast because it doesn't have to do these computations again and again again But in blender because it's a generative system It's slow because then it has to recalculate all those stuff every time for the mesh Am I am I allowed to add a quick question? Okay, about open sub div and why it's not on GPU because like if topology changes and things like that Then you have to recalculate all these things But just once you have things set up and you're animating I mean that's what it's exactly designed to do So like what's the hold up? well For the ray modifier it generates geometry So it's it's if you have the topology is the same It doesn't change per frame then it's fast but as soon as you have something generative like Boolean modifiers or Something else that changes the topology of the mesh Yeah So in the 2.80 we remove two big things like blender internal who's missing blender internal handling Still a couple of people I heard that one person that you told me that they use a offline rendering or service and blender internal is fantastic for that But we'll be walking on it the game engine. Come on the game engine. Wow. How many a Little bit a little bit. Well, that's not very enthusiastic, right? So this is not a super high priority. How is interaction mode coming in general? Well, yeah, they asked right with we're doing the beats that can allow us to Create the whole thing a design interactive mode We answered that question already. It's all good. It's all good Now it's a 2.7 on the rise have a real question here off me anything more 2.7 stop. Yep. Yep I'm still using 2.79 for smoke simulations because Subframe interpolation and 2.8 is broken on smoke sims Smoke simulation I'm I've seen that it's broken and 2.7 but in it works in 2.79 and it doesn't in 2.8 but I Didn't see a point in putting too much effort into fixing it because the new system is coming out in 282 so you just have to wait a bit for the next route So we are still not Get up get up stand up. We have said that you're using 2.8 because our in our company the whole pipeline relied on the GLSL shaders of 2.7 and Yeah, Dalai, you know the problem and I talked with you about it That's I just wanted to mention it and I'm happy that you're already And Kind of yes, they were working with node materials and they would export the node materials and use using their engine But if Evie, there's only a hacky way of doing it. So that's on the works I think there's a question here You're the next after we're done with 2.7 to 20 80. Yeah, it's not about it's when seven but 27 people there No, I don't use it And I'm not completely sure. Sorry. I'm not completely sure but I think that you cannot open GL render at the sequencer Is that right? You can you can you're wrong? Maybe maybe yeah Yeah, I can so I can render Spencil did Spencil specific maybe Annotation because it now is a notation in in the sequencer because I did some tutorials or thing and I I use Grip pencil to to mark things by now you mean 2.80 80 or I Don't know. It is the future for me. So I did one. It's like wow While we are working on the sequencer We see a few talks about that the grease pencil team is particularly keen and then I should be here by the way Daniel Lara, but you're like keen on having know the editor centric pipeline. We saw that from tangent as well doing grease pencil So even if it's not working, we're gonna get there. I can't see Okay Back to the asset manager. How would the asset manager? If it's not 2.7, we have a line here Any anything else more 2.7 anyone else still hanging to it We had the whole presentation about the blending image. It was fantastic pharaoh 3d. So good luck with that, buddy Honestly It I have a question for I think that was in 2.7 already and it's still in 2.8 but that is a little bit for Python development and It is especially for the console I think that has been going on for quite a while That is that you have to toggle the console window to get the actual output from the print commands, for example And on Windows that works quite fine because you have that toggle console command on Linux For example, you actually have to make like a shortcut to start blender via the command line So you can actually see the output. Is there maybe any way whatsoever to get the console output? Visible inside the blender window without having to toggle all of that And that's feature request look at them feature request coming like Like there's nothing does anyone want to comment on that? But that's not preventing you from using 2.80. So you're sneaky bastard. I mean We don't have any concrete planning for that or so, but what we plan is having the info editor reramps or we can put all kinds of information actually in there like detailed statistics and error logs and notification logs and everything And we could sort of redirect standard out to that and have the prints in there. I guess that could sort of work Yeah, I don't think we don't have any concrete planning for that. It's I think William said before just the added modes is really slow if you have a lot of modifiers on your thing Which makes me go back to 2.79 if I have a lot Subd modeling to do because I need to see what's gonna be my sub surface surface But it's like really slow like I drag a purchase and I wait a half a second to see where it's going as Somewhere way beyond where it's supposed to go. What's the scale for projects in your computer station? Sir say again. What how big is the project you work on? No, not big just even I've a lot of just one file one sub demodeling object Sometimes it's just I move a vertex. It's really slow and then it draws and it's just all over the place It's somewhere. I think William mentioned it before We're gonna put that on you. Well, I just Yeah, no, but it's it's But you're right That's that's probably what more people are experiencing as being slow rather than the viewport drawing I mean the opening the viewport is quite Fast but manipulating vertices is a slow part which is to do with the depth graph and the new open sub div Thing that we already discussed There's still 2.7 topic. Yeah 2.7 topic. I get this question every week So I want to point people to an official answer regarding baking something that people are missing baking the displacement It was possible in blender internal. It's not in 2.8. So what's the official answer? All right, who likes to talk about baking we can talk about the baking the other part, but I Think the official answer right now is there are two tasks for displacement and some other type of baking that used to be possible on developer blender org both of them are created but have zero comments and That generally means it is planted will be coming back. It will be added to cycles in the future Now we're going for the final two questions Very quick So so in the work that I do we we need We use custom properties on Individual objects and we want to use the same shaders between those and I'm just wondering if there's any technical Limitations to being able to access custom properties in in cycle shaders There is no technical limitation. It would just need to be implemented The main concern as far as I know there is the user interface because you would just type it into the attribute node But there should be some kind of visual feedback on whether what you typed in is actually going to work in the render So we will probably need to work with some UI experts Like some UI experts sitting over there maybe okay made it so yeah, that's definitely possible and it is planned Sheeper and yeah What so it's that's over topics answered Yeah, it's all good. Okay. We're happy We're done. We're done with questions. No, we're done We're on because we're going to end with a new tradition, which I tried as an experiment last year That is the audience will vote for a feature I Know there's manipulations going on here, but we're going to make it a fair vote So I want to have like three or two three four topics Where you feel like the blender foundation should spend more attention on for example by giving a developer a grant To work on that specific area on blender. So can I hear a suggestion from the audience? Compositor has been noted It's there What else just scream just scream these are Deep compositing deep composing that's composing but deep It's part but do the compositing people also want deep compositing Of course Okay other features Everything knows you're walking on that A native voxel object that that that's addressable like that. That's that's essentially a wrapper for open vdb Yeah, open having access to yeah, that's like an add-on. You can call that in. No, no, no, no, no No We have an even a task for these Native primitive in blender. Yes, and you'll be a volume primitive breath had his ideas a way to do it for cycles Okay, so this is a good feature them as it must be fantastic. So how many people support this? Oh You're voting like these you're going to get a hard time beating the compositing mafia here There is something more popular than compositing come on NURBS NURBS NURBS, I thought we wanted to remove NURBS Okay, that's total NURBS fan shout out a Separate workspace for lightning. That's also easy. That's a weekend project for very easy You can do that. No one walks in the weekend as one of those guys and they will do it work in a few hours for you people They are just cream what what what what what? Motilator text apparently we don't have that No, what about the color management? I see someone there asking for a snapping now Simulation grooming and hair. Yeah, we have grooming. Yeah, no, no, that has to be Okay, what what you'll be editing of what? We have UV editing say something otherwise we forget parametric object Wow but But can you can you summarize that in one line what it exactly means for you? So you so you still have you still have your editing options after you start to modify say a cube or More like a real modifier stack Without no, it's We know it is like you have a cube with six edges and then you do everything and then you change the four edges But not a cube Proceed or really procedural object that knows geometry only procedures Okay, but with booleans and everything. Okay. I think we have enough features now. What? Unless someone's really desperate and it's a something really desperate Camera tracking But yeah, whenever you try to make a python macro sequence it doesn't register on what object you click It would be very cool if that would be added added. No one's gonna vote for that. They want cool features They want cool features So, um, I because if I find I will remember five of them Okay, uh, we're first going to do a poll for every five want to see hands and see how the Diversion is The parametric objects in blender now I see arms Are you Okay, that was a Composite multi Multi-layer texture Multi-layer texture Everything that's getting popular um Compositing is Getting there, but there were three two more. Um Volumetric Her her her editing and grooming Yeah, and grooming and volumetric and volumetrics And color management. Oh, we have that already Color management attention to color management. Okay so And the Oscar goes to the Oscar. I think the compositor is choose a bit What are the top two now? I think it's composter and multi layer. Yeah multi layer. Okay. Now. We're going to vote for only those two Can I get arms for the multi-layer texture editor? And now the one for compositor It's like the same It's like the same We have to do both you have to do both. Okay, we do both Thank you audience Okay, we have to wrap it up Uh bigger plus for the developers