 I'm going to show you a little bit of what I've been working on during the last year while I was doing development for Blender and let's go. So there are a few categories I've mostly worked on, which is the paint systems, the viewport project, which is a little bit of a recent, we undertook the project pretty recently, but it's been brewing for some years actually. We have some things on the user interface, which have been included in past releases, the PyManus and the widget project, which I'm going to talk about also. A little bit on video sequence editing, which was mostly for the needs of the Gooseberry project. And we're going to also talk a little bit about the plans for the next years. So for the tools that have been made this year for paint systems include better masking, lasso and box masking. We have full speed mat caps as opposed to earlier versions of Blender. We have done a number of optimizations on the speed of sculpting with Campbell Barton, who must be somewhere around here. And we've improved the twist tool. We've also added the gravity tool, which is a development by Jason Wilkins. And we've added vertex color baking to textures. And we also have for the 272B release, we have the Google Summer of Code 2013 branch, which is mostly texture painting related, but there are also features that are relevant to sculpting and vertex painting and everything really. So we have curves, palettes, gradients, masks, and the most important feature I think is the slot functionality, which is similar to layers, but not exactly like layers. We have blend mode support for the brush system, which is more like a GIMP. We have anchored in the drug.strokes, a unified color system, and a huge saturation and lightness color wheel. And all these have been merged in master. Perhaps I should make a demonstration later about how it used to be and what we managed to do for the latest release, how it works and how much better I think it is. So maybe give me a minute. So I'm going to open up a version of Blender. So this is Blender 269. So right. So the workflow in Blender 269 is that you have to go to edit mode and wrap your mesh. So we have to unwrap it, we have to go to the texture paint editor, to the image editor here, we have to add a new image and sign it to the UVs and then go to texture painting mode and now we can paint. And I guess you can imagine that this could be a little bit troublesome and new users when confronted with these many steps usually did not know what to do. Some people skip steps. If you want to change the image, it was pretty hard. So the way it works in Blender in recent master is right. So this is Blender 272 and the steps is you open a pie menu, you select texture painting and here you have a warning which says you have to add the UVs and the texture paint slots. So you basically just add the UV layer and the diffuse color which is assigned to the material itself. So if we go up here and we select the material and the texture slots, you can see that you have a diffuse color added and you can go in texture painting and paint immediately. So yeah, this is not the best example, of course, it's very simplified. But if you have a quite complex model with textures which use UVs, you can immediately get a list of them by going to the slot tab and seeing the list of your images here. And I think what I personally like best about the system is that you can quite easily just add a normal slot like that and go to material mode and paint bump maps which was pretty hard to set up in the old days. Anyway, so pretty quick that's the highlights for texture painting. Some people liked it, some people liked the old workflow. I might be interested to know if people used it and what they think of it and if you think it could be improved somewhat. We had a few issues right before lease where some people complained that they didn't really like the system. So I'm always open to feedback. So just hit me. And back to the presentation. For the future, we want to, I'd personally like to use a PPVH for vertex painting which would make it fast and actually work correctly. I don't know how many people of you have tried using vertex painting. But if you really zoom out and try to paint on the system and you zoom back in, then you get some blanks, because it basically depends on the viewport itself to detect what face is to paint on. And the correct thing to do it is of course to do it computationally and reuse the same acceleration structure that sculpting uses which will allow us to basically paint on the mesh directly and correctly. We had an old P-tex branch by Nicholas Bishop which is somewhere in the cyberspace. We probably would have to look at it. We have a great patch by Fabio Russo who has coded layer support like Photoshop in the 2D Mag editor, which is a huge patch. I'm not sure if we can properly look and integrate it soon enough because of the size of it. But it would be really nice to have. We have to fix multi-res modifier and be any people who are sculpting here I'm sure will have experienced the awesomeness, quoted awesomeness of the multi-res modifier. And maybe I'd like to look into GPU texture painting for the future. All right. And let's talk a little bit about the viewport project. So we've written a blog post about the viewport project in Blender. Actually we want to improve the performance of the viewport and support full-screen effects, compositing, and of course materials, node-based materials, PBR and stuff like that. So we have two main branches, one with by Jason Wilkins and I've started a few experiences on myself, still early in development. Next targets are going to be compositing mainly and performance. And we wanted to also take this opportunity to explore support for embedded devices like cell phones and tablets, which is reliant on the OPEGL version, which is different than what we use on the desktop. And this is going to take a while because it's a technical target that's really complex and we will have to change our code base quite a lot. So we are going to have to change our OPEGL requirements to 2.1 probably if we want to support this, because we need shaders for everything. And we are also looking into deferred rendering perhaps, and there is a blog post where you can check out the targets of the project. But I'd like to show you a little bit of what we have already. So I'm going to open up my viewport branch. So this is a build-up of the work I've been doing on my personal branch. It's not by Jason Wilkins. And I'm going to use, sorry, okay. All right. So basically, one thing I wanted to first check out is a screen space somebody conclusion technique, which would help sculptors to see better what they're working on. And here I'm going to use an own project file, which I'm going to mutilate. Okay. So this is how the default effects look like, and we have some settings here. Basically you can enable some effects here. I'm working on that for field these days, but still it's not working as it should, so it's disabled. Okay. So basically you can define the radius in the world space and you get a darkening effect which you can control by this slider. It's basically a ray tracing effect. It's not very apparent in this projector, but there is some noise which we can control by adjusting the quality settings here. And of course it's real time, so if I scout on this, I can get the effect I want. And you can also control the color of the effect. So you can get something like, I don't know, a metallic neon light thing if you want to. So, what else? I've got another scene as well to show with the inclusion, sorry. Something is wrong. Yeah. Sorry. Right. So this is a pretty simple scene with a screen space ambient occlusion. This is without screen space ambient occlusion, so you can definitely see, you cannot really see what's going on, but with screen space ambient occlusion, with an orthographic camera if you enable it, you can see that it's on the scene. And it's not a real ambient occlusion effect, but it helps I guess if you want to show something that's in your viewport. It works for edit mode, it works for matcups, so yeah. These are not really important effects in that they don't really enable artists to do better stuff. So you might argue that it's not really important, but still they're somewhat, they make the experience somewhat more pleasant, I guess. So let's go on. Right. Now my favorite subject, it's the user interface. Okay. So there are two things that I've worked on this year. The first is the PyManus, the widget project. For the PyManus, I'd like to thank Liquidape, Olson, I forget his name right now, Sonolson, yes. He was very helpful during the development. So basically PyManus are already committed to master. The initial plan was to support one level of PyManus, and we're slowly working on multiple level of PyManus, like I'm going to demo it in a while. We still have missing support for custom positions, which is if you have a PyManus, you cannot, actually you still have to script your PyManus. I'm going to show you in a while how a little bit of how this is done. And it will be nice to be able to script like I want my main item to be at northeast, etc. So this is not yet supported. And we still do not have sticky keys, which is something people requested. And initially we had a semi-working solution for that, but it wasn't really optimal. But we have a developer here, Julian, Iso, who was provided a patch and we're probably going to review it and include it for 273. Basically it allows you to click a key and get an old behavior. For example, if you're in edit mode, if you wanted to toggle between edit mode and object mode, you can tap the tab key and get in and out. But if you press the key and hold it, then you get a PyManus. So people who want to use the old key maps can use the Caluso, and they can also use a PyManus as they want. So PyOthering is most left to add on authors now, that was a little bit of a difficult decision. Actually, there was a bit of an uproar because of that, but I think it's the correct decision. Because we had a discussion about what to include in Blender, and there was so many different opinions that basically it didn't make any sense. So I'm not sure how many of you know what PyManus are, but basically they are sorry. Okay. So this is a bit of a master of Blender 272 again. So this is a PyMenu, so it allows you to quickly select a mode and get into it. And you can select a shaded mode for the viewport really quickly. So the thing here is always to be quick about what you want to do. And you can even select the view, left, right, et cetera, bottom. So even with people with laptops, this is who missed the numerical keys, this is pretty useful, I guess. And for the 2.73 release, we're going to include the sticker keys that we've talked about, which I cannot show right now. And we also support nested menu. So I can just click here and get a second level of PyMenu with more options. So we also have an interesting new option here, which allows you to very quickly swap between menus. So you can zigzag, for instance, once you pass the threshold, you can do something like that. So if you know how the layout of the PyMenu's is, you can pretty much navigate through them and select the option really quickly. Let me resume my presentation. Oops. Right. So we have the widget project, which is a big project that I initially wanted to do a quick demo, but maybe it's not the time yet. Actually the widget project is an idea by tone, and we wanted to allow actually plug-in authors and blender developers to easily use widgets in the 3D viewports to tweak properties of their operators and actually over their scene. So basically say you might want to tweak the color of, not the color maybe, but the distance of a lamp. You could just drag a little widget in the 3D viewport and it would get adjusted. And as opposed to looking it up in some menu and tweaking some property, you would get the interaction in 3D viewport immediately. So the design is to be able to easily add through a C and Python API. So people who also code add-ons should be able to use them. And you could use them to control operators or even control properties that are in leather like a lamp's spotlight or whatever. And we also plan to make them work on animations. So you could basically define a part of the mesh to act as a widget and you could tweak it. But then you get an operator fired up or you get an animation key frame operated on. So unfortunately I can't really show you something right now because it would have been great, but it's not working as we would like to yet. And it communicates with your virtual through the event system. It's pretty technical. I don't think it's much interesting. Okay. Target for mesh to act as a widget as well. Okay. So we also have the sequencer. We tried to improve the sequencer for the Gooseberry project because basically there are some issues which mostly have to do with synchronization and undo and audio. I've done some work a little bit on this problem. There are still lots of things to do. For 2.73 we're going to include a trim tool. Actually yesterday it was renamed to slide tool. I'm going to show you in a while what this does. And we still need to make the system much more stable, I guess. But I'm going to show you a little bit of the sequencer branch. This is the end of the presentation actually after this demo we are through. And I'm open to questions, but I am going to show you a little bit about what the improvements are. So I'm going to open up a movie file here. This is ... if you've seen it, so ... ah, so I have to mount my disk first. All right. So let me refresh that. All right. So if you want to see the endo issues I've been talking about, you really have to enable ... sorry, yep, okay. So we have a preview here. We have the movie and we're starting to look at it. And I also want, sorry, it's a bit awkward to work like this. Okay. So say we want to draw the way from this strip, which is the audio of the strip. So I'm waiting. I'm going to get the coffee. I'm waiting some more. Right. I'm waiting some more, more, more. And finally, I hope we're going to get our way from ... I think I didn't? No, I did. I did. Yeah. So anyway, as you can see, this is pretty much unworkable. Okay. So yeah, I took the second click. Now the interesting thing is that if I try to move this and undo, it works. Okay. Maybe I should re-click that. Anyway, maybe I shouldn't have done that. But the point is if you undo and you try to, if you have draw waveforms on and you try to undo after an operation, it still takes the same time to basically do this thing. And I'm pretty much, I think I will have to kill Blender so that you won't wait forever. Oh, it is? Thank you. Thank you, Blender. All right. Okay. Maybe I will not undo. Just, I think you got the message. But I'm going to open the new branch so you can see how it works now. That was master. Yes. So it turns out that since people are working on video editing a lot, we were interested for the Guiseberry project to also use an option to turn on all waveforms on and off. You do it through here. And now Blender uses a work, basically, which is a threaded thing. The interface is responsible. You can do stuff like that. Okay. Master disappeared. I don't know. But you can undo and you still get fast undo. I didn't try this on the old branch, but I advise you to not do it. And we have the trim tool, which is actually the slip tool. Basically what it does is it allows you to take the contents of the movie inside the strip or a sound and transform it without changing the in and out points of the strip. And also what the director of Guiseberry requested was the ability to have a backdrop just like the sequencer. So I can click that and I have the movie right in the back. So yeah. Okay. And now if I trim and trim the contents sliding, sorry, you can see the result at the background. So that's all. That's the important things about the sequencer. We're still working on it because there are some issues with sound still that we're trying to resolve. But basically, we're trying to make it more stable and usable. And I hope you get to see the results soon. So I think we have five minutes for questions. Your finalist. What kind of impact does the AO have on the report performance when sculpting? Excuse me then. I'm a conclusion thing. Yeah. Well, it depends on the quality settings. I was trying to make it as optimized as possible. It's not that much. The good thing is that it's bound by your resolution. So it doesn't really depend on your vertex count. It's always bound by your screen resolution. But I still wanted to improve that. But it's not much. It's interactive. You can sculpt. I think you've tried it. No, I haven't seen it. Okay, okay. It looks great. Yeah. It works. So it's possible. I want to really appreciate your work on the user interface. And I want to ask with the pie menus because I started to using them immediately. And really loved the work you did. But if there is some intent and rather in the whole Blender Institute or like to switch to this as a default system, or at least to do it like that all the menus are like this because now it's an improvement. It's a huge improvement. But at the same time now there are two systems of menu. So is there an intent to integrate this like maybe it's consistent? Well, the thing as tons of the beginning is that the thing is to allow users to switch their defaults easily. And I don't think I think, you know, having this sort of discussions about the defaults, it's a little bit pointless because each person has his own or her own workflow. So I guess what we really need to work on is how to make it easier for people to do this sort of customization. And currently it's not so easy. I must admit because you have to script the pie menus or you have to rely on the official pie add-on that we shipped with 272. But I guess for the first version it's not that bad. We also wanted to see how people would use the pies themselves. You know, what workflows would be most common? And I guess we still have to explore that. But having something as default, I guess if we accept the sticky key functionality, we could have some pies by default because there would be no conflict with the current key map so people who are used to it can, you know, keep using it. But about what the default pies will be and what they're going to do, I guess you can't please everyone. I mean, when I was first developing the pies, there were so many different opinions and it was just impossible to please everyone. So I hope that answers it even though, I guess, so we will see. There's also a key map project where we have to, we want to re-explore how our key map works and maybe this is a chance to see what functionality we have and how to integrate it better. Okay, first congratulations. It looks excellent. Thank you. Yeah, I think these tweaks to the workflow make a big difference in how you work with Blender. So thank you for that. Thank you. Since you mentioned work on the waveform thing, waveforms are one of the most important things that you need when you do lip sync animation. Maybe. I'm not sure if you could fix it so that the timeline tool can display some kind of, how is it called? Preview, maybe? Yeah, like the waveform, but taking into account all the sound files. Oh, it's like a mix down of the whole thing. Is it possible? I'm not sure if that's really necessary. I mean for lip sync, basically you have to mix one sound for one character, usually. I'm not sure. We have to do a bit of research about how people could use it. I guess it might be possible. Though I think the audio API does not still allow it. Of course, we can always try to code it. And there were actually, one of the artists requested the ability to have a sort of window which only displays the movie or audio clip that you click on. So that would allow you to easily see the data better. I'm not sure if that's the solution. I guess we just need some better feedback for that. Okay, thank you. Thank you very much. Now the next speaker. The next speaker is Andrew Peel. Thank you. Thank you. In this part.