 My name is Dalai, I've been working, nowadays I work as a Development Coordinator for Blender, at Blender in Amsterdam. I started coding for Blender 11 years ago, my first patch into Blender. And although it's fascinating to talk about the Blender project for 2020, I think you can all find these in our official communication channels, which a lot of you don't follow. So I recommend going to code.blender.org. This is our main communication page from developers to developers. We just recently posted something about the main 10 projects for this year. But for this moment here, I think it's more interesting to take from all these projects, which can see how it's gonna be affecting for the end users, what's gonna bring to Blender, leave them aside for a little bit, and talk about a particular project, which for me is a particular pet project. It's an area of technology in Blender, which I'm quite passionate about, virtual reality, virtual reality or augmented reality, or XR, which is an umbrella for all of those. And from this project, try to illustrate a little bit, how do we get new developers in Blender? How do we get new features into Blender? How is our communication? Where does that happen? How it is to be transparent and open and big at the same time. So let's start our journey here. In 2014, we're starting to play with VR in Blender. So this is the Gooseberry project. Was there at the same time the one of the open movies we had in Blender? Was starting, who here knows about the open movies? Okay, good answer. Open projects made by the Blender Institute, Blender Animation Studio, and I was playing with one of their files and starting to render with stereoscopic VR. Basically, you create two panoramas, you put an Oculus and you can look around. 2014, like six years ago. Was actually really well-received. At the time, the mother of the director of Gooseberry, of the Cosmos Lundermat was there. She was like, fascinating, oh my God. And for me, it was easy because that was something that did not start as a feature for VR. It started as something for full domes, it's a different kind of technology, and I started working on this as a contributor, as a volunteer, and at some point, made it to Blender mainstream. 2016, we said, okay, if you can render in VR, but what if you can, what if you could, gives one to Tom as well, to be very thankful. What if you could also author in VR, if you could experience and see how the project is going before you go all the way to the final rendering. So this is where we're starting to experiment with storyboarding into Blender, but then put in one of those headsets and look around. At that time, what if a plugin using the Oculus SDK, totally non-GPL compatible, totally experimental, non-official. But it got things going, like it actually, the technology is very interesting and we're starting to get some traction. 2016, the Blender Animation Studio, they had the Caminandes, one of their open productions over there, and they said what, if you could partner up with Google, Google at the time was promoting a lot of the cardboard, their cardboard headsets, and they wanted content, and they wanted authoring tools, DCC, they called digital content creation tools. And Blender, as a big umbrella, was never afraid of partnering up with someone as big as Google, if it was of common interest. There was nothing that was violating the openness, the freedom, it's the other way around. They allowed us, allowed Blender to actually move forward a little bit with VR agenda and produce a really interesting content. At the time they're using Open H&D, it was a Linux-based, reverse engineering-based H&D VR support for Blender and Linux. What I'm saying that, because at the time, I say, one of the experiments I was showing was using the official Oculus SDK, the other one was using some reverse engineering Linux, but how does it become mainstream? How can we take something like this and make into Blender? The interesting thing is, there are a lot of people interested on that. That project I was showing here, something I was actually developing personally, myself, like in a research facility in Brazil, because people from Oculus Animations, Oculus Story, which was a mini-group inside Oculus, the company was bought by Facebook, they wanted to experiment with storyboarding for VR experiences. And the alternative was to draw something, animate, compile the game, and then see how it looks like, and then do it over and over again. And if you could have something that the same offering tool for the game can also be used to preview the game, it's actually very impressive. But I have people like the MPX project, which are people that were already involved in Blender. I have a guy who here knows what is the grease pencil in Blender. Some grease pencil is what allows people to do those kind of 2D drawings in a 3D environment. And the whole grease pencil team is actually composed by people that are not in the payroll of the Blender Foundation, it's like contributors, and they've been involved for a few years already. And they went to the next, well, went to the next level, sorry, and decided to think, how what if we could also draw in VR? Could be working with Blender, put in Oculus, and work a little bit and take it out. So that was the whole idea. And what they wanted is seamless integration. They wanted to be able to not have to learn a new UI, not have to learn a new way to interact. So then, for example, they wanted the whole Blender interface inside the VR space. And they made it, they had someone hacking their way around Blender. It was a really nice prototype. Had people like Blender XR, which is a company called Marui, they developed a plugin for Maya for VR. Basically a plugin where it could do all sort of operations immersed in VR. So it's only UI, it's only UX. And they wanted to start supporting Blender. But they had the same problem we had in the other projects. How can we do something that's compatible with GPL, with some Blender license, and at the same time, compatible with the industry standards for, you know, the Oculus or the Vive or the Microsoft lens. Interesting problem. Blender Facts is a group in Germany. It's also been involved with the Blender development as contributors. So users, artists that use Blender, giving us feedback, sitting together with developers to think about the features. And they were taking some of the open movies and re-rendered them as a panorama and have a small TV that you see immersed in the CDV to play in there. And they're also interested on using Blender in VR for seeing inspection. They're doing architecture reconstruction. Won't be able to look around and then look around. And before rendering the final thing, they wanted to use Blender, they're using Blender. And surprising, Ubisoft. More recently, they joined the Blender Development Fund. It's a way that if you watch the tone stock, probably he covered most of the, he sort of how the funding happened to Blender, where you are today. But basically, Ubisoft not only joined the Blender Fund in giving a Blender money, but also promising to allocate some development time on their own team, Ubisoft France, to implement features into Blender. And they're particularly interested in VR because they want tools for set dressing. So we have your set, you want to put different furniture, or pebbles, or stones, or for a director. And you want to see how a shot's going to look like before you render out. Since everything nowadays goes through computer, goes through 3D, Maestro also have authoring tools for the director to be immersed. So how do we conciliate all those different tools and what's the role of the foundation in all of that? Because you think about it, the foundation can grow until a certain point. But the fundamental role of the Blender Foundation, I would say, is to make sure the collaboration can happen, to provide the infrastructure, to provide the onboarding, and to make sure it cannot work together. Luckily for everyone involved in that last year in June was the first release of OpenXR 1.0. OpenXR is a standard by Kronos. Kronos is the same group behind HML, behind OpenGL, so all those acronyms we got used to. And they tried to unify this whole ecosystem for VR. While before, every time a software wanted support, either Oculus or Vive or Microsoft HoloLens, you need to support their SDK, you need to be compatible with their SDK, yada yada. They created a whole obstruction layer where Blender does only needs to worry about OpenXR. And the same way OpenGL is integrated on a low level in the operating system, it just, unless it's compatible, I don't know the details. Should not matter. But it allowed Blender Foundation to say, you know what, we officially can actually help bring VR into reality. So last year, we had Julian Izo to participate in the Google Summer of Code. We still use the Google Summer of Code to bring new developers on board. And he basically got, as far as, I'll say quite far, a whole scene inspection. This is again one of the open movies we had at the Blender Animation Studio. And he was using OpenXR to the whole thing on Windows, I believe. Because on Linux, they still don't have head tracking. But we say, you know what, we can, as a Blender project, can support the basics of VR. VR is a very niche project that is a bit overkill to dedicate. We have the whole that fund, right? To dedicate to core money, to feature creep, and try to get a really smooth VR experience, which whole here has a VR headset. Oh my God, there's actually, oh. I've seen a lot of people. We are like a very biased sub-sample. But it's a very interesting topic. And we developers, geekers, thinkers, we like to think about that. So the idea was to at least get the basics running, the fundamentals. You learn actually being a contributor to Blender for years already. So six years, you've been on and off in a Blender contract. You participated in two Google Summer of Codes. First Commit was in 2014. And nowadays, after the Google Summer of Codes, it was a nice project. Julian also was available. He said, what? Come on board. We had enough funding at the time to have him working for time. And that's not, he's not there only to do VR. He's working using interface and basic, but fixing, trying everything else. But it's also part of the process here. Okay, this is working well. Let's get him involved in. Let's commit both parts to continue this relationship. Of course, again, we don't do anything only by ourselves and for ourselves. So for instance, the Blender conference this year actually got one of the, like a small representative representation of those groups of stuff I was telling about. People from Blender FX, Daniel from MPX, people from Websoft. We then could sit together and say, okay, what can the foundation do? And what can everyone else do? So I agree that the first two milestones is by the foundation, which is a basic open Excel support. It is a basic scene inspection and API for drawing in VR. But what's gonna be the usability into VR? No one knows. Still, it's gonna be a few years until that's consolidated. So Daniel's gonna lead a third milestones toward basically drawing and sculpting interaction. Marui, which is not represented here will make sure their plugin can be run into whatever API we come up with. Then you're gonna have like Ubisoft keeping us in check because at some point you're gonna say, hey, the basic is there, go have fun and give code back, give code back. Well, we cannot have people gathering physically at all times. It's not practical. And at the same time, we, Blender as a project has years, 20, 25 years. Wait, how many years? 18. 18 years. How short is it? It seems it's open source. But it seems 998 is when it was online. And the whole infrastructure of Blender, of the communication was built on top of, you know, mainly in the East, 2002 here, IRC, which in a way didn't aid it so well when it comes to competitive Twitter, to Facebook, to people gonna use the channel they're used to use. So we're trying to modernize it a little bit. So this is a rocket-based online IRC. It's like, so we have a generic, you call Blender.chats is the website. Anyone can go there. It's where developers now talk among themselves. It's where we work. We even, as an example, have a Cahol VR room, channel only dedicated for that topic. And it's really the place where people are supposed to work. We have a DevTalk, which is a discourse based website. Where, again, we try to separate user feedback and usability from general development. But the idea is that the user, the cycles module, the grist message module, the VR group, should be able to use that among themselves. And everyone can read and can follow. They can, in a way, interact. We keep everything open and transparent. We have the, every Monday, every Monday we have like a development meeting. And we keep everything there, post it, what happened, what didn't happen. The progress reports of every single developer. Jesus Christ, shame on me. It's only one more slide. And we, as mentioned before, we have a nice blog where we encourage, not only our own, it's not a place for a PR. We're not here about PR, not here to talk about marketing, but a place where people can share development. We're at the Grispens of team, which is working in Spain, in Argentina. And we kind of keep tabs on their work and have some collaboration. They can go there and own that space. The Blender infrastructure is place for everyone that contributes to the Blender project to really take ownership from. And of course, use this for also outreach for the community. We're using YouTube, using Twitter. So, overall, we are in a bit of a transition from going over tradition, IRC, and mainly lease, and lonely developers working their own little projects to try to be bigger and try to get more people to collaborate. So, basically, you hope we, yeah, you hope that given insight in the project and more people can join us. Thank you. I also believe we might have five minutes for questions. Yes. So, if anyone has any questions or comments. Where you hear from in the beginning, sorry. More or less. If you go to the code.blender.org, we posted this last Monday. There's an overview of the 10 big core projects. There's even more than that. But those are the projects that's everyone's responsibility. Every single developer in the core team be responsible for this. The criteria, we didn't put there, but it's basically if a developer goes away and they're leading a project here, someone will fill it in for them. If the developer goes away, maybe won't have developer VR for another three years. Yeah, that's the reality, right? Those are the core projects we're gonna make sure is delivered. I see a hand here. Just a question. In fact, in SVR and you see the same thing in your screen and then in VR, it actually is really different. I mean, the feeling of seeing the models is really different from, so, exactly how does it help the productivity to have the VR? Is it just for making VR or does, for example, you think VR will help you make, like, I'm gonna be later on. I'm just curious about how VR helps the productivity because I will assume if I make it something on the screen, people will look at it on the screen. But if you make it on VR, then the screen's going to be different. So just my question is how? Yeah, could anyone hear it? Could anyone hear it? I don't know. Could it get the question now, no? What's the point of VR if you're not only doing VR? If you're doing VR, it's very obvious, of course. But again, Ubisoft, for example, they'd like to, when immersed, they can still have a traditional virtual camera there that you preview in the virtual set. If you're doing architecture modeling, I want at some point, that's more AR. You are here in Blender and you put your glass and you see, if you're modeling this lecture room, you can see the whole lecture room and go back. If you're doing character modeling, you probably have seen those making-offs of Disney where you have these sculpted characters. You could also use these to inspect them. I'm totally biased, but for sculpting, can actually sculpt in a more natural medium, right? You actually see and move around and touch. But the real answer is no one knows. But we're willing to give it a try for the technology as long as there is this other part from the community like embracing that as well. That's kind of the process. We need to create people to come back to. It's a good midterm, middle, a good compromise between digital and analog. You have a question? No. All right. For today, I don't need to, I mean. To use Blender. Oh, to use Blender. Yeah, I see. For 2D people, well, there's a whole roadmap now for storyboarding with tools like Blender. For even like 2D, because even for 2D, it's so handy to have a camera where it can actually pan around that it can reuse the assets for different shots. That's a whole discussion. Think you might have to for one last question? Yeah. It's not. We do have a developer, Ricardo Antalic, who's been hired to work full-time with triaging to help the infrastructure as a whole, but also to try to tackle the video sequence editor. But it's one of those projects that if he has for whatever reason decided to walk away, we won't be able to prioritize. However, it's in the agenda. This Vulcan, for instance, is in the agenda as well. More storyboarding for 2D is in the agenda. More sculpting tools, texture painting tools. There's a lot of work being done. Just, we had to draw the line at some point and the other can promise. Everything else is circumstantial. And again, anyone is welcome to add to Blender to bring, ideally, to see what the roadmap of Blender tried to help on top of that, otherwise it might get too complicated. But everyone's welcome. So it's a bit of an open end. What's going to be there? We can only tell when it's ready. Well, thanks everyone for the time. We have stickers here. And that's what time has. Thank you.