 Ja, well, let's get started. First off, HDR, might not be what you think it is. It's not that we think that your mobile phone can do scratching several exposures into one really near image. If you're doing tone mapping, you're not doing HDR. When doing HDR, any kind of ICC based color management is completely irrelevant. HDR means having a huge dynamic range, depending on the hardware between 0 and 1000 nits. If you're doing real HDR, which has only become possible in the past year, you can show effects like sun shining through a window. You can actually play with light. Instead of imitating paper, screens start imitating life. And you get way more colors. The HDR image has 10 to 16 bits per pixel, sometimes even 32, but that's not that relevant. So there's a lot of extra capacity for storing information to go beyond basic sRGB. We are calling this the HDR Colorspace Red 2020, of the keyboard 2020. And they have got a big gamut. More greens, more blues, more reds. It's a bit like, if you start painting in HDR, it's a bit like it's the 19th century again. Vitalo Blue kept me red, and Viridian Green got discovered again. You've got way more colors in your paintbox. That said, it's very early days. We started the HDR project, which we did together with Intel. Intel started bothering me about HDR in Krita a couple of years ago. Hardware didn't exist back then. Specifications were very weird. Operating systems support didn't exist. Late 2017, Windows started supporting HDR displays. Unfortunately, there's no other operating system that supports HDR displays yet. The number of HDR displays is growing quite quickly, but when we started the project around April 2018, the display hardware I ordered couldn't be delivered until September, which put a bit of delay in our schedules. HDR monitors, HDR displays, come in three classes. There are actually two competing standards, Dolby HDR and Visa HDR, but only Visa HDR is relevant because that's what's supported by the OS. So don't get taken in by the Novo. They think that X1 Yoga with HDR screen is Dolby HDR is useless. Intel is working on Linux drivers, but that's apparently a very low story. We had to use Windows for this. If you want to know which displays can display HDR content, go to theDisplayHDR.org website and get a supported display. I will try to get to that animation run. The problem of course is, I forgot my code here. The problem is, I can't show you any HDR content without an HDR display. A projector like this one will never be able to show HDR content in HDR. Where did my slide notes go? It's only when you're sitting at an HDR capable system that you will discover how wide your new capabilities will be. Traditional displays are calibrated, like they are sheet of white paper, illuminated by a source of light, matching the D65 standard, like average midday light in Western Europe. So what do you really need to get started? That's an HDR capable monitor, just cover that. You need to have a latest version of Windows 10. You need to have an HDMI cable, or a display port cable. In my experience, an HDMI cable doesn't work, because you will get red ghosting around every black pixel. Early days. Nvidia works fine, I have no idea about AMD. You really need to check the compatibility of your hardware, because some monitors only work with Intel. Then you get started to painting. It's a bit of a hassle, so these are the instructions to actually get started, and then you can start painting. The act of painting in HDR is really fun. It's like you're painting with light. You can use it for really subtle effects, like our single HDR image, a winter landscape. All that HDR did was show the snowflakes reflecting light as real light. On a theoretical level, your HDR image could be constructed as being scene referred instead of display referred. Like you're working with the light as it is in the scene, instead of working to the display. You're not painting the reflection of light anymore, you're trying to paint the light. All this would be pretty useless if you couldn't save it, and that's another area where it's early days. Creative nature format can, of course, save anything you can do in Krita. If you want to interact with other applications, there's the OpenHR format, which has existed for a very long time. Krita has actually supported creating HDR content using OpenHR since 2005. It just turned out that it was impossible to see your HDR content in one go. You would have to fill in with the exposure slider, check your images at different exposures. Well, they are actually painted HDR images using that method. So when we finally had got Krita working and showing HDR content in HDR, we could finally check those images as they were intended. That was a really cool moment. There's the extended pane format, which can be used to share your images. It's not the finished standard yet, but it's a bit more finished than extended JPEG, which is not implementable at all at this moment. So Krita supports that. And then the latest builds of FFMPEG can take those extended pane images and create an animation from HDR animation from those images. And you can upload that to YouTube and YouTube handles, HDR content. En probably if your phone is fancy, you can watch the HDR content in HDR, on-premises your phone or your television set supports HDR. So, how did we do this? Hacking is fun. Well, this was quite tough. There were lots of components that we had to really hack about. In theory, it's pretty simple. You use DirectX, because that's the only API that supports getting the metadata from the monitor and sending information about each image to the monitor. You create a swap chain with a 10 or 16-bit pixel format. Then you set the color space of the swap chain to P2020 PQ, if you're working in 10-bit mode, or SCRGB for 16-bit mode. To make sure that all your textures, surfaces or your image data is rendered in 10 or 16-bit mode, because you don't want to go from 32-bits to 10-bits, and you come to a scale from 8-bits to 10-bits per channel. Now, there's another wrinkle. If you're doing DirectX or OpenGL, your GUI elements, like the buttons and menus and things like that, are rendered in the same surface as your image, as your canvas. You have to convert the paint of your GUI, which your painting code does. You have to convert that to the same color space as you're using for your canvas. All in all, we had to do a lot of work. You might be lucky if you're using OpenGL, and there's another HDR rendering application running on the same monitor. Your application might accidentally be able to render HDR content, but you're missing out on all the metadata, so you can't actually figure out what the screen is capable of. So you have to use DirectX. Now, Cricket doesn't use DirectX, just like we don't use Metal. We use OpenGL, and maybe in the future we'd use Filton, but we are not going to do all those proprietary things. We use a layer between OpenGL and DirectX, so we can use the APIs after all. That layer is called Angle. It was developed by Google for Chromium, because OpenGL drivers on Windows are shit. They are really horrible. They have given me hours, days, months of bugs. At one point with some little chips, if you use OpenGL instead of Angle, my users would get a black canvas. Angle is a godsend. We use Angle as our intermediate format, intermediate API, and we have to extend Angle to get to the HDR API. We need to ask Brits Colorspace of how we want to use. I said we can choose either Rack 2020 PQ or we can use SCRGB. SCRGB with a linear tone response curve is easier to work on for artists. We choose to let people work in that format, and then we would cover that for this label. We had to hack Qt, because Qt externally, will tell you that it supports, for instance, 10-bit integer OpenGL buffers. Internally, it will convert everything to 8-bit, so we had to hack that out and make sure that whatever we feed, Qt's OpenGL API will stay the format we choose for it. And then we had to make sure that we could use Qt to tell the system what we were going to render. We had to create Windows Server methods so it could support Colorspace selection, and then we could pass that as a default service format. We had to make sure that everything we rendered was Qpainter. Qpainter is Qt's equivalent of Cairo. It's a 16-bit integer channel 2D painting library. We had to make sure that everything we rendered was rendered with Qpainter, which was converted from SRGB to the display Colorspace at the right brightness level, because that's another issue. If you have this range between 0 and 1000 nits, between really black and Australian-inducing bright, brighter even on this lamp here, then you have to fiddle your gooey brightness, which is effect SPR, not HDR content, so it's comfortable. There's a setting for that in Windows where you have to get that setting from Windows and then you have to convert all your widgets to the right brightness level. That's actually something that's a bit of a struggle right now, because Windows doesn't remember those settings correctly, Windows doesn't apply those settings correctly to individual applications. So sometimes Qt as gooey gets a bit too dark, sometimes it gets a bit too bright. Like I said, this is early days. Qtite right now is the only content creation application that does HDR with HDR displays. We're still figuring that out. Of course, we do Colorsselector. Colorsselectors are mostly SPRGB based, at least in Qtite. We had to create Colorsselector with an exposure slider, from dark to light. Colorsselectors still got a few bucks. But it's useful. People are using it. We extended Q Windows native handle interface API to fetch all the information from Windows I was just talking about, and test show that we indeed get. We have stuff now and then. So, what's up next? I'm pushing Intel to support HDR on Linux. It shouldn't be too hard for them. They're working on it. macOS? I doubt that will happen anytime soon. I mean, macOS likes its walled garden. There are no Macs with HDR capable screens at this moment. I don't see that happen. I'm not too interested anyway. I see a rapid growth in hardware. I've seen the first laptop is true display HDR panel It's going to be a Dell and XPS 13 which is a pity because that's a model without a panel and it would be nice to be able to take your HDR capable painting computer outside in the sun and start painting but I'm sure it will come. The list of available monitors displays is getting too long to fit into one screen full of links. What's turning out to be quite a bit of work is getting our patches merged upstream into Qt and angle but that's quite normal when we have fixed Qpainter to support OpenGL 3 core profile 2 years ago 3 years ago already. It took about a year to get it merged so I'm confident it will happen who develop future Qt can create a true HDR capable education There are some bugs of course right now as far as I know there are only 2 people in the whole universe who have created HDR content in Krita they are sitting over there and we will have to make some more features compatible with HDR right now our gradients are 8 bit integer per channel which is not good enough for something we can't work on some of our filters work but most don't really work bushes work fine there are some bugs with blending modes something like overflow all that is planned to happen after release Krita 4.3 Krita 4.3 is for September October we are still going to have a couple of months fixing bugs we just released Krita 4.2 which means that people are finally testing our new version I don't know whether that's the same for other projects but our office and that as we make 90 builds they are hardly get tested there are no release and suddenly everyone finds bugs so I kept the introduction short because I was expecting questions and I see that I still have quite a bit of time for questions so questions ask you ask you I put it in ask you right so the question is what color management engine do we use for HDR content we use open color IO so we can have so let's let's take a step back we use little cms for our ICC based workflow original workflow original meant for publication paper years ago we added open color IO open color IO comes from the movie industry it's an advanced system that will allow you to emulate looks of different kinds of projection technologies it supports floating point and HDR by default the thing is that color management in HDR is a bit of a gray area so we use open color IO what open color IO is actually meant to take HDR content and show it at a certain exposure at a certain level so I've been wondering what is not actually more accurate to say that HDR content is not really color managed it's you've got the red 2020 PQ color space we know how to convert from SCRGB to red 2020 that's the music of open color IO there's no way to accurately profile an HDR display so you just have to hope that it's going to be okay I don't know what is going to happen in a couple of years when our displays be great start showing colors in a weird way because these displays they use a lot of energy it gives a lot of wear on the panels well I think in the audio project there is some discussion about HDR content where is the discussion going on in audio right in audio I will subscribe to better I will ask Dimitri to subscribe we get this go ahead on the pixels that ask forums there is a discussion about wayland in color profiling and currently they discuss how a photo quality would look like profiler this day and they took HDR into account to see how to do that because it's whatever they need to send to the compositor they probably wouldn't use the same protocol and that's where discussion is going on are you not aware of that I was not aware of that so the it's still ongoing it's a huge threat let me repeat that for the video so on the pixels.us forum people are discussing wayland in color management and they are taking HDR into account like that of course presupposes that color management is something that should be done in the display compositor I'm sort of torn about that because we've tried that before with window managers on x11 and the results were not always that good the problem is if you are writing a color management application you don't want the compositor of the window manager to mess with your pixels anyone else go ahead ok, so the question is is HDR interesting for vector art I would say HDR is interesting for every sort of art because it gives you all these new colors to play with you can go way outside of what used to be possible why wouldn't it not be interesting it's looking at this from an artist point of view you get a whole new box of toys to play with go ahead do you need HDR to use those larger chromaticities in the BT 2020 yes I guess that technically it could be possible to have the larger let me go back to the slide now I'm going too far wonderful so theoretically of course you could have the red 2020 color space without the extra dynamic range but I'm not aware of any displays that give you red 2020 gamut without also giving you a larger dynamic range go ahead so the question is do display manufacturers cheat do they really implement the full red 2020 gamut or are they superty and easy to see converting back to whatever they built Janssens, I don't know I haven't measured it I wouldn't be surprised at all there's a huge variety in price points at this point there are also 3 left HCL support 400 nits, 600 nits, 1000 nits and then there are really professional monitors that can go up to 30.000 but I haven't seen those I would expect cheaper monitors to fetch last question well you go ahead do we have contact with Netflix because there has been a blog about how they create HVR content for the menus no I'm not in touch with Netflix except that Intu gave me their white papers and research papers to read I haven't got contact there this was the last question so thank you very much thank you