 Hi, I'm David Glenn with the HDMI Forum. We're a standards organization that develops all of the future versions of the HDMI specification, starting with 2.0 and then 2.1, which we're very excited about. A standards organization that's open to anyone who wants to join any company. We currently have 84 members that spread across the industry, so we work with not just device manufacturers who make sources and sinks. We also have a lot of people making components like silicon and cables, people making test equipment. We also have members who are content developers like streaming companies, content media companies, studios. And we all get together and decide on what we want in future versions of HDMI. So in the last few years, we've been developing the 2.1 specification. The focus there was on 8K and 4K 120, as well as gaming features like ALM and VRR, which we're all very excited about coming out in products in the near future. Because device makers wonder, device is to be future proof, right? So they're looking far ahead into the future and say, we want this. How do they agree on what needs to be in there? So in the forum itself, we have regular meetings pretty much every week by phone. We also get together many times a year in face-to-face meetings, and this is where we make decisions about what kinds of features we want and technical discussions about exactly how to accomplish those things. And it's a democratic standards organization. So all the active participants get together and make technical proposals. Sometimes there's various technical proposals and we debate the merits of each and often come up with a combined solution. Sometimes there's only one proposal, but a bunch of other companies will join in and help make that better and improve it. And then it's ready. We included in a specification like we did with the 2.1 spec. So can you tell a bit of a story as to how you came to these specs for the HDMI 2.1? Is it possible that one of the members wanted AK-120 to be there? And other people said it's too difficult or something. But you can do it. AK-120 by compression, right? Yes, we support it by compression. By compression. So there's all these backing forward and you pretty much have to judge what becomes possible in the future with all these chips. Something like that? Sure. So I think the question really is about, how do we decide what to put in in terms of future capabilities like 8K, 60 and 8K-120? The 2.1 spec does support 8K-120 using DSC compression. But really it comes down to making sure that the products for the next, let's say, three to five year timeframe are going to be affordable, right? In other words, we don't want to build an infrastructure that has, say, 100 gigabits of bandwidth and force that on to consumers when it's not going to be economical in the next timeframe. So we agree as a standards organization what the right bandwidth for the next cycle of standards is and then we build specifications to that to ensure that we're doing it in a way that all the products can be made at an affordable price point for our consumers. It's already a big jump. It's 48 gigabit per second, it's pretty big. Yeah, and 4K-120 doesn't use all of the 48 gigabits. Even 8K-60 doesn't use all of the 48 gigabits in many configurations. HDR10 uses about 40 gigabits. So there is a big jump there and there is bandwidth for some future expansion especially like we said 8K-120 with compression is available within that 48 gig envelope as well. It's a pretty amazing amount of data. It's big and all these chips, they know how to handle this. I mean, they have to now for three to five years handle all this data. Not every chip has to handle 48 gigabits. If you're building a product that is a gaming monitor that's 1080p then you don't have to support 48 gigabits because you're not going to have that high bandwidth silicon in a 1080p monitor. It's only really if you're building 8K-60 or 4K-120 TVs or source devices that you really have to enable that higher data rate. The HDMI 2.1 spec doesn't always have to run at 48 gigabits. It has lower speeds it uses at 40 gigabits and 24 gigabits and various other rates so that we'll use the right data rate for the capabilities that your mix of products want to have. On the other hand the cable, the ultra high speed cable that we've defined it is future proof. So every ultra high speed cable has to be capable supporting the full 48 gigabits. So that way when you buy a UHS cable you're insured that it's future proof and you can use that cable going forward when you do decide to upgrade to 4K-120 or to 8K-60 or beyond. It's just awesome industry know. When people talk about what's to look for to its CES most of the articles are all mentioning HDMI 2.1, they're all mentioning 8K. And you are pretty much organizing the industry around that right? You're like, you're presiding over the forum of the whole. Everything from American companies to Chinese companies to everybody right? Yeah we're very excited that HDMI is such a widely adopted standard throughout the world in both the consumer electronic space and other spaces like the IT or PC ecosystem. And yeah you know it's great that everyone is adopting 2.1 on these advanced devices. And HDMI has been very successful there's now over nine billion devices in the world that have been HDMI certified in terms of sources and sinks. And that's certainly going to continue to grow at a very good rate. So more than one per person. That's right. I certainly have more than one. That's a lot. One thing that I noticed is that there's a lot of different types of HDR. And is it a challenge to specify what goes in or just support all the different ones in HDMI? Yeah so with the 2.1 specification on HDR we took a framework where we really defined a very flexible mechanism for transporting HDR with wide data rates. So you know 10 bit, even 12 bit per component if you want. And very flexible ways of transporting metadata. So in the HDMI specification itself we don't really mandate the different kinds of HDR. We really enable the transport system for different mechanisms. So that way we can work with all of the different standards that are around the world for HDR. And in fact all of the existing ones are supported by HDMI and it's certainly possible to write new ones using the HDMI 2.1 infrastructure. Because it is so flexible it's certainly possible that future HDR standards could be working as well. So it doesn't mandate what type of HDR? No it does not mandate a type of HDR. Just gives the infrastructure for transporting very extended pixel data and metadata. Because HDR is awesome right? Absolutely HDR is awesome. HDR is something I've been working towards my whole career. Oh yeah so a long time? Yes. So it's even better than the resolution sometimes. People say that it's more impressive. Yes well I've been around long enough that my eyes are no longer that appreciative of the very high resolution. But I've certainly with my eyesight I'm very appreciative of high dynamic range and wide color gamut. So those are things that I think even more people can really benefit from than higher resolution. Everyone can benefit from higher resolution unless they're very low vision. But I think it's more obvious to people when they see high dynamic range and wide color gamut. So there's more things that HDMI 2.1 kind of solves. There's all these sound bars that are sometimes difficult to configure and a little bit weird. And now it's making it easier. Yeah we have the enhanced audio return channel which really is trying to take that problem and solve it with sound bars and AV receivers. So it's much more flexible in terms of negotiating capabilities and supporting a lot more advanced audio codecs. Wireless speakers, all this kind of stuff and it needs to be automatic. Well HDMI is not wireless. It's not wireless, okay. Oh yeah okay. That would be all these different kind of speakers and stuff because TVs have sometimes good sound but it's often better to have external sound. Certainly I think a lot of people prefer external sound. It is challenging to build very good speakers into small bezel TVs. Although some of the TV vendors are doing very impressive things I have to say with their narrow bezels. And then you satisfy also the gamers and all these how challenging is it to make a TV that has variable frame rates and stuff. Is that piece of cake or it's really complicated and how do you make everybody agree how that should work? Well making TVs support variable refresh rate is certainly more challenging than it is with monitors. I can't really get into the technical details of how it's done but we're very happy that VRR is now part of the HDMI standard and that's encouraging more and more TV manufacturers to recognize that gaming is such an important aspect of their market and the game console vendors of course appreciate that too. Because as far as I understand there's the chip like the GPU and the CPU sometimes they can't quite do the full frame rate or it depends how powerful it is, how many frames is going to output. So it's really nice for a gamer to have the same rendered on the screen as what the chip can do. Is that the point right? Yeah the point of variable refresh rate is that often with games you get into scenes that are just a little bit too taxing for your rendering system, your CPU and GPU. And so if you want to, you can't always achieve say 120 frames per second or 144 frames per second. And so in those more complicated scenes you can fall down to a lower refresh rate and then when you get back to a more typical scene you can get then to a higher refresh rate. There's also gamers that just prefer in PC gaming to set all their quality settings to extreme and ultra and they're not really trying to get to 144 Hertz they really want to prefer to get a very beautiful looking image at say you know more around 60 or 50 Hertz or FPS. And that allows you to work as well with without having to go and change your display into 60 FPS and back to 120 you can just stay at the one timing. And if the GPU goes down to like 71 Hertz or something it'll just work. It'll match exactly that in real time. Yeah that's right. And it also has to do with lag somehow. So that's also a big deal like there's a different standard of people doing different ways how the GPU is synchronized with the display. Yeah the HDMI specification doesn't really get into the lag elements of that. So that's sort of a separate issue that the TV vendors and some of the other variable refresh rate ecosystems like G-Sync and G-Sync they get more into the lag issue. That's not part of the spec. No. So again compatible with different. Yeah the spec is, the HDMI spec is absolutely compatible with having reduced lag but the spec itself doesn't really talk about that explicitly. When you talk about quick media switching it's like if you watch a movie in 24 frames per second or something like that. Yeah so this allows you to, often when you're streaming you're going to be flipping between different clips on say YouTube or whatever that are all at natively different frame rates. 24 FPS, 30 FPS, 60 FPS, 50 FPS. And with quick media switching, QMS, you're able to very quickly switch between those using basically it's sort of an extension of VRR. It's another mode of VRR within the HDMI specification. That allows you to then have the transport across the HDMI cable always be exactly the frame rate of that media. And that allows your display like your TV, often TVs have very good quality frame rate conversion and allows them to go to that native frame rate. Or some TVs can then simply display it at the intended rate or some multiple that. Some TVs will multiply it by two or by three or by five. So like you did 24, they'll multiply it by five and display it at 120. Does it do anything in switching the kind of like the modes? The TV sometimes has like a movie mode and they have like a game mode and like different way of showing them. So that's more of what we call ALLM. So auto low latency mode is just a signal to really tell that the source device can tell the TV whether you're playing basically game contents or that wants low latency or say video content. And some TVs can have options to move between their game modes in terms of their image quality as well. That's really not mandated by the spec. What's mandated by the spec is that when you say you're gaming, the source device says it's gaming, the TV should move into its game mode, which is lower latency than the movie mode, which is higher latency. And often the different licensees are associated with the image processing within the TV itself. So when it's watching movies, it's trying to spend a little bit more time image processing and improving the quality of the picture. Whereas in game mode, it'll spend a little bit less time doing image processing because the focus is on really getting the image on the screen as fast as possible for the gamer. So a lot of all this stuff is to do with metadata, right? It's like, it's not the 48 gigabit stuff. It's like small data. It's small, the industry has to agree how to format this metadata. Yeah, and that's definitely within the HMI specification. We have a bunch of metadata specifications. Some of them are very specific like the ALM and the VRR and the QMS are very specific. This is how you transport that metadata. Some of them are a little bit more flexible within specifications like I mentioned around HDR. So there we really provide an infrastructure and then within that infrastructure, different HDR standards can define how to transport their metadata packets. And so part of the CES 2020 has got to do with a new cable. And how big of a role has the forum to do with this? Well, this was the first cable that the HDMI forum has defined since it was, the forum was created in 2011. And the first version of HDMI that we worked on was 2.0. And in that case, we used the existing high speed cable that had been defined back in the earlier HDMI specifications. And we increased the ability of the high speed cable from nine gigabits per second up to 18 gigabits per second. And that's what we've been running with the 2.0 for the last five years. But with the demands we had for 4K120 and 8K and beyond, we needed to move up to 48 gigabits. And that was not possible with the high speed cable. So we did a lot of effort over the last few years to define the ultra high speed cable, which we're very proud of. So it's a cable that has been designed from the ground up to not only provide very high bandwidth, but also lower EMI. So we've worked carefully with international EMI standards. This is electromagnetic interference. Basically, making sure the cable doesn't radiate energy that can interfere with your Wi-Fi or your Bluetooth or your cellular network signals. And so all of the cables that are being released for ultra high speed, every single cable design will have to go through a mandatory certification program set up by one of our authorized test centers. And then it'll have a logo or label put on it that allows the consumer to see very clearly that this is a certified cable. So anyone can download an app onto their Android or iOS phone that can allow them to very easily scan the label on the product in the store and confirm that it is, in fact, a certified cable. And part of that certification is the EMI testing itself. So it goes into a very specialized test chamber where we test that the radiation coming out of the cable is well within the international standards. Because you don't wanna have the new 8K cables interfering with your cell phone and stuff. And Wi-Fi and... Yeah, and in fact, you know, if you didn't already have a premium certified cable, it may be worthwhile to get the new cable if you're having wireless interference problems because not all of the older HDMI cables on the market did go through an EMI testing. Right, so now, even if you're playing games at 1080p or watching 4K 60 on sources and syncs that are HDMI 2.0, the new cable may help you solve some of those EMI issues because it's been EMI certified. So it doesn't interfere with all that stuff if it's a bad cable? Well, sometimes. Sometimes. A bad cable can be the issue, yes. Is there a way to measure that somehow? Measuring EMI interference is not easy. It does take expensive equipment. But basically, for users at home, I think the easiest way to sort of mess around with putting your cell phone near and trying to see how many bars you get, things like that. But it's not a very exact science. There's not like an app for that. I'm not aware of an app for it, although there's maybe an app for everything and I could be wrong. So, is there a price to be a member of the forum or a price to get this certification? Well, certainly more forum members, there is a price. That's all public information on the HDMI forum website. We certainly are always looking for new member companies to join up. Basically, you have to agree to work in an open standards organization under very typical standards organization rules and there's an annual fee to be a member of a forum of 15,000 US dollars. So, often it's mainly the big companies but they could be smaller companies. But I'm not part of the forum and but they just follow everything you do. Yeah, so there's also a thing called an HDMI Adopter. So, most companies in the world are HDMI Adopters. So, they work directly with the HDMI Licensing Authority to become an Adopter. An Adopter is a company that is making an HDMI related product. But to be part of the forum is to really be part of defining the future standards rather than just building a product. So, we have bigger and smaller companies within the forum but they're all focused on the development of future standards whereas companies are really just trying to focus on building products with the existing standards they'll become just HDMI Adopters. And that helps fund the whole organization and the way you do things, right? Yeah, we're a self-funded organization through our membership fees and other product fees. And you have forum where people actually meet up face to face. Yeah, we do meet face to face several times a year. It's not just emails. No, there's a lot of email, there's a lot of teleconferences and there's face to face meetings throughout the year. And so, I would guess that every company has the top experts in this field. Like, they really know what they want and how to organize the whole thing. Yeah, we do have a lot of experts in the forum. Many of them have over 20 years of experience with HDMI, including myself. Some have been there since the very first days of HDMI being defined. The chipset companies, the TV companies. Yeah, as I said, we have TV companies, we have monitor companies. We have source device companies. We have chipset vendors, we have cable vendors. We have test equipment manufacturers, as well as studios and content distributors. All these streaming sites and stuff, they probably want to be, maybe. Well, we have a few streaming companies, yes. And a few studio companies. And the Hollywoods and all this stuff. We have a few Hollywood companies, yes. And also the Bollywoods and the Chinese and everything. Yeah. We're certainly interested in having active membership from throughout the industry around the world. All right, so do you feel like the HDMI is so huge now, but it's going to be much bigger? Oh yeah, there's no slowing down of HDMI at this point. It's certainly with the 2.1 spec, we've got everything we need for the next several years in the industry at least. We have AK-60 as we talked about. We even have AK-120 with compression. And that's certainly going to take a while for the industry to catch up with. And this gives a lot of a nice job for the chip designers to fill up this AK-60 and AK-120 with amazing content. Yes, so content, you know what I mean, an AK content on the video side, that's taking a bit of time to develop. But certainly we're having the Olympics this year. We'll be done in AK and that's going to create a lot of ecosystem infrastructure for creating video content. There's going to be cameras coming. Yeah, there's going to be cameras, there's going to be all of the studio and truck infrastructure system associated with that. But it's not just I think about video, it's also about gaming, right? And so both Microsoft and Sony have announced that they're going to have 8K support in their next-gen consoles. So that'll also be interesting to see. And I think YouTube has supported 8K for a while, but when you look at the new codecs, we can compress 8K in something like 100 megabits. That's much less than 48 gigabits. So all this compression doesn't really matter, right? It's just the video needs to be kind of uncompressed and there's a reason why it's 48 gigabits. Or does that make any sense when I'm asking? Well, so typically what happens with video streaming, for example, is it gets uncompressed in a source device like in a game console or a media box like a fire stick or so forth. And then it'll often get mixed with a menu so you'll have menu items and choosing what your next stream is and all that sort of stuff. And then the resulting thing, and often the video gets put into a window. Sometimes you can have like 20 different video clips together in your menu, right? So all of those get decoded separately on the source device, scaled and composed. And then the final image is shipped over to the interface to the display on the HDMI interface. And that makes sense to have huge bandwidth in the last part. Yes, because of the uncompressed nature. It doesn't really make sense to recompress it again and then decompress it again in the TV. Although as we said, we do support the compression standard DSC in HDMI. And that actually does enable you to work even at 4K, for example. You could use DSC with a lower quality cable without having to actually upgrade your cable if you don't have the ability to do so. So if the 48 gigabits is kind of like 4X more bandwidth, does that mean 4X more power consumption also? Well, 48 gigabits is about 2.6X of the 18 gigabits we had in 2.0. In terms of power consumption, that's not really determined by the specification, that's really by the silicon vendors. It does not mean 4X the bandwidth though. It will power. Sorry, it doesn't mean 4X or 2.6X. You've got to be confused about 4X. Yeah, so in terms of power consumption, that's really not part of our standard. To go faster bandwidths at 2 or 2.6X, the bandwidth will take a little bit more power, but exactly how much more power is really up to the silicon designers to optimize for. Because we also want to have a green planet with not too much power consumption and stuff. And absolutely. But it's also important to remember that we don't always run at 48 gigabits. We'll run the link at the rate that's suitable for the mode you're trying to do. So if you have a source and a sync that are not needing the 48 gigabits, if they only need say 20 gigabits, then we're going to train the link to 20 gigabits, or maybe even less. And so that's going to help reduce the power consumption as well, because we're not making the link run at full speed all the time. It's also going to make sure that you could potentially operate even with a legacy cable that's not a UHS cable. Because that older cable that doesn't support the 48 gigabits could be used with a lower link rate. And without getting too much into detail, but there's many different kind of chips that know how to encode and decode the stuff, right? And so maybe some of the chips are kind of including this on the SOC more and more and stuff and doing all kinds of architectures can do it, like x86, ARM, DSP, FPGA or what kind of, how does it work? A6? We're seeing HDMI adopted into Silicon from many different types of devices, right? So all the way through the ecosystem of both handheld devices, camera devices, tablets, PCs, media sticks, automotive. Yeah, so it's a widely adopted standard and it's a technology that can be integrated into a lot of different Silicon technologies. You don't need to have a very, very advanced, like seven nanometer technology to implement HDMI. You can do it in some of the technologies that are often typical of lower cost devices and sort of more consumer focused devices rather than the bleeding edge highest end, say gaming device. And they can do it maybe on the SOC or they often need to have a dedicated chip next to the connector. That's really up to the system designer and the Silicon designer, but yes, it can be integrated into SOCs, absolutely. All right, but thanks a lot. So you're going to be busy here to CES 2020, right? CES is always busy, yeah. So what's going to keep you busy? Because I mean you're going to talk with all the 84 members? One by one, you go to all the booths? I'm going to go through all the booths today. Some of them are forum members, some are not, but I'm always interested to see the products that are coming out, especially I think there's going to be a lot of 2.1 products in the booths this year. So yeah, the show hasn't quite started yet, but I'm looking forward to checking out today. How big a part of the CES story do you think 2.1 is going to be? Because everybody who did the articles before the show, everybody's mentioning it, right? Yeah, that's been great. The reaction we've had has been great. We hope it continues to be a good reaction and we're very excited about the feedback and the support we're getting on the 2.1 at HDMI and CES.