 Hi. Please introduce yourself. I'm Olivier Kheid. I'm the multimedia lead here at Collabra. And I can show you this is one product that we've contributed to. It's a media tech-based Chromebook. And we've done quite a bit of work around bringing upstream Linux support for various IP blocks that are found in ARM devices. Is it one of the latest Accompanio or one of these two sets? I think so, yeah. One of these latest MediaTek? Yes, one of these. And is it just a regular Chromebook? This is a regular Chromebook. What is Linux? It's ChromeOS, right? Just ChromeOS. This is ChromeOS. This is off the shelf. We just went to a store and bought it. But we've been working on it. The ones we work on obviously we cannot bring. Can you explain what is part of that that you work on? Well, we've done work on, for example, the codec hardware, but also other bits of hardware that we've upstreamed to the Linux kernel. And also we work on with Google on having these devices in the kernel CI project, which is a project to do continuity sensation on the Linux kernel itself. So upstream running whenever someone submits a patch to the upstream Linux kernel, we build it. We work on hundreds of devices. And we report to the maintainers what works and what doesn't. Especially what doesn't. Oh, this work is useful for when people want to run Linux on these, too, right? Yes. Not just the ChromeOS. Not just ChromeOS, right? Because a lot of what we do is we bring the drivers to upstream Linux so you don't need a special kernel. You can just use a standard Linux kernel. Which is useful for someone like Google because they have eight-year support on these things, right? So they want to upgrade the kernel. They just don't want to ship an eight-year-old kernel. What's happening behind you there on the screen? So, well, this is our lava lab. So what we're showing is that we're building different... We're showing all these boards. Yes. So we have continuous integration across all these boards. So the developer writes some code. And then our system will build it for all of these boards, create a system image, boot the board with the image and bring all of the results of all the tests that it turns into a UI where the developer can see. So it's easy if you have a complex system that you might run on many generations of boards, many different types of hardware. Also, the developer probably doesn't have 10 boards on his desk to actually have more confidence in what you build. The lava has been developed for years, right? Yes. It's open and everybody can implement it the way they want. And you have a special way to get it to the next level? So we actually contribute to the open-source projects directly. So it's not like we have our own special flavor. We're actually part of the community. But one of the services we have is that we host a lava lab where we can take people's devices, put them there, and then have both kernel CCI and Mesa CI. Yeah. All right. What else can we film here at the booth? So the next demo, my co-worker here, Marcus will show it. I can show it? Yes. It's an AI demo. What's the latest happening with panfrost? What are you doing here? What we show here is basically a machine learning demo that runs on a complete open-source deck, like operation system. But the important part here is that we use like an open-source graphics driver. And what the demo is about is basically a video compression focused on web video conferencing. So basically at the beginning of the call, we take like one image of the sender and send it over to the receiving site. And then following up, we just extract key points for every image. And we use these key points with the first image that we extracted to reconstruct the face. And that way we are actually able to reduce the bandwidth 10 times in comparison to H.264. Which makes sense because we just transfer the key points for every image instead of the whole video stream. Is this working or just a prototype? This is basically more like a prototype than a full-featured product. But it would be easy to just take it and integrate it into an existing video conferencing software. Because the last three years, I don't know what happened. There was something weird where people had to stay at home and everything and they were doing it as a chance. Exactly. So there's been a lot of talk of, there's been trillions of petabytes of video conferencing bandwidth out there. You could say 9% of it. This complete demo is already open-source. You can just download it, flash it to your embedded board or PC and just use it. And the next step would be to actually integrate it into an existing video conferencing software like Zoom or like Microsoft Teams. You could potentially have Rockchip-powered 4K webcam and if the other guy also has it or even if they don't or I don't know the software has it, then suddenly they can see you 4K image with 10 times less bitrate. Yes, up to 10 times lower bandwidth. So you could do 4K video conferencing with like one megabit? Yes, you could. You probably need like a little bit more capable hardware because like this one is like very resource constrained but it would be possible to run like in 4K. I did a video with NVIDIA once. We talked about some, all kinds of optimizations there. They were thinking of doing, I forgot what it's called, but I guess you like to do everything in the open source so everybody can use it. The question is how soon are people going to use this in products or? We hope pretty soon as I said like the next step would be to actually integrate it but since this is open source we can actually optimize every single bit. As I said it runs like an open source graphics driver which actually allows us to optimize the whole machine learning pipeline to run in real time and also like support like 4K for example. Which is like often not possible even on like NVIDIA hardware like the NVIDIA JetSync. How good is the panfrost? Is it fully working? Everything is perfect? It's like really really good and works like out of the box. Really good out of the box? Yes. All the Mali GPUs? Not all the Mali GPUs. There are like a couple, there are like more support than others but... The newest ones, they get great panfrost or the oldest ones that have problems? It's more like the older like chips are like a little bit more supported and are more optimized I would say. It's always like a combination of software and hardware right? The latest one they call the Immortalis the GPU and it will take a while before panfrost is good for that? So ARM hasn't open sourced the GPU driver yet? No. They don't want? I don't know if I can share too much about this. There are definitely like discussions in this direction. Awesome. What's happening behind you? What else we can talk about here? Maybe we can switch to them because... Do you want to give like a name? Oh Lord. Do you want to give like a name? I really want to say that we take a lot of advice to mark the possible future. All right. Perhaps all at first we'll see. You can ask them to... Just try to go through. All right. Hi. Hello. Introduce yourself. My name is Laurent Pancha. I'm the CEO of 5Ds and Board. We are a Linux consulting company specialized in multimedia and camera support. So what does ideas do on board? On which board? Lots of board actually. So our goal is to enable camera support for all of our customers. We make your cameras work for you. So anyone who needs to capture images to create a product, we enable that. We don't care about the end user applications. So we support customers in the automotive market, industrial markets, medical markets. Anywhere where you need to support cameras, we can be there. So we work with SOC vendors. We are supporting both from an SOC from Rockchip, from NXP, from ST, TI, Exilings, all across the board. And we also work with OEMs who integrate those products and the SOC in their own products to enable them to use them as efficiently as possible. When there's an SOC and it has support for cameras and stuff, isn't it just there or what else do you need to do to make it great? Unfortunately, many things because usually vendors will try to BSP with a custom solution. They will have a camera stack that is vendor specific. So for OEMs, switching from the vendor to another one is very difficult. Or integrating just support for another camera sensor on the same SOC is very difficult. You need to go through a whole tuning process that involves closed source tools that involves usually working with the selecting key partners from the vendors that can be a long and costly process. So we want to create and we are creating a completely open ecosystem with an open source camera stack that solves those problems. We focus on interoperability between different SOCs and different camera sensors so we can make any camera sensor work on any platform. We also work on creating tuning tools to enable our customers to do the tuning themselves at the cheaper cost and leveraging all the knowledge Just behind, I saw Libre computer. Are you working with these guys? Is that your partners and all that? So we are partners of Canabara. They are hosting us on their booth. And Libre computer is indeed a company that Canabara is working with. We have a working camera solution on some of the Libre computer boards. And we partner with lots of SBC vendors. It's nice when I see the AI just makes the video cortex amazing. Are you also a part of doing that? That's a Canabara project. That's a really cool one. That's actually a demo that you can see. Yeah, it's all right there. Yes, it's detecting the face. That's the same one. So you do have face detection and reconstructing the face and sending just the face over the network during the bandwidth. So they have created this AI demo and we have worked with them to replace the USB webcam that they were using with a RAW sensor. So now you can use the same demo on any kind of SOC with any kind of RAW sensor. Can you describe one of the really cool SOCs? They are great. But then the camera, the imaging chip is kind of like separate all the time, right? And you have to talk with the SOC and there's... What's the language? There are SOCs that lack camera support inside. So you have to use an external ISP that's usually a chip between your RAW camera sensor and your SOC. That makes it more costly to integrate a full solution. But we do support some of them. We have worked, for instance, with Onsami with some of the external SOCs and developing drivers for that. So those are solutions that we support as well. So depending on the type of SOC you use whether it integrates an ISP or uses an external ISP that is something that we can work with. Many of the cool ones have it integrated directly. But not necessarily the cool ones. I mean, there are lots of interesting SOCs that definitely focus on camera support. They may have lots of other interesting features without an integrated ISP. But otherwise, yes, we're more interested in the ones that do integrate an ISP because that gives usually a more integrated user experience. Is this also stuff here? Yes. So those are samples of a board and system that we support, just a small subset. What are you looking at here? We do have the Debix board is based on an NXP iMix 8M+, that's the first SOC actually in the NXP product line that integrates an ISP. There was a bit of a game-changer for them. But we have camera modules from Audicam, OKDU, Raspberry Pi. This is a cool device, actually. So this is a Raspberry Pi Zero behind the hood with a camera sensor from Raspberry Pi as well, the latest camera module. And it's actually used as a USB webcam. So it runs Linux inside, full open source report. In there. Yes, in there. That's the same one you have here. There. So that's the Raspberry Pi Zero inside. You run Linux on that, you run Libcam around that and you connect it through USB to your computer and it's recognized as a webcam. What's the best platform? Do you want to do a 4K video conferencing? What should I use? So if you go for higher resolutions and higher performance, we are looking at the moment at new SOCs such as the ROC5 board, for instance, distributed by OKDU. That's based on a new ROC chip SOC that's very interesting from the kind of point of view that we're just starting developing on that. So that's something we fully support yet. But for the existing supported platform, yeah, I'm excited. And plus it's actually an interesting one. It's very versatile. I wouldn't say that's necessarily the best one for video conferencing as such, but for lots of industrial use cases, including higher resolutions, we can support lots of applications on that. What's cool about the embedded world when you support a platform very well, it's going to stay for a decade. It's going to stay for years and years. It's going to be useful for potentially millions of products. Yes, it does. And that's also something where I think we have an added value. If you look over here, this is a tablet running or developed by Google. That's a Chrome-wise product. And they offer seven years support and software updates on all the Chrome-wise devices. This one is reaching end of line very soon. And because we have an open source camera like in that, it means that user can even extend their lifetime beyond that. So you can use main line, the main line in external, you can use a live camera, you can use open source software, and you have support even beyond what the vendor would provide. Seven years is already quite long in that kind of market. So that's good support from Chrome-wise, but you can even go longer than that because of the open source ecosystem. Nice, that means these Chromebooks get a bunch of updates somehow. Yes, yes. Seven years from Google and after that, depending on what they use this one. Cool, that's awesome. Okay, so that's a good book here. Lots of action happening. Where are you based? I'm based in Finland personally, originally from Belgium. I've been based in Helsinki. And here, that collaborates a lot of French guys? French, they have a big official Montreal, so certainly French-speaking, not just. In nine years on board, we have seven people we're distributing over already five different countries. Carbide is much bigger with 140, around 140 developers, so they definitely spend a whole globe. And working in the open source, is possible to make money? Absolutely, yes. We're selling not out of work. We have lots of requests from customers. We do not sell our software in need, but we sell consulting services. We work with SOC vendors to enable their products in this whole ecosystem. And we also work with OEMs who want to integrate them later in the product. So any kind of support they need, that's where we come. And all the cameras, all the kind of drivers, that's indeed released as open source. So everybody can benefit from that. But the other way around, we do benefit from the work of other developers. So that's also how we can be competitive. Is it possible that somebody will make really cool devices? I don't know what kind of sensor size is here, but could you have like a one inch sensor, micro-forters, APS-C, I just have amazing video conference or something like that. Yes, absolutely. You could go for really, really large sensors. Usually if we're looking, if we're looking at extremes there, sensors that have resolutions, I don't know, 400 megapixels, for instance, you will need custom hardware to go with that. Because it's such a niche use case that you don't want that kind of of the shelf as you see that will offer all the features that you need. But apart from that, you can go high-resolution, you can go, this one is relatively large, but you can certainly go larger than that. And you support it? Yes, we can definitely support it. You can work with anything people want? Yes. The biggest problem we have actually when it comes to an interrelated point of view is connecting the sensor to the hardware. There's no standardization when it comes to those connections. All those ribbon cables that you see, they have different pin-outs, different number of pins, and so that limits our customers when they want to connect a camera that's not meant to be connected to go on. So there's always some adaptation that you have to do. So having standardization that feel would be great. That's hindering development in little bit. But from a start-up on a view, we can really support anything. As long as we can establish a relationship with the vendors and get documentation to be able to develop drivers, we can certainly support everything. All right. Thanks a lot. You're welcome. Thank you.