 Je vais commencer de nouveau avec notre prochaine présentation, la dernière sur G-Streamer. Cette fois sur WPE Webkit. Bienvenue Philippe Normand. Merci d'être venu. Donc ça serait très tôt de parler d'intégrer Webkit et G-Streamer, mais dans un nouveau sens. Donc où je suis ? Je suis involvement en G-Streamer et Webkit. Depuis déjà quelques années, j'ai contribué à l'utilisation d'un couple de projets. Et j'ai travaillé pour une compagnie qui s'appelle EGALIA. Et nous sommes environ 80 personnes maintenant en travaillant ensemble. Donc WPE, les basics. Je ne vais pas entrer dans les détails. Mais ce que je peux dire c'est que c'est un Webengine basé sur Webkit. Et c'est vraiment... C'est été designé pour des cas de utilisation d'embaîtement. Pour être utilisé dans les devices d'embaîtement. Il y a un cycle de six mois, comme WebPGTK. Donc c'est une bonne avantage. Parce qu'il y a des updates de sécurité régulière sur le projet WebStreamer. Et il n'y a pas de dépendance sur les kits de Ui. C'est vraiment un projet basé sur WebBone pour être utilisé pour l'embaîtement. Donc, si il n'y a pas de Ui2 kits, comment nous ferons les rendements ? On a un certain type de système de plug-in. On l'a appelé View Backends. Et on a des backends pour des graphiques et des drivers. Pour Weyland, il y a un expériment pour Android aussi. Et des backends pour des graphiques et des drivers. Par exemple, nous avons une back-end pour des drivers Calcom. Donc, je vais parler d'une spécifique back-end appelée FDO. Qu'est-ce que c'est appelé FDO ? Parce qu'il réalise beaucoup de libraries sur le projet FreeDuster. Il dépend de Weyland EGL API. Et il y a un support cross-process buffer-sharing. Et il y a un API pour images EGL, resources Weyland. Et aussi Linux DMABuff support. Maintenant, c'est Internet-only, mais on plane de l'exposer à l'API-level. Et c'est utilisé en combination avec Mesa. Et ça fonctionne sur desktop et sur embedded. C'est ce que j'ai fait. J'ai beaucoup développé sur desktop et j'ai validé sur embedded après. Donc, si vous êtes dans les précédents talks, je pense que vous savez déjà sur this streamer. Donc, je ne vais pas aller très vite sur cette pièce. C'est une pièce d'application multi-media, une pièce de framework, pour développer les applications multi-médiaires. Vous pouvez voir que j'ai installé l'image de la documentation. Un exemple d'un pipeline représentant un joueur vidéo. Comme vous pouvez le voir dans ce talk aussi. Donc, this talk is about HTML overlay. And what are the use cases for that? There's quite a few use cases, but I'm just going to talk about the one in this talk, which is, for instance, when you broadcast a live stream, like you do for the first time, maybe you have notifications or background overlays. So you could do that in HTML, for instance. And you can also display banners, animations, using CSS. There's a lot of things you can do with that. So the project I developed is basically GSTWP, so plug-in providing a source element. So the dependency is that you need to have a GL support in a pipeline. And the FDO backend is used internally by the source element. And a WebView is created by the source element. And basically it will load the page that you configured as the source, as the location property of the source element. And you will get internally EGL images from the view backend. And those EGL images will be wrapped in GST EGL images. So there's no copy there. And it's quite nice because it happens everything in the GPU. So it performs quite well. And then those GST EGL images are pushed downstream towards the sync, the video sync, or other elements. So that's the most basic element. Example IR, it's basically a web browser using GSTWP pipeline. And the video frames generated by the source element are pushed to the video sync. And the input events coming from the video sync, like when you scroll or use the mouse, those input events are forwarded upstream to the source element. And then it can be forwarded back to WebKit so that you can scroll the page and it will have an effect on the video frames. So yeah, there's a non-limitation about that because right now the source element push only video frames. So there's no audio support yet. I'm going to talk about that later. And a more complex example involves video mixing. So there on the left you have the source elements. You have two source, media source that can use like any kind of media you want to provide to the pipeline and the WP source. And then the media source gets decoded and goes to a video mixer, GL video mixer. So everything there happens in the GPU. There's no download to the central memory. And then the video frames are composed together in the GL video mixer and output it to the video sync. So I have a demo there, but I don't know if I have time. OK. Yeah, if you want to see the demo, you can scan that code. And it's available on YouTube. So basically what you see in that demo is first a pipeline showing a web page side by side with a normal video. It's the central video I think. So they are composed together but shown side by side. One is notifications that are provided by an HTML page using a transparent background. And you can see the notifications being overlaid on top of the video. So there are advantages and disadvantages to using GCWP compared to other approaches that could be based on Chromium, for instance. The advantages are that the WP is really designed for application embedding and use in embedded devices. And it does everything in the GPU. You don't need to... So performance there is really, really good if you have good graphics drivers, of course. And WP is not to work on really small devices down to 256 megabytes of RAM. But there are some disadvantages that I said before. The audio buffers are not rendered yet so I plan to work on that at some point maybe. And there's no... There's limited input-event support because the input-event only was designed mainly for DVD at the beginning so that API would need to be redesigned a bit, I think. And then there's the dependency on Wayland HGL because of the FDO backend. So right now this works only on Linux for now. So what I've been working on after streaming that source element is transparent background support for the WebView. It's not merged yet, it's in Mozilla. But I used it for producing this demo that you could see on YouTube. I also plan to work on audio support. It would be three steps, as you can see. Maybe also industry itself tries to improve the input-event API. And there's an idea about communicating between the pipeline itself and the WP source element so that when you can have communication there for specific applications, maybe for video players that could do some input-event, over input-event something. And perhaps also support more platforms. So this is upstreaming GST plugins bad. It's not bad. It's the repository where new plugins go usually in Gstreamer. You can see another demo there if you want. And yeah, that's it. Thank you. Thank you. We have time for one question. So we have time for one question.