 Okay, I think I know quite a few of you. I'm Auster Neumann-Karlberg, the Executive Director of Open Forum Europe where I think time has worked. I've been active in Brussels for 20 years working with open technologies and public policy in different forms, wherever they intersect, we try to be there. So today we're going to have a discussion, different perspectives on the cyber resilience act and the progression of the negotiations, etc., where it's potential effects on the free and open source software, but perhaps more importantly, what those effects on free and open source software could mean for the European economy and society. So, since we're already five minutes past nine, let me give the podium to Jose Carlos de Galvobos, the site's attaché of Spain, just to open this event and then we'll smoothly transition into the panel. Thanks for coming to Carlos. You're welcome. Although this won't be edited in post-production. Good morning everyone, thank you for inviting me to this event, for chasing me to be present today, as from the beginning I had some agenda problems. In fact, I apologize beforehand because I will have to run after this intervention a little bit more after another meeting. And I'm so happy to see so much nice people that already know working on this. I won't talk about the importance of open source software. That's on your side, I'm convinced of the importance and I don't care much about the numbers, if it is 80% or whatever. I'm sure that we need to keep it as safe as possible from side effects from legislation, so that's my personal view. And from an official view, you're inviting me probably to talk about the side effects of CLA on open source. And I just came to send a message that we are working in the Council with all the stakeholders that have been involved in this proposal from the beginning, the Commission also. And I also know very well that the Parliament is on that also. And you should rest assured that no harmful side effects will come out from legislation like CLA. So Benjamin also knows very well that we have some fruitful debates on this. And I always say the same, your input from the stakeholders is crucial. And I don't need to tell you, you have to let us know because we are dealing with such amount of papers and lines and paragraphs and words that at some point it's very difficult that all of us missed something. But probably we don't have the same angle. So if you don't have in the pants of people that is working, there is a lot, the right angle on a specific word, then the legislation won't be as perfect as we want it to. So this is it. I think this is a collaborative effort from everybody and the words are very well. I cannot give details only that wording to make all the others more assured that the words are going in the right way. But just my word, the things are going well and the legislation are going well. And when it is in the side effects on the officer, I don't expect any other. So thank you for inviting me. And I talked to Quasar yesterday and there's a time, one, two questions, maybe somebody has some questions, direct channeling to the council within the limitations of what he can say. Any questions, concerns, views? Okay. Maybe not a question, but just to add more to say, don't forget this from IBM. Thank you very much also for your comforting words about open source and that it should be fine. I also have the impression, the message that we said, not heard. The open source topic, and I like to stress this, this is not a topic of individual lobbying or whatsoever or things. It's a topic of significant substance of the innovation ecosystem for you. And clearly the commission had good intentions when they crafted Recycle 10, but it went wrong. And it's very important to fix this to make clear that with open source source code must always be available. Otherwise it's not open source. That's critical. Individual developers who contribute cannot be made liable because otherwise people don't send anybody to contribute anymore. If there is a risk that you send someone and they have right away liability doesn't work. The open source innovation power will stop in Europe. And thirdly also all those hosting the source code, Eclipse Foundation, Linux Foundation, GitHub. They cannot be considered manufacturers or distributors because they are not and they are not meant to be right. And even if something is coined, the Eclipse data space connector. It has the Eclipse name there, but Eclipse only provides the framework for doing the open source development, the governance framework. They are not a manufacturer. It must be those who really take the code, put it into a commercial activity, make money with it, who are. And they are ready, I can say for IBM, they are ready to take the liability. But up to this step when something is brought into a commercial activity, it must be extended from the news. So thank you very much for writing this in the council as well. Happy to see that all the messages that we got into our negotiations into our resonance. They are the same ones that you are presented to. So happy to know that we didn't miss anything. Thank you, Jochen. James? Yes, so James, just to echo Jochen's point about how welcome it is to hear that reassurance. I guess my question about the commerciality that was his third point. And the reference to possibly guidance, which is going to be potentially in the final regulation, to better understand how the new legislative framework's terms, conditions, notions, definitions, fit with software in particular open source. So could you shed some light on that? Is there a discussion now within council to see how that's going to, that guidance could potentially pose adoption further help iron out some of the question marks that we have around what is commercial, what's not commercial, et cetera, et cetera. I think that would help a lot. Thank you. That's a tough question because that's one of the definitions that generates more debate about what is commercial. I cannot say much, I cannot say much about this at stake because it's being considered still, but I mean the final word is not agreed in the council. But if I would have been forced to choose a magic word, I would say profit. Okay. Very good. I think this set up the panel quite well. Thank you so much for joining, especially in the busy days. I'm going to pick up the microphone. As you can tell, we don't have too many microphones. So you'll have to share one? Yeah, exactly. Yeah, okay, perfect. So the style I usually use when I interview or like moderate the panel is a bit informal. So I'm not going to go for a long formal introduction, even though you both deserve one, but I'll leave it to you to introduce yourselves. So I'll ask a question and then get into and introduce who you are and what you do and why you think we've invited you to talk here today. But I think one thing that we really want to underscore and I think it's just to, of course, talk about the implications of the CRA or potential implications of the CRA on free and open source software. But I would like to start by making the link between free and open source software to the European economy and society. So the EU IT industry, but also broader than that, like what open source software is in the economy. And let's say innovation ecosystem today. And Dettler, I'd like to start with you. In your job at Etas Bosch, how do you perceive the impact of the CRA on the practices of your company? Large-scale industrial software development and engagement. Yes. Thanks, first of all, for the invitation. Also to give you a little bit of background, I'm with the Bosch Group since more about, you know, 25 years, always in the software area. I'm coming from the computer science. And so we are in, of course, automotive industry, but also in, I would say, home appliance or other topics, gardening tools, power tools and so on. And everywhere, open source is in. And so for us, it's clear that we have the full liability for our products, independent what's in. If it's a screw from a manufacturer and we are using it for safety critical applications, we are liable for this. The same if there is open source in, we are liable. Therefore, it's not a question or we don't expect that we get an exemption for us using open source. We are taking the liability for this built in software. And therefore, we got also a bit scared when we heard about the first draft of the CRA and we rose our head and explained also to the European Commission that we have a big, big issue here if it's going in the wrong direction. Because if we are not allowed to use the open source, perhaps because the open source community would say you can use it everywhere, but not in Europe because of the CRA. If this is the argumentation, the entire industry, not only automotive, is in a deep, deep, deep, deep problem. We cannot develop these things by our own. And therefore, we are not lobbying for us. We are lobbying for using high quality open source to be used in our products. And in our context, we have a special in the automotive area. We have a very special challenge compared to the most compared to most of the other industries. Our product life cycles are about 20 years in the commodity products. Like we are just looking into a mobile camera. They have three, four years, but we need to maintain this software which is included over 20 years. The vehicles are running on the road average in Germany about 40.7 years, but they are even running much, much longer in the European Union. And this means we need to maintain this software over a long, long period of time. And this can only be done with open source approaches. And there you see the impact. If we are not allowed to use the open source, then we are done. And so we got really scared when we read the information, for example, by the Python Foundation. If the CRA is coming like this, we might need to think about not allowing Python to be used in Europe anymore. But this would be a nightmare for us because all our tooling environment, all our software scripts and so on, they are all used or they are using Python. And this is the impact. So now you understand that we are quite scared. But now we don't have anyone from Python here, but we have somebody from Apache. As founder of the Apache Software Foundation, can you talk about just, again, the possible implications from your point of view of the Syria and the FOS ecosystem. Just this open sharing collaboration. Maybe, could you just, why is there a worry in the ecosystem? Well, yeah, thanks for the question. I'll introduce myself as well. So I'm one of the founders of the Apache Software Foundation. Once upon a time I was still a respectable physicist and worked with the European Commission. And then basically I was already at work on this thing which later was supposed to be called the Internet and the World Wide Web. And then that led to Apache, that led ultimately to the Apache Software Foundation, which is now one of the larger open source organizations in the world. And basically what we do there is we have lots of professionals, lots of people who work full time for companies all over the world in Europe, wherever they are, working on software, open source software. If you take a typical European as a me, or even if you take someone like Boss, 18, 90% of their software stack will be open source. And if you look at that SaaS services it will, that number will be much, much, much higher. So obviously we talk about CRA. CRA is not just regulating some proprietary hits of software at the top, it basically is 80, 90% of its impact is on open source. Since that's our core business, I mean we worry about that. And if you look very closely at how open source comes to be, the distinction between when it's commercial and open source isn't really easy to make, which logically is like 80, 90% of every commercial software stack is open source. We have thousands, many tens of thousands of developers who are full time employed by companies here in Europe and the rest of the world, who work for a very large part of their time on open source. Not necessarily because they want to make open source better, but because they want to, their employers, their setting, want certain things in that software to be better, more fit for Europe, more fit for whatever they're doing, or some other arcane sort of like business. And they happen to have found out, the industry as a whole has happened to have found out that over the years that contributing back some of the things you're doing for your customer, basically to your competitors, to your peers, simply makes business sense. It sort of like makes your maintenance easier, together you sort of like can build something a little bit better and serve your customers better. At the same time, those companies who place their product in the market, who sell that service, who sell that product, they're usually extremely proud of their software and they stand behind it just like, what is not an exception. Everyone sort of like it was completely, like with basically a bunch of volunteers worked for, they're very fiercely proud of their software and they absolutely stand behind the entire stack they're delivering. Even though 89% isn't written by them when they sell it to their customers or when they give the product service to their customers, they feel responsible, they are responsible for that whole thing and they'll keep that up and running it alive. Now, what happens with the CRA or what happens in general is, what happens when in that very commercial setting at that call phase, you're making an improvement or you're making a security fix. Of course, you're going to attract your customers because that's your first priority. But you probably will also supply it back to the Apache Software Foundation so that basically the software basis you're built on can become safer for everyone. That's a bit of a win-win for you. There's a certain amount of altruism in that but it's also because long-term you also benefit from a safer ecosystem. Now, that fix was developed in a very, very commercial setting. That now goes back to Apache and then from Apache it goes back to sort of like all your competitors to your peers and everyone else. That fix was written by someone full-time employed at your company. You're not going to tolerate any sort of like risk or liability associated with you giving, doing your sort of like fixing it back, basically giving it back to the community because your customers are not paying for that. You have no idea how that fix is going to be used by others. So there are sort of like a lot of sort of like fear and uncertainty and doubt in that process. So one of the things about open sources and one of the secrets of open source and it's a bit of dirty secret is that that win-win is very, very fragile. It's usually much easier to simply hoard all your changes. The fact that we sort of like contributed back is because we've learned long-term there's a win-win. But 99 out of 100 companies never ever contribute anything back even though they may have very valuable things. So it's a very fragile process. So the primary worry in the Apache Foundation is volunteers and with that all those SMEs and companies around the world in Europe is that we're breaking that very fragile win-win situation that we basically see companies at the edge still making security fixes but not contributing them back. And that's a concern because while in the Apache Foundation we typically fix things fast within hours and days, the industry as a whole isn't so good. I mean in many ways the CRA is something this industry badly needs to get security better. But metal with that process by having a not perfect abstraction, a not perfect definition of what opens already where open source starts and where commercial begins. We sort of like is risky and it's a hard problem right because again sort of like 89% of any software stack in Europe and in SME or the company is open source. So that's basically sort of like where our real concern is. Okay, perfect. And now this is a bit of a tricky question because there's essentially one big study of the economic impact of open source that's been made in Europe. And it was published by Commission 2019 and we were involved in writing this. But in the process of writing this we also learned of how understudied this area is of economic impact. So instead of just regurgitating like the numbers that we and the research team came up with, I'll give you a little bit of freedom just like reflecting over the kind of societal impact of let's say a cooling effect on open source. Without you know not here to be like overly dramatic and kind of doomsday, but just everything from washing machines to cars, power tools, but you know Apache is right there in the middle. How big is it and how important is it? Right, so I think I mean nobody should like look at the billions of dollars or euros basically associated with this and simply give the fact that sort of like 89% or it was a SaaS service in the high 90% of software stack. That particular thing runs on is open source. It's essentially value of the RT industry that is sort of like the scale down to 90% is the value of that open source thing. But I think that's sort of like in a way doing a little bit of this service to what open source is, right? Because to some extent open source, the reason why companies are collaborating around open source or why they sort of like find a win-win is because we have a lot of things in society that we somehow need to do together, where we need to exchange data, where we need to make systems talk to each other. And as things get more complex, what we're finding is that it's often easier to simply use the same piece of software when we want to talk to each other or basically sort of like have a lot of commonality in basically the things where we don't do anything special. Like if we take sort of like for example again basically they'll use sort of like metric screws, basically number three or number five all over the place, just like every other company does in Europe who needs screws and bolts and nuts and things like that and washes. So in a way basically open source is really that, right? I mean if you would go to a builders market to sort of like buy stuff from your house, everyone recognizes the boot and stuff like that. If you buy a tire, you know it's a tire right? It's nothing special about it. In a way that's what open source is. It's all basically sort of like it's the entire foundations of the IT industry. And the very point is that it's basically the same basis because we want all these things to work together and be logical and so on. So I would basically say like the economic impact is simply basically how we use IT to run a many society, especially the bits where we all need to be the same because we want to trade the VTOL or supply information and so on. Yeah, perhaps to add a little bit on what you just told, I would bring another analogy. In Europe we have a power supply of 220 volt or 230 volt. It's open to everybody. If somebody would now tell okay let's go to 165 volt, then we have a severe problem because the complete infrastructure will break. And the same would happen with open source. If open source will be pulled out of the industry, the infrastructure is completely down. This is one social impact and the other I would say more looking into the forward direction. We urgently need software specialists in the European Union for our industries. And we have a pure fight for talents not only within Europe but across the world. It's a big mega shift of IT competences around the world. So if we would like to have a really secure labor market for IT technologies, we need to be in this area of open source active because this is one very important advertisement criteria also for computer scientists. And I'm a computer scientist. I want to work in a computer scientist environment. And open source is one of the critical resources of this computer scientist environment. So if we are restricting the usage, we will restrict our economy to develop towards software development. This is a very severe impact on our society and our economy. Yeah. And I'm thinking here, thanks for this. And I'm just talking point sounds like a good tweet. The CRA is about open source first. I think that is at least a good starting point to think about these questions because of course the goals are also shared by the open source community to just make software more secure. But just in terms of understanding impact, I think what we hope to do with an event like this also is just to elevate this part of the conversation because there are no spokes people for open source. It's a very different ecosystem. But I think I want to get to those questions about maybe a bit more hopeful end of building some bridges and things. But I'd like you to really dig into some details here of the specific challenges that the CRA potentially could pose to the FOSS community. And I should say here that of course we're referring back quite a lot to earlier iterations of the text. So not specifically on the language right now. We'll have other sessions to discuss that. But if you just help us connect the dots because it can get quite abstract sometimes if you're not right there in software development from what you see in the CRA and the risks that you're raised to say a developer at Bosch. Could you connect those dots and like how that would affect the day to day work just kind of in practice? Yeah, so I can give concrete examples. So in the automotive industry we have a tremendous shift also in the kind of transformation towards the software driven industry. This is happening while we are getting more and more high performance computer inside the vehicles and we are getting also software features connected with services in the cloud. Therefore we have a dramatic shift and so we are using now more and more software which is coming from the open source area earlier in the deeply embedded area. This was not the case. There is to be honest no software at all for an ESP and so on. But now with the vehicle computers where we are having features which are with like autonomous driving where you have a feature which is not in one single computer box anymore but spread across the entire vehicle or even across fleets across the cloud. There you need to use this different type of software which is based on open source software. And now the people are also looking the developers are looking which kind of software can be introduced into my products dependent on the functional safety level which I need to address. In a functional highly functional safety relevant features you will hardly not use this but in other we call it quality managed software that it's heavily used. For example in the entire infotainment area it's in and then you use the software and you need to monitor is there any vulnerability that you can fix it in a fast manner and deploy it also in a fast manner. Why are the OEMs into the vehicles. And therefore you need to have this connectivity is strong connectivity to the open source communities. And if you see a bug as you mentioned you fix it and you bring it back to the to the open source community. So we are doing this but of course it's a learning process. I would say the industry especially the automotive industry is in a learning phase there but from the Bosch side we are doing this already heavily. But without this we cannot deliver our products anymore. So of course we are looking also that we are not using software in an illegal manner. So all our products are tracked and scanned for open source licenses we have to fulfill specific requirements coming from our customers that we provide all the licenses. We check whether the licenses are compatible to each other so that we are not building up wrong products. So we have a lot of competence in using open source in a legal manner and there are really technical details. So sometimes you if you link libraries in a static manner. This can hit you badly because then you might have an illegal product where you can use a library also as a dynamical link. Then it's legal but you need this experience and therefore we are investing year on year millions for scanning all our products and also contributing back. And these are real developer tasks. And I think here because when discussing these things and talking about the implications. I always find there's a bit of there's kind of an elephant in the room in terms of these discussions. So what are we actually talking about when we're talking about a false exemption. It's 80 to 90 percent of all software even more in some others. That sounds like we would create a massive loophole where it's just unregulated software. What is actually being called for here. Right. Right. So first of all I think sort of like the word exception sort of like is it makes me a bit uncomfortable because really I mean you should have 80 90 percent. Or in this case the size of 99 percent of your stack being exempt from a regulation we badly need to I mean the security industry has made a little talk breakfast. So we do sort of like needs that security to go basically come better across the board absolutely everywhere. So we really shouldn't be looking at exempting open source in that fashion. Now if you go back to those open source developers. So that just just paint the story from the developer at the call phase who actually delivers to the customer and know what's going to be used in the car and stands behind that car and puts a nice brand in it or something. Where sort of like open source developers are more sort of like on the on the one hand so like they work in company and they know exactly for what customer they're doing it for and how to be used. But from the open source side purely you're much more like a nut and a bolt manufacturer. And that you just make it may be used for a picture may be used for a bike or maybe used for something between is a metal actually Ikea furniture without collapses or maybe something highly critical in a car or whatever it is. But you don't really know as a developer. I mean you may be making a third maybe like like fertilizer you're selling fertilizer may be used for for crop or for exposure or whatever. So there's a lot you sort of don't know as an open source developer as you sort of like work on that. On the other hand just like with that nut or that bolt there are a lot of quality control systems and bits of confidence and things you can do to basically make sure that what goes out of that open source into the industry to be used in whatever else we use that that is good. So if you take a party for example we we've got very stringent rules that no releases go out with with known security issues and fixes are always mentioned in the release notes. Things have to be fixed timely and so on and so on that we do responsible disclosure and also sort of like we don't have the processes on top of that to make sure that when these things are not happening in our open source community because in the end of the day everyone. I mean there are all the volunteers that they sort of like this sort of like get escalated that we do stop distributing that software or in the worst case that software get moved into what we call the ethics or if it's in that sort of like cease to exist or we pull a release or whatever else. And likewise you also try to sort of like have a whole bunch of things whenever you're pushing out release you go through to make sure that that release is is tip top so you don't know. Have I fixed all the shooting holes and you but also are the release notes there is the license there are dependency there. Am I referencing something which actually has a known issue. Have I sort of like ran basically to like all the basics hygiene things on that patch which you need. I'm a little bit vague here because in a battery we have everything from from graphics library to something which runs an initial PLC to whatever else. So it each community is different in what their industry largely needs in terms of that sort of like compliance. And we generally sort of like as an organization wrap that up in what we call a release vote so no product goes out without a few of the core developers explicitly personally having said like yes we sort of like don't is release is ready to meet all the requirements. And if one of them vetoes it the release won't go out. And that's all like and pairs of like with a couple impacts to board or sort of like all the project reports several times a year or project is in dire straits they've been for every month. That will be sort of like compliance aspect to make sure that what goes out is essentially clean. But it's a little bit like the job that they're telling right. They may make a difference. So like when the Titanic left here when left to work it was she was fine. I mean we don't know how that software gets used in the field. And so that's why it's so important that rather than folks on exceptions for open source we make sure that open source intentions at the place where we see a place where it really matters. That's basically near that call phase where the product goes to the customer. And you know exactly what it's doing what it's supposed to do how to support it what requirements it needs to meet to meet. And then you're very careful about anything going the opposite direction. So when the company makes a fixed security fix for their customers and also a stream that so it goes to everyone else because if you impair that process we're making the world less secure rather than more secure. And of course there's also competitive. And they feel just one quick thing because I often realize that this is something that should be discovered. You can describe this well what is upstream and what is downstream. I think this is like one of those things that software people should sometimes take a step out and just explain this. Right. Yes. So basically whenever you write a piece of software. Let's say the piece of software. And it goes to let's say. Well it's almost IBM in the room. It basically sort of like if you observe and it goes to IBM and it goes to Oracle and they sort of like make a nice package out of it. That's what we basically call a downstream. Now then for example that IBM package may be bought by some some finance company to make some nice finance package. That's all like again for like a downstream. So if you get close to the consumer you're sort of like going downstream fixes improvements and things like that go upstream. So basically whenever someone in the field finds a security hole and makes a fix or reports it. It basically that sort of like page then goes upstream to for example the Apache software foundation. And sometimes it will go upstream even further. Let's say to the Linux foundation or smaller package. And then when sort of like the industry as a whole has made a fix that fix then sort of like goes downstream again. And it's a fairly messy process because it's quite common for for example a fix to be made somewhere downstream to go upstream. That fix is then improved. It's found that security hole actually exists in more places. So the fix gets bigger. So what you actually see is that as a company initially sort of like a small fix to the open source world. You find back that sort of like three months later you see a whole bunch of changes coming back to you. Which actually fix a much broader problem than the one you found. That's also basically where you win is why you actually sort of like upstream that in the first place. But that's sort of like is a fairly sort of like it's sort of like a wave going backwards and forwards. And so downstream towards the customer upstream basic towards the source of the software again. Okay. Explaining also from our perspective. Even within our industry we take care that developers are allowed to contribute to dedicated software open source project. There is a process running inside our company to wafer people to contribute to dedicated software open source project. Not everybody can contribute to any to all software projects. It's a controlled and managed process. And if you look if in our product is for example a software stack version one dot zero in and we identify a defect. We could fix the defect in our own environment would work. But if there is a version two dot zero of the stack we would need to build in our fix again in this version. And typically there is not only one buck there might be is 10 20 bucks managing this fixes locally or within the own versions is not suitable anymore because it's it's not manageable. Therefore contributing back the fixes in the upstream will ensure that we get with the next downstream or fix automatically within the open source project integrated. And this is the big big benefit because otherwise the quality would even drop would drop. But I have another point which I would like to mention we were always talking about the software stacks. That's only one half of the story. The entire open source industry influences heavily the software development processes and the methodologies used in the software engineering area. If you look all these DevOps approaches they are coming more or less from the open source. So if you look the first configuration management systems the RCS or what it's called it was all open source. So these processes were coming from that community and got improved over the time. Of course company build then on top their products. I remember earlier the MKS system they were purely based on this standard configuration management system from the open source. So the tooling aspect in the open source and therefore for example the eclipse foundation is so important for us the tooling aspect and the process impact for the development in the industry is tremendously important. So DevOps tooling etc. let's stay there a little bit because I think there's an interesting question about like you talked about the need for more investment into security generally speaking in maybe not just the IT industry but the industry in general. But if we stick with the open source ecosystem this broad term that means so many different things. But what is being done right now in order to improve security. I mean it is if I talk to an open source expert professional developer security is an obsession. The only thing being talked about what is actually being done. Yeah so if you have open source everybody can look into the source and everybody can check whether there's a vulnerability. If I get only the binary the effort to check for vulnerabilities would be even much much higher. So I call always called binaries and saying I'm delivering binaries only because it's more secure. Then I would say okay this sounds to me like security by obscurity. If I open up the source and everybody can look in then I see is it really secure. This is from the respect. There's a question. You know what let's get a question. Is it time for questions already? As I said we're taking it in use for other things. What time is it actually? Fifty minutes. Fifty minutes. Let's obviously get a question in. Yeah we're going on. So I'm Torval Terakalyo, the Estonian cyberattachet. This is my first exchange with the open source community regarding the CRA. Very educative. For Estonia the open source question in the CRA is one of the key priorities. I would even say the first one to get right because as you might know the digital society of Estonia also depends on open source solutions. I wish we would spend still more time in the console talking about that issue. Because I said that it's enough but in my opinion like listening to all of that there are new elements that come in. So I have two questions. One is about monetization and the profit part. I wanted to just understand a bit better how these open source organizations work. Which part of their activity can be monetized? For example technical support or is there anything that is monetized? How do the people who contribute earn their income? And then the second one relates to what you just said. So it's like the element about vulnerability reporting. Is there any kind of interaction between the open source community and the C-SERTS or the authorities now? Do they regularly do the voluntary reporting? For example whether in case all the open source area would be excluded from the scope of the CRA? Would there be in the future this kind of voluntary reporting? How would other users of the open source understand when there is some vulnerability in their element? Is there a need for the C-SERTS to be the CPD kind of coordinator in that case? I hope I was clear about that. Just to start on the business model side. There are many open source organizations and there are all sorts of variations. The part of the organization is basically one of the oldest and largest one. It's very simple in that case. Basically we're a volunteer organization. So our developers are paid by their employers or perhaps they may come with a pension or are very rich or something. But that's actually very rare. 9 out of 10 or 9 out of 100 of our developers are basically full-time paid by our companies to work on software. A large part of that happens to be open source software. So in a party basically the only people we pay out of donations is really on the side of the organization like our lawyers, our accountants. And we also for example pay one person so that we have a very rapid sort of security response of the initial emails coming in. But there are basically no business model whatsoever around it. It's basically much more like a professional organization basically of engineering or something like that. Where essentially it's all volunteers working on their specific topic. There's no money changes handed. There's no business model. If you want to have support on Apache products that's perfectly fine. There are many companies which supply this. But that's entirely outside the Apache representation. We just supply the software for free. We don't pay for it, no one pays for it. Basically there's no commercial things entering into that. All the donations we do get from companies are specifically spent on lawyers first and then on some of our small things. Administrative things next. As to your sort of like... Sorry jumping in there. One thing that's great is Apache. But could you maybe just describe a little bit the diversity of the many different... So basically Apache is sort of like the simplest one and one of the oldest ones. The second model you sometimes see in the industry is a little bit of the pay-to-play model. Where you pretty much still have the same model as Apache where everyone is a volunteer etc. But a bunch of companies basically pay a certain amount. They do a contribution. They do a donation and in return for that they get board seats. And board seats of course means you've got a certain amount of control over those foundations. In Apache that's absolutely not the case. We basically... The board never ever interferes with what people are doing. The only time the board pulls the plug is when for example there's a security issue. But they will never tell anyone what to call it. Of course when you're in the pay-to-play foundations it gets a little bit different. Once you're sort of like beyond that point it sort of like gets quite complex. And you set to get all sorts of models where for example groups of people come together or small companies come together for example. And they do sort of like collectively pay a few developers to work on open source. Then all of a sudden it looks a lot more commercial. Or you have situations where basically they come together and they sell support and with that support they then sort of like pay themselves. But you sort of like then quite quickly go into even though the source code may be open and meet the open source definition. You very sort of like then sort of generally slip into a commercial model. Until you sort of like are all the way to extreme spectrum and not to pick on anyone. But for example like Red Hat is a perfect example. They're a completely commercial company and they support lots of open source projects. So they're not really an open source foundation or anything like that at all. But their company is entirely about open source. So that's kind of like at the far extreme of that. But basically sort of like that whole range is available. And then maybe just that there's managers. There's code repositories. It is very large. And one size fits all I think. I'm not the moderate. So if you look for example to the eclipse foundation. This is a completely different approach than Apache eclipse foundation. For example Bosch and it has our members of this but we are not owning it and we are not financially benefiting from it. We are supporting these foundations because we are convinced that they are doing the right thing. So for example they are in I think this year in June or July there will be an eclipse foundation day at close to Munich. So this event needs to be somehow paid because it is cost involved. They are using our membership fees. It's like a club to pay the cost for this. But it's not a financial you can call it like an NGO. Like an organization of engineers or medical professionals. They are much more like professional organizations. But you asked another question with regard to vulnerabilities. Again you can even set up an open source project without being in such a foundation. I can set up an open source project and put my code somewhere. If a lot of people are jumping in this could become a success or if nobody is jumping in it might fail very fast. We as a company we will never ever go for this I would say tiny project because you cannot be sure that you can maintain or that this project will be maintained over time. And that they will provide us a proper tracking of vulnerabilities. So our open source projects which we are using are fully under the vulnerability checks. And so if there is a vulnerability it will immediately be reported into our product development life cycle. And to be honest the open source projects are typically faster. So this is a benefit using the open source project for this kind of thing. They can react much much much faster and this is beneficial for us for the security of our products as well. So in general if you go to the more established open source foundations who have an awful long levity and reliability that that industry will build upon them. Whether your small SMB or when you're a large company. I mean pretty much all of them basically take in security reports and you sort of like are quite good at funneling and triaging that quickly. So in the party we get the things coming in and our scale that basically means that sort of like one or two a day are actually serious and need to be need to be addressed. In order for it to be this and sort of like goes to the typically confidentially to the to the developers of that particular code base to further triage. Figure out the book figure out the fix or work with the with the submitter in private to basically get it resolved. In the meantime you always ask for a CD number and you start using that CD number to basically make sure that everyone sort of like has the same tracking of that. Then so like depending on the situation. Yeah basically and what it is you then sort of like go through the process of making a release. If the issue is very particular sensitive the release in the picture doing will will be kept out of public view. So it will be actually or you obfuscate how you describe it. So it isn't clear to the community that what is what's being fixed until it's ready basically and can go to everyone and then you sort of like publish the CD number along with the description and mitigation and measure things like that. In some cases you decide that actually the fix is going to be complex but there's an easy mitigation in those cases you of course report that earlier and send and basically do that outreach much earlier. And all of those cases to see search and search and so on. They're basically like being informed and you sort of like well you don't really report them but you basically send out the email messages to them like basically with with your reports with your ability scans and things like that. That process isn't always perfect and one of the things we see as a party is that some of the CSERF basically complain to us that we're sending too many of them that the volume is too high. But yeah so basically that is a fairly structured process and regardless of the CRA and whether it comes to pass or not I mean that process will be followed and that's pretty much what the CRA expects us to do. The one bigger than the CRA is and that's something we absolutely do not know is that in its current writing expects unfixed vulnerabilities to be reported essentially. And that's something we completely uphored because we're incredibly worried about having a central stash of exploits where they aren't yet fixed. So even in our own organization a lot of things are on a need to know basis and we carefully keep them sort of like close to our chest until they have really been fixed. Because we also know that parties of course international and yeah I mean these vulnerabilities have all sorts of values so you don't want to get into a situation where you have to report them one country and you're not allowed to disclose them in our country and things like that so it is completely up to that. One additional hint more downstream to the product if the patch for the buck is available it does not reach at that time the final product. This is a much much bigger challenge for the industry for example if you're looking into the automotive area. Millions of vehicles are running on the road you need to get this patch deployed into the vehicles or in the washing machines or whatsoever. And these are not centrally coordinated devices anymore. So in that sense open source gives us the opportunity to address this topic. But the open source can never ever fix the products in the field. So therefore the liability for this cannot be with the open source products because they even do not have access to the products in the field or the components in the field. Because if I would have the liability I'm in charge to address the issue in the field. This I think it's important to understand. And there's also something else happening in here which really helps security in open source is that those open source developers it's much like when you're developing a bolt. I mean when you're making a bolt there are only a few things you can really focus on like strength and things like that. So that means that open source developers are typically very fanatical about getting their product absolutely right absolutely secure and basically as well as they can and they can afford to do that because that's really sort of like their only focus. Whereas if you're that washing machine manufacturer or you're that car manufacturer there are a lot of other concerns basically and trade-offs to be made. Because for a patch for the T's we fix it and we do a new release what's the problem. And we're proud of that and we enjoy actually doing that. Whereas if you're yeah sort of like yeah that that poor car manufacturer is yet another work fix coming in and you just rolled out the last one. I mean it's yeah it feels different. So in many ways the open source world has an easy to sort of like do these these works to secure the holes and fix them well because the other problems start downstream from us. That's where the road leads the road. I say open up for some more questions. Anyone yeah James. Yeah I had a question for both of you simultaneously. They don't talk about this secret. And as a red hat we have three first company 100% open source. Half our engineers are upstream all the time. So we believe passion in this approach to fixing the problems unlocking the potential. So could the CRA help drive more contribution upstream. By the same mature as a maturizing open source. Then leads a liability question. We haven't mentioned the product liability directive and notion of defectiveness and so on. Is there a risk that strict liability comes in for software that it inadvertently was talking about inadvertent or unintended consequences then brings that liability back up because. There is still this utopia notion in the city about having a bug free solution. And those developers upstream do not want to do that. Okay bug free software is not possible. Full stop everybody who's talking something different should go to the university and should go for a research project. And should get a field medal or whatsoever. Because a formal proof of software can be done for some small parts of the software but not for the entire software and especially not for entire systems of software. No chance. So I can talk about this because in the automotive industry we are talking at four in the I would say hardware components in so called PPM rates. How many parts per million are getting defect in the field and the target is always a one zero one digit number. So from one million parts maximum nine parts are allowed to fail. They have a bug. This never ever would work with software. So there are about 15 years ago we had a big discussion inside the automotive industry. Do we have to use the same PPM rates for software. The problem is if you have one bug in a software the PPM rate is immediately one million because it's in all components. And this is a different to a screw. If a screw has a defect it could be restricted to a certain batch of screws and then you can say OK only one thousand cars are impacted by this batch of screws. But with the batch of the software all the cars are impacted immediately. Therefore this is no point of discussion. No chance. Every person who's addressing this and want to have it in the other direction should go to university and have some courses in computer science. Sorry. Just a small reflection. We talked a bit about it before the event this question of metaphors and how to describe this ecosystem and the challenge that comes with that. Because there are some metaphors work well in some situations and we've come back to the one with the screws here. And it works to explain certain dynamics here. But I think let's also question that metaphor a little bit because Carlos mentioned the profit streams. The screw designer or maker in this but it's not selling screws. They're not making money from screws. That's where that metaphor fails. Reflectable about these metaphors and perhaps the challenges of conveying these messages. So we I already mentioned earlier in before this talk. We have a severe issue from my perspective also in the computer or software engineer society. We are keen to talk about software. We are purely in software. We have our own language. And sometimes we miss to build the verbal bridge to the non software guys. And it's not the point that we are telling we are the best ones. But we are we are thinking somehow different and explaining this. What we want to convey is a little bit difficult because we are not really looking into the context of the audience who is not coming from the computer science. So building analogies and try to explain the things is so much important which we somehow miss in the computer science area. The other problem is the non computer science guys are very often not so keen to get a little bit deeper into the software area and to try to understand the thinking process of the software guys. I'm always giving this example I mentioned this today morning. So most of the people can easily imagine a three dimensional cube. For me as a computer scientist it's not an issue to think about a 27 dimensional cube. For me it's very could be 30 dimensional. It doesn't matter for me it's a data structure in my in my brain. But if I want to explain how a 27 dimensional cube is looking like to a person who can think about the three dimensional cube it's very difficult. Therefore we should not talk about 27 dimensional cubes as a computer science to non computer science because we will simply lose them. But this is a way where we need to find a common understanding and this is also my perception in the CRA discussions that we have not spoken early enough to each other. That's my perception I don't know whether it's right or wrong but it's my perception. And I have to say I mean one way of really simplifying the CRA is that it's extending the the legal frameworks of the three dimensional world into this 27 or 30 dimensional world. And you know there's I find in the discussions of you know getting the language right etc. There's part of it's like yeah use more more more shapes here but it's growing the circle like we're trying to get these things meeting together. I think it will be possible but we really need to work on it. And I think at this point what I really would like to stress from from OFE said we're happy to provide this forum but it's to really sit down and talk this through because things are everyone who's engaged directly with the CRA know that there's some time pressure. Things are moving fast. But if which of course I would be the first one to sign up to the CRA is firstly about open source. Let's sit down and have these conversations and I think we'll get there. Yeah just to add I'm not sure whether we have not spoken early enough but what I sense is that at the level of policymakers there is still not a we still have a lack of understanding in Europe about open source. And it's still to be the perception open source is something that some freaks do they get together in a room and they do something and it's just research and innovation. What is not clear is how much business depends in Europe and how much the innovation cycle for business depends on the source. I totally agree but it's on both sides. So and it's not only talking because talking is very often one sided communications. It should be questioning ask the people I do not understand what you just saw. Can you explain in a different manner or did I understand it right and this should be done in both directions. Not only conveying messages but asking and really try to understand the other side and it's not a confronting other side. It's really to understand the I would say the intention and then find a common solution on this. And I think it's also important to stress that it's not all doom and blue to go back to James's questions of like can it also help improve things. So we know that as an industry we need to improve security. I mean there's no doubt about that. So open source has has first brought source code to the industry and then sort of like I should just alluded to processes and first control systems release processes build systems and so on and so on. So if we basically can if this basically get a CRA right then it's very very likely that we will sort of see the downstream entity sort of like the companies who place the product in the market and responsible for their for their customers to get much better at reporting problems back upstream. And I think it's also very likely that open source collectively realizes these people sort of like who are making products. They all have the same problem of implanted the CRA of becoming CE compliant everything else. So why don't we together sort of like already in the open source releases just like we now package release notes and packaging notes and as bombs and all these other things. We also add in the bits you need to sort of like comply with the CRA sort of like at that call phase. Of course that makes makes absolutely crucial to sort of like make sure that the liability and the risks and things like that are put in the right places because otherwise. If any of that would would increase the risk for you then of course the opposite is going to happen things are going to get less secure. But if you do this right there's absolutely a chance that that's sort of like the next phase open source will not just be sort of like helping with packages and as bombs and things like that. You should mention what as for me. Oh yeah. So sorry. So basically secure bills of material. So I sent you something like the labeling really well. What is in your software package. What depends on because that's typically what just like you need a list of ingredients or you need to know basically your group of know where materials came from. Whenever you're assessing something for security issue or reliability issue you need to know what's in it in the first place. That's what these these as bombs are basically but that's what these bills of materials are. But as we sort of like get better at that it's very easy to imagine the open source industry again. So I go in the next step and actually helping people at the call phase sort of like at that end where they supply with the customer with the raw materials. If you want with the templates to actually make their C certification even if they sell certified by much much much easier right now. In the fabric of C array that's actually risky and harder than it needs to be. But of course I mean things can improve. I think I think it would be great also. I mean I know it also I think the document foundation angle and this would be interesting because I might pull out some some people that I see in the audience to get a sense of this this diversity. Benjamin you have a question. Yeah let's start with you Benjamin and then I'm thinking Peter talk a bit about software or like your package managers etc as well. But it doesn't start with you talking from a point of view of a larger open source project related to a product like LibreOffice which is productivity. So it's in front of every user or can be in front of every user the difference between so the let's say the mental distance between developers and users. It's incredible. I'm not a technical person. Let's say a sophisticated user but I was not a sophisticated user when I entered into open source. So when I entered and it was an open office site I started to say we should document security towards 17 years ago and the developers told me no way. Because security is a given is a fact and why we should document it for users because they said because a user like me and I'm not an idiot. If it doesn't see the process documented it will not believe and believe me on the other side people documents things that is not security ask security. One month ago I had a discussion with a user that told me we moved our email system to outlook for security systems and he said OK you are not good for open source. You should never use open source because if you believe that outlook which is the most insecure software ever written by a human being then you cannot understand open source. But this is the average person that we have on the other side so we should we have started to do now people start to believe me that we should document security. So I've started to write a document that says this is what we do in terms of security cannot be the best solution but at least is what we are doing and documenting it because we have the numbers. So we have 15 tinder boxes compiling every day LibreOffice does anyone knows this no. Users do not understand tinder boxes but if you explain that in a say human readable way they start to understand that we compile every version for every 10 patches there is a compilation. If it fails it goes back to the 10 people that have submitted the patches and if it goes OK then it goes to be to the tests. I think that we should have done a better job in documenting security on our side not given for granted that as I am a senior developer you should trust me because unfortunately people outside. On the other word non-technical by the way 99% of the world population do not understand technical stuff of course it's difficult. Okay I can immediately contribute to your statement security at the end user front is perceived as an burden. If you look all the users who have access to online shops and so on same password and try to convince them not to use it. It's really tough but I can even provide an analogy from your home from your flat. To convince people to exchange the door locks because to be more secured against burglars. Most of the people think OK I need a really very solid door lock at the front door. The most important door is the back door because burglars are typically coming through the back door and or through the window. And again here we have professionals at least in Germany you get even support by the government if you get somebody from the police as a kind of consultant to give you advice which parts of your building. You should make more secure and you get financial support for this. And I'm totally with you we are not doing this sufficiently who in this room is using encrypted emails on a private level. Hi but it's very difficult to get somebody on the other side who can receive it. This is the indication that that is what I mentioned with the back door. Everybody has the back door open. Yeah Benjamin. Yeah thank you so much. Yeah I'm expansion and I work for you. Some of you may know me by now. Just wanted to reassure everyone that as a commission we are working really hard to get it right. It's certainly not the knowledge of the open source community but at the same time we feel that responsibility needs to be placed where it's due. Meaning that those that are best places to fix a vulnerability should also be the ones responsible for fixing it. Just as a general comment. I wanted to take the floor because I think there are some misconceptions maybe. So first of all I keep hearing the word liability. Right. So the cyber resilience act is actually not about establishing liability. It's merely about placing responsibility and that's a completely different issue. So the CRA only assigns responsibility to certain entities to take care of security but it does not deal with liability. Other legislation maybe or does that right. But the CRA does not. Then may I ask a question for my understanding it's really a question. If I have the responsibility but if I'm not fulfilling the responsibility am I not liable. So not either the cyber resilience act. The cyber resilience act will only like of course you can define. If you do not comply with the CRA but the CRA is not liability the basis for let's say a consumer to sue you. But if I'm intentionally breaking the responsibility what is the impact on me if I'm not following the responsibility. This is a question where automatically we in the software community think OK if we are not fulfilling the responsibility we could be punished or should be punished. And this would lead into the direction of perception on our side as liability. So I think it's important. So I think it involves and also I think for everyone in the discussion to realize that the open source world and the copyright systems and everything else around it and everything which governs our systems are basically using all the American words and assumptions. So in general like when we say liability what we mean is when we don't do that bad things happen to us and bad things are a problem because it's for something we're not being paid for or not being asked to do. I know you were kind of cut off. Finish off with Benjamin and we'll have. We should talk about this. Yeah but this is the context of the wording again that we are using perhaps perhaps we have a different semantics of the word liability. And there it would be good to understand what is meant on both sides so that we are not talking. Chris Cross. And then another point. And then another point. Contributions. Now. We fully agree that. Contributions are extremely important. I think we support that that becomes clear from the text that the commission has proposed in the original commission proposal. There is a requirement to report back to the. Upstream. Security issues. Since the text. I think you can go further and say that this was strengthened in the council. There's like a council even proposed it now or discusses a possibility of maybe requiring that you provide the source code of your own backpicks back upstream. Yeah. So we fully support that. And under no circumstance would an upstream individual upstream contribution ever be considered a product being placed on the market. So a contribution. Would never be on the scope of the site resilience act and using your language they could never be right. And what about the company where the contributor is employed and getting employed by this company. Same thing. Same thing. This is important. For its own product. To integrate that component into their own products. Yeah. And the maintainer of the repository may also be responsible for the security of the repository. But an individual upstream contribution would never be considered as a product placed on the market. Let me ask you another question. I've provided this example. If I would start an open source project on my own. Not like Apache or whatsoever. A big foundation. I personally started by somebody at some point. Yeah. Work. How does then this impact me as a person as a founder of a new open source project. And I'm automatically reliable or responsible responsible whatever it means. What kind of obligations do I have to fulfill. And if I'm not sure that I can fulfill this. I will never ever do this because I'm getting scared because if I'm even not in a company but I have a great idea. I want to bring it to the open source. This could be a barrier. To get this innovation. Yeah. So not every single open source project would be a commercial project. So you're not automatically on this. Yeah. And this need to be clearly understood by all the people who might not be I would say. So aware of this topics because they are coming from the pure I would say really nerdy nerdy software development. And they are not so much the expert. So at least a corresponding explanation or interpretation easy to understand for software engineers who are totally newbies in the area of legislation. Yeah. I think a very much like I appreciate the clarifications. But I think what I'm also seeing in the in the international in the wider industry that that as big companies and big organizations like have lawyers and experts look at the various jobs and things. The implications of like are a lot more clear and a lot more more concerned. So I think there is there's still. Yeah. So like to get so excited like you made so like to basically make it much. Yeah. Make sure that basically everyone reads them into those documents right now. There is a lot of fear and certainty and doubt which would influence that that behavior around innovation and taking that away. It would be a very very valuable thing. We take the point of course on reaching out to the community making everyone understand and having the same interpretation. Right. We're really making a lot of effort. I mean we're talking to everyone who reaches out to us. We're going to be events speaking there. We're really doing it. We will also provide all the clarifications necessary at the end of the process in guidance documents. Of course also already in the process the college letters you can see a very clear effort to improve the clarity of the text so that when you read the text you can already have the magnitude. And I just want to make one last point I had on the reporting. Just also to reassure everyone that we do not at all have the intention to create like a stash of valuable vulnerabilities. So the sort of information that will be reported would be rather high level. It would be more information useful to those people that mitigate risk such as the CSER. So a CSER does not need to receive the proof of concept of a vulnerability in order to understand whether it can have a negative impact on critical infrastructure. You need much much less information than that. So what we want is only that kind of information that can help CSERs take risk assessments and potentially pay for the measures, give advice to critical infrastructure and so forth. And I mean the vulnerabilities that we're looking for that's a very, very narrow subset of all the vulnerabilities out there. It's not even all untouched vulnerabilities and it really shouldn't be all untouched vulnerabilities. It's really only those that are already being exploited by malicious actors. We're only talking about vulnerabilities that are already known to criminals. So all we want is that we put the CSER at an equal footing with the criminals. That's all we want. And yeah, so I think it's really limited and nothing to be afraid of in terms of cybersecurity because the information will not be valuable even if it's for our fans. So I actually agree with the sentiment that's going in the right direction. I think the experience in the last 20 years of doing this has learned us that even the very fact that a security hole may exist in a certain package or a version number or the fact that it reappeared at a certain date or something like that. Because the source code is open, has often led to sort of like inadvertent leaks or basically reconstruction of what the issue was by a much wider set of criminals than the useful set you were leading with. So I think that no matter how we spin this, it will basically remain incredibly sensitive information where it's very easy to sort of like reconstruct things even if the version numbers are hidden. Yeah, that's true. I mean, I don't deny that. But at the same time, I mean, why would we allow state-sponsored actors to have access to information and our own system should not have access to that information? I feel like there's a full, maybe a full day workshop that should be planned, especially on this. There's this interesting question of the Commission, of course, being open to having the meetings. I've said this several times Benjamin, I saw you at Faustum, which is a great sign here in Brussels just with the Commission reaching out to the community. But there is still this element of getting these bridges and meetings because, of course, the open source ecosystem is not represented by a trade association. That's just not how it works. Let's say in the multi-stakeholderism of the model of the EU. And I think, just like Denlyf said, there's a lot in the diverse open source ecosystem to reach out and have those conversations. And the European Commission has built an open source program office, an interface to engage with open source ecosystem. But this, Anisa does not have one. Should it have one? Probably. There are all these other institutional interfaces that I think also the European governments and institutions need to build to be able to engage. It is simply different in terms of engaging with an ecosystem. Because how else could the Commission reach out to all of these different stakeholders? I think there's going to be a massive discussion in the coming years in terms of how to communicate essentially the point that you're making, Benjamin, out to this ecosystem. Because it seems to me that we found ourselves in this situation because those approaches that were in place didn't work. And I think we're good at that. And I think equally challenging and almost more urgent is to make sure that all those standards which have to be written, which basically CRA postulates as going to be pointed into existence, those basically are written well and in forward by what the IT industry needs. With the big, big sort of like the IT industry as a whole, especially the IT industry which deals with the internet and web service and SaaS and things like that. Has generally sort of like never participated in the typical standards organization like ISO and Etsy and so on. They basically set it around the ITF and the way they see it and this. So that community is not only hard to reach, it's also completely unaccustomed to the type of standards which would be the natural ones for Europe to use and they're entirely sort of like different worlds. So that, I think, makes the challenge even harder to sort of like not only connect that world but also connect them to the right standards organizations. But we have a few months to fix this. Before we get it, because I threw Peter under the bus to say something, I think it's worth, there's like some groups in this ecosystem that haven't been mentioned. Then we'll have some time, I think we'll go a bit over time. But Peter, just to describe these two points, like what does a code repository do? And what does like a package manage? What is it? And why does it matter? I think you're going to introduce this first question, some helpful points here. So GitHub and GitHub is a code repository that enables developers to collaborate and post the code. And then we also manage the NPM package manager, which enables folks to access the compile code once you've collaborated on it. So I would carefully not use the word distributed, but it's important to emphasize that we're not an out store. That's attributing a final product but instead enabling a collaboration both on the source level and the title binaries. I don't want to ask you to flash that question. In your remarks, you conveyed this distinction between an individual or an entity that's taking a component and integrating it into the product and understanding that the responsibility, not liability, lies at that product. And I'm not sure how intentional that distinction was. I'm reassuring my point, but I'm a little concerned that a lot of the discussions that are focused on the important component happen to be compliant as well. And in particular, if that is the focus of GitHub and GitHub, I think that's side of the park. Because if we were to be concerned, if we were to need to maintain or understand that any individual piece of software to prevent that from allowing that to happen. So a lot of people collaborate on the commission to address the components. Yes, I mean we definitely do want to cover components by the cyber resilience. The idea is that many components, I mean the open source is of course a bit special, but many components are net boxes, right? And it's extremely hard to evaluate for an integrator, it's extremely hard to evaluate their security. Sometimes even impossible if you think of microchips, right? I mean that's literally, I mean that's basically impossible to evaluate someone that integrates that. But even for software components that may be open source, at least the source code is openly available. It is of course much more easy for the person that wrote the original code to understand the security properties of that code than for an integrator, right? If I'm an SME, a very small company, and I integrate lots of components into my project, it's of course extremely difficult to go through each and every of these components and understand the security interface. So this is the rationale of the company component. We understand of course that components like the security posture of a component changes depending on which environment it is integrated. That's just key to us of course, but at the same time there are, of course, I mean there are certain vulnerabilities that our vulnerabilities always give respect to where the component is integrated, right? So there is a case to be made to fix those vulnerabilities already at component level. And just a quick question, maybe you could answer this and this is just me kind of responding to that. In terms of, you said of course, yeah, there are many black box proprietary components. That's fine. But in terms of just components in general, if you think of like percentages here, this comes back to this question of this area being about open source first. The screws you're talking about the components. So let's talk about components. Everybody has seen this topic with Lock4J. I'm pretty sure everybody is aware. I'm not complaining anybody, but it's known. Nobody or most of the, I would say 99.99% did not develop the code but used it. It's a component. It's a part of a library. So to get the fix, it was required to get the fix for this component and then integrating the fixed component in your own solutions. So when this vulnerability came up at Bosch, it was a complete scan across the entire organization. And to be honest, it took a certain amount of time, more than it should have been to be honest. But we checked all the products with this component are in and then draw corresponding measures to fix it in the final product. So this is a task. And therefore the components are relevant, definitely. But that is the reason why we are typically using open source, which is well maintained. If we are looking for, if somebody is coming up as an open source component and want simply to build it into a product, this is not working in our company. But to be honest, our company is a quite big company and can afford to do all these analysis and everything. We have all these stable processes in place. But for smaller companies, I would say KMUs or something like this, this could become a real, real big challenge. But again, I emphasize our company is fully liable for the product, including the screws, including the component software components. Therefore, if we have an issue with a software component coming from somewhere else, either open source or from a third party or whatsoever, finally we are liable. And if we got the code or the software from a third party, then we have to get into negotiation with the third party. For example, if a recall is coming on based on this topic, we need to check with our supplier how he will compensate to some amount with regard to the recall. But we cannot put all the burden of a legal recall to a screw manufacturer. Especially if the screw manufacturer gives the screws away for free. No, this is not happening because there is material involved. And material costs a high. So I think it's important to stress that in all those cases, it is that final and the basis of the market which sells it will be this liable. So it is fair to say that if you're a very small company and you've used loads and loads and loads of open source and you haven't really bothered to check or do anything on it and you just use random stuff, whatever. Then yes, under the CRA or even today, you should be having a problem because you're basically, I don't know, a company making toys which are a little poisonous because you just take any random things. Of course, as a society, we don't want that. So yes, I think that one of the hard bits to the whole idea industry is that yes, I mean, you need to get together and that isn't going to be free. But you still sort of like, and also like if you are that small company which is selling that whole stack. And of course, there is a whole ecosystem of companies that support open source or that basically provide packages of it. I mean, there's a whole industry which basically caters to that. So I just want to say because again, I don't for some reason have a clock, but I think I've noticed some people leaving so I bet we're solidly over time. But I would propose because, and already told some people, the ones who still want to discuss these things, let's eat up the rest of the breakfast that we have here because otherwise it's going to go onto my desk and then we're going to have a lot of things I just eat all day. But okay, Ian, last point, and then if it's a really tricky question, we'll have to move it over there. Yes, I just wanted to mention many of the issues that you have talked about today also come up with the AI Act. Although that was a much more focused example, it's placed on the market, the council and especially the parliament, especially since chapter 15 basically had so much attention in the media have also been debating what do we need to regulate more upstream in effect in terms of what we're using and might that affect open source software? I don't know how closely you've been following that debate but maybe it's something to discuss over coffee as well. Or maybe it's yet another breakfast event that we'll have to have in coming weeks or months. Well, so just to run this off, thank you so much for coming. That left the agreement. Thank you so much for coming. We'll continue the discussions like we've said. There's quite some time pressure. There was no question. Oh, there was a question. Okay. Yeah, no, it was a slight objection. When it comes to liability, responsibility, in the end it's a regulation. So there are legal obligations. So it's a product safety regulation, a classic product safety regulation. So all those economic operators, developers, I work for developers the last year I'm thinking about software. So all software developers in the scope of regulation we have a legal obligations. On top of that, it's the product, that is imposing a strict on all types of software, embedded, non-embedded. It's quite worrisome from our point of view. But it's irrespective of the CRA. Yeah, but the CRA provides a little bit of a rope. If you have a product with a piece of software that you see is certified, then the product takes that into a company because it's a presumption of conformity. Is that it? So... It's a market access. It's a market access. We are not supposed to put anything into a market that's not compliant with the rules. Exactly. So the product always takes that into a company. It's a presumption of conformity. And that's a good thing, Bryce. It's not a bad thing. It's a good thing. Yeah, it's a good thing. But it's a connection. I see it as a connection. And finally, once in a time, a ripple effect in the end it would be on price of software in Europe and that would be in consumer products. And I think just a kind of tying it all together is trying to. Like Benjamin said, there's the CRA but there are other, you know, separating it from, let's say, the PLD but also like Ian referred to, the AI act. I mean, in my view, essentially, what we're talking about here there's not that obvious, I think, for anyone involved. The software market is being regulated. Elements are starting in Europe. Other things we'll see elsewhere. But we don't have those bridges and conversations in place yet properly to have the open source conversation. That hasn't been sorted out. And I think it takes two to tango. I think we've concluded that. And we need to... I think the general open source ecosystem and the diverse many different organizations need to step up, mature their thinking and approaches to, let's say, the European Commission, national governments regulators, et cetera. But I also think that there's, you know, there are some early steps from the commission building the open source program office but also building the institutional capacity to take in the massive diversity because when we talk about regulating software, it's about open source first. I think that's an important thing to conclude. And it might require more attention than a recital. Thank you very much. Let's go out and get some coffee. I really implore you to get some coffee. I need some more croissants. Thank you very much for coming and we'll continue the presentation.