 Kindness and thanks everybody for joining today. This is a highly relevant and pertinent topic and a recent development as well. So I'm glad that we have the opportunity to discuss the subject very insight today. One second. First of all, can you all here and see me well? Yes. I think people cannot respond to chat. So I assume that yes. Who am I? I'm working in open source since about 25 years as a contributor and community manager in other areas. My role at the Linux Foundation is senior director for community development at Linux Foundation Europe and also teach open source intellectual property at the Technic University in Berlin, where I'm also from. So welcome. What's the Linux Foundation Europe? Linux Foundation Europe is the European chapter of the Linux Foundation. We build opportunities in the region for our participants to contribute to Linux Foundation's global projects. We amplify the reach of Linux Foundation projects in Europe and we represent the Linux Foundation and the open source ecosystem as much as we can in European policymaking, science development and research development funding programs like Horizon Europe. And as such, I've spent quite some time in Brussels to work on the Cyprus Insect as it came up. And what you will hear today is the current version of the text, which is based on the parliamentary vote just three weeks ago, 12th March. So this text is not 100% final, but most of what you hear today will already be what you will have to later implement. So with a competition policy notice here, please feel free to look at our website to make sure that we don't say anything here that is like, for example, private to your company and kind of also already mentioned the code of conduct. So let's jump right in. I would like to first provide an overview of the Cyprus Insect, what it is. And yeah, as I said, please keep in mind that we're waiting for this to become the law. It has been approved by European Parliament, but some things may still change. But I think that the basics are relatively stable and will not change anymore. So the Cyprus Insect is a horizontal regulation for the European Union internal market that lays out essential cybersecurity requirements that all businesses and also those communities that operate in Europe will have to implement. And it's the first regulation of this kind and this scope. It covers all digital products, meaning hardware and software and the combination of them. And this is a novelty. It defines separate roles for manufacturers and documents of software stewards in how they should work together and who carries what obligations in bringing digital products in the market. We will cover today basically who is responsible for what in that concert between manufacturers and stewards and what the essential requirements and obligations are and what we as an institution can do to work with you to implement these obligations. So the Cyprus Insect's main goal is to strengthen cybersecurity in Europe, first of all, but also for the whole world. It has three detailed policy goals. It's supposed to reduce vulnerabilities for digital products, to ensure cybersecurity is maintained throughout the products lifecycle and to make sure that users can make informed decisions when they select products and when they operate them. So these three things, these three goals are kind of the pillars, the design principles behind the Cyprus Insects Act. And I think we can all sign up for them. And that's the, I think the big contribution that the Cyprus Insects Act will make to the ecosystem is that it will enshrine, for example, rules like that security updates need to be provided for the lifecycle of a product in European law. One thing that is very important to understand from the start is the Cyprus Insects Act does not say a European business has to do this or that. It says, if you want to introduce products into the European market, if you want to sell them, if you want to do business in the European Union internal market, then you have to fulfill the requirements of the Cyprus Insects Act. That's what is called a horizontal regulation, meaning it applies to everybody who operates in Europe. And it doesn't matter if your business is headquartered in Europe or on a different continent. If you want to sell products in Europe, then the CERA is relevant to you. And this is one of the key reason why we say, we need to start looking into this today, no matter where we are, because Europe is quite a big market. So it will be relevant for pretty much all manufacturers. And it's also one thing, there's also one thing to keep in mind in this that it says in the text that you intense to play an internationally leading role with this regulation. So we've seen this in other EU regulations in the past, for example, in the GDPR, where maybe the international effect was still kind of accidental, but GDPR did change how we handle user data and privacy in digital products all across the globe. And the US learned from that. And with the Cyprus Insects Act, it intends to play this role from the start. So I think the key takeaway at this point is, it doesn't matter where your business is located, it also doesn't matter where your open source community is located. If you make software or other digital products available in Europe, then the CERA is relevant to you. So let's go a little bit into the details on how the CERA goes at implementing the policy goals that I've mentioned. First, it classifies products into three groups based on their criticality for cybersecurity. Every digital product is supposed to be developed according to the essential cybersecurity requirements. And that's one annex, and I will give you a few examples of that later. Two, the Cyprus Insects Act that basically just lists what you should keep in mind when you build products. And this applies to all products because there's an assumption that if, for example, a product that is not very critical, like a small device that you may use has security vulnerabilities that these security vulnerabilities can also proliferate and open vulnerabilities in other devices that are networked with it. So that's why all products need to be developed according to the essential criteria. And since the CERA speaks of digital products and not just software or hardware, it basically says every device, every piece of software that ends up in the hands of the consumer is covered by this. So we are saying all the legislation software projects that release open source software are covered by the CERA, all digital consumer devices and all software installed on consumer devices. There's a question of what software on the cloud is covered by the CERA. And that's described as the software that is basically essential to operate the device in the hands of the consumer. So for example, if the device that you use operates in fact with a cloud service that stores the configuration and is the account that you log in, but you have to do that to use the device, then even that is covered by the CERA inside. Pure cloud services that don't end up in the hands of the consumers are regulated by different regulations, not by the CERA. The next class of product is products that have a potential to have a bigger impact if vulnerabilities are identified in them. For example, web browsers or children's toys, devices for vulnerable consumers. Here, the compliance with the CERA needs to be either demonstrated against harmonized standards or ordered by a third party. And for that, there will be a network of accredited assessment bodies, essentially consulting companies and service providers that will be able to certify compliance with the CERA for your business when you introduce products in the market. And then there's a third class and those are products that really have the possibility that problems with those products proliferate into the whole network, like operating systems, networking devices, routers, et cetera, VPN gateways, things like that. And here, this is the most critical class of products under the CERA. Here, the assessment always needs to be done by a third party. Now, keep in mind that bringing products into the market is something manufacturers do not open source communities. So these requirements for products essentially end up on the person that brings the product into the European market. So as I said, manufacturers should develop all digital products according to the essential requirements of the CERA, since less critical devices may serve as a springboard or security attacks that can then target other devices. Strictly requirements are basically there's a cascade of requirements and the stricter requirements start with the second group. For example, targeted at vulnerable consumers and then you have the strictest when exploiting devices can cause a wide damage. So this applies to all products and it just gets tougher if the products are more critical. So here are some examples and it's only the only examples because the list is much longer of the essential cybersecurity requirements. So these are the basics that every manufacturer has to keep in mind. There are, sorry, the price should all be designed to develop in accordance with these essential requirements. Designed and developed is important here because it is important to be able to demonstrate that even during the development phase of the product, essential cybersecurity requirements have been kept in mind. When products are introduced into the market, they should be introduced without known exploitable vulnerabilities. So at the time you ship a product, you should have fixed all known and exploitable vulnerabilities that you're aware of. And of course, things can happen at a later time. You can identify new vulnerabilities at a later time, but at the time you ship the product, you should have hope, have them fixed. Products should be made available with a secure by default configuration, meaning if you have a choice between, for example, being potentially easier to use, but more secure versus more secure, then you should err towards being more secure by default. Vulnerabilities, your devices should be able to fix them through security updates, ideally in an automated way and ideally over the air. So products need to be built in a way that the vulnerabilities are identified. They should also be fixable, ideally in an automated way and remotely. And that of course needs to be safe from unauthorized access. So in short, many great practices that we have applied in the past to the development of digital products and also to free and open source software are now enshrined in the law of secure by default configuration, the regular provision of security updates, et cetera. So these things will become requirements where in the past, you may have done this anyway, if you maintain your products in a great way, but now there will be a requirement for it. One thing that is important is that we're talking about the European Union here. So the European Union consists of many member states and the idea is that the approach to cybersecurity is harmonized in the European Union market. Because of that, the Cyber-Syber-Zero-Insect contains clauses that say that member states cannot impose additional cybersecurity requirements on market access on top of the security. So the idea is that, yes the Cyber-Zero-Insect is difficult to implement, but if you do it, you have access to the whole European market. There's some caveats here because there are some fields that the European Union cannot regulate for the member states. For example, defense or national security stays within the realm of the member states. So if they want to impose additional rules for products that are used in defense, that is possible. But for general consumer products, the CRE is the authoritative regulation. But also more specific regulations on top. For example, for medical devices, for radio equipment. And these are currently all in place even with the CRE. However, there's an encouragement in the text that says that they should be harmonized with the CRE as they get updated over time. So basically, take away here, the Cyber Science Act applies to all the European member states and it unifies access to that market with regards to cybersecurity except for some specific areas like defense and national security. So now we all know that today, most digital products consist of a large share of free and open source software and then a smaller share of proprietary software that puts this all together, integrates it and develops the functionality that is then provided to consumers. We assume that in most devices, about 80% of the software integrated is open source software and 20% is developed on top by the device manufacturers to integrate the open source components and to make consumer functionality. And that creates a relationship when it comes to how to bring software like digital products into the market between upstream open source communities and manufacturers where we have to ask at what point in time is a product being made available in the market that in commercial fashion and so that the obligations towards manufacturers apply. And the Cyber Safety Insect regulates it this way. It says that the provision of open source software products that are not monetized is not considered a commercial activity. So it's not considered the activity of a manufacturer. Similarly, the development by non-profit organizations is not considered a commercial activity. The numbers behind the bullets are in recitals in the CRI text as it stands today. Manufacturers that take open source components and integrate them into products should exercise due diligence when they do so. That means the responsibility for cybersecurity and for vulnerability for the product that a manufacturer introduces into the market lies with the manufacturer. And that means manufacturers need to, for example, assess open source components that they integrate. They need to understand what these components are and document that and make a decision if it's safe to introduce that product with the open source components integrated into the market. And to facilitate this relationship between upstream and manufacturers, the COA suggests that we apply voluntary security at the station programs to support the due diligence process. So the suggestion is that upstream communities voluntarily provide security-related product documentation essentially for open source components that manufacturers can then use. But it's very clear in the text that the work of the upstream open source communities is a voluntary support and that the responsibility and the due diligence lies with the manufacturers. And the result of this delineation of responsibilities is a novelty in EU law, which is a separation of two economic operator roles, that of manufacturers and of open source software stewards. Manufacturers are companies that bring products into the market. If they market them, they have the products carry their trademark, they monetize the products, basically they act as manufacturers as businesses. And manufacturers under the city have the full range of obligations. And I will go into more details here on the support period, on how to provide fixes to vulnerabilities, how to disclose them, et cetera. So all these obligations are on the manufacturers. And in contrast to that, in the law, we have open source software stewards who are supposed to be regulated with a light touch regime. That's what the law says. And open source software stewards is a legal person other than a manufacturer. So not the same, which has the purpose to systematically provide support on a sustained basis for development of specific free and open source software products and ensures their viability. So this very clearly says that an open source foundation, like the Linux Foundation, acts as an open source software steward, steward for the projects and foundations that it hosts. So one key takeaway here is that a steward is a legal person other than a manufacturer. So a manufacturer cannot, at the same time, be a steward. If you, as a manufacturer, maintain an open source project, then you still have the responsibilities of a manufacturer. And that's one impact that we expect from the CyberSales Act is that it will require us to make the distinction between who's a manufacturer and who's an open source software steward. It requires us to make that more clear. So I'll get to the questions at the end. Thank you for raising them. Let's get through the basics of the CLA first. So I would like to summarize a couple of takeaways for manufacturers. The first one is that open source projects, even the ones that later become commercial products, they change their nature over time. So I tried to visualize that here as like the tree of life of free and open source software projects. Many projects start as a crew project that a talented developer creates on the weekend as a hobby project. And initially, they're the only contributors. Technically it's a single vendor project in a way. And very early on, often the decision is made, will this be a small open source community where we jointly provide stewardship for the project over time? Or will this form into a small startup company that will go around as a commercial manufacturer? If you would decide to develop this project as a open source community, then it might grow further and then maybe you establish a nonprofit organization that provides the fiscal home for this project. And we know of many projects that went this route and are very well-established open source projects today. Or you may join one of the open source foundations to have your project under a neutral home and be able to invite any interested stakeholder or contributor to this project. And this is all on the side of Stuart. So at the point where the project changes from, yeah, a hobby project or a single developer that works on the weekend to even a small community that jointly cares for that, we go the route towards stewardship. If the decision would be to make this into a startup, then very early on, as soon as there's commercialization or monetization of the software, we go into the route of manufacturing. So it doesn't matter if the software is released in this context as open source software, if the activities of the entity that develops it are clearly commercial. So if there's monetization, if there is a commercial version, like an enterprise version of the software and maybe the core of the software is released as open source by the same company, that's clearly a manufacturer. And from here, if the startup is successful and it grows, it may grow to be a small or medium-sized company that still contributes to open source software, but it's still considered a manufacturer or it can be very huge. But basically, there are two things that we need to keep in mind. One is that projects change their nature over time. And I draw two lines here into this visualization, basically saying where do you cross these lines from this is a hobbyist project to this is a small community or a startup. This is where you go from an individual developer to a group, either nonprofit, steward or company. And then as you grow, you're basically either a steward or manufacturer at different sizes. So, and this is something to keep in mind is everybody should be aware of the role they play at the time, am I still an individual developer or am I already a small company? Is this a small community or do we have a legal entity? And this should be made very explicit because your obligations under the Sierra are changing over time with that. So what are the obligations of individuals? Individuals meaning hobbyist, occasional contributors, as long as their participation remains non-commercial, I exempt from the Sierra. So in the example I gave before, the hobbyist developer who comes up with a great idea on the weekend, as long as that's an individual project, even if it's published on GitHub, that's still exempt from the Sierra's insight. However, when then a community or a business forms around this, this may change. So second item, very important is contributing to projects that you are not responsible for is also exempt from the Sierra. Meaning if somebody contributes to a project that is for example, hosted at a foundation, the responsibility for that project lies with the foundation with the steward. Similarly, if a company acts responsibly for the project and you contribute to it, it's the company that is responsible for the release the next one. So contributing to other legal entities' projects is not covered by savage agency. But now the important caveat is that individual developers may also be manufacturers. If you think for example, of a single person business, and it can also be stewards. If you think of long-term maintenance of projects. So the takeaway is here, projects grow from ideas to large communities or businesses. And in that process, they change their nature from being projects by individuals to either manufacturers or stewards. And we need to understand which they are at what time. So I mentioned the obligations of manufacturers, and I would like to go a little bit into the definition of support period and how software updates are supposed to be provided. So the first takeaway here is that manufacturers must supply vulnerability fixes throughout the support period of the product. That means if you become aware of vulnerability, you need to fix it. And you need to deploy that fix to your consumers. And products need to be designed to support software updates, especially for consumer products and ideally automated. So it should become a habit that if a vulnerability is identified, you provide a fix and you have a way, a mechanism to roll this out to all the users of the product, ideally overnight. If you communicate that the support period for a device ends, this must be done without restricting the functionality available to the user. So you can say, sorry, your support period has ended. And because of that, we're disabling these to 10 different functions on your device until you buy an upgrade. So the functionality that the consumer has paid for essentially needs to be maintained and security updates need to be provided. And that support period should be no less than five years. There's one exemption and that is if products have a shorter kind of natural life cycle. So for example, if you have a medical product that can only be used for three months and then it needs to be refurbished because it stops working, then the support period doesn't have to be longer than that. On the other hand, if a product can reasonably be expected to be used for longer than five years, then the support period automatically also extends. And this is of course relevant for manufacturers, say of network equipment that is supposed to run for a very long period of time or of vehicles who have a road life of 15 years. If either consumers can expect that the product can be used for such a long time, then software updates need to be provided also for the expected lifetime of the product. So take away, the vulnerability fixes must be provided throughout support period. They should be provided automatically and as long as the support period lasts and it's at least five years and it can be longer if your product can be reasonably expected to work longer than that. As a manufacturer, you also oblige to disclose vulnerabilities if you notify them, if you identify them. So if you get a report from a consumer that they identified a vulnerability in your product and that vulnerability is actively exploited, then you need to notify about this vulnerability to the computer security incident response teams, C-SERTS and Anisa and Euro. So basically it's not just about the consumer, you will have to disclose the vulnerability to the authorities. There will be a European vulnerability database where critical information about vulnerabilities is being shared so that the damage caused by the vulnerabilities is restricted or limited. Importantly, vulnerabilities that you discover in good faith, for example, through intrusion testing or you're properly testing your devices, you are doing that, they don't need to be reported but you need to provide the fixes. There is a provision to apply for brief delays. So for example, if you're saying it's Friday today and we'll work on the weekend and we're sure we have a fix for this vulnerability on Monday, can we please not disclose this to the rest of the world until we've got the fix? That's possible. How exactly this is possible will be defined in standards that are still being developed. There's a provision for that. Manufacturers need to provide a vulnerability disclosure policy, which is essentially a point of contact for consumers of their devices to report about vulnerabilities or to inquire about them. So ideally there should be a website or something where you say, if I use this device, what are the vulnerabilities that are currently known about this device? An interesting caveat is that manufacturers are required to draw up S-bombs, but they're not required to make them available to the general public, but they are required to provide them to the authorities on request, which means, so I'm assuming that this means that manufacturers need to have S-bombs for their products because otherwise you can't provide them on request. So now this is all just what needs to be done. What happens if you're not compliant and how do you demonstrate conformity? The conformity is signified by the CE marking. You may notice already from the various products that are shipped in Europe. It's on physical devices so far primarily, and the CRA extends the CE marking requirements to digital products of all kinds. So basically you're saying I assure my products are compliant and then I show the CE mark on the product or for software somewhere in a visible way. One thing that's important here is that the way software is compiled and shipped, not compiled but compiled and packaged is actually part of the production process. And so essentially your CI and the way it's run is part of the conformity assessment. There will be a regulatory sandbox for being able to dry run conformity assessments for manufacturers and that will be provided by the EU. And there's also as I already said, there will be an accreditation process for conformance assessment bodies. Yeah, and if we fail with that then there will be effective proportionate and disraces penalties. The penalties are scary. If you look at them five to 15 million euro or one to 2.5% of global turnover in the private fiscal year. So in short can be quite substantial. They should take circumstances into account and be proportionate. And one thing that's important because I think that's relatively new as an approach in Europe is that it's possible to enforce the requirements of the Siri through representative actions of the collective interests of consumers which basically means consumer groups can join the enforce compliance with the Siri. So this was an overview of what manufacturers need to know. And I'm sure that here there are also people that seems those more in the open source ecosystem. So now I would like to switch a little bit to what is the role of open source software stewards in this context. So the key point to take away here is that the upstream project hosts the open source project under neutral governance. So that's the assumption if it's in the project is independent from a manufacturer from a business. So an open source foundation like the Reynolds Foundation provides a neutral home for a project and manufacturers take this project from there and integrate it into their products. We have to keep in mind of course that the contributors to the projects are usually not employed in the foundations. The foundations host the projects but the developers and the maintainers are somewhere downstream or with service companies where the software is being developed. That means there's an interesting dynamic here where the stewards of a project they can help for example to integrate the fixes to vulnerabilities but they can't necessarily develop them themselves because they don't have developers. So both the maintainers and the contributors usually work downstream in the businesses that use the software or in open source communities and other setups. And I provide the link here to an interesting blog post that basically says open source maintainers owe you nothing, which is a bit harsh but you don't have a way to say you must fix this bug for me within three days because I need to ship a project update. So what we have to develop here in the relationship between the upstream open source ecosystem and manufacturers is a way to work together here to be able to implement the requirements of this area. And this is why we titled this presentation as a new era in a way for the interaction between open source communities and manufacturers. Okay, so who is the open source software? So first of all, it's the legal entity. So in the case of the Linux Foundation, it's better for example, Linux version or Linux version Europe and the projects that are hosted within the foundation as long as they don't have their own legal entity, they're under the umbrella of the open source foundation. And as such, the foundation or the Stuart, for example, has to provide a single point of contact for reporting and acquiring about vulnerabilities. The administration has that as a point of entry for security reports. So if vulnerabilities are reported to the upstream community, then we'll make them available to the development community to be able to fix them essentially. Stewards need to implement a cybersecurity policy and make it widely known basically, which is for example, how vulnerabilities should be reported and how you can acquire. The key responsibility is to cooperate with market surveillance authorities on their requests. For example, on known vulnerabilities and the distribution of the software and releases. And then open source software stewards will have to notify widely about reported vulnerabilities. Now, this is already for the most part, good practice in open source communities. Of course, if a vulnerability is reported, then at the same time we'll work on a fix, we'll work on a responsible disclosure, ideally when the fix is available, so that the fix can be integrated into products. A more organizational takeaway for software stewards is that they need to document the non-profit character of the organization because there are some details in the text that say, it's okay for stewards to have income revenue, but the non-profit character of the organization needs to be maintained. So this is roughly what stewards need to do. And it's important to understand what is not in this list. So there's no requirement in this list that if a manufacturer reports a security vulnerability that the open source software steward will have to fix it. Because that's not how open source works. And I think this is also why the European Union here separates these two roles. Whereas the project hosted in a neutral environment as open source software and where the manufacturers who use this and who then need to contribute to, for example, fixing security vulnerabilities. And this leads to a practical requirement, I would say, of collaboration to implement the lifecycle support for digital products that use open source. The best way to ensure the viability of an open source dependency of a digital product is to participate in the governance of that project. Think of open source project being discontinued. Maintainers go away, security vulnerabilities don't get fixed. That's a bad situation if you're required to provide security updates through the lifecycle of the product. So for key dependencies, it's going to be important to engage in those projects and to ensure that the viability of the project is there as long as the project is supposed to be used. And if it becomes obvious that the project maybe discontinued at some point, then we will collectively have to work on a replacement so that from the consumer's perspective, vulnerabilities can still be fixed and software updates provided. By participating in the governance of projects at the Stuart, the member companies can gain influence on the project roadmap and of course on the contribution process. So basically ensuring the viability of the product. And so the homework for the manufacturers is to identify the key dependencies that they use from the open source ecosystem and to identify who the stewards are and then make sure that you work together to ensure the viability of this product. So this is why we think that the cyber resilience act is triggering more, let's say elaborate collaboration between open source foundations and manufacturers with a much more long-term and sustainable relationship. And it requires that, but by defining disappointment. Okay, so this was the overview of the responsibilities of stewards. We talked about manufacturers before. I will provide a little bit of an outlook on whether we stand with this, what comes next and then we can go to the Q&A. One thing that's important to know is I said that the text we're talking about today is from 12 March, only not quite four weeks ago and it's not yet in effect. We don't expect the text to change much to everything that you've heard today. It's probably going to come into effect as we discussed it but there are a lot of details missing to fill in these details. There will be a standard development request for 44 different standards that are related to the cyber resilience act. And many of the details will be implemented in those standards, defined in those standards. There are two main groups of standards. The first group is general approaches or requirements to, for example, how do we deliver a product with digital elements without any known expertise vulnerabilities. So a baseline requirement that should apply to all products. And there are a couple of these, about 10. How do you build products with limited tax surfaces and things like that. And then the larger group, about 33 of them are requirements for specific product types. So for example, the essential cybersecurity requirements for browsers or for microprocessors. And here as an installation, we're currently looking into participating in those standards development groups with our projects and foundations that are working in this specific field. But essentially this means that in the next three years in the implementation period of the cyber resilience act, these standards will be defined. And then many of the details, for example, how do you document compliance because that's based on the requirements that are listed there will then become clear. So the basic requirements are known today. Many of the implementation details will come from the standards development process. And yeah, and the legislation Europe is engaged there in various ways in European standards development. And then in terms of outlook. So as I said, the text that we discussed today was approved on 12 March. It's still subject to a lawyer linguistic review and the translations are ongoing, but it's mostly final. This is the first union level law in Europe that models open source software stewards separately for manufacturers and actually speaks of green open source software. So I think this is really a novelty. And I do expect that this approach will proliferate into other EU regulations as well. The theory should come into effect in probably mid or the second half of this year. Then we have 21 months of a transition period for the vulnerability reporting to begin. And then all the obligations come into effect three years later, so in early 2027, we assume. So thank you. This was my overview of the cybersecurity insight and why it really changes the game. I would now look at the questions we have and see if I can answer them. Okay, well, start with the first one. Is the software behind web services, websites, e-commerce, et cetera, et cetera operating in the EU market affected by the CLA? This is what I tried to lay out earlier. If the software is required for a device to work, then it's covered by the CLA. If it's only behind the web server, then it's not. I'm not a lawyer. This is my very simplified way to explain it. But basically if the user operates a device and as part of that interacts with a cloud service without possibly even knowing about it because that happens behind the scene, then that cloud service is covered by the CLA. If it's only e-commerce, et cetera, then there are other regulations that cover that. So I think we answered the first question. The next one would be, how can end developers know upstream products are at risk? Does this mean the developer is liable for all upstream issues? What's important to know here is that every product is seen as a whole. So you integrate various open source components, you integrate your own proprietary software to it, and then you release a product into the market. Which vulnerabilities are exposed by this product? Depends much on the configuration of the product, hardware, software, attack service, et cetera. So it's not one-to-one the sum of all vulnerabilities of the open source components because that might not be visible to the user or exposed. So how can the developer know vulnerability from upstream? That's the disclosure policies and handling policies that the stewards are supposed to implement. So there's supposed to be a point of contact where you can say, what are the currently known vulnerabilities about my product? And then you have to make a decision as a business to release that product into the market or not. And that's your responsibility. Will this inhibit the use of open source software? I don't think so because you would just as much have to do the same kind of due diligence and assessment if you replace your source dependencies with proprietary dependencies. And the question is how you can then do that. So the setup, especially the two operator roles, Steward and Manufacturer, they're designed to enable collaboration between the two groups. But the responsibility for the cybersecurity of a product introduced into the market is clearly with the manufacturer. So I hope I've answered this question. The next question is what defines a subject as an open source steward? That's actually in the law. An open source steward is a legal entity that provides long-term sustained support for specific free and open source software products and ensures their viability. So there are definitions in the text. In the text is public, you can already look at it. And then there's a paragraph that says, what do we understand as a manufacturer? What do we understand as a steward? I think the key message, and I think I mentioned that in the presentation is, a steward is a legal entity other than a manufacturer. So it cannot be a manufacturer and also be a steward at the same time. Next question, would a company like Red Hat be a manufacturer? Yes, I think so. Because the product is monetized. So Red Hat in a specific case, in a specific question, sells support contracts for the software. Sells enterprise evaluation, et cetera. So from our perspective, that is clearly a manufacturer setup. Red Hat contributes to many open source projects. And I know that Red Hat is looking into how can this be done in a transparent way by a manufacturer. Here, of course, the rule that if you contribute to a project that is under somebody else's responsibility, you're not liable under the Sierra, it's the other party, is also helpful for Red Hat because you can contribute to many open source projects that are under your own responsibility. I think the takeaway for manufacturers is if you are responsible for open source projects, then you need to think about how to do this in the long term. Would you like to continue being responsible for them with manufacturer obligations? Or do you move that project under neutral governance in a steward in the non-profit environment so that you're not responsible for it? So I hope that answered the question about Red Hat. The next question is, manufacturer with a steward developer that writes the code or fixes the book, is the manufacturer or a steward? Well, that really goes back to the slide about the individual developers contributing to other people's projects, et cetera. I don't think the COA mentions much who writes the code. It mentions who's the steward of the software, who's the manufacturer that releases a product into the market. And that means a contributor to a project that is upstream is not covered by the COA. That the person is not manufacturer or steward. It's a contributor to the project. The upstream project when it makes the release act as the steward, the manufacturer that integrates to software acts as the integrator is then the manufacturer. So that's a bit of a difficult answer, but I think it was mostly in the presentations. I hope that's clear. And then what will happen if a project cannot find this steward who will take in the project, I assume? Well, that's a very good question. I think the assumption here is mostly that products that are viable, that are important upstream dependencies for downstream products will not have difficulty finding, let's say, a community of manufacturers that would jointly want to maintain it. But there are of course many small projects that have a very small contributor base. I think to the extent that they can be considered volunteer projects, they will probably not be covered by the CAE, which again puts the due diligence with the manufacturer and that there are many, especially in dynamic programming languages, many software components that are very small, some of them are not even maintained at all. They have just written once, they work mostly and then they stay away, they stay where they are. In this case, the responsibility will be mostly with the manufacturers. If a project wants to find a steward and can't find one, that's an interesting question. I actually haven't thought about that question yet. So I have to think about this a bit more. Last question, or not last, but next. How would the CAE and BSI Cretus work together? So BSI Cretus is a German cybersecurity framework that is legally required. I can't say, I'm assuming that because the CAE is a European Union level law that once promulgated and in effect across the EU takes precedence that the German regulations will be have to be adapted to that. I am not a lawyer, so I can't really answer this question authoritatively. Next question, what happened if I make an S-bomb for my maintainer public? Will the S-bomb be under a publication license? And will FOS have to make the S-bomb public in fact, manufacturer is not? So first of all, open source offers to us don't have to provide S-bombs at all, not in a law. Practically, I think if manufacturers are supposed to integrate software from upstream, then needs to be away for them to integrate, to aggregate information about what dependencies they're integrating to then provide the documentation when they're probably introduced to the market. So there may be a practical requirement for S-bombs but it's not explicit in the law. Under what license the S-bombs are distributed? It's not mentioned a lot at all. And so that means that the last part of the question will open source software have to make S-bombs available and manufacturers not? That's not the case. As I said, stewards are not required to make S-bombs available and manufacturers need to have them if the authorities ask for them. Yeah, so the whole question of at what point and in what format and by whom should S-bombs be made available? I expect that there will be more clarity based on that context from the standards development process and for the moment, again, most of the responsibility is with the manufacturer because the responsibilities come from introducing products into the market and that's not what open source communities or stewards do. Next question, who has project ownership under stewardship? So that's a very good question. The CAA does not fully describe this. It describes what an open source software steward is. I said an entity that takes the responsibility to maintain a project over a long term to make it sustainable, to make it viable, to be responsible for it, usually nonprofit. So basically it circumscribes open source foundations and with most open source foundations and most definitely with this foundation, projects are community owned. They're basically owned by the community that develops them under the umbrella of the Linux Foundation, meaning the trademarks of the project are held with the project under the umbrella of the Linux Foundation and we use open source licenses for all the code, meaning that everybody has access to the license to the source code, everybody can participate in the project and contribute to it and all the assets of the project are community owned and I think that's the assumption under the CAA. So if you want to be a steward, I think the assumption is that the project is owned under this legal entity that is the steward that who operates in a non-profit way. So project ownership, I think the expectation is that that's worth the steward, that's where the steward is a legal entity. The next question, if my business does not manufacture products but is for profit and hosts some projects on GitHub, do we know if it becomes a manufacturer or a steward? That's a very good question and I had a couple of discussions in this direction. So I think it really depends on what your business does. You're right, it's a sales development and consulting services. But the question is, do you help consult your clients in using open source software and integrating it into their products and you work with the clients but they get the software from upstream, then you don't distribute the software, you don't make it available. I would assume that then you're just a service provider but since you're not providing the software, you're not the manufacturer of the software. If you, for example, take an upstream project and you make a, let's say, well maintained, well documented, well working version of it and make that available with commercial support licenses, I would assume that you would be the manufacturer of that version of the software. So I think the takeaway from this question is businesses that work like, for example, in this way like selling development and consulting services need to learn what their obligations under the services are and avoid being the manufacturer of the software. I think that should be easy with open source software. Okay, and I think a couple of more questions and I think we're running out of time. So maybe I will collect some of the questions and make them into a blog post. Let's see if I can pick one or two. One question here is what comes into effect middle or second half of this year, which obligations, the answer is none. So the law as such comes into effect, we assume second half of this year and then we have a three year implementation period and in that period, at some point obligations kick in but nothing kicks in immediately as the law goes into effect. Next question would be, let's see is a system on a chip development or reference platform, running Linux or Android to consider the product and therefore a silicon vendor is considered to be manufactured and has to fly by the CLA. First of all, I cannot assume that you can sell a physical product, just a monitor chip, even if it's a reference platform and not be a manufacturer. The only exception is open source software, which is provided upstream by Adam Stewart. I think every physical product will automatically be a product and the person that sells it is a manufacturer. There's potentially a gray line for, I don't know testing devices that are not marketed but that's going to be because monetization can be indirect. For example, saying if you want to build a market by providing something to then monetize it later, it's still monetization. So I think, yes, a manufacturer of reference platform for development that sells that by device is a manufacturer. Okay, so I have more questions and I will work with our team to collect them all and to try to answer them maybe in a blog post or in a follow up. I think we're out of time for today. Thank you for attending. I hope this was informative. I hope this will help you move forward and if you need more support, especially if you're a Linux relation member, feel free to reach out to us and we'll try to work with you to implement the Cyber-Zoons Act. Thank you. Thank you so much, Mirko, for your time today and thank you everyone for joining us. Just a quick reminder that this recording will be up on the Linux Foundation's YouTube page later today. We hope you join us for future webinars. Have a wonderful day.