 We are here at Open Source Summit in Dublin and we have with us, once again, Tim Serowitz, your director of training program at Linux Foundation. First of all, it's great to have you here and also to see you in person. We have done a lot of interviews remotely, but it's a different experience to see folks in person. It is. Thanks for having me. It's nice to be here. Now, let's talk about one of the biggest announcements at the event was the PyTorch. So talk a bit about the project itself. PyTorch was a meta or Facebook project and there's three main languages or frameworks that are really essential for any AI project to operate. So think of it like the nervous system for AI. And by becoming part of the Linux Foundation, this project will attract a lot more people that may not have participated if they felt there was a particular corporate owner. So it's a great project. It has all of the things necessary for the AI world to really come to the forefront. Having the right tooling, tooling that you can count on, tooling that you can know will maintain openness and neutrality is essential. So if you're an independent or a small organization or an individual contributor by working with PyTorch, the AI projects in general will gain access to your work and you know that it's going to go in healthy overall community. And with it being a Linux Foundation project, you know that neutrality will remain there. I think that's been a reason people have been hesitant to join is we want neural networks. We want AI to be a widely usable tool, but which one do we choose? And I think this is going to be a defining moment where the AI world will turn towards it and say we can participate. Because Linux Foundation is a foundation of foundations, so which foundation is it going to? It's own foundation. So we have an existing LF AI foundation of 40 or so projects, but in this case PyTorch is its own unique foundation and that was part of the deal with becoming part of Linux Foundation. So that was an intentional choice because we really wanted it to stand on its own and to be seen as an important and independent project. If you look at the history of machine learning, artificial intelligence projects within Linux Foundation, you know, there have been different foundations, you know, and submerged and so over. Do you think that the process is still going on because this is becoming an anchor project for this new foundation? So where do you see all the AI related projects in Linux Foundation? Well, the nice thing about independence and working with the foundation is that we're not choosing winners and the community really is what we follow to a large extent. So the point of what we do is to enable technology and communities to advance. By having that neutral and independent voice, then whatever technology the community decides is best, is what grows. So whereas we have 40 or so projects in LF AI, some of them overlap with what they're doing. So if the technology is better or the community has a feature, then that's what's going to lead the charge. With LF AI being independent, I think what we're going to find is what we're going to see a lot of projects really get a lot more activity they used to have. And we might see things move. And that's the nice thing is it has autonomy as a community. You already touched upon that that coming to a neutral foundation does kind of breathe some kind of confidence within folks. They do know that they can reliably use. What kind of community are you seeing will be built around this project? It's too early, but you still have a lot of insights. It is too early. I agree. But Meta has done a great job in getting this language going. So just as CNCF with what Kubernetes was, was a 15-year project with a lot of great people working on it, I think we're going to see something very similar to that. Where it is strong enough, it's mature enough, and there's enough working tooling so that people can say, you know what, I was working on something I was struggling. I can join it. I can bring my stuff over. And I think we're going to see, I hope it's something just as big if not bigger than CNCF. I mean, when we look at AI ML, first of all, when we talk about all these technologies, AI ML deep learning, so much jargon and terms. So depending on who you talk to, they'll say, no, this is different than that. But I look at it holistically. We are kind of moving in a world which is driven by, I mean, of course, open sources, the foundation of most of things now without AI ML. Even these cameras, they have some kind of deep machine learning where they track your face and all those things. But we're not even talking about autonomous cars. We are talking about small sensors. Even if you look at Edge, AI ML, they play a big role because you can not send the data back and forth. So will it be wrong to say that, you know, remember in the early days, we used to say that, hey, you know what, you should have a software strategy, otherwise you will not succeed. Then we start talking about you should have a cloud strategy. And now you're talking about you should have some kind of AI ML strategy as well. Otherwise, so what role do you see of AI ML in modern world? Well, I would actually say that if you're not already embracing AI ML, you're probably behind. A lot of people don't realize that when they do a handwriting on their laptop or their phone, that AI ML is the back end. That's that's figuring out what you meant to write or speech recognition that you're probably using it already with the many different projects, many different people moving into it and trying to monetize it. There has been some fracturing of those tooling. So I think what's going to happen is as there's a little bit of cohesion behind the PyTorch project, that we're going to see more embracing of it. And then the advantage to that is that all the hiccups that one small group may have had, we're not going to share the information and we're going to get stronger. So we're already using AI ML. And even if you don't realize it, I think a lot of people think that it's too much. They hear neural network. And so I usually try not to use too complex terms. I've mentioned like handwriting recognition, voice recognition, so that when you you say, you know, OK Siri, turn on the lights, how does it know what you mean? And so you're actually using it to some extent. The hard part is taking the laser focus that is AI ML and allowing it to grow. So it will interpret a little bit more for you instead of being one task really well. I was talking about the market point, like when we look with you mentioned Kubernetes, you mentioned CNCF, a lot of those projects or technologies, they're kind of in the maturation phase where we see a big ecosystem of vendors. But with AI ML, most of the cases, we still talk to researchers. A lot of applications are happening in different industries. They're automating their desk jobs and all those supports and virtual assistant. But we are not seeing the same kind of, of course, a lot of big companies, they are doing a lot of exciting things, but we are not seeing the same kind of ecosystem around AI ML that we have seen in the, let's say, Kubernetes or Linux space. Is it correct or not yet is what I would say. And part of that was we didn't have that nervous system to tie everything together. And, you know, historically large companies that have had ownership of a project, people were hesitant. They didn't want to get involved because they didn't know where it was going and who was going to be in charge. So I think the floodgates are open and those the work will be done. When Kubernetes was first talked about, a lot of people wondered, well, why do I care about containers and orchestration? I really think the exact same thing is going to happen with AI ML that now that people turn towards it, they can embrace it. We'll find the reasons to use it. And I think it's going to be one of those situations where, for example, where you drive your car, where you're looking, if you're looking at the road, you're looking at the stereo that is telling, are they still paying attention? Those kind of everyday life advantages will start showing up to the extent that you won't even realize how much that the AI is working. And all the systems from, OK, does my car integrate with my house and does the house integrate with the elevator in the building? All of these things right now are disjointed and not connected with a with PyTorch, I think we're going to see them come together and communicate in in a common language, so to speak, at least with a common framework and that you're going to have that advantage that my phone talks to my car, which talks to my house and, like I said, the elevator. So it knows what floor I'm going to and knows if am I going to the basement to do laundry or am I going up to the apartment to to sleep based off of what am I holding in my hands? Those are the kinds of choices that will make our everyday lives easier and also things like health that wouldn't it be great if we could check your temperature, your pupil dilation, all of these things automatically and determine based off of your history, based off of all the people just like you. Chances are, you know, you are a predictive analysis. Like you might be having a heart issue. Most people don't know until it's too late, really. So with AI ML and automation of this, we can solve these problems, these everyday problems before they really actually become a problem. Right. And once again, it's like the market, the use cases are so diverse, so messy. I mean, you talk about diseases. If you just look at detection imaging, you know, where you know, the Philips and all those companies, they are working a lot of things, you know, on kind of amplifying the image that they're taking to detect cancer early, early stage or very good example. Alexa, you know, natural language processing is happening. You go to health, you know, where a lot of folks know, they have to, voice is kind of becoming media where they talk to. But there are a lot of AI ML is back and then who is asking the question, what should be the tone, you know, when you're talking about bank support and it should be authenticity versus if you are a patient, you know. So but the thing is that as you all said, this is all too fragmented, too fractured. So we are still not seeing the same community that we saw. But I think with these projects, which are moving into the umbrella of Linux Foundation, things slowly, you know, kind of start to take shape where we'll see the community leveraging from each other. But that market is still a lot proprietary at this stage. It is. It is. And that's what keeps especially small people from from participating is they don't want to give away anything or to take it over is still kind of not in the immediate vicinity where I can see that market is evolving. But that will create a new challenge, which is going to be that folks, you know, we already already talk about talent, shortage, skill gap with the arrival of this project at Linux Foundation. What are your plans to kind of actually once again, prepare skills in order to folks are actually well worse with these technologies? What can actually use it? It is a challenge to do this, mainly because it's evolving so fast and there isn't a lot of history that we can call on. Linux is 30 years old, so it's a lot easier to write material and train people on something that has 30 years of time. As far as PyTorch is concerned, one of the first things we have, we have actually a course coming out. It's open for enrollment right now. And it's for decision makers to understand why PyTorch is important, how to compare it to competing frameworks like TensorFlow. And that you can start taking. It's a free class you can take it now. Past that, we have already in plans a course for more technical people to get started with it. And this isn't mastery, it's what I call zero to 20, not zero to 60. So getting moving in that direction, making sure it works. And if you can get it to work, then you'll be able to do a lot more with it. So we have courses in process now. One you can enroll, one you'll be able to enroll in soon. And I think we're going to end up with more, including, I think we're going to see a certification where you can have a stamp of approval that you've done a practical use of PyTorch. So right now, I couldn't tell if you know PyTorch or not. But I think judging by the activity I'm seeing right now, I think within a year, we'll be looking at a certification. Tim, once again, thank you so much for taking time out today and sit down with me here to talk about PyTorch and also the, I mean, the thing with the Linux Foundation is that you folks help, you know, to build a community. But the thing is that the community is built by the community, right? You folks being the infrastructure, hey, to build the whole thing there. So they don't have to worry because mostly developers, they're not good at all those things, you know, so to make it easier. And that's why we are seeing all these booming communities there. So I hope that next time when we'll sit down again, we will be much more further in discussing about all the ecosystem players that are there. But once again, thank you for your time and I look forward to the next discussion. Thanks very much. It's great to be here as well.