 This is your host up in Bhartiya and welcome to our yearly predictions video series. And today we have with us once again, Mark Collier, Chief Operating Officer at the Open Infra Foundation. Mark, it's great to have you on the show. Thank you. It's great to be here. Before I ask you to grab your crystal ball and share your predictions, quickly tell us. If you look at Open Infra Foundation, you folks have gone through a lot of evolution. Talk about the foundation it's stolen modern word. Certainly. So it's an open infrastructure foundation or open infra for short. And we're really focused on building communities that write software that runs in production, but specifically for infrastructure. So things like cloud computing, obviously, with projects like OpenStack, probably our most well-known project that we host at the foundation, but we are a non-profit and our mission is to really help build communities around these pieces of critical infrastructure technology. And other examples could be things like Starling X for edge computing, Cota Containers, which is in the secure container space, very, very hot project, hot space right now, and Zool for CI CD. So we have 100 members in 187 countries, over 100,000 members. So very, very global, all about collaboration, helping people write this open source software. And ultimately, we wanted to run in production all over the world. And that's kind of the ultimate measure of our progress that we work on. Now it's time for you to grab your crystal ball and share your predictions with us. So I think that the focus we were aiming for was looking at AI, hot topic on everyone's mind, certainly been on my mind all year. And so taking a look at kind of what's coming down the pike in AI, I did write down a few of these and funny thing that's happening in AI right now, it's moving so quickly that some of the things I jotted down two weeks ago and we first connected like already starting to come true. So it's kind of a crazy space. But the first one was that AI models will become smaller and more efficient, especially when it comes to inference. We're already seeing this happen and different techniques for how to structure the large language models, mean that they can run in smaller and smaller hardware, even all the way down to an iPhone or an Android. We're seeing models that can do that and in the laptop level. So I think we'll continue to see kind of new techniques come through that make it possible to take these technologies and just make them far more accessible. So that's the first one. And one little pro tip on that. There's a great Reddit forum called Local LLM. That's a great place for you. It's very addictive, so fair warning, but there's a new model coming out almost every day. I try to stay current and test them out and it's pretty wild what's happening. But this trend for more and more efficient models that are almost as good as the much, much bigger models you think of with like a chat GPT. That's a big area of progress. I think 2024 we're going to see a lot more of that. Number two, open source AI will catch up and surpass GPT-4 and GPT-4V, which is really the state of the art. Again, from the chat GPT or open AI folks, which, you know, it's not actually open, not everybody realizes that because the name is open AI. But they really have they coming more and more closed over time. But there's so much investment going in to open source AI models. And you see all the different releases coming out every day and people testing them and learning from each other. Researchers from around the world are constantly learning from each other. And you have companies like Mistrol, which is a company in France. It's raised four hundred million dollars. They're really pushing the envelope and what they're releasing publicly. And one funny anecdote on this last week was a big one of the biggest AI conferences in the world. And Mistrol announced this new mixed called mixed role that uses this mixture of experts model, more efficient, kind of pushing the state of the art. But they had some restrictions on how you could use it. So it was kind of as it really open source and some people complain. And the CEO came out and said, Oh, I hear you, it's fixed. I'm going to remove this restriction. So you see people that are investing hundreds of millions of dollars in AI and pushing the state of the art, but they're doing it in the open. And if there are restrictions, they're they're responding to feedbacks from people saying that's not open enough, open it further. The last point I'll mention on this is just that there's a really important effort that the open source initiative OSI has started up that I'm participating in with a bunch of other people to actually write a definition for what is open source AI. It doesn't fit neatly into what we traditionally think of as open source because there's a lot more than just source code involved. So this is all part of kind of the nuance of this world and its prediction. But I do think we'll see open source AI really catch up to the state of the art. And of course, you know, the proprietary models will keep getting better too. But this is where the trend is headed. And I'm a believer in that. Number three, there's a lot of new ways you can host these models, these open source models, a lot of different startups out there that are popping up, driving down the cost, finding more and more efficient ways to to host it. So you can run them locally, as I said, in many cases on a laptop or your desktop, if you have enough GPUs, but the server is where you're going to really see a lot of the big production use. And we're going to just see more and more competition there, driving down the cost per token to process these things. Number four, the AI hardware will get much more competitive. So you probably if you've been following AI in the news, you know, the biggest coveted prize under the Christmas tree this year is an NVIDIA GPU. Everybody wants one. There aren't enough to go around. And so it's really an NVIDIA monopoly, effectively, on the most critical component for training and running AI models. But AMD and others do have some impressive hardware that's coming out, but really the key enabler here that's been holding them back. I think 2024 we're going to see a massive improvement in this area is actually the software that enables the hardware. So NVIDIA, with their CUDA architecture and their software stack, that's really given them a lot of this kind of monopoly power above and beyond just, you know, is it very, very fast at floating point math, which it is great, very impressive. But the software layers have been kind of lagging behind with the competitors. And you see AMD investing and PyTorch support and and sort of their alternatives to CUDA out there. So I think we'll see kind of a shake up. No doubt NVIDIA will still be, you know, riding high all throughout 2024. Not not going to be that contrarian to to suggest they're not knocked down. But I think we'll see more, more competition. Number five, my last prediction is that we'll see something really impactful happen in the world of entertainment. That the example I chose was that we'll see an AI generated short film that's actually screened in theaters. Maybe it'll even get an Oscar nomination. Who knows? Maybe it'll just be something silly that catches the imagination. But I think what we saw with chat GPT was some of these natural language processing and large language model technologies that already existed were really kind of nobody really in mainstream world was paying attention to them until something really easy to use a layer on top the chat interface of chat GPT came along. Now they have 100 million active weekly users, fastest, you know, consumer web product ever to get to 100 million users. And so I think something like that will happen in the entertainment world. It's really going to shake up how we think about how entertainment is produced. Maybe it'll be somebody, you know, a high school kid that produces it with these tools, it's just putting the power to do something as big as even a full-length feature film in the hands of, you know, everyone. I think well, there'll be some kind of breakout moment in 2024 with somebody's, you know, movie TV series. We'll see what form it takes. But so those are my five predictions and super excited to see, you know, how unfolds in 2024. We're just seeing so much incredible rapid rate of change that maybe there's a few days left in the year. Maybe it'll still happen before 2024, but looking forward to it. Can you also share what kind of challenges that you see will be there in 2024, not only for the larger, you know, cloud ecosystem, but maybe even for organization like Open Infra Foundation. I think that just kind of with the first thing that's because we've been talking about AI is just the pace of change. I think this is going to be affecting everybody as an individual, as well as organizations, foundations. Like that things are happening actually sort of this rising exponential rate where it's actually not possible to really understand or follow every single thing. So your podcast plays an important role in this. And but I think just, you know, one of the challenges is going to simply be how do we all keep up as the systems get more intelligent? You know, I don't want to paint some kind of dystopian sci-fi about runaway AI. I don't think that's going to be the case in 2024, but just simply the pace of change is it gets harder for us as individuals to stay on top of it. But also if you're a company and you're trying to figure out how to productize or incorporate these new technologies into your business, you know, you're going to be facing more and more time pressure. So I think that, you know, that's going to be just just one thing that's kind of looming over us. I think another that's kind of related to this is I saw a stat recently that was very interesting to me, which is that 90 percent of all new data being generated and stored is unstructured data now. So however, only 18 percent of enterprises, organizations are actually doing anything with that data. They're just maybe they're storing it away somewhere in a data lake. And I think this is where directly related to AI, people are looking at these new techniques to kind of get some some signal from all the noise in this massive amount of data. So the data generation is growing. That means more infrastructure. So Open Infra Foundation has to be helping to build software that can manage all this incredible growth of data, but not just store it, but actually do something meaningful with it. So I think that unstructured data also plays well with things like object storage so you can connect these, which is like open sex with, for example. So you can kind of connect these these rising patterns and how people manage and store data with the software that our community is right. So that that always keeps keeps us on our toes, keeps us fresh. You know, we have to support new architectures, GPUs, arm, and that's been going on for several years, but I think we'll just see more of that. So I don't know if that answers your question, but those are a few of the kind of challenges I see ahead. And if you look at these challenges and a lot of opportunities also there, what is going to the focus open in the foundation or your focus in 2004? You know, we have a number of priorities that we've been laying out in the year ahead. One is actually being very delivered about protecting our vision of open source. So going back to some of the challenges we see in the regulatory environment or kind of nationalism and things that are kind of headwinds to our global model, we really have to invest in educating people, working with the ospos around the world, the open source program offices, but also increasingly governments and other organizations that are kind of becoming aware of open source and maybe they don't fully understand it or they're trying to regulate it without understanding, you know, what kind of costs that may put on innovation. So that's sort of number one, just protecting and promoting, you know, what we believe are the core purposes and values of open source and open infrastructure. Number two is first and foremost, you know, our projects that we host, which we're continuing to add new projects, but these projects, you know, we have to really help those communities keep improving the software so it meets the needs of customers. So for those like 90 percent of unstructured data that people are not really taking advantage of, what does the software need to do differently to help people, you know, derive more value out of these systems? So really supporting our project communities. And third is just looking at the overall impact of open infrastructure. So we have, you know, a lot of events next year where this is going to happen in person, we talk about this kind of global nature of and unique requirements coming out of Asia and Europe. So, for example, earlier this year, we launched open in for Europe and open for Asia, which are regional hubs to kind of help strengthen the voices and raise up, you know, those takeaways of what's maybe happening uniquely in Europe versus Asia so that we can then take a global perspective on it. So we're going to have many events, including an open infra summit in Asia next year in Korea, just outside of Seoul. So that's one hard, you know, example. And we're going to have also a road show of different open infra days events throughout Europe in the May, June timeframe. So these are all parts, you know, sort of tactics, but important investments we're making and strengthening the community around the world. Mark, thank you so much for taking time out to share these predictions with us. I would love to have you back on the show again next year. Not only to check how many of your predictions turn out to be true, but also get set up next predictions. Thank you so much. And I look forward to talking to you again soon. Thank you. Sounds great. Thank you.