 Hi, this is Joho Sapilpatiya, and welcome to our yearly prediction series. And today we have with us, once again, Arun Gupta, Vice President and General Manager for Open Ecosystem at Intel. Arun is great to have you on the show. Thank you. I'm very excited to be here. Of course, I'm going to ask you to pick your crystal ball and shade your predictions. But before that, let's just talk quickly about Intel in the modern world. Yeah, I mean, if you think about it, Intel has a legacy for over 55 years. We put the silicon in Silicon Valley for all these years. And our people have had a profound influence on the world, driving business and society forward by creating radical innovation that revolutionizes the way we live. Whether it's a silicon available in the data center, client devices, Edge network, with software defined, silicon-enabled services and offerings, we contribute to hundreds of software projects and optimize them to make sure that our silicon can be leveraged appropriately. And most importantly, help customers solve those real world problems that they're facing in the modern world. Now it's time for you to grab your crystal ball. I'm pretty sure that you have a hardware crystal ball there as well, not just a virtual one. And share what predictions you have for us for this year. Yeah, and I've been thinking about this for a while, actually. So it is my first prediction. My hope is that in 2024, there is transparency in AI. Think about the regulatory landscape, IP ownership, and ethical considerations. They should drive new approaches to gen AI models with focus on openness and transparency. It's very essential so that a broad spectrum of AI researchers, builders, adopters, they have the right tools because we fundamentally believe that open ecosystem commoditizes, makes a equitable playground so that everybody can play around with it. So as a matter of fact, a few weeks ago, Intel along with Meta and IBM launched this AI alliance, which is basically fostering an open community and enabling developers and researchers to accelerate responsible innovation in AI. So that's an element that we are excited about. So hopefully a lot more wonderful things to be done as part of transparency in AI over there. Second prediction. Second prediction is really to take gen AI, move this from experimental stage to a production stage. And if you think about it, there are several surveys being done. They talk about how 10% of respondents have already launched gen AI solutions. In this 2024, my hope is that a lot more organizations will be able to leverage the benefits out of gen AI. There were several surveys again done. Business wire had one particular one they were talking about, how 25% of organizations are looking to deploy more and more. And again, to begin with, heavily centric towards US to begin with, but then expanding more globally over there. Why gen AI? The reason customers would move into production is because better customer experience improved efficiency, enhanced product capabilities and cost savings. So again, taking technology as it is, but moving it into a production to really get some useful benefits. Third prediction. LLMs and RAG, Retrieval Augmented Generation, have seen some really good plays over the, since the launch of Chargipiti back in November 2022. But LLMs have, RAG is starting to get traction, but my hope is that in 2024, there is an explosion to make business information more easily accessible. Whether it's digging into your embeddings, creating a vector database, if you think about any project, whether it's a Pinecone, whether it's a vector, like MongoDB, whether it's Cassandra, there are new vector databases coming up that can create those embeddings. And then existing databases like MongoDB and Cassandra that are creating embeddables so that you can actually enable vector information in there. So I think that's gonna be a big one here. How not just LLMs like Chargipiti, but how can I attach my RAG pipeline to it? And as a matter of fact, again, heavily into the open LLMs element of it. So Intel continues to play a key role in that explosion. We have launched several ones and over the last few weeks, once there is one that we launched called as LDM3D which is Latin diffusion model for 3D. It basically generates 360 degree images from text prompts. We did fast rack, which is the research framework for efficient and optimized RAG pipelines. Bridge tower, which is a vision language model. So several such that Intel will do, but again, as an industry, hopefully a lot more explosion around that element. Fourth prediction, deep cybersecurity efforts to improve lock for shell-like incidents. Most importantly, there are response times. The cyber threat are only gonna grow. It's not if, it's when. We will have more and more such incidences. So I think how we leverage the response times for such. What are the organizations doing to identify that, oh, lock for shell has happened. How quickly can I remediate? How quickly can I patch up? How quickly can I remove the threat altogether? So that's gonna be a really, really big element. And then important element part of that is not just doing it in a conventional way, but how AI can be leveraged to security and make it a lot better. You know, there are forest research which talks about 80% of organizations are extremely concerned about their organization's AI model security. So as you are creating all of these LLMs, what information is being exposed? How is it being exposed? That's gonna be a key element as well. So that's the fourth one. And the fifth one really is significant PC refresh across the industry because of a new market segment, AI PC. You know, so the, if you think about the classical PC industry, you know, new chips come out, new operating system come out, but with this AI PC, it'll fundamentally reshape connectivity in 2024. You know, computers infused with AI PCs will change the PC market and completely alter how we interact with our computer with CPU and GPU and the new NPU capability is gonna be amazing on how much processing can be done on the laptop. And, you know, it's like a impact that Centrino had on Wi-Fi which really enabled users to access to the internet wherever they are. So the hope really here is moving forward, this development will change users behaviors and expectations. And people began seeking out hotels, planes and coffee shops with Wi-Fi availability, for example. We, I believe 2024 will see a similar shift with the AI PC that how are you making me better? So I think those are sort of my five predictions, essentially. Excellent, thanks for sharing these predictions with us now. Can you also talk a bit about what kind of challenges you see will be there in 2024, not only for the larger ecosystem, the market, but also for Intel and open source? There are known challenges that possibly we can tackle with, but then there are unknown challenges that we may not be ready for. So like, nobody was ready to tackle with pandemic, nobody is ready to tackle the walls that are happening all around the world. So those are the harder ones to tackle. So that's something we have to adapt with. But I guess the known challenges that we could possibly see is things like that just happen. So for example, New York Times just sued Microsoft for using their paywall data and feeding that into the open AI models. Now, things like that is where exactly the interesting discussion has to happen around open. New York Times, for example, is a paid subscription. Can you scan their data, put it into your data, put it into your LLM and expose it to your users? So for example, I could start going into open AI and say, hey, what does New York Times think about this article? Show me the first paragraph and the second paragraph. Can I start getting all of that element without any attribution? And I think that's where the open element is gonna be very critical because we want to know as you are training your LLM, where is it coming from? You know, what is the data sources? What is the attribution element? So I think that's gonna be a very interesting challenge that we will have to solve in 2024. What do you think is going to be the focus of Intel this year? Yeah, no, I mean, the focus is very clear in that sense for 2024. The first one really is bringing AI everywhere. We launched a fifth gen Xeon right before Christmas, Emeril Rapids, so that's super exciting. How do we ramp it up, bring it up in the data center? We launched AI PC, how do we ramp that up? So both in the data center side of it and on the client side of it. Those are two important elements. Then comes into a developer cloud. How are you getting access to the latest hardware for your developer environment? So when we think about bring AI everywhere, it could be in the data center, it could be in the client, or it could be having access to the latest hardware so that you can run your latest LLM and optimize it with your latest upstream software in Intel Developer Cloud. Now that's sort of what we have already launched. But going forward, of course, there is Granite Rapids and Sierra Forest. Those are the sort of the next additions of the Xeon chips. And that's gonna continue to be the focus from a broader company perspective. Last year we talked about Intel Foundry Services and that's again gonna be the continue to a big focus, how the company is kind of getting restructured, both from the BU perspective and from the finance perspective and how we are able to kind of make that work for us so that it's a similar model for our external facing customers versus internal facing customers. And last but not the least, there is gonna be a very clear focus on how we're gonna delight developers to be the platform of choice. Because end of the day, the developers are the decision makers. If we're not gonna keep them happy, if we're not gonna tell them what is in it for them to keep coming back to Intel platform, it's not gonna be a good battle. So I mean, we wanna make sure that we continue to delight developers, continue to contribute to those hundreds of projects, tell us more projects where we should be contributing and how do we optimize them to make it work for you? I don't thank you so much for sharing these predictions with us. Of course, I would love to have you again next year, not only to get the next set of predictions, but also check how many of these predictions turn out to be true. So no pressure there, but thank you so much and look forward to next discussion. Thank you. Thank you so much, Swapil. I really enjoyed it.