 Hello, everyone. Welcome to the Keeps coverage of AWS Startup Showcase. This is the closing panel session on AI Machine Learning, the top start-up generating AI on AWS. It's a great panel. This is going to be the experts talking about riding the wave in GenRiv AI. We've got Ankur Mejwitra, who's the Director and General Manager of AI and Machine Learning at AWS, and Klem Delan, Co-Founder and CEO of Huggingface, and Ori Ghosian, who's the Co-Founder and CEO of AI21 Labs. Ori, from Tel Aviv, dialing in and the rest coming in here on theCUBE. Appreciate you coming on for this closing session for the Startup Showcase. Thanks for having us. Thank you for having us. I'm super excited to have you all on. Huggingface was recently in the news with the AWS relationships. So congratulations, open source, open science, really driving the machine learning. And we got AI21 Labs access to the LLMs, generating huge-scale live applications, commercial applications coming to the market, all powered by AWS. So everyone, congratulations on all your success, and thank you for headlining this panel. Let's get right into it. AWS is powering this wave here. We're seeing a lot of push here from applications. Ankur, set the table for us on the AI machine learning. It's not new. It's been going on for a while. Past three years, there's been significant advancements, but there's been a lot of work done in AI machine learning. Now it's released to the public. Everybody's super excited. And now says, oh, the future's here. It's kind of been going on for a while. And baking is now it's kind of coming out. What's your view here? Let's get it started. Yes, thank you. So yeah, as you may be aware, Amazon has been investing in machine learning research and development since quite some time now. And we've used machine learning to innovate and improve user experiences across different Amazon products, whether it's Alexa or Amazon.com. But we've also brought in our expertise to extend what we're doing in the space and add more generative AI technology to our AWS product services, starting with Code Whisperer, which is an AWS service that we announced a few months ago, which is you can think of it as a coding companion as a service, which uses generative AI models underneath. And so this is a service that customers who have no machine learning expertise can just use. And we also are talking to customers and we see a lot of excitement about generative AI and customers who want to build these models themselves who have the talent and the expertise and resources. For them, AWS has a number of different options and capabilities they can leverage, such as our custom silicon, such as training, training and infringer, as well as distributed machine learning capabilities that we offer as part of SageMaker, which is an end-to-end a machine learning development service. At the same time, many of our customers tell us that they're interested in not training and building these generative AI models from scratch, given they can be expensive and can require specialized talent and skills to build. And so for those customers, we're also making it super easy to bring in existing generative AI models into their machine learning development environment within SageMaker for them to use. So we recently announced our partnership with HungPace, where we're making it super easy for customers to bring in those models into their SageMaker development environment for fine-tuning and deployment. And then we're also partnering with, you know, other proprietary model providers such as AI21 and others where we're making these generative AI models available within SageMaker for our customers to use. So our approach here is to really provide customers options and choices and help them accelerate their generative AI journey. Thank you for saying the table there, Clem and Ori, I want to get your take because the riding the waves, the theme of this session. And to me, you know, being in California, I imagine they're the big surf, the big waves, the big talent out there. This is like alpha geeks, alpha coders, developers are really leaning into this. You're seeing massive uptake from the smartest people, whether they're young or around, they're coming in with their surfboards, if you will. These early adopters, they've been on this for a while, now the waves are hitting. This is a big wave, everyone sees it. What are some of those early adopter devs doing? What are some of the use cases you're seeing right out of the gate? And what does this mean for the folks that are going to come in and get on this wave? Can you guys share your perspective on this? Because you're seeing the best talent now leaning into this. Yeah, absolutely. I mean, from hugging face vantage points, it's not even a wave, it's a tidal wave or maybe even the tide itself, right? Because actually what we're seeing is that AI and machine learning is not something that you add to your products. It's very much a new paradigm to do all technology, right? It's this idea that we had in the past 15, 20 years, one way to build software and to build technology, which was writing a million ninths of code, very rule-based, and then you get your product. Now what we're seeing is that every single product every single feature, every single company is starting to adopt AI to build the next generation of technology, right? And that works both to make the existing use cases better. If you think of search, if you think of social network, if you think of SaaS, but also it's creating completely new capabilities that weren't possible with the previous paradigm, right? Now AI can generate text, it can generate image, it can describe your image, it can do so many new things that weren't possible before. And it's going to really make the developer it's going to really productive, right? I mean, you're seeing the developer uptake strong, right? Yes, we have over 15,000 companies using HuggingPace now, and it keeps accelerating. I really think that maybe in like three, five years, there's not going to be any company not using AI. It's going to be really complex default to build all technology. Ori, weigh in on this, API is the cloud. Now I'm a developer, I want to have live applications. I want the commercial applications on this. What's your take? What's weigh in on here, weigh in here? Yeah, first, I absolutely agree. I mean, we're in the midst of technology shift here. I think a lot of people realize how this big this is going to be just the number of possibilities is handless and I think hard to imagine. And it's, I don't think it's just the use cases. I think we can think of it as two separate categories. We'll see companies and products enhancing their offerings with these new AI capabilities. We'll also see new companies that are AI first that kind of reimagine certain experiences. They build something that wasn't possible before. And that's why I think it's actually extremely exciting times and maybe more philosophically. I think now these large language models and large transformer based models are helping us people to express our thoughts and kind of making the bridge from our thinking to a creative digital asset in a speed we've never imagined before. Right, I can write something down and get a piece of text or an image or a code. So I think it's really, I started by saying it's hard to imagine all the possibilities right now, but it's certainly big. And if I had to bet, I would say it's probably at least as big as the mobile revolution we've seen in the last one years. Yeah, I mean, this is the biggest. I mean, it's been compared to the Enlightenment age. I saw the Wall Street Journal had a recent story on this. We've been saying that this is probably going to be bigger than all inflection points combined in the tech industry. Given what transformation is coming, I guess I want to ask you guys on the early adopters we've been hearing on these interviews and throughout the industry that there's already a set of big companies, a set of companies out there that have a lot of data and they're already there, they're kind of tinkering. Kind of reminds me of the old hyperscaler days where they were building their own scale and they're eating glass, spitting nails out, they're hardcore. Then you got everybody else kind of saying, board level, hey team, how do I leverage this? How do you see those two things coming together? You got the fast followers coming in behind the early adopters. What's it like for the second wave coming in? What are those conversations for those developers like? I mean, I think for me, the important switch for companies is to change their mindset from being kind of like a traditional software company to being an AI or machine learning company. And that means investing, hiring, machine learning engineers, machine learning scientists, infrastructure team members who are working on how to put these models in production, team members who are able to optimize models, specialize models, customize models for the company specific use cases. So it's really changing this mindset of how you build technology and optimize your company building around that. Things are moving so fast that I think now it's kind of like too late for low-ending foods or like small, small adjustments. I think it's important to realize that if you want to be good at that and if you really want to serve this wave, you need massive investments, right? Like if there are like some surfers listening with this analogy of the wave, right? When there are waves, it's not enough just to stand and make a little bit of adjustments. You need to position yourself aggressively, battle like crazy, and that's how you get into the wave. So that's what companies, in my opinion, need to do right now. Or what's your take on the generative models out there that we hear a lot about foundation models? What's your experience running end-to-end applications for large foundation models? Any insights you can share with the app developers out there who are looking to get in? Yeah, I think first of all, it started creating an economy where it probably doesn't make sense for every company to create their own foundation models. I mean, you can basically start by using an existing foundation model either open source or proprietary one and start deploying it for your needs. And then comes the second round when you're starting the optimization process, right? You bootstrap whether it's a demo or a small feature or introducing a new capability within your product and then start collecting data. That data, particularly the human feedback data helps you to constantly improve the model. So you create these data flywheel. And I think we're now entering an era where customers have a lot of different choice of how they want to start their generative AI endeavor. And it's good thing that there's a variety of choices. And the really amazing thing here is that every industry, any company to speak with, it could be something very traditional, like industrial or financial, medical, really any company, I think people's now starts to imagine what are the possibilities and seriously think what's their strategy for adopting this generative AI technology. And I think in that sense, the foundation model actually enabled this to become scalable. So the barrier to entry became lower. Now the adoption could actually accelerate. There's a lot of integration aspects here in this new way that's a little bit different before. It's like very monolithic, hardcore, very brittle, lot more integration. You see a lot more data coming together. I have to ask you guys as developers come in and grow, I mean, I went to college and you were a software engineer. I mean, I got a degree in computer science and software engineer, that's all you did, code, right? You did code it. Now, I mean, isn't it like everyone's a machine learning engineer at this point because like that will be ultimately the science. So, you got open source, you got open software, you got the communities, Swamy called you guys the GitHub of machine learning, hugging faces that get GitHub of machine learning, mainly because that's where people are going to code. So, this is essentially machine learning is computer science. Is that, what do you guys, what's your reaction to that? Yes, my co-founder Julien, a tugging face, have been having this thing for quite a while now for over three years, which was saying that actually software engineering as we know it today is a subset of machine learning instead of the other way around, right? People would call us kind of like crazy a few years ago when we were seeing that, but now we're realizing that you can actually code with machine learning, right? So machine learning is generating code. And we're starting to see that every software engineer can leverage machine learning to open models through APIs, through different technology stack. So, it's not crazy anymore to think that maybe in a few years there's going to be more people doing AI and machine learning. However you call it, right? Maybe you'll still call them software engineers. Maybe you'll call them machine learning engineers. But there might be more of these people in a couple of years than there is like software engineers today. I bring this up as more tongue and cheek as well because infrastructure is code is what made cloud great, right? That's kind of the DevOps movement. But here, the shift is so massive there will be a game changing philosophy around coding. Machine learning is code. You're starting to see code whisperer. You guys have had coding companions for a while on AWS. So this is a paradigm shift. How is the cloud playing into this for you guys? Because to me, I've been riffing on some interviews where it's like, okay, you got the cloud going next level. This is an example of that where there is a DevOps like moment happening with machine learning whether you call it coding or whatever it's writing code on its own. Can you guys comment on what this means from top of the cloud? What comes out of the scale? What comes out of the benefit here? Absolutely. It's so good. Yeah, so I was, I think as far as scale is concerned I think customers are really relying on cloud to make sure that the applications that they build can scale along with the needs of their business. But there's another aspect to it, which is that until a few years ago, John Ward we saw was that machine learning was a data scientist heavy activity. There were data scientists who were taking the data and training models. And then as machine learning found its way more and more into production and actual usage, we saw the MLops become a thing and MLops engineers become more involved into the process. And then we now are seeing, as machine learning is being used to solve more critical business critical problems, we're seeing even legal and compliance teams get involved. We're seeing business stakeholders more engaged. So more and more machine learning is becoming an activity that's not just performed by data scientists, but is performed by a team and a group of people with different skills. And for them we as AWS are focused on providing the best tools and services for these different personas to be able to do their job and really complete that end-to-end machine learning story. So that's where whether it's tools for related to MLops or even for folks who don't know any, cannot code or don't know any machine learning. For example, we launched SageMaker Canvas as a tool last year, which you can just use. It's a UI based tool, which data analysts and business analysts can use to build machine learning models. So overall the spectrum in terms of persona and who can get involved in the machine learning process is expanding and the cloud is playing a big role in that process. Ori Klein, can you guys weigh in too? Cause this is just another abstraction layer of scale. What's it mean for you guys as you look forward to your customers and the use cases that you're enabling? Yes, I think it's what's important is that AI companies and providers and the cloud can work together, right? That's how you make a seamless experience and you actually reduce the barrier to entry for this technology. So that's what we've been super happy to do with AWS for the past few years. We actually announced not too long ago that we're doubling down on our partnership with AWS. We're excited to have many, many customers on our shared product, the Hugging phase deep learning container on SageMaker. And we're working really closely with the with the InfernoShare team and the Trinium team to release some more exciting stuff in the coming weeks and coming months. So I think when you have an ecosystem where the AWS and the AI providers, AI startups can work hand in hand, it's to the benefit of the customers and the companies because it makes it like orders of magnitude easier for them to adopt this new paradigm to build technology AI. Or this is a scale on reasoning too. You got the data is out there and making sense out of it. Making it reason, making getting comprehension, having it make decisions is next, isn't it? And you need scale for that. Yes. Just a comment about the infrastructure side. So I think really the purpose is to streamline and make these technologies much more accessible. And I think we'll see, I predict that we'll see in the next few years, more and more tooling that make this technology much more simple to consume. And I think it plays a very important role. There's so many aspects like the monitoring the models and their kind of outputs they produce and kind of containing and running them, you know, in production environment, there's so much there to build the infrastructure side will play a very significant role. All right, that's awesome stuff. And I love to change gears a little bit and get a little philosophy here around AI and how it's going to transform. If you guys don't mind, there's been a lot of conversations around on theCUBE here as well as in some industry areas where it's like, okay, all the heavy lifting is automated away with machine learning and AI, the complexity, there's some efficiencies, it's horizontal and scalable across all industries and at a good point there. Everyone's going to use it for something and a lot of stuff gets brought to the table with large language models and other things. But the key ingredient will be proprietary data or human input or some sort of AI whisperer kind of role or prompt engineering people are saying. So with that being said, some are saying it's automating intelligence and that creativity will be unleashed from this. If the heavy lifting goes away and AI can fill the void, that shifts the value to the intellect or the input. And so that means data's going to come together, interact, fuse and understand each other. This is kind of new. I mean, old school AI was, okay, I got a big model, I provisioned it a long time, very expensive, now it's all free flowing. Can you guys comment on where you see this going with this free form, data flowing everywhere, heavy lifting and then specialization or? Yeah, I think. Good. Yeah, I think so what we're seeing with these large language models or generative models is that they're really good at creative stuff. But they, I think it's also important to recognize their limitations. They're not as good at reasoning and logic. And I think now we're seeing, we're seeing great enthusiasm, I think which is justified. And the next phase would be how to make these systems more reliable, how to inject more reasoning capabilities into these models or augment with other mechanisms that actually perform more reasoning so we can achieve more reliable results. And we can count on these models to perform in, for critical tasks, whether it's medical tasks, legal tasks, we really want to kind of offload a lot of the intelligence to these systems. And then we'll have to get back, we'll have to make sure these are reliable, we'll have to make sure we get some sort of explainability that we can understand the process behind the generated results that we've received. So, I think this is kind of the next phase of systems that are based on these generated models. Clen, what's your view on this? Obviously you're at open community, open source has been around, it's been a great track record proven model. I'm assuming creativity is going to come out of the woodwork and if it can automate open source contribution and relationships and onboarding more developers, there's going to be unleashing of creativity. Yes, it's been so exciting on the open source front, right? Like we all know birds, you know, bloom, GPTJ, T5, stable diffusion that work out like the previous sort of, the current generation of open source models that are on hooking face. It has been accelerating in the past few months, right? So like I'm super excited about control net, right now that is really having a lot of impacts, which is kind of like a way to control the generation of images, super excited about FlanUL2, which is like a new model that has been recently released and is open source. So, yeah, it's really fun to see the ecosystem coming together. Open source has been the basis for traditional software, right? With like open source programming languages, of course, but also all the great open source that we've gotten over the years. So we're happy to see that the same thing is happening for machine learning and AI and hopefully it can kind of like help a lot of companies reduce a little bit the barrier to entry. So, yeah, it's going to be exciting to see how it evolves in the next few years in that respect. I think the developer productivity angle that's been talked about a lot in the industry will be accelerated significantly. I think security will be enhanced by this. I think in general applications are going to transform at a radical rate, accelerating incredible rates. So I think it's not a big wave or it's the water, right? I mean, it's the new thing. My final question for you guys, if you don't mind, I'd love to get each of you to answer the question I'm going to ask you, which is, a lot of conversations around data, data infrastructure is also involved in this. And the common thread that I'm hearing is that every company that looks at this is asking themselves, if we don't rebuild our company, start thinking about rebuilding our business model around AI, we might be dinosaurs, we might be extinct. And it reminds me of that scene in Moneyball when at the end it's like, we're not building the model around your model, every company will be out of business. What's your advice to companies out there that are having those kind of like moments where it's like, okay, this is real, this is next gen, this is happening. I better start thinking and putting into motion plans to refactor my business because it's happening, business transformation is happening on the cloud. This kind of puts an exclamation point on with the AI as a next step function, big increase in value. So it's an opportunity for leaders. Anka, we'll start with you. What's your advice folks out there thinking about this? Do they put their toe in the water? Do they jump right into the deep end? What's your advice? Yeah, John. So we talked to a lot of customers and customers are excited about what's happening in the space, but they often ask us like, hey, where do we start? So we always advise our customers to do a lot of proof of concepts, understand where they can drive the biggest ROI. And then also leverage existing tools and services to move fast in scale and try and not reinvent the wheel where it doesn't need to be. So that's basically the advice to customers. Okay, then, Ori, what's your advice to folks who are scratching their head going, I better jump in here. How do I get started? What's your advice? So I actually think that, I need to be thinking about it really economically. So, often the opportunity side and the challenges. So there's a lot of opportunities for many companies to actually gain revenue upside by building these new generative features and capabilities. On the other hand, of course, this would probably affect the COGS and incorporating these capabilities could probably affect the COGS. So I think we really need to think carefully about both of these sides and also understand clearly if this is a project or an effort towards cost reduction, then the ROI is pretty clear or revenue amplifier, where there's again, a lot of different opportunities. So I think once you think about this in a structured way, I think, and map the different initiatives, then it's probably a good way to start and a good way to start thinking about these endeavors. Awesome, Clem, what's your take on this? What's your advice, folks out there? Yes, all of these are very, very good advice already. Something that you said before, John, that I disagreed with you. A lot of people are talking about the data mode and proprietary data. Actually, when you look at some of the organizations that are building the best models, they don't have a specialized or like unique access to data. So I'm not sure that's so important today. I think what's important for companies and it's been the same for the previous generation of technology is their ability to build better technology faster than others, right? In this new paradigm, that means being able to build machine learning faster than others and better. So that's how, in my opinion, you should approach this, right? In kind of like, how can you evolve your company, your teams, your products so that you're able in the long run to build machine learning better and faster than your competitors. And if you manage to put yourself in that situation, then that's when you'll be able to differentiate yourself to really be impactful and get results. That's really hard to do, right? It's something really different because machine learning and AI is a different paradigm than traditional software. So this is going to be challenging, but I think if you manage to nail that, then the future is going to be very interesting for your company. That's a great point. Thanks for calling that out. I think it all reminds me of the cloud days early on. If you went to the cloud early, you took advantage of when the pandemic hit, if you weren't native in the cloud, you got hamstrung by that, you were flat-footed. So just get in there, get in the cloud, get into AI, you're going to be good. Thanks for calling that. Final parting comments. What's your most exciting thing going on right now for you guys, Ori Klem? What's the most exciting thing on your plate right now that you'd like to share with folks? I mean, for me, it's just the diversity of use cases and really creative ways of companies leveraging this technology. Every day I speak with about two, three customers and I'm continuously being surprised by the creative ideas and the future is really exciting what can be achieved here. And also I'm amazed by the pace that things move in this industry. It's just, there's not a dull moment. So definitely exciting times. Klem, what are you most excited about right now? For me, it's like all the new open source models that have been released in the past few weeks and that they'll keep being released in the next few weeks. I'm also super excited about more and more companies getting into this capability of changing different models and different APIs. I think that's a very, very interesting development because it creates new capabilities, new functionalities that weren't possible before. You can plug an API with an open source embedding model with a no-geo transcription model. So that's also very exciting, this capability of having more interoperable machine learning will also, I think, open a lot of interesting things in the future. Klem, congratulations on your success at Huggingface. Please pass it on to your team or congratulations on your success and continue to just day one. I mean, it's just the beginning. It's not even scratching the surface. Ankar, I'll give you the last word. What are you excited for there at AWS? More cloud goodness coming here with AI. Give you the final word. Yeah, so, you know, as both Klem and Ori said, I think that the research in the space is moving really fast and we are excited about that. But we, you know, we're also excited to see the speed at which enterprises and other AWS customers are applying machine learning to solve real business problems and the kind of results they're seeing. So when they, you know, come back to us and tell us the kind of improvement in their business metrics and overall customer experience that they're driving and they're seeing real business results, that's what keeps us going and inspires us to, you know, to continue inventing on the behalf. Gentlemen, thank you so much for this awesome high-impact panel. Ankar, Klem, Ori, congratulations on all your success. We'll see you around. Thanks for coming on. Generative AI, riding the wave. It's a tidal wave. It's the water. It's all happening. All great stuff. This is season three, episode one of 80 of a startup showcase closing panel. This is the AI ML episode, the top startups building generative AI on AWS. I'm John Furrier, your host. Thanks for watching.