 Hello, welcome to this special CUBE conversation. I'm John Furrier, host of theCUBE here in the Palo Alto studio. It's a preview and feature of the upcoming trends with AI, but also a feature of Kong, successful startup with Agi Marietchi here, CEO and founder of Kong. Very successful entrepreneur, serial entrepreneur. Great story, great to see you. And also legacy SiliconANGLE contributor from 2010. It was a previous company. Talk about APIs back then. Welcome back to theCUBE. Yeah, thanks John for having me and having Kong. Great to see you. So I was saying on the intro there that, you know, one of the thing we've been up for 13 years and 2010 was our big kind of first real year. 2009, 2009 we were kind of setting up. It was a group blog. We went to cloud, cloud was early. You know, we were playing with cloud in 2008 timeframe and we all knew APIs were there. And you wrote a guest post on SiliconANGLE in 2010. Yes, yeah. Talk about API and all of it came true. And all it came true. At the end it was like, your toaster will have an API. And they do now. And yeah, we had these visions that, you know, APIs would be multi-decade transitions and they were going to take cloud and enterprise by storms. And we thought, and thank you for giving us the opportunity. At that point we were immigrants so with three people start-ups and a big vision and a big dream. And I think that was one of my first ever guests blog posts in SiliconANGLE. Well, I really appreciate it. And it was really timing too because at that time that was the beginning of another wave coming in. But it was a turbulent time. If you remember 2010, it was a lot of on-prem activity, a lot of anti-cloud sentiment at that time. Even Amazon's first re-invent conference was 2013. So it really didn't pick up on the cloud until that year. Then, you know, 2013 through the rest is just history. At that time, OpenStack in 2010 was the hot thing. Remember OpenStack. It's true, it's true. It's true. Hadoop. Big data. Up-reduced. Now you got Spark and then you got Databricks. So the world's changed. But I got to ask you now that we're on this dawn of this next era. You know, we're looking at next-gen cloud. So you got a lot of multiple cloud. People inherited new cloud. We've got Amazon, so we have all Amazon. Some have Azure or Teams or whatever. Google's got some new cool stuff. Multi-cloud's on the horizon. AI just dropped on everyone's lap. So there's a lot of people talking about that next 20-mile stair, that next journey is going to be a lot of machine data, human-generated data, data in people's minds. And you know, it's just like just the human capital equation. So there's a whole nother error coming. Yeah, yes. It's going to go to the next level that API's created. This is something that's going to be exciting. What's your vision? Yeah, so if you have to think back to what you mentioned, 2013, 2014, when we wrote that blog post, we didn't know exactly the tailwinds, but we knew that API was going to be the way software talks, basically. So I think there is three major pillars. The number one is the cloud, which started to your points, 13, 14, 15. And what happened in that era? It's a, containerization technologies took off, serverless as well, and multi-clouds. Or cloud is finally going mainstream. So those three kind of created this cloud workload mainstream. But what that means is when you move workloads in the cloud, you're really appifying your infrastructure and creating an API sprawl. The second pillar, it's mobile. And all these applications that came with it. So every apps, somebody would driving a Tesla car, a Samsung fridge, the toaster, but it's all APIs behind the power or digital experience, right? You might join a Zoom call, it's all API. So behind every consumer experience, there is always an API. That creates even more APIs sprawl. I think that the third one, I would say third one is a kind of a plan. A, you have the blockchain, which some people might say is a fad in terms of crypto, but still the blockchain is a, if you think about it, it's a dynamic open API. So that's also create more API sprawl. And AI, I think at the end, it's an infrastructure problem because fundamentally it's two main things. It's data and compute. And how this data move and get compute, it's through APIs. So at the end of the day, we have a say, we'll say, you can't do AI without APIs. Because I think APIs is gonna be the primary interface on how AI talks. Because in the apps, you have a keyboard and a human. In AI world, it really queries APIs to get their answers. I think that's will be the third tailwind that kind of creates the API tailwinds. I was just talking with a founder of one of the big LLM companies this morning, AI 21 Labs. And we were riffing on this idea of an operating system of the next generation operating system. And we were kind of riffing on this idea. I think you're kind of connecting to it is that there's going to be an AI system. And it's not going to be like yesterday's architecture and that how neural nets are designed, trade-offs on compute where it sits, what gets computed in with the math. So there's a lot of new scenarios coming with the connective tissue. And so I guess I want to get your perspective on this as we continue that riff forward. APIs are the connective tissue in a way of connection points. Then there's other connective tissue around API. So now you have a, whether it's graphs or whatever, but you got things like computations important, data. Data changes a lot. We expect data to move around. What's your vision on this? Is there going to be a, I won't say a refactoring, but I mean, even an open source, if software is free, is data open source? Do we open source the data as a proprietary LLM coexisting with open LLMs? So again, it brings up a construct of, you know, a proprietary LL might not be bad if it's just doing something functional. Okay, I guess. Yeah. Well, you know, there are like two kinds of LLM, right? There's the LLM, like the large models. And that's like, you know, you want to create this Google kind of ask me anything and it's like open AI, you crawl everything and the data, they're not usually refreshed, but they're trying to be this mega monster brain, right? And I think that's one dimension. The other one is the TLM, which are the tiny language model, right? And those are small and very specific. And I think for, for enterprise, for the API's use case, the TLM will go mainstream. Because every enterprise has their own data. They don't care about asking question about, you know, political or whatever. They want to be very specific on their business. Let's say, They have domain specific. Very domain specific. Let's say oil and gas company, they plug in a tiny LLM and the whole company and the customer of the company, they're going to only ask those specific question about oil and gas, for example. Don't need to ask about things that are not related to your domain specific data. So I think on one side of the consumer levels, yes, we might have this big brother by few players because you requires, you know, billions of dollar investment in NVIDIA and so on. But on the enterprise side, we're going to have hundreds of open source TLM that companies are going to adopt and kind of train their data to for their own customer needs. And I think this is where API's are going to play a huge role because this is just to your point. It's kind of the nervous system of the cloud because the API works on a request response, which is exactly how the nervous system works. APIs are also pediferical, like our nervous system. And what is missing is really is a central nervous system, which is the brain and the spine. And I think the way we see is that Khan eventually could play the spine in the world by managing and securing all these peripheral APIs. It's interesting when you talk about software and you combine data into it, you're talking about like an organism, not like a mechanism. So you're starting to think about, because things will change in shape based upon the conditions. So that's kind of an interesting theory. Okay, play that forward. What does that look like? Is it cloud, on-prem? Do you care? Is it abstracted away? I mean, you mentioned blockchain. I mean, everyone's down on crypto, but crypto's not blockchain. Blockchain's an infrastructure. Exactly. So where do we go next? What's this telling us if you had to kind of squint through and look at the 20-mile stare out in the industry? We should do another blog post. I had no time. I mean, API, of course, it's a lingua franca of the underpinnings. But now, new abstractors are going to emerge on top end, bolted on, and decentralized. Yes, there is this thing that in software, everything is an abstraction layer of an abstraction layer, as everything is abstract, abstract. In a way, everything is an API client, right? All the, you consuming APIs from OpenAI or on-prem workload or cloud workloads. But I do think this kind of API waves, even if it looks like it's been here for a decade, I think you're just the beginning. Every time we ask our customers how many APIs you have, there may be a different evolution of their journey, 10, 100, sometimes 10,000. But everybody's consistent. In three, five years, they usually have three times, five times, 10 times more. That's kind of the connectivity tissue that requires security, management, observability. So TLMs, Tiny Language, I like how you put that out there because I think every company will be, have AI, obviously, like a website. Yeah, it's a feature, right? It's like the web info. Oh yeah, and the web's here. Oh, what's that? It's a toy for kids. No, everyone will have a website. Ah, that's never going to happen. And it did happen. Every company has a website pretty much. So AI, having that same thing. Okay, so play that for it. As more people on board, like the web, yet online population increased and everything got better. As more AI comes on, that's going to tell us something. So as you look at that, what's that app framework look like? Because you're going to have a power law of languages. You're going to have the big ones at the top. And then you're going to have Evergreen and or pizza. Every company will have an LLM somewhere, a TLM or some language model, or foundation model. Like us, we have all these transfers. We were talking about before we came on. We have transcripts. Every company will have data. What do they do with it? Are they just going to interact with other data? Is there connection points? I think actually data is going to be the secret sauce because you're going to have, you can see in repos like Aging Facer, you're going to have hundreds of LLM, thousands of TLMs that are open source, but then you got to fuel it. Is those this engine that open sourcing the architecture? But then you got to fuel it. I think the quality of the output will be based on the fuel. And companies are going to own, directly or indirectly specific domain data in a way they're going to be very powerful because they will have an advantage. So it's not going to be any more, the model is going to be advantage. Or even the machine like H-100s, I think those will become eventually like, yes, there's going to be all these big GPUs that are available. There's not going to be a supply constraint anymore. So what it really left is data. So companies are going to be able to retain specific data. They always are going to have an advantage. I think they're going to become very, I think conservative in unleashing the data and to who? And they're going to make sure they're going to just build their right model on top of those data. What's interesting is I think when I hear gener of AI, gener of means it's generating something. I think of seeds, like growing something, like things just happen, something generates. And that seems to be, if you have high quality data and you have the right environment, you could, things can be generated. So you can use that. How's that going to translate into the infrastructure? Because everyone talks about the application part of AI, but there's a lot going on under the hood beyond the app. So classic DevSecOps layers are being disrupted and rethought. What's your view on that real quick? Yeah, I think AI is an infrastructure problem. And if you look at how they're getting, and we talk about generative AI for now, right? Like a piece of, but what we think about, if you look at about what you're going to see in all this company, there's kind of like three kinds of things that are showing up. Every products now is going to have either a bot. Okay, well that's one dimension. Let's just have a bot that answer you, whatever. Another one is having a co-pilot. So there's all these co-pilot companions that people add to help you be more productive. Obviously Github co-pilot is probably the biggest one, but every kind of SaaS is going to have this mini co-pilot. And tour is like predictions, like let me predict you some kind of future. So if you look about this generative AI, what really happening, you see that you're getting a bot, or you're getting a co-pilot, but you're getting some prediction charts. So it's not like nuclear physics. So the application level is getting applied into this three-dimension today. The question would be, okay, if this is like Internet in 1999, those are like early primitives, right? It looks like so-primitive, even the way you train the models, so-primitive. These primitives APIs underneath, what is going to be like really game-changing in the future on the consumer side? I think a lot depends on how much progress we make on the infrastructure side. And that specifically means data and compute and how we use those models. And of course, yeah, that's why I asked the question because the shift has happened where everyone sees what AI is, everyone gets aware of it. Yeah, there's some gloom and doom people, put the guardrails down, but I think now everyone's like, okay, that's definitely happening. So let's get in and clean up the infrastructure. Let's make a scale. We're seeing a lot of customers at Kong that are adding some sort of machine learning capabilities. And as a side effect, there is API sprawls behind the scene to powers those data movements. Okay, that's a great preview. We just gave a nice preview of what's coming for the next few years. Certainly we're going to be going into KubeCon, CNCF, and re-invent coming up. So I'm sure the customers will be there. We'll be heavy on that. Let's talk about Kong, because I think this is one of the best success stories I've seen in a long, long time. Great entrepreneurial story. You had a previous company, had great idea. Timing might not have been great at that time, and then Kong was part of, take us through the story of Kong. Yeah, so it's like, have you watched the movie, The Proceed of Happiness? Yeah, it's 10 times worse. That's how it started. It was a grind. It was a grind, blah, blah, blah. So we arrived as immigrants here, right? First illegally, and then legally. Yes. We won't tell anyone. We won't tell anyone. He just told everybody. Yes. And we went with this idea, and that's where we had actually the guest blog post on really APIs eating the world. The idea was to have a gigantic API marketplace. Think about an API repo like Github, Sleshy Bay where you can go and find APIs, consume APIs. So we built on that idea, which was maybe 20 years ahead of their time, right? And we raised capital here from some great investors, and we eventually got our visa and green card and so on. So we built the company here for like a half a decade. And we went through a lot of lower lows, a lot of near dead experiences. The company grew to about 300,000 developers and 20,000 APIs. So it was like a big community engine. Revenue was a million, a million and a half. So it wasn't really growing that big, or it wasn't growing that big. It was emerging. APIs were emerging. It's very early. We're taking 20% revenue share. So only monetizing on the monetizable APIs, which at that point was a very subset. And in doing that though, we built a lot of internal technology and internal infrastructure. And one of that was the API engine, which was called an API Ape Note because it was built in Node.js first. Then we were built in Java and finally we built in C. And we found out that this kind of internal API gateway that we built to power all this community, re-limiting, smart routing, caching, authentication, we could have probably give it to every company in the world because we look around and think there is not really a cloud native developer friendly, fast, efficient API gateway out there. So we open source it and it took off right away. And it became this kind of side project. And six months after we got a phone call from Obamacare from the healthcare.gov in Baltimore and say, hey, we're using this thing all over our APIs that programmatically talk with TurboTax. But there's a big problem. Tax season is coming up and we don't have a commercial relationship with you. And this is mission critical. And if it goes down, the whole tax analysis. And they had a lot of traffic on that. And it was anywhere rewriting the whole system. The healthcare.gov was rewritten and the Congress part of it. And so we sold on the phone the first commercial subscriptions with them. And we went back to the board and say, folks, with a big check 30 days later, this is the business, not Meshap anymore. So we ended up selling what became then a year transition. We sold the Meshap asset to another company. We raised our B round and we focused all on Kong. We cleaned up a little bit of the cap table and then Kong was like spinning off ribbons like this baby gorilla of ape node, which funny enough was called Lucy. But 24 hours from PR, we changed it to King Kong. But King Kong had too much ego, so it became Kong. That's all the apes' names. And Lucy died in space. It was bed karma. And so that's where Konging was born in August 17. So it's now really six years. So this is a classic entrepreneurial tale. Don't give up and be open to change your plan if it doesn't work out. That's being in the arena. That's the classic lesson. Yeah, you have to be, I think, strong on the vision, but lose on the details. Yeah, and be ready to pivot or not pivot. I hate the word pivot. I jump onto the right tailwind in this case. You kind of, by accident on purpose, got the right answer because you were in the right market. Just, it was just early. Yeah, do you surf? Have you ever surfed? I haven't surfed yet. So there's this analogy of the wave and surfing. So you have to pick the right wave because you cannot go faster and higher than the wave. So the API wave was right. What we did wrong for half a decade was the surfboard. It was the wrong technology to ride that wave. And then Kong became the right surfboard to ride the wave. And then you got the great wave. And then we got bigger and bigger and bigger and bigger. And it's getting, but it's not stopping. So let's get into what you guys are doing today. So you just had your announcements coming off the API summit, one of the biggest ones global. Yeah, it was the largest API event in the world. We moved it from Kong summit to API summit. So it was just for the bigger API community. We had 7,200 registered users which made it the largest API conference today. Take me through, in your mind's eye, describe the API community out there. What's it like? Who are the people? What's the persona like? What's their jobs? Is it, because it's changing and growing fast and super important. What's the community like? Five and who are the people? It depends a lot about the company sizes. But if you had to divide it, you divide API providers and API creator, API producers and API consumers, right? You take just those big ones. So millions of developers consume API. They need a set of tools for that. We have things like Kong Insomnia that help you to consume and test APIs. But on the producer side, you go from creation, running and production API, how is they reliable? Is it secure? Those are more like vice president of platform, DevOps, chief architects, enterprise architects. They kind of architect the nervous system of the enterprises. And they're going to be the key players in our earlier section on like this next generation. They're going to be top leaders of how we architect an enterprise that looks a lot like a nervous system through our APIs basically, right? Take me through in your mind's eye, the kinds of personas that are kind of transitioning from the old way the new way. I see a lot of VMware operators turning into like super cloud, multi-cloud. Is there, in your world, do you see kind of jobs changing over and people refactoring their careers? Yeah, there's a new job, or I would say a new role that is growing up in the last few years. It didn't exist even maybe five years ago. And it's the kind of head of platform. The VP of platforms, platform engineering. Platform engineering, yeah. This is a whole new set of role. They become the internal platform of the company with a bunch of APIs. Then the company's business unit, they can use and they can get that developer speed and building that innovation. So there's this platform layer. It would be interesting to see with AI how that changed, but at the end of the day, I think this, what is changing in the market, we see this new role, which is becoming more and more strategic, more and more powerful, and making sure that the internal platform, basically every company wants to become an API company. And if they don't do that, there will be no company in 10 years. And essentially, that means they're a cloud ops company, basically. They're cloud ops. They're DevSecOps, multi-cloud, hybrid cloud, on-premise, they don't really care. They don't really care. We see multi-clouds, a lot of multi-clouds, like usually you have two. One is your primary and one is your secondary. But that's what the way we do, we make every company become an API company. Okay, so just on the news, Insomnia 8.0 was out. It's got auto-generating tests, AI there. You got the cloud gate, dedicated cloud gateways, core product, API management across multiple clouds. And you got the Kong mesh. Yes. The API lifecycle management platform. Single pane of glass, not pane, it hurts. Pane as in glass for service mesh. Yes, yes. How are that playing out? How is service mesh going? We're going to be at KubeCon coming up. And I'll see super cloud is all the same areas kind of coming together. Of those three areas, one's more mature than the other. Give us a report card on those areas. Yeah, so if you go from the first you mentioned, Insomnia, that's on the API consumer side. We launch a new version, which allows collaboration, so developers can collaborate and generate tests. That's really making developers more productive. How's the consumer build-ups on top of their APIs? On the other side, how you proxy those API, right? You have a gateway. And we're now dedicated cloud gateways. So you can click a button, one click, and we spin up a bunch of gateway fleet around the world on whatever cloud you love and just run it for you. So you don't have to care about SLA, have ability and so on. And the last is like the mesh. We brought mesh into Connect, which is our cloud SaaS management plane, to provide a unified API platform, a unified experience where you go in and whatever you have edge use case at the APIs, you can spin up a cloud gateway. But as you're also decoupling internally service to service connectivity into 10, 100s, 1,000 services, you can also deploy mesh now inside the same kind of Uber management plane. That's, I think it's unique, it's the only product in the market that provides a unified experience end to end from mesh all the way to gateways in one single platform that provides management, control, observability, security, and publishing of APIs. And that's kind of the vision of helping you build a nervous system of the cloud. And that is to build, that's targeted at these platform engineer teams. Yes, 100%. You know, you can go mix and match, but why not keep everything together? That's what your pitch is. Right, whenever you need it, right? You can start with the gateway and then two years from now, you might turn on mesh, it's already there, click a button, we deploy your mesh. You don't have to change platform, change control plane, change edits, everything one click away to provide a seamless experience. We call that a super cloud operator, someone who's dealing with a new infrastructure, like IT used to deal with IT. This is now, I'm dealing with cloud and all kinds of AI and there's two questions that come up from this platform, actually three, what's going to be enabled on top of it? So obviously these platforms enable things. Yes. Be successful apps and whatnot, mentioned business units. And then two, AI and security. So what apps are going to be deployed? What's that going to, where are they going to land on the platform? What enables that? And security AI, what's that? Those are three areas of discussion when you go to the next level. Yeah, the connected self is a platform so we can add any kind of apps that will cover addition use case, think about API security, API CDN. So we can provide over time a lot of use cases. Internally, those platform builders, they really have two kind of objective depending on the business. One, it's internal, so making your own company more productive with a single, you know, repository API inventory that you can provide across BU's, across strategic partners. And that allows that kind of a, you know, Amazon 2004 manifest where you get this developer acceleration of building apps by reusing and finding APIs. So that's kind of the platform layer for internal, but also it could be external, like how we can help distribute our product, our business to even more use cases and you can do it through a public API, right? You can see open AI, public APIs, that's kind of the use case. So those are either internal or external platform use case. That's what happened inside. On the security side, obviously you're creating this API sprawl and I always say that you have a thousand of backdoors if you have a thousand of APIs and you need a whole set of new securities to make sure you are in compliance across all your API stack. And the security landscape right now and posture is what? And you have to define that strong. It's early days of, I would say it's early days of API security, not because of the creation of API, but just how seesaw, because that's kind of different buyers. So they're different by how they think about API security. There is a few companies that are doing Nietzsche product into that segment. I'd say it's going to grow over this decade. The need for API security. But in terms of anomaly detection, enforcement, detections, there's going to be a whole new players. Eventually we do believe that API security is a feature of an API platform, not necessarily a standalone entity. So we'll look into it. And AI is a tailwind for you. How would you answer that question? AI, we have a lot in development for next year. We're doing planning and there is a lot of things coming out for next year. Because we see APIs of our customer. Of course we have the, we see how data moves and what events needs to be triggered. So adding an AI layer on top, that's one of the things we want to do soon. And how does that change your customer, customer's environment? And as they pick up and do more initiatives with AI? The beauty of that is we try to not change it. The disruption, once we have all the API catalog, they're there, we have this kind of system or record, source of through all the APIs. That's about adding a layer that allows you to query those APIs simpler and easier. Without disrupting your own like CICD pipelines, your developer work stream. You don't want to do that, right? If it's cloud and native, you just had an AI layer on top, but don't have to change. Because at the end, everything runs on top of an API and we have the APIs. I've been great, great masterclass there and great update on Kong. What's going on now? Give a quick plug for Kong now. What's the staff look like? Obviously business numbers, can you share any success numbers? We looking for hiring and then what's the focus for you as the CEO founder? Yeah, so we are, the mission is the same, powering the API world, making every company an API company, right? We're the leading developer of cloud API technologies. We have about 500 people worldwide. Half of our business is now outside of US. So it's pretty global with more than a third enemy and 15% in APJ. It's very global business and we keep, we're still a hyper grow and with more efficiency than last year, that's interest goes up and you gotta be a little bit more interested, but we're still in the hyper grow territory at large scale numbers. I think we're still the leader, we're the leader of the space and there is a roadmap that is like a mile long and this point about how we can having fun. Prioritize it. You having fun? Yes, I think, I think, be part of a story that is builder than yourself and go through the journey with a lot of smart folks. That is one of those opportunity of a lifetime. Well, I love the origination story. I love how you guys navigated that time where you had. That wasn't fun. You know, I mean, this makes character and this is what makes companies have great cultures. It's like, cause you can remember that saying, you know, if we get too lazy and complacent, we could never be gotta keep grinding, gotta keep pushing. You never know. Just keep, keep going. You know, now you guys doing great. So congratulations on your success. And the world's changing too. I think, I think this AI systems concept of a whole nother architecture is coming. I mean, my prediction is open source has already won. It's not even open source in which the software industry. So if software is plentiful in this community around that, data is the next open source question. Do you open source the data? What does that look like? Do people want to open source their TLMs? How does TLMs work with proprietary foundation malls? So I think, I think open source is going to be tested in AI in a good way. I think it's going to, it's going to, cause the software is there. Yeah, the models are there. The open sourcing data would be an interesting question. Cause you know, there's GDPR compliance and all of that. And it's IP. If that becomes a secret sauce, you, you, you keep it close. Yeah. Thanks for coming on. I appreciate it and good to see you. Yeah, likewise. Thanks for having me. Thanks. theCUBE conversation here with Kong, getting all the latest insight in what's coming, a preview of the big wave coming next gen cloud, obviously AI and security. And of course, the success of Kong as a company is a testament to the cloud native world continues to march along. Obviously compute infrastructure is still exploding and growing. We'll keep it right there for you at thecube.net and siliconangle.com. I'm John Furrier. Thanks for watching.