 Welcome back everyone to theCUBE's coverage of re-inviter, 11th year. I'm John Furrier with Dave Vellante. We're here in the MongoDB Emerald. Lounge. Emerald Lounge, Emerald City. This is the sugarcane, this is the sugarcube. Okay, the sugarcube and the sugarcane as we say. We've got David Chia here, CEO of MongoDB here. Thanks for hosting us here with theCUBE. We really appreciate it. John, Dave, thank you for having me. Thank you. Yeah, awesome. Well, we've chatted at your event in New York. Generai was on the conversation space then. Now, Amazon's laid their cards down on the table. Data's really important. And they led with storage on the keynote. Yeah, obviously everyone uses storage. Our relationship with Amazon though is really strong. You know, just to give you some facts, our self-serve business through them in the marketplace has grown 5x over the last 18 months. We're deployed in 29 regions around the world. So we're in every Amazon region. We have thousands of customers of all shapes and sizes using us on top of AWS. We're working closely with them on a bunch of new capabilities offering new services for our customers. Our teams in the field are working really well together. And so the relationship is going well. The culture at Mongo is evolving. The presence here at Reinvent is bigger every year. So I think it's the second year you have this restaurant kind of bought out. You've got the big booth on stage. Talk about how Amazon and Mongo's relationship from a cultural standpoint. It's a faster pace of play now in this tech game with AI where data is the centerpiece of all the conversations and architecture. Well, I think in the early days, you know, Amazon was very focused on developing its own first party services. But over time they realized partners like MongoDB can drive a ton of business. And collectively we can solve some very interesting problems for customers. And so that relationship has really blossomed. We just struck a new strategic relationship with them earlier this year. We're on both go-to-market products and just general kind of integration of services. So we feel really good about the relationship. We've actually been in this restaurant since 2018. We first did it in a renegade way where we just bought out the restaurant and then now it's become a core part of the Reinvent show. And it's a great site, right? It's in the middle of the traffic of people going back and forth to the show. You've had a pretty amazing business momentum. I remember when we talked last year at Reinvent, interest rates were rising. You had made the case to Wall Street that, look, we're a transactional app. It's critical to business. So we're not as prone to some of the optimizations that are going on. Even though you're optimizing, your customers are optimizing. I remember I talked to some customers in here and they were like, we love Mongo. We're up for renewal. We're going to take it in little bits and pieces. So you guys have managed through that very well as evidenced by your last quarter. So that premise I think has come true. Increasingly you move into sort of the world of analytics. You're expanding Mongo and you basically got a very small share of a very large market. How do you think about sort of your place in the TAM? Yeah, so our whole focus is enabling developers to do more. So we're not trying to be a data warehouse or some analytics platform for a BI user. We're really focused on enabling developers to do more sophisticated analytics as part of building very intelligent applications. As you know, we introduced Vector Search and the reason developers are flocking to it is as AI becomes more democratized, as it moves left away from the machine learning engineers to developers really embedding AI to make apps smarter, it's having a real profound effect. And the reason they gravitated to MongoDB is that the usability of all these capabilities is very well crafted so that the developers can just get the technology out of the way that they can just do their work. They don't have to kind of clunk through and kind of meander through a bunch of point tools. They can really use a very seamless platform to be able to move fast. And in a world where customers are being asked to do more with less, the best way to do that is to make your developers more productive because you're not necessarily going to grow your team by 30%, but if you can make your developers more productive by 30%, 40%, all of a sudden, by definition, you have 30%, 40% more capacity. So I presume you saw the Retool Survey, if you haven't seen it, go to retool.com. Retool is like a software development platform to develop business apps, and they did a great survey of I think 14 or 1500 developers. And I was shocked, they said, you know, which vector database are you using? You guys haven't even GA'd yet, and you were as popular as Pinecone, but your NPS was off the charts. So obviously the developers are trying it, they're downloading it. We use Mongo, we use a separate, we actually use Milvis, open source, and so we want to sort of think about that, pitch us on why we should be using Vector Search in Mongo. What are the benefits of there and why are developers taking it up so quickly before it's even announced, GA? Well, the analogy I'd use is with products like the iPhone or like Tesla, it's not like there aren't other phones or other cars, but how it's all packaged together in a way that's so compelling and so user-friendly and enables people to do what they want to do is essentially what differentiates MongoDB. So yes, there's other vector point tools out there, but it's very clunky for a developer to basically connect their vector data to their metadata, to their core data, be able to orchestrate all that, then figure out how to do the embeddings and all that. It just becomes a very convoluted process. We made that so much easier, and you're right, we're not even GA, but the demand from customers have been off the charts, and we're really excited about the opportunity. Now, I do want to say it's early days, like I don't want people suddenly to think like there's something going to be an inflection point in our business, but the fact that people are gravitating to MongoDB, it makes us feel good about where the market's going and that they're going to think of MongoDB first as a thing about building AI apps. Well, John, I don't know if you saw it, only 20% of the survey respondents were actually using a vector database. So I was actually proud, but we were one of the 20%. Well, there's a lot of things going on in vector data. Is it distributed? Can I use it on my machine? Is it working with my data store? We talked to Sahir, I talked to him on Google next about this, even though that we use open source version, we're going to put it with the data store, mainly because of what we saw on the keynote today when you see things like Agents and the Q product that they showed. The dots connect when you want to take advantage of those workflow automations by having the embeds, which essentially is post-ingest value. One of the other things that I think differentiates MongoDB is the usability is very good because think about it, like your vector data is just one subset of data. Like, so if I imagine, you know, say you and your family are looking for a new house and you see a picture of a beautiful house, you can do a search in MongoDB saying, I want to find out is there any house like this within 10 miles of this zip code? Now, so you're marrying vector data with geospatial data, right? And that you're doing it all in one platform, right? Or you're seeing a wonderful image, you want to say, hey, who is the photographer who took that photograph? And I want to see what kind of other work they do. So you can do very sophisticated analytics with your vector data. So those kind of sophisticated queries is what really developers are excited about when they use MongoDB. So if I asked you on an earnings call, they don't really ask these technical questions, but if I said, okay, I get Atlas, what is Atlas vector search and what is search nodes? How does it all work together? What's the difference between them? Why should I care? So vector search is essentially enabling you to essentially, you know, create vectorizer data in a very seamless way and then essentially use things like RAG to basically marry your private data with your public data to get more accurate results from a large language model. So that's why people are very interested. Search nodes is the ability to scale up and down your search arc nodes, depending on the use case of your app. So you're very search-intensive use case. You can scale up your search now. You would have to scale up the rest of your cluster so you get much better price performance in your architecture. So that we offer these kind of fine-grained tools for customers to figure out how they want to kind of really, you know, build their app and what kind of performance levels they want to offer to their customers and we do it in a very cost-effective way. I think the search, this whole search-thick concept around this generator is interesting because it changes the game, it's multi-dimensional. And the apps are now going to take advantage of it. So how do you guys see, what's your vision for how you guys are going to enable this next-gen level app market? Because low-code, no-codes coming, we see that. And they're still going to shift left with developers. What is going to be the lingua franca for developers when it comes down to enabling them to do apps? Well, I think you're going to see a bunch of threads, right? Obviously, with code generation tools, you're going to see developer productivity increase pretty dramatically. So the scale and ambition of the kinds of workloads they want to take on will just go to the next level. I think you're going to see them obviously embed more intelligence into these applications, whether it's driving automation or productivity, et cetera. And then I think what you're also going to see is people looking at their legacy platforms and saying, I need to migrate my legacy platforms to an architecture that could kind of position me and future-proof me for AI. And so you're going to see a lot of customers, and we're investing a lot of tooling to enable customers to more easily migrate from their legacy relational platforms to a more modern platform to position themselves for an AI future. Is that implied, the cloud, or not necessarily? Because you play in both places. Are you seeing a demand from customers to apply GenAI to their data that's on-prem? Yes, I mean, so obviously we have nearly 50,000 customers from the largest company in the world to two guys in a garage. What we're seeing is a lot of customers, especially large customers, who have regulatory constraints of what they can do in the cloud. Or they have such infrastructure, some cost that for them, it makes sense to keep using that infrastructure because it's already paid for. So what we've heard loudly from customers, they love the optionality. They love the optionality of using MongoDB on-prem and then future-proofing it for the cloud or starting on the cloud and going multi-cloud. Either staying on one platform or using multiple platforms, different apps, or in some cases using the same app on different clouds because they need the geo-diversity of not being on only one region and save some one part of the world. So that region goes down, they're in trouble. They can basically have diversity in a certain part of the world across two different cloud providers. So that gives them such a range of options of optionality that they really value doing that on MongoDB. So my final question for you is what's the strategy going forward? I'll see if you have a good bet, develop a productivity at your target. You're not trying to be some sort of data thing all over the place. Develop a first platform with Atlas. I'll see you seeing the DOSConnect with Genervai and it was possible with Agents and other kind of co-pilot things. What's your strategy to run the business going forward? Our strategy is said very simply is to enable a broad set of use cases across a broad set of deployment models, right? Back to on-prem edge and to the cloud and also multi-cloud and a broader set of use cases. So we're just increasing the aperture of the use cases being addressed. We talk about search, vector search, time series. There's a whole host of use cases that we're supporting because we want to make it so easy for developers to think of MongoDB to address any problem they want to solve across any deployment model. So those are basically workloads, right, that you're attacking. You don't have to necessarily, who knows, you may go out and buy some companies from a technical, but it's not like you have to buy companies to increase your TAM. No, we're going after one of the largest markets in enterprise software. There's no lack of TAM for you. And it's interesting to see, because you see the analytic and the BI guys, they're trying to bring in the transaction data. You're the transaction side, and as you've described before, you just want to make it easier for your developers to do analytics with the data that you have today. So your TAM is, I don't know, 100 billion. I mean, it's like enormous. Yeah, we have a, you know, obviously, I wouldn't say we never do M&A if we see an opportunity where we can build versus buy faster, but we're always looking to innovate quickly and that's, and obviously in this market that now there's more opportunities to look at some technologies and teams that may not be possible a couple years ago. But you don't have to pay 28 billion for Splunk as an example. Well, that's why I asked the culture question. I want to come back to that because I want to circle around because if you look at the history of Mongo, it's always been a developer culture. Okay, but now you're becoming so big and growing so fast. How is that managing internally with the team as you add more people, you're going faster? I mean, you see Amazon's announcements, you're being pulled into that vortex of this new wave. So again, TAM's expanding, that's great check, but now you've got your teams, global team, what's, how is the culture? Our biggest challenge, I would say, is there's a big difference between being well known and known well. So a lot of people think they know MongoDB because they tried MongoDB three, four, five, six years ago, but they have no idea, many of them have no idea what we're doing today. And especially at the senior levels, maybe because they weren't developers using document databases, they still have a very relational mindset. So we know that we have to constantly educate them on why a modern platform like MongoDB is so much more compelling in terms of scale, flexibility, performance, and agility, and security, and simplicity, versus the existing platforms. I guess I have one more final question since we've got one more minute left. The theme here is price performance. We're now shifted back from solutions. Price performance, speeds and feeds, go to solutions. Now we're back to how much is going to cost me over time in the systematic rollout of a platform and what's the performance of it down to the chip level? So what's your reaction to that? How do you see that going forward? Is this going to be more of the same? Continue to be a cost performance conversation? We hear a couple of things from customers. One, innovate, they still need to innovate. Two, AI is the center of everything they're thinking about. And three, they need to do it in a very cost-effective way. So the ROI bar for making investments has definitely gone up. So that's why actually in some ways we think that helps us longer term because people are tired of using these single point solutions for every single use case and then they look at their data architecture and it's a complete mess. It's very costly, very complex and very cumbersome. And so being able to do vendor consolidation on a more modern platform is something that I think is going to be good for us. When you have these big inflection points it kind of flips the script. Data management, how you do data, how you develop data in your wheelhouse and you've got a great focus. Thanks for coming on theCUBE, Dave. It's great to see you. It's great to see you guys. And thank you for hosting us today. It's our pleasure. Excellent space and primo real estate. Our pleasure. Thanks for having me. Dave Vellante is here with me. He gets Kelly Kramer's here. Rob Hove, Mark out. Team coverage on the ground here with theCUBE and SiliconANGLE bringing you the best editorial content as part of reinvent stories. Go to SiliconANGLE.com. We got a big feature story out there breaking analysis going on. We got videos hitting and we got a live stream out of our Palo Alto office. SuperCloud five, special edition. We'll be right back after this short break.