 Welcome back everyone to theCUBE's coverage here on location in Las Vegas for AWS re-invent annual conference for AWS. I'm John Furrier, your host with Dave Vellante, Shelly Kramer, Don Klein, Rob Hofer. Our whole team coverage is here. We are in the MongoDB Emerald Lounge Club. This is the Sugar Cane bar in Vegas. This is the Sugar Cube action day. We called it the Sugar theCUBE here. Our next guest is here as our CPO Chief Product Officer of MongoDB. Thanks for having us here. Welcome back to theCUBE. Good to see you both. Good to see you guys. So obviously the keynote, chock full of products. Yes. Alright, generative AI, a lot of data discussions. Just really, I thought a really good keynote. I thought they really, they had to come out, Adam had to deliver. They had to, they were getting hammered in the press. He hit a home run. But you know, you guys have a great relationship. Booth gets bigger every year. Partnership gets bigger. What's the status of your relationship with Amazon? Let's start with the partnership. You guys have a relationship with these guys. You have an ecosystem. What's the update on the partnership? Yeah, I would say, you know, as you can tell from the investment we've put here and the energy here, our partnership is really strong. And frankly, I think it's getting stronger. You know, just this week we announced integration with some of their AI technologies with Bedrock and MongoDB as we work with them on training their large language model, code assistance, code whisper on MongoDB best practices. So I would say the partnerships at an all time high. We're seeing a lot of areas of collaboration and we're fortunate to be a premier partner and value the relationship with AWS. What does it mean to integrate with Bedrock? What do you have to do to do that? You have to accommodate all this different optionality. How difficult or easy is that? Absolutely. So, you know, we love what they're doing with Bedrock because it creates a really nice abstraction and API for any development team to start to embed, you know, generative AI capabilities more easily as part of their application. And you know, part of that is obviously the large language models, whether it be integrating the open source models or various other available proprietary models. That's one part of the equation. But as you guys know, a lot of what organizations are trying to do is add context with their own domain specific proprietary data that their organization has unique information about. And a lot of that gets stored in their own data sources and structures. One of the most popular for that, of course, is MongoDB and our vector search engine. So as Bedrock looks to combine public LLM workflows with proprietary data of custom embedding to these organizations for whatever the information is, any enterprise has, it's unique. Joining those together, but in an integrated developer experience, you can serve that simply through an API becomes a really interesting combination. And so we're working with them to make sure we're a first class citizen backing vector data store as part of the Bedrock experience and our start to work on that integration. It's interesting because they talk about Bedrock as that layer two of the stack, which is essentially going to, is the middleware now. I call it middleware, oversimplify it, but the point is that's where the action is. Data's got to be key, you got a power law of a long tail of open source with the proprietary model. So, you know, this is a big data game. Big data and just a whole new stack emerging for how people are building these generative AI powered applications. And so we're obviously focused on the data layer around how do we simplify operational data, metadata, search data, vector data all in a single platform, but then plug into the various frameworks and level at the layer two and make sure we're open and ubiquitous where no matter how customers want to build. I was going to ask you, that's formally for our AI, since it's capturing everything now. What is your product strategy as this new stack's emerging? Obviously the legacy of Mongo, huge install base. You don't have a lot of customers that leave you, they grow with you. What's the current product strategy on your plate right now? Yeah, it relates to AI. There's really kind of three pillars to what we're focused on. The first is, as I mentioned, making sure MongoDB as a data platform is suited for this next generation of AI applications. So that's obviously our core database, the scalability, the performance, the ability to join massive datasets together, combined with newer features that we've added to the platform like vectorization and bring that developer experience of people who love a Mongo and not have to introduce another technology component, more sprawl in their environment. And that's both development we've done on our own but also partnerships. So we mentioned bedrock, but also an emerging ecosystem of startups and open frameworks. Lama index, a lang chain. We've worked proactively over the last year to make sure MongoDB is well integrated into an open stack of technologies and tools and that we can be a part of the foundational data layer for these modern applications. And so that's sort of pillar one of our strategy. I know we're hearing a lot of focus this week. The second is how do we make the developer experience of working with MongoDB even more easy, even more seamless? This is where we've taken AI and embedded it in our product and even just beyond the products into our developer experience. So everything from making our documentation smarter with the ability to use semantic search to be able to find answers more quickly, leveraging code assistance in our IDEs. So a developer can develop complex queries much more easily even using natural language or even serving business users who may want to build a dashboard or report on their operational data in Mongo. They may not be technical or know the MongoDB query language so now you can just ask a question of the system and it'll generate a chart. So really just how do we make developers more productive by using our platform? That also ties into taking our knowledge and feeding it into things like Code Whisper or the other eight code assistance because we have a massive developer community increasingly using code assistance. We want to make sure the answers coming back are adhering to our best practices or writing efficient and accurate code. So that's where we're working with our partners to train those models based on our unique information. You're going to be input into bedrock as you mentioned before. Exactly. Your core competency is your community, you're harvesting that data. And feeding those best practices into those models. And then the third which I think is an interesting one is there's billions of dollars of legacy applications built on legacy data technologies sitting in the data center. And we've been working for years on modernization projects with our services or our partners in the SI space. We really think there's an opportunity for GenAI LLMs to help automate everything from understanding legacy apps where the developers aren't even around anymore to understanding the old legacy stored procedures and converting that into MongoDB queries or even taking the app code and automating some of the modernization into a microservices architecture. Do we think it's going to be fully automated anytime soon? Probably not, but even 10, 20, 50% of that has hundreds of millions of dollars of potential cost savings implications for our customers to want to get out of that legacy and move into a more modern platform. So your vector search adoption is exploding which shocked me, I saw the Retool Survey, it was like 1400 survey respondents. They said only about 20% but they're still like 250, 280, maybe 300 that were actually using vector databases. And Mongo was the most popular Mongo and Pinecone and your NPS was off the charts. I was shocked because you're not even GA yet. So why is that uptake so high? And are there use cases where it doesn't make sense to consolidate on Mongo? Sure, so I think we're very fortunate that the reception of our vector search technology which is just embedded into the core MongoDB experience has been really exciting. We've got more pull for this product than anything recently just given how exciting everything is in the AI space. And I think the reason why is fundamentally about the developer experience. We like to joke internally that an AI application is still an application. You still need a scalable transactional database. You still need the ability to search your data using keywords and synonyms. You still need stream processing. You need all these other capabilities. It's not just the vector database. So instead of having to bolt on another component that you have to operate, scale, learn, if that just becomes part of the technology developers are already using like MongoDB, it just allows them to move much faster, prototype much more quickly. And that elegant experience of having it all integrated has been a real, real benefit. Is there a trade off though? You would think that if somebody, if all they do was vector database, you would think, just putting myself, if I own what I say, well, we're faster, we're more focused, that's all of our R&D goes. Is there a trade off there or is it pretty much a feature? We certainly have a point of view on this. We think it's frankly a feature. And if you look at where a lot of the investment in the pure play technology is going, it's in building a large scale distributed operational database system. We've been doing that for 15 years. So that's actually where a lot of the dollars are going versus actually anything that novel around vector management. Now that being said, of course there's going to be an ecosystem of specialized tools, but just like there are specialists in every niche of the database market that can serve very kind of esoteric niche needs, that's great. We're trying to be a general purpose platform for 80% of the use cases in the average organization and that's really what we're seeing the update. Would you invest in one of those vector database companies? No, it's a feature, not a company. All right, so I want to continue with a little bit on a use case. So the other interesting thing is you guys, as you dip your toe into the analytics space, your strategy is to essentially make it easier for your developers to do certain analytics. We don't necessarily need snowflake to figure out what's going on inside of our rag, right? So how can a company like ours extend to do some basic analytics on how people are using the system and what the patterns are and where we should double down? Sure, and for us, we're clearly in the operational space and what we're focused on is how do we make sure that you can get analytics in real time off your operational data? Whether that's a business user looking at a dashboard or increasingly so an application that's automating that decision making process in software and code. And one of the things we've seen over the last few years actually is analytics moving from reporting for executives to make decisions to smarter software that automates that decision making process. Well, that becomes an operational data platform that developers have to care about, not something that just analysts and kind of a BI reporting team care about. And I think of anything, the boom with Gen AI is doing the same thing with data science and machine learning where instead of being in a walled garden centralized data science and ML team running kind of analysis and insights, now every development team is being asked to be competent in how to apply Gen AI to the software experiences that they're building every day. So this kind of shift left that we saw with security or we saw with operations is now happening with analytics, with ML and moving to where we think really things go, which is more sophisticated smarter software that ultimately automates more and more that business process. And we think we're well positioned given our strength on developer community and operational scalable systems. You guys are the first ones on theCUBE to actually use the shift left analogy. We've been saying for a year now the data developers emerging. We talked about it in New York. And data is going to be part of their workflow. And so, okay, they're going to shift left. Totally agree. I think that's a home run winner. Lock that in. Now the next question is what changes on analytics when it comes to like observability in quotes? Because one of the big conversations with the whole reg retrieval augmentation generation is that you don't have metrics on what is good. So the memory aspect of the results and we see that today by typing in the same query you get different answers. So as developers try to figure out the equation in the math, so to speak, how do you see the product side having that reporting verification? Just is it working? How do I repeat it? How do I iterate? There's a lot of interesting open questions right now. I mean, there's a lot, you know, vibrant ecosystem of new technologies and new approaches all around generative AI and how we operationalize it. So one of the things we're partnering and also very closely focused on is how do we make the experience of tying into whether it's an open source model or a public foundational model more seamless for developers so that as things shift left there's less custom development to tie that whole rag workflow together and to be able to even just stitch it together and make that more prescriptive, more abstracted. That's kind of step one. But then we're also working closely with a variety of partners in model evaluation, model observability to say, okay, which of my models are most accurately producing results that are consistent and repeatable? How am I reducing hallucinations based on the vectorization and the custom context that I have? And there's a whole suite of technologies actually being invested in by Vetri Capital right now around model eval, model observability. And, you know, a question for us is really, are we really domain area specialists in that or is that a new opportunity for us to partner and create an open ecosystem where we can promote multiple providers that have different domain experience across that life cycle? I think it's early days and, you know, we're being very open-minded and if you're a smaller organization, you know, out there working in these areas, we are all ears on innovation labs, on investments and integrations. What's the opportunity for you to invest, partner, and partner with the ecosystem? All right, so the next question, which I think tease out for is this open versus closed. I've been comparing the AI way to the web. My generation, I remember that clearly. But prior to the web, we had online service providers. They're the proprietary dial-up systems. AOL, they're no longer around. The web was standard. DNS, HDB, HTML, Confensers provides, moved over to the web. That's historical perspective. Is there a parallel here, because there's a big open source conversation on what the models are, but also what's being built? Will there be some sort of web DNS-like AI model coming out, a new system to do it? Is it the data layers or the control plane? What's your vision on this? I know it's open, tough question, but... It's a tough question because obviously there's a lot of dynamism right now. It's early days. Actually, one of our product VPs likes to say we're in the AOL area of Internet of AI, so the area of knowledge... Don't be on the wrong side of that one. Exactly. Next cave had the answer and got beat by Microsoft. I think organizations are under a lot of pressure on one hand to, right now with the economic environment, save costs, be super efficient. But yet, at the same time, respond to one of the biggest technology platform shifts of our era, the whole shift to AI. There's a tension there. And so I think anytime there's these big waves and a lot of pressure to move fast, getting an output and a solution out there, even if it's closed or integrating a vertical stack, sometimes in my opinion, can get a first mover advantage because it's just prescriptive and it's an easy answer to get a result. But as the market matures over time, customers build sophistication. They want more control, they want more flexibility and that's where open standards, open platforms that emerge. So if you ask me, do I bet on things over time becoming less proprietary and more open, I would probably say yes, even though right now there's a lot of traction on these very prescriptive, very opinionated low-hanging fruit, because everyone's trying to get their feet wet and experiment and sometimes something that's more proprietary may be the fastest way to get there. You mentioned low-hanging fruits. I know you're an investor and advise the startups and friend to startups. What's your advice to people out there working because everyone's like saying things when Open AI had their dev day, oh my, they've killed five startups. There's always going to be some carnage when you have these white spaces that aren't legit. What's your advice to startups? There's a lot of young founders out there now super excited about AI. They see the wave, they want to ride this wave. What's your advice to people who are building? Yeah, I think, you know, I don't know, this advice is much from an investor standpoint as it is a builder standpoint is like any big transition that you're alluding to, there's a ton of hype and there's going to be a lot of companies that fail and you know, a lot of experiments but then there's going to be real sustainable businesses that emerge. I think the biggest thing I would spend time as a founder thinking about is what is really your defensibility in your sustainable mode? Because I'm seeing a lot of technologies right now that are just lightweight rappers where it's a simple user experience that, you know, you create an experience that's amazing that could really disrupt an incumbent but then when you look at it, it's three developers in a garage and it took them a week which means how defensible is that for the next startup or an incumbent to replicate what you're doing? So really, I think without trying to be superscripted on a particular area, thinking deeply about what's going to be defensible long term, what is your unique IP that won't just be easily replicated by an incumbent or the next startup that just moves a little bit faster can build in two weeks, things are moving so fast and I would say just chasing these novel kind of use cases, we're seeing a lot of startups spike and then flatten really fast. Yeah, and crash. And crash, and that's because there's not a lot of defensible IP in a lot of these. We'd love to talk more with you. Great success at Mongo, congratulations. Final question to end the segment. What's next for you? You have the product roadmap, we get the keys to the kingdom at Mongo. What's on your mind, what's next? I know this is probably a bit of a boring answer, but it's keep our heads down and work with our customers and make sure they're delivering value and with this AI wave, stay close to the ecosystem. I mean, this is one of the most fun times I've had in my career. Reminds me of when AWS and cloud was the big platform shift 15 years ago getting started. We're in sort of a V2 of that to an next level of scale and so we just want to be part of it and you can see the buzz here and hopefully a lot of white space for partners. Exactly. Open mine in. Absolutely. So here, thank you so much for coming on theCUBE. We really appreciate it. It's great to see you both as always. Thank you for joining us here. Yeah, thanks for having us. We're here in the Mongo Emerald Club Lounge. Lounge are calling it great venue. Their top customers are here. theCUBE is here and we want to appreciate thank you for you guys and go back to the studio. We'll be back with more on location coverage after this short break.