 I'm John Furrier, the host of theCUBE in Palo Alto, California. Also, it's looking at Anglenews. Got two great guests here. Talk about AI, the impact of the future of the internet, the applications, the people. Amar Awadala, the founder and CEO, and at Albany's the CEO of Victoria. A new startup that emerged out of the original Cloudera, I would say, because Amar's known famous for the Cloudera founding, which was really the beginning of the big data movement. And now, as AI goes mainstream, there's so much to talk about, so much to go on. And plus, the new company is one of the now, what I call the wave, this next big wave. I call it the fifth wave in the industry. You know, you had PCs, you had the internet, you had mobile. This Genervae, I think, is real. And you're starting to see startups come out in droves. Amar, obviously, has a founder of Cloudera, Big Data, and now, Victoria, and you guys have a new company. Welcome to the show. Thank you, it's great to be here. So great to see you. Now, the story is, theCUBE started in the Cloudera office. Thanks to you and your friendly entrepreneurship views that you have. We got to know each other over the years, but Cloudera had had Duke, which was the beginning of what I call the big data wave, which then became what we now call data lakes, data oceans, and data infrastructure that's developed from that. It's almost interesting to look back 12 plus years and see that what AI is doing now right now is opening up the eyes to the mainstream. And the application's almost mind-blowing. You know, Satya Nutella called it the mosaic moment. Didn't say Netscape, didn't say Hill Netscape, but the mosaic moment. You're seeing companies and startups, you know, kind of the alpha geeks running here because this is the new frontier and this real meat on the bone in terms of like things to do. Why? Why is this happening now? What's the, what is the confluence of the forces happening that making this happen? Yeah, I mean, if you go back to the Cloudera days with big data and so on, that was more about data processing. Like how can we process data? So we can extract numbers from it and do reporting and maybe take some actions like this is a fraud transaction or this is not. And in the meanwhile, many of the researchers working in the neural network and deep neural network space were trying to focus on data understanding. Like how can I understand the data and learn from it so I can take actual actions based on the data directly just like a human does. And we were only good at doing that at the level of somebody who was five years old or seven years old all the way until about 2013. And starting in 2013, which is only 10 years ago, a number of key innovations started taking place and each one added on, it was no major innovation that just took place. It was a couple of really incremental ones, but they added on top of each other in a very exponentially additive way that led to by the end of 2019, we now have models, deep neural network models that can read and understand human text just like we do. And they can reason about it and argue with you and explain it to you. And I think that's what is unlocking this whole new wave of innovation that we're seeing right now. So data understanding would be the essence of it. So it's not a big bang kind of theory. It's been evolving over time. And I think that the tipping point has been the advancements and other things. I mean, look at cloud computing and look how fast it just crept up on AWS. I mean, it was you back five years ago, I was talking to Swami yesterday and there's big news about AI expanding the hugging face relationship with AWS. And just three, five years ago, there wasn't a lot of training models out there. But as compute comes out and you get more horsepower, these large language models, these foundational models, they're flexible. They're not monolithic silos. They're interacting. There's a whole new, almost fusion of data happening. Do you see that? I mean, is that part of this? But of course, of course. I mean, this wave is building in all the previous waves. Like that we wouldn't be at this point if we did not have a hardware that can scale in a very efficient way. We wouldn't be at this point. We didn't have data that were collecting about everything we do that were able to process in this way. So this movement, this motion, this phase we're in, absolutely builds on the shoulders of all the previous phases. For some of the observers from the outside, when they see chatGPT for the first time, for them, it's like, oh my God, this is an inflight that just happened overnight. Like it didn't happen overnight. GPT itself, like GPT-3, which is what chatGPT is based on, was released a year ahead of chatGPT. And many of us were seeing the power it can provide and what it can do. I don't know if I have to agree with that. I mean, I do, although I would acknowledge that the possibilities now have, because of what we've hit from a maturity standpoint, have just opened up in an incredible way that just wasn't tenable even three years ago. And that's what makes it, it's true that it developed incrementally in the same way that, you know, the possibilities of a mobile handheld device, you know, in 2006 were there, but when the iPhone came out, the possibilities just exploded. And that's the moment we're in. Well, I think, and I've had many conversations over the past couple of months around this area with chatGPT, John Markoff told me the other day that he calls it the $5 toy, because it's not that big of a deal in context to what AI is doing behind the scenes and all the work that's done on ethics that's happened over the years. But it has woken up the mainstream so everyone immediately jumps to ethics, does it work? It's not factual. When everyone who's inside the industry, this is amazing, because you have two schools of thought there. One's like, people with that name, this is the beginning of next gen. This is now we're here. This ain't your grandfather's chatbot, okay? With NLP, it's got reasoning, it's got other things. So they get to- Yeah, I'm in that camp for sure. Yeah, well, I mean, everyone who knows what's going on is in that camp. And as the naysayers start to get through this, they go, wow, it's not just plagiarizing homework, it's helping me be better. Like, it can rewrite my memo, bring the lead to the top. So the format of the user interface is interesting, but it's still data-driven app. So where does it go for me? Because I'm not even calling this the first inning. This is like pregame, in my opinion. What do you guys see this going in terms of scratching the surface to what happens next? I mean, I'll start with, I just don't see how an application is going to look the same in the next three years. What, who's going to want to input data manually in a form field? Who is going to want or expect to have to put in some text in a search box and then read through 15 different possibilities and try to figure out which one of them actually most closely resembles the question they asked? You know, I don't see that happening. Who's going to start with an absolute blank sheet of paper and expect no help? That is not how an application will work in the next three years. And it's going to fundamentally change how people interact and spend time with any opening any element on their mobile phone or on their computer to get something done. Yes, I agree with that. Like every single application over the next five years will be rewritten to fit within this model. So imagine an HR application. I don't want to name companies, but imagine an HR application and you go into application and you're clicking on buttons because you want to take two weeks of vacation. And menus and clicking here and there, reasons and managers versus just telling the system, I'm taking two weeks of vacation and going to Las Vegas, book it, done. And the system just does it for you. If it doesn't, if you weren't complete in your input and your description for what you want, then the system asks you back. Did you mean this? Did you mean that? Were you trying to also do this as well? What was the reason? And that will fit for you and just do it for you. So I think the user interface that we have with apps is going to change to be very similar to the user interface that we have with each other. And that's why all these apps will need to evolve. I know we don't have a lot of time because you guys are very busy, but I want to definitely have multiple segments with you guys on this topic because there's so much to talk about. There's a lot of parallels going on here. I was talking again with Swami who runs all the AI database at AWS. And I asked him, I go, this feels a lot like the original AWS. They don't have to provision a data center. A lot of this hard, heavy lifting on the back end is these large language models or these foundational models. So the bottleneck in the past was the energy and cost to actually do it. Now you're seeing it being stood up faster. So there's definitely going to be a tsunami of apps. I would see that clearly. What is it? We don't know yet, but also people who are going to leverage the fact that I can get started building value. So I see a startup boom coming and I see an application tsunami of refactoring things. So the replatforming is already kind of happening. Open AI, chat, TBT, whatever. So that's going to be a developer environment. I mean, if Amazon turns this into an API or a Microsoft, what you guys are doing. We're turning it into an API as well. That's part of what we're doing as well. This is why I want to, this is why this is exciting. You've lived the big data dream and we used to talk, if you didn't have a big data problem, you weren't full of data, you weren't really getting it. Now people have all the data and they got to stand this up. Yeah, so the analogy is again the mobile, I like the mobile movement and using mobile as an analogy. Most companies were not building for a mobile environment, right? They were just building for the web and legacy way of doing apps. And as soon as the user expectations shifted, that my expectation now, I need to be able to do my job on this small screen on the mobile device with a touch screen. Everybody had to invest in re-architecting and re-implementing every single app to fit within that model and that model interaction. And we're seeing the exact same thing happen now. And one of the core things we're focused on at Victara is how to simplify that for organizations because a lot of them are overwhelmed by large language models and ML and yeah. They don't have the staff. Yeah, yeah, and under staff, they don't have the skills. But they've got developers, they've got DevOps. Yes, yes. So they have the DevSecOps going on. Yes. So our goal is to simplify it enough for them that they can start leveraging this technology effectively within their applications. Ed, you're the COO of the company, obviously the startup, you guys are growing, you've got a great, great back and good team. You've also done a lot of business development and technical business development in this area. If you look at the landscape right now, and I agree the apps are coming, every company I talk to that has that JAT GPT of Piphany. Oh my God, look how cool this is. Like magic, like, okay, settle down. But everyone I talk to is using it in a very horizontal way. I talk to a very senior person, very tech alpha geek, very senior person in the industry technically. They're using it for log data. They're using it for configuration of routers and other areas are using it for, every vertical has a use case. So this is horizontally scalable from a use case standpoint. When you hear horizontally scalable, first thing on my mind is cloud, right? So cloud and scalability that way. And the data is very specialized. So now you have this vertically specialization, horizontally scalable, everyone will be refactoring. What do you see, what are you seeing from customers that you talk to in prospects? Yeah, I mean, put yourself in the shoes of an application developer who is actually trying to make their application a bit more like magic and to have that soon to be honestly expected experience. They've got to think about things like performance and how efficiently that they can actually execute a query or a question. They've got to think about cost. Generative isn't cheap, like the inference of it. And so you've got to be thoughtful about how and when you take advantage of it. You can't use it as a, you know, everything looks like a nail and I've got a hammer and I'm going to hit everything with it because that will be wasteful. Developers also need to think about how they're going to take advantage of but not lose their own data. So there has to be some controls around what they feed into the large language model. If anything, like should they find to a large language model with their own data? Can they keep it logically separated but still take advantage of the powers of a large language model? And they've also got to take advantage, be aware of the fact that when data is generated that it is a different class of data. It might not fully be their own and it may not even be fully verified. And so when the logical cycle starts of someone making a request, the relationship between that request and the output, those things have to be stored safely, logically identified as such and taken advantage of in an ongoing fashion. So these are mega problems, each one of them independently that you can think of as middleware companies need to take advantage of and think about to help the next wave of application development be logical, sensible, effective. It's not just calling some raw API on the cloud like OpenAI and then just, you get your answer and you're done because that is a very brute force approach. Well, also I will point, first of all, I agree with your statement about the app's experience that's going to be expected. Form filling, great point. The interesting about Chachi- sorry, it's not just form filling, it's any action. It's any action you would like to take. And so if taking it by clicking and dragging and dropping and doing it in a menu or on a touch screen, you just say it and it happens perfectly. It's different, it's different interface and that's why I love that UI, UX experiences. That's the people falling out of their chair moment with ChatGPT. But a lot of the things with ChatGPT, if you feed it right, it works great. If you feed it wrong and it goes off the rails, it goes off the rails big. The Bing catastrophes. And that's an example of garbage in, garbage out. Classic, old school kind of comp side phrase that we all use. This is about data injection, right? It reminds me of the old sequel days. If you can sling some sequel, you are a magician to get the right answer, pretty much there. So you got to feed the AI. You do. Some people call the early word to describe this as prompt engineering. Old school's search or engagement with data would be I have a question or I have a query. New school is I have to issue it a prompt because I'm trying to get an action or a reaction from the system. And the active engineering that there are a lot of different ways you could do it. All the way from raw, just I'm going to send you whatever I'm thinking and you get the unintended outcomes to more constrained where I'm going to just use my own data and I'm going to constrain the initial inputs to data I already know that's first party and I trust. To hyper constrain where the application is actually, it's looking for certain elements to respond to. It's interesting. Amar, this is why I love this because one, we are in the media is we're recording this video now, we'll stream it, but we got all your linguistics. We're talking. This is data. So the data quality becomes now the new intellectual property because if you have that prompt source data, it makes data or content in our case, the original content intellectual property. Absolutely. Because that's the value and that's where you see the chat GPT fall down because they're trying to scroll the web and people think it's search. It's not necessarily search, it's giving you something that you wanted. It is a lot of that. I remember Claudia, you said ask the right questions. Remember that phrase you guys had that slogan? And that's prompt engineering. So that's exactly, that's the reinvention of ask the right question is prompt engineering is if you don't give these models the question in the right way, and very few people know how to frame it in the right way with the right context, then you will get garbage out, right? That is the garbage in garbage out. But if you specify the question correctly and you provide with it the metadata that constrain what that question is going to be acted upon or answered upon, then you'll get much better answers. And that's exactly what we solved at Victara. Okay, so before we get into the last couple of minutes we have left, I want to make sure we get a plug in for the opportunity and the profile of Victara, your new company. Can you guys both share with me what you think the current situation is. So for the folks who are now having those moments of AI is bullshit or that's not real. It's lots of stuff. Oh my God, this is magic too. Okay, this is the future. What would you say to that person if you're at a cocktail party or in the elevator, say calm down, this is the first inning. How do you explain the dynamics going on right now to someone who's either in the industry or but not in the ropes. How would you explain like that, that what this waves about? How would you describe it? And how would you prepare them for how to change their life around this? Yeah, so I'll go first and then I'll let Ed go. Efficiency, efficiency is the description. So we figured out a way to be a lot more efficient. A way where you can write a lot more emails, create way more content, create way more presentations. Developers can develop 10 times faster than they normally would. And that is very similar to what happened during the Industrial Revolution. I always like to look at examples from the past to read what will happen now and what will happen in the future. So during the Industrial Revolution it was about efficiency with our hands, right? So I had to make a piece of cloth like this piece of cloth for this shirt I'm wearing. Our ancestors, they had to spend a month taking the cotton, making it into threads, taking the threads, making them into pieces of cloth and then cutting it. And now a machine makes it just like that, right? And the ancestors now turned from the people that do the thing to manage the machines that do the thing. And I think the same thing was gonna happen now is our efficiency will be multiplied extremely as human beings and we'll be able to do a lot more. And many of us were able to do things they couldn't do before. So another great example I always like to use is the example of Google Maps and GPS. Very few of us knew how to drive a car from one location to another and read a map and get there correctly. But once that efficiency of an AI by the way and that behind these things is very, very complex AI that figures out how to do that for us all of us now became amazing navigators that can go from any point to any point. So that's kind of how I look at the future. And that's a great real example of impact. Ed, your take on how you would talk to a friend or colleague or anyone who asks like how do I make sense of the current situation? Is it real, what's in it for me and what do I do? I mean, every company is rethinking their business right now around this, what would you say to them? You know, I usually like to show rather than describe. And so, you know, the other day I just got asked I've been using an application for a long time called Notion and it's super popular. There's like 30 or 40 million users and the new version of Notion came out which has AI embedded within it. And it's AI that allows you primarily to create. So if you could break the world down of AI and to find and create for a minute just kind of logically separate those two things. Find is certainly gonna be massively impacted in our experiences as consumers on Google and Bing. And I can't believe I just said the word Bing in the same sentence as Google but that's what's happening now because of a good example of change but also inside the business. But on the create side, Notion is a wiki product where you try to note down things that you are thinking about or that you wanna share and memorialize but sometimes you do need help to get it down fast. And just in the first day of using this new product like my experience has really fundamentally changed. And I think that anybody who would, you know anybody say for example that is using an existing app I would show them, open up the app. Now imagine the possibility of getting a starting point right off the bat in five seconds of instead of having a whole cloth draft this thing imagine getting a starting point so that you can modify and edit or just dispose of and retry again. And that's the potential for me. I can't imagine a scenario where in a few years from now I'm gonna be satisfied if I don't have a little bit of help in the same way that I don't manually spell check every email that I send. I automatically spell check it. I love when I'm getting type ahead support inside of Google or anything. It doesn't mean I always take it or texting. That's the efficiency too. The key word is yeah. I mean the cloud was about developers getting stuff up quick. Exactly. All that heavy lifting is there for you so you don't have to do it. Right. And you get to the value faster. Exactly, I mean if history told us one thing it's you have to always embrace efficiency. And if you don't fast enough you will fall behind. Again looking at the industrial revolution the companies that embraced the industrial revolution they became the leaders in the world and the ones who did not they all died. Well the AI thing that we got to watch out for is watching how it goes off the rails if it doesn't have the right prompt engineering or data architecture infrastructure. It's a big part. So this comes back down to your startup real quick. I know we got a couple of minutes left. Talk about the company, the motivation and we'll do a deeper dive on the company but what's the motivation? What's the, what are you targeting for the market? Business model, the tech, let's go. Actually I would like to go first to go ahead. Sure I mean we're a developer first API first platform so the product is oriented around allowing developers who may not be superstars in being able to either leverage or choose or select their own large language models for appropriate use cases but that want to be able to instantly add the power of large language models into their application set. We started with search because we think it's gonna be one of the first places that people try to take advantage of large language models to help find information within an application context and we built our own large language models focused on making it very efficient and elegant to find information more quickly. So what a developer can do is within minutes, go up, register for an account and get access to a set of APIs that allow them to send data to be converted into a format that's easy to understand for large language models, vectors and then secondarily they can issue queries, ask questions and they can ask them very, the questions that can be asked are very natural language questions so we're talking about long form sentences, drill down types of questions and they can get answers that either come back depending upon the form factor of the user interface in list form or summarized form where summarized equals the opportunity to kind of see a condensed singular answer. All right, I have a... Oh, okay, go ahead, you go. I was just gonna say, I'm gonna be a customer for you because my dream was to have a hologram of the cube host, me and Dave and have questions be generated in the metaverse. So, you know... There'll be no longer any guests here, there'll all be AIs talking to you guys. Give me a couple bullets, I'll spit out 10 good questions, publish the story. Automation, this brings the automation, I'm sorry to interrupt you. No, no, no, no, no, I was just gonna follow on the same... So another way to look at exactly what I've described is we wanna offer you chat GBT for your own data, right? So imagine taking all of the recordings of all of the interviews you have done and having all of the content of that being ingested by a system where you can now have a conversation with your own data and say, oh, last time when I met Amur, which video games did we talk about? Which movie or book did we use as an analogy for how we should be embracing data science and big data, which is moneyball. I know you use moneyball all the time. And you start having that conversation. So now the data doesn't become a passive asset that you just have in your organization. No, it's an active participant that's sitting with you on the table helping you make decisions. One of my favorite things to do with customers is to go to their site or application and show them me using it. So for example, one of the customers I talked to was one of the biggest property management companies in the world that lets people go and rent homes and houses and things like that. And I showed them me searching through reviews, looking for information and trying different words and trying to find out, is this place quiet? Is it comfortable? And then I put all the same data into our platform and I showed them the world of difference you can have when you start asking that question wholeheartedly and getting real information that doesn't have anything to do with the words you asked but is really focused on the meaning. When I asked, is it quiet? Things, answers would come back like the wind whispered through the trees peacefully. It's like nothing to do with quiet in the literal word sense, but in the meaning sense, everything to do with it. And that was magical even for them to see that. You guys are on the front wave, this on the front end of this big wave. Congratulations on the start of OMR. I know you guys got great pedigree and big data and you got a great team and congratulations. Vectara is the name of the company. Check them out. Again, the startup boom is coming. This is one of the, will be one of the major waves. Generative AI is here. It will be, I think, look back and will be pointed out as a major inflection point in the industry. There's not a lot of hype behind that. People are seeing it, experts are. So it's going to be fun. Thanks for watching. Thanks, Joe.