 Welcome to Dell Technologies World. It's the premier technology event of the year. Join John Furrier, Dave Vellante and Lisa Martin as they talk to the trailblazers and trendsetters of future technology. Dell Technologies World 2023 and theCUBE, the leader in live and emerging high tech coverage. Good evening everyone and welcome back to theCUBE's coverage of day one Dell Technologies World 2023 coming to you live from Las Vegas. Mandalay Bay, Lisa Martin, Dave Vellante. Dave, we've had a great first night so far. We're just scratching the surface. We've been having some great conversations so far. A lot of news today. I'm excited about this next segment because you had me at data. So... Data and AI. One of our alumni is back with us. Jonathan Suckler joins us. Senior Director of Product Marketing at Dell Technologies. And Ariel Pizetski, VP, IT and Cyber at Tabula. Guys, welcome. Thank you, it's wonderful to be here. Thanks a lot, yeah. Ariel, let's go ahead and start with you. Give the audience an overview of Tabula. You're a Dell customer but tell us what you do, your vision, mission, how are you using AI? Give us all that scoop. Okay, sure thing. So Tabula is a content recommendation platform. So we provide approximately four billion web pages a day. We see about 1.5 billion unique users a month. And we do all of that powered on Dell servers. So we have about 12,000 Dell servers. And for each and every customer, for each and every user that comes to our website or comes to a publisher website, we provide personalized content. To do that personalized content, we do it with AI. So inferencing at the edge and really deep learning and machine learning at the back end to make sure that we provide the best user experience for any user out there. What's with the name? What does that signify? So Tabula comes from the Latin phrase Tabula Rasa, which means blank slate. So we don't know who the user is. We're not like any of the other big ad tech companies. You don't log into our service. You don't tell us anything about your affiliation. We just infer everything in real time. And so did you have something to add? No, no, no. So why didn't you just do this in the cloud? Why did you, why are you doing this on-prem? So the cloud, while very trendy, is also another thing very expensive. To operate heavy compute loads, to operate machine learning, to operate AI in the cloud is just so super expensive that our TCO, our total cost of operations, on-prem is that much cheaper using Dell servers, using our own IT systems, and just making it work that much faster and cheaper for the business and for the end customer. So John, that's not like you just, last week invented this class of servers that could service a customer like Ariel. You've been at this for a while. What's your sort of AI server journey been like? It's obviously been accelerated with all the hype, just in terms of the awareness, but from a product standpoint, a reality standpoint, take us through that. Yeah, exactly. So we've been working with Taboola for three or four, five years now, at least, right? And so we've been providing the infrastructure for artificial intelligence during all of that time. And it started out as, starting out as just being a great platform for hosting the data, for processing the models, and for delivering the inferencing. But we've then evolved that into a set of validated designs for artificial intelligence that we do for, we have natural language processing, we have general artificial intelligence, and we do some converged IO type machine learning ops solutions and things like that. And it's been really a journey to get customers over that hump of going from a pilot or a proof of concept, like I said, in the public cloud to actually putting something in production. And I want to appreciate that almost everyone who does put something in production is going to end up like Taboola in building their own single tenant infrastructure because of the cost of operations, right? I mean, artificial intelligence is still very much a high performance computing solution, right? Did it start out like as a reference architecture that then evolved into a validated design and then ultimately become secure, right? Exactly, the reference architecture and then we had design guidance and testing and validation tools and things like that. Well, it's interesting because we're, in conjunction with Dell Tech World, we did a bunch of prerecorded interviews for ISC, which is in Hamburg. And it was very interesting to speak to some of the customers who said, look, I'm going to do this stuff on-prem because first of all, it's too expensive. At some, I don't know what the crossover point is, some number of thousands of cores, it gets way, way expensive when you go beyond experimentation and they're worried about IP leakage and they said they want to show it off, right? They got this cool data center or even a supercomputer that they're like, hey, check this out. Why should the cloud guys have all the fun? Yeah, I mean, you would know more than I, but artificial intelligence is still very much, unique IP, no matter where you, the stuff that you're doing, no one else is doing. They can't do what Tabula does, right? So it requires some investment, yeah. So Ariel, talk to us about why Dell. Obviously, we always thought customers have a choice. You mentioned working together for quite a few years now. What is it about Dell, its innovation, its technologies that really led you to make that decision that this is the right one to power our business? So the short answer, it's easy, it works, it's reliable. And then we can go into a much longer answer of our transition from other vendors into really being a Dell shop through and through because we wanted to have that one-stop shop for support, for the management of the servers themselves, for the security, making sure that we're secure from the chip set to the firmware to the software that we place on the servers. The ability to get the architecture help from the sales engineering team and the validated solutions, all of that helps us squeeze more performance out of every core that we bought. So as you said, from thousands of cores, we're running over half a million cores. And at that point, you really, really want to get as many performance cycles as you can out of each core. So even a 1% bump is really a lot when you're talking about that many servers. So Dell was a natural choice, easy to work with, secure, reliable, good support. I don't think you can ask for more. Well, energy has to be a huge concern, obviously. So how are you, now, the good news is, okay, you guys are smart enough to know how to manage your own infrastructure and keep your cost down, et cetera, but now you, of course, inherit all these other data center issues like power and cooling. How are you addressing that? That's a great question, and really, thank you for that. So part of efficiency is, of course, not only squeezing as much as you can in terms of compute, but also really bringing as much power to the IT side in the hosting environment. So not wasting that cost and power on cooling, sorry, on cooling and other things that you might have in the data center. So with Dell, we are able to control fan speeds. We are able to run the servers a bit hotter on the cold side than with other solutions that we have. We are able to really see a full view of the server within our management systems and optimize the airflow, optimize the power pull, optimize the power usage for the different levels of the day where we have more compute coming in or less compute power coming in. So the load balance, the software load balancer would bring in and dis-reviewed all of that into all of the different servers, but when the traffic goes a bit lower in the day, we're able to cycle back with the Dell technology and also cycle back on the power. You mentioned you can run hotter on the cold aisle or side, you said. That means you're saying you can let the temperature rise Just a bit, yes. Just a little bit, but even a couple of degrees is going to make a difference, right? Why is that, Jonathan? Think about it. Well, computers at the end of the day, I like to think of it as it's like you're putting electricity in, you get mathematics out. And at the end of the day, and unfortunately with the mathematics, you get heat. And one thing that Dell is really good at is getting as much math out as you can by dealing with all of the heat. And so we have, for example, in our latest PowerEdge servers, we've altered the airflow in the servers to improve the airflow through the servers so that we can run at higher temperatures, right? And when you're talking about AI and you're talking about millions of cores and thousands of nodes, you've got to be able to know to get that, like you said, that air from the cold aisle to the hot aisle as efficiently as possible to take that heat with you. So you got the A plus and thermodynamics. So what does the AI stack look like? Can you describe that? And how is that different than sort of the everyday stack to run whatever, SAP or just general purpose applications? Well, I mean, you should tell a talk about that. I'd be happy to share, yes. So when you're talking about SAP as an example, that's a monolithic application or usually an application that is very heavily bound to a single server, let's call it. When you're talking about AI, you're talking about grids, talking about supercomputers, you're talking about hypercomputing, it depends on the use case. So we have thousands of computers or thousands of servers, I should say, in the different racks all connected to each other through a non-blocking network so they can interact and you don't actually have only one CPU solving something or one system with a four-way CPU. You have thousands of CPUs and tens of thousands of cores solving problems as they come in or trying to infer different, I'd say, conclusions for the users that are coming into the system. You said a half a million cores, did I hear that correctly? Yes. Okay, that's incredible. And so what's the networking like inside there? What is it, is it Ethernet talking to? Yes, so we are strong believers in Ethernet, so we're using 25 gig networking and really that is our easiest go-to solution where we have each server hooked up to a 25 gig port and then we have the top of rack hookup at 100 and 400 gigs. Is 100 gig in your future or is the cost delta too much right now? Because they're coming down, compressing. Absolutely, so 25 gig is our default go-to today and I would guess for the next approximately two years that's where this port is kind of going. It's here to stay but in terms of new servers coming in for at least the next two years we're going to be on 25 gig and then we're probably going to see the 100 gig to the server itself as the CPUs themselves become bigger, hotter and able to also absorb more processes. There will be a crossover, I presume. It's like today you wouldn't do a 10 because you're going to get a 25 at the same price, it's good for free. Jonathan, can you comment, obviously what Ariel has described here is a huge deployment, tremendous power going on. Can you comment on the learnings that Dell has gotten from this customer relationship and as AI is evolving, how are they helping to evolve Dell in that respect? That's a great question. So with Tabula, like I said, we've been working with Tabula on their deployments in Israel but also all across the world and it has really spurred the need to really understand, from a customer standpoint or a support standpoint being able to deliver that global support no matter where Tabula is, not just in terms of break and fix but that AI expertise, that configuration expertise, et cetera. And we're taking that and then we're building that back into the portfolio even as we talk today. So the sexy cool AI technology out there now is this thing called generative AI, like chat GPT. And I will say that while that is really an exciting technology to see demonstrated, the application is going to be more like what Tabula is doing, where you build a unique model and a unique set of tools to help your enterprise solve those problems underneath the cover. So that's where the real value's going to be in the future. I saw, I think it was a Twitter, maybe it was LinkedIn, somebody posted, have you figured out your AI strategy yet? Well, before you figure out your AI strategy, you better figure out your data strategy. Right. Do you buy that and what is your data strategy? Yes. Doesn't care about data at all, I'm sure you should tell. So when you think about AI, obviously you need to feed it. It's like a monster that just demands more and more. So the data must be pushed into the algorithms so or on the training side or eventually when you want to get answers from them. So for us on the training side, we do NLU, which is natural language understanding where we are NLP natural language processing where we would like to understand what an article is about. So we will ingest a whole lot of articles, a whole lot of publisher web pages online, bring them into our systems, understand what they are and categorize them so we can provide the best content recommendation for that moment. So we'll do that on GPUs. But just think of the sheer amount of data when we serve four billion web pages a day, how many web pages we need to understand, ingest and have the computer kind of understand the language and what they are about, not to speak of at least 30 different languages that we support today. And this is all with our internal AI. So when you first saw ChatGPT, were you like, eh, I got this. What was your reaction? Take us inside sort of a, it's a term of endearment, an alpha geeks brain when you first saw that. So ChatGPT is a revolution because really if you think of what my kids, your kids will be talking about in 20, 30 years ago, oh, you had computers? Wasn't that like a typewriter? You had to actually type on them? Because ChatGPT is changing the way we will interact. Like emails will suddenly, my English will suddenly be that much more polished. My spelling would be that much better just because of these solutions out there. And who knows what's coming 10 years down the road. So it's totally a revolution and it revolutionizes the ability of IT teams to work better. So if I'm taking it just back into our tech world for a second, for anyone that is coding, anyone that is writing scripts, the productivity boost there is astounding. So we, for us, we run those 12,000 servers with about 15 SREs, Cyber Liability Engineers. And now I envision 20 and 30,000 servers with the same amount of people. I don't need to grow anymore because ChatGPT is like another friend sitting here helping me whispering in my ear how to code better, how to write better, and how to be better. I think that's the key to all of artificial intelligence. You know, we've been looking at ChatGPT as this cool sexy thing, but what you don't realize is that it really is a public demonstration of what can be done. But the real value is going to be boosting and productivity of employees in all kinds of ways, both predicted, right? You know, like IT, coding, those are going to be like low-hanging fruit, right? But then, you know, who knows what else is coming, right? Michael Dell was interviewed a month or two ago and he said something around, you know, when the cognitive power goes to zero, think of the things that you can do. And that really resonated with me, this idea that, you know, in every job that you are in, no matter what you do, you spend a lot of that time just trying to find the information that you need to make a decision. Or you're trying to, you know, make sure you've got all your ducks in a row before you pull the trigger or something like that. And technology like generative AI inside the enterprise is going to do all that kind of grunt work for you. And it's going to make people, I think it's going to make them more productive. It's going to allow them to be more creative. And so, you know, it's going to create all kinds of unpredictable consequences. Some of them might not be wonderful, and I know everyone's worried about that, but I think that at the end of the day, it's a game changer for the economy. And presumably, like other mind-blowing innovations, the graphical user interface, when you first saw that, you're like, wow, the web browser, oh my gosh. And then you look back and you're like, wow, that was horrible. I can't imagine what this is going to bring. Exactly. What it's going to bring, what is your outlook? So I think it will really improve us. It will help us be more productive, find things faster, the ability to really ingest a whole lot more information, make it more user-friendly. Just think of help pages today. Sometimes when you're looking for a technical solution and you find this help page and it's hard to read, but if you search it on Bard or on ChatGPT, and suddenly you get something that is user-friendly, that is for us as users. How will it change publishing? Is it a really interesting question? How will it change search? Because suddenly instead of going into just a normal search bar, normal for the last 15 years, where you get all these answers and links to the publishing world, suddenly you're getting the answer and who's going to pay for that? Because that has scanned content that someone actually worked hard to create. So who's going to pay for that? What's going to happen there? There's a whole lot of questions to be asked. You know, even search, when you think about it, the way you prompted search in the early days, you had to really think about this plus that. And now it's all ads. Or you'll take us out with obviously a phenomenal use case that Tabula has with JAL with AI. What's next? Wow, what's next? I would guess that a whole lot more compute in terms of the ability to see more compute units within a single rack. I'm looking at, if today we're looking at 15 and 20 kilowatt racks, we're going to see much hotter racks with much better cooling. So we will be more environmentally friendly. The whole IT industry needs to be more environmentally friendly and Dell is really leading that way and helping us as Tabula see the future in really being carbon neutral. So IT will be a better place for the planet and we will be able to be more user friendly and provide more, I'd say, intuitive services much easier. Awesome guys, what a great use case. Thank you so much for joining Dave and me on the program today talking about Tabula and Dell, what you're doing, what you're enabling your customers to do and the horizon seems limitless talking to the two of you. We appreciate your enlightening comments. Thank you so much. Thank you. We appreciate it. Our pleasure. For our guest and for Dave Vellante, this is Lisa Martin signing off from day one of our coverage of Dell Technologies. Big day tomorrow. Big day tomorrow. Big day tomorrow. We've got Chuck Whitton in the keynotes, I think, right, tomorrow morning. Yes, we do. And then Michael's coming on, Chuck's coming on. We've got a big, big, big day. We do. Wall to wall tomorrow and the next day. We hope you have a great night. We'll see you tomorrow.