 Daily Tech News Show is made possible by you listening right now, Philip Lass. Hi, how are you? Howard Yermish, John Outwood, you doing OK? We should all welcome in our new patrons, David and Ray DeVila. Welcome. Welcome. On this episode of DTNSN, Vidya is on top of the world announcing new AI processors and new AI platform, getting compared to Taylor Swift. And Facebook has lowered the price EU users would need to pay to avoid ads. This is the Daily Tech News for Tuesday, March 19th, 2024 in Los Angeles. I'm Tom Merritt. And from Studio Animal House, I'm Sarah Lane. And on the show's producer, Roger Chen. You know, I'd say nine days out of 10, I read in Los Angeles. I'm Tom Merritt. And I think to myself, well, what are you when you're not in Los Angeles? You're still Tom Merritt? Yeah, the answer is I'm still Tom Merritt. And I still think we're better or worse, you know? You kind of just have to your cross to bear. Happy Nvidia Day, everyone. Oh, happy Nvidia Day to you. It's Nvidia's coming out party. It was it was and it was a big one. Yeah, it's like a quinceanera. I don't know if they're 16. They're more than 16 years old as a company. Nvidia, are you wearing a cool dress right now? You look great, Nvidia. Well done. Thanks for inviting all your friends. All right, let's start with the quick hits. Google's Liz Reid has been in charge of adding AI tools into search, AKA the search generative experience, and has just been promoted to running all of search. Congrats, Liz. Certainly looks like a sign that Google sees its search future as firmly powered by Gemini and the like. Microsoft is shaking up the people running some of its AI products. The co-founder of Google's Deep of Mind, Mustafa Suleiman, has been named executive vice president and CEO of Microsoft AI, which is a new group within the company that includes co-pilot as well as bringing Bing and Edge into the department. Suleiman left DeepMind in 2022 to found a company called Inflection AI and fellow Inflection AI co-founder and chief scientist Karen Simonyan will also join Microsoft and become the chief scientist of Microsoft AI. The person who was in charge of Bing and Edge, Mikhail Parakin, CEO of Advertising and Web Services, is moving over to this new group as well. Inflection AI will continue to exist and will host its 2.5 model on Microsoft Azure, so I guess no hard feelings. Commercial Bank of Ethiopia, Ethiopia's largest bank, suffered a systems glitch over the weekend, which let customers withdraw more cash from their accounts than they had in their accounts. Yes, but only temporarily. Much of the $40 million withdrawn was done so by students on campus ATMs. Once word about the glitch spread via messaging apps, at least three universities have now released statements advising students to return that money that you took that wasn't your own. They might have taken from CBE. Anyone returning money will not be charged with the criminal offense. I mean, how would you get charged with a criminal offense when it's like their glitch, but knowingly keeping it, I could see you get in charge. Yeah, you may remember us mentioning the Picsis Ocean Shipping Vessel a while ago, P-Y-X-I-S. To jog your memory, it was using 123 foot tall wings, kind of standing upright on the deck of the boat to harness wind power to generate energy for ship propulsion. So not exactly sailing, although that's the obvious joke. But yeah, using wind power to power the boat. After six months of testing this, the results are in and shipping company Cargill says the Picsis Ocean saved an average of 3.3 tons of fuel each day, which tells you how many tons of fuel these ships have to carry around, peaking at 12 tons of fuel saved each day under optimal weather conditions, right, when the when the wind was blown in the right direction and all that. Greenhouse gas emissions therefore fell 14 percent. The test vessel was using two of these wings, but they say they could probably fit most ships with three. So that would increase all those numbers. The next challenge is getting ports to be able to accommodate the unusually tall wings. Now, that may seem weird because you're like, well, they're up in the air. What does the port matter? You just pull into the dock. There's things like clearances and bridges and stuff like that they have to deal with as well. So it looks but looks promising that these would help improve the efficiency of shipping. Tom, remember when you used to poke someone on Facebook or get poked by somebody on Facebook? It was to try to interact with somebody, you know, you're ignoring me. I miss you be my friend, all sorts of reasons. Well, the company announced Tuesday, and this is not a joke, that not only is the poking feature not actually dead, just mostly dead, but recently improved suggestions on who to poke and how to find the poking page through search is one of the mandates. Facebook says three these small changes have led to a 13 X spike in poking in the past month. I'm sorry, I can't say this with a straight face. It's true, though, young users, though, and this is interesting. On board, Facebook reports that more than 50 percent of recent pokes are coming from those aged 18 to 29. Ah, so no, I'm not going to say that. The young kids are enjoying this feature, this old feature. It's like 90s nostalgia. Yeah, early on site, I suppose. I was only sad that this wasn't about poke, the delicious Hawaiian food, which has never fallen out of favor. No, absolutely. And Facebook should should, you know, increase the amount of poke poke bowls in the world, too. Indeed. Well, I'm going into more meta news in order to comply with the EU's Digital Markets Act, or DMA, which we've talked about quite a bit here on DTNS. Meta offered EU users the option to pay nine ninety nine euros per month for an ad free experience. Now, that's complied with the provisions that users have to be offered a way to use a product without being tracked. So Meta said, OK, well, we've got the free option. You're getting tracked where you pay and you don't get tracked. But some people objected, saying a paid option doesn't count as allowing customers to give free informed consent. At a workshop in Brussels, Meta lawyer Tim Lamb revealed that Meta will reduce the fee down to five ninety nine euros per month and four euros for additional accounts. The company is waiting to hear from the Irish Data Protection Commission to get a response if that lower price indeed addresses the concerns. Yeah. So Meta's argument is, well, if we can't track users and again, it's not like you can't serve ads to users if you can't track them directly. But if you can't track them directly, they can't make enough money. They can make some money, but they can't make enough money to justify offering the service. It's a business decision that, you know, if if we could only make the amount of money you can make without tracking, we wouldn't be offering this service. So if the only option it can offer that makes business sense are pay us the fee that would cover what we're not going to get from from advertising. And we won't give you any advertising. We won't track you at all or let us track you. And then it's worth doing. Max Shrems has been leading the charge for privacy in the EU for more than a decade now and is one of the people trying to hold tech companies accountable is out there talking to the press and says, look, even at a price of a euro 99, which begs the question, what about 99 cents? But he says, even at a price of a euro 99 in studies, that has changed consent to be tracked from three to 10%. So if it's like, do you consent to be tracked? There's no cost to it. Three to 10% say yes. Everybody else says no. But if they're like, hey, do you consent to be tracked? You'll be charged a dollar 99 or a euro 99. Suddenly 99.9% people say, no, I fine, track me. I don't want to pay for it. So Shrems argument is that charging money, even as low as a euro 99, is effectively economic coercion. And that violates the rules that say you have to be able to give free consents, hence it fails the standard. Sarah, which side are you on? We're a war. Oh, yeah. OK, so as a Facebook user who doesn't pay anything and I don't live in the EU. So this is this does not apply directly to me. I have to just, yeah, preface this whole conversation with there have been Facebook, you know, like dumb spam rumors for years about like, the company's going to start charging you. So that is actually kind of true, depending on the region that you live in. But it means that you might have an experience that you value more or it makes you feel safer than the one that you have for free. So would I do this for even one dollar 99 euro per month? No, I would not. I would not. I think that it's, you know, somewhat negligible for a lot of people who, you know, are paying for five euro coffees kind of thing. So, so, OK, the the amount that is, you know, that we're talking about right now, which is six euros per month is, you know, it's doable. Do you want that kind of experience? Great. What I think is really interesting, though, is, is what you talked about, Tom, the idea that if it went down to a certain number, people go, OK, all right. I feel a little bit differently about that, and that's just kind of economics. But, but yeah, I, I don't think that I don't think that I think the whole this is sort of like a furniture store, right? It's like, you want this couch? Nah, too expensive. Well, let me go in the back and talk to my supervisor, come back outside and be like, what if the couch was like 50 percent of that price? And people go, maybe I want it after all. This feels like sort of more of the same. Yeah. It's I mean, clearly it's a negotiation of like 999. That was too much. What about 599? What if we charge people 599? What do you think? Irish State of Protection Commission? Yeah, good enough. And then, you know, you'd be right in saying, well, then why did you charge 999 before? We're, you know, you were just you were just being greedy. And to that, I would say, yes, if you are running a business of any kind, let's say you're running a garage sale, you try to charge what you can charge. Yes. And yeah, you might give someone a little bit of a discount because you you're like, you know what, you have a kind face. I want you to have something. But in general, you're trying to make money. That's all these companies are doing. And I think it's fair to say you don't have to use Facebook. So if we're not allowed to make our maximum amount of money on advertising, then we're going to charge that if they what if they didn't give any kind of free access and just charged, that would be fine, right? It's also like offering an access that's tracked that seems to be the problem. I mean, and I'm not minimizing people who do business on Facebook because I know there are a lot of you who do. But, you know, if my bank said, oh, your $4 monthly fee is now $14, like I would be up in arms. You know, there there are certain apps that I I feel like I have to use as a person. You know, just making our way in the world. But Facebook is not one of those. But I think some people feel like, well, this actually is kind of where I hang out, where I might do business, where I might make money. And and yeah, the you know, the the that makes these numbers actually really relevant. Yeah. And Stoic Squirrel says, you know, if they gave it away, if they charged everyone, they'd have fewer users. And that's absolutely true at which point I say, well, what they're trying to do is have a free version that's ad supported. If you say you can't it is illegal for you to track people and they know that even with you know, they won't get enough consent for free, then maybe they would have a different time period like it would be free for a limited period of time or with limited features. But at that point, just make tracking illegal. If you don't want people to be tracked at all, just make it illegal. Like to me, this is it's sort of like saying what we really want is Facebook not to track anyone. And it's like, all right, well, then just say that. Just just just make that the the law. If you have feedback about why I'm entirely wrong about that, why not let me know on our social media. DTNS show on X mastodon at DTNS show at mstvn.social. Daily Tech News show on TikTok and DTNS picks DTNS P I X on Instagram and on threads. Say hi. At an event compared to rock concerts by not one person, but quite a few, a few of them even brought up Taylor Swift as an example. Nvidia. Yes, that is the comparison held its GTC developer conference as it continues to roll tide on the stock market charts. And sell hardware in large amounts to companies making AI products. Nvidia seems to be taking the mantle from Apple, Google, Microsoft, Amazon. Nvidia is big. It's, you know, maybe not even part of the big five, but bigger. So Tom, let's break down what they announced today. Yeah, you might say they have champagne problems. The star of the show at this announcement is the next generation processors. The next generation is named Blackwell after the mathematician David Harold Blackwell, who specialized in game theory and statistics, kind of the underpinnings of AI. The Blackwell line is the successor to the Hopper series Hopper named after Admiral Grace Hopper and the Hopper flagship you've probably heard bandied about in the headlines as the H 100 processor. The first Blackwell processor is going to be called the GB 200. It will be made for Nvidia by TSMC, who makes all the big stuff. And it's actually a combination. It's two chip dies to be 200 Blackwell GPUs combined that communicate at up to 10 terabytes per second and includes an arm based Grace CPU. This thing can do 20 petaflops that's up from four in the H 100. It is seven to 30 times faster than the H 100 and uses 25 times less power. There are 208 billion transistors in Blackwell chips. And the only reason I bring that up is so I can compare it to the H 100, which has 80 billion transistors. So it's bigger, it's more powerful, it's more power efficient. And it has its own transformer engine that is specifically meant for transformer models, GPT models, not just open AI GPT model. They claim they can deploy a 27 trillion parameters model and as a comparison GPT four, which is one of the highest parameter models out there has 1.7 trillion parameters. So there's room to grow. The GB 200 NV link to server will combine 72 Blackwell GPUs and is designed to train models. AWS, Google, Azure and Oracle will be offering the GB 200 in their cloud services. The chips will be shipping later this year. This is a big one, Sarah. It's a big one. I think what's also big is how much it costs. You know, if you're a corporation, you know, if you're AWS, do you buy a thousand or a million, you know, and get some sort of discount? Like, what are we talking here? Because obviously the specs are really, I mean, they're great compared to, you know, anything H 100 ask. Yeah. And this this is what you would expect. It's probably a bigger jump than you would expect. And if you're like, OK, there's a lot of numbers and I'm not going to buy one of these things because they're probably $10,000, which, you know, maybe on the low side, what are they good for? They are good for companies renting them, essentially, as some companies may buy them and roll them into their own data centers, but most companies are going to rent them from AWS, Google, Azure and Oracle. And and they're going to use them to train or more likely inference their own model. So it is a more powerful, faster way of providing the models. Don't you need the GPU if you if you train a model successfully? Don't you still need that processing power going forward? You know, like, is it like a lifetime rental type thing? Well, it's it's it's like a cloud rental. It's like paying for your ISP, right? You're you're going to rent from, let's say, Azure. Like, I want to I want to use this much, this much GPU power per month. And then you you pay per month to access that. You can also buy the stuff and use it on your own premises. But then you have to maintain it and you have to replace it when it gets out of date and all of that. So this this is going to supercharge the training of models. And by the way, in the press release for this thing, everyone recommended it. Sachin Adele, Sam Altman, Sundar Pichai, everybody who's in this space, they are all like, yes, we love the Nvidia GPU. The heads of all the companies that care about AI. Well, so once you do have that more powerful GPU, whether you buy it or you rent it, Nvidia wants to make it easy for you to deploy AI without needing to spend weeks on the code or even have expert AI folks in-house. Maybe you don't have any, you might not need it, says Nvidia. Hence, the Nvidia inference micro service or NIM coming to the Nvidia Enterprise software subscription. So this is going to let companies who have their own trained models save money by using their existing older Nvidia GPUs for inferencing, which is less intensive than training. That way they don't need to buy new expensive hardware or outsource to cloud providers. I mean, that that is the ideal situation. Anyway, NIM combines your model with an optimized inferencing engine in a container available as a micro service you can call on. So if you're not a developer, that means weeks to months of work available immediately. Nvidia says NIM can run on a laptop with the proper GPU, which yeah, they conveniently provide. And we don't have prices on the Blackwell processors, but we know that the hopper ones were, you know, ten, twenty five thousand dollars. The license for this NIM software is forty five hundred dollars per GPU per year that you're running it on, right? And so if you have an in-house set of GPUs, maybe you have hoppers, maybe even have some older ones, you can use NIM to get more out of them and improve the performance. It'll support all kinds of models, hugging face, Google, Getty images, Shutterstock, a bunch of stuff, and you're going to be able to rent it as well. If you're like, I want to be able to use this, but I actually need to use it in the cloud, Amazon, Google and Microsoft are all going to offer NIM because it's a container that you can just call in. So if you're like, you know, and I don't have old computer hardware, but I don't also have AI developers, I need to be able to call in a container that can save me weeks and months of work. You can do that. You can do that through the cloud as well. This thing will be able to do speech, translation, routing optimizations, weather modeling, the Earth 2. We're going to talk about that in a little bit is available through NIM. And they're going to keep adding stuff to it. They say they're going to add chatbots to it as well. I mean, it's impressive. It, you know, forty five hundred dollars per GPU per year for a license. There are certainly, you know, really small companies who would be like, oh, can't do it, but companies who are really taking this stuff seriously and need this kind of processing power. This feels affordable to me. Yeah, if you're going to make that money back, which is that's the calculation, right, is like, I'm going to why else would you even bother? I mean, I have a speech and translation model that's going to increase my business by, you know, forty five hundred dollars, you know, by nine thousand dollars per per year. I mean, you should be able to increase it by quite a bit more. Forty five hundred per year, not per month. That's doable for a lot of businesses. So this this is also significant because it's NVIDIA pivoting to enterprise software. That was one of the things Jensen Huang kept hammering in this announcement was NVIDIA isn't just a hardware company anymore. They're a software company. They're trying to do what Microsoft did with Windows when it pivoted to cloud and Azure from GPUs to enterprise AI services, not trying to replace open AI, not trying to replace anthropic or hugging face, but being the software platform that those models can run on. And I think that's incredibly smart because NVIDIA is doing that at a time where there aren't that many other games in town to compete with them and they're not trying to compete with the established players like Microsoft and Amazon, they're they're signing them up as clients. Well, and I think I was listening to a it was a public radio something or other yesterday evening when I was in my car and there was an NVIDIA story again ahead of the announcements today. But but the whole kind of thing was this was for a general audience like what's a GPU NVIDIA? But it got me thinking, I think a lot of people still think like, oh, NVIDIA, that's like a gaming processor company. It's for somebody who wants to build their own PC, not untrue. But also NVIDIA is has made moves that it's going to be a part of your technological experience in a variety of ways that either already is or will be in the future because of all this stuff. Yeah. And the other thing to remember is NVIDIA makes money. This is not a startup that's trying to get listed on the stock exchange. You know, we're going to lose money for a few more years before we make. No, NVIDIA prints prints cash. And so getting into this AI space, they're doing it because they make money selling this stuff and they make money selling this stuff because there's a demand for it because it works. There's a lot of hype around AI and I don't deny that. But there's also a lot of things that actually work. The other thing that Jensen Huang is doing here, which I think is also very interesting, is he's looking ahead of even the software of like, OK, we'll be the software provider for something like NIM. What else should we be positioning ourselves to provide? And that's where Project Groot or actually I should probably say Project GR00T, because it's spelled with zeros, a general purpose foundation model for humanoid robots was announced. NVIDIA says this will make it easier for humanoid robots to take actions based on inputs with a combination of language, video, human demonstrations and past experiences. So again, a platform it can offer this time to a very specific industry, robotics, NVIDIA also announced Isaac manipulator and Isaac Perceptor for use by companies that make robotic arms in the manipulator case and autonomous mobile robots in the Perceptor case. Project GR00T runs on a computer called the Jetson Thor. They really love their Marvel illusions, which is optimized to run simulation workflows. So a lot of times when you're training these robots, you will simulate what they do as a way to speed up training. And that's what the Jetson Thor hardware is designed to do. Jetson Thor uses Blackwell architecture with a transformer engine that delivers 800 teraflops of 8-bit floating point AI. Well, all right. I mean, if you're like, who's going to buy this? Everybody, everybody that's in this business, like probably the one people have heard about the most is Boston Dynamics, but there's a dozen other robotics companies here that are taking advantage of this. Well, I mean, Amazon, a company, you know, going pretty hard on robotics as well. Yeah, that's yeah, I don't know if they're a client, though. Tesla is not a client either. But Agility, Boston Dynamics, X-Pang, all of those. No, no, I mentioned Amazon just as a company who's, you know, leaning into the humanoid robot thing, you know, to kind of see what sticks. We're still on those early days for warehouses. We're still, yeah, we're in those early days. I don't know, man. I mean, NVIDIA has been busy as all heck. And I don't know. The whole sort of like robots that do things that might, you know, become, you know, adversaries to humans. I'm not really worried about that. You know, let's let's do it. Let's do it. NVIDIA, I'm in. And this follows this follows the pattern that we're noticing at this announcement of, yes, we're still a GPU company and we're mostly an AI hardware company, so we're going to announce that is our big showpiece. But here's what we're looking forward to. We're going to pivot and provide more enterprise software in the AI space. We're going to provide software for a growing industry like robotics. And they're even going to jump ahead and try to provide some stuff that's got an even longer timeline, right? Yeah. If you were like, well, that seemed like a big announcement today, NVIDIA has not done yet. A couple other announcements worth noting, NVIDIA's Earth 2 is a digital twin of our planet you can use for climate tracking. It can run on different supercomputers and is claimed to be 1000 times faster and 2000 times more energy efficient than today's numerical weather prediction processes. It can also deliver real time forecasts, warnings in seconds. You know, if you live in earthquake country, as some of us on the show do, kind of cool, Taiwan's central weather administration is among the first to adopt it for typhoon forecasting as well. So there are a lot of climate things that it is designed to take care of or at least alert you of the weather company plans to leverage Earth 2's APIs. And probably the longest timeline that NVIDIA is getting into is Quantum Cloud. NVIDIA Quantum Cloud is a data center stacked with AI chips and systems that simulate a quantum computer for research. So they're not providing a quantum computer, but they're getting themselves ready to do that. NVIDIA says the service will provide access to third-party quantum computers in the future. In related supercomputer news, Fujitsu will use 2000 Hopper H100 GPUs in its supercomputer that it's building for quantum computing research at Japan's National Institute of Advanced Industrial Science and Technology and supercomputers at Denmark's Novo Nordisk Foundation and Australia's PAUSE Supercomputing Research Center will also use NVIDIA GPUs. So NVIDIA really at the top of its game. Like it feels like this is the apex of its climb. It is it has graduated to top tier status. So when you talk about Apple, Microsoft, Amazon, you got to talk about NVIDIA now, too. Well, and NVIDIA has it's one of those companies where and there are many companies where it's sort of like behind the scenes. You know, this is what powers the thing that you love. And NVIDIA has been doing that for a long time. But we're getting to the point where it's I mean, it's a it's a real superstar of its own company. And this was its coming out party. Roger reminded me earlier today that it's been five years since they had a GTC. So that kind of fueled the excitement. But their their stock price also probably fueled the excitement. So there you go. Congrats to all of you who made money. Let's check out the mailbag. Well, if you're tired of lugging your bags everywhere when you travel because you need bags and, you know, that's not very fun. Chris Christensen found luggage that might be able to move itself and you along with it. This is Chris Christensen from Amateur Traveler with another tech in travel minute. I was at the Travel Goods Association show last week, seeing all sorts of new different devices, gadgets and lots and lots of luggage that are coming out. And one of the most interesting pieces of luggage that has a high tech angle is the Moto bag. And this is a bag that has a motor in it. It is self propelled. It is a carry on that can carry you. It's fascinating. It is the size of a carry on, but it is self propelled and you can ride on it. And you can see a video of this on my Instagram, which is Chris2X. But it's a fascinating different bag. The idea is that it would give mobility for people who may not be able to make the long walks through airports, but you can check it out. It's Moto bag. And this is Chris Christensen from Amateur Traveler. I I've seen members of Blackpink writing similar. I don't know if they're Moto bags, but they're similar things around in their vlogs. So, you know, this is the future. We'll be riding our suitcases in the airport. I mean, I I I'm still, you know, I'm still rocking that, you know, get my steps in type thing. But hey, you know, at the end of a long day after maybe. Once you hit the step count. Yeah, exactly. Then then I am chilling on the carry on. Yeah. Yeah. You've taken that long walk through Heathrow or JFK. You know, reward yourself by writing your bag. And patron, stick around for the extended show, Good Day Internet. Generated images are spamming Facebook. You may have seen some of these, perhaps a very holy looking shrimp. And they are leading people to realize, huh, there are bots in this world, which is probably a good thing. I don't know. We're going to discuss it. Stick around. You can catch this show at live Monday to Friday when we do it live at 4 p.m. Eastern twenty hundred UTC. Find out more at daily tech news show dot com slash live. We've got somebody pretty cool coming to the show tomorrow. That is Patrick Beja. We've missed him. Don't miss tomorrow's show. The DTNS family of podcasts. Helping each other. Understand.