 This 10th year of Daily Tech News show is made possible by its listeners. Thanks to you, Larry Bailey, Michelle Serju, Kirk Stephenson, or David Garver, not or, and, and you, right there, listening. Who's like, what about me? Yeah, I mean you too. On this episode of DTNS, AI makes weather forecasts better, and Microsoft is making its own AI chips. That and more from Microsoft Ignite. This is the Daily Tech News for Wednesday, November 15th, 2023 in Los Angeles. I'm Tom Merritt. And from Studio Original Trisket, I'm Sarah Lane. From Salt Lake City, I'm Scott Johnson. And from Slightly Showery Skies, I'm the show's producer, Roger Chen. What are the other Trisket flavors besides original? Oh, Tom. Yeah, there's a sea salt. Are you not aware? You got sea salt, you've got an herb. Tomato and basil. Herb? I have another herb. An herb blend. Sun-dried tomato. Yeah. Sun-dried tomato. There's another one that I forget. Sour cream. I'm just, I'm just gonna, sour cream, although sour cream and Trisket's not the worst idea I've done it before. Chipotle. I'm just gonna go ahead and say, oh no, Chipotle Trisket. Bridge too far. All right. When are we getting the nacho cheese Trisket? Find out in Good Day Internet. But for now, let's start with the quick hits. OpenAI CEO, Sam Altman, posted on X that the company is pausing new paid subscriptions for chat GPT+. Altman said, since its dev day event, so many people signed up that OpenAI had exceeded capacity. Now, you can sign up to be notified when subscriptions will reopen. OpenAI services had several outages last week due to a combination of high demand and denial of service attacks. Intel pushed microcode Tuesday to fix a high severity bug described as a redundant prefix issue and known as Reptar, R-E-P-T-A-R. The flaw resulted in unpredictable system performance, including hangs, crashes, and nefariously, in some cases, the possibility of privilege escalation. So that was the big concern. And particularly of concern to cloud providers is that it could crash your hypervisors upon which they rely to provide machines in the cloud for clients. Virtually all modern Intel chips are affected. This is pretty wide ranging, so it's a good thing they found it before the bad guys did. Alder Lake, Raptor Lake, and Sapphire Rapids chips received updated microcode before November and have not experienced any detectable performance issues. So they kind of wanted to try it out. Now that they have, microcode has been issued for all affected micro processors and pretty much assume if you've got a modern Intel chip, it's affected. Users are advised to update your BIOS, your OS, and your drivers. META wants adults, specifically parents and guardians to be required to approve an app before someone under the age of 18 downloads it and is calling for national US legislation to enforce it the same way throughout the country. Now right now, states all have differing and sometimes conflicting laws based on this exact thing. The idea is that parents would verify their teens age when they set up a phone, could be younger than a teen, but this is often targeted at teens, so that all downloads after that point assume the teens age is correct. Everyone has said the truth and we go forward. Now META says currently parents need to navigate a variety of sign up methods and provide potentially sensitive identification information for themselves and their teens to apps with inconsistent security and privacy practices. This would be sort of a one sweep situation, you know, if meta has its way. Yeah, because I see weird dummy in the chat saying look META you could just do it yourself and that was my thought when I first saw this. They're not asking for there to be a law to make them do it. They're saying we just need one. We just need one set of rules because sometimes the rules conflict, depending on which state we're operating in so I actually think in this case that's kind of reasonable. Also, thank you Clinton for listing all the flavors of Triscuits in our YouTube chat. Google introduced a version of its Titan physical security key that works with past keys. So you can get it with either USBC or USBA on the end. Both of them have NFC if just want to tap the keys work as 502 multi factor keys normally do, but they also have the ability to store up to 250 past keys. Models are available in the Google Store right now. It's 30 bucks for the USBA version and $35 for the USBC. Google announced new features for its maps service so you can now filter navigation options by length of trip time spent walking your estimated time over rival number of transfers and even type of transit navigation to train stations will now include which side of the street is the best entrance to reach the platform going in the direction you need to go. Oh boy for people who do a lot of subway travel. This makes a lot of sense. You can also now share lists with other users and vote on them with hearts or thumbs down if I don't know. You don't like a list I suppose you can also respond to reviews on Google Maps with emojis and create your own emojis with a generative model so Google Maps growing up. My next feature request is which entrance has the escalator. What you don't want to lug your your luggage up the stairs. I mean it depends depends on the day right. Yeah. This just in Clinton didn't list all of the flavors of triskets on his own he asked chat GPT so it may not be correct. Oh man the robots we need them don't we. We don't know if they work or not though. Well we have lots of announcements robots are part of this in some form at Microsoft Ignite so let's go over some of the significant notes. Bing chat has been renamed Microsoft Copilot and will remain free. It will also still be available in Bing as well as on Windows 11 and add a new URL. You can just go to copilot.microsoft.com have to sign in with your Microsoft account but then you can use it right there. I don't know that it works in all the browsers. I couldn't get it to work in Firefox. They say specifically it works in Chrome and Edge. There was already a paid version of Microsoft Copilot for Microsoft 365 and in classic Microsoft fashion that one is now renamed Copilot for Microsoft 365. Alright everybody keep your bingo cards open because there's a lot more to come. Copilot Studio is a no code tool so you can customize the aforementioned paid Copilot for Microsoft 365 as well as integrate custom chat GPT chatbots. Microsoft Jared Spitaro said you can use natural language to describe what you want and Copilot Studio will help you build and iterate the conversation design. Anything from a webpage to quarterly reports. Windows AI Studio lets you access tools and models from multiple services not just Azure but also like Hugging Face. You can fine tune small language models. You hear all about large language models. You don't often hear about the small language models. Those can run locally and if you want to do large language models. NVIDIA is cooperating with Microsoft and updated Tensor RT LLM to run on the GeForce RTX 30 and 40 series GPUs. You just need 8GB of RAM and of course running Windows. I like that note cooperating with Microsoft. We're going to find out why NVIDIA might not like Microsoft in like 5 stories from now. Hold that thought everyone. Yeah hold it because we got more. Azure AI speech text to speech avatar lets you upload an image of a person to be created into an avatar then you can make them talk. So let's say I say I want Scott Johnson to be my avatar that would be the tool that I would use. Now the speech generator can also be trained on the person's voice if you so wish. It's in the public preview and most customers won't be able to do that particular customization at this time although they can use pre provided avatars and voices you know celebrities and that sort of thing. Customization is also limited to certain use cases. Then there is personal voice which uses one minute of speech to synthesize a voice which can then be used for dubbing, translation, narration, a variety of things. Users must give explicit consent in the form of a recorded statement before that voice can be used for training however. And customers have to agree not to use the voice for user generated or open-ended content. How quickly do you think somebody makes the voice get approved and then does it again? I mean like immediately I have so many questions about that but anyway go ahead. Yeah I mean there's a lot of you're gonna have to enforce this after the fact. You do have to make people register for this so it's not as easily misused as you might think. That's the big reason why with the video avatar they're like no only particular use cases that we work with closely are gonna be able to do that one. And the personal voice, I mean you can do that personal voice stuff right now with Descript and things like that. So I think Microsoft, I don't know, I'll say they're being appropriately cautious on this. That's true. Microsoft Teams gets regenerative backgrounds and voice isolation and Microsoft's mesh mixed reality platform is gonna integrate into Teams in January so you can do 3D virtual meetings. Now you won't have to have a headset to be in the meeting. It works with the Quest 3 though if you do want to wear a headset. If you're not wearing a headset you just appear as a webcam window in the virtual setting as if you're on a conference call. And then everyone else with the headsets appears in the 3D environment. Microsoft To-Do, Microsoft Planner and Microsoft Project for the web will be combined under an umbrella, the Microsoft Planner name in Teams starting in spring of 2024. Existing apps and web apps will be renamed Planner. Now here's why NVIDIA might not be as pleased with Microsoft. Microsoft announced it's making its own chips. The Azure Maya 100 AI Accelerator and the Azure Cobalt 100 CPU. Maya can train models. So you use Maya if you're wanting to create a large language model, for example. Cobalt can run them. So once you've created the model and you just want to use it, you run it on Cobalt. These will roll out to data centers early next year. The Maya 100 is a 5 nanometer chip with 105 billion transistors and is designed specifically for the Azure stack. So Microsoft's not going to be selling these to other people. It's meant to run Azure. It'll power Microsoft services to begin with. It'll not be available to public cloud customers for their own uses, at least not at first. Microsoft didn't give a lot of detail in the architecture. It's not submitting this to public benchmarks. It did say that it had to build custom data racks to accommodate cabling and some liquid-based cooling they don't normally need. And these chips, the Mayas, use a high amount of power. The Cobalt, on the other hand, the one you run the model on, is more energy efficient. They emphasize just how energy efficient it is. It's a 128 core chip built on the ARM Neoverse CSS architecture. And Microsoft is following a trend here in creating its own chips for customers to run their models on. Amazon has the Graviton 3E inference chip. That is also based on ARM. Google is supposedly developing one based on ARM. And Cobalt will be available for new virtual machine customers in the coming year. So Maya only for Microsoft services, but everybody will be able to rent space in a data center using Cobalt. Oh man, there's so much to say here. It strikes me that, by the way, just as a side note, the world of ARM chips not taking over, but their time has come. The ARM architecture is here, baby. And everybody wants a little piece of it. Apple's got their thing. These guys are now doing this, although they're not selling chips to me, you and me. But I feel like this old adage that ARM will never catch up with 486-based architecture. Nope, we're here, baby. It's here. It's time. It didn't end around instead of catching up. It just became more power efficient than started to be used for everything. And also, this is sort of, you know, a little of an aside. But when I heard the term Maya earlier today, I was like, oh, the 3D program that Autodesk used to own? Did Microsoft buy it? No, it's a different Maya. And it's felt different. It is. And it is. But, you know, it also is like, oh, really? I'm slightly bummed that Cortana got thrown to the curb. That happened a long time ago, but if you were going to have a new name for your AI assistant, I was hoping it would make a comeback. Copilot's fine. It's fine. You want Cortana? Oh, you think Copilot should have been called Cortana? I know with you there. Yeah. I think Copilot, because it's sort of like, you know, it's your little friend on your shoulder being like, you should do this. I'm your Copilot. It's a good name. It's just embraced Clippy. Yeah. That's kind of what, yeah. Or Bob or whatever. Sure. Bring back Clippy. Call it Copilot. Yeah. It's exciting stuff though. Look at Microsoft Go, everyone. Yeah. All right. There's a little bit of fun around this next story. Tell us about it. All right. So scientists published a study in the journal Psychological Science on Monday focusing on how convincing generated images are. Now, the study used the term hyperrealism to describe a phenomena where people believe that generated images of human faces were more likely to be real than actual faces of human faces that are real. The images were generated by the 2020 NVIDIA style GAN2, G-A-N2 generator. Now, in order to identify bias, only white adults, Caucasian white adults participated in the study. When shown images of white faces, they identified 66% of generated images as real, compared to 51% of real images. Now, images of non-white faces were identified as real, 51% of the time, whether they generated or not. A reminder, the participants identifying the images were all Caucasian white people themselves. The study did not include non-white participants, which I think is an interesting factor in all of this. Yeah, I think a lot of people might misinterpret why that was. There is a phenomenon with identifying what are called out groups. So you tend to view people who look like you. Whatever you look like doesn't matter. You tend to view people who look more like you differently than you view people who don't. And what they wanted to show here or what they wanted to test was, are the training sets biased so that the images created of white faces are better, higher quality synthetics than the ones of non-white faces. And if they were to put non-white people in the panel, that would bring the recognition down because of the out group thing, because specifically they were testing for white faces. So to see if that hyper realism effect that Sarah mentioned kicked in, they were like, let's not have a confounding factor where they didn't think it was as realistic because it didn't look like them. Let's make it, since we're testing for what white faces look like, let's make it look, let's make it all white panel so that that part of it isn't playing a part in this. And what they found was, yes, the white faces seemed more realistic to the panel. That is to me only the first half of the study. I think the next half is to do other groups and say, okay, but do we see the same effect? If we test an entire Asian panel, do they rate the Asian images here as 61% hyperreal versus 51% for everything else? If that's the case, then maybe the training set isn't as biased. If it's not, if it's like only 54% or something like that, then you have a case to say, okay, definitely something going on in the training set. This is one of those things where everybody hears about it and then braces for it because we're just like, oh, what do we say? How do I dance around this? How do I talk about this? It's a really interesting thing though about how we see each other, how we see people that look like us, how we see people who don't and vice versa. There's a real opportunity here to figure that out, to assess that out. And I think Tom is dead right that the study needs more and add more data to this thing based on that than just jumping to any conclusions from what we have. And I think the results might be super interesting, might say a lot about how we think and how we behave. There's a real thing here. I suspect that, again, I wasn't part of the study, but I suspect that people will gravitate toward somebody that could be related to them physically. You know, like genetically. It doesn't mean that you have anything in common with that person, but somehow you think that person is closer to you than the next person. And a lot of that is, you know, physical stuff. And, you know, I don't think I do that myself, but at the same time, I mean, that's where bias comes in, right? A lot of the stuff is subconscious. You don't realize you're doing it. And these sorts of research projects can bring things to light that help us, you know, in humanity, have, you know, be able to deal with each other in a way that's, I guess, more fair. Sure. I think the other part of this that's really interesting is the idea that these synthetic images, even from an old style, not even the good ones we have now, that hyper-realism is a thing, that we will tend, even like forgetting the whole, like, ethnicity, 51% of the time we were looking at non-real images and going, that was real, right? So I think that's interesting is that there's an ideal in our heads that reality doesn't match, but the AI can. That's really interesting. Well, if you haven't thought about that or anything else we talk about on the show, and you don't know our email address, do you have an email address? If you don't have an email address, send us an email. Let us know, or not. But if you do have an email address, you can email us feedback at DailyTechNewsShow.com. Scientists at DeepMind published a paper in the journal Science on Tuesday showing that one of their models called GraphCast outperformed the best existing weather forecasting models in predicting global conditions up to 10 days in advance, and they did it in a minute. One minute. Took one minute to, like, look at all the data. Yes. Got it. Let me know when it's going to rain. Thank you. One minute. As long as you want the entire Gulf of Mexico and not your particular neighborhood, I'll get to that in a minute. GraphCast outperformed the leading system, which is the ECMWF, which stands for European Center for Medium Range Weather Forecasts. It outperformed it in 90% of the 1,380 metrics they used. And on events in the troposphere, which is the lower part of the atmosphere that affects the weather we feel, it reached 99%. So things like temperature, wind speed, wind direction, humidity, pressure, et cetera. GraphCast uses a machine learning architecture called a graph neural network that is trained on more than four decades of ECMWF data. But instead of simulating like the ECMWF does, it just does what machine learning does. It predicts. GraphCast also uses less energy than ECMWF, which has to use big supercomputers. And it works well globally for things like major storms. It did a great job of predicting a hurricane in Nova Scotia, Hurricane Lee. It beat the ECMWF by three days in predicting when it would hit landfall, and it was right. Now, it isn't as good with your neighborhood forecast, with your smaller scale forecast like how much rain am I going to give in my neighborhood. So it's complimentary for current local forecasting methods, but important nonetheless. I think sounds amazing. Totally. I mean, I am such a weather nerd, even though I'm not a storm chaser. But I just, I think about weather all day, every day, partly because I have to walk a dog who is afraid of puddles. So when it rains, we have to just, you know, go a whole different avenue. But Hurricane Otis, with different, at literal Avenue, Hurricane Otis, which was a real hurricane that hit the Pacific coast of Mexico, down in Alcapocco. That would came upon my radar. I'm sorry, I'm not trying to do like funny. But you're doing a great job. You don't even have to. I know. It's just coming to me. I don't know. I'm built for this. But this was a hurricane that a lot of meteorologists were like, we didn't see it coming. We didn't, you know, the weather patterns kind of confused us. You know, I had a lot of people going, well, wait a second. You know, are we just not going to, you know, feel storms coming? Now that was a little bit of an outlier storm, but something to think about, you know, as we talk about weather in different ways, more and more these days. Unfortunately, Otis in particular is an example of what Graphcast isn't good at because it was a sudden intensification and it did not outperform the other models in the case of Hurricane Otis. So it's going to, it's not perfect. You know, it's not going to outperform in every single case, but most of the time it does. And the ECMWF is now using it. If you're like, well, great, why is the ECMWF still using all those power hungry models that aren't as good as Graphcast? Well, guess what? They're not. They're using Graphcast and they're developing their own version of it. They're going to integrate it into the systems that they already have. It sure does feel like the future though, doesn't it? Like why wouldn't this become, you know, as this gets better and better, we are going to use it more and more. And I hate how I can't rely on my weather forecast very much. It's usually kind of wrong in some way or another. Maybe it's just where I live. Maybe it's a regional anomaly, maybe has to do with climate change. But the more we get ahead of heads around this stuff, the more accurate that stuff's going to be and I am here for it. My favorite thing is, you know, and I've got one of my complications on my Apple Watch to show me, you know, what the weather is going to be for the next five hours. Like what is it now? What's the high today? And what's it going to be for the next five hours? And it's like, it is not raining. And I look outside and it is. I'm like, well, okay. You know, we're not wizards here, but that's not helpful. I think we forget how bad weather used to be. Like it was worse, you know, 20 years ago, 30 years ago. And I think we got used to the idea that, ah, weather predictions, they're rarely right. And it's become a trend. 40% means that you can say whatever. Well, and it's become a trope to the sense that now if the weather isn't perfect, we're like, see, the weatherman isn't always right. And it's not even always a weatherman. You know, like times have changed. We have gotten very good at predicting the weather. And I think if I were to try to explain this, Graphcast is going to be really good at telling you within the next 10 days, is it going to rain? Is it going to rain on Tuesday? Graphcast is going to be really good at telling you that. Is it going to rain at 10 a.m.? At 1 p.m.? Is it going to rain a lot? Is it going to rain a little? That's the part it's not as good at. Yeah, that makes sense. And just as progress in this area is always a good thing. And also, I'm sure I'm not the only person listening to the show who, when she said Hurricane Otis, we thought about our dog and maybe raising havoc in your house. That's what we all thought. Yeah. So I just wanted to point that out. He can be a hurricane. Trust me on that. You can't predict that though. Luckily, he cannot make it rain outside the home. And he's afraid of puddles. He's very afraid of puddles. It's weird. I'm like, you're a dog. All right, let's check out the mailbag. Let's do it. This one comes from Bill who wrote in about our PlayStation portal discussion from a Monday show. Bill says, I've had my PS5 since launch and it resides next to my OLED in my basement movie room. The main room that my wife and I live in is our living room upstairs. After work and supper, maybe we work out. The only time we really have to sit and be around each other is for a couple of hours in the evening. Could I escape to my beautiful OLED downstairs for some gaming only to come back up when it's time for bed? Sure, but sometimes I'd like to stay upstairs where we can have conversations and we're watching Euro trip for the 20th time while I remote play on my laptop. Laptop works fine, but due to an issue with the Wi-Fi card, I'm stuck at 30 frames per second with some packet drops every five to 10 minutes. I also have to be aware of any kind of pressure being put on the laptop's USB port. So Bill says, the portal solves all of this for me. I can easily play something while lying in bed or sitting on the couch. As long as it works like the PlayStation remote play app, it'll also work outside of my house as long as it has an internet connection. Bill says, I'm planning on getting the portal once it's been out for a while. Bill seems really smart. He wants to make sure he still has good conversations with us. Yeah, Bill has thought about it. He wants to have a high quality gaming. Number one, talk to your wife. And is also planning on getting the portal after other people have bug-tested it. Yeah, that's a beautiful way to do it. And he's right about all the use cases. This makes perfect sense. I do this with my steam deck and my wife while she watches her terrible Hallmark Christmas shows. I like to sit and play games and then chat with her while we're doing it and make jokes back and forth. So he's got that part right. The only part that really is open right now is how's the latency in some extended use cases? So far, I've heard from a bunch of people in my community that listen to my gaming shows saying they love it. They're very happy with it. They say it's heavy, feels well-built, not heavy in a bad way, but, you know, like it's a nice hardware, solid state piece of hardware. Yeah, it doesn't feel like jenki. As long as you've got a good connection within the house, you've done some decent, you know, Wi-Fi work in there. It sounds like it's going to be exactly what he needs it for. And I'm stoked for PlayStation 5 developers or developers, players, to have this method of taking their games with them. And for these use cases, because they are more common than we think. And I think Sony knew that. That's the reason this is a product. So I say, if it works that well, well done Sony and well done everybody who needs it for what it's good for. Love it. Agreed. Well, Scott Johnson, we knew you were going to have thoughts on this. And thanks for sharing them. And thanks for being on the show in general. And we want to know more about what you think about gaming and what else you do. Well, the good news is I launched, hopefully successfully. It seems like it's so far a new monthly discussion show with myself and Greg Street and the people at his new gaming AAA MMO developing company called Fantastic Pixel Castle. If you go over to frogpants.com slash street or their website, which you can find very easily by searching that name, you will find the first episode in both podcast form and in a archival video form. We live stream it as well. The idea here is to be very transparent from ground zero all the way up to the top of this thing when they release a game about how it's made, working with players on what they want, how they want it. It's a it's a very different approach to this sort of stuff. So if that sounds interesting at all and you want to get behind the scenes on this stuff, come check it out. That's every month available now at frogpants.com slash street. Fantastic. I'm so excited for that. You and Ghostcrawler are podcasting together. That's awesome. We've been thinking this would happen for years, but we didn't know how. And now we found a way. Life finds a way to podcast patrons. Stick around for the extended show. Good day internet. The 2023 game awards have been announced and one of the best games of the year. Starfield got one nomination. People are upset, but it's a great year for games. So what do you do? Well, Scott Johnson will tell us what to do. So stick around. You can also catch the show live Monday through Friday at four p.m. Eastern 2100 UTC. That's when DTNS is live. Find out more at daily tech news show dot com slash live. We'll be back tomorrow with Will Smith joining us. Talk to you then.