 Daily Tech News Show is made possible by you, the listener! Thanks for listening, even if you're watching. Maybe you're listening with your eyes. Thanks to all of you, including Mark Gibson, Reed Fishler, Larry Bailey, and James Irizari. On this episode of DTNS, the new app model is no ads monetized with premium features and don't use algorithms. Plus, Google Meet lets you use your phone, even if you're at the meeting in person, and envidia's new AI for your local PC. This is the Daily Tech News for Tuesday, February 13th, 2024. In Los Angeles, I'm Tom Merritt. And from Studio Animal House, I'm Sarah Lane. And on the show's produced, Sarah Rodgerson. And sometimes we all come together to form Daily Tech News Show, like a mecca, right? A Voltron of the form of it. Excellent, I like to say. Like a Triforce. Not to toot our own horn too much, but we're kind of good at this. We should toot our horn a little more. We're a link to the tech news. A link to the tech news, indeed. All right, let's start with the quick hits. Regulators in the European Union had decided that Apple won't have to make iMessage interoperable with other iMessaging platforms. Messaging platforms, rather. Microsoft also won't face tighter controls on how it can operate its search engine Bing. The European commission concluded the services aren't popular enough to count as core platform services under the DMA or Digital Markets Act. The commission also gave Microsoft's Edge browser and Microsoft advertising a pass. Apple says it will still support RCS on iMessage later this year. Sick burn, EU. Oh, sorry, you're not popular enough to comply. The Flipper Zero is a digital multi-tool that's popular with pen testers and other kinds of hackers. Partnership with Raspberry Pi has produced a video game module that can run games programmed in C, C++ and MicroPython. The module also has sensors for hand tracking, a three-axis gyroscope and a three-axis accelerometer. Now, yes, you can play the games on the built-in 1.4-inch monochrome display on the Flipper Zero, but you can also output the video to an external display. The video game module costs 49 bucks. The Flipper Zero itself, if you didn't already have one, costs 169 bucks. And in other small ways to play games news, the Playdate Handhold Console is now available for immediate shipment after ordering. If you were waiting for that delay to come down, well, it came down. Rumor has it that Microsoft plans to take Xbox exclusive titles to competing platforms, including the PlayStation 5 and the Nintendo Switch. The company plans to share its Xbox vision for the future of Xbox on a live podcast Thursday, February 15th, at 12 p.m. Pacific Time, that 3 p.m. Eastern. The event comes after weeks and weeks, it seems like rumors suggesting games like Hi-Fi Rush, Sea of Thieves, Starfield, Indiana Jones could all appear on non-Xbox platforms for the first time. Microsoft has confirmed gaming CEO Phil Spencer, Xbox President Sarah Bond, and head of Xbox Studios Matt Booty will all share updates on the Xbox business at this Thursday event. Bond, Booty and Spencer. For those chat GPT power users tired of reminding the chat bot how they like their emails formatted every time they ask for help or other stuff like that, OpenAI is introducing a feature called memory. The feature lets chat GPT remember things about you over a time. Now you can do this in a couple of ways. You can tell it specifically, Hey, remember to do this, or you can let it pick the details up itself and remember them as it monitors what you do. Each GPT you use can have its own memory. So if you've been using some of the chat bots from the store, they can have specific memories. You can ask the bot what it knows about you too if you let it learn on its own just to make sure it's not keeping some stuff you don't want it to keep. And you can actually remove that stuff either by telling it to remove it or going to the manage memory section. Memory is not rolled out for everybody though. It's available to begin with for a small portion of users. Along with putting ad-free Amazon Prime video content behind a paywall that charges users an additional $2.99 per month, Amazon put support for Dolby Vision and Dolby Atmos on movies and TV shows that support it behind that paywall as well. Some folks were not super thrilled about this. The change was first noticed by the German website 4k film and later confirmed to Forbes. I call this shrinkflation for digital video. Slowly take away features, but don't raise your price. Alright, Nvidia released a demo chat bot called Chat with RTX. It runs locally, not in the cloud, and can answer questions about the data you give it. You'll need Windows. You'll need a machine with an RTX 30 or 40 series GPU and at least a gigabytes of VRAM. This is a demo as well, not a finished product. So if you want ease of use in a slick user interface, don't do this. But if you're okay tooling around with stuff a little bit, it installs a web server. Don't worry, it's not actually going to connect out to the internet. It's just using it to give you the information and an instance of Python and can use either the Mistral or Llama 2 models to query the data you feed it. Nvidia's Tensor cores on those RTX GPUs are the things that speed it all up. The app itself is 40 gigabytes. The Python instance takes up three gigabytes of RAM, and it creates a JSON file in every folder you ask it to index. So, you know, don't go crazy and ask it to do your entire drive. Nvidia says you probably need about 50 to 100 gigabytes of space free on your hard drive. And that web server, like I said, it does not send your data to the web, but it does mean you use a browser to access the interface. And output is given to the command line. So you'll also need that running. This really is a demo. You can feed it text documents PDF documents dot docs dot dot X and XML, as well as YouTube URLs. If you wanted to do like a transcript or summary or something. It does not remember context. So each query is fresh. You can't say something that refers to the previous query. And Nvidia says it does better at answering questions than it does at summarizing. It also does better with larger amounts of data than single documents. Very, very much a demo, Sarah, but kind of a cool demo in that it's showing that you could have a lot of the features of these larger LLMs, these larger large language models. Yeah, locally on your own machine. I guess as you were explaining the same, like, yeah, if you're a person who likes to tinker, and I know a lot of you out there are, this sounds just sort of like fun to do just to do it, but it does sound pretty rough around the edges. I also, you know, I guess one of the things, and I get asked this question a lot, and some of some folks out there either are asking that themselves or maybe know somebody who is going to ask them this later is how, how important is it when something is run locally versus in the cloud, especially when we're talking about, you know, a language model? Yeah, it's it's one of those questions. It's easy to answer from the trust no one side of things. It's a little fuzzy when it's like, but how much do I really need to worry? So if you are someone who's like, Hey, I want to make sure I don't accidentally give any kind of information to anybody who wouldn't use it. Well, having it all locally means it's under your control. If you do send things to the cloud, like with chat GPT or co pilot or Gemini, you are relying on that cloud service provider open AI, Microsoft or Google to maintain good stewardship of your data, not only keeping it out of the hands of people who might try to get it from them, but also, you know, not using it themselves for something you don't want to use it for. So I guess it's all an individual choice. How concerned are you about your data being used for things like training models, which open AI says chat GPT does all the time unless you're an enterprise user, and how much you trust those companies to keep that data to themselves. Also the whole not remembering context again, totally depends on, you know, where you fall on, you know, privacy versus I don't know how context can can help your next query, but it feels so limited in this sense. I feel like that's something that the next version of this will include. They just are trying to keep it simple. Because again, you're running it locally, you're not using a big old data center. So, you know, once you add context, chat GPT used to not remember context. It was a big deal when they added context to it. So it's more it's just more process intensive for it to do that, which is again, but to your point, another reason why this is just a demo, right? Yeah. Well, if anybody does end up playing around with this, let us know how it works. And yeah, it's not that tinkery, like it's not like you have to go and do registry hacking or anything. It's just not user friendly out of the box, you know. Yeah, exactly. When I hear sort of like web server, but not really a web server command line, you know, I know some people are like, eh, talk to me later. Web server can sound scary, but you have all kinds of web servers on your machine right now. You don't realize it because nobody called them out. It just the only reason to know it's a web server is that you're going to have to use a browser to access it. It doesn't have its own independent GUI, which, you know, fine. But there's a lot of stuff like that, mostly websites. A little further context on this too, Nvidia CEO Jensen Huang was talking to the World Government Summit and said that he thinks each country should quote, codify the language, the data of your culture into your own large language model. Now, obviously, Huang thinks that Nvidia's localized AI processing is the right hardware for the job. But this comes right along with the news this week that Nvidia stock got to the point where it was slightly ahead of Amazon and market cap briefly. And then it fell behind, but it's pretty much even with Amazon, which makes it the fourth or fifth largest tech company, the fourth or fifth largest company in the world behind Microsoft, Apple and Alphabet. So we're going to talk with Scott Johnson a little more in part two of this conversation tomorrow about how did that GPU maker that only gamers cared about suddenly turn into the fifth largest company in the world and the leader in AI hardware? Well, another company that some folks around the world are familiar with Google has a feature called Google Meet that many people who are part of meetings are well aware of. Google Meet introduced a new companion mode, which is designed to better help participate in video conferences. Companion mode actually already existed, but they've just furthered it, I guess, because enough people were using it even when you're in a conference room. So you could be remote, you could be in the conference room, but companion mode will allow you as a user to check in on a mobile device, share something like an emoji reaction without having to interrupt the speaker, maybe raise your hand virtually to indicate that you'd like to speak, turn on captions, send chat messages, kind of just be in the conference room but without being sort of too disruptive to the conference room and view and zoom on any presented content to follow along on their device as well. Companion mode has been available on the mobile web version of Google Meet as well as Google's Nest Hub Max device. Now it's available as its own dedicated app for iOS and Android. And Tom, before the show started, we were kind of going back and forth where you were like, so this is for people within the conference room, right? Yes. But you don't have to actually physically be in the same room as everyone else in the conference room because, of course, as many of us either who are remote workers or some form of hybrid work know very well. That doesn't happen as much as it used to. Yeah. So a little bit of the, I looked into a little bit of the history of companion mode. If people aren't familiar with it, they launched it, like Sarah said, originally for laptops for the mobile web and the Nest Hub Max as a way for people who were in the conference room but like part of the meeting was not, part of the meeting was remote. You couldn't do any of the other stuff, right? Because there was one device connecting you to the meeting and your only other option would be to get into the meeting yourself but then you'd have to turn off your camera, mute yourself and you'd still see all the tiles and all of that. So what companion mode does is say, hey, if you don't need voice video or to see everybody, just turn on companion mode, then you won't be distracted with all of that stuff. You won't have to remember to mute yourself or anything like that but you'll still be able to raise your hand and see the chat and all of that sort of thing. So if you're using companion mode, you won't see anyone else in the meeting. You won't hear anyone else in the meeting. They won't hear you but you will be able to do all of those interactive things. And the news here today, like Sarah said, is that instead of having to open a laptop to do that in a meeting, now you can do it on your phone, which is a little less bulky, right? If you're sitting in that conference room to be able to just pull out the phone and be like, okay, now I can send smiley emojis to my boss while they're talking. Isn't that great? I mean, I don't know. It depends on how much you like your boss, I guess. But I do find this, first of all, thank you. That was a good explanation. But there was a time, not that long ago, well, I guess it was sort of a long time ago when I was actually physically in a conference room regularly to be part of a meeting. But when I was to kind of be looking at my phone, pressing a couple of buttons here and there, people would be like, what is she doing? Super rude. She's obviously distracted. These are, I don't know, I mean, this is the modern era of meetings. Some people love meetings, some people don't. But if you have to be part of a meeting, you're not always going to be running the meeting, being the main speaker. It's not really that different than just being like, I'm on mute, but I'm listening, kind of thing. If you happen to be in your car, maybe you're coming back from a doctor's appointment, whatever you're doing, a lot of people just aren't in the same room anymore. But if you are, this just allows you to, I guess, be a little less checked out. That's kind of what I see this as. Well, because I imagine a lot of people have run into this where they're like, well, I need, I need to put this in the chat room, but now I have to reach over to the Google Nest Max and like type into it. No, I'm not going to do that. So I guess I'll open my laptop, but then I have to remember to mute and all of that. So companion mode solved a lot of that. And what the mobile one does, I think is funny, but I imagine this is something people really run into is, hey, if you don't have room to open your laptop, you're in a small conference room, right? And you're like shoved in there might be a little uncomfortable to pull that laptop. Now you can do it on your phone. So yeah, all of this, all of this underlines, right? That hybrid meetings are here to stay like their companies are developing these features because everyone's doing hybrid meetings. Yeah. I mean, when I, when I was looking at this story earlier today, I was like, is this even really that important? I mean, for a lot of people, people be like, companion mode, I don't care about that, you know, or I don't have enough meetings to care about that or, you know, whatever. But it all just illustrates how much these little, little tweaks, little tweaks to make it easier to be part of a group while, you know, not necessarily all being in the same room, or just not necessarily all having the same equipment, or having to participate in the same way is just, you know, it's an evolution. Must be between four and 25 letters in our Twitch chat says, Oh, good. I can keep playing Candy Crush, but look like I'm paying attention to the meeting chat just to be able to switch between apps. Thanks Google. We see you. Right. Yeah. I'm just, I'm paying attention. If you want to share your favorite meeting hacks, let us know on the socials. You can get in touch with us at DTNS show on X at DTNS show at mstdn.social on mastodon at daily tech news show on tiktok and DTNS pics DTNS PIX on Instagram and threads. See you there. So TechCrunch had an article today that caught my eye on an app called Memorizer. Now, not every time TechCrunch or the Verge or anybody does an app review, does it make the news? But I thought there was something really interesting about this one. Memorizer is an iOS app that's kind of part Goodreads, part tracked and part Yelp or Foursquare. It offers to let you track anything, books, movies, shows, video games, restaurants, recipes, et cetera, as memories. Memories can be things you did, like a movie you saw, things you're doing. So you can, you can check into reading a book and say it's ongoing or it could be a thing you want to do. It's like, you know what, I want to go to that restaurant. I'm gonna put that as a memory. So I remember what restaurant I want to go to. Memories can be public or private, although you got to pay for the premium to get the private. And you can follow people's public memories in the inspirations tab and add some of their memories as recommendations you want to follow up on. It also has groups that you can recommend to each other, so groups around like sci-fi books or, you know, action movies. Users can create lists of their favorite things, another premium feature. You could make like best movies of 2023, best restaurants in Austin, something like that. And these are the other interesting aspects of Memorizer that I thought made it worth talking about. There is no recommendation algorithm. There's AI all over this thing, but the AI is in your memory to help flesh it out, you know, to bring in like if you put a movie, it'll like bring in summaries and directors and enrich as they say your memory, but they don't use AI for recommendations. Recommendations are made for you from people you know, people you're following or groups you're a member of, and there's no ads. The company makes its money on these premium subscriptions I was talking about. That lets you make your memories private, create more custom lists, etc. So Sarah, it's kind of a social network built around interests rather than news or viral hits, but it's still kind of for influencers, right? Because, you know, you're influencing people with your recommendations and stuff. I guess, yeah. I mean, if you really wanted to go all in on Memorizer, you could influence lots of people. My first reaction was like, didn't we all decide to stop like overthinking every little thing we do throughout, you know, each day of our life type thing, write it down, send it to somebody type thing. But I think that in past cases where, you know, you mentioned Foursquare, you know, I got no problem with Foursquare or Yelp or Path or, you know, any of the apps that I used to use a lot more back in the day. But this does feel sort of throwbacky, but I downloaded it, I installed it earlier today, played around with it, you know, just like, I think you should leave. One of my favorite TV shows, you know, I added that in there. I was like, super easy, comes up, nice graphics, you know, and it sort of, you know, it lets me quickly add like a note of, you know, why I like it type thing. I already know that I like the show, but let's say I just saw it last night type thing. Sometimes there is a little context like, oh, Tom and Roger and I were, remember, you know, it was, it was that Monday when we all hung out and had pizza and watched the show that we all liked. Little things like that, kind of bread crummy stuff that I think it, my first reaction was like, it sort of reminds me of if you wake up from a dream and you're like, wow, that was a crazy dream. But if you don't write it down right away, there's a lot of context that just gets lost. Even if you think like in 10 hours, I'll still remember all of this, you just never do. You just don't, or you don't care anymore, but you might wish that you cared later, you know, in a week or a month or a year when you go back to it. I like this. I like stuff like, you know, the sort of minutia of things I like and why and where and how this is, this is a great app. Yeah, I think it's different than Foursquare and the others, not just because it combines multiple things. I think that that is key. It's not saying it's a restaurant app. It's like, it's for following things, whatever they are, but also lets you keep those private, like it makes that front and center. I'm sure you could do private things before in Foursquare and Yelp and others, but they didn't really want you to the point of Facebook and Twitter and Foursquare was like, share it with the world. What's interesting is this is like, share it with people you want to share it with. You can share it with the public, you can keep it private. You can have a private group where you only invite certain people and then what's shared in that group is, so there's more of a balance between, yeah, there might be some things you're comfortable like letting the world know, but let's not just assume that you want to always let the world know everything. This doesn't strike me as a place to like blast out like, hey everybody, here's what I just did, which is why I mentioned PATH. PATH for anybody who wasn't familiar, which gosh, I don't know. PATH's heyday was 2010, 2011, but it was a great app, but the whole idea was like, you know what, social networks are fine, but what about the stuff you really care about? You only really want to share that stuff with your close knit friends. So, you know, pick those 10 to 30 people and that's a different conversation than what you might be having on Twitter. And I don't know. I mean, I always liked that idea. I think there's room for both of these things and also for, you know, for example, okay, so I mentioned on the show last week when we were talking to Scott Johnson that I watched a Clockwork Orange after many, many years and, you know, it was horrified all over again. So I looked it up and it's like, all right, what's the thing that you want to remember for yourself that then I could share with you, Tom? I can rate the memory, you know, between one and 10, like, is this a really bad memory? Maybe that's important. It doesn't have to be something that you love. It's like, this is a thing. This is an experience. I watched this and I need to remember I didn't like it. Yeah, right. Or, you know, if someone asks later, Sarah, requiem for a dream, should I watch that with my grandma? I'd say, heck no, don't do it. Yeah, good for the grandma, I suppose, but probably in most cases. Yeah. Maybe if I didn't have a memorizer, I'd be like, ah, maybe. I don't remember that movie very well. But yeah, I like stuff like this. I think it can be as interesting as you want it to be. I think the key thing that caught my eye, though, is that it is a different approach to what seems like the same sort of thing. But it's saying we're going to use AI. I mean, it's memorize.ai, but we're not going to use an algorithm to make recommendations. We think those are better from people. That you're not going to be subject to weird algorithmic fluctuations that way. And in fact, it's a social network, but it's not like Facebook and Twitter have been in the past, which is everybody follow everyone because that leads to chaos. What we want is people following other folks with similar interests like on Reddit, because that seems to work a little better. So it's combining a lot of these different things that we've started to identify as working better in these spaces. Not to mention, it's not ad supported. So it is not selling your privacy. It's asking you to pay for the privacy. Now that might rub you the wrong way. But then you don't have to use the app at all. But I think it is fair for them to say like, Hey, if you want to maintain the best version of the app, you should pay it pay us for it. And we'll give you some of the stuff for free. Subscriptions are $6 a month $45 for a year. So it's, you know, it's not outrageous that might be more than you want to pay for something like this, but it depends on how useful it is. One thing they know is that they have 70,000 monthly active users and 50% of new users keep using the app after three months, which is an insanely high retention for apps, people download apps all the time and never go back to them. So this one is holding onto its users a lot longer than others. All right, let's check out the mailbag. This one comes from Technomanche who said, Tom mentioned, this was on Monday show, I believe, that he has an external Blu-ray drive that works with his Mac. Technomanche says, I have yet to find one that does my old external DVD drive was a micro USB to USB a used to work on the Intel max until I upgraded to a newer model. That wasn't supported. It's been gathering dust ever since is your drive USB C or are you using a hub with USB a support? What have you found that works? My drive is USB C. I still can't tell you what it is because I can't find it, which is I think revelation into how often we use it. So I will I will dig that up as soon as I can. But Roger came up with an OWC link at max sales.com with Blu-ray drives both internal and external. If you want ones that definitely work with Mac. And if anybody else has recommendations for Technomanche of like, oh, yeah, this is my external Blu-ray drive. I use it. I love it. And especially I think if it works with USB a because I think that's one of Technomanche's situations as well, send it to us feedback at daily tech news show dot com patrons stick around for the extended show. We've got more good content for you on good day internet. You may have seen the story that the on demand version of the Super Bowl halftime show fixed a few audio issues like a cracking note and some low mic volume so that you wouldn't know they happened live. Is this smart? Is this ethical? Does this bother us? Stick around and find out. Just a reminder, we do the show live. You can catch it live Monday through Friday at 4pm Eastern 2100 UTC. Find out more at daily tech news show dot com slash live. We are back tomorrow talking about how Nvidia pivoted from gaming to AI with Scott Johnson. Talk to you then.