 Coming up on D T N S piece in the smart home wars spotify wants you to make some friends with music and how deep fakes work and how much you should be worried. This is the Daily Tech News for Wednesday, December 18th, 2019 in Los Angeles. I'm Tom Merritt and from Studio Redwood. I'm Sarah Lane in Salt Lake City. I'm Scott Johnson and I'm Roger Chang, the show's producer. We were just talking about ice cream, a lot about ice cream. Also about angry garbage men and much, much more on Good Day Internet. If you want to get that wider conversation, become a member at patreon.com slash D T N S. Let's start with a few tech things you should know. Following a series of leaks from multiple people, Samsung publicly revealed the Galaxy S 11 and an application to Chinese CCC. The company confirmed that the Galaxy S 11 will come with 5G, but will only have support for a maximum charging speed of 25 watts. This is after Samsung's Note 10 plus introduced 45 watt charging. So few people disappointed about that. Yeah, share now the car sharing service, formerly known as car to go and owned by Dallamar and BMW says it's leaving North America to focus on the European market and will cease service on February 29th, 2020. That is right around the corner. Share now share now currently operates in New York City, Montreal, Seattle, Washington, D.C. and Vancouver. Google resumed the rollout of Chrome 79 for Android after issuing a fix to the bug that made data inaccessible for some apps that use the Web View framework. Google says the fix will make its way to devices within the week and most importantly, make data visible again. All right, let's talk a little bit more about some ransomware situation. It's a big one. Let's do it. Canada's life labs. Have you heard of them? Well, we'll tell you all about it. A provider of laboratory diagnostics and testing services said it paid to receive data stolen during a data breach. That's pretty gnarly. According to documents filed with the Office of Information and Privacy Commissioner of Ontario and the Office of the Information and Privacy Commissioner of British Columbia. The attack occurred around November 1st and included information on more than 15 million customers such as names, home addresses, email addresses, user names, passwords, health card numbers, etc. Oh, and by the way, 85,000 customers, medical test results. Eesh! Life Lab says it's working with law enforcement and that it has patched the vulnerability. So two big significant things about this. One is the size. 15 million is close to half the population of Canada. The number two part of this is that an organization this big paid the ransom. With ransomware, the generally accepted practice is not to pay because you don't want to encourage the practice. You don't want people to think that this will work. And so especially large organizations will often not pay the ransom because of that. Also, you don't know that they really got rid of that information. They may have kept a copy that they could down the road use against you again. So it's odd that Canada's Life Labs did pay this. They say they made that decision in consultation with experts, but they didn't delineate what the reasoning was for why they would do that in this case. They also didn't say how much they paid. Yeah, and you wouldn't. You wouldn't want to you wouldn't want to put that out there, especially if there's a reason why you're paying that is has to do with catching the people. Maybe, you know, the security experts said, you know, if you pay this, I think they're doing it in a way that's wrong that we will be able to find out who they are. Let's do that, right? The other thing is sometimes insurance will cover if you pay and it's faster to just pay and get the get the information than not. But it's generally considered a bad idea to do this. Yeah. Look, here's what I would feel if I were any of these people. I would think that this isn't resolved and that paying them off means nothing because I guarantee. Well, I can't guarantee anything, but I'd be willing to bet a lot of money that they still have a copy of this, that it's still out there, that it's maybe in other people's hands. I don't think there's any code of ethics that should that we know of the kicks in here where the the ones holding a hostage suddenly go and because I swear transparency, we promise on our mother's grave that we've deleted the data. I just we did this big illegal thing. But listen, all we want is money. Then you get all your data back. No problem. I've all of these stories and yes, the life lab story is a big one. If I were a life labs, well, I guess you don't necessarily know if you're a customer, but if if if you knew that you were affected in this breach, I would be extremely upset and extremely worried. But you know that aside, all of these stories, I just sort of shake my head like even if whether or not you pay a ransom or not, you know, and we've probably all gotten, you know, the phishing emails where they say, well, we've been watching you through your webcam and you know, you should pay us this money and stuff like that. It's like the people who pay. I don't know at what point you ever think that any of this is resolved because once you pay, it's like, well, then you're susceptible to more attacks. Yeah, like a real a real ransom. Let's just say the Mel Gibson movie ransom. Okay, right. You're trying to get your kid back because they have your kid. So you pay the money. You meet him at the drop thing, drop the money, take the kid back. Now you have the kid. They didn't copy the kid. They don't have a backup of the kid somewhere that they can then hold over your head again, but they do in this scenario, right? This stuff is infinitely copyable. There's nothing about this that that rings like to me that they are going to do the right thing, quote unquote, as they do the wrong thing. Like it's ridiculous and the fact that it's this many people, by the way, in the context of Canadian population, this is a huge number, like a gigantic number. I would be livid if I were those people. What if you look at it this way, because the problem is the breach, not the ransom, and the breach has happened. The data is out there. If you don't pay the ransom, the data stays out there, right? This isn't ransomware where they locked up something. This is ransomware where they have your data. If you pay them and they delete the data, like they said they would, you win. If you pay them and they don't delete the data, well, you're in the same situation you were before you paid minus the money you paid. There may be a situation where it makes sense. Like, you know what? This is going to be covered by insurance. Anyway, pay them. Pay them and maybe they will delete it and then we'll at least be in better shape than we would otherwise. So I could see that being the line of reasoning to like, yeah, we're not guaranteed to fix this, but it certainly won't get fixed if we don't pay them. Maybe it's worst case scenario and they went or best case scenario and they went with that and I guess I can understand that. Well, this next story might make feel a little bit better if everything goes as planned. Amazon, Apple, Google and the ZigBee Alliance are partnering on Project Connected Home over IP, which is a new standards organization aiming to let smart home products work better together. Right now when you buy a smart device, you have to check to see if it works with your platform. Smart home device makers have to choose which platforms to support. They always don't support all of them. The project aims to enable communication across smart home devices and mobile apps and cloud services as well. The ZigBee Alliance, if you're curious, includes multiple companies like Samsung, Philips, now signify IKEA, NXP, Recidio. So quite a few use ZigBee as well. The project will start with safety devices like smoke alarms, security systems, electrical plugs, window shades and HVACs. A draft specification and open source materials are scheduled to come out late next year. Man, this I mean, I talk about this most when it comes to smart assistance, but but it is an entire smart home problem where I have to think about do I want to buy that device? Will it work with something I have? And if I have a Google home or I guess a Nest home now and an Echo, which do I want to use to control things? Do I want to try to use both? It's just a mess. And so this just probably won't fix the voice assistant part of that. You're probably still have to pick one of those, but it may make everything else a lot easier where you don't have to think about like, oh, wait, should I be in the Apple Home ecosystem or the Echo ecosystem? You can just buy stuff and it'll work with each other and that is good for all of us. It's good for us as consumers and it's good for the companies too, because they'll sell more products. It's also rare. That's it's usually these companies usually dig in and think, well, we want to be the ecosystem people most use. You know, Apple in particular and they don't usually budge from that, but once in a while you'll see something like this happen. And I think this is for the betterment of the future of home connected IP devices and our ability to control them because right now there's a lot of confusion in the marketplace about who to go with what to go with. How about it be cool if I went with the best thing for the job from these guys and then I had a whole other job. I think we'll be best handled by those guys and maybe one's opening my garage door and the other one shutting off my lights at night. But if they could in some way cohesively work together like a hundred percent behind this idea, I think it's great. Yeah, I it's it's it's unusual. Like you say, usually you have one holdout. Usually it's Apple. So it's good to see Apple, Amazon, Google and the entire ZigBee Alliance was trying to be this on its own. And so it's interesting to see the ZigBee Alliance go fine. We know we won't get you to join us. So we'll join you in a but needs a better project connected home over IP is not. Catchy. I'm sure they're going to come up with some rebranding. Well, you don't like Pachiqui. Yeah, but chip. That's what they'll change it to. I'm sure Spotify is testing a way to see what your friends have been listening to called taste buds. Jane Manchang Wong found some code referring to taste buds in the web version of Spotify. Jane Manchang Wong is amazing at fighting these sorts of things. She told TechCrunch about it taste buds would show up alongside the library and home sections in Spotify. So it just be one of the main sections. Right now you can see the recent things your friends have played. You had to dig down to find them and it's only the most recent stuff. Taste buds would show you everything. So the the section would you get to there by tapping a pen icon. You go to the taste bud section you tap a pen icon and search the people you follow to see what they've been playing the most. So I guess not everything, but you you would see what their most favorite things are rather than just the three or four things. They played most recently. I'm unsure if I would use this because I already find myself not caring so much about what my friends are listening to as much as I want better recommendations just from algorithms and from my previous you know what I'd listen to would like it to just make better suggestions. But then I got to thinking my friend Eric Van Skyhawk who's a bit of a musician in his own right is a huge nut about great electronic music. And it occurs to me that this actually would be something I would do with him in particular because I'm always trying to find good electronic music above the opinion that it peaked in the 90s. That's a whole nother argument for different show. But my point is if I want to know what's good and new and the algorithms just aren't telling me. Maybe that's the only way to get it is if I start tapping into other way or into it. So he's one of your buddies with taste. He's my taste buddy is what I'm saying. Yeah, I definitely have friends who I find to have really good taste of music and not just because it's exactly the same as mine. They turned me on to bands that I end up liking or a song that I end up liking that I wouldn't know of otherwise. Or maybe they were just first. So I get how this would mimic that. Most of my friends are not those people and it's not because I don't like their music. I just don't care. And I find that and I don't use Spotify. But if this was an Apple Music feature, I know I wouldn't use it because I kind of like the playlist that Apple editors that who are they even their experts somewhere, you know, that I don't know. I like that kind of stuff because it's a little bit more like radio where I'm just like, who knows what's next? That that kind of speaks to me a little bit more, but I'm sure there are people out there saying, no, no, no, this is great. I'm Spotify user and I, you know, I can't wait to learn more about my friend's tastes. There's just so much out there. I feel like I stumble on things sometimes and that's great. But it always feels like such a weird accident. I'm kind of angry that it took that long for me to find it like I would have been listening to this for five years. Where's this band been? And it's been nowhere because nobody's recommending it. So maybe this is just another way to do that. I'm all for more features in our music. That's the state of the world today. Find something you love and be angry about it. That's right. Well said. Netflix, the thing we all like and are angry about released 371 new TV shows and movies in the U.S. This year, just this year, everybody, that number again, 371. Think about that. According to data from Variety Insight, an increase of 54.6 percent over 2018 variety also points out this is more original series than the entire United States TV industry released in all of 2005. Not that long ago. The number includes documentaries, adult animation, unscripted TV shows, comedy drama, news and talk, but not content targeted for kids. That number is crazy to wrap your head around. Yeah, because when you think of the entire TV industry in 2005, it's not like 2005 was the Dark Ages. You got cable. You got broadcast. You got local. You got other stream. You know, it's just pre pre streaming originals. You really didn't have Netflix making originals in 2005, but all the major broadcast networks and cable networks are there. You're right. Yeah. Yeah. I think if you said all of these streaming services together made more than 2005, but Netflix alone, does it include all the original content or Amazon or anyone else? We talked about this on good day Internet. Another reason to have good day Internet because the story came across the wire right after DTNS yesterday. And the thing that I pointed out was the total television and counting everything. The original television made in the U.S. In 2018 was a thousand two hundred thirty three shows versus a hundred thirty eight made in two thousand two. So 16 years before they made a hundred thirty eight shows. You still couldn't watch everything a thousand two hundred thirty three in 2018. In fact, it went down in 2019 according to variety went down to a thousand one hundred seventy eight. I didn't make quite as many. So maybe we really have hit peak television. They just can't make anymore. This is this is the maximum capacity of television you can make. Well, it's definitely not. I'll shovel shovel wear either or junk. It's pretty good generally. Like I know, you know, it's tastes are going to vary, but we're not talking about just throw away terrible television like the kind I grew up with. I mean, that exists too. But there's also there's just more of everything good and bad. That's true. There's enough. There's so much that just on this one service, which if you think about it, I mean, if you're really into original Netflix content, you're getting quite the value for your whatever you're paying 14 bucks, whatever it is now. But with so much content, if only there was a way for me to know what my friends were watching, then I could get great recommendations. They hate on Twitter. You'll be good. The the thing I when people complain about, I don't want to pay for all these services. The thing to remember is you used to pay more money for less. There are only 138 new shows and you were paying more money for that than you are now for a thousand one hundred seventy eight shows. So that's why it caught. That's why there's so many streaming service. So much stuff being made. It'll shake out. They can't keep this up forever. Yeah, I guess we don't know what everybody together is, but Netflix alone is enough. I would buy be super curious. No, once the one thousand number is everybody together. Oh, that's everyone together. Yeah, yeah, yeah. That's what I've been saying. Sorry if I didn't make that clear. That's the entire television output of all of them. Hey, folks, if you want to get all the tech headlines each day in about five minutes, be sure to subscribe to daily tech headlines dot com. This Monday, Timothy B. Lee published a part in an ongoing series on on A. I. and deep fakes and the like on Ars Technica about his experience creating a deep fake video. He got approval from ours to spend a little money and a little bit of time learning how this worked. So he decided he was going to take a video of Facebook CEO Mark Zuckerberg testifying to Congress and replace Zuckerberg's face with data from Star Trek the next generation. He ended up with a 38 second video that cost him five hundred fifty two dollars to make. He used deep neural networks. That's generally what is used to make deep fakes rented a virtual machine with four graphics cards and took him about a week to train his model. He said if he took more than a week, it might have got better. But after a week, he had something at least worth looking at. He started with 14 video clips from Star Trek and nine videos of Zuckerberg. He used iMovie to then edit out any parts of the video that didn't have their face in them. He just want you want to give them a pure sample of the video with the faces. He ended up with about nine minutes of footage of data at seven minutes of Zuckerberg. That was then cut up into stills because that's how the training works. They don't look at the video. They look at the individual still images that make up the video. So he had two thousand five hundred ninety eight images of data and two thousand two hundred twenty four faces of Zuckerberg. Now the reason you need all that is you want a variety of angles and expressions and lighting conditions. So it's not the length of the clip. It's the diversity of it. That's important. You an hour long video of Zuckerberg talking in the same lighting conditions and in the same position is not as valuable as three three second clips of him and entirely different places. So that's why he wanted to have a lot of different clips of the folks. Then he used some software called face swap to do the training. He rented that virtual machine as I mentioned a Linux machine. He started with one graphics processor but he upgraded it as he went along and he ended up with an Nvidia with four I'm sorry for Nvidia T4 tensor core GPUs 16 gigabytes of memory each. He says he could have done it cheaper but it would have taken more time. Apparently an Nvidia GTX 1070 or 1080 card with at least 8 gigabytes of VRAM can do this just runs little slower. The first phase of the face swap is extracting. That's where you detect the frames the faces in each frame of video. It's fast for the extraction to happen but then humans get involved. Humans have to go in this case. Timothy Lee and look at each thing to see was that really a face of Zuckerberg was it really a face of data. We're not even into the training yet. This is just identifying faces and did they identify the faces he wanted? Did they identify things that weren't faces? And he said that took a lot of time going through and manually kicking out some of the false positives. Once you're sure you have a pure data set then you feed that into face swaps training algorithm that takes the extracted images and trains the deep neural network to make the convincing versions of the face training ran from December 7th to December 13 and then once he was satisfied that the training looked pretty good. He did the conversion which is almost instant taking the trained model and applying it to that video of Zuckerberg speaking to Congress and the output of course is the video of Zuckerberg with data's face speaking to Congress. The better part of Timothy Lee's story here though is his explanation of how deep fakes work. Scott, Sarah, I know you've both read this article but I'm going to try to summarize this part because I think I finally understand how this works and I think it's helpful in understanding how deep fakes work to understand what's going on with them. It's a big read but it is the best explanation I have gotten thus far about what's going on and how the technology might advance. So here's my attempt to summarize and all apologies to Timothy Lee if I get anything wrong it's my fault not his. The encoder in the deep fake software takes the actual image let's say of Zuckerberg and then tries to squeeze it down to a small number of variables. The decoder then takes those variables and tries to reconstruct it. So the encoder squeezes down the original passes the variables to the decoder. The decoder takes those variables and then tries to reconstruct the original. So if you see this training and you're like well wait you're just giving it Zuckerberg and it's outputting Zuckerberg that shouldn't be hard. Well it's how they do it that's important. It's can how many variables can you squeeze out. The process learns what parts don't change and no longer encodes those as variables. So Zuckerberg always has blue eyes. You don't put that in the variables because both sides know he has blue blue eyes. No need to put it in both sides only the transitory variables are encoded and that's going to become important a little later on. This is an example of unsupervised learning. You don't have a human grading whether they got it right or not the two networks can tell how well they're doing without a human because they just compare like I squeezed this picture down to these variables all you took were those variables made the picture now let's look at the two pictures and see how close they are. Over simplifying it puts in the variables and then compares what the decoder creates to the original and then tweaks the neural network to make it get closer. That's how it improves it compares those two and says oh that one's a little off so let's tweak this variable see if it gets closer and that's why it takes a week because it keeps doing that over and over. You do this until the auto encoder sufficiently can reproduce its own input and that's where Timothy Beeley looked at a week and said that looks pretty good if I did it longer probably get better. Now it trains to auto encoders side by side for a deep fake one on the original face how how good can I get at squeezing down Zuckerberg to small variables and then re-encoding a Zuckerberg and the same thing for data they both here's the key they both use the same encoder but different decoders. So the the part that is squeezing down Zuckerberg is also squeezing down data but there's a different decoder recreating Zuckerberg than is recreating data. So the neurons in the neural network change with both faces but the decoders only change neurons for the face they're creating whether it's data or Zuckerberg the encoder is working on squeezing down variables from both faces and then and then it gives it to either the Zuck or the data decoder. In other words you're training the input to recognize commonalities between both faces head position eye position angle but the decoders are being trained to take those variables and output only their face. Once you've trained it up that's when you swap you give the encoder a Zuck face but instead of having it pass to the Zuck decoder you have it pass to the data decoder and voila deep fake and just to be clear we're talking about Star Trek data not just data in general. You know I mean I think everyone's with us but you should have just a lot going on instead of data but that's a very good idea. This is amazing. I'm so we were talking before the show about ramifications and stuff and the thing I thought of first and foremost isn't so much that the deep fakes themselves concern me it's the false positives that concern me and Tom says well that's no different than what we do with Photoshop today and I would only argue that I agree with that but I would say it's a notch further down the road it's a little more of that little more intense of a of a fake out and a greater chance for someone seeing something real and saying well there's no way that's that's real that's one of those deep fake videos which is sometimes as harmful as fooling somebody the other way. I also know there are a lot of positive ways this tech could be used in film and special effects and some other things but as it stands right now my biggest concern is that is the false positive not so much that you can tell something's fake like we all say we can with Photoshop now we're going to do it with a much more sophisticated piece of technology that while demonstrated here we could all do for $600 we're still going to be very skeptical of something we see that seems just a little unbelievable and we're going to go you know we're going to go that direction by default and I think that's I think that's bad I agree this bothers me a lot more than someone saying that's probably just Photoshop and I'm not sure how to summarize that quickly but I think everybody who works in the video industry knows that anything with video is there are so many more points of failure right you can doctor up an audio file easy you know make somebody say something that they never said you can doctor up a still image doctoring up video as we've seen here and again there's there there are still a lot of limitations even Lee himself says my deep fake wasn't that good there's some artifacts you can see there like I know I if I spent more time on this that it might have had a different result but here's how I did it rather cheaply and with equipment that a lot of people could get easily if you had more money you had more time what what are the real implications here and I think that yeah that this sort of growing public sentiment that everything's fake what's real anymore you know who's fooling us what's their motivation this just brings it to an increasingly frightening level well to calm you down the good news is it takes a long time to make a good-looking face swap the higher the resolution the worse it'll be so if you see a wide shot video it's more likely to be deep faked than a close-up face also it only does faces not whole bodies there's all kinds of telltale indicators right now so the thing about false positives the worry about you know that that sort of disillusionment that isn't as big of a worry until down the road when this gets so good you can't tell but Photoshop has gotten so good you can't tell in fact I like the point Lee makes in his article that we can fake an email exactly we can make it look like it came from Sarah Lane and said horrible things but emails don't ruin careers because we know that can happen and yet we haven't lost all trust in email because we learn how to deal with that we learn to go okay with there's a chain of custody as Lee says that that we start to look for and we don't rely on just the email we don't rely on just the picture that could be photoshopped and we're not going to rely on just the video and and I don't expect that suddenly all videos will no longer be believed in the future it's a concern certainly but it's something that historically we we we get used to it what's what's more important is teaching people how this works so they can know what the limits are what the telltale signs are and that's why you should all go read Timothy B. Lee's article at ArsTechnica.com and get up to speed on this good stuff totally agree really good stuff ArsTechnica articles show up on our subreddit all the time as well as many others you could submit the ones that you care about and vote on others get them to the top so we see them dailytechnewshow.reddit.com if you haven't already joined do so also our discord is a great place to chat with all of your DTNS peers link to a patreon account at patreon.com slash DTNS and hop on in Let's check out the mailbag Oh let's so I missed the show yesterday and I particularly was bummed to miss everybody's gift picks but I know that one of Roger's gift picks was a dash cam he said it'd be something that I I would like you just you know there's just stuff you have to think about ahead of time so Rob wrote in and said dash cams are great even the good ones are only as strong as the SD card that you put in dash cams should have a type of card called high endurance where a special kind of micro SD card that's made for dash cams has to survive and record inside a car that might get baking hot in the summer but also freezing cold in the winter and still have to always work the normal micro SD card that's probably in your phone can't handle those temperature extremes where the constant video recording which means that it might fall flat after a couple of months not save the video of the guy that pulled out in front of you that you need to show your insurance company now that it wasn't your faults Rob says don't ask me how I know that Oh no I think we might be learning at Rob's expense here but thank you Rob best thing to do is actually just reformat that SD card on a regular basis maybe every couple of weeks every month and then at least twice a month check the video to make sure that it is recording properly because that he's right there's been a few times where I had something that I thought happened and it didn't capture shout out to our patrons at our master and grand master levels including Dr. Carmine and Bailey Mike McLaughlin Phillip Lee's Frederick Huebner James P Callison Wandy Hernandez Jonathan Price and Michael Atkins also thanks to Scott Johnson we saved the best for last on that one Scott what's been going on since we saw you last well lots of stuff going on but I actually have a reverse thing I'm going to do today and that is I'm going to ask the really smart awesome DTNS audience if they have recommendations or found one that they love of an eGPU external GPU accelerator I'm looking at one man our opinion spread all over the internet and I'm tired of it I want people out of this rad community to ping me on Twitter or something at Scott Johnson and tell me what you ended up with or will the one maybe you recommend cause man it's a sea of opinions so anyway that's my request to you this fine holiday season tell me what eGPU to get enclosure card whatever I'm I'm open-ears you can find me again on Twitter at Scott Johnson or other ways to contact me at I will not be here the rest of the year actually I'll be here on the special episodes we already recorded but this is my last live DTNS of the year so I want to give you thanks I will be back for a special New Year's Eve current geek with Scott Johnson as part of Ritual Misery and Diamond Club TV's fifth annual Diamond Club New Year's Eve streamathon we're ringing in the New Year in every time zone with 27 hours of games live podcast talk shows and more and Scott and I are doing the London New Year we're also raising money through extra life to benefit the Children's Miracle Network Hospital so come on have some fun help a great cause streamathon starts at 4.30 a.m. Eastern 9.30 a.m. UTC on December 31st at twitch.tv slash DC streamathon and around the London New Year you'll hear Scott and I doing a current geek as part of that also blanket thanks to the patrons who have gotten us through our year of Daily Tech New Show I never feel like I can thank you enough for supporting the show for allowing us to continue to get bigger and better so here's to 2020 and an even better Daily Tech New Show thanks to you the patrons thank you thank you thank you our email address is feedback at DailyTechNewShow.com we love to hear from you we're also live Monday through Friday at 4.30 p.m. Eastern that's 21.30 UTC you can find out more at UTC.com slash live happy new year buddy this show is part of the Frog Pants Network get more at frogpants.com hope you have enjoyed this program