 Coming up on DTNS, Amazon wants you to learn machine learning by playing music. China ups its facial surveillance efforts and how VFX graphics differ between video games and movies. This is the Daily Tech News for Monday, December 2nd, 2019 in Los Angeles. I'm Tom Merritt. And from Studio Redwood, I'm Sarah Lane. I'm Roger Shea, the show's producer. And joining us, the host of comedy film Nerds, Chris Mancini back with us again. Good to have you back, Chris. Great to be here. Thanks for having me. We were just discussing grocery stores and how Trader Joe's can get you dates or at least dances. All kinds of things on good day internet. You need that wider show that encompasses DTNS. If you want to have fun conversations like that, become a member at patreon.com slash DTNS. Let's start with a few tech things you should know. The FBI Portland field office has posted a warning about the risks of buying a smart TV. The posting warns about surveillance by TV software makers through tracking your viewing habits, but also about attackers gaining access to home networks through unpatched smart TV operating systems. The FBI recommends placing black tape over an unused smart TV camera, keeping your smart TV up to date with the latest patches and fixes, and to read the privacy policy to better understand what your smart TV is capable of. A programmer created an open source algorithm to randomly generate secure passphrases in Welsh, which has the distinction of only having around 700,000 speakers worldwide. So that lowers the number of people that would guess the password by speaking in their native language. According to howsecureismypassword.net and myonelogit.net, it would take 11 quattro-dissilient years or one trillion trillion trillion years for a computer to crack various Welsh phrases. The programmer, Alice Warrens though, it's probably not a good idea to actually use this since the word list is freely available, along with the algorithm being used. But Erhuifi and Bakhandarlin. Ah, of course. One of my favorite holiday phrases. UBS and Formalhout Technosolutions estimate that the Huawei Mate 30 handsets are made without any U.S. parts. Huawei Cybersecurity official John Suffolk told the Wall Street Journal that all of Huawei's 5G hardware is now also America free. EU antitrust regulators are investigating how Facebook and Google's customer data is quote, gathered, processed, used and monetized, including for advertising purposes. Questionnaires have gone out to both companies as part of a preliminary investigation. AWS announced that it's expanding its Amazon Transcribe service to include support for medical speech. Amazon Transcribe medical lets physicians dictate clinical notes in real time without having to use humans to intervene. Amazon Transcribe doesn't require prompts like comma or full stop. Notes can be fed into ER systems or other medical language services as well. Amazon Transcribe medical is HIPAA compliant and charges based on usage with no upfront fees. Amazon Transcribe medical is available in the U.S., East, North Virginia and the U.S. West in Oregon regions. That's coming out of AWS re-invent happening in Las Vegas right now. Here's another story out of there. Amazon announced Deep Composer as a 32 key, two octave keyboard for developers to use to learn generative adversarial networks or GANs. It comes with pre-trained models or you can develop your own. GANs are work by having two different neural networks play off each other in an adversarial row to learn whatever it is you want to learn. In this case, with Deep Composer, they learn to compose new and original digital works based on sample inputs. Developers can create music based on a model, tweak it in the Deep Composer console, which happens in the AWS cloud, then generate music. Compositions can be shared on SoundCloud if you want. This joins Deep Lens, which is used for photography and Deep Racer, which is used to make faster cars, which also teach machine learning. Developers interested in using Deep Composer can sign up for a preview whenever it's available. You'll get notified. It's not out there yet, though. I wonder about copyright issues when someone says, well, wait a second, this sounds like a sample from my song. It's like, well, it was the machine. It wasn't really me at all. It was two machines talking to each other and learning from it. We're going to have to build a machine lawyer. Yeah, to say. I know what you did, Deep Thinker machine. You gan you. No, that's interesting. The copyright going that way wouldn't be too big of a deal, except for who do you sue? Do you sue the developer? Because that's where copyright law is going to be a problem as more of these are used. Obviously, this is just used to teach machine learning. It's doubtful anybody comes up with some musical composition that's so great that it's worth enough to go after. But eventually, that will be an issue of, if an algorithm creates a work of art, who owns it? A lot of people are trying to work on the legal theory behind that already before it becomes a problem. Yeah, the last thing we want are rich computers living in mansions. That's ridiculous. The 1% become entirely virtual. Yeah, become the AI. The AI is going to take over in multiple ways. But I think it's an interesting theory, but it's also things that are going on with all technology, like even self-driving cars. I mean, you guys have talked about that. Who's at fault when there's a crash on a self-driving car? I think it's a similar thing. It's a creation issue of like, all right, well, who created the machine that created the art? Who's ultimately the owner? Yeah, because even now with technology, music technology, which is, you know, vast, you can still point back to the person or people who use the technology to create the work of art. At what point do you sort of go, well, okay, I was, you know, I maybe started this whole cycle, but I didn't do any of that. Yeah, I pressed the button, but I was using Amazon's pre-trained model. Well, does Amazon have the copyright on it now then? How does that work? Yeah. Well, if you ask Amazon, I think you'd know the answer. What depends on whether they're being sued for infringement or not. Well, speaking of, not speaking of Amazon at all, I was trying to do a great segue and I just didn't work. Facebook began rolling out a tool to let users in Ireland transfer photos from Facebook to Google Photos. The tool is based on code from the open source data transfer project, which includes Apple, Google, Microsoft, Twitter, and Facebook. Facebook plans for worldwide availability in the first half of 2020. The company also says it's starting with Google Photos and evaluating other services as well. Why Ireland was the original? I imagine they started in Ireland because that's where their Facebook European headquarters is and data transfer is one of those best practices they can show they're being good citizens, because as we mentioned earlier in the show, Europe's going after them for regulatory concerns. So they want to start in Europe to say like, no, no, data portability, we get it. It's the law now. We're going to make it happen so you can do this. And like I said, that's where their headquarters is right there on the Liffey. It's a real shame when you see governments really trying to curtail these small startup companies like Facebook. It's really this old David and Goliath thing. Yeah, I started to get really excited when I saw this story thinking like, oh, this is great. I'll be able to pull my Facebook photos out and then realized I don't have that many photos on Facebook because I don't use it much anyway. And all the ones that are on there are on my camera roll, which is backed up with Apple and Google anyway. So, you know, they don't live specifically on Facebook, the one of the photos that you would want to keep. Yeah, I mean, I mean, pretty much everything that goes to Facebook is something that I'm posting on Instagram, where I just do like the also want to post to Facebook. And I go, okay, like even if you deleted your Facebook account, you wouldn't lose any photos. I don't think there's anything on Facebook except there are photos that I'm tagged in that other people have uploaded. And I'm pretty sure those photos only live on Facebook. So that's something to consider as well. Because, you know, some folks just, you know, they're people have all sorts of backup methods or not at all. And it's it's it's a good service, whether or not the company wanted to do this out of the goodness of their heart or forced to that it's it's it's good to know that you've got options, you know, take a backup of important family photos, which are usually the ones that end up on Facebook anyway. If you are someone who looked at this and said this, I definitely needed this, send us a little email telling us why feedback at dailytechnewshow.com. As of December 1, customers signing up for new mobile phone plans in China can have their face scanned to match with identity documents instead of having to submit a picture. China's Ministry of Industry and Information Technology announced the change in September, which uses artificial intelligence and other technical methods to verify identity. Previously, your mobile plan required you to show your state ID and have a photo taken. So if if you weren't mad at the surveillance of this before, there's not much more to get mad at here. They're just making it easier for them to grab your photo. They have streamlined the effort of getting your photo. Yeah, I'll tell you, it can't be more transparent than that. We're taking a picture and storing it. They're pretty much saying that. Well, and honestly, it's like you had to have a picture taken before, which was a little bit of a clunky process. Now they're like, hey, you can just you can just be surveilled right there in the app. It's so much easier. Yeah, what do we need security cameras for? You've got your own now. Just yes, exactly. Why would we spend money, the state's money setting up cameras when everyone's carrying one around? In all seriousness, this this is getting treated as like, oh, look at this expansion of surveillance. And I think in this case, it's not an expansion. It's an easing of surveillance, but it's it's something they were doing already. Well, other Chinese news, according to leaked documents obtained by the Financial Times, Chinese companies like ZTE, Dahua and China Telecom have proposed new international standards on facial recognition, video monitoring, city and vehicle surveillance to the UN's International Telecommunications Union. ITU standards are often adopted in developing nations in Africa, the Middle East and Asia, where the Chinese government supplies tech under its belt and road initiative. Critics say that the proposals are more likely policy recommendations than technical standards. The facial recognition proposal would store facial data in a central database and suggest its use in public places by police, verifying worker attendance and comparing the country's fugitive library with the local population library. And gadget notes that in June, a standard was accepted for streetlight design that included an option or video monitoring for video monitoring, rather ZTE and China Mobile proposed that standard themselves. When they say policy recommendation, it makes it sound so less nefarious. It's like, oh, no, it's just a policy recommendation. I think what what the critics are saying is this doesn't belong in the ITU. You shouldn't try to take a policy, a government policy and slip it in as a standard, which is actually more nefarious because they're trying to say it's just a technical spec. What's the big deal? It's not like we're but but the critics are saying, well, wait a minute. No, this is a policy of surveillance. We also had news today that the United States is going to require facial scans at the border for US citizens right now. It only applies to non US citizens when they travel in and out of the United States. So this is this is something that is continuing to be an issue that governments and companies are trying to use more and more because it's starting to work better and better. It's not perfect, but it's starting to work. And there's a lot of people concerned about how it's used. I personally don't have a problem with facial recognition being used in certain cases, but it's never about the fact that it works. It's always about, yeah, but what else are you going to do with it? Okay, so you scan my face to make sure that my identity is me when I sign up for a new cell plan. Theoretically, I'm fine with that unless you're storing it in a central database, for instance, that could then be used for other things later. I may be less alright with that. Right. Now, the problem is like you don't always know whether or not that's happening. And in fact, you don't really. Well, because there's no there's no oversight and safeguards on a lot of these programs, which to me, that's the real issue. It's not. I think people get a little bit their hair on fire about facial recognition and sometimes for the wrong reasons, but the real reason to light your hair on fire, should you want to do that? Don't do that, kids. It's a bad idea. But if you want to if you want to have a concern, it's a concern about like oversight and who's in charge of this? What safeguards do we have? How do we know that you're not going to use it for something else? And if we don't, then it shouldn't be put in place until we have systems for that. Yeah, I agree. And then if you really want to mess it up, just actually light your hair on fire. It could lead to your facial recognition no longer. Yeah, let me like that and look like you at all. Not sure it's worth it, though. Here's one of my favorite stories of the day. I we haven't mentioned Cyber Monday on this show because most of the news out there is about deals that by the time you hear this show, you probably won't be able to get anymore. So there's always a little bit of a difficulty on Cyber Monday finding other news out there. But this is a gem. NPR reports that dairy farmers in Massachusetts are using machines from Vanguard renewables to generate electricity from food waste. It's a little bit of a roundabout process. So these are, you know, farmers who are selling their eggs and their cheese and their milk to grocery stores in some cases. And now the grocery stores are sending the food waste back to them. In Massachusetts, it's Whole Foods. Whole Foods stores are taking food waste that can't be used by food banks. So they try to donate it to somebody you can eat it first. But at a certain point, you just can't. And they put it in an industrial strength grinder. The resulting slurry is loaded on trucks, which deliver it back to the farmers who feed it into anaerobic digesters. Farmers also get food waste from other sources than grocery stores, creameries, breweries, juice plants, or some examples where they take food waste that can't be consumed and send it to the farmers. And then the waste is heated, releasing methane, which is captured and used to run a generator that burns methane. Farmers say in general, they use around 10 percent of what is generated to power the farm. And then they feed the rest back into the grid. And byproducts from the process can be used as fertilizer. Thousands of digesters are in operation in Europe and Vanguard hopes to expand their program beyond Massachusetts in the United States. Now all anaerobic digesters are on sale for Cyber Monday. Vanguard has a 50 percent off sale. Now's the time to make your own farm, everybody. This is cool, though. I was asking Tom and Roger before the show, okay, well, we're talking about Whole Foods. Amazon owns Whole Foods. Amazon's also expanding into the grocery market pretty substantially outside of the Whole Foods brand. Is this something that we're going to be seeing more and more of? And from what I can understand, it is a great way to stop a lot of food waste that's happening now. It's cool. It's a great way to recycle for sure, because all this stuff used to be just thrown away. Yeah, it's a good example of taking the resources we have and instead of just plowing them into a landfill, I guess we're plowing them just into a farm in this case, but using it to generate electricity, which is good. And like you say, Sarah, Amazon, I actually tried out Amazon Fresh for the first time this morning. It's the service that you now get for free with your Prime membership if you have a minimum order of like $30, $35. And I accidentally ordered six bunches of bananas instead of just six bananas. So read the interface carefully. You're going to be potassium high. Amazon Fresh and Amazon's Whole Foods together are going to have a lot of food waste. And I could see Amazon, even if they just use this themselves, to generate electricity for their own warehouses, it's a way to engender sustainability, be even better if they're able to give this to farmers to use in a Vanguard-like program. You know, and I did, after the recent fires in Northern California, which were, you know, luckily, no one I know was deeply affected, but a lot of people were displaced. I did a volunteer night with World Central Kitchen and they were doing a thing like a big old feed at the fairgrounds. It was super fun and happy to be part of it. But I learned a lot from the volunteers who have done a lot more of this than me. How much, like the food that was made that night was made by local chefs. I mean, it was good food. It was like a gourmet kind of thing. But I mean, maybe a third of it was eaten, because we just made way more food than there were people there just, you know, to be safe. And there are very strict rules about where that food can go afterwards, meaning you can't even really drop it off at a shelter, because it's been a different kitchen and it's just, you know, there's just kind of their health department restrictions. So this seems like it's just going in the right direction overall. Well, I think 10 years from now, you know, a choice is going to be made whether, all right, well, are we going to use Amazon's anaerobic digesters or Soylent Green, which is a good spoiler. So yeah, no, I won't, I won't. Folks, if you want to get all the tech headlines each day in about five minutes, be sure to subscribe to dailytechheadlines.com. All right, folks, Chris and I were emailing earlier this month about the fact that you saw Midway, and that wasn't the point of the conversation. It spurred some observations about visual effects in you. It really did. And as a fan of like someone who loves movies and video games, you know, I've always been interested in seeing how, you know, each medium progresses. But then when I see crossover in the wrong way for movies or video games, it really kind of misses the point. Like Midway was a big giant spectacle of explosions and, you know, it's a Roland Emmerich film. So I got that, I expected it. But we got to the point in that film where the actors were secondary to like I was watching one cut scene after another for video games. To the point where too, with visual effects, you could see the assets being reused at some point is like, oh, that's a plane hit the deck and then went into the ocean from an aircraft carrier. And when you see that kind of same sequence multiple times, you realize, well, this is just these are art assets just being reused to fill time. And one of the things I always noticed and that I always make this point when I was talking about movies or even video games on the podcast is that each one, a video game or a movie, invokes a different emotion. And that emotion is very critical to the actual medium. Like when you're playing a video game, it's a immersive experience. Whereas you're when you're watching a film, it's more passive. So if you use techniques incorrectly from each one, it actually takes you out of the experience and makes you more detached from it. Like if the visual effects were more seamlessly integrated to the performance of the actors in the film, you will be more emotionally attached. And if you have cut scenes in a video game that are more designed to keep you in part as part of the action, then you're more immersed on the video game side. But when you have long detached cut scenes in video games, it detaches you. And when you have long cut scenes in a movie that look like cut scenes in a from a video game that you're completely detached from anything, then it removes you as well. So I think it's like sometimes everyone gets drunk on the technology, we're like just bigger, bigger, faster, more. And that's not really the case, the medium has to be respected first. So especially with a movie like Midway, and I remember like everyone's comparing it to Pearl Harbor, yes, that's not incorrect. But you see the difference in visual effects that kind of like one up and like, well, you know, now it's Midway, it's Roland Emmerich, it's got to look insane. And it's got to be full of explosions and giant things happening when that's really not the case. I mean, if you had a really cool story that was grounded with the characters, and then the visual effects were layered on top of that, it would have been a much more, you know, emotional experience as a film. And it wasn't. Well, the two things strike me out of that one is the battle of Midway. I always learned from both my father, who was in the Navy and in history class, which was taught by a teacher in junior high that was in the Navy that the biggest thing about Midway was the ships never saw each other because they were too far away. It was it was fought entirely in the air. And I feel like they may have missed a really. I never would have gotten that from the film component. Right. Is is is that sort of detachment by doing that? But the second thing is it's I'm curious because I haven't seen the movie is the problem that if when you say cut scenes are the things that look like cut scenes in Midway scenes that just don't have humans or if they have humans in them, they're not they're not the center, right? They're they're just sort of being blown up or or screaming or something. Absolutely. But it's also more of the the spectacle of it, like everything where, you know, everything is a close up on an explosion where, you know, it doesn't feel that that grounded feel to it like a good example is like say like a war game like Call of Duty, you know, where the cut scenes still kind of immerse you in the characters actually in them. Whereas Midway and this was how poorly it missed the mark or how much it missed the mark like Midway was actually based on certain real life people in the actual movie. Like, you know, at the end, they always have the credits like this happened to this person, whatever. But the movie was so detached from any kind of emotion or attachment with these characters that when you saw the cut scene, it actually made those characters seem unbelievable. Like, oh, these are how they would made up characters. And like, oh, no, these are real people. But the way the movie is presented made them actually feel like they weren't even real when they actually were. The other thing I've noticed and this isn't even a VFX thing so much is writing in shows and movies, sometimes feeling like a non-player character speaking to you. When you when you go when you have a cut scene with an NPC that says, we're either going to have to go up the hill, or we're just going to have to bail out, you know, like, oh, that that's bad exposition, but it's on purpose because I have a decision I have to make as a player. But I'm sorry, I see that creeping into movies and TV shows, too. Oh, yeah, for sure. And it's like we call those the exposition characters. It's like, well, your sole purpose is to specifically move the plot along in a very inorganic way. It's like, well, you wish we can't go down that road because, you know, there's an ambush. So we'll have to go down this road. And then, you know, I'm, you know, I'm detached because I had a terrible childhood. And this is why I can't connect with anyone emotionally. You know, when characters say that, it kind of ruins the point of, I don't know, storytelling. Well, one of the one of the key mantras, at least that I've heard from people working in the industry is that the point of the point of any effects, whether it's visual effects, practical effects or anything, is that it shouldn't call attention to itself. It shouldn't say, I have a special effect. I'm a visual. It should be as if the world was totally organic. To kind of add to that verisimilitude where, oh, I totally believe I'm in the world of Marvel comics and, you know, aliens are coming down and the Hulk is breaking through stuff. It shouldn't look like you were saying like in a video game where, well, okay, here comes the cutscene exposition. Lead on to the next mission. Okay, where do I press go? Let me play. That's a great point because especially with Midway specifically, it felt like you were getting all the visual effects from either a video game or even like a Marvel movie, like with an alien invasion. So nothing looked grounded or realistic or any of it. I mean, it was all just about how do we make it bigger and how do we make it more explosionary? I don't know if that's a word. We want exposition, not exposition. Is that what you're trying to say? So, but stuff like that is it really struck me because I felt like I was watching Midway looking for a controller in front of me because, right, well, the cutscene is over and I'm like, okay, well, now what do I do? There's a dialogue tree with these wooden actors that I can't see. It's really as these both have evolved video games and films and we've seen the way effects can be layered. There's video games that do it absolutely beautifully, everything from like a last of us to the last Call of Duty got mixed reviews, but I felt like that was a really immersive World War II game. But the filmmakers should be learning lessons from film, not from video games and vice versa where it's because they're two different experiences and two different emotions. I remember when CD-ROMs first started and that was one of the things that they did because the technology was new. Myst used it beautifully, like we're using full-motion video and putting it together. We're using the technology to tell a story specific to this, like no one's going to go to the movies and watch Myst. It was used perfectly, but then there was a rash of CD-ROM video games that all they did had was full-motion video on them with cheap actors and sets and stuff that they could get away with very cheaply. And then you would just go around from room to room to trigger the next scene. I felt like that was like, well no, you guys have learned the wrong lesson here. And I think that's kind of what we're seeing with certain filmmaking where, and I really think it's, you could really look at the Marvel movies where there's that great mix of like, oh, this spectacle belongs here, but it's also grounded with the characters. Indeed. Well, thanks everybody who participates in our subreddit. Lot of stories just like the one we talked about are kicked around in our subreddit every day. You can submit your own and vote on others at dailytechnewshow.reddit.com. Join in on the conversation in our Discord as well, which you can join by linking to a Patreon account at patreon.com slash d-t-n-s. All right, let's check in with Chris Christensen, who has a social tip for travelers to help each other get deals on electronics. This is Chris Christensen from Amateur Traveler with another Tech in Travel Minute. I'm not sure if you've heard of the Grabber app. That's G-R-A-B-R at grabber.io. But Grabber is an interesting app. You've run into those things where the iPhone goes on sale here, but you're in a different country, so you can't get it or, gee, I loved those things that I bought when I was in Italy, but I can't get there and I can't find them online. Grabber is an app that connects shoppers with travelers, and the idea is that you can shop for someone, getting the thing you can only get in your country because you happen to be traveling to their country, and you can bring it to them and they will pay you for it. You register what trips you have, and other people register what stuff they need, and the app tries to connect you. But you have to get over that whole thing of shopping for other people with your money, and I'm not sure I can. I'm Chris Christensen from Amateur Traveler. I'm with Chris there. I would want someone to do this for me. I'm not sure I'd be the person on the other end buying the thing for them. Right, and then they just don't show up at that place that you had predetermined earlier. You'd both be apps. Like any site, they probably've got systems in place to refund you if that sort of thing happens, right? Otherwise, they'd be gone in a second, but it does. It makes me a little nervous. But if you got trusted folks around the world, hey, it seems like an interesting way to do stuff. Hey, shout out to our patrons at our master and grand master levels, including Phillip Shane, Jeffrey Zilx, and Paul Reese. Also, thanks to Chris Mancini for being with us today. Chris, I think we decided it'd been about eight months since you were with us. Too long. Thanks for being back, and what's been going on with you, and how can people keep up with your work? Well, we have the final comedy film rights show that will be December 12th at the Hayworth Theater that Tom will be a guest on. I can't wait to be there. But the main thing I have right now that only has a week left, that I really want to tell people about, is my new Kickstarter for my graphic novel, Rise of the Kung Fu Dragon Master. And it is a really fun kind of like an action comedy, like Big Trouble in Little China, and like 80s buddy comedies, Army Darkness. And it's kind of a follow-up to my first graphic novel that was actually helped funded by a Daily Tech News show of viewers and cord cutters called Long and Going Far Away. Tom has one right there. And this will be put out by Starburn's Press. They do, Starburn's does Rick and Morty, Animals, and Anomaly, so they're an animation house. They have a print division, but we need to get it funded first. So we've got until December 10th, and we need help for sure. And you can go there at Kickstarter and type in Rise of the Kung Fu Dragon Master, or you could just go to ComedyFilmNerds.com and click a link there. You could watch a video, check out all the rewards. You just want to grab the digital book or a physical book, but there's also cool rewards too. If you're an animation fan, you could get a tour of Starburn's and see how they make animation. You could be on the last Comedy Film Nerds show with Tom. You could, I will say, you know what? If it's a Daily Tech News show listener, I will sit them next to you. Fantastic, as it should be, yeah. And so there's a lot of cool rewards, but you know, Kickstarter is always a ticking clock, so we only have till December 10th, and we've got a fair amount of time to go because it is a big 160 page book, so we'd love to get as much help from the viewers as possible. Yeah, long ago and far away was really, really cool. If you haven't checked it out, you should take a look for it as well. But first things first is to get Rise of the Kung Fu Dragon Master funded. So head on over to Kickstarter.com folks and make that happen. Maybe you'd be sitting next to me at the last Comedy Film Nerds. It could be amazing. Go check that out. We also have new Patreon reward merchandise to celebrate six years of Daily Tech News show. Len Peralta created a six-year anniversary DTNS logo. And if you back certain levels at Patreon.com slash DTNS for three months, you'll get a sticker, a poster, a mug, or a t-shirt with that logo on it. Get the details at Patreon.com slash DTNS slash merch. And of course, you can always support our show at any level at DailyTechNewsShow.com slash Patreon. You can also send us feedback. In fact, we love it. Feedback at DailyTechNewsShow.com is our email address. We're also live Monday through Friday, 4.30 p.m. Eastern. That's 2130 UTC. And you can find out more until a friend, DailyTechNewsShow.com slash live. Back tomorrow with Justin Robert Young. Talk to you then. This show is part of the Frog Pants Network. Get more at FrogPants.com.