 Coming up on DTNS, Amazon wants to compete with cable TV, the scoop on whether giving up your phone passcode is self-incrimination and the first mistaken arrest involving facial recognition. This is the Daily Tech News for Wednesday, June 24th, 2020 in Los Angeles. I'm Tom Merritt. And from Studio Redwood, I'm Sarah Lane. In Salt Lake City, I'm Scott Johnson. I'm the show's producer, Roger Shane. Folks, we were just talking about old-timey baseball cards, five and dimes and all kinds of good stuff. Get that wider conversation. Get our expanded show, Good Day Internet. Become a member at patreon.com. Let's start with a few tech things you should know. On Wednesday, Slack announced Slack Connect. So up to 20 organizations, if they're paying anyway, can communicate with each other within a single Slack channel. The company says that Connect will come to unpaid tiers in some form in the future as well. The company also says it's been working on Connect for four years. Enterprise security key management using Connect will be available this summer. It should have been Connect with a K. Olympus announced it will sell its camera business to Japan industrial partners, JIP, which previously bought vial computer systems from Sony. They will continue to develop products under the Olympus brand such as OM-D and Zuko. Olympus's main business will remain medical equipment, so they're still really focused on that. The deal is expected to close later this year. Microsoft released a public defender. That is a preview of its defender antivirus software for Android. Only for commercial customers, though. It's the advanced persistent threat. Microsoft Fender ATP for Android will show up in companies' dashboards with the option to deploy it to employee devices. Defender for Android can scan for malicious apps on the phone. Detect and block malicious websites. And infected phones can be blocked from accessing company resources until they get fixed. Microsoft Defender ATP for Linux also exited preview today and is now generally available. Brazil has suspended WhatsApp's right to operate mobile payments just a week after the feature launched. Mastercard and Visa, which are used by WhatsApp to handle payments, have been asked to suspend money transfers on WhatsApp. The central bank suggests it did not get to analyze the WhatsApp system prior to launch. WhatsApp Pay launched in a limited test in India two years ago and is also available as a test in Mexico. Brazil is WhatsApp Pay's first nationwide rollout. Good news to all the Minecraft players out there. NVIDIA released a driver update to support Microsoft DirectX 12 Ultimate. A key component of DirectX 12 Ultimate is ray tracing. So your RTX cards now have something to do if you're a fan, I guess. Which is supported on NVIDIA's RTX line of cards. The new NVIDIA drivers also support hardware accelerated GPU scheduling in Windows 10, which lets the video card manage its own memory for improved performance and reduced latency. GM launched an on-star app in Canada and the United States for Android and iOS called Guardian. When an active on-star plan is available, you can have up to eight people use it accessing safety features, whether or not they're in the car. That includes emergency advisors, location sharing, roadside assistance, and mobile crash response, which can detect a collision using a phone's motion sensor and an Android automatically connect you to on-star. The latest version of Opera now has Twitter baked into the sidebar of the browser. So you can access your main feed, you can search, you can DM people without opening a website or opening an app. Opera says it's the first major browser to do this. It also previously added Instagram to the sidebar and has built-in access to Facebook Messenger, WhatsApp, Telegram, and the Connect. Google announced new default data practices for new users. In a blog post Wednesday, CEO Sundar Pichai wrote that automatically deleting web and app searches after 18 months will now be on by default for new users. Users can also still change it to be three months if they want. Google's location history is off by default, but even if turned on will default to 18 month deletion schedule. Google is also letting iOS users switch to incognito mode with a long press on their profile picture. That's coming to Android and other platforms soon. Alright, let's talk a little more about that Amazon thing. Alright, let's do it. Protocol sources say Amazon is actively pursuing, that's the quote, deals to license live linear TV programming. Think of it as the opposite of on-demand. Job listings indicate Amazon may add live programming to its prime service. That's a prime video. One job listing mentions concerts, political debates, and news in addition to existing streams of live sports. Other listings mention live TV broadcasters and cable networks as a potential partner. One listing says, we are building next-gen linear catalog systems to provide best-in-class linear TV experience to prime video customers, unquote. Well, you know, I pay for YouTube TV, which is, you know, it's the cheapest version of cable out there. That is, that feels like a true cable package, you know, with all the usual suspects as channels. If Amazon builds something like this and as a prime member, they can undercut what I'm already doing with YouTube. Because when YouTube TV launched, I was like, seriously, $50 for all this? That's great. Then I'd be pretty excited about it. If it was anything more limited, it would be of little use to me. Yeah, I'm wondering what they're going to do. It'll be interesting because it could be something like that. It could also be something like Pluto TV, where it's just a lot of linear streams of video from big brands, but not necessarily the same thing you would get on cable TV. I would feel like if they did that, it would be free with prime video. It would just be another way to get you to use prime video because you'd have all this fun stuff. If they're going to do a cable TV replacement service like YouTube TV, that could be a big deal because we are now away from when those services first launched. When Sling TV, PlayStation View, YouTube TV, when they all first launched, they had to strike weird deals to get cable networks to agree. Because the cable networks didn't want to undermine the cash cow that was cable television. But cable television is no longer a cash cow or if it is, it's a very skinny cow compared to what it used to be. So Amazon might be able to get better deals and maybe create a more comprehensive and therefore competitive service. I don't know. It's possible. What I hope, I like Pluto TV for a bunch of reasons. Mainly though, it's my own weird reasons because I like curated weird channels. I like channels that say, hey, we're called the funky horror from the 60s and all you're going to get are 1960s horror movies. And it's just nonstop all day, commercial supported, but there they are. I like that kind of thing to get sort of lost in. They have animation channels like that and so on, but they don't have any kind of big mainstream stuff. Little channels here and there, food networks, some news channels, but for the most part, it's these kind of weird sort of nichy things. And I like that. I'd like more of the mainstream stuff too. So maybe Amazon will do both of these things. Maybe they can bring these two worlds together and I can get linear TV as an option, hopefully with my exact same Prime subscription that I'm already paying. That would be cool. In fact, that's kind of crucial for me in this case. And I will, I'll buy into it, even if it's commercial supported. I will watch whatever they bring. I just want it to be full and lots of options and also just stick a little weird stuff in there if you can. Well, this next story is a bit of a head scratcher, but let's go with it. Three U.S. Senators introduced a bill that would require tech companies to comply with lawful access to encrypted information, even if that information was end-to-end encrypted. End-to-end encryption like that, used in WhatsApp, makes it impossible for anybody but a user to access data without somehow cracking the encryption. While not mandating how this would actually be accomplished, the bill would require companies to comply with warrants that request user data no matter how it was encrypted. The bill would also create a competition to offer a prize for anybody who develops a way to access encrypted data while still maintaining privacy and security. Yeah, so this is another take on making a backdoor without mandating a backdoor. The Urnit Act also does a similar thing. That was related to repealing Section 230, but this one's not. This one's just saying, hey, we're going to make it the law that if law enforcement comes to you with a warrant, okay, they've got the right to data, you have to give them that data. No excuses. You don't get to say, I'm sorry, it's ended and encrypted. We don't have the key to it. Which is basically saying don't employ encryption because or employ broken encryption. It's just impossible to offer fully secure encryption and also allow anybody to access it because as soon as you allow anybody. Yeah, then it's not truly encrypted. I mean, it's easy to just be like, wow, these people just have no idea how encryption works and that may be true. But it sounds to me more like just saying, listen, we respect encryption and privacy is really important, but we don't want you to use end to end encryption anymore. So let's create a fun prize for somebody who can figure this out. Because what it really means is we just don't want you to use encryption the way that it's being used well. The other response to this has always been like, there just isn't a technical way to do what you're asking. And the response from a lot of politicians has been, well, you're smart. You can figure out a way. You're just not trying. So we'll give you a plan. You've already built. You're a genius. We're going to give a prize to somebody who can make a ball not fall to the ground when you drop it. You can come up with ways to fake it, but it's no longer just dropping a ball anymore. I mean, you can come up with ways to fake this, but it's no longer encryption anymore. Yeah, part of me is they would just say, we want backdoors into all this sort of stuff. So let's just, we don't like encryption. So stop encrypting things and just be straight up about it. Like that's what they're asking for. They're not asking for somebody to figure out a way to make it encrypted and not. The answer they wanted. Right. Exactly. So just either either ask that and be turned down because that, because no, or quit asking. Because this isn't very well veiled. It sounds stupid. And when you guys read it earlier, I just kind of had to wrap my head around and go, really? Are we okay? It's an impossible thing. Possibly. Yeah, this is just a bill. It is not even being voted on at this point. So don't get too over concerned yet, but we wanted to put it on your radar. Another interesting thing happening in the world of governments in the United States, the U.S. State of Indiana Supreme Court ruled that the Fifth Amendment's provision against self-incrimination does in fact protect a woman accused of stalking from having to unlock her iPhone for police. This is not the last word on this. If you're like, oh, good. Now I want to worry about having to unlock my phone. Courts across the U.S. have ruled differently. And therefore different regions have different protections. And the U.S. Supreme Court hasn't weighed in on one of these cases yet. I think we're waiting for the right one to come along. The U.S. Supreme Court precedents pretty much predate smartphones. The short version is they rely on what the suspect has and what you can prove they know. That's how the courts are determining whether you can force someone to unlock a phone or not. The closest analogy to a phone is a combination lock, right? Where you know the combination, but it's locked. Unless you tell someone the combination or you do it yourself, it won't be unlocked. And the rough rule is that the court has to prove, or someone has to prove to the court that the suspect knows the combination and knows what's inside the safe before they can be compelled to open it. Because that would produce documents. The Fifth Amendment says you can't testify against yourself. And so the court says, look, if you know the documents are in there, you can force them to give you the combination because that's not testimony. That's just having them hand over the evidence. But if you can't prove that the suspect knows either the combination or the documents, then compelling them to open it would be testimony. Because you're asking them to say something or come up with something that they know. You want them to show knowledge of something. So if you can't prove they know it, but you compel them to do it, that would be asking them to prove that they know it. That would be testifying against themselves. Does that make sense? The contents of a person's mind is what you hear used to describe that. You can't have them give you the contents of their mind. But with phones, it often comes down to the idea that you can prove they know the way to unlock the phone because you've seen them unlock the phone. That's pretty easy to prove. But you might not be able to prove they know what documents are on it. You might think there might be something incriminating on it, but can you prove that they know that that's on there or that you know that it's on there? The way you would with like, well, we know the documents are in there, we just need the combination. Now some courts say you only have to prove the suspect knows the code because producing the documents is in testimony. Since you're not asking the phone's owner what files are on there, you're saying we know the files are on there, we just need the code. You said Fifth Amendment does not apply. But other courts say that the police must be able to prove they know the documents are on the phone. Not just say they know, but prove they know. Otherwise it's what's called a phishing expedition where you're like, I want to open this up and see what's inside. Maybe I'll find something relevant to the case. And in that case, that would be forcing self-incrimination because if the person knows there's something on there but you can't prove it, forcing them to open the phone is therefore incriminating themselves. Yes, it does. I mean, all of this goes away if I say, let's just say a cop says, give me your phone. I want to see what's on there. You need to unlock it. And if I say I make the decision right then, well, yeah, I have nothing to do with whatever they're looking at. I'm going to unlock it and give it to them. All this goes away because I've consented for you to be in my phone. It depends. There could be cases where you could say I was forced into consent or intimidated. But yes, by and large, if you say I will unlock it and show it to you, it probably doesn't apply. Kind of like when they don't have a warrant, they want to look in your car. And you give them permission. Right. That's a whole different situation. But the Indiana court ruled that unless you can prove that you know the files are on the phone, you can't force someone to give you. Well, and the Indiana case and some other cases that ruled the same way, where the courts ruled the same way, I mean, it would be really hard for law enforcement to prove they know documents on a phone that is locked of someone accused of a crime. That's a tough one. Yeah, you'd have to have like an email from somebody else that says here's the files. I've said, you know, and you could say, look, we know they're on the phone because here's the email where he said he sent them. Right. Yeah. Like that. Yeah, like you've been videotaped. We saw the files over your shoulder. It's pretty specific. Very specific instance. Exactly. Well, we've talked a lot about Apple this week on the DT&S show, as you guys know, the move to ARM designs for its chips. We have not talked a lot about how much better they're going to be. Apple would have you think in their keynote that everything's going to be better, but we just don't know. Apple didn't announce a new chip for Max and therefore did not publish its usual charts and benchmarks. They just said Max would be moving to them. They are demoing Mac OS on an A12Z Bionic, which you'll find currently in the iPad Pro, the most recent version of the iPad Pro. The most specific Apple gets to say about the Mac with ARM will be industry leading performance per watt. They said that quite a bit. What Apple is not saying outright is that the A12Z Bionic would make a more powerful MacBook than say the current Intel low power laptop chips that they currently use. Apple did say ARM will give the Mac higher performance GPUs, which is interesting. From my point of view, I kind of, I could not get us out of my head when they started talking about it because on the one hand I was like, oh, I remember Rosetta. I remember the power PC transition I hated that time and sucked. And I know you guys have talked about that stuff pretty thoroughly, but I actually got a little excited because my iPad Pro is a workhorse and performs really well at the tasks I needed to do. And much of mine are art related and some other things. But there are the kinds of programs that tend to chug even on high end PCs and Macs don't chug on my iPad Pro. Again, you're working toward a unified platform, not dealing with a lot of driver issues and that sort of thing. That plays a role, certainly. But if what they're saying is that kind of usability or, I guess, potential power in the kinds of apps I care about is moving over to the Mac line to the desktop, to the laptop, wherever it may be. I'm more bullish on this than I thought I was. I'm actually kind of excited to see how that translates. We just don't know enough to say how that will impact every other kind of use case you may have for desktop. So, so many questions to answer. Yeah, and like usual, Apple being secretive leads people to try to fill in the gaps. And if you fill in the gaps with what they've given you, performance per watt could mean it's just really battery efficient, but not necessarily more powerful. And we've seen that before. In fact, the early Intel M core processors when Apple first switched to Intel had problems where, you know, they were they were battery efficient, but they weren't terribly powerful. So a couple of years for them to milk a lot of the of the power out of them. That doesn't mean that that's what's happening with the I-12 Z Bionic, but it does sort of lead you thinking like, well, if that's all you're saying is performance per watt, that tells me something, which it may or may not. Apple may not be willing to say the other part of it because they they're being Apple and they're being secretive. And I would love to know. I would love to know what a Mac Pro, a six to seven thousand dollar base price Mac Pro looks like with these chips on them. Yeah. I mean, do they even do that? They may stick with Intel for the high end stuff. I don't know. Signed arm to work in a supercomputer so you could certainly theoretically design it to work in a Mac Pro. Yeah, I'm really curious about that. Jess Whittlestone at the Lever Hulma Center for the Feature of Intelligence at the University of Cambridge and her colleagues published a comment piece in Nature Machine Intelligence this week, arguing for improvements in AI ethics. AI ethics today often respond to problems, but don't anticipate them. And that can cause delays in adoption. A recent example would be using AI to assist in patient triage that would affect which patients are highest priority. And if you don't have a built in ethical framework, many institutions say, we don't want to be the first ones to try this. There's just some issues there. People with AI expertise need to think through the implications of their work, say the researchers, and work with people who understand ethics through all levels right through implementation. Yeah, I thought this was really well, well put together because what they're saying is you need to think of ethics during the design. You can't just put something out there and wait for an oversight board to go, well, hold on, that's not very ethical or worse, wait for it to cause a problem in the workplace. Which is why your example is really well put, Sarah, if, if you're triaging patients, you don't want to try out the ethics of it and have someone tell you later that the ethics weren't good. You want the ethics to be hardened. And I think that's what Jess Whittlestone and her colleagues are saying is let's try to make this be something we can put out with the product and say the ethics are built in. We know that this doesn't have biases or if it does, we're transparent about them and we know that it shouldn't be used in these particular use cases because of that. Well, this article inspired a thought in my head, which was if you include a plan for the ethical implementation of your, of your tech of your AI tech, if that's part of the process, like this is suggesting. Won't that improve the rate at which you implement AI systems in the wild and therefore better iterate on them in the lab and therefore move them out into the wild again, because then you won't have institutions that are afraid of it. Yeah, I know that there's a plan. You've come in with this thing saying we know all of these things are issues or potential issues. So we've done this, this, this and this. And so implementation will be simple and easy and you're protected because blah, blah, blah, like all of that seems prudent. They should just do it. Yeah. And that's what Whittlestone is arguing to. She's like, look, it'll you'll be able to adopt things faster. You'll be able to iterate faster. It's absolutely right. Hey, folks, if you want to get all the tech headlines each day in about five minutes, be sure to subscribe to daily tech headlines.com. There's a technology out there that I bet would have benefited from having ethics built into it from the beginning rather than putting it into the hands of people and finding out, oh, wait, there are problems with it. And that technology is facial recognition. In fact, Boston has joined cities like San Francisco, Oakland, California and Cambridge, Massachusetts in prohibiting the use of facial recognition technology for most municipal uses. They made exceptions for unlocking your phone or using facial recognition to preserve privacy, in which case the facial recognition software is just removing faces. It doesn't have to tell you who they are. Basically, if it's not identifying you or just looking for faces, you can't use it. Cities are banning facial recognition over the fears of its misuse. All facial recognition technology has some rate of false identification. And that's a higher rate among people with darker skin since machine learning is often trained on data sets of predominantly lighter skinned people. In fact, we now have an example of that in the real world. The ACLU of Michigan has lodged a complaint in Detroit Wednesday alleging that a black man spent 30 hours in custody after facial recognition software mistakenly matched him with a shoplifting suspect. He is no longer suspected of being the shoplifter, but he had to be under arrest for 30 hours before they realized they had the wrong person. The match was made in facial recognition from the Michigan State Police's digital image analysis section, which uses software from rank one computing. Not using Microsoft, not using Amazon, not using any of the big ones like IBM who are all saying, well, we're going to hold off using somebody else. Michigan Police and rank one guidelines do say that arrests should not be made on the basis of facial recognition. And it's unclear at this point if the police had other evidence, again, other evidence that was wrong, but other evidence that may have pointed and given them to reason to suspect this man. But the ACLU says it doesn't matter. You shouldn't have been using facial recognition in this arrest because it wasn't him. The worry I would have. I've always had about this sort of thing is if they make it too much of a factor in the arrest, too much of the reason why you're arresting them or too much of the burden of proof is on the match that they got from facial recognition, then what would happen is, I don't know, I'd end up some actor who looks like me would get in trouble for doing a terrible thing and then I would suddenly be in a database because I kind of looked like that guy or they got a false positive because I look like him. I was going to use an exact example, but I'm not going to. I don't want to get myself in trouble, but there's an actor I look like is all I'm saying. Anyway, the point is, yeah, I don't want that to be. I mean, it makes these sorts of things they get developed that make sense as like supporting arguments, right? Like if you've got other evidence that points to the thing and you say, well, and also there's the facial recognition. It's not the chief, but it's this other thing. We got to figure out where that is. Yeah. I mean, that's, I mean, at the very least, yeah, supporting something else. The fact that the police in Michigan and also rank one, which makes the software have guidelines saying, well, this shouldn't happen based on facial recognition. There may be other, you know, you know, parameters here. And like you said, Tom, that that's not necessarily known. But what if it was just facial recognition? I mean, you can't just have a guideline saying probably shouldn't do that. It's not enough information. It should be you legally cannot do that. And if you do that, you're in trouble. Yeah, I mean, if the case is they just used facial recognition to bring the guy in, then it becomes a question of why didn't you follow the guidelines? Should the guidelines be law? That is one aspect of this conversation. If they say, look, facial recognition just pointed us to the guy. And then we found out he was in the store that day. He was driving a car that fit the description of the shoplifter. You know, it was just a case of mistaken identity because he fit the description could have happened to anybody. That's still a problem because my guess is if it weren't for facial recognition, they wouldn't have bothered this person. They wouldn't have gone. They wouldn't have made him stay in a jail cell for 30 hours. They wouldn't have arrested him in front of his wife and two little dollars. Right. You can say, well, okay, but how often does that happen? I think we're all very clear that for black people, it happens all too often and having another instance of it certainly isn't helping. Yeah, to be to be to have been somewhere, you know, at a, at a, I don't know, coincidental time, yeah, and maybe driving a similar car and that sort of thing. It's like, OK, well, suspects get questioned all the time. And, you know, if they're not the right person or it's deemed that they, you know, they aren't guilty after all, that's something that we're all very used to that happens all the time. But yeah, the whole facial recognition thing is like, if that's sort of the, you know, the nail in the coffin of like, well, we got these three things, you know, where were you at 3pm that day? And oh, you just happen to look a lot like this other person. It must be you. I mean, that's, that's, that's a real problem. Well, and that's where the guidelines are not being violated, right? But the fact that you had the facial recognition, like you said, now becomes the, the coup de grace to say, like, we might have let the guy go to say, well, don't leave town, you're a person of interest. But instead, we had the facial recognition and that may have given them the confidence to like, let's just arrest him. Could be other things too, of course. Well, this story and lots of others are always getting batted around like ping pong balls at our conversation at discord, which you can join by linking to a patreon account at patreon.com slash DTNS. Let's check out the mailbag. We got a good one from Alec in warm and sunny Fayette'sville, Arkansas, who says the SCADA that's an acronym for supervisory control and data acquisition platform that I use that Alec uses at work announced yesterday that they've released a free version for home use. This is a huge deal from my perspective, as it's identical to the package that we use at work, less some of the redundancy and enterprise administration features, but without the $30,000 price tag. So it's definitely not for the faint of heart. The platform allows just about anything to be controlled from a central server with real time data collection and the ability to build custom HTML5 interfaces, alarming and data collection. Here in the conversation on GDI yesterday, we were talking about smart bulbs and routines that work most of the time, but sometimes get a little wonky. This software has the potential to fine tune all of that. The Dublin Airport runs all of their baggage claim and tracking systems with ignition. That's the name of the software. Sierra Nevada, that's a brewery, runs their entire canning and brewery operation with ignition. List goes on. I'm very excited to play around with it and figured I would share it with you all. And we'll have the link in our show notes. inductiveautomation.com is the website. We'll have the full link, like Sarah said. Very cool. It's not something that, you know, the average user is going to just throw up and solve all their problems. I built an IOT. It's a very sophisticated piece of software, but the fact that a very sophisticated $30,000 piece of software is now available for free for home use. That's pretty great. And I can't wait for the people who do want to play around with this to do that and come up with the version that is easy for folks to be able to implement. This could be huge. Very cool. Very cool. Thanks, Alec. Shout out to patrons at our master and grandmaster levels, including Ragnold Vermidal, Reed Fishler, and Paul Reese. Also thanks to the one, the only Scott Johnson. I may as well tell you who the actor was. It's Judge Reinhold. Okay, everybody. That's who looks like me. Or we used to. We used to kind of look alike. We don't so much. I would not have, I've been thinking for the last few minutes, like, who is it? You just look like Scott Johnson to me. Back in my mullet days, many years ago, I looked a lot like Judge Reinhold, but it was a weird reference and I thought I'd clear the air so nobody wondered. I want to thank you guys for having me on, but second of all, I wanted to just drop a quick tease. Tom and I have been working diligently on the rebirth of the current geek show, which has been around since 2010 and various iterations. And going strong that whole time and we did some really cool stuff with it, but we are taking things to a literal new level and I'm so excited about it. The only tease I'm going to drop is within the next week or so, check out current geek.com. You can go bookmark it now. There's not much change there. The current show or the existing shows there right now, but there'll be more there soon. We're taking the show in places you will not believe. It's going to be so freaking rad. I can barely stand myself until we are ready to launch this thing. So pay attention. Check out current geek.com in the meantime. If you have any questions for me, there's contact links on the site. More details to come, but look forward to what will officially be called current geek chronicles coming to a podcast player near you. Alright, thanks for having me on. I'm Scott Johnson on Twitter and you can also find that show and everything else I do over at frogpants.com. Just saying. Justin Robert Young announces me doing a wrestling move on Scott. There you go. There's that. Hey, an episode of know a little more is coming tomorrow to the public feed about arm. If you're hearing all this talk about apple and arm, you're like, wait, but I don't know what arm actually is. I will explain it to you and why you might think they're a patent troll since they don't actually make things, but they're not. Get that episode at know a little more.com. Or if you're a patron, it's available to you already at patreon.com slash DTNS. Hey, guess what? We have an email address. Guess what it is? Feedback at dailytechnewshow.com. Guess what? We're also live Monday through Friday at 4.30 p.m. Eastern 2030 UTC. And you can find out more at dailytechnewshow.com slash live. We'll start with Justin Robert Young. Talk to you then.