 Coming up on DTNS, Twitter's about face on hacked materials, Sony's $5,000 3D monitor and why Apple's use of LiDAR in phones may be a big boon to the auto industry. This is the Daily Tech News for Friday, October 16th, 2020 in Los Angeles, I'm Tom Merritt. In Studio Colorado, I'm Shannon Morse. I'm the show's producer, Roger Chang. Sarah Lane has the day off, she's getting her HVAC reinstalled and Len Peralta is not with us today, but he did draw something that we will review later in the show. We were just talking about our favorite scares on good day internet. If you want to get that wider conversation, become a member at patreon.com. Slash DTNS, let's start with a few tech things you should know. Google announced it will migrate all existing Google Hangouts users. That means the Hangout that you chat in, not the video one or any of the other ones, just the one you chat in. You're going to go to the now free Google chat starting next year. They're giving you lots of notice. Chat will be available in Gmail and as a standalone app. Users migrating from Hangouts will retain your conversations, your contacts and your saved history, and Google will give at least four weeks notice before each phase of the migration to chat. Ours Technica sources say GameStop will receive a percentage of the revenue from Xbox transactions, including game on any Xbox console itself. It's unclear what the percentage is and whether it applies to used consoles resold by GameStop. All right, John Prosser is at it again. His sources say Apple may hold another special event, a third one, November 17th. This one would be to introduce the first Macs running on Apple Silicon. Now, Prosser doesn't have a perfect record on these things, but he did accurately forecast the October 13th iPhone event and Bloomberg also reports new Apple Silicon Macs are expected in November. So triangulate that, we might get a third Apple announcement. Software security company TripWire's vulnerability and exposure research team uncovered a stack based buffer overflow bug inside Sonic OS. Sonic OS is installed on almost 800,000 VPN appliances from network security company Sonic Wall. The flaw can be triggered from the WAN side by an unauthenticated HTTP request involving a custom protocol handler letting an attacker cause a persistent denial of service condition. And Google added a hum to search feature on the Google app and Google assistant letting users identify a song just by humming, whistling or singing it. Users can select a song, search button or just ask assistant what is this song and start humming. Google will display results based on the likelihood of a match. It uses some machine learning for this is one of a bunch of announcements about machine learning. The feature is available in English on iOS and in more than 20 languages on Android. Another way Google is improving search with machine learning is to let users see how busy a place is on Google Maps without having to search for the place specifically. It'll just show up as a maps layer. All right, let's talk a little more about Twitter. What they do, Shannon. I feel like I want to hum a song for you now, Tom. After criticism of how Twitter applied its hacked materials policy to sharing a New York post article about Hunter Biden, the company announced Thursday evening that it would change the policy. Twitter's trust in safety lead, her name is Vijaya God, wrote it would no longer remove hacked content unless it is directly shared by hackers or those acting in concert with them. And Twitter will quote, label tweets to provide context instead of blocking links from being shared on Twitter. Now, God also wrote that the policy is originally applied, could result in unintended consequences to journalists, whistleblowers, and other contrary to serving the public conversation. Friday morning, Twitter CEO Jack Dorsey wrote, quote, straight blocking of URLs was wrong, and we updated our policy and enforcement to fix. Our goal is to attempt to add context, and now we have capabilities to do that. The US Senate Judiciary Committee plans to subpoena Dorsey to appear October 23rd and testify about that removal. Yeah, so a lot of moving parts on this. It does seem like the policy will be changed so that they wouldn't be blocking URLs to respected outlets in the future. So, you know, the New York post story would no longer need to be blocked. And in fact, I tested this earlier. I can post that New York post link, and it doesn't give me the warning of, like, oh, you can't do that here. So they have changed that. I haven't seen the actual text of a new policy yet. I'm very curious how they word that. All we have to go on is Vijaya Gad's Twitter posts about this. And her wording is close to what they're going to do. But I would not want to parse it too closely because it's just her saying this is what we're going to do. And I want to see the actual words of the policy. But it feels like what they're trying, what they're aiming for is we don't want people coming on to Twitter post doxing people. And we don't want you linking to, like, scribbed, you know, and to documents that dox people. But a New York post article about something shouldn't qualify. So they're trying to draw the line there. You know, this is already out there. People are already upset. This is not going to make people less upset. But to my way of thinking, it is the right thing to do. They applied their policy literally, realized that was unreasonable and they're adjusting the policy. Very curious what that policy ends up being. It doesn't mean they're not going to run into this again because the definition of who's a hacker? Who's someone working with a hacker? Isn't the New York post working with the hacker in this case? Or because it's not an actual hacker? Is that why it doesn't count? So how do you qualify what a hacker? Is Chelsea Manning a hacker? What about Edward Snowden? Is it that you can only have the New York Times write an article about Edward Snowden links, but you couldn't leak directly to them? What about whistleblowers? We mentioned DDoS secrets had been banned yesterday. They have not been unbanned under this policy. They are still banned. Shannon, how's the hacker community reacting to this? I love that your reaction to this was, I need to see the policy and how it's written because that's exactly what I was thinking too. Like what is their description of a hacker? For example, the infosec industry kind of as a whole, I watch a lot of what goes on on Twitter, what the trends are within the industry. And a lot of people were retweeting Vijaya and saying, oh, so I guess I can post leaks again now. Great. And it's so funny because that's just been an ongoing thing. People have always posted leaks on Twitter, whether they're third party or they come from another website. And that kind of makes it very, very pointed in that Twitter is never the place where people first post leaks, or at least it is very rarely. Usually leaks are posted somewhere else and then a bunch of people find out about it and they start posting it on Twitter. So how do they know if it's an original hacker or somebody working in concert with a hacker, posting links to specific information as opposed to a journalist or somebody who is trying to get information out for the public interest? Yeah, it's there's still a gray area, right? They just moved the gray area. The gray area yesterday was right over the New York Post. And so they removed that link. Now they're saying, all right, if the New York Post or the New York Times or the Wall Street Journal get some leaked information and they vet it and they report on it, we're not going to block those links. We are going to look at links that are that look like they're directly coming from somebody or somebody working with them again. I'm very curious how they define that. So now that's the gray area. The gray area is like, well, how do you define directly and how do you find working with them? Exactly. Do you think that down the road, right? Do you think that they're going to have like a specific list of outlets that they allow certain publication or certain information to come out of or certain journalists to be able to post? Maybe this could be a part of their verification for the blue badge. I don't know. I'm trying to figure out a way how they're going to choose what gets posted and what doesn't. Right, because it's it's I don't think they'll have a list. I think they'll just say, you know, if it seems like it's coming directly from the person who docked somebody or hacked somebody, that's the problem. And then the fight will be not no longer over the New York Post. It will be over one American network or it will be over, maybe not Huffington, but I can't think of a good example on the other side. But but but something that's a little more controversial or maybe it'll just be over scribbed. Maybe it'll just be over, you know, direct links from from non journalistic entities. But how do you define a journalistic entity? Like, I think the New York Post is now firmly on the OK side. But what about those ones on the edge? What about the upstarts, etc. So exactly. Very curious. All right. Our tactic is Timothy B. Lee has an excellent explainer up on the LiDAR sensor in the new Apple iPhone Pros and the effect it might have on the auto industry. First of all, go read Timothy B. Lee's article. It's excellent. It'll give you a much fuller understanding. But I want to hit on some of the high notes here. Apple first introduced LiDAR in the iPad in March. So this is the second implementation of LiDAR in an Apple device. If you don't know what LiDAR is, I'm going to do and know a little more episode about it. But for now, it used to stand for light and radar. It's been retconned into a bunch of acronyms, but it's essentially sending light, laser light out and measuring how long it takes to reflect back. And that lets you estimate the distance. And then you can create a point cloud map of objects in range of the light. And that gives you a 3D representation of what's around you. The iPads LiDAR, which we assume is probably the same, but nobody's gotten hold of an iPhone Pro yet. So but the iPads LiDAR uses something called a VC cell, vertical cavity surface emitting laser. VC-SEL made by Lumentum. It's a fancy way of saying that the laser light emits from the top instead of the edge of the wafer. And that makes all the difference. That is combined with a single photon avalanche diode or spads from Sony that senses the returning of the light. So the VC cell sends out the laser light and the spads receives the light. Lee has been working on a story about how this pairing is being developed by companies for the automotive market because at scale, it becomes a lot cheaper than other LiDAR systems. Apple joining the crowd is going to speed up that process because now you got a bunch more orders, right? And that scale for the people who make these becomes easier to meet. The system is cheaper because the parts are cheap. The VC cells are already used in mice, optical networking, etc. They only recently became powerful enough to be considered for LiDAR use. VC cells are cheaper because, as we mentioned, the light comes from the top, not the edge, so you don't have to cut the wafers. That means you can put a lot more VC cells up to thousands on a silicon chip, which brings down the cost per unit. And spads are affordable, too, since they can be made using conventional semiconductor techniques. And you can pack thousands on a single chip there as well. Of course, the Apple LiDAR has a range of about five meters. That's all you need in a phone or an iPad compared to the more expensive automotive LiDAR, which needs to have a range of at least 100 meters, if not 200 or more. And the more powerful LiDAR systems that are used in automotive use spinning gimbals. Even if they're using a VC cell, they're using a spinning gimbal to be able to extend that range. It expands the precision, too, but it also raises the expense, even though they're using fewer VC cells than they are in the iPhone. Car companies like Tesla don't even use LiDAR. Elon Musk is on record very vociferously talking about how it's just too expensive and it'll never work. Companies like Auster, IBO and Sense are trying to push the performance of VC cell and SPAD combinations to rival more expensive versions. If they can get the range high enough, which they think they can, then Apple, helping push the price down with all of these high amount orders, could bring low cost LiDAR to more cars. So not just auto autonomous cars, but but also to driver assistance systems like the Tesla autopilot. But there's ones in other cars like BMWs and Audi's and Ford's and all of that. I found this fascinating to know that, oh, OK, so the LiDAR in the Apple phone, we still don't know exactly what it's going to be used for, what it's going to be best for. Could it definitely improves on face ID? I don't know if they're going to get rid of the depth sensor. It doesn't look like they are. But just having it in there means that we're going to have more affordable LiDAR and therefore more affordable 3D detection in our cars. Shannon, I know I know you've spent a lot of time with LiDAR, the more expensive version. Yeah, I'm I'm kind of a dork when it comes to LiDAR. And it's because maybe a year ago or so, I got the chance to check out Waymo in Arizona. They invited me as well as a few other YouTubers down to actually experience firsthand how these automated cars are using LiDAR, as well as machine learning and AI to be able to determine what they are seeing with these LiDAR sensors. So they take in all the all the little laser lights, all the little spots that you get from these VC cells and all the different technology. They combine it and then they turn that image into something that's that gives definition to each item within the image. So if it has if the LiDAR thinks that it sees a person, it'll bring that into machine learning and AI to determine if that's actually a person and does the car need to slow down or does it need to turn or doesn't need to stop. And it's able to determine this information in order to give you a safer environment to be within that vehicle and not feel more secure without having somebody else in the car with you. But it's not only car technology that we've seen LiDAR in and after I got to experience it, I kind of went really deep into learning as much as I could about it. You have them in drones. Drones are currently using it for geospatial measurements to determine like architecture and how an environment is mapped out. For example, if you've seen the Curse of Oak Island, which I'm also a nerd for, that entire island has been mapped out with LiDAR and you can look up the images online and it is so fascinating. So seeing all this commercial aspects of LiDAR, everything which has cost thousands of thousands of dollars. In fact, it's like seventy five thousand dollars average for commercial use, seeing it drop down to an ability for users to use this in an Apple iPhone or an iPad is just so fascinating to me. And I can see many ways that hopefully this will drop down the cost for commercial use cases because it might make this technology evolve a lot faster than what we're currently seeing. Yeah, there's a lot more in that Timothy Lee article. So again, Timothy Lee's article is definitely worth reading, but there's a lot of explanation of like why the gimbal uses fewer of those VC cells because it it rotates and is able to simulate like you have more points. But in a stationary thing like the iPhone, they just put them all in there and they got cheap enough that it was affordable for them to be like, well, we don't have to build a gimbal. So we can just put a few thousand of these things inside the iPhone and we don't need the gimbal to put as many points of light out there. We can have as many as we want. In fact, there are more, I feel like George Bush suddenly, but there are more points of light from the LiDAR sensor in the iPhone than there are in the TrueDepth sensor from the iPhone. So cool. Which is the TrueDepth sensor is just looking at the deformity of the light, whereas the LiDAR is actually mapping your face and able to say like, oh, OK, I know exactly how deep your face is. So yeah, there's a lot going on there. And LiDAR getting cheaper is going to have a lot of applications that we're just now beginning to dream of, I think. I'm so excited for the future of LiDAR technology because just seeing it in person and being able to experience it is so cool. So I want everybody to be able to experience it. And right now it's on the really expensive iPhone, but eventually it'll come into Samsung phones and mid-range phones. And yeah, we're on our way there is more. It's one of those situations like the more people that get it, the cheaper it's going to be. If you want to join in the conversation about LiDAR or anything else, join in our Discord. Link it to a Patreon account at patreon.com slash DTNS. Speaking of 3D, Sony announced the Spatial Reality Display. And that's not just my accent. SPATIAL Spatial Reality Display, a 15.6 inch 3D 4K display. It's being pitched as a way for artists and designers to see creations in 3D without having to put on a headset and do VR. You can just look at it because it's a 15 inch monitor on your desk. Similar products are available in higher resolutions with larger screens from other companies and for cheaper. So what's the pitch with paying extra for the Sony? Well, Sony says people want a desktop system. First of all, there are others that are wall mounted. The Spatial Reality Display is meant for a single user. So it's a creator working at their desk wanting to do 3D models. And the other differentiator, Sony says is it just ours just works better. And here's what they've got going on there. It uses precisely adjusted lenticular lenses and an eye tracking sensor. That seems to be their big advantage here. So they use the eye tracking with a real time algorithm to create the stereoscopic 3D picture and adjust it on the fly so that it always looks right to you. Now, according to Sean Hollister at the Verge, you can lean too far in or you can lean too far back and it goes away. But if you're in the right space, it will move around. You know, it'll look like a stationary object as you move your head around and look at it. So that's pretty impressive. Sony recommends that you use it with a computer that has at least an Intel Core i7-9700K processor and an NVIDIA GeForce RTX 2070 GPU or more powerful. It also works with a leap motion controller for 3D gesture control as well. Sony has SDKs for both unity and unreal engines that work with it. If you want to develop for it that way, sales start in November for five thousand bucks, which I think that means most of us are not going to go out and buy this. But if you work in something that needs this, I could see you justifying five thousand dollars for it, right? Yeah, I think the silver lining of this year is that we're seeing really cool technology being made compact for like work, work from home users. And that's what I consider this to be is even though it is expensive, five thousand dollars, if this is your profession and you're in a work from home environment, but you still need to get this kind of work done, then this is something that you're going to seriously consider. Now, again, they do require really heavy duty processing with your internal rig, ninety seven hundred K processor from Intel and the twenty seventy GPU RTX. So you're going to be spending quite a hefty amount of money there. But if it's a business expense, this is definitely something to consider. Yeah, people spend that much for reference monitors that don't even do three. So this is specialized. This is not something for the mass consumer, I don't think. Well, here's something for the mass consumer. YouTube TV is integrating NFL fantasy football into the app. Yeah, to set it up, you log into your NFL dot com account and it will bring in your fantasy team and match up. While watching an actual football game, you can select a fantasy tab to pull up your player's scores while the video window shrinks to the top corner of the screen. You can then toggle between multiple accounts as well. And fantasy players in the game you are watching are shown at the top. That adjusts if you change channels to another game as well. Google plans to bring more apps to its platform if they can be incorporated into content. Yeah, so this this is interesting to me because it's not the first interactivity we've seen. It's not even the first fantasy interactivity. I think Xbox One tried to do this when it was really more focused on on the TV aspect of things. But YouTube TV is the most popular right now of the over the top streaming cable like services. They have football because they have your local channels. And so they're tying into something popular. And I think that that's interesting. This idea of bringing they're calling it an app, but it's really just a cloud service into what you're watching. I think is really compelling. And you may not care about fantasy football, which is fine. So this is not the thing that's going to get you into it. But imagine other uses for it. I can imagine something like Amazon's X-Ray being implemented from IMDb. What if you could integrate IMDb and they could always like, you know, tap in with your remote or on your tablet or your phone and and be able to just touch the screen and say, who is that? Right. And it'd be able to tell from image recognition or something like that. I don't know. I feel like there's other uses that I'm not thinking of here. Oh, yeah. I was thinking about like political debates. What if you were watching a live political debate and then an application built right into it, was able to give you like real time fact checking or maybe you could see real time opinions from your favorite news sources. Or even if you're watching like a competition show, like a singing competition show and they allow votes from the audience, it would be cool to integrate something like that or something that. Vote right from the screen. Not only just see the voting, but yeah, yeah, not just like texting to vote, which we've seen on traditional cable television, but like in the application. I think that would be really useful and really fun for people to be able to interact, especially since we're all stuck at home still this year. I mean, I can see all sorts of different integrations that they could do for live TV and I'm a cord cutter and I don't even watch live TV. I don't even do any kind of NFL, sports, ball, football stuff. But this is really neat for the future. Yeah, I think anything that you use on your phone while you watch TV and have an account for is a possibility for them to integrate here. So that's that's that's something I'm I'm looking forward to to seeing new things come out of stories for a reason. Weather Channel. Yeah, I was thinking about Weather Channel because like they can customize Weather Channel to your location, but they don't always get it right. Or what if you're traveling and you want to be like, oh, I'm going to be traveling to Hawaii. So no, not only put my local weather up, put my destination weather up before we finish up. Robots down on the farm is one of my favorite beats. And we have news today. We do. Alphabet X has unveiled a prototype robot as a part of project mineral that can inspect plants to help farmers improve yields. The robot buggy rolls around on four pillars. So the rollers are in between rows. And the robot can be above the plants. In trials, it was able to count things like number of beans, plant height, leaf area and fruit size. It can also study soil and other factors to combine with weather and other environmental data. A machine learning algorithm identifies patterns and creates predictive models. Project mineral has analyzed strawberry fields in California, soybean fields in Illinois, as well as melons, berries, lettuce, oil seeds, oats and barley. Project mineral is working with breeders and farmers in Argentina, Canada, South Africa and the United States. There is no timeline on a commercial product. Soybean fields in Illinois. That totally hooked me because that's where I grew up. I everybody talked about corn in Illinois. I'm like, that's up north. We all do soybeans down here in Bonn County. But yeah, the idea that the robot can walk the beans for you and and provide way more specific data. There's a lot of efforts in this field. This isn't the only one. It's certainly not the first. But they all have that holy grail of being able to tell you like, Oh, you need to be able to water more in this exact area. Or there's there's a fungus coming in on these two plants. If you if you get to it now, you're going to be able to stop it. Stuff like that is is the holy grail of bringing data into the farming life. And this is this is just one of the projects doing it. I was thinking it would be so cool if they were able to implement this up in California wine country and be able to use this on grapes to be able to tell them what kind of volume they need to grow their grapes at and what kind of watering they need to do. Like, oh, man, I can't it would be amazing. All the bottles of wine, we'd be able to get out of that. Yeah, because the whole point is to increase the yields. And and farming such a low margin business, even if this increases the yield by just a couple of percentage points, that's huge for farmers, right? It would be so big. All right, let's check out the mail bag. Squinty wrote in this is how squinty calls themselves squinty. That's how they sign their email says, hi, team. OK, today's show with Peter Wells. You made reference to visually impaired when talking about the Seeing AI app from Microsoft. I'm legally blind and was MCing an event for Guide Dogs Western Australia when I stated to others that are also blind, that I was visually impaired to which they laughed and asked if I was ugly, confused. I asked what they meant to which they said we are vision impaired, not visually impaired. Ugly people are visually impaired. And if I may interject, I think what he's saying is like, the visual of you as impaired would apply like, oh, it's not pleasant to look at you, but their vision is what's impaired. Anyway, squinty said I was just amused listening to the show today, so I thought I would share when talking about blind or low vision people. It's vision impaired. Now, that is not the standard. I went and checked. Most places have not caught on to this yet, squinty, but I am 100 percent a supporter of that. I'm curious if you consider yourself either visually or vision impaired. What do you think of this? Because I'm willing to switch. I think that makes sense to me. Yeah, it makes sense to me, too. I should ask my my my sister-in-law because she works at a school for the blind. So I should get her opinion. Yeah, just like, like, pull pull the audience and see if they're into it. All right, folks, keep those emails coming to us. Feedback at DailyTechNewsShow.com. I also wanted to give a shout out to patrons at our master and grandmaster levels, including Irwin Stur, Pat Sheeran and John Atwood. Thank you so much. Now, Len couldn't be with us today because he's got his secret meeting going on. Good luck with your meeting, Len. But he did draw something for us around today's story. In fact, the the past couple of days stories, the the Twitter and Facebook New York Post story. So check it out at LenPeraltaStore.com. You can see his his his art. And it's it's another one of those great Len editorial cartoons where you you really captures the spirit and the and the anguish around this story. So go check it out, LenPeraltaStore.com. Or if you're a patron, you will get this as part of your patronage at Patreon.com slash Len. Thank you, Shannon Morse. Always a pleasure to have you. Anything new to tell people about? Yeah, check out my review of the Google Pixel 5 over on my YouTube channel, YouTube.com slash Shannon Morse. It's October, so there is a lot of phone reviews coming out right now. Yeah, that's right. No, no kidding. Tons of those. Also, patrons, did you know your ad free RSS feed can have just DTNS or just a GDI or both? I ask this because some of you in our survey seem to be unaware of that. Check your tier on Patreon to see if it says DTNS, GDI or all. And if you want to change, just change tears to the one that says what you want, either DTNS, GDI or all. That's at dailytechnewshow.com slash Patreon. We're live Monday through Friday, 4 30 p.m. Eastern 2030. UTC, you can find out more about that at dailytechnewshow.com slash live back on Monday. Sarah Lane has next week off, but we'll have David Spark and he's going to play a security game with us. See you then. This show is part of the frog pants network. Get more at frogpants.com. Diamond Club hopes you have enjoyed this program.