 Daily Tech News show is made possible by its listeners. Thanks to all of you including Philip Shane, Paul Boyer, and Brad. Coming up on DTNS, Tech comes to board games, but there's still board games. Chris Mancini is here to explain. Plus, the first copyright test of text-to-image generators, and James Earl Jones hands over Darth Vader's voice to a computer. This is the Daily Tech News for Monday, September 26, 2020, in Los Angeles. I'm Tom Merritt. In lovely Cleveland, Ohio, I'm Richard Apolino. I'm the show's producer, Roger Shane. And joining us, writer and podcaster Chris Mancini, welcome back. Thanks for having me. Great to be back. There's always a lot of tech to talk about. Always a lot of tech, yes. We keep thinking the show's going to be over, and then they just make more tech. In fact, let's start with a few tech things you should know. Netflix announced it'd establish an internal game studio based in Helsinki, Finland. Now, you may say, hey, Netflix, they own some other game studios. They own three other game studios, but this will be the first building games from scratch. The studio will be led by Marco Lastica, who previously worked at Zynga, developing Farmville 3, and before that, in EA's mobile division. Apple has confirmed it is assembling iPhone 14 models in India at a Foxconn facility near Chennai. It's expected to go on sale in India later this year. This is part of Apple diversifying where it assembles its phone. Apple began assembling iPhones in India in 2017, but up until now, that was only older generations, so they're now doing the current generation in India as well. According to documents seen by Reuters, the Indian government proposed smartphone makers make hardware changes to support its regional navigation satellite system, otherwise known as NAVIC, in addition to GPS by the start of 2023. NAVIC became operational in 2018. Uptik has maybe not been where the Indian government wants it, and so they're thinking about mandating it in smartphones. Apple, Xiaomi, and Samsung reportedly sought until 2025 to support NAVIC, citing higher research and production costs. Got to build the radios to support all that. The UK's Information Commissioner's office has been busy lately putting everybody on notice, and the latest to get a notice of intent is TikTok. The ICO said it has reached a provisional view. I want to have more provisional views. I might think this. I'm not sure. Give me a moment. Anyway, provisional view that TikTok's app breached UK data laws between May 2018 and July 2020. According to the notice, TikTok may have processed the data of children who were younger than 13 years old without parental consent. TikTok has 30 days to respond to that notice, and the New York Times sources say TikTok and the U.S. White House have drafted a preliminary agreement to resolve national security concerns, but we don't have any details on that deal. It isn't finalized yet, so we'll keep an eye out for that. Has this been going on for literally years? TikTok harbs the information. Why are there still news stories like people are surprised that this is happening? Well, that's why it's a provisional view. Yeah, because they're not surprised. They're just trying to decide what emotion fits. Alright, and the International Telecommunications Union is a 157-year-old organization originally formed to coordinate telegraphs across countries a noble mission indeed. In 1949, it integrated into the United Nations system. It does not govern the internet despite headlines you may read. That's common misconception, but it does have 193 member countries and 900 participating organizations, so it can decide things that affect the internet simply because of all the Coteries have agreed to it. So, who runs it? It has a lot of influence over the internet, particularly over standards and interoperability. And this week in Romania, the ITU is choosing a new head to succeed. China's Zhao Hulin was led the ITU for the past eight years. Down to two people. Former U.S. Commerce Department telecom expert Doreen Bogdan-Martin and forever Russian Deputy Minister of Telecommunications Rashid Ismailov. Hmm, yeah. It's not like, I mean, we've had China, a person from China in charge of it for the past eight years, and there's been pressure to do things, but it's not like one person just makes all the rules, but which way they get pushed, it makes a difference. Who's in charge? Alright, let's talk about copyright. We're only just starting to get into the intellectual property implications of text to image generation, open AI, all of the algorithmically generated content out there. But we do have a few precedents. We do have a few rules being made. For one, Dolly 2 and some of the other text image engines say that you can use generated images for commercial purposes. They just build that in. We're going to avoid that whole issue. If you use it from us, you can do whatever you want, sell it, whatever. That doesn't clear up all the potential issues, though. For instance, if my intellectual property is used in the data set to train the engine, does that give me any say in what it outputs? There's also lots of concerns about whether the person who pressed the button really has the right to sell the image. Those are the kinds of things behind Getty's recent ban on text to image generation from algorithms. But we're getting another test, eh, Rich? Yeah, so this kind of comes from the copyright perspective. And we have some precedent for how kind of the US Copyright Office at least has dealt with AI. Back in 2019, they ruled that an AI engine cannot be the author of a copyrighted work, something tied to Stephen Thaler with the work by the algorithm Creativity Machine. So he tried to say that Creativity Machine was the author of a work, and the ruling was upheld earlier this year, so it took a while for them to hit the appeals board. And they maintained that copyrighted works have to include an element of human authorship. Not that they can't use AI tools, but an AI isn't a human, so you need human authorship to have a copyright in your name. In the United States. Now, Thaler has succeeded in this in other places in the world, but just not in the United States. And we're talking about the United States because of what happened with Chris Custanova. AI-generated images can be used in a copyrighted work. That's what Rich was pointing out there. And New York-based artist Chris Custanova received a US Copyright Registration for the AI-generated graphic novel or algorithmically. When we say AI, it's useless, right? AI can mean anything. This is an algorithm. This was one of those text-to-image generator. Generated a graphic novel called Zarya of the Dawn. And that copyright is effective as of September 15th. So it has been granted. It is in force. Custanova used the mid-journey commercial image synthesis service for the work and was clear in registering the work that there was an assist from an algorithm. Custanova wrote the story, created the layout, and shows how to piece the images together. So the human did a lot of work, but didn't alter the generated works in any other way. So the actual images you see came out of the computer. Custanova's graphic novel is available for free. If you want to take a look at it, you can find it at aicomicbooks.com. This definitely feeds into a, is it a tool? Is it okay to use the tool? A lot of people are saying that the main character in this book, Zarya of the Dawn, looks a lot like Zendaya. Are we going to hear Zendaya's people put a claim on this? Chris, you make graphic novels. You sell graphic novels. How does this make you feel? Well, I mean, this is an interesting thing, because first of all, what do the robots feel about this? I mean, what are they going to say? And once they take over, they're going to remember this, okay? But what's interesting is when you make the graphic novels, obviously, as a writer, he still wrote them, but it more affects the artists and the art, and I would probably ask my artist what he thought. But it really is an interesting thing, because it's like, well, if the art and the style is created by the artist, or is it generated by the AI, and it's, it goes, so many things go into a graphic novel between the art and the layout and the lettering and all of those things. It's a great question. Like, what is the human contribution? What are the AI? I don't think we're completely there yet, as far as like the AI making like a work of art, completely as far as like a book or a graphic novel, like, you know, a still image, sure. You know, but this kind of happened before. I don't know if you remember that game, The Last of Us, when it first came out. There was some controversy about whether or not that was Ellen Page's face for Ellie, and like, that kind of got, you know, a little bit of controversy there. So, especially when your AI main character looks like a famous actress, that might also be a problem. I mean, that's the first thing I saw, too, when I saw that image. Kostinova is obviously making a point here, and very validly has said, I want to test these waters. That's why I did this. I think in practice, you would still have an artist to help with layout, to help with image selection, to refine individual images in order to touch them up, make them look exactly the way you look. You want them to look. You probably also might want to modify them so they don't look so much like Zendia, so you don't get into that situation. Because it's not just, are you trying to use someone's likeness the way it was with The Last of Us. This is also the idea that the algorithm was trained on images of Zendia, most likely, right? And so there's that other side of the element as well. And one thing it could do is also speed up an artist's work. Like if the artist does the original drawings and style and compositions, and then the AI kind of goes in and like takes different angles or positions of like what the artist is drawing. So in a way, there's a real positive thing here if it could speed up, because let me tell you, graphic novels and comics take forever to make. And if there's like a computer way to speed it up without losing human element, I think that could be very valuable. Well, speaking of human elements, as far as iconic performances go, it's hard to name something more distinctive than James Earl Jones' voice of Darth Vader, defining the character since 1977. But Jones is now 91 years old, so many have wondered how the character would be portrayed after Jones can no longer perform the voice with the continuing Star Wars content plan since Disney bought the IP seems like a pressing concern, at least for Disney. But, you know, Tom, it seems like we know what... Yes, we have an answer. Rather than task another actor with stepping into the role, it seems the plan will be to simulate it. With Jones' consent, he has signed off on this. Disney has worked with a Ukrainian company called Respeacher. They used archival recording and trained a proprietary algorithm to create new dialogue for Vader, focusing efforts to create Jones' voice as it sounded in 1977. My guess is they feel like if they need Vader to sound older, they can do that. But they trained it mostly on his work from 1977. Now, I noticed this too. Oh, sorry, go ahead. Oh, no. Well, that's kind of the litmus test, right? It's like whether people will notice this and a lot of people already have. It was featured in the new Obi-Wan Kenobi series. Some of the scenes were there with, I guess, spoiler Darth Vader. It was featuring the work by Respeacher. According to Lucasfilm's Matthew Wood telling Vanity Fair, James did guide the algorithm's performance through many of these scenes so it still has a hand in forming the character. We've also heard the tech for Young Luke Skywalker in the book of Boba Fetza. But Chris, I'm curious, in terms of hearing it in your reaction, did it pass the James Earl Jones human test? You know, especially if you kind of know that it's not the actor, you kind of have that little subconscious bug in your mind. You're like, well, that's not really going to sound like it. And they do. They sound close, but they're still not the same as like hearing an actual human speak. But, you know, when you see how close people are getting, like with Respeacher, especially with like the Mark Hamill stuff and with the James Earl Jones, you know that we're getting closer to it actually being super close where it's almost indistinguishable. But I'll give you a little bit of trivia about James Earl Jones' voiceover and what a great guy he is. Like, I would work with audio designers and they would occasionally have James Earl Jones do voiceovers for like different spots and stuff, and he would do them remotely. And of course, all the audio designers would be like, oh, well, this is great. Can you record something for my voicemail or whatever or something like that? Because they would always ask. You know, it's because they would want that on their voicemails. Like, you see how cool this is. So what James Earl Jones would do is he would say, all right, this is what it would cost me to do it. I'll do it for you for free, but I want you to give that money to charity. Oh, wow. So that's what he would do for all of those audio requests for voicemails. I wonder if he is going to require Respeacher to do the same thing. That would be great. Yeah. Did you, Chris, did you know it was Respeacher doing some of Vader's lines in Obi-Wan? I did not. But I'm curious if you knew that. I didn't, but I suspected because I didn't know for sure because we had kind of gone through with Mark Hamill, but it was like, it was on that bubble of like, is it or isn't it? Well, I think that's the key is when you saw Luke Skywalker show up in the Mandalorian, you knew, OK, this is either going to sound like old Mark Hamill or it's been processed, right? And when it didn't sound like old Mark Hamill, you're like, oh, OK, they processed it. So that biases you. You're already thinking like, OK, so this was synthesized. Does it really sound like him or not? Whereas in Obi-Wan, I remember thinking like, wow, he sounds really good, like almost too good. Like almost not the way I would think James Earl Jones today would sound. Did they process it? But I also did not know. And so I think that's a truer test of how good it was when you aren't coming in with it going, well, I know they did something to it. And whether you notice or not. Here's the other big difference is, you know, Mark Hamill and Luke Skywalker is a natural voice, whereas whenever you hear Vader speak, right, it's through a rebreather. So you already have kind of that layer of like distortion, which can also hide any imperfections in the voice. So you've got an image there, too. Yeah, I think where. Go ahead, Rich. I think where, though, like both in both of these instances, we're still dealing with the actor is still able to be involved in the creation of this character that they defined in so many ways. But where I think, and that's why I think when reading this, I don't get like, I don't get like a weird uncanny valley even feeling like that because I was like, OK, it's based on their voice. They're involved with it. I think where it gets interesting and where a lot of these companies that are dealing with these long tail IPs are going to get to at some point. Disney seems like maybe one of the first to get there is trying to keep, you know, these characters around and using these types of methods instead of doing recasting and where it gets to, oh, now we're dealing with the estate of this actor. And the not definitely the legality of that is, I would think, fairly clear. I'm not an inheritance expert or anything like that. But I think the perception of it is definitely different when we're talking about actors that can still be involved with it. And certainly both these were by all accounts versus, you know, if if Grand Marv Tarkin's voice is, you know, brought back the same way as he was visually. If there's a new Maltese Falcon movie, then, you know, like, all right, I don't know if that's Humphrey Bogart. I think there's something going on. You know, for a second, I really thought Millennium Falcon, even though you said Maltese Falcon just because of the conversation. But yeah, I in the vein of all of that, I think it is important that this is the precedent of living actor says, with my blessing, with my guidance, with, you know, my wishes, I will hand it over to the computer. It is more difficult when the actor is already gone. But I think this is a good precedent to follow, to say, let's not wait until they're gone and then have a big argument over whether this is okay. You know, let's let's line it up. It may feel a little morbid to come to somebody like, Hey, so when you die, you know, but you never know, like even if people don't die, their voices could become, you know, changed by illness or just not able to get up and get around and do stuff like that. And it's important to have a succession plan, so to speak. Yeah, audio wise, for sure. Yeah. All right, let's turn it over to Dan Campos in Mexico City with an update on some tech that's helping players at this year's World Cup. Aficionados que viven la intensidad del fútbol. This is NTX with some tech and sport news. The FIFA announced the release of FIFA Player, where the actual football players will be able to view data related to their performance during the matches taking place in the Qatar World Cup. The data will be synchronized with videos taken during the games in order to have the most efficient evaluation at key moments. The information collected won't be available to the public, but to the players themselves, their clubs and some selected fear parries. Currently, our professional soccer leagues monitor their players using the EPTS, the Electronic Performance Security System, which uses wearables to collect information and was introduced in 2017. For more information about this, check NTX's latest episode. Back to you, amigos. Very much, Dan. This is, though it's cool about this, it's like for years, all the sports games have just been chasing this realism and this almost feels like the continued video gamification of actual, like it's the opposite now. And that, to me, is the most fascinating thing about all that. It's so cool. Pretty soon, they'll be able to simulate the voices of the soccer players, as they pretend they're injured. Folks, if you haven't thought about something on the show, but you don't know our email address, let me fix that. Our email address is feedback at dailytechnewshow.com. Recently on an episode of Star Trek Lower Decks, the main characters who, if you don't know, live centuries in the future were playing a board game that included holographic elements, along with good old fashioned game pieces and dice and a board. Now, that may sound silly to you, and Lower Decks is an animated comedic take on Star Trek, so maybe it was, but it's not far off to some of the things we're seeing now. Chris, what kind of tech are we seeing in board games? You know, it's pretty amazing. Board games have seen a resurgence over the last couple of years, and there's different levels of tech as it goes into board games, and there's always the purists who are like, I don't want any tech in my board game, but there are certain ways where it's super beneficial. When you have a super complicated game like Gloomhaven, which we play a lot, but it takes a very long time to set up, play and tear down even for one adventure, you can have what are called helper apps. The app will kind of keep track of like monsters hit points and the scenario that you're on and the rewards and the player conditions and all of those things, so it kind of helps you keep track of everything. The other thing it will also do is kind of like hide information to make it a little more dynamic, like where if you're looking at the scenario board it kind of has everything that you're going to see in the adventure, but the actual app can kind of hide those things. And then the next level up which is also really fun is like a game like Mansions of Madness or Return to Dark Tower where the app is integrated actually into the gameplay where you can't play it without the app. So the way you can look at it is like when we used to play Dungeons and Dragons but there was always a DM, a Dungeon Master, well that Dungeon Master is taken over by the technology so everyone could kind of play together and nobody has to actually be the Dungeon Master, which is really fun, especially in co-op games. Like Return to Dark Tower you have a giant tower in the middle of the game which is fun but the app does all the heavy lifting it tells you where to put the monsters and when the tower is going to spin and what the victory conditions are and gives you quests so it's integrated into the actual gameplay. So I really like that approach because it kind of simplifies things and I know purists don't like it but I do because it it kind of takes that away from like where's the monster go now I have to have a player that actually has to worry about the enemies but there's something on the horizon that I have only seen in videos but it's super expensive but it's kind of like the next level of board gaming and it's called AR Gaming and there's a company called Tilt5 that is actually creating holographic board games. So what happens is you actually have a board and then you all wear AR glasses and the games come alive right actually in front of you now it'll be a mix of classic games or new games but what's interesting about it is where's the line then where it's just a video game or it's a board game like what tactile things that you can have like you can move pieces around and roll dice or is everything just going to be like a wand and glasses and then which case okay well then it's a video game so dynamic for sure and I'm very anxious to see where it's going. Yeah I think one of the things that Stoic Squirrel brought up in our chat room now is that the tech can be controversial if it's required to play. Now Dark Tower the original had technology in it so it's kind of just carrying on the legacy there and it comes with the game but with the apps do you run into controversy where people are saying I don't know I don't want to use apps because I don't want to have to use the app myself and if some people are using apps then maybe that's going to give them an advantage etc etc. Well somebody's always angry about something on Reddit so sure yeah there's always somebody but my gaming group we're a little older so we're open to different ways of playing and I like playing it both ways like I like the you know the super low tech of a board game where you're turning over cards and moving pieces but I also like the AI assist and the that's basically controlling the game where you're playing against the game in an app but and like I said the AR the tilt 5 stuff I haven't seen yet I've only seen it in videos and it looks amazing I'm just not sure like you can't really tell like where what the experience is unless you actually do it so I'm very curious to see the problem is it's very expensive so it's a bit of a buy-in to try it out well and to your point about you know kind of this line of video game versus board game I mean I remember I mean even back in the PS3 days there was that PlayStation Eye camera and they had that game Eye of Judgment which was kind of the other side of this which was hey we have this incredible graphics processing software let's point a camera at a bunch of you know it was more of like a magic to gathering kind of card game like card combat game but that idea of hey we can we can add this element to it wasn't incredibly successful but like that idea always fascinated me that like we can have that and if you can not require the set console and with a wired camera and all that stuff and it's you know when AR glasses become more of a consumer good I can see that becoming super exciting and I really think at the end of the day the technology is going to be used as tools and it's going to be really about the designers and the artists on how they use it where I feel like there's going to be an AR board game that's super boring and like a video game or there's going to be one that's absolutely brilliant where it's going to be like the system seller where you can't play it there's like no experience like it so I think it's really going to be up to the creativity of the designers and that's one of the things that happened with Dark Tower I really felt like they mixed like the technology and the gameplay of like the old school 80s Dark Tower like like super well like it's not too complicated it's not gloomhaven but it's complicated enough that it's fun to play and it's just you know it's just fun to have a tower in the middle of the board that's bluetooth connected to your your tablet spitting skulls out at you so yeah yeah what's not to love right yeah good playing fun well till five if people are curious you only have to have five dollars to reserve it but the actual price will be three hundred fifty nine dollars for one set of glasses and you'll be very lonely if you only have one set so you're talking three fifty nine dollars for every person you want to play with right yeah well that's enough of living in a fantasy world on a tabletop let's move to a world where we smash things into asteroids yes our glorious present because let's face it if you're an enthusiast for either dinosaurs or disaster cinema from nineteen ninety eight you know that a massive asteroid hitting the earth is an existential concern and be clear there's no threat right now or indication that any asteroids are coming toward the earth calm down don't worry don't panic everything's fine but NASA's double asteroid redirection test or dart is being built as a planet a planetary defense test mission this will send a spacecraft traveling over fourteen thousand miles an hour to crash into the demorphos asteroid in an effort to alter its speed and orbit demorphos is about the size of the great pyramid of Giza so a nice nice size asteroid all things considered in case you're worried this should not knock the asteroid off its trajectory it wouldn't hit the earth nothing like that demorphos is a small asteroid orbiting a larger asteroid called diddy diddymos the impact should shorten its orbit around diddymos by about ten minutes a measurable amount just enough first to check that all our calculations are fine no need to worry everything is fine go about your daily lives the dart spacecraft will send back imagery and data up until the moment of impact and its companion craft the Italian leaky a cub will also monitor the crash the collision expected at seven fourteen p.m. eastern time on Monday the twenty six it takes about thirty eight seconds for light to travel from demorphos to earth so there will be a slight delay now I know a lot of you will be listening to this after the impact and so we're so sorry we got it so wrong yeah you know I feel like like you look at the timing of this is like well why now exactly we doing these these tests and you could tell like the best way to tell if it's a real serious threat or not is book a tour at JPL and when you're on the tour see how quickly people are moving back and forth and then you'll know what's what's going on or if your tour just gets cancelled yeah all right let's check out the mailbag we got a great one from Marty who wrote just finished the episode you were talking about Nvidia's announcement and their use of the usd format for the metaverse wanted to say I work in live events in production and the usd format has been gaining traction over the last year in our industry for media server projections XR etc work for the reasons you described we were just talking a few weeks ago about adding support for it into one of our products and it looks like it's going to be the new standard moving forward so we'll see where it goes but wanted to mention it's already being used and growing in applications outside of the metaverse to that's great Marty think it's good to hear from somebody inside the industry saying yes the standard Nvidia picked has got some momentum elsewhere as well yeah definitely good to hear from the real world and then Thor sent us a message in Patreon he said just finished listening to GDI 4361 and since you were discussing the future of casting and the benefits of short form content I wanted to share my experience from the fountain and podverse podcast apps both apps support clips users made clips that let you get a taste of a highlight from a longer podcast on fountain users can also like a clip by sending a small amount of Bitcoin otherwise known as satoshis the app is also trying to incentivize listening and clip creation by awarding currency when listening love to hear your thoughts about this and kind of the podcasting 2.0 spec if you check that out and there's some other cool apps that allow you to stream currency to the podcast while you listen to other cool features of the interest keeping things short I'll leave the rest for you to explore yeah thank you Thor I appreciate you sending these along we'll have links to both of those the podverse and fountain in the show notes I see a lot of these kinds of efforts they're very interesting to look at and so I always watch to see like are they getting critical mass are they getting people to sign up and so I'll keep an eye on these as well and definitely well let's say a big thanks to Chris Mancini for being with us you always got some cool stuff going on what's going on with you these days absolutely been working on the new company white cat entertainment were kind of like a boutique publisher of like books graphic novels and podcasts and the two podcasts of course one of them you've been on what are you watching we kind of go through like well in entertainment to make entertainment and see kind of what they're watching movies and TV but the other one that I've been really happy with and been growing is the journeys of Professor Atwood which is a podcast to kind of help people tone down anxiety help with insomnia and help them sleep and it's a narrative podcast where you don't have to do anything it just I wanted to make something that could actually just help people where you listen to these stories about a professor who goes on these journeys and then has a bed of sound effects and music that kind of like is almost like technologically specific to help you relax and go to sleep so it's kind of a very very calculated funny weird narrative journey that can kind of help you relax and who couldn't use a little bit more relaxation right now so that they're all free you can subscribe White Cat Entertainment as you watch the asteroid get smashed into perfect follow up aren't coming near us at all don't worry but if you need to relax we now know get some goodness well it's Monday so we always want to thank our new boss and we have a new boss who just started backing us on Patreon Chad! Chad's the best you could be tomorrow's Chad patreon.com slash DTNS Patrons stick around I can't imagine we're not going to talk a little more about Vader and comic books and all that kind of stuff Good Day Internet has all that you can also catch the show live Monday through Friday 4 p.m. Eastern 200 UTC at dailytechnewshow.com live back tomorrow talking Raptor Lake with Patrick Norton talk to you then this show is part of the frog pants network get more at frogpants.com Diamond Club hopes you have enjoyed this program