 Coming up on DTNS, Plume Wi-Fi is likely in your ISP's Wi-Fi router right now. Is that a good thing? The latest in the Chinese crackdown on tech platforms and a detailed look at Apple's plan to scan for child abuse images and the implications of that plan. This is the Daily Tech News for Friday, August 6th, 2021 in Los Angeles. I'm Tom Merritt. Somewhere in Dogtown, STL. I'm Patrick Norton. And I'm Roger Chambers. The show is pretty soon. We were just discussing all manner of things from my online pet refill follies to whether I should wear a hat on the show or not. If you'd like that wider conversation, get good day internet. Become a member at patreon.com slash DTNS where you can join our top patrons like Jeff Wilkes, Paul Reis, and Dr. X17. Let's start with a few tech things you should know. Thanks to Semi-Declared on our subreddit who pointed out that today is the anniversary of the publication of the very first World Wide Web page by Tim Berners-Lee on 6th, August 1991. It says on that page, the World Wide Web, or W3, is a wide area hypermedia information retrieval initiative aiming to give universal access to a large universe of documents. Microsoft has three interesting announcements. A new experimental mode for the Edge browser will turn off just-in-time compilation in its JavaScript engine to improve security. Microsoft's Chief Product Officer, Penos Panay posted a gif on Twitter showing Spotify integrated into the alarm clock on Windows 11, apparently related to the focus session feature. And over the next 12 months, Microsoft will bring features from the UWP One Note into the desktop version of One Note. And by the second half of 2022, all users will be encouraged to upgrade to the desktop app. Support for that version of One Note will continue through October 2025. Did you say gif? Probably. But I seem to change how I say it every couple of years depending on... Oh, okay. Yeah, I like that. Equal opportunity. Wednesday, we discussed Facebook's decision to suspend the accounts of researchers from NYU's advertising lab in the name of protecting privacy. The FTC's Acting Director of the Bureau of Consumer Protection, Samuel Levine, wrote a letter to Facebook CEO Mark Zuckerberg on the subject, saying that he was, quote, disappointed by how your company has conducted itself in this matter. The letter also added the following paragraph. The FTC is committed to protecting the privacy of people and efforts to shield targeted advertising practices from scrutiny run counter to that mission. Had you honored your commitment to contact us in advance, we would have pointed out that the consent decree does not bar Facebook from creating exceptions for good faith research in the public interest. Indeed, the FTC supports efforts to shed light on opaque business practices, especially around surveillance based advertising. While it is not our role to resolve individual disputes between Facebook and third parties, we hope that the company is not invoking privacy, much less the FTC consent order as a pretext to advance other aims. End quote. That's the sternest thing I've heard from the FTC and approximately forever. Good writing. I enjoyed it. Seriously. John Deere announces it will acquire autonomous farming heavy machinery company Bear Flag Robotics to fill gap in farm labor Bear Flag developed autonomous farm robotics and has become a limited deployment in California. The acquisition will help Bear Flag to avoid what happened to Apple peaking robot developer abundant, which closed last month when it was unable to raise enough investment. And Thursday, we mentioned the Wyden-Lumis Toomey Amendment from U.S. senators that would tighten up the definition of cryptocurrency brokers in the infrastructure bill. Well, senators Portman, Warner and Cinema have introduced their own amendment that would reportedly broaden the definition of broker and apply the increased reporting requirements and hire taxes to more cryptocurrency businesses. And President Biden has expressed support for that amendment. All right, let's talk about Google first before we get to Apple unsealed court documents indicate Google considered buying Epic Games as a way to remove the two companies disputes over Fortnite in the Android store. Epic offered Fortnite as a side loaded app for a while, you may recall. And in Epic's filing on the lawsuit, it said Google developed a series of internal projects to address the contagion, as they called it, that it perceived from efforts by Epic and others to offer consumers and developers competitive alternatives, and has even contemplated buying some or all of Epic to squelch this threat. Now, that's interesting, but the supporting documents that show exactly what Google employees were saying about the idea of acquiring Epic remain sealed. So we don't get to know that. And the filing does not indicate that Google actually ever raised the idea with Epic. It may have just been an internal thing. Epic's filing also shows Google staff members were aware that side loading was quote abysmal and took 15 steps compared to two for Google Play. In other words, Google knew that there were barriers to an alternative and use them to protect its monopoly in Epic's opinion. That's why they're bringing it up. More along these lines include a presentation about Amazon's competing app store where Google employees said if we were honest, we would admit that most users and developers aren't consciously choosing. They are going with the default, default being the Play Store. Epic is doing all this in advance of a trial against Google. It's still awaiting a ruling from the court on its lawsuit against Apple. That's yet to come. And a trial with Google has yet to begin. Seems like Patrick, like all the buzz is focusing on that idea of Google saying, well, maybe we should just buy Epic, which yeah, that's tantalizing. But without really seeing it in context, I can imagine that inside Google, employees would throw that idea around of like, I mean, we got enough money, why don't maybe we should just buy them without necessarily seriously thinking it was the thing that they should do. What's kind of crazy is when you think about Google or Apple or Microsoft for that matter, these are companies that are sitting on tens of billions, if not hundreds of billions of dollars that are just sitting accruing interest or whatever you do with $150 billion, you stash it away for a rainy day. So, you know, it's literally when you think about what a company could buy, they could buy anything. Exactly. That's why this doesn't get my dander up so much is like the idea that Google could buy Epic isn't news. The idea that someone inside of Google among its thousands of employees at some point said, I don't know, maybe we should just buy Epic and end this whole thing isn't surprising. What would be surprising is if it made it to a planning stage and they're like, yeah, let's do that. Let's shut them up. Let's buy them. That would be interesting. We're not seeing those documents. Or no signs of any negotiation. I have a feeling like on certain lawyers, they have a checklist. Can we buy our way out of the problem? Can we buy the company to buy our way out of the problem? Honestly, they should. They should go through that checklist. And quite often that checklist should end with antitrust concerns do not outweigh the benefits of buying, do not buy, which is probably where this conversation would have ended rather quickly. But yeah, it's getting a lot of attention. So, I figured we should acknowledge it. Makes sense. Protocol has a write-up on Plume. That's a Wi-Fi company you may or may not know about. They started out making software that helps Wi-Fi networks allocate bandwidth intelligently by device in order to increase coverage in a home. And the idea was you wouldn't have to add hardware or antennas or mesh-based stations. Plume sells the systems to ISPs like Comcast, Charter, and Vodafone to make their Wi-Fi routers work better for home users. They also, of course, sell standalone products, Wi-Fi pods direct to consumers to help them manage their home networks. It's on these products that Plume has been testing Wi-Fi-based motion detection technology from cognitive systems as a feature. It analyzes changes in Wi-Fi radio waves to tell where motion occurs, which is kind of a wild thing to think about. Plume is considering using that to offer motion-based features like turning off lights when someone leaves a room or dusting a thermostat when nobody is home without having to have an actual camera there or adding multiple sensors. And Plume is, of course, collecting aggregate data with a billion devices being used in its system or so they claim. The company says they can report things like a 158% increase in installed Peloton devices or a 50% increase, 54% increase in Apple watches between October 19 and May 2021. The data can be used for network management and customer service issues. It could also be used to help investors decide on what kinds of companies to invest in or device makers, basically which type of devices to develop. Oh, my goodness, look at all those devices out there. We should make a competitor. And, of course, it can be used for marketing, though Plume insists it will not sell anything but aggregate data and not allow individual targeting. Man, I was enjoying the story right up into that last sentence. And the thing is, I don't have any reason to distrust Plume. I don't know enough about him to distrust them or not. But as soon as somebody's aggregating data these days, everybody's dander goes up, right? Everybody's like, wait, hold on. And it doesn't matter that you say, oh, we promise not to sell to third parties. Everybody assumes that you will because they've been burned before, right? Or it'll only be aggregate data. Don't worry, no one will know. No one will ever find out, which has me thinking about that stern and well-written letter from the FTC to Zuckerberg. I'm curious. I'm a kind of fan. I had no idea. I knew Plume as a company that makes mesh devices. I didn't realize how deep they were in supplying services to ISPs. I also had somebody who had a Nest thermostat and had it shut off the heat to the house while my family was in the room with it. I think Nest has gotten better with their sensors. I find the idea of using, you know, interpreting Wi-Fi signals really kind of fascinating. And I wait with bated breath to find out how well it works in homes across America. Yeah, I mean, because the whole wife in aggregate without targeting me specifically. The use of the change in the delay of the bouncing of the radio waves to the Wi-Fi to tech motion, that's great because it's like, yeah, you don't have to record video of somebody. You can just look at the math that's there already and say, oh, somebody's in the room. It seems like it tends to be a little more accurate as well because it's not trying to detect video. It's actually able to show motion. So I think that's great. And I love that Plume is doing that. This aggregate stuff, I mean, I try to steer against my instincts and say, it can be done right. And that is valuable to have, to know, like, listen, we don't need to know who has them, but seeing a lot more Peloton traffic on the networks is important to know and could be beneficial, especially if it's public. To stay in age, you have to work a lot as a company to prove that you are not abusing that data. The other thing that always strikes me is, one, because of the way they sell this as a service, they have continuing revenue that I think will motivate them to keep their devices secure, which is something you do not see in the vast majority of Wi-Fi devices or mesh devices, less so with mesh, more so with Wi-Fi, where they age and they're not secure. But I'm also like, if my router knows whether or not I'm home and somebody can never figure out a way to detect that from a security standpoint, that's a little terrifying. And that could be independent of even using Plume. If somebody can just monitor your Wi-Fi radio waves, that's possible. Oh, yeah. Time for another China crackdown update. China's rules that you cannot operate tutoring services for profit or with foreign investment have impacted US-based Duolingo. The app is no longer available for download in major Chinese app stores. Duolingo says it hopes to restore the app in the near term. They have not abandoned all hope. And ByteDance said Thursday that it's going to lay off employees in its education business and close some tutoring operations. China trying to crack down on the over-profiting on parental fears by just saying you can't make money off tutoring anymore. And these are the effects. There's also a crackdown on celebrity culture online going on in China. Popular short messaging platform Weibo removed an online celebrity list. It's kind of like a Twitter follow list, like where everybody's like, oh, these are the celebrities. Let's look at what they're up to. The Weibo removed it due to irrational support for some celebrities. That was the quote, irrational support. State-owned People's Daily had recently criticized platforms that make stars out of unworthy individuals. Former member of EXO is under accusations. Kevin Wu, that might be playing a part in this as well, could be one of those unworthy individuals in the People's Daily's opinion. Wall Street Journal notes that in the wake of Chinese crackdown on tech companies, Hong Kong's Hong Seng index is down 3.8% on the year, and down 39% from its peak in February. It started strong, and it's just dropped ever since. However, investors do believe that the long term more Chinese firms are going to list in Hong Kong, not the U.S., which will ultimately be good for the exchange anyway. China Telecom was cut off from the U.S. markets recently, though it plans to list its shares in Shanghai for now, and China Mobile also plans a domestic listing for the same reason. Holding onto its U.S. listing is proving challenging for Didi Global. Bloomberg says the ride-hailing giant is considering giving up control of its data in China to a third party, though Didi disputes that, says that report is untrue. China is investigating Didi for security concerns related to its U.S. listing, and punishments being considered reportedly include fines, or Didi being forced to take on state-owned investors to try to outweigh the foreign ownership from Japan's soft bank in the U.S.'s Uber. Also in the tech company Crackdown Wake, food delivery platform Matewan may be fined $1 billion for accusations of monopolistic practices. Huawei reported its biggest drop ever in revenue, though that is a result of U.S. sanctions and restrictions, not of Chinese Crackdown. And yet, there's also good news for one Chinese company. Counterpoint Research estimates Xiaomi sales grew 26% on the month in June, giving it a 17.1% share of the world market for smartphones, pushing it up above Samsung's 15.7% and Apple's 14.3%. It may be temporary. Samsung's production in Vietnam was disrupted by COVID-19 in the month of June. But for now, Xiaomi is the world's number one smartphone maker. I was going to say, also, it's got to be a terrifying time to be anybody that's invested in China or any Chinese tech company that's kind of looking around and going... Yeah, what's going to be the next misstep? And particularly, as we talked about last Friday, it's platforms. China loves its hardware makers, its chip makers. It wants more of those. It wants people that make the parts. And so if you're a chip maker in China, you're loving life. So it's not all tech companies. There is a rational support for companies that just make money off of the people. Yes, you can have a rational support for chip makers, just not for celebrities on Weibo. Hey, folks, we love patrons that stick around with it. So many of you have been with us from the very beginning, if not right after. That's why we're happy to offer Patreon loyalty rewards. You can get a unique sticker mug, t-shirt, or hoodie every three months, as long as you stay a patron at our top four levels. Each one has unique art from Len Peralta, featuring the DTNS seven-year anniversary logo. You can get the details by looking at the tier descriptions at patreon.com slash DTNS. All right. Yesterday, we mentioned at the very top of the show quickly that Johns Hopkins University professor Matthew Green had received confirmation that Apple was planning to do on-device scanning for child abuse materials. Apple has confirmed that plan now and released more details about what it calls expanded protections for children. Let's look at what it actually does and some of the reactions from the security and privacy community. Two main features will become part of iOS 15, Mac OS Monterey, and the next iPad OS and watch OS this autumn. They are both an attempt to combat child sexual abuse material, which is being referred to in a lot of the documentation and reporting as CSAM, C-S-A-M. Keep in mind, most all cloud services, Microsoft, Google, and more, scan user uploads in the cloud. Once it's in the cloud and on Google server or Microsoft server or Amazon server, they will scan for illegal content like CSAM. Apple doesn't do that and isn't going to start doing that because Apple lets users encrypt files before they get to the cloud. So to scan for CSAM uploaded to iCloud photos, Apple is going to implement something called neural hash, which will work on your device without decrypting the images. You'll be saying, how's that work? Here's how it works. A hash number is created for every image to be uploaded to iCloud photos. A lot of you know what a hash number is, but if you don't, it's essentially a number that can be created without viewing the image or decrypting it. It's math. If you're up on cryptographic techniques, Apple is using private set intersection. Security researchers generally agree that this works as advertised. This isn't the controversial part. It uses math so that you can't identify the image from the hash unless you have a matching hash for that same image yourself. Before an image is uploaded to Apple, your device will compare hashes to a database of hashes that have been made from offending images provided by child protection organizations, including the National Center for Missing and Exploited Children, the NCMEC. That database is stored on your device. Just a list of numbers. You're not going to have the actual images on your computer, just the hash numbers. Images that are the same or cropped from the same as one in the database will result in a match. So you don't need to decrypt the user's image. If the hash is the same, the software knows what the image is. If it's not the same, it doesn't. If there's a match, that result is uploaded to Apple servers, but that process uses something called threshold secret sharing. Basically, that sets a limit that won't allow the results. Even though they're on the Apple servers now, the results will not be decrypted until it meets a threshold of some number of matching results. Now, Apple would not say what it set the threshold at, because it doesn't want people trying to game that system. But threshold secret sharing is also not controversial in the security community. It works. Apple will not even know you had any matches until you meet the threshold, whatever that is. And if the threshold is met, there are still more steps. Apple will take decrypted results if you meet the threshold and manually review the images to make sure there wasn't a false match. Apple thinks it's a one in a trillion chance of a mismatch, but it will still manually review images just to be sure. If, after all of this, the images do match illegal images, Apple will disable the user account and forward the information to the NCMEC. The NCMEC will then engage with law enforcement. Okay, that's neural hash. That's one of the features. Here's the other feature, message scanning for children. This only applies to accounts that are set up on a family account and identified as someone younger than 18. For these accounts, Apple uses on-device machine learning classifiers to detect sexually explicit images in the messages app. If the user is trying to send a matching image, the system will warn the user, hey, you're about to send those. Are you sure you want to do that? And if your account is listed as being for someone 13 years or younger, the warning will include a parental notification if the image is sent. Now, if a child user receives a matching image, the image will be blurred and the user will be told, hey, you don't have to reveal this image. It's okay. If the user is younger than 13 and the user chooses to reveal the image, parents will be notified and the image will be saved for review. So that's child protection in messages. Apple says neural hash and the child account measures will roll out first in the US and did not commit to other rollouts. So it's coming to the US only for now. Might go elsewhere, might not. If you do not use iCloud Photos and if you're not on a child's account, the system will not be used. So if you want to avoid it, for now, don't use iCloud Photos. So to sum up the story so far, Apple will use on-device scanning to look only for abusive materials, child abusive materials, CSAM, keep it encrypted even if it matches until it reaches a threshold and even then manually verify it before a disabling account and reporting it to a non-profit child protection agency that will deal with law enforcement. Apple won't deal with law enforcement. And in addition to that, accounts identified as children will be protected from seeing sexually explicit images and if they're identified as younger than 13, parents will be notified if the parents won. And yet security researchers in general see these measures however well-intentioned they are meant as backdoors to encryption. Because in the end, an encrypted piece of data is being decrypted by a third party. Apple, in the case of neural hash, or seen by a third party, the parents, in the case of child message protection. The EFF also points out that the systems can be easily changed to scan for more than just child abuse if Apple wants to do that or look for more than just sexual images if Apple wants to do that and easily could be changed to apply to more than just i-photos and messages on child accounts if Apple were wanting to do that. Right now it's limited to that but the infrastructure is there. Apple's defense in the past has been that it could not give access to encrypted data if it wanted to because the user holds the keys and that's the end of that unless it's iCloud. That defense goes away if this is implemented. Apple now could be pressured to update its software in particular cases to scan more widely. Some government could say to Apple like hey we know you can do this just you know do a little tweak to this person's operating system so we can see a little more. Apple of course will say it will not give into such pressure or push out such updates. After all, it holds the keys for all iCloud photo libraries already and it has resisted pressure to scan in the cloud it's not even doing it now and it could have also updated software to weaken end-to-end encryption in the past and it didn't and it bears repeating that Apple's not the first to do this or even the worst. Most cloud companies scan your photos for CSAM in the cloud after you've uploaded it and most of the time you don't even realize that's happening. The reactions are security researchers don't think this is a good idea for Apple to do a GitHub petition has been started requesting Apple halt deployment of these systems and reaffirm its commitment to end-to-end encryption. Nine to five Mac obtained an internal memo from Apple software VP Sebastian Marino May saying we know some people have misunderstandings and more than a few are worried about the implications but we will continue to explain and detail the features so people understand what we've built. I feel like I haven't read the full white paper on this necessarily Patrick but I feel like I have a good handle on what they're doing and I very much support the idea of why they're doing it. I am very uncomfortable with them implementing infrastructure that could be used for something else. Yeah it's I have children. I have a lot of feelings about this and it is interesting to look at. I think the real challenge here is again you know we're going to save the children we're going to do the right thing. We just have this one exception. They're a little box and what we've seen and this is coming from Apple it's particularly fascinating because they've done they've held the line so hard and end-to-end encryption and you know the EFF write-ups on this are particularly compelling and one of the things that was in the show notes was David Teal writing about it and he's written a lot of really compelling so or tweeted I should say the EFF article by Indio McKinney and Erica Portnoy Apple's plan to think different about encryption opens a backdoor to your private life and what it comes down to is you know this becomes you know it starts out as A and then you know through legal pressure through we want to do business in this country pressure through you know that sounds like a good idea pressure and it suddenly becomes a tool for kind of keeping an eye on what's going on inside of your phone and I know people love to say well if you're not doing anything wrong it won't be a problem but you know that's that's a that's a complicated subject especially if you are say oh I don't know perhaps you are not particularly happy with the current government in Russia you know or any of a number of other countries in the world or the United States or the United States you know there's there's also you know I man it's it's you know it's a it's a very you know I think it becomes a very slippery slope very quickly depending on what else this is used for that my thinking keeps changing on this so by the time you hear what I'm about to say and respond I'll probably have changed my mind but right now the iCloud photos bothers me less because Apple already holds your key sure and so what bothers me about this is that they're saying you know what we want to give in a little right now we hold your keys and we don't give them up for any reason unless we absolutely have to under a warrant but we're going to give them up in this case under these conditions I don't love that because slippery slope like you say right but and this is to Teal's point uh maybe it's worth it uh maybe that's worth it and you already hold my keys anyway and you're doing a lot to to make sure that you don't see it unless you absolutely have to okay I'm still uncomfortable with that but I I'm starting to see like how that is not maybe as bad as I thought originally the message is is where you have put in a back door you you this is end to end encrypted end to end encrypted means nobody else can see what's in my end end encrypted and as much as I don't want kids to have to see messages that they didn't want to see and shouldn't see at that age especially children under 13 and I am not unsympathetic to that you weaken end to end encryption to do it and that's the debate because you can't say it's end to end encryption when one end can be sent to another person without yeah you know without the choice of the person sending the message yeah I also uh you know I I cannot imagine uh being a teenager uh in the age of social media and iPhones it was complicated enough before that um but I think there's also going to be some pretty complicated uh unintended effects when the parental notifications you know between 13 and 17 you know uh and technically they won't get it they'll get a warning about an image um but uh man the whole parental notification thing um for under 13 I'm less worried about it in between 13 and 17 they shouldn't happen but it seems uh it seems fraught yeah well it's never the intended use that's the problem right it's always the unintended use uh you may have noticed a few times I referred to an account that has been created uh saying it is a person younger than 13 I'm an abusive family member who creates the family account for everybody because I'm very controlling I can decide to call your account 13 and younger just so I can make sure I get those messages sent to me should they happen uh so they're there you know there's a whole conversations to be had about how the system could be abused that isn't a that isn't to me that's not a torpedo argument of therefore Apple shouldn't do it because any system could be abused and usually it's better to stop abusers than to take away tools from everybody else uh but that said it's it's not immaterial to the conversation it's it's it's worth balancing and considering whether in this case well maybe it isn't worth it because there's all these other implications to it as well yeah uh and Mike is asking a question in the chat that a lot of people are asking why now why now it to me is always kind of a you could always ask that about anything so to me it's not usually a terribly relevant question but Apple has answered it they said the reason we're doing this now is we couldn't guarantee the encryption worked well enough until now they they feel like they got the tech to the point that they could they could protect it at least that's what Apple said yeah um I also sense a lot of a lot of activity moving to different applications you know yeah and if you're saying how does this impact end to end encryption uh because one end is no longer uh protected from being forwarded without the permission of the user uh it's certain certain messages in this case are forwarded to parents without the permission of the user uh and you could say well that's worth it because it's a kid but it's that is no longer end end encryption at that point all right uh please send us your emails feedback at daily tech news show dot com there's not an easy answer to this uh I'm not coming out with a band hammer saying Apple's absolutely wrong uh I'm just saying so far I'm pretty uncomfortable uh with this maybe I can be brought around maybe you're the person to do it I think the answer to that comes in 10 years when we see what actually happened with the technology maybe where it ended up maybe nothing happens you more often than not that's the case yeah or maybe a bunch of nasty people are thought yeah maybe which would be good all right speaking of email we got one from Todd who said there was a discussion about the Olympic broadcasting system in tech and the next Olympic games in 2024 next Olympics are actually less than six months away in Beijing uh if you think of the winter Olympics we were talking about the summer Olympics in 2024 but you're right winter games coming in February 2022 in Beijing and Todd said the entire OBS control center will be disassembled as soon as the games in Tokyo are over and shipped right to Beijing to be reassembled uh Todd says we had the manager of the OBS sports commentator system on office hours yesterday and he mentioned this thank you Todd uh for passing that along good point like this this is uh we we got a small gap for that technology to get ported over this time also thanks to our brand new boss uh we listen man if you back us on patreon we appreciate it Brandon Boyer just started back at us thank you Brandon for backing us patreon.com slash dtns every single new patron uh is appreciated and thank you Patrick Norton you are also appreciated thanks for hanging out with us thanks for having me as always and as always if you want to track me down outside of dtns please look on twitter at Patrick Norton or go to avxl.com the home theater and audio podcast I host with Robert Herron so much conversation we are live monday through friday 4 30 p.m eastern 20 30 utc you can find out more about where win and all that daily tech news show dot com slash live back on monday with white cat entertainment's chris mancini talking about some tv tech uh so we will see you then this week's episodes of daily tech news show were created by the following people host producer and writer tom merit host producer and writer sarah lane executive producer and booker roger chang producer writer and host rich strothelina video producer and twitch producer joe coons associate producer anthony lambs spanish language host writer and producer dan compost news host writer and producer jen cutter science correspondent dr nicky acrimates social media producer and moderator zoey dennerding our mods beatmaster w scottis one bio cow captain kipper jack shid steve guadirama paul reese matthew j stevens and jd galloway moderation and video hosting by dan christiansen video feed by sean way music and art provided by martin bell dan luders mustapha a a cast creative asterix and len peralta a cast ad support from trace gainer patreon support from stefan brown contributors for this week's show included scott johnson justin robert young and patrick norton and our guest this week was seth rosenblatt thanks to all the patrons who make the show possible this show is part of the frog pants network get more at frogpants dot com the timing club hopes you have enjoyed this program