 and welcome to our Zoom folks who are tickling in. I think we're good to get started. Awesome. Well, welcome everyone to the Berkman Klein Center for Internet and Society and our Institute for Rebooting Social Media at Harvard University. My name is Tony Gardner and I direct operations for our institutes. And today I have the honor of welcoming you all and our folks on Zoom, our esteemed panelists and introducing our moderator and organizer for today's event, Professor Anupam Chander. Anupam Chander is a visiting scholar with the Institute for Rebooting Social Media and the Scott K. Ginsberg Professor of Law and Technology at Georgetown. A graduate of Harvard College and Yale Law School, he is the author of a number of books, including a new book on data sovereignty just out from Oxford University Press. He has been a visiting law professor at Yale, Chicago and Stanford and is a member of the American Law Institute. Over to you Anupam. Thank you very much, Tony. Thank you. I'm honored to introduce the panel. And before I do that, I want to just kind of bring you up to speed on where we are with TikTok Band's past and present. So in 2020, Donald Trump famously and dramatically banned TikTok. But instead of Trump banning TikTok by the following spring, it was TikTok that had banned Trump. Removing various videos involved, is showing his involvement with January 6th. So how did a Chinese big tech app, big tech company or a Chinese origin big tech company beat the president of the United States on his own term? It was the US courts that stepped in to protect TikTok and TikTok's users in particular. The courts concluded that the president lacked statutory authority to ban this cross-border speech up. There are still a couple of spaces up here if anyone wants to just come to the front. The ban, the government's efforts to convince the courts that national security required the ban to go into effect, failed to convince two district judges including a Trump appointee who concluded that they were conjectural and could possibly be met with other measures. But TikTok's troubles were far from over. It began negotiating earnestly with CIFIUS, a committee in the United States government that reviews foreign investments that present possible national security concerns. TikTok to negotiate with CIFIUS offered to place all of its US user data inside servers held in the US and controlled and managed by Oracle, a company whose leaders had coincidentally supported Donald Trump in the 2020 election. TikTok's algorithm would be monitored and vetted closely by Oracle and others and the board of directors of the TikTok data security arm with the arm that held all the data and controlled the algorithm would be approved by the US government. A remarkable landscape for a speech app in the United States. Those efforts continued to this day and CIFIUS hasn't yet ordered again a divestiture or accepted TikTok's Project Texas plan to mitigate those risks. Now where Trump's ban failed for lack of statutory authorization, a bill in Congress today would fix at least that deficit. That bill was by Congressman Mike Gallagher and joined by Congressman Rajah Krishnamurthy would clearly direct bite dance to either sell TikTok and his other US facing apps for face and effective ban in the United States. So while that bill is concerned with both the use of TikTok for surveillance and propaganda in conversations with the press, Congressman Gallagher has said that he's especially concerned with propaganda from the app. And in November of last year, Congressman Gallagher blamed TikTok for pushing Chinese propaganda and radicalizing Gen Z on the app. Kind of interesting interpretation of recent events. The bill which passed the US House in seemingly record time and with overwhelming margins is now before the US Senate for consideration. While the bill's proponent said it's not a ban, it's only compelling to sale, they can't be sure that it won't lead to the shuttering of the app in the United States. This is because in 2020, China modified its export controls to add to the technologies subject to export controls personalization algorithms. The algorithms that would make recommendations based on personal information. Thereby kind of hinting that it would ban the sale of TikTok should that arise, should that even help. Now, ByteDance also may itself prefer to shut down the US market. This is not talked about rather than create a future competitor at a fire sale price. So ByteDance itself has a TikTok app that operates across the world. It would now have to face the difficult situation of not only having already vibrated, it's China facing out, do it in, but now vibrating TikTok into multiple apps that with obviously interoperability questions that would be potentially difficult. Here's coming to the rescue is Trump's former Treasury Secretary, Steve Nugent. Steve Nugent is assembling a group by TikTok and he has a solution for the Chinese government's veto. He proposes to buy TikTok without the algorithm. Okay. And so his proposal is to recreate an algorithm in the United States from the ground up. And so now that the TikTok bill has passed the house, and I also note that this is Congressman Gallagher's valedictory, next week you will resign from the U.S. House to join Palantir, defense contractor for the United States government that interestingly has its first one, his first initial investments from the venture capital arm of the CIA. And I kid you not, right? Okay. He would not have expected that there's words coming out. That is the reality. So our panel considered what happens if the TikTok bill is passed and signed to law. Both TikTok and its users will sue arguing that the bill violates the First Amendment. And the bill interestingly places original jurisdiction over challenges in any challenges to the bill, which will immediately happen, of course, in the DC Circuit Court of Appeals. That is the only possible appeal from that original court will be directly to the U.S. Supreme Court. So this is very much an issue that could well be before the court is coming in the months to come before the U.S. Supreme Court, I mean to say. So the TikTok saga is kind of like, and as I wrote this, I realized it's kind of like a Hollywood drama. And its final scene may well lie at the U.S. Supreme Court. And joining me today to explain the issue, so what we're trying to do with this particular conversation is focus on what happens in those courts, okay? Ask that particular question. What happens to the First Amendment challenge that will inevitably be filed if the TikTok bill goes through? So we have some leading experts from across the nation joining us here to explain these questions and to think through how the courts will reason this issue. First, to my immediate left is Jennifer Huddleston, a fellow at the GEO Institute. This happens to be a very nice opportunity to host Jennifer, but I'm also hoping that we can join in helping support Jennifer because on Monday, she'll be running the Boston Marathon. Oh, I'm sorry. I'm sorry. I'm sorry. I'm sorry. This is going to be her 12th marathon or half-marathon, as I understand it. 12th full, okay, 12th full marathon. And her average time is an eight-minute mile for just shocking to, for me, I can't do one eight-minute mile in 26. To her left is Ramya Krishman, senior staff attorney at the Knight First Amendment Institute and a lecturer in law at Columbia Law School. And finally to my far left is Jenna Leventhoff, a senior policy counsel at the ACLU where she develops and advocates for policies relating to protecting free speech. And on screen, joining us from Minnesota is a law professor at the University of Minnesota and a senior editor at Law Fair, a professor at Allen Rosenstein. Allen is a graduate of this very law school and a former fellow at Berkman. So with that introduction, I'm going to go around and ask them questions and there will be time for questions from the audience and there will even be time for questions from the Zoom audience. So I want to encourage you to think about your questions. And we're doing this very much for lawyers. And so this is, we are going to focus on the question that lawyers ask in such a conversation. So I'll begin with a threshold question. And the threshold question that a court will ask is what is the standard of review? So in the Montana case, the district court applied intermediate scrutiny, finding the TikTok ban to be wanted even under that standard. It said it needs to decide whether or not that was the appropriate level of scrutiny because it failed under intermediate scrutiny with awkward sure I fail under strict scrutiny. And so I want to ask Jenna first, what is the appropriate standard of review in this case? I think the appropriate standard is actually even more than strict scrutiny. This is a prior restraint. This is stopping the speech of 170 million Americans before they can say anything. And in so many cases, that's worse than, you know, well, be traditionally just the chair speech because it's not, and that means you're viewed and then you're punished later. In this case, your view is never getting out there to begin with. It is the most strict, like speech restricted thing that you can possibly do. And so when Supreme Court has lifted prior restraints like this, where again, you're stopping speech before it starts, they say they're going to presumptively fail a constitutional analysis, right? The government has to go so far above what it normally goes for. So they have to show that there's a medium. They have to show that harm is extremely serious. And then not only does it need to be narrowly, a narrowly tailored solution, but it pretty much needs to be necessary. It needs to be a necessary solution, right? Like, is this the only thing that you can do to actually solve the problem? And in this case, it's going to fail. This is not going to meet that analysis because I think the government has yet to put forth any public evidence that they're either real harm, let alone an immediate one. But even if we got to that point, right? Like, even if the government came out, I know in Congress they're talking about doing a public briefing, many members of Congress have gotten private briefings about what the potential harms are. And if you ask a lot of those members, they'll say, I haven't heard of any things that I find particularly convincing about an imminent harm. This seems to be a radical, but some members do seem to admit that there is a real harm. So even if that comes out, where we're going to fail here on these bans is being the least restrictive thing. What is this doing? It's shutting down the app, essentially. And there's really no way to do that in advance. Like, that is speech restrictive in and of itself. And there's so many other things that we can do to target any of these harms. We could pass up privacy though. Right now, like, let's say the concern is that China is accessing our data, right? Well, China can still access our data even if TikTok is banned. They can get data from a data broker. They can hack into Facebook's system, like every other app and website that lacks the same data. And so we're not doing very much here to actually solve the problem. Okay, so Alan, I'm going to come to you. Jenna says, this is even more than strict scrutiny. It's obviously a prior restraint with shutting off access to speech, incredibly important speech platform or something like 170 million people in the United States. And then before we're coming to the substantive of that, applying whatever the standard of review is, what is that appropriate standard of review from your perspective? Sure, so to be perfectly honest, I'm not sure. I feel like I can argue it many different ways. And it's a common problem in First Amendment jurisprudence is these sort of endless arguments about what the appropriate standard of review is. So let me say two things. So first with respect to what the actual standard of review is, I think, again, I think you can make arguments sort of up and down the spectrum. So we just heard the argument for a prior restraint that I think that's very plausible. On the other hand, and there are lots of potentially analogous cases where we wouldn't apply that kind of standard. So for example, an FCC denial of a license, right? I don't think you would apply necessarily a prior restraint standard, right? That would keep someone, for example, out of the communications market. So again, I think it just depends exactly on sort of how you characterize the issue. And there's a lot of play in the doctrinal joints as it were. You could potentially characterize it as viewpoint-based if your concern is that the Chinese Communist Party is pushing a particular viewpoint, then you might characterize this ban or this law that way. And that would obviously be a high level of scrutiny. You could just characterize it as a content-based law, which is to say the content is Chinese propaganda. If that's the only characterize it, and that would be strict scrutiny. Or you could characterize it the way that the Montana court did as a more neutral, something analogous to time, place, and manner in which case you have intermediate scrutiny. And then of course, we haven't even talked about the national security implications, which I think tweak all of these tiers of scrutiny, at least how the courts analyze them. So I think this foundational question is very much open. But the second thing I want to say is, and not to get sort of too legal realist five minutes into the conversation, I'm not sure the fight over the tiers of scrutiny here will ultimately matter in the long term. I think these sorts of distinctions are quite important when lower courts are trying to slot a fact pattern into well-established body of First Amendment law. But I think there are two reasons why that's probably not the case here. First, although this isn't sui generis, or as I'm sure we'll talk about later on in the conversation, there's a long history, and we can sort of debate the specific parameters of that history, of restrictions on foreign ownership of platforms. This sort of move is I think quite unusual. There's not like a massive amount of case law here. In addition, as you pointed out, Anupam, this litigation will start in the DC circuit, and then it will almost certainly, I would imagine be reviewed by the Supreme Court. This is such an important issue that it's hard to imagine the Supreme Court, which I think to its credit in the last few years has shown a real willingness to engage with a whole host of internet related platform issues that it's generally not in the past that will take on. And once you get to the Supreme Court, I don't think the tiers of scrutiny will play any role whatsoever. I think the Supreme Court, you have nine policymakers who are going to be balancing the various equities here as they see it. And so, you know, while, again, I think asking the tiers of scrutiny question is a good place to start. At the end of the day, I don't think that this is, that that's going to be determinative of how the doctrine ends up playing out in this litigation. I love the idea of the Supreme Court as nine policymakers balancing various interests as they judge this. And it reminds me of Justice Kagan, formerly Dean Kagan, saying that they are hardly the nine most brilliant people about expert people about the internet. So fascinating to imagine. Okay, so let's turn to one of these questions. That has already been mentioned. Many of the defenders of TikTok bill argue that it doesn't actually impose a ban. It simply requires bike dance to find new owners. Okay, ones without ties to a nation that has been identified as a foreign adversary. But this is a little of that national security question T-Doc that Alan has suggested. Of course, the bill would impose a ban if bike dance doesn't divest. Does it matter for the person of analysis that the bill isn't in immediate ban order? Or rather, I divest, or if you don't divest, then you're banned. And Jennifer, I'm gonna come to you first. But yes, it does matter. In part because it's going to go back to the, are there lesser means to achieve this goal? So clearly a forced sale or divest there while very concerning for many reasons that I'm guessing we'll have some time to go into from a speech point of view is a very restrictive means. A full out ban would be a more restrictive means. But the question is, what else exists on that spectrum before you get to something like a forced sale or divestor? And I also think it matters how this forced sale or divestor must occur because we're talking not just about a small company that doesn't have many users. It's a small transaction or something like that. We're talking about a very large transaction in a very short period of time but also has to clear some additional regulatory hurdles, not only because it's likely of the size of the transaction but because of what else is in the bill. It has to be proven that this satisfies that this alleviates concerns about the foreign interest. So are there certain buyers that the government might potentially strike down? There are all sorts of other elements of how this divestor must occur that does not mean that it's as simple as sometimes advocates of this bill make it out to be of like TikTok, they just go down on the corner and offer itself up for sale. These are complicated business transactions that there's only going to be certain people that can potentially participate in. Now, why this matters for a First Amendment analysis is a lot of us can figure and think, well, what else could be done? If we do say that the government has a national security interest, is this the least restrictive means on speech or are there other steps that could be taken? We've seen some of these play out in courts at a state level as well with regards to, for example, banning TikTok from government devices or government stuff. It's the idea that if there is a national security concern, it shouldn't be on government devices, it shouldn't be on government networks. That has generally withstood the challenge in the Texas courts, I know. And in many cases, we haven't seen as much challenge to that kind of ban that's much more narrowing the character of a particular situation. On the other hand, you'd have something like the Trump executive order that's a much more flat out ban, but you also have other things in between. You mentioned Project Texas, that would be an example of something that would perhaps be less restrictive. We could think of something where Congress, for example, could mandate a warning label that says, this app is known to have ties to China. Again, there are many first amendment concerns with such a proposal, but it's probably less restrictive than a forced divestiture. We can think of other steps that could have been taken. So I think it will matter when it comes to identifying if this is the least restrictive means to achieve Congress's goal, which is not always clear what even that national security concern is. By the way, the privacy bill instead of those in Congress do actually have a disclosure requirement for data transfer to China. So the latest bill that is in Congress, like the bipartisan bill that was proposed last year, so there's an interesting alternative that's before Congress right now. Ron, yeah, any thoughts on the posture where this is a bill that says, divest, and if you can't divest for some reason, then you're banned as opposed to an outright ban. How does that change or affect the analysis or does it? So I'm gonna get a little bit realist like Alan Pierre. I'm not sure in practice how much distance there is between an order to divest or be banned and a flat-out ban. Because we know that China would have to approve any deal and it is on record that it will very likely firmly oppose any such deal. That's what it's commerce spokesperson said when last year, the Committee on Foreign Investment in the United States, if yes, when it told TikTok to divest will be banned, it's China spokesperson came out and said, well, actually we're gonna have a problem with this and that's when they noted also that they would have to be the export of by dances, algorithm, the algorithm that TikTok runs on. And so that's why analysts have said that it's very, very unlikely that a divestiture deal will be accomplished here. And so what we're staring down the barrel of is almost certainly a ban. But the other thing that I would mention is a matter of sort of first amendment doctrine is that generally speaking, the government is not meant to be able to do indirectly what it can't do directly. And here, the government would be using the threat of a ban in order to accomplish a divestiture. And I think that that should matter for the purposes of the analysis. There is this threat of a ban being held over a company in order to achieve the divestiture. If I could just sort of respond though to something that Jennifer said that's unrelated to that question, but how we should look at certain less restrictive alternatives. Like for example, the many state laws we've seen that have imposed a ban on state employees accessing TikTok on state owned or operated devices and a little bit self-interested by us here because as one of the litigators who litigated that, that Texas case, the case challenging the application of Texas's state employee ban to public university faculty engaged in teaching and research. And so I just do wanna highlight that at least, the application of those kinds of bans to the public university context, I think does raise serious first amendment concerns. There are faculty that are engaged in the study of TikTok, many of them focus on the very privacy and security risks that these states have said they care about and have offered as a reason that they have passed this ban in the first place. Students' interest in learning about one of the most popular communications platforms is also implicated. And so I would just wanna sort of push back against the idea that those bans never raise first amendment consent. I think they do. If I can just clarify really quick, I am not saying that they do not raise first amendment concerns. It is more of a when we're thinking about what our less restrictive means, we've already seen some of those less restrictive means play out. There certainly are first amendment concerns in many of the things that I mentioned, for example, a warning label or even some of the data localization requirements there could be certain other concerns related to that. But I think it's important that when we see what a sizable step the divester ban would be. I think we both agree that that is a significantly more restrictive means than what we've seen play out so far. So, Alan, does it matter how the bill is styled as an outlight ban versus a divestment order coupled with a ban if divestiture doesn't occur? I think it does. I agree with both Jennifer and Rummy here sort of simultaneously and so I'll try to explain why. So I agree with Jennifer that it does matter, right? There's just a difference between saying this thing is banned versus this thing might be banned but it might not also not be banned if there's a divestment. On the other hand, I think Rummy is correct that based on everything we know about the sort of geopolitics of this, if this law is passed and the president then identifies TikToks sort of under the law, it's likely to TikToks ban. And I do think that defenders of the law have to be prepared for that. And so I think you do have to accept that possibility. Nevertheless though, I actually think the divestment option is clever for another reason, which is that if China refuses to divest or allow by dance to divest, I think that actually then strengthens the national security case for the law itself because it shows how valuable the Chinese government perceives TikTok's role in the United States is. Now, so at the end of the day, I think Rummy is correct that if this is gonna be defended, it's gonna have to ultimately be defended as a ban. But I also think Jennifer is correct in that there's a lot of cleverness in adding the divestment option. Great, great, thank you guys. So I do think, by the way, there is something that we haven't mentioned yet just to add a little editorial commentary quickly. When Twitter asked hands from the shareholders in all the before to Elon, from the public shareholders and its leadership, that made a big difference. And so Twitter's content changed. So the divestiture order itself should have, we should think of that as having personal implications even before we get to a ban. So it's not just the ban that you can't use this act, but that it has to be run by someone else. That's a pretty substantial first amendment intrusion in my personal view. Okay, I'll let anyone else respond to that if you want to, sorry. So I don't get the last word, I'm going to get the last word here. Well, the only other thing I would point out about this is we keep talking about this in the context of TikTok. And I think that's because TikTok is named in the bill that has this divestiture of ban from the vision that you mentioned. But if you look at that legislation, it's actually broader than just TikTok. So we have to think not only what does this mean for this current debate, but what does this mean more generally for apps that could be determined by the government to fall under this category? And what does that mean more broadly for the way we see not only government interaction in this market, but also government intervention into the control speech apps. Just to kind of clarify what you were saying, which is what this bill says is that the president can unilaterally decide that if there is another app that is owned by a foreign adversary, so the president can ban it. There's no due process. The president has to give notice to Congress and notice to the public. And that is it. There is no... To clarify, when the bill set talks about foreign controlled apps, it means an app that has ownership or 20% or more that originates from a foreign country that is labeled an adversary. That doesn't mean that the government of that country owns part of that app. It means that there are, in this case, Chinese citizens that might have 20% ownership of the app. And so that's the way... So in other words, a wide swath of companies that might actually come into the scope of being foreign controlled apps. And as Janice points out, the president can unilaterally declare those apps to be threat to the United States with very limited challenges available to a very limited publication of what the rationale is. Very little scrutiny of what is the basis for that claim and a few challenges that might be held for that designation. Okay. Let me move on to a conversation between a congressman and PBS News. The congressman says, we would never have allowed CBS to be owned by the Soviet Union in the 1960s. And indeed, of course, going back to the Radio Act and the Communications Act and Alan already referenced this in his remarks, we've had restrictions on foreign investment broadcasting. How would this history of restraints and foreign investment affect the analysis? Alan, I'm going to turn to you first. Sure. So I think it should affect it somewhat, but not too much. And what I mean by that is just because we've been doing something does not by itself make it constitutional. There are lots of things that are in American history that were done for a long time until the courts came in and said that's not a constitutional thing to do. So I don't want to overstate the importance of the history here. On the other hand, I don't think it means nothing. I think really good work has been done on this by Ganesh Siddharaman, who's a law professor at Vanderbilt. He wrote a wonderful paper on foreign control of platforms in the Stanford Law Review quite recently. And he goes through this history in a lot of detail. And what's notable, I think, from that history is that U.S. restrictions on foreign control of platforms are pervasive, not just in the communications industry, but also in banking and transportation. And then within the communications industry, they go back quite a long way, more than 100 years, back to sort of the original radioactive 1912 and then through the various communications revolutions of the 20th century, radio, telegraph, telephone, and so on. Now, I think it, well, again, I don't want to overstate the importance of that history. I think it's important for at least two reasons or at least look at this way. I think it strengthens the case for this law in two ways. First, I do think that the way that constitutional provisions are interpreted by the political branches is a important thing for courts to take into account. It is a kind of political branch precedent that coexist in a certain way with judicial precedent. And I think courts should be appropriately cautious, not overly scared, but appropriately cautious about interpreting the constitution in a way that would not only strike down a law passed by the political branches, but would have the effect of declaring potentially 100 years of what I think were not particularly controversial restrictions as unconstitutional. It's just something for the courts to think about. These are the considered judges of the political branches for over 100 years. The second reason I think the history is important is because I do think it shows that restrictions on foreign control of ownership of communications infrastructure in the United States is compatible with a robust communications industry and robust public sphere in the United States. Now, I think you could respond to that by saying yes, but actually it would have been a good thing. It would have been a better thing if had the Soviet Union wanted to buy CBS in the 1960s or 70s, we would have allowed it because that would have made a better communicative sphere. I happen to disagree with that, but you could make that argument. But nevertheless, I think the fact that our communications system has been quite vibrant, despite a history of foreign ownership restrictions, I think further tells you something about this law. Again, I wanna emphasize, I don't view the history here as in any way clenching one way or the other as to the constitutionality of a bill like this. Great, thanks so much, Jennifer. I think it's an interesting comparison because I also think it shows another element of this discussion that's often underappreciated. And that is what this bill would signal for the regulation of the internet and technology more generally. Because network television has been regulated much more heavily than the internet to the points that Alan just made because we saw a spectrum as a scarce resource. We saw the airwaves as a scarce resource. So it did not have that full first amendment right. There were more restrictions. There were more regulations on certain elements of broadcast television as well as on certain elements of broadband today. But when it comes to apps, when it comes to the internet ecosystem, when it comes to the kind of different platforms that we've had for speech online, we haven't seen those same restrictions. And that's part of what actually allowed the US to be a leader in the internet revolution was the fact that we didn't put many restrictions on the ability to come up with these creative ideas. Anything we supported platforms in enabling more opportunities for user speech. And that's why the internet has been such a positive tool for user speech. And my concern about when we start to hear comparisons to broadcast television is what that's actually doing is opening the door not only in this particular case, but more generally to placing much more heavy-handed regulation on the internet and particularly on online speech, which had been such a critical tool for so many people who in that broadcast era couldn't have their voices heard. Could I just jump in there? I mean, I do think that a sort of another relevant point of distinction is that generally those other frameworks, they were ex-ante, they were sector-wide, they generally involved requiring compliance with sex-wide regulatory standards. And a reason I think that that matters is sort of the intent behind these frameworks. I think that it's being clear from, and I don't know if you made this point, clear from statements made by the bill sponsors, the bill supporters, that a big motivation for them is that they don't like how TikTok is currently being moderated. And they think that an American company would moderate the app differently. They've made something that made specific statements about concerns, not necessarily grounded in evidence, but suggesting that the app is sort of artificially amplifying pro-Palestinian content at the expense of pro-Israel content. And they would anticipate that an American company would make a different decision. And that is generally that kind of content-based purpose, viewpoint-based purpose, is one that we consider anathema to the First Amendment. And I think sort of distinguishes this case and this bill from some of those other frameworks. Thank you very much, Ramya. Now, let's get to the heart of the matter. Would this bill survive the challenge? Let's imagine that it's just tested on intermediate scrutiny. And I hear Alan say, you know, I recall Alan saying the level of screening doesn't matter in practice, but let's imagine that the form of opinion will follow the particular standard of scrutiny. In any case, would a TikTok law pass intermediate scrutiny? In that context, that is, does it advance important governmental interests unrelated to the suppression of speech, not burden substantially more speech than necessary to those interests and leave ample alternative channels of communication? In this context, I think, you know, we have to keep in mind the possibility of Project Texas as one of these alternative approaches that might be relevant to that question. So I'm gonna ask Alan to lead off there. Sure. So I think it would, though I don't pretend that this is any sort of obvious call. Let me distinguish between the two grounds on which this law generally is defended. The first being data privacy. The second being, you can call it propaganda. You can call it, it's gonna be information control. I'm not a fan of the data privacy rationale for this law for reasons that I think are generally very nicely, which is that the lack of any sort of data privacy protections we have at the federal level means that I do think that if the Chinese Communist Party wants data on US citizens, they will get that data whether or not ByteDance owns TikTok. And I will say before I was a law professor, I worked in National Security the Department of Justice and I'm the proud owner of multiple lifelong paid for subscriptions to identity protection services courtesy of the federal government because China so thoroughly stole mine and millions of others of my colleagues data. So I don't find the data protection argument particularly compelling. I think it's a compelling interest, but I'm not convinced given the real free speech stakes here, which I certainly don't deny that that would work. So that's why I think the law is best defended. And I think that frankly folks in Congress should more explicitly defend it on these grounds. And I think they are increasingly doing so as avoiding Chinese interference in the information space, frankly. I do think that that is a compelling interest. And I do think that this is quite substantially related to that a divestment. TikTok, which is owned by ByteDance, which although was a private company given everything we know about the way that Chinese government operates means that ByteDance is ultimately under the control of the Chinese government, in particular Xi Jinping, is an enormously important source, not just of fun cat videos, but of information of news for millions and millions of Americans, including young Americans. Now, again, that obviously raises profound speech issues, but I think it also very clearly shows the real geostrategic and national security implications of that. We can talk about Project Texas. I think that's an important issue. And I do think that that's something that unfortunately policymakers have not given enough discussion to. And I think that if Congress, if the Senate takes this up and there's some reporting that they might actually, despite a month of sort of being very quiet on the issue, there needs to be a record explaining in more detail than was the case in the House why Project Texas, something like that, would be insufficient. I think there are things you can complain about Project Texas and there are even ways in which Project Texas, given how much actually US government involvement it would introduce into the day-to-day workings of TikTok is in some sense creates its own first amendment concerns in a way that just having sort of a TikTok or a TikTok competitor cleanly run by a US or not by China or other foreign countries of concern doesn't propose. So I just, I don't wanna concede that Project Texas sort of unambiguously the right alternative solution here. But I do think that at the end of the day the concern over the Chinese control over a profound information infrastructure is certainly compelling. And I do think this is a reasonable way of dealing with that problem. Thanks, Alan. Let me come back to Ramya. Yeah, so I mean, I completely agree with Alan on the data privacy point. I think that the reliance on data privacy as a rationale is very weak. Protecting Americans privacy is an interest of the highest order, but the way that you protect that interest is by passing a comprehensive data privacy law, not a TikTok ban, which is frankly not just unnecessary, but ineffective in actually achieving that interest. And the reason is the one that Alan mentioned that the Chinese government simply doesn't need TikTok in order to be able to purchase or access American sensitive data. It can easily get that data from data brokers and data aggregators on the open market. That's a real big problem. And I really hope that Congress takes up that problem by passing a privacy rule, but a TikTok ban isn't going to do very much there. On disinformation, again, I think that a ban on TikTok is going to be ineffective. The truth is that foreign governments, China included, don't need to own or own platforms in order to be able to spread disinformation. Many foreign governments have run disinformation campaigns in a variety of platforms, including American-owned ones. Obviously that was the case with the 2016 Russian campaign on Facebook. So I'm not sure that banning TikTok is really going to be effective in addressing that interest, even if it was a permissible one. I think that there's a real question mark over that. For the reason I mentioned before, which is that generally we don't like the government sort of controlling the public's access to ideas, including app ideas and information media from abroad. But the other thing that I would mention is I guess I have some trouble with this assumption that foreign speech is uniquely manipulative. Domestic speech can be just as manipulative, just as pernicious, but we generally wouldn't accept restricting domestic speech on those grounds because we would rightly see the potential for government abuse, the potential for the government to use that as a cover to suppress ideas that it doesn't like. And the sort of like prospect of distortion on one of the major channels of communications that Americans are relying on. I mean, I acknowledge that it's a way to be concerned, but it's not one that's limited to TikTok. Again, it's not that long ago that people were talking about fears that a company like Facebook could swing an election, right? I mean, there was a study, I think, back from 2010 where it was an internal experiment run by Facebook along with researchers at a university. And they ran an experiment on 61 million people. They showed them variations of a clickable I vote button. And the result of that study was that or they concluded that they had gotten an additional 350,000, I think, it was Americans to the polls, which is a significant margin and could be the margin of victory in the kind of close elections that we're having. So this isn't an issue that is limited to TikTok. I mean, I think that this problem flows from having a centralization of power in a handful of sort of for-profit companies. But I don't think we would accept the governments of imposing a divestible ban or flat out ban on any of these companies simply because they control foreign channels of communication. I think that there are other better policy responses that we get at this underlying problem of concentration of power in a handful of platforms over our public discourse. Jen. I agree with most of what you said. I think ultimately, for this bill, the bail's intermediate scrutiny or strict scrutiny or anything that's objected to you. It's that a ban is just not effective to solve any of the problems that this bill is reporting to solve. And we've talked at length about how every other social media company, these same problems are present. And so banning TikTok and TikTok alone is simply not effective. Can I say something just very quickly? So I agree with Ramya that this would not fully solve the problem. And I agree with Ramya that other platforms have this problem as well. But I do think it's important to distinguish between the scale of the problem on something like TikTok, which, again, is controlled by ByteDance, which can be controlled completely if it wanted to be by the or if the Chinese Communist Party wanted it to, and a platform like Facebook or X or YouTube or whatever, which, while it has potential to be a vector for disinformation, is not potentially under the control of a foreign government. And so I think just the scale of what you are potentially looking at is profoundly different. And I don't think that a law has to completely solve the problem at issue to pass constitutional muster. And so I do think it's important to keep the scale of what we're talking about with TikTok versus other platforms in mind. So let's go to the question about the national security argument. Many courts, when reviewing this question in the TikTok bans that we've seen, and the government in all those cases submitted secret evidence that has not been made public to us. But the courts have repeatedly concluded that government's claims were hypothetical, conjectural. And often, conjectural claims of harm are not enough to justify free speech burden. And so I'm just wondering how this will play out. In this case, you've got the government's claims of possibility. Alan just said the scale of possible manipulation of the information environment in the United States by China, by the TikTok out should potentially justify this. And so I want to come back to Alan. But I think that's the question here. That is opposed in the it could possibly happen. And it where the government hasn't yet seemed to show that this is, in fact, occurring. Even though, as I did mention, Congressman Gallagher does believe it's actually occurring today. Let's go to Jenna. Yeah, I think we have not seen any evidence of a real threat right now. It is all hypothetical. And that hypothetical is not going to be enough to survive government scrutiny. I think what it's really clear to me about this is that the government wants to go after China. I think they see that as being politically popular. They think that going after China is how they are going to win the election. TikTok of all the assets that post the size of China. Therefore, we will ban TikTok. And that's how we win our election. I think that's going to backfire. I think half of the country uses TikTok. And I read an article recently that said, it's actually the children of members of Congress who have been the best lobbyists for TikTok because they go and they beg their parents not to ban this app. They use for so many protected speech activities. But yeah, we just simply don't have evidence that any of these threats are real, let alone pricing to that imminent and severe standard that we think it'll need to rise to to pass a prior restraint analysis. That strikes me that I'd love to know how many people in the audience have TikTok on their phones. Who has TikTok on their phones? So I'd say a distinct minority, maybe a third of you have TikTok on your phones. So it's fascinating. So Alan, hypothetical, you post a hypothetical, is that enough to get through this substantial intrusion upon pre-expression? So I think it depends on what you mean by hypothetical. I fully concede that the nightmare scenario that is motivating supporters of this bill, that does appear to be hypothetical, which is of course what you would expect, right? If your concern is that this is kind of a ticking time bomb that China could use at a moment of high tension, you would expect China to wait to use that. But then you are accepting that that is hypothetical. And that is of course a weakness in the government's case. On the other hand, if you look at, I think, the component pieces of that concern, they're anything but hypothetical. So for example, we know that the Chinese government is extremely, extremely prickly, let's say for lack of a better term, about how prickly in terms of trying to control the kind of communications environment, not just within its own country, but outside, right? So whether this is Hollywood changing how it makes movies so that it can then play them in the Chinese market or the Houston Rockets getting banned from Chinese television after the general manager tweeted something nice about the Hong Kong protests. I think what's definitely not speculative is again China's willingness to throw its weight around to change how other countries view it. The other thing that's not speculative is the Chinese government's willingness to really in an extremely heavy handed way control its major private companies. So Jack Ma, the Chinese billionaire and head of Alibaba, which is a huge tech company, he basically disappeared for a while after saying some not nice things about Chinese government control over the economy. During this disappearance, the Chinese government basically forced off, forced to sale of a bunch of Alibaba's assets. Jack Ma has now reappeared and he seems to be happy with everything. So those things are not speculative. And so the question is, given what's not speculative, what's kind of the margin of additional speculativity to the nightmare scenario? And again, I don't have an answer here, but I just wanna emphasize that it's not an either or that this is or is not a speculative threat. There's gonna nuance there, it's important. So coming back to the propaganda question, which seems to be the one that I think everyone agrees is the threat most likely to be of great concern here. In 1960s, we have a Supreme Court precedent. And that is the case of Lamont v. Postmaster General. There, the court ruled unanimously that regulations infringed that restricted the receipt of information from China infringed the recipient's First Amendment rights. That is, Mr. Lamont had the right to receive Chinese Communist propaganda, literally the peaking review at the time. So the TikTok users have a First Amendment right to receive foreign propaganda in the kind of worst case scenario as you described it. And I'm gonna come back to Rania to lead us on. Yeah, I mean, I think the answer is obviously, obviously yes, based on the case Lamont, which is a case from the height of the Cold War. So in that case, you had a regulation that required Americans who wish to receive information that the government considered to be communist propaganda, communist propaganda from abroad, that they had to send in an opt-in card to the post office saying, yes, please, I'd like to receive communist propaganda. And the court saw through this registration requirement for what it was, which was a very significant burden on the First Amendment interests of Americans to receive ideas, engage with those ideas from abroad. And it wasn't, even though this registration requirement fell short of a ban, the court understood that the requirement at issue would exert a very powerful chilling effect on Americans in this country and their right to hear. And so it struck down the law. And so I think if you know, faithfully applying Lamont here, the TikTok ban is actually far more onerous. Obviously, there's the prospect of a ban, it wouldn't just be registering with the government to sort of engage on TikTok though, obviously that would raise very, very serious First Amendment concerns. And the government doesn't argue that everything on TikTok is disinformation. Yes, it argues that China could one day hijack TikTok's algorithm to push disinformation as Jennifer said, that is an unsubstantiated claim. And the government has evidence of that. It should share it with the public. And this brings me to my, I think the central point here, which is that even leaving to one side, generally speaking, the court has held that the suppression of speech is not a permissible response to the problem of disinformation and that there are less restrictive means that the government ought to use before it goes there. And Jennifer mentioned one of these before disclosure and that requirement, the disclosure requirement that the Foreign Agents Registration Act requires, it requires agents of foreign powers to register. And the court said, that kind of requirement, requiring certain media to label themselves as propaganda. Yes, it does raise First Amendment concerns, but it is a less restrictive alternative to a flat out ban. And so if the government has evidence that TikTok is being used in the way it says it could be, it should engage in its own counter speech. Generally speaking, the court has said the best answer to bad speech is good speech. It's not enforced silence. And so it's not at all clear to me why you would throw away that basic principle simply because we're dealing with foreign speech. And I think LeMond stands for the proposition that we shouldn't do that. So I'm just gonna pick up on one part of that, which is in the 60s, and then again in the 80s and 90s with the Berman amendment in particular and in the 60s with LeMond, you have this sense that the people who want to allow this speech are very confident in the American people. That is, we can receive foreign propaganda and manager. And so, and this is what freedom means. It includes the freedom to receive foreign propaganda. There is a kind of realness that is suggested. Now, Ramya, you did mention, well, we can get people out to the polls with these kinds of algorithmic shifts, but that's also different. Getting people to the polls is different than getting them to full a particular lever for one candidate or another. And become suddenly communist or anti-communist. There's a kind of sense that manipulation is pretty easy to do and that there might not be other means to respond to that manipulation that we shouldn't consider. But I mean, in turn to Jennifer to pick up on this question. So a lot of what Ramya said, I want to in some ways just say plus one. And I want to get a lot of what she said, but I also think it's important to reframe this conversation in the speech rights of TikTok users. Oftentimes when we hear this debate about banning TikTok, it's seen as a debate over banning a large social media company. And in fact, I would argue that's one of the problems with our debate over technology and technology policy in general right now. So that we're thinking about these as large companies and not thinking about the millions of users who have found opportunities to have their voice in this way. And that particularly when we're talking about banning a particular platform, users have chosen that platform for a reason. They have many choices and they find that this is the one they like best, whether it's because of how they connect with an audience, whether it's because of the nature of how they consume content. There can be any reason that an individual user uses one platform over the other. And many, if not most users are using multiple platforms for their different speech needs. So I think we have to think about how this impacts users' speech, particularly in this context. And traditionally our response as Americans, as was discussed, has been that the answer to speech that we're concerned about, whether it's propaganda or disinformation or anything else is to engage in more speech to trust that our fellow Americans will ultimately land on the truth. And that we have those conversations that we don't ban the speech itself. Great, Ellen? Yeah, so I'd love to respond to the great points that both Ramya and Jennifer made. So with respect to what Ramya was arguing, so I agree that the case stands for the proposition that a blanket ban on foreign propaganda would be unconstitutional. I just wouldn't push that proposition farther than I think the court meant it. This is not a blanket ban on Chinese propaganda. This does not ban the peaking review. It does not pose any obstacle to the peaking reviewer. I think it's called the Beijing review these days. If there really was a bill that tried to ban, literally ban Chinese propaganda, that would be blatantly unconstitutional and I would very much oppose that. But I just don't think that's what this is. This is a bill that would potentially ban Chinese control of a communications platform. And I think that control is actually much more insidious than straight-up propaganda itself because the whole point is, and I think wherever you fall on this debate, I think we all agree at this point that the power of social media to shape how we perceive the very truth itself is profound. And so when you think about what the First Amendment value underlying Lamont is, which is that we want more speech because that in the marketplace of ideas, presumably will lead to more truth and people might agree with the propaganda and other people might disagree and that will sharpen their own understanding of what's right. I think if you take the concern of control over the medium itself and the ways that the social media algorithms and content moderation can manipulate perception without people even realizing it, I don't think that the principle of Lamont gets you anywhere near this case. So obviously it's relevant. Now, as to the point that Jennifer made about framing this as the speech interest of TikTok users, I completely agree with that. And I do think that's something that gets lost frequently. And here I think you have to sort of make a guess, frankly, about what would happen if TikTok was banned. I do think that a lot of people who are on TikTok, they would find that extremely disruptive. I think that the kind of content creators on TikTok who have created a lot of content and have invested a lot of that, that would be a huge blow to them. And I don't want to minimize that. But I don't think that TikTok is irreplaceable. I think that the idea of short form video content with sort of algorithmic curation is well understood, very standardized, you have competitors. I think Instagram Reels is basically a competitor for TikTok. I just, I don't see a realistic possibility that the sorts of affordances that TikTok provides would not be fairly quickly replicated. Again, with disruption, I'm not denying that fact. But I do think when you're thinking about, how would this leave the speech interests of US social media users, I think that within a fairly short time, you would have just as much social media content, including TikToks.content as before. Okay, so I want to turn now to the audience for your questions. I'll turn to this gentleman up front first. There is a microphone there. Great. One of you mentioned that the law, although in some ways generic, also singles out TikTok explicitly in the text of the law. Does that itself raise a constitutional question that it is not a generic act, but that it singles out one corporation? So let's take a couple of questions and we'll go with the woman right here in the blue. Yes. Thank you. I didn't think I would ever ask that question, but I have to. What is the difference between prohibiting us from screaming fire in the crowded theater that I would think it would produce a lot less harm to the society and basically allowing TikTok to produce something of a much bigger nature? Thank you. And let's ask the question. Well, thank you. I actually have two questions. The first is in 2020 after violent fashion between India and China, India immediately suddenly banned the TikToks. I just wondering if our panelists can compare between India and the U.S. in terms of their respective ban on TikTok. That's the first question. The second question is there are speculations that in the aftermath of the TikTok, the U.S. might further talk that some other Chinese act like G-MU and Xiying, which are the two most influential, like Professor Chandra mentioned, the Jiang Zey-style e-commerce platform run here in the United States and from China as well. So if that were the case, what would be the possible rationale for the U.S.? Just want to know our panelists' perspective. Great. Let's take some more and do another round. And let me begin with Alan. Alan, would you like to respond to any of those questions? Singles out TikTok, this is much worse than fire and grata theater. If we can regulate that, questions about that. Jeff Coslett is rolling over right now. But then we should certainly go and do this. India banned TikTok, what does that teach us? And is this just the first app to roll other heads that will roll as well? Sure. So Jeff Coslett. Interempty hands are all of them. I won't, I won't. So Jeff Coslett is a friend of mine and he would break up with me, his friendship, if I did not address the fire and a grata theater. I mean, the whole point of that issue is that sometimes you can yell fire and a grata theater. Sometimes you can't yell fire and a grata theater. The whole question is, how imminent would the harm be? And I think I mentioned this because that's in a sense, a lot of what we've been talking about here. I tend to think that the magnitude of that harm would be enormous potentially if China wanted to use the control it has over TikTok. But again, as we've talked about, in a sense that is somewhat speculative, right? More speculative, let's say, than what would happen if you literally yelled fire in a crowded theater and caused a stampede. And so the question here, and I don't pretend I have an opinion but I don't pretend to have an answer is, is this threat too speculative, right? I don't think it is because I think when you disaggregate it, it's no longer particularly speculative. But to be honest here, it is in fact the case that the nightmare scenario is still speculative. I'm asked the last question about, is this the only communications platform that might be affected? No, right? I mean, this could potentially apply to a bunch of other Chinese owned or Chinese controlled apps. I do think that you do have to evaluate the merits of each case somewhat separately. You have to ask questions like, what is the potential for, whether it's data privacy infringements or for misinformation that may be different for TikTok than for WeChat than for an e-commerce platform. You have to ask questions to Jennifer's point about the rights of users. What are the alternatives here, right? Again, I tend to think that there are much more richer alternatives to something like, let's say TikTok than perhaps to something like WeChat which really is one of the main ways in which Chinese Americans communicate with Chinese, let's say family members back in China. So I do think you would have to identify, you'd have to analyze kind of each case a little bit is on its own. And so I don't want to, I don't want my argument in defense of this bill as applied to TikTok to necessarily be extended to every other possible sort of Chinese owner, Chinese controlled platform. Thanks, Alan. Let me be with Jenna. I think the question that I want to answer is about banning other apps. And like I said earlier, what this bill does is it says that the president can unilaterally ban other apps from foreign efforts that are partially owned by foreign adversaries, right? And that is right for abuse. I would imagine President Trump coming in and if a president of a company says something that he doesn't like about them, he is going to ban that company in the U.S. And therefore any, you know, anytime a U.S. user relies on a company to be gone, you know, pretty quickly because they've done something to anger the president, I think this just puts too much control in the power of the president. With no one else, again, there's no due process here. There's like no way to fight this. There's notice and that's it, right? And so I think that's the thing that we should be concerned about. Those are really broad-viewed powers that we're bestowing, you know, just kind of at the end of a bill that's purported the ad-hoc TikTok. Brahmit? Yeah, you know, I guess the first part I make in response to the submission of India is I think that a real concern that a lot of us have about this ban on TikTok is that it's going to, that it passed will be a real gift to authoritarian regimes around the world that will use this as precedent to ban foreign media in their countries. You know, previously the U.S. has criticized, has been rightly vocal when other countries have banned their citizens access to foreign media, foreign social media in those countries. And I think it would no longer have the credibility to do that going forward. I mean, we've already, you know, seen obviously India has already banned TikTok, but very recently, I mean, I think Israel is planning to ban Al Jazeera on the grounds of national security grounds. I mean, I think that these examples sort of raise this sort of broader concern about investing just sort of far-reaching national security discretion in the executive, even if you trust the current executive, imagine a future executive that you might not trust. Yeah, Jennifer. I also kind of want to pick up on the two questions about India or other countries having banned this app as well as what this might mean more broadly. I think in addition to what's already been said about the concerns that this has of what this means for countries that are looking for an opportunity to perhaps force divest or ban on other media apps, including potentially American media apps and some scenarios, you know, how would we feel about an authoritarian regime using this as an excuse to ban American apps? But I also think it's important to recognize that the first amendment is largely unique. What we will not tolerate with regards to government intervention into speech is distinct from other countries' views of free speech and free expression, and that matters a lot. I think that's a good thing, but it also means that we can't just say country X did it and it didn't have legal scrutiny there. Therefore, why is it a problem in the US? Because we do have this standard of the first amendment that requires different elements when it comes to government intervention in speech. In addition to what's already been said about how this does best broad power in the executive, I think it's also important to note that this isn't just a China bill either, that there are other countries that are named in the bill and other countries that could be added to the foreign adversary list. And so this also not only has to be thought about what does this mean for some of the apps that were mentioned, but what does this mean more generally for the way the US may interact with other apps? So one of the apps that I watch that I think gets very little attention in the United States is Telegram, which has a Russian origin app and is very popular in certain parts of the United States. And I'm gonna go further than what Jenna said, Jenna worried if I had to jump to, but I think Telegram often skews to the far right. You can easily imagine president on the other side saying this is kind of supercharging hate speech, et cetera in the United States and therefore should be banned and therefore threat to national security in some way. So I think there's that. And I certainly, when it plus one to the idea of borrowing from India's example, India's response was in a context of a literal battle of the Himalayas where Indian soldiers fell to their deaths. And so I think that was a totally unique environment in which to respond in different ways. And India chose to respond in this, what the minister at that time called a digital strike against China, which is much better than a kinetic strike against China. So I wanna say that was actually a moderate response in those kinds of circumstances. Okay, now I wanna turn to Guzo and give the floor to Guzo to ask questions from the online audience. Yes, many, many callers with questions here. I'm gonna try to run through a few so you can have an idea of the commentary that is popping up online. First in regards to like the homework of the government, any sense on why the committee on foreign investment in the United States never ended up assigning public servants as board members of the United States Digital Service as part of Project Texas, and in that sense is there was your view on an effort of actually making the propaganda and manipulation more evidence-based, more showing that case. Another question in regards to that as well is how is this different from the committee on foreign investment, a force in Chinese company, can learn divestment of Grindr in 2020? Other questions go about the idea of the theater. Can you explain a little bit better on how was the difference of banning free speech if you have other options to express your free speech online using other apps and other mechanisms as well? Another question here is how would the Supreme Court cases like net neutrality and Chevron contribute to the evidence on these arguments and what are the lines between public safety and health and free expression on this case? Finally, yeah, I think this is a good picture of the entire thing. Thank you for summarizing very, very pitifully five big questions. So let's just, I'll summarize those questions again very quickly. The question being, well, SIFIUS never followed through on certain aspects of Project Texas where there might be public servants that are assigned to monitor what USDS, US data services, the TikTok arm that controls the data and the algorithm. And maybe that might have mitigated some of those propaganda concerns that had been described earlier. So why didn't that transpire in that way and essentially kind of largely address those concerns or at least mitigate those concerns? In 2018, the US government ordered the divestiture by Quinlan of the dating app Grinder and a year later, Grinder did in fact was sold to another company. So maybe that's a precedent that should be that serves, how is that different than what's going on here? The buyer in a credit theater question and there are being other alternatives to express yourself in this than use TikTok. So perhaps it's not such a huge burden. After all, what is the net neutrality debate each has about the question and what about the public safety, public health arguments against TikTok? So many of you watch the show two hearings. Many of the people said this is leading our kids to drugs and other ills. And so this is kind of one way ratchet that the Chinese government is leading our kids like the pipe piper to their doom. And so I'll open up those many questions, interesting questions to all of you. Let me begin with Jenna. Sure, I will tackle a couple of them quickly. One is I don't think TikTok is that easily replaceable. I've talked to a lot of small business owners who say that they were never able to get their small business up and running with a different app. It just, it didn't work the same, they weren't able to reach the same audience. So for so many people, this is their actual lively that that's at stake. And like earlier Alan was saying, sure, this would be a hit. For people when their income is hit, that is not something that you can easily recover from even if it's only a temporary hit having no income for months. I mean, for some people, that is the difference between having a home and having something on the table and not, right? So we really can't underestimate the fact that TikTok is not so easily replaceable, especially for the people who are relying on it for their income. The second question I want to tackle is the net neutrality question because that's the other thing I've spent my whole week working on. It's also part of my portfolio. You know, I think in both cases, what net neutrality does is it says the internet service providers cannot treat different internet traffic differently. They can't decide what is sped up, what is slowed down, they can't block some things. And I mean, that's the same, right? The reason that net neutrality is happening is to protect the ability of users to go undo what they want to do online. And this is the same right now, half of the country wants to use TikTok. And so it's a ban TikTok that's really going against the core of net neutrality, which is to make sure that users can access the information online that they want to access when the difference is from net neutrality is what internet service providers are dictating that constituents are accessing. In this case, it's the government, which is obviously has an even bigger implication for the First Amendment. There's one other question that I wanted to answer. Oh, harm to kids. So, you know, I think one thing that, because I also work on a lot of cases online safety is that the government is not talking about the actual benefits of social media to kids. For all of the harms that exist, ultimately social media has changed the game for kids in a lot of really positive ways. You know, I'm at the ACLU. We have a big focus on equity. We have a whole team that focuses on LGBTQ issues. One thing that you'll hear a lot is people who say, if the internet had existed when I was a kid, I wouldn't have felt like the only gay kid in the world, right? Kids who live in places where there aren't others like them have been able to find that information, find resources, explore themselves, find a sense of community, not feel so alone. Kids that are bullied will often say the internet is what they turn to because that's where they find friends, that's where they find people. In so many cases, the internet is made out to be this horrible risk for kids. And sure, these harms, I will not, you know, they exist, they absolutely exist, but there's a lot of good that happens as well. And, you know, I think policymakers jump to regulating the internet because that's easier, right? It's much easier for them to regulate the internet than to invest money in education and digital literacy than to invest money in law enforcement to go after people that are selling drugs and abusing children. Like, there's so many other things that the government can do that are bigger and harder and that's why it's easier for them to live. Let's hold Facebook, TikTok, Instagram, accountable here for every harm that's ever happened to kids. Thank you. Ramya? So I might try and take one of the questions that hasn't already been addressed. The first one about, you know, CIPPS is potentially getting the U.S. government more involved in auditing, having oversight over TikTok's algorithm as a way to protect against the possibility of, you know, Chinese co-optation and disinformation. So I'm not sure why that proposal didn't quite work out, but I'm pretty glad it didn't because, you know, I think the prospect of having really close U.S. government entanglement in, you know, in sort of reviewing and having veto power over the content moderation, sort of like policy making, but also, you know, algorithmic decisions of a social media company are a little bit concerning. You know, third party oversight might ameliorate some of those issues, but having U.S. public servants directly involved in that way I think raises some pretty significant, you know, free speech concerns. But this brings me to sort of one of the other sort of less restrictive alternatives that I think the government should act, you know, should very much turn to instead of addressing the risk of disinformation not only by the Chinese government but other foreign adversaries as well, is requiring greater transparency of the platforms and potentially imposing obligations on them to share data with independent researchers who study problems like the spread of disinformation on the platforms. You know, a point that, you know, renowned cybersecurity expert, Bruce Schneier made in his affidavit in support of a lawsuit challenging the application of Texas' TikTok ban to the public university faculty context is actually research as if they had access to this data would pretty quickly realize if the Chinese government were involved in such a monumental effort to sort of hijack TikTok's algorithm to push this information. So that could act as a significant bulwark against that kind of effort. And so transparency would have a lot of, you know, I think, socially invaluable sort of uses that that is one of them. Great, and I always, yeah, so the possibility of researcher access to data that can help us determine whether or not propaganda is being pushed would seem to be one of the measures that might mitigate these risks. So I'm always advised that that isn't one of the things that rolled out is the first approach to these questions. Jennifer. So I wanna turn to the Sipias question a little bit and then also get to the kid's question. So with Sipias and particularly with Sipias and Brinder, I think this also shows what another kind of unusual element of this bill and that there was a process in place for considering these questions. The Sipias process has been ongoing that should be weighing the potential risk considering what these alternatives are and building a case if any steps are needed with regards to concerns about national security in this foreign investment. That's a little bit distinguished from what we have here, which are, as we talked about in some of the other elements of this panel, what are very vague national security concerns though. With the case of Brinder, there were some very specific elements with regards to the LGBTQ community that were discussed in that decision. With this, it's kind of much more amorphous about what the concern actually is, as we've already said several times on this panel, which in some ways segues to this question about kids online. And I think we have to recognize that the TikTok debate, while there is a TikTok debate going on, is also part of a broader debate. And like Jenna, I'm very concerned with some of what we're hearing in terms of age verification, age of primary design codes, all sorts of calls to ban children from social media when there are so many beneficial uses and when there's not always clear evidence of the harm, let alone clear definitions of what harm we're trying to solve when we say we wanna keep kids safe online. The end of the day when it comes to kids' online safety, I think it should be parents, not policymakers making those decisions, because there are going to be so many different options and so many different households that are gonna fit so many different situations, because we don't necessarily agree on what the problem is. And that makes it a bad place to try and have a one-size-fits-all calls. Thank you, Allen. I know we're out of time, so I'm happy to cede my time. Okay, thank you. Well, I think we have to cover all of those wonderful questions. Thank you all. Please join me in thanking our group. We will post this on YouTube and thank you all very much.