 Hello and welcome to the first in a series of Gallup and Knight Foundation webinars focusing on the future of technology policy. 2020 has been quite a year. American life was deeply altered by the coronavirus pandemic, forcing us to isolate and change in ways we never thought possible as a society. Most recently, we've seen large numbers of Americans rising up to demand real change to systemic racism and police brutality toward black Americans and people of color. Both events have had a common factor, technology and the connectivity it provides us and its ability to disseminate the messages of health officials and racial justice activists faster than ever before. But those aren't the only voices these platforms have amplified. While positive messages of support have been able to spread, so have voices that counter those narratives with misinformation or hate speech in addition to politically motivated or foreign influenced fake news campaigns. Just a few years ago, Americans were overwhelmingly optimistic about the power of new technologies to foster an informed and engaged society. More recently, however, that confidence has been challenged by emerging concerns not only over the role of the internet and technology companies, but in particular social media and the role it plays in our democracy. Knight Foundation partnered with Gallup to explore how much this landscape has shifted together. We search to understand key concerns of the American public around technology policy from the spread of misinformation to election interference and data privacy. Now more than ever, a research based understanding of how Americans view these issues is needed so that regulations can be designed to serve and address those public aspirations today to get us started. Our first speaker will be my colleague, Dr. Priscilla Standridge. She's a senior research consultant in Gallup's public sector division, leading both global and domestic research initiatives that focus on human rights and civil liberties. Since Gallup and Knight Foundation partnered in 2017, Priscilla has served as Gallup's lead investigator in our partnership. So without further ado, Dr. Standridge, the floor is yours. Thank you so much for that introduction, Mo, and I'm so excited to present these data and speak with everyone today. As Mo mentioned, I am very proud to lead the work that we do with the Knight Foundation on tech policy and tech lash around the media. It's part of a wider research project and in fact a two-part series that you can find on the Knight Foundation website. The next webinar that we're hosting next week will continue this conversation, and we are really excited to kick off this week looking at how the American public feels about major technology companies and the way it's shaped their lives. So to get started, I think it's really important to set the context around how people's views have been changing about these technology companies. And when we first pulled about this topic in 2015, people, as you mentioned, Mo, were pretty optimistic about the role that these companies played in their lives and how they felt about them. But over the last five years, we've seen a pretty sharp decline in overall positive views, but more strikingly, we've seen that negative views have basically doubled in the last five years. And this is really consistent with other data points that you can find in our report. People saying that technology companies do more to divide people than to unite them, that they do more to misinform people than to inform them. So there's definitely a lot of view that people have that support this general trend. And also very similarly, we see that Americans feel that these internet and technology companies have too much power. A full 77% of Americans feel this way. And this is a very striking, strong majority of Americans that really hold a fairly negative view of these companies. And this will definitely be a topic of conversation we'll want to discuss today. But surprisingly as well, Americans are pretty evenly split on whether the government should intervene to actually break up these companies into smaller ones. This doesn't seem to be something that has large-scale support in the American public. And we definitely wanted to understand better exactly where the concern lies when people think about these internet and technology companies. At the top of the list, not surprisingly, we see that the spread of misinformation on the internet is definitely the largest concern. And it was definitely the largest concern in December when we pulled about these particular questions at the time. This is actually interestingly a partisan viewpoint. It's more frequent for Democrats to feel concerned about this. They're also more concerned about hate speech online as well as foreign political interference. There is bipartisan concern, we could say, for the issue of privacy and personal data. That's an area where both conservatives and liberals feel equally concerned about the role that these technology companies play in their lives. So we wanted to ask Americans how they thought that this information, this misinformation online, should be dealt with. And what we found is that a striking majority of Americans really do not trust social media companies to handle misinformation. A full 40 percent of Americans do not trust these companies at all to handle misinformation and make content decisions. And a further 44 percent do not trust them very much. So this is a striking majority of Americans that really don't really have confidence in the way these companies are handling these content decisions. But on the flip side of that, Americans don't really see many alternatives because they're also not very supportive of the idea of the government making these decisions. Overall, we see about 55 percent of Americans supporting these companies making a decision and only 44 percent of Americans would support the government getting involved. So where should this regulation of content really happen? According to Democrats, it really should be more up to the government. We do see a difference here in partisan views. More Democrats feel that the government should be involved in this. But it's really important to note here that it's really only just over half of Americans and American Democrats really that feel that way. So that's really not exactly a strong endorsement even among Democrats. So where should the government get involved? That's really the question hand here. According to most Americans, we see that the government should have a major role in the case of online foreign interference in U.S. elections. That's a clear area where Americans think that the government should have a role and get involved. And then secondly on that list, we see yet again privacy of personal data. Again, that was a bipartisan concern. And so that comes in close second in terms of an area where Americans would really like to see the government have a major role. And then it really tapers off. We have other things like the spread of misinformation, things like hate speech or targeted political ads. And these are areas where it's really a little bit more, we could say, gray for most Americans. There aren't strong minorities saying that the government should play a major role, but they definitely think that the government should have a role, at least a minor role in regulating some of this content. And to give you a little bit of a preview of what we have coming up next week as well, we really dug deeper on how companies should handle online content. And specifically here, we were asking Americans about different types of content and whether they thought it should never be allowed, allowed in certain cases or always allowed. And really comforting, I would say in a comforting way, child pornography was universally considered to never be allowed on the internet. But a close second there, and you'll notice here that we collected this data in late March of this year, the close second here was misleading information about health and medical issues. So obviously concerned about the pandemic and some of the, the infodemic, we also wrote an article about this that you can find on gallup.com about the infodemic around false and misleading information about the COVID pandemic. This was clearly starting to get into people's minds and starting to be a major concern for people as something that never should be allowed. And a close third here, we see intentionally misleading information about elections and political issues. And I think that this is a very timely issue in a lot of people's minds given that this is an election year. It's something that a lot of people are starting to be concerned about. But the information really starts to get a little bit more muddied and confusing for people, I would say, when we start to think about, you know, false statements about someone's reputations, information like hate speech and things like that, people really have, the majority of people say it should never be allowed. But there's more people thinking that it should be allowed in certain cases. And then of course, at the bottom of the spectrum, we really see information like nudity and foul language that people think should be allowed in certain cases. And they think it should be less controlled by these platforms. So hopefully, this has piqued your interest for next week. I really invite you to register for that webinar as well. We will be digging deeper into some of these issues and also looking at online content moderation, content oversight boards, as well as America's views on Section 230. So I'll hand it back to Mo now. Thank you. Thank you, Priscilla. And before I introduce our panelists, I want to encourage those of you tuning in to include your questions in the question and answer box on your screen. We're going to have a few moments here after this discussion where we'd love to field some of your questions. So don't forget to do that. We'll try to get to as many of them as we can. Next, it's my pleasure to introduce two gentlemen who really need very little introduction to this audience. Commissioner Rohit Chopra was sworn in as a Federal Trade Commissioner May of 2018. He is aggressively advocated to promote a fair and fully functioning marketplace through vigorous agency enforcement that protects families and companies from those that break the law. Commissioner Chopra joined the Treasury to launch the Consumer Financial Protection Bureau, then serving as Assistant Director of the CFPB and overseeing the agency's student loan agenda. Sam Gill joined Knight Foundation in 2015 and oversees all of Knight's grant making programs for community and national initiatives, journalism, and arts. Mr. Gill also oversees Learning and Impact, Knight Foundation's research and assessment program. Gentlemen, the floor is yours. Great. Thank you. Thank you so much, Mohamed and Commissioner. Thanks a lot for taking the time to join us this conversation. I've got a little bit of background noise here, so I'll mute when I'm not talking. It'd be great, I think, as the audience today reacts to the polling and what Americans think, to help them get their minds around the part of our economy and our society that we're talking about. I think if you add in Amazon to the Fang companies, Facebook, Apple, Netflix, you're talking about a huge percentage now of the S&P 500 in market capitalization. I think I saw somewhere that if you add in Nvidia, you've got a market cap of $6 trillion, which would make it one of the top five economies in the world. In an age of COVID, it seems to have become clear just how fundamental so many of these services are to our lives, to all of our political, social, and commercial transactions. To begin, could you just tell us a little bit about, as a regulator, how are you thinking about this sector as a part of our economy? Well, I think tech has been ingrained in the plumbing of our economy now for quite a bit of time. This is not new. We've seen how this is true in telecommunications generations ago, a generation ago, obviously, Microsoft and its operating system. This is not a new phenomenon, but really, this is a unique moment in history where actually some of the largest firms on the planet appear to be controlling some of the key vehicles for which how technological innovation is occurring and how commerce is occurring and the extent to which they are both platforms and a participant on that platform. And that's something that's new and much more in our face. And the business models of these firms are now raising much more questions about whether their business incentives are aligned fully with a dynamic and innovative economy. So that's what we really have to confront, I think, as the public and as regulator. And to unpack that for us, the way that their platforms and also participants on the platform, what is it about the way that these companies operate that if it's not totally new is a really important factor for you and for all of us to be thinking about and considering? Well, I think what we saw in the slide deck, there's a lot of pieces of that, but just one example is the advent of the surveillance based advertising model or behavioral advertising. So this is a fundamental shift. Instead of advertising being contextual or tied to the content, it's now tied to an individual person. So the results of that, and we can take the example of YouTube. YouTube has a business incentive for young people to become engrossed in content. And this is why we think the algorithms lead those young people and others to darker content to more harmful content. The externalities of that are young people being recruited as terrorists or creating hate. So I think there's a lot of focus on what the content is, but really the issues are about the business model. And I think that's ultimately where we, as regulators, need to really focus. Obviously, you've raised now Amazon, of course, which has gained many, many billions of dollars in market capitalization over this pandemic. And there's questions about really what's the ability for the participants on that platform, the merchants, the sellers, to really get a fair shake. I think those are key questions. We saw some policies by Amazon about how they were going to limit third-party sellers' inventory in their warehouses. And naturally, you have sellers asking questions, is that policy also optimizing Amazon's profit model? Are they more likely to substitute for goods that they control? Are they finding which goods and services are being sold by third parties and then figuring out how to take it for themselves? So this is ultimately that kind of conflict that you hear more and more often about when someone controls the platform, but also as a player in the game, and how much does that game end up being rigged for them, if that makes sense. And to what extent is, I kind of want to ask two questions about sort of the intrinsic links between the business model but also the size. So on the business model side, how intrinsic is this to the business model? The companies might argue, on the YouTube side, for example, look, low-intent broadcast advertising is an inefficient business. Night foundation comes out of journalism money. There's a reason that journalism has so diminished. It's because low-intent kind of here's the Sunday paper advertising is a lot less efficient for brands than high-intent advertising that knows what you or I might be looking for based on our behavior. So we're offering a more efficient service for the advertiser and by definition more efficient for the user who actually wants to get the products that are helpful for them. And on the Amazon side, they might argue, look, we're creating a marketplace for third-party sellers that never existed before. People who maybe couldn't have even dreamed of owning a business can now be shipping products around the country, around the globe, thanks to our platform. And we help them find customers because it's on the backbone of our technology platform that we can make those connections. So that's not to say of, you know, in listening to you that it's clear that these tools can't be abused, but it strikes me that some of the companies want to argue, well, look, this is just about our internal policies or how we build our systems of recommendation. This doesn't have anything, this isn't intrinsically linked to our business model. What do you make of that argument? Well, you know, you look at some of the business models and the business models are really premise some of them. Some people talk about the attention economy or that there's an incentive to addict. I tend not to focus too much of that on that, but the reality is, as we keep hearing, when companies are finding out that their own algorithms and products are causing serious harm to their participants, but they turn the blind eye because they know that's what's profitable, that's a market failure. And that's something that we have to deal with. And you raised this question about efficiency of advertising. There are many, many ways that we have seen over the period of the past 50 years that advertising and analytics can drive efficiencies and make people more effective at acquiring customers. But there's also a line between advertising and manipulation. And what we have seen in a multitude of contexts is the ability to essentially manipulate through this surveillance-based advertising model. So no one is actually arguing that you can't have advertising. No one's actually arguing that you can't use interesting quantitative approaches to try and make your advertising more efficient. What people are concerned about is, one, the whole basis of this behavioral advertising model and the fact that only a couple companies are ever going to be able to be viable competitors on this because they have literally swept across the internet. And even if you don't know that you are using their platform, their technology is embedded across our society and in the internet, whether it's connected devices, whether it's websites, what have you. So we don't necessarily have a choice of whether we are doing business with these firms. That's one. And two, the whole issue of advertising, I don't necessarily agree with the framing you're putting on it because there still will be allowed to be advertising and it still may be able to be more effective. So I also want to raise just as a separate point, the advertising as a social good is much more in question than it ever has been before. We used to think about advertising as one of the only ways that customers, potential customers, could find out about a product or service. Today, most people find out about products and services in some ways through their own initiative of searching online. And it's not clear that advertising has as much of an information value as it ever does. So many of the old economic models and theories have really just collapsed around that. And to what extent do you see, you've alluded to this too, I mean, to what extent do you see some of these challenges as intrinsically linked to this question of bigness, of how big these companies need to be, of how big their networks need to be. You could argue those two things are the same. I think sometimes the companies try to argue they're different. But how do you approach the question of bigness, which has been a really big feature of the discussion about this sector of our economy? Yeah, and it's a real big concern about how much power they have. You know, when you talk to a lot of advertisers, advertisers are really worried that the bigger they get, the less accountable they are. And here's one specific example. Many of those advertisers don't even really know when they are doing a campaign, let's say, on brand awareness or what have you. They almost sometimes have no idea whether they are getting true engagement or if much of their engagement is through fake accounts, bots, and other things that may look like it's real engagement. And the truth is that that bigness is really intentioned with accountability and transparency. And many of those advertisers are struggling to figure out what is the way that they can engage in digital advertising without getting some real honest metrics about what is going on. Because typically it was the advertisers who were a little bit more on a level playing field with who they were advertising through. And that was part of a bargaining, even bargaining power. And it's a big worry. I think, look, it's not just in the advertising space, although that's huge, but also e-commerce, also the sharing of goods and services and selling of goods and services. Right now during the pandemic, we're obviously seeing a huge reliance on food delivery apps. So of course, this is a scenario where the business incentive of some of these apps is going to be to figure out how to squeeze both sides of the market. And often that means squeezing the restaurants. So we have to figure out sometimes, let's look back at really what drove the real innovation in the digital economy. It was an open internet that you have lots of voices, very, very low barriers to entry. And you didn't have a company that could essentially copyright the internet or patent it so that they could tax and extract rents out of everybody. But that's, I think, what you're seeing in whether it's app stores or other platforms, some real worries about whether there is that level playing field. And I ask myself the question all the time, what technological innovation is being choked off without our knowledge of it because of this system? And I get really nervous when I hear people invent your capital, when I hear people in financing, I don't even want to invest in an innovation or a startup unless I can sell it to one of these big incumbents. That is hugely distortionary of technology and innovation that we could get. And we should want more of it rather than less. To what extent do you think this is a function in addition to being a function of the business trajectory of some of these services, sort of a function of the culture? You and I were talking a little bit before we got started. There's some folks I've collaborated with in Silicon Valley who come from the 80s and 90s generation of innovation when venture capital existed to take a bet on technologies and services that others wouldn't and innovation that wasn't going to happen inside an established firm. And they talk about how different the discourse, the ethos was then today when Reed Hoffman, for example, talks about blitz scaling being the necessity of what you have to be able to rapidly grow to serve a global market if you want to compete when the language is about how do you capture network effects in quotation marks at some scale. Have we just entered an era in which the business culture is one of domination is the only theory about how to serve a market and provide profit or return to shareholders? Yeah, I think it's a great insight in some ways because it goes way beyond the digital economy. I think really from around the 90s and beyond a change in how the capital markets are looking at opportunities and the rise of the private equity rollup, the changes in venture capital, all of these things I think are putting more energy into the platform play. How do I engage in what some people think is predatory pricing in order to knock out any competitors, create those network effects, and then monetize that over time? And that's a business model that is becoming commonplace and ultimately it has its core about getting advantages that others don't have. So for example, one of the ways things that is exploited is the ability to get online immunity where your competitors offline have a liability. That's basically regulatory arbitrage. That's really not meaningful kind of innovation and competition. So we're going to have to think about also the intersection with, frankly, with Wall Street and the capital markets about what incentives is driving that and ultimately what are the tools we can have so that there is not ultimately a motivation for complete dominance but really a motivation to constantly innovate, compete to provide the best product and service in a decentralized industry structure. That's ultimately where in the US we have found ourselves to be most successful. It is why after the government took action to unlock a lot of patents in the telecommunications area, we saw huge innovations across the sector. Once we took out some of the entrenched monopolies, we saw huge innovation and that was different. The US took a very different approach to what the Europeans did with the more national champion strategy, the government cultivating certain companies that were politically connected. Ultimately, this was the system that emerged and I want us to keep embracing that system rather than a proprietary closed system that allows companies to impose their private regulations and taxes, which is really how many would see app stores and other platforms that are really embedded in the digital economy today. What are some of the directions that you as a regulator think we should be considering to get to that world? As you saw in the polling, if folks are certainly skeptical, I think increasingly consumers, citizens are raising the questions that you're raising. They're beginning to perceive the effects of some of these structural features yet ambivalent, as you can see in the polling about blunt force solutions like breaking up companies. What are some of the directions that you have at least been raising questions about from your seat as a regulator? There's a few things I would raise. One is that there should not be a special set of rules or a special enforcement approach or frankly, a leniency to the largest, most dominant firms. The FTC did a settlement with Facebook and essentially it's been widely criticized for not fixing any of the problems. If it was a small tech company, boy, would we have gone after the CEO? Would we have really fundamentally changed that business model? But for Facebook, Mark Zuckerberg didn't even get deposed. They got to pay a fine, but it's not really going to fix the problem. One is we need to rationalize the fact that small companies and startups should not really get unbelievably penalized while you have some that kind of skate away with paying a fine. The second thing is really trying to... We've got to fix the bad behavior by... We can't just try to stop the bad behavior. We have to make it either impossible to do or very risky to do. This is about structural reforms in some ways. One, you raise this question about... And the polling, I found very interesting, especially the wording of the polling, it said break up the companies into smaller companies. What's interesting about our history is that when there has been antitrust action, the focus isn't necessarily on size, but it's about structural separation. Where is the inherent conflict of interest that might exist in a business model? In media and telecommunications, we obviously have a strong public policy posture and tradition in America of promoting speech diversity and lots of voices, broad access. But we also have a policy that in order to preserve that, we really make sure that there's not a lot of mischief and shenanigans in the ownership of it. So that's why there is a history of certain cross ownership bands that ultimately cut away some of those conflicts of interest. We see this in banking. We see this in other parts of telecommunications, that separating certain types of business activities allows them to focus on innovation on that line of business rather than trying to hijack other parts of the value chain in order to extract rents. So enforcement is one, thinking hard about structure of these firms, but also thinking about the sources of power that is enabling some of that abuse. So is it about immunities? Is it about intellectual property abuse? Is it about the contract terms that we've all clicked through blindly that they are able to impose without negotiation? That is really where we have to look very carefully to determine about how they might be advancing their dominance to the detriment of startups, of innovation, of consumers, of other small businesses, and really to the social fabric itself. So let's talk for a bit about one of the sources of immunity that's kind of entered this, that was inside baseball for a long time and has entered the spotlight, which is section 230 of the Communications Decency Act, which of course has liability protections for companies that provide digital services around a third party content that you or I as a user might post, with of course exceptions to that. And we've seen, and our polling has shown it, people are increasingly worried about how content online is managed. As Americans have been using social media more, for example, during COVID, they've also been really self-aware that there's a lot of misinformation, for example. We've seen in our polling that people are concerned about hate speech online. At the same time, a lot of Americans, we found, really, they like the openness of the internet, to your point. They like the idea that there's sort of an open, innovative ecosystem here that anyone can participate in that lives up to our highest ideals of a marketplace of ideas. And now we've seen during COVID both the combination of some of these companies taking more aggressive action than they ever have around taking down certain kinds of health information or misinformation or labeling certain kinds of health information or misinformation. And they've extended that in some cases to the president, which has led to an executive order that kind of brought this issue out into the open that made some strong claims about the potential for censorship, for example, by these platforms, censoring views that they didn't like. The FTC was mentioned in this executive order. And so I'd love to know how you guys have been thinking about this particular question of the moderation of speech and content online and what are the really significant consumer issues that you're thinking about in this dimension of the debate? Yeah, so I really tell people follow the money. In some ways, I think the focus on content moderation is a little bit misplaced. We're always going to have bad content. We're always going to have certain things that people don't like. But really, who's profiting for when you turn some of that content up in terms of volume and on steroids? And what we actually see is that this content, which may be full of disinformation, misinformation, whatever, the long list, that's also some really profitable content because it deepens engagement. We know this from the psychological research. It gets people to click more. It gets people to consume more advertising. So ultimately, I really feel that this is an issue that people sometimes think, oh, the law is the problem. We need to fix it. Maybe it's the market and the business incentives that are the problem. We can't accept that technological fixes for content problems are not possible just because companies with no interest in fixing those problems haven't come up with the tech fixes. Sometimes it is about the actual business model. And I've actually raised some questions and I support the fundamental tenets of what led to the creation of Section 230, which many of you know was really thought up in the prodigy-compusor bulletin board context. I've raised questions about whether the surveillance-based advertising model is really consistent with that Section 230 approach because it is a totally different animal than the prodigy-compusor framework. And I'm not sure that we want to get into a big, big debate about how are these tech companies going to moderate content and they all want to create their own many governments internally and who knows who that's accountable for. I think we need to focus much more on the structure of their business model and whether those business models need to change. And whether some should only enjoy that immunity if they adhere to a certain type of business model. And others can choose a different one where they may not have all of those immunities. I think another key thing that we think a lot about at the FTC is that commercial speech shouldn't be considered the same as citizen speech when it comes to immunity. So they are really different animals and constitutionally, they have different sets of protections, and we really value citizen speech in a way that it's sort of fundamental to a democratic system. Commercial speech is really a different thing and it's not clear that it necessarily should enjoy all of those same immunities. So on this topic, but also some other aspects of the way these companies operate, one of the things you mentioned was how we've got a lot of dated concepts that we're trying to use to understand the modern economy and one of the things we need to do is to think about what concepts we need. I'm struck when I think about either the speech issue where you've got on the one hand this incredible amplification opportunity and community building opportunity for the person pervading the hate speech, and on the other hand this incredible opportunity for harm and abuse by the person who might be on the director or indirect receiving end. I sort of similarly think about in the Amazon case, the sort of amazing opportunity for the seller of goods, but that seller of goods who's being abused by Amazon may also be selling counterfeit goods or an Uber driver who might be being abused by Uber but may not be a very safe Uber driver or on the other hand has this incredible business opportunity. So what I'm building toward is who do you think of as the consumer when you look at these sort of many to many services and as you try to follow them, to your point of following the money, they're structured. The whole business model is around sort of these many to many connections where the Uber driver and the rider in some sense are both the customer and the harasser and the harassed on Facebook are sort of both the customer. How do you think about who the consumer is that you're looking to help protect? Well, I really think about we're trying to protect and safeguard really everyone who's playing by the rules and also who is a participant and multi-sided markets are not a new thing. You know, in banking, we know that banks also, they take deposits on one side from savers, they lend it out to others. And when you think through that, over time, we've really put in some safeguards to make sure that it doesn't get totally abused in the ways that we saw 80 to 100 years ago. We don't, for example, allow CEOs of banks to also, we don't allow them to sit on the boards of certain companies because we want to create some of those separations so that those conflicts of interest don't completely hit up on steroids. So I think, you know, you've raised a lot of examples from Amazon and commerce and to advertising. When a platform becomes a behavioral advertiser, that really is something to your question. How do you separate commercial and citizen in some ways and how do you classify some of that content written by user versus the business incentive to totally boost it and monetize it? And multi-sided markets are really a huge part of the debate. And here's what I call, I call some of this the middleman economy. That on one side, you have a middleman who has created these network effects and then they can monetize it in some ways through manipulating the participants to their own benefit or to impose taxes and regulations on them that allow them to really seek more rents and seek, you know, take money out of the real economic activity. So you should, you know, we know this that you can create this multi-sidedness or you can create these connections, Sam, through open distributed network. I mean, we have seen that obviously in the advent of the core of the internet, you know, if you look few years ago, that was sort of the core of what podcasting looked like. You know, there was a lot of these kind of open distributed systems where the middleman did not really have the ability to distort it for their own benefit. And that's why you taught we talk a lot about technology solutions that make things more open and distributed, more open source. We talk obviously about, you know, how do things become interoperable so that there is not the ability to choke it in one place. You know, there's a lot of discussions about contract clauses like most favored nation, you know, clauses that lock certain sellers or market participants in with a particular platform. Those are all the things that I think we want to enjoy all of the benefits of connecting people, buyers and sellers, people with like interest. That's all possible in a restructured industry that is not so distorted in suppressing market entry or amplifying certain voices over others for their own monetary gain. And we have to really, really reject when we hear from big tech and their lobbyists that if we don't embrace that system, that we're going to lose, you know, some of the technology gains because that's just not true. We need to push back when we hear Sheryl Sandberg and others saying that America needs to essentially coddle large tech platforms in order to stay competitive with China. That's just not really the case. The reality is that we innovated the most and our country was the most dynamic when it was more distributed and when there was more competition when there were lower entry barriers. And there's just so many examples of this from the advent of social media in South Korea to, like I mentioned earlier, podcasting. This is the vibrancy we want in our economy and our democracy. We don't want a middle man and gatekeeper economy. So I want to ask you kind of one more question before we open it up to the power of the Internet to hear from some of those who are listening from all over, which is actually a question about your agency and about the regulatory apparatus in general, which is, you know, if I'm right, the FTC I think is about 107, 105 years old and right around there and is sort of was one of the, was itself the product of an innovation, right? Where at the toward the end of the 19th century, you know, we saw the benefits of industrialization, food and medical medicine prices in particular coming down dramatically, but also the excesses, right? That there was much more counterfeit food and medicine, the kind of conditions that we saw in the in the production of these goods were concerning to us and sort of our governmental innovation was the administrative state, which, you know, starts with regulations over the meatpacking industry ads agencies like the Food and Drug Administration ultimately adds agencies like the FTC to be able to match the complexity of the industrial world that was that was that was precipitating. And I'd love to know as now that you're in this seat and confronting sort of the new the vicissitudes of a new shift, industrial shift along the lines of what you've just described, what do you think, what do you think we need to see from either the FTC or our regulatory structure in general in order to keep to keep our values alive in a moment in which the context of those values, the landscape of those values is different in all the ways that you've and all the ways that you've described to us. Yeah, I think it's super important to be rigorous and vigorous. So, one, it's important to actually really investigate and look, you know, behind the veneer of the talking points and the lobbying and actually see how these business models and algorithms and everything are operating, figure out what the incentives are and see how does that comport with the law? You know, we have laws on the books that forbid certain anti-competitive conduct. We forbid certain abuses of consumers and small businesses. And on the other side of that rigor has to be vigor. It means enforcing the law, not for a headline driven fine, not to, you know, just try and give some behavioral fix that is really difficult to enforce, but to actually discharge our duties to have promote the real concepts of openness, innovation, and prosperity rather than rewarding rent seeking or rewarding those who want to cheat rather than compete. So, protecting that competitive process is just so fundamental to making sure that we can stay and keep leapfrogging ahead rather than being stuck in place. And, you know, again, we have this challenge all the time. People do see a lot of change and technological progress. It is very hard to know what we might have missed out, you know, based on the current industry structure. And look, it's natural. The capitalist system throughout the industrialization changes in agriculture, all of the things you referenced, there's always going to be a sense from some incumbents that rather than putting in more equity capital and, you know, more elbow grease into coming up with something that's meaningfully new, innovative, and disruptive, you think about how to protect your turf. And sometimes you do that through anti-competitive mergers. Sometimes you do that through imposing certain taxes and regulations to control the platform in the market to benefit yourself. And that has to be core to what we have to do, as well as our state attorneys general, as well as my colleagues who I keep in close touch with around the world who are all confronting these similar problems. So we have a big task ahead of ourselves. But, you know, that's what I think the public is counting on us to do, to really look at the data, look at the facts, and really see if the law has been broken and make the changes that are necessary to remedy it. And not just surface fixes, but that really killed the incentive to violate the law again. Fantastic. Thank you so much for this conversation. Muhammad, I'll kick it back to you. I'm sure there are some great questions coming from the audience. Absolutely. We have some amazing questions and comments. I'm going to, in the interest of time, read sort of them in groups to you all. And anybody feel free to kind of tackle them. These first few are all sort of policy and regulation related. Any insights on why Americans are more opposed to execute the Sherman Antitrust Act when Europeans seem to be more active in that field? Another question is that there are many regulators and oversights on banks. Shouldn't the same basically be mandated for social media as well? What are the strategy state government agencies who are moving from legacy systems to cloud-based systems? How can they address these aggressive tech business models? So maybe we could spend all day if you had it. But maybe on the first couple of those, do you want to comment on the differences between us in continental Europe and from your perspective? And you've alluded to some of the both similarities and differences with other sectors. So maybe you could comment on that as well. Yeah, I think, look, they're talking about reluctance for government intervention. It's actually a really interesting question. I think what we've seen over the past 40 years is essentially a narrative that has been created that government absence from a market is what leads it to prosper. And it's so funny because when you think about the iPhone or you think about the internet, much of this is based on taxpayer-funded technology that was often made available to the public. When you think about how telecommunications prospered, so much of it was about unlocking intellectual property. So I really think a lot of people have naturally lost faith in government to police corporations to actually hold anyone accountable. But the reality is that it is antitrust enforcement. I think that is actually was a key pillar in making sure the United States was a technology leader, whether it's the Bell Labs consent decree, whether it's AT&T actions, Microsoft actions. I wouldn't want to live in a world today if Microsoft was essentially able to monopolize the entire personal computer and internet. I mean, we would not have what we have today. And frankly, there would be no Google or Facebook or Amazon if those actions had not occurred. So we have to think about, in our American tradition, what is it that we want to safeguard? And what do we want to crack down on in order to make sure that entry is still low, that you still can challenge an incumbent and disrupt them? And so that's why we need good federal and state regulators and those around the world. I understand that we also need to think carefully to be focused on the right problems. We can't be distracted by sort of silly issues. We have to focus on the core of what are we going to do to make sure we keep our technological leadership. And I don't want to follow a Chinese system either, that essentially there's government affiliated companies that essentially are arms of the state. So I think looking to our own American tradition and our values is a good one. And I'm not sure if we have fully executed or lived up to those values in the past, you know, recent era. Let me ask you a follow up question is I think the Microsoft case is sometimes overlooked. But you know, my understanding, you know, from most accounts is, you know, not only did we find the regulatory tools to just ask the obvious question about whether an operating system and the set of software should go right, we could get our minds around the question. And as you point out, some of these questions are far more intuitive than we treat them. But also that, you know, my understanding is that that antitrust action really changed the culture of the place by some accounts. You know, Brad Smith, who's the president, the number two in the company, his his rise is in part explained by the way that he marshaled that company around, hey, we're going to need to work with regulators to be a part of the modern innovation economy. We can't just continue to have an adversarial stance. And that became widely accepted within the company and people, you know, the same arrogance that some accused Silicon Valley of today was widely that Microsoft was widely described that way in the 90s and has a very different orientation. Today, Walt, to your point, remaining a highly successful integrated enterprise. We don't see that, you know, with in the current moment, we don't we don't see even even, you know, what this questioner is asking more seemingly aggressive regulatory action from Europe has not does not seem to have produced as far as I can tell substantial culture change. I guess one, is that your perception into what is that? What was what's the key to to to that kind of culture change that moves you from adversarial to in our American tradition collaborative, you know, where we actually are trying to build the right structure of sort of commerce and society. Yeah, it's funny you raise that I don't think about the duality between collaborative and adversarial too much. I think about incentives. And I think about rule of law and consequences. So we have to accept that there are going to be certain incentives of, you know, dominant firms. And we have laws on the books that relate to anti competitive conduct and the like. And if they violate those, then they have to face real consequences. This is part of commercial regulation. So, you know, I don't think it's my role to police, you know, internal firm culture, but it's my job to enforce the law appropriately. Because I don't think people think it's right that when a smaller firm gets caught, they get nailed and strung up sometimes. Whereas when the big guys get caught, kind of it feels like nothing happens. But you're right. The importance of the government actually taking Microsoft to court was, I think, a very powerful action, you know, deposing Mr. Gates. All of that, I think, does have a meaningful reminder that, you know, the citizens and the government that they elect and have administered the country shouldn't just roll over for a giant firm. Mohammad, do we have time for one more question? Absolutely. Two really great questions, comments. There's clearly no democracy in the information that is allowed on social media. How do we include the voice of users in what isn't allowed online? And the last one is, you mentioned the need for honest metrics. I think this was about when you were talking about bots driving up metrics for content. Do you have any idea what those honest metrics look like and how to make sure they are implemented for content creators? Well, those barely need interpretation. How do we get the voice of the user in the mix? And how do we measure success? What a great note to end on, Commissioner. Yeah. So look, you know, there's been a lot of talk during this conversation about policy and regulation. And, you know, to reference the past, recent discussion we just had, you know, Microsoft went through that process, but Microsoft is still one of the most valuable firms on the planet. And the answer to your question about social media and user participation, I think it's technology and tools, the ability for users to be able to shape their own experience and to shape their own community without necessarily having to be dictated or distorted, you know, by a business model. And I think ultimately, that's really what we want. We want to cultivate more, make sure that there are plenty of options that they can switch to, that lock-in is low, that there is the ability to have that vibrance of new tools all the time and moving to them in a way that ultimately gives users more bargaining power and more control and sellers and everyone else to be able to shape their own experience. And that's ultimately, I think, what will power our economy and our society. And, you know, at the end of the day, when it comes to honesty and accountability, you got, you always worry that larger firms can sometimes feel that they're above the law. And when it comes to metrics, you know, we need to make sure that you can't keep succeeding if you're engaged in deception or not providing accurate information. And we need to get to a place where companies can die when they don't innovate or when they break the law. And that's okay. My space lost. We need to return to a place where companies can actually die when they don't innovate or they can actually die when they don't provide a good experience, when they don't provide a good service or when they really break the law. So that's really where we need to get to. Fantastic. Well, thanks a lot for taking the time for this conversation. We really appreciate it. Okay. Thank you so much. And Sam. Thank you, Sam. Thank you, Commissioner. And thank you all for joining us for this conversation. The link to the entire report that Priscilla presented is in the resource box on your screen. We also encourage you to sign up for our next webinar in this series where we'll continue to tackle these issues around access to information, First Amendment rights, and the American public. Thank you all for joining us and be well.