 People don't like Facebook, but they use it anyway. Facebook users are upset that things they hate remain on the platform and end up getting shoved in their faces every time they log on. And Facebook users are upset that things that they love are blocked or downranked by Facebook's algorithm. They hate Facebook's remote, high-handed appeals process, which is either fast, cursory and arbitrary or slow, deliberative and arbitrary. If you don't like that system, you don't get to change it. Facebook's friends and enemies say, if you don't like it, just leave. A lot of people compare this to addiction, because I hate this, but I just can't give it up, has a familiar ring to it. But we think that's wrong. People aren't addicted to Facebook. They're trapped by it. That story about Facebook addiction is everywhere. Facebook itself likes the addiction explanation. They want us to believe that we are powerless before their almighty machine learning tools. Facebook's real business, after all, is selling ads, and those ads make a lot of money. 117,929,000,000 in 2021 up from 85,965,000,000 in 2020. Facebook's incredible ad revenue isn't just driven by volume either. It's a very expensive place to advertise. Why are advertisers willing to pay more to put their ads on Facebook than on other platforms? In part, it's because they believe that Facebook ads work really well. Why do they believe this? Because Facebook tells them so. Facebook's whole pitch is that with enough data, enough machine learning, and enough secret sauce, it can figure out how to insert exactly the perfect ad in front of the exactly perfect eyeballs and close the sale. Facebook really wants us to think that it has a mind control ray, a means of bypassing our critical faculties. The Facebook pitch boils down to this. Other ad platforms ask your customers to buy your stuff. We tell them to. Extraordinary claims require extraordinary proof and the fanciful claims about Facebook's ability to control our behavior by hijacking our reward system do not rise to that standard. But if people aren't addicted to Facebook, why do so many of us say, I hate it, but I can't give it up? Economists have an explanation. They blame network effects, a technical term of the trade. A product enjoys network effects if it increases in value when more people use it. Facebook certainly has network effects nearly everyone who uses Facebook today joined because they wanted to socialize with the people who are already using Facebook. In other words, those people made Facebook more valuable. And when you and I joined Facebook, well, that increased its value too. We became a reason for other people to sign up. Network effects are real. They're indisputable. They're powerful. Products with network effects can grow fast. But network effects are only how Facebook got big. They're not how it stays big. Look, Facebook is a digital product. It is made out of computers and software, and computers and software are really flexible in a way that physical products just aren't. Connecting a kitchen mixer to a random attachment is hard. You need a machine shop and a skilled machinist to do it right. And when you're done, anyone else who's stuck with mismatched mixers and attachments will have to repeat the whole process. On the other hand, if hypothetically you wanted to plug a new service into Facebook, say by automating a browser that logs in as you, gets all the post waiting for you there and puts them in an inbox on a competing service, well, not only could you do that with code, but you can share that code with everyone. Once one person figures it out, everyone can do it. What's more, Facebook could make this really easy. All they have to do is supply programmers with a way to plug in new programs, what technologists call an API, and it would be simple to make new services work with Facebook. When a new thing works with an existing thing, that's called interoperability. It's an inescapable fact of computers, and it's why we found so many ways to use these machines in such a short time. Facebook got big thanks to network effects. It stays big thanks to switching costs. Switching costs. That's another phrase from our friends in the economics profession. It means everything you have to give up when you stop using a product or a service. Like if you quit Facebook, you'll lose the ability to keep up with the people that brought you there in the first place, family, communities, customers. When there's interoperability, digital technology users rarely face high switching costs. The ease of making something new that plugs into whatever you're dissatisfied with now has meant that you can move from one product or service to another with hardly a break. Are you sick of Microsoft Office? No problem. You can get a Mac and read all your old Word, Excel, and PowerPoint documents with pages, numbers, and keynote. Or you can switch to GNU Linux and use Libre Office, or you can get a Chromebook and use Google Docs. All these programs can read and write the same files, so you can go back and forth between them without getting locked in. The same goes for email. Don't like your email provider? Get another one and forge your old email to the new address. Hate your mobile carrier? Yeah, we all hate our mobile carrier, but thanks to interoperability and frequencies, protocols, and phone numbers, you can switch to a new one and your friends won't even know unless you tell them. There's no technical reason that leaving Facebook should involve high switching costs. From a technical perspective, building a tool that lets you see the posts of the people you leave behind when you quit Facebook and lets them see your posts is completely possible. It is a little more complicated than sending a text message from Verizon to AT&T, but the principle is the same. Facebook could start making this happen tomorrow if it chose. The reason for the high switching costs for Facebook, the reason so many of us want to leave Facebook but feel like we can't, aren't just technical. They're also legal. Facebook has often claimed that creating a service that interoperates with it is a violation of a thicket of laws. Copyright laws like Section 1201 of the Digital Millennium Copyright Act, cybersecurity laws like the Computer Fraud and Abuse Act, and unfair and non-negotiable contracts. If that sounds like a lot of legal technical mumbo jumbo to you, don't worry, it is mumbo jumbo, but it often works. Facebook has managed to scare off nearly anyone who might stand up a service that would let you leave Facebook behind while staying connected to the people who matter to you. When a brave startup, university, or individual tinkerer tries it, Facebook bears down on them with remorseless barrages of legal firepower. If only there was a way to cut through Facebook's legal thicket, to blast a door in the walled garden and let Facebook's dissatisfied users escape without forcing them to surrender their social connections on the way out. If only there was a way to make sure that Facebook had users not hostages. There is a way, more than one in fact. Here's the first way. We could reform all those laws that Facebook and other tech giants used to block interoperability, cleaning up our copyright, patent, contract, and cybersecurity laws to make sure that they don't interfere with interoperability and lower switching costs. That's a great idea, but it's a long haul. Changing one law is hard. Changing many laws? That's very hard. But there's another way. While we're working on getting the cruft out of 40 years of bad tech laws, we can pass a new law, an interoperability law, a law that tells big companies like Facebook that they have to make it easy for interoperators to plug into their systems. Freeing those users, they hold hostage. New laws aim to do just that. The Access Act, a proposal in the U.S., will require the largest tech companies to open up an API, one of those application programming interfaces that make life easier for programmers trying to talk to the service, and then give access to startups, co-ops, tinkerers, and nonprofits that want to offer alternatives to big tech social networks. A law in the EU, the Digital Markets Act, sets up a similar framework. It only covers messaging apps, probably the most difficult place to start, but it could be expanded to include social networks. It's been so long since the majority of internet services were distributed among small, independently managed internet-worked services that it can be hard to imagine how an interoperable Facebook would work. This video is our design fiction version of one hypothetical model for an interoperable Facebook based on the laws that are working their way through the EU and U.S. lawmaking process and on proposals that others have come up with. We'll present four scenarios designed to help you get a feel for what life might be like in a future where people who grow dissatisfied with Facebook can leave without paying a high switching cost. Scenario One Leaving Facebook Let's start at the beginning. What would it be like to leave Facebook and go somewhere else? Say someone you trust, your church, your local Kiwanis Club, your hacker space, or start up in your town has set up a new service. They downloaded some free open-source software, got some server space, and now you've got an account with them. Maybe it's free and run by volunteers. Maybe you pay a monthly subscription fee, or maybe it's got ads. We all know how an ad-supported system works, right? Shadowy ad brokers follow you around the web, spy on you through your phone, buy up location data from your weather app and your car maker, and merge that with purchase data they buy from merchants and credit card companies. All of this goes into a dossier on you that is used to target ads. Under the Access Act and the DMA, these kinds of ads would not be allowed for interoperable systems. The Access Act bans commercializing user data while the DMA requires companies comply with the GDPR, Europe's Strong Privacy Law, which gives users a right to avoid these uses of data without opt-in consent. So what kind of ads would these services have? Contextual ads. These are ads based on the content you're looking at, not on who you are. Studies show that contextual ads make a little less money for publishers than behavioral ads. That's the polite euphemism used instead of surveillance ads. But that's only because the surveillance advertising industry is able to push all of its costs, identity theft, the creepiness of being spied on all the time, surveillance-based discrimination, onto society, while pocketing all the profits. Back to that service. Maybe it's run by volunteers. Maybe it charges money. Maybe it has context ads. Maybe it's a co-op. You get your account and then use its connection to Facebook to connect that account to your Facebook account. We've had a request to link your Facebook identity to an account named Tricia Thomas on a new service, User Republic. If this is you, click OK. If this is not you, click here to report it. After you link your Facebook account to the new service, Facebook has to figure out if your friends are OK with having their messages to you sent to the new service. Your friend, Tricia Thomas, has moved to a new network, the User Republic. Are you OK to continue messaging her? Your messages and posts will be sent to User Republic service for delivery to Tricia's feed and inbox. Click here to find out more about User Republic and read its privacy policy. Of course, if all your friends start to leave Facebook, these notices could get annoying. If that's the case, you can just set a blanket permission. When a friend moves to a new service, ask me every time, block all requests, accept requests from friends on services I've already said yes to, accept requests unless they come from one of the following services. As your Facebook friends give permission to have their messages and posts forwarded to you on User Republic, those messages will show up in your inbox and those posts will show up in your feed. They'll be mixed with messages from other users of User Republic and users on other services that connect to Facebook can message you too. User Republic will have its own algorithm for deciding what goes where in your feed and it will have its own house rules about what content is allowed, what content is banned, and what content is subject to being flagged or tagged by moderators. That's why you picked it. User Republic explained its philosophy for moderating content and it sounded good to you. You might change your mind about how you want your content moderated or your preferences might evolve. That's okay. You can leave User Republic and move to Facebook or any other service that's federated with Facebook using the API. Scenario 2 blocking undesirable content from another federated server. That whole federation of services can contribute to your feed and your inbox. The people running User Republic can block some of those services based on their own house rules. Say you're part of a community for people with cancer diagnoses that has decided it will not tolerate discussions about, quote, alternative medicine. If there is another service out there, let's call it Conscious Community that encourages that kind of talk. Your community might choose to block messages from that server from appearing in its forums rather than having to moderate every post from Conscious Community. Sorry, the members of Cancer Support do not allow participants from Conscious Community to join their discussions. If you still wish to join this forum, you will need to get an account with another service. Conscious Community members who want to join the Cancer Support forums can open a second account with another service that is considered more reliable by the Cancer Support members. Or they can switch to another service or they can start their own community where discussions of alternative medicine are welcome. Scenario 3 Blocking objectionable material that Facebook permits. The whole federation of services doesn't need to agree to a single definition of harassment, for example. If there's a server whose users engage in what they think of as vigorous discussion but you and your friends consider harassment, you can block that server. If you were one of the millions of Facebook users who thought Donald Trump should be banned for posting when the looting starts, the shooting starts, you could use a server where he was instantly blocked for that post rather than waiting months for Facebook to reverse its own decision to leave Trump's account active. This message from Facebook user Donald Trump has been blocked for violating our community standards and this user has been blocked from User Republic. Scenario 4 Posting material that Facebook prohibits. Federated servers can enforce their own rules based on filters, user reporting, user voting or volunteer or paid moderation. That could include allowing links that are banned on Facebook. For example, if you want to have a discussion in which you post and critique Russian war propaganda that Facebook has decided to remove. Those posts might be blocked by Facebook itself but the other users on your server would see it. Same goes for federated servers that also object to that sort of content. This message from user Republic user Tricia Thomas has been removed for violating our community standards. Click here to learn more. That's the point of devolving moderation to communities themselves. You get to choose which service you use based on what you want to see and what you don't want to see. And you get to leave if you and your service don't see eye to eye. There's something seriously wrong with a technological world where billions of users feel they must use a service despite not liking or trusting it. The tech giants insist that we're just being dramatic and that we really do like being under their thumbs and the only reason they go to such great lengths to keep us from leaving is to keep us safe from our own foolish choices. It's true that there are lots of ways that Facebook could be even worse. The company spends a lot of money and devotes a lot of resources to preventing fraud and promoting civil discussion. In other words, what you see represents their best effort at producing a service that you will stick with voluntarily. Facebook is in a mess of its own making. It was Facebook's idea to try to moderate conversations in more than a thousand languages across more than a hundred countries. The company's grand experiment in moderation at scale has made a lot of people miserable, but its legal ability to inflict high switching costs has kept many of those miserable users stuck inside its walled garden. Interoperability puts decisions about community standards back where they belong with communities. Interoperability and Federation permit users not corporate executives to decide who belongs in their groups and what behavior is and is not acceptable. Facebook definitely has plenty of room for improvement and we hope they apply themselves to that project. For example, they could adopt the Santa Clara principles, a document that human rights organizations, advocates and academic experts developed for fairer moderation. But Facebook doesn't just need to work well, it has to fail well too. Today, people who aren't served by Facebook's rules and rulings have two choices, like it or lump it. They can stay on Facebook or they can go into exile from their Facebook communities. Interoperability creates a third choice, go somewhere else, but keep your relationships alive. That's a truly social media design, one that is oriented around the social lives of real people, not commercial surveillance or the social norms of a small group of executives in a Silicon Valley boardroom.