 It's been so long since the majority of internet services were distributed among small, independently managed internet-worked services that it can be hard to imagine how an interoperable Facebook would work. This video is our design fiction version of one hypothetical model for an interoperable Facebook based on the laws that are working their way through the EU and US lawmaking process and on proposals that others have come up with. We'll present four scenarios designed to help you get a feel for what life might be like in a future where people who grow dissatisfied with Facebook can leave without paying a high switching cost. Scenario 1. Leaving Facebook. Let's start at the beginning. What would it be like to leave Facebook and go somewhere else? Say someone you trust, your church, your local Kiwanis club, your hacker space, or start up in your town has set up a new service. They downloaded some free open source software, got some server space, and now you've got an account with them. Maybe it's free and run by volunteers, maybe you pay a monthly subscription fee, or maybe it's got ads. We all know how an ad-supported system works, right? Shadowy ad brokers follow you around the web, spy on you through your phone, buy up location data from your weather app and your car maker, and merge that with purchase data they buy from merchants and credit card companies. All of this goes into a dossier on you that is used to target ads. Under the Access Act and the DMA, these kinds of ads would not be allowed for interoperable systems. The Access Act bans commercializing user data, while the DMA requires companies comply with the GDPR, Europe's Strong Privacy Law, which gives users a right to avoid these uses of data without opt-in consent. So what kind of ads would these services have? Contextual ads. These are ads based on the content you're looking at, not on who you are. Studies show that contextual ads make a little less money for publishers than behavioral ads. That's the polite euphemism used instead of surveillance ads. But that's only because the surveillance advertising industry is able to push all of its costs. Identity theft, the creepiness of being spied on all the time, surveillance based discrimination, onto society, while pocketing all the profits. Back to that service. Maybe it's run by volunteers. Maybe it charges money. Maybe it has context ads. Maybe it's a co-op. You get your account and then use its connection to Facebook to connect that account to your Facebook account. We've had a request to link your Facebook identity to an account named Trisha Thomas on a new service, User Republic. If this is you, click OK. If this is not you, click here to report it. After you link your Facebook account to the new service, Facebook has to figure out if your friends are OK with having their messages to you sent to the new service. Your friend, Trisha Thomas, has moved to a new network, the User Republic. Are you OK to continue messaging her? Your messages and posts will be sent to User Republic's service for delivery to Trisha's feed and inbox. Click here to find out more about User Republic and read its privacy policy. Of course, if all your friends start to leave Facebook, these notices could get annoying. If that's the case, you can just set a blanket permission. When a friend moves to a new service, ask me every time, block all requests, accept requests from friends on services I've already said yes to, accept requests unless they come from one of the following services. As your Facebook friends give permission to have their messages and posts forwarded to you on User Republic, those messages will show up in your inbox and those posts will show up in your feed. They'll be mixed with messages from other users of User Republic and users on other services that connect to Facebook can message you too. User Republic will have its own algorithm for deciding what goes where in your feed and it will have its own house rules about what content is allowed, what content is banned and what content is subject to being flagged or tagged by moderators. That's why you picked it. User Republic explained its philosophy for moderating content and it sounded good to you. You might change your mind about how you want your content moderated or your preferences might evolve. That's OK. You can leave User Republic and move to Facebook or any other service that's federated with Facebook using the API. That whole federation of services can contribute to your feed and your inbox. The people running User Republic can block some of those services based on their own house rules. Say you're part of a community for people with cancer diagnoses that has decided it will not tolerate discussions about, quote, alternative medicine. If there is another service out there, let's call it Conscious Community that encourages that kind of talk, your community might choose to block messages from that server from appearing in its forums rather than having to moderate every post from Conscious Community. Sorry, the members of Cancer Support do not allow participants from Conscious Community to join their discussions. If you still wish to join this forum, you will need to get an account with another service. Conscious Community members who want to join the Cancer Support forums can open a second account with another service that is considered more reliable by the Cancer Support members, or they can switch to another service or they can start their own community where discussions of alternative medicine are welcome. Scenario three, blocking objectionable material that Facebook permits. The whole federation of services doesn't need to agree to a single definition of harassment, for example. If there's a server whose users engage in what they think of as vigorous discussion, but you and your friends consider harassment, you can block that server. If you were one of the millions of Facebook users who thought Donald Trump should be banned for posting when the looting starts, the shooting starts, you could use a server where he was instantly blocked for that post rather than waiting months for Facebook to reverse its own decision to leave Trump's account active. This message from Facebook user Donald Trump has been blocked for violating our community standards and this user has been blocked from User Republic. Scenario four, posting material that Facebook prohibits. Federated servers can enforce their own rules based on filters, user reporting, user voting or volunteer or paid moderation. That could include allowing links that are banned on Facebook. For example, if you want to have a discussion in which you post and critique Russian war propaganda that Facebook has decided to remove. Those posts might be blocked by Facebook itself, but the other users on your server would see it. Same goes for federated servers that also object to that sort of content. This message from user Republic user Trisha Thomas has been removed for violating our community standards. Click here to learn more. That's the point of devolving moderation to communities themselves. You get to choose which service you use based on what you want to see and what you don't want to see. And you get to leave if you and your service don't see eye to eye. There's something seriously wrong with a technological world where billions of users feel they must use a service despite not liking or trusting it. The tech giants insist that we're just being dramatic. And that we really do like being under their thumbs. And the only reason they go to such great lengths to keep us from leaving is to keep us safe from our own foolish choices. It's true that there are lots of ways that Facebook could be even worse. The company spends a lot of money and devotes a lot of resources to preventing fraud and promoting civil discussion. In other words, what you see represents their best effort at producing a service that you will stick with voluntarily. Facebook is in a mess of its own making. It was Facebook's idea to try to moderate conversations in more than a thousand languages across more than a hundred countries. The company's grand experiment in moderation at scale has made a lot of people miserable, but its legal ability to inflict high switching costs has kept many of those miserable users stuck inside its walled garden. Interoperability puts decisions about community standards back where they belong with communities. Interoperability and federation permit users, not corporate executives, to decide who belongs in their groups and what behavior is and is not acceptable. Facebook definitely has plenty of room for improvement. And we hope they apply themselves to that project. For example, they could adopt the Santa Clara principles, a document that human rights organizations, advocates and academic experts developed for fairer moderation. But Facebook doesn't just need to work well. It has to fail well, too. Today, people who aren't served by Facebook's rules and rulings have two choices, like it or lump it. They can stay on Facebook or they can go into exile from their Facebook communities. Interoperability creates a third choice. Go somewhere else, but keep your relationships alive. That's a truly social media design, one that is oriented around the social lives of real people, not commercial surveillance or the social norms of a small group of executives in a Silicon Valley boardroom.