 So, the talk today is loosely adapted from a paper that my colleague at Electronic Frontier Foundation and I co-wrote called Privacy Without Monopoly, and I'll drop a link to that in the Discord. But if you want to go have a look for that, it's pretty easy to find. My colleague Bennett Seifers did the heavy lifting on this, and I wanted to acknowledge his contribution to this talk. I also want to note just how totally cool it is that EFF has a technologist whose surname is actually Seifers, and it was the name he was born with. Talk about nominative determinism. So, that talk, or that paper about privacy monopoly addresses itself to a subject that has come into more focus lately, something that people are paying more attention to, and that's monopoly itself. And monopoly has become endemic cross-sectorally in every industry. You know, a brief list of some of the industries that are either globally or within North America dominated by five or fewer firms would include pharmaceuticals, pharmaceutical plans, pharmacies, the U.S. health insurance system, major appliances, athletic shoes, defense contractors, film production, cinemas, music production, publishing, bookselling, stationary, eyeglasses, Lasik, enterprise software, car parts, glass bottles, rental carts, hotels, aviation, rail logistics, accounting, mattresses, oil, beer, spirit, champagne, cowboy boots, candy, professional wrestling, and of course the internet. As Tom Eastman, software developer from New Zealand, tweeted once, I am old enough to remember when the web wasn't just five giant websites filled with screenshots of text from the other four. Now, the critics of technology are apt to lean into tech exceptionalism to explain why it is tech got so monopolized, how it is the web became five giant websites filled with screenshots of text from the other four. They say that technology is different because it has network effects, because algorithmic persuasion is so powerful that once a customer has had the powerful big data mind control rays turned on them, they can no longer even muster the will to change platforms or that tech bros are somehow a uniquely wicked species of executive relative to all the other industries and that if it weren't for the absolute evilness of tech bros, tech would be a better place. Now, hilariously, tech exceptionalism is also what big tech leans into when it wants to simp for itself and explain why it shouldn't be held responsible for the fact that it's become so monopolized. It says again, that network effects are what drove the monopolization of big tech. That first mover advantage is what caused these firms to become dominant. And of course, that there is something different about tech CEOs that people like Mark Zuckerberg or Steve Jobs or Sergey and Larry were kind of once in a generation towering geniuses who were destined to rule over us all and it's not their fault that they were born so smart. But we got here not through any kind of exceptional tech monopolization, but rather through a kind of very unexceptional and common form of monopolization. About 40 years ago, we experienced a sea change around the world and how we enforce competition law up until about 40 years ago, the dominant form of competition law started from the presumption that monopolies were bad because it was bad to have a monopoly. That when there was a monopoly, you ended up gathering too much power into too few hands. And no matter how benevolent that dictator was for some people, for other people, they would not be suitable. And because they were unaccountable and were not subject to any kind of democratic control, if what the monopolist wanted wasn't what you wanted, then it would be bad for you and bad for society and that over time the interests of monopolists would diverge from the interests of the people they dominated. And so we would have to stop monopolies because they were monopolies. And as a result, competition law generally prohibited firms from merging with major competitors. They prohibited firms from buying up small competitors before they could grow to become a threat. And they prevented firms from doing things to corner markets either by expanding into vertical lines of business, buying up your supply chain, or by, or rather, competing with your supply chain or by buying up your supply chain. So for example, rail companies were historically prohibited from buying or starting freight companies, because if you were a freight company and you depended on the rail company and you were competing with them, well, I don't actually have to tell you how that works, because anyone who's ever sold an app in an app store that's run by Google or Apple that competes with an app that Google and Apple make knows exactly how that works, right? You cannot compete with the platform owner itself in a line of business it wants to enter. But 40 years ago, we had a sea change. And it began with a guy named Robert Bork. Most Canadians, I think, haven't heard of Robert Bork. But if you have, you probably know him as the guy who Ronald Reagan put up for the Supreme Court that the Senate wouldn't confirm, because he had been Richard Nixon's solicitor general and had been directly responsible for Richard Nixon's most egregious crimes. But Robert Bork's career didn't end when he was bounced from the Supreme Court. He became a very important economic theorist. He was part of what was called the Chicago School, based out of the University of Chicago. And they were really the architects of a lot of business reforms in the Reagan era and since. And his particular contribution was a deep rereading of antitrust statutes from American law and of the theory of antitrust and competition and monopoly. So Bork, he had this very weird idea that if you went and read the four major US antitrust statutes starting with the Sherman Act, and if you read them really closely, like kind of QAnon closely, where you were like really leaning into maybe hidden meanings and stuff, that you would find that despite the fact that the people who wrote these bills told you exactly why they wrote them, and it was because they didn't think monopolies were good, that really secretly, they all kind of liked monopolies because monopolies were efficient and produced productivity gains, and that the only monopolies they didn't like were the monopolies that produced consumer welfare harms. And what that meant was that monopolies that raised prices on consumers were bad, and every other kind of monopoly was basically just fine. Now, this raises a really interesting question. If a firm wants to merge with a major competitor or buy a smaller competitor or expand into an adjacent line of business, how do you know before they apply for the merger whether they'll raise prices? Well, Bork had an answer. He said that you could build these very abstract mathematical models and that the models would tell you whether or not the resulting merger would produce price rises. And if two firms did merge and then down the line prices went up, how would you know whether it was because they had a merger to monopoly and not because energy prices went up or labor prices went up or other key input prices went up or maybe like the moon is in Venus, like how do you know that the price went up because of monopoly? And again, he said you could do this with a model. Now, the thing was only Robert Bork and his friends at the University of Chicago knew how to build these models, and only they knew how to interpret them. And it turned out that if you paid them to make a model to predict whether or not your monopoly should be allowed, the model always said that the monopoly would be fine. And if you paid them to make a model to determine whether your monopoly was doing something no good, every time they could tell you that the monopoly was not the reason that prices had gone up. So they became a kind of priesthood like sorcerers who whenever the king wanted to do something that the courtiers frowned upon, the king could bring in the sorcerer and the sorcerer would sacrifice a goat in the middle of the king's court and they would read the guts of the goat and they would find in the guts of the goat that the will of the divinity and the spirits was that the king press on with his plans. And if one of the courtiers were bold enough to say, you know, I don't see that in the goat's guts, they would go, look who thinks he can read the guts of a goat. And so for 40 years, we had mergers to monopoly, we had the acquisition of nasa competitors, we had anti-competitive pricing, we had all kinds of things that produced market concentration, not just in tech, but in every other sector as well. You know, if network effects were the reason that we would have, that we have monopolies, you would expect that monopolies would be confined to industries with strong network effects, but professional wrestling, which went from 20 leagues to one league, does not enjoy network effects. It just has a billionaire who owns it, who had ready access to the capital markets, who bought every other wrestling league until there was just one major league. So all monopolies are created in the same way, from mergers and other predatory conduct, but each sector that becomes monopolized has different ways of deminopolizing based on its technical characteristic. So tech does in fact have network effects. The more people who use a piece of technology, the more valuable it is. And you may find that you use a piece of technology you don't want to use because of its network effects, because of what you get access to if you join the technology. You may find yourself using Facebook, even though you hate it because all of your friends or relatives are on Facebook and especially under lockdown, where all of your communication is virtual, you have to join Facebook because your friends are holding you hostage. And it may be the case that your friends are there because you're there. So you're holding them hostage too. That's what network effects look like. But network effects have an easy remedy, which is lowering switching costs. If you find that you're locked into buying printer ink for your HP printer, because they figured out how to use network effects to sell printers at a subsidy and then sell ink for more than you would pay for vintage Viv Clico, you can either throw away your printer. But if someone can reverse engineer the security chips that they use to stop you from refilling or using third party ink, then you can escape their ecosystem. You can use your own ink. If you can leave Facebook, but if there's an interoperability interoperability layer beg your pardon, that allows you to continue to talk to your friends on Facebook, well, then the network effect is turned on its head. Because instead of Facebook being the place that you have to go to because all of your friends are there, Facebook becomes the place that people want to compete with, because they can tempt people to leave by promising them they can still talk to their friends without having to live under Mark Zuckerberg's judgment. Interop is a unique characteristic of digital technology. There are other technologies that are interoperable. You can buy anyone's tennis shoes and you can put anyone's shoelaces in them. But the degree of interoperability that is intrinsic to digital technology is really different from any other industry because of that touring completeness, that universality, that thing that is our curse when we talk about information security, which is that we don't know how to make a computer that can run all the programs except for the bad ones, that every computer we make is functionally equivalent to the other ones, in that it can run any valid program, any program that can be expressed in symbolic logic. And because of that universality, it means that you can always plug something into something else. So here's an example of what it looks like when an industry does not have the benefits of digital technology for interop. About 150 years ago, Australia, like Canada, was in its early days and it was fragmented into a bunch of independent colonies, which they called states unlike provinces. And each of those early states had its own governance and they each had their own would-be rail baron. And each rail baron worked with the local governments to ensure that their own gauge of rail would be laid at a different width from all the other railroad tracks in Australia. They just never had a Canadian Pacific Railroad project. They had their own baby CPRs. 150 years later, no one has figured out how to solve what they call the middle gauge model. More than 300 designs for rail cars that can drop one set of wheels and retract another set of wheels and hop from one set of tracks to the other have been tried. None of them have worked. Instead, the only thing that they've got that reliably solves the middle gauge model rather is tearing up thousands of kilometers of rail track and laying it at a standard gauge. Now, there's only six Australian states, there's six Australian rail gauges. If I asked you to write an interoperability layer that could parse six file formats, you would not find this very challenging. When it comes to digital, we just write a little interpretation layer that sits between one old piece of technology and a new piece of technology or two mutually incompatible pieces of technology and we pass data back and forth between them. And that means that interoperability poses a way to escape the monopoly of big tech, to allow people to walk away from big tech without losing the things that the network effects demand that they stay with big tech for, to stay in touch with their friends, to continue to read their old documents, to continue to access the services that they were accessing, but on a take it or leave it basis or an a la carte basis, you can get your access to your friends on Facebook without having to subject yourself to Facebook's terms of service or rules by finding another service. So let's pause for a minute here as we talk about demon monopolizing big tech and ask why we want to demon monopolize big tech. Why do we want to nerf the power of big tech? I think the answer is not efficient markets. It's wanting to have self determination. It's wanting to allow people to make their own choices about how the technology in their life will work, that when you have a monopolist, it is able to dominate your relationship with your friends. It's able to dominate what answers you find when you search for answers to questions, what apps you can install, where you can get your devices repaired, when you have to throw them away because the manufacturer has declared them irreparable. And if you want to have self determination, you have to have privacy, not because big tech has figured out how to use mind control rays to convince us that up is down and north is south and so on. Mark Zuckerberg did not create a mind control ray to sell your nephew fidget spinners only to have Robert Mercer steal it and use it to convince your uncle to become an anti-masker QAnon racist. What those companies are able to do is not destroy our judgment, but rather target us at a fine grain and expose us to constant surveillance that makes us self conscious about who is looking at us and what they're thinking of us, how we're being judged both by algorithms and by human watchers. And when you're watched, you can't be your authentic self. We live in a moment now where in living memory, people who did things that today are considered not just lawful, but just and beneficial were considered criminals being gay, being in love with someone whose skin is a different color to your own, even partaking of cannabis. These were all felonies and human living memory. And the way that we got to a world in which we were able to make social progress was not by stripping people of the right to have a private realm where they could really be themselves. It was by giving them that private realm so that they could form alliances with other people who felt the same truth that they did, so that they could form alliances with people who were their friends who didn't have that same secret, and so that they could gradually make the social progress that we now see around us. And unless you think that in 20 years, your grandkids are going to say, tell me again, grandma, tell me again, grandpa, how it was in 2021 that we got everything right, then you should really believe that there are people you love and people whose happiness really matters to you, who have never been their authentic selves with you, and who may never be their authentic selves with you unless they can have a private realm to develop that authentic self. And if they are never their authentic selves to you, they will always have a sorrow in their heart, a sorrow that they'll take to their grave because you never knew them, and they loved you and you love them. So we need a private realm because private realms are key to self determination. Now, monopoly allows surveillance monopoly allows surveillance because when firms have tendrils all across the web, it's easy to spy on us. Because when firms are very concentrated and have a lot of money, they can figure out how to lobby to make privacy law weak or non existent. And it drives surveillance because it drives other actors in the marketplace to participate in surveillance. You know, how many newspaper editorials have we read railing against online surveillance on a newspaper's website that has 75 trackers on it. The reason they have those trackers is because the only way they can survive in the market is by conceding to the terms demanded by monopolist that require unchecked data acquisition from their customers. Now, ironically, every time we have a privacy scandal, the platforms assert not less control over our privacy data, but more. Every time there's a Cambridge Analytica, Facebook doesn't give you more ability to control how your data is used on Facebook. It gives you less ability by shutting down its APIs. It's already hard enough to leave those big platforms. We are already stuck in these mutual hostage takings. But when platforms tout themselves as guardians of our privacy and say that's why they can't let us take our data elsewhere or communicate with them from off platform, they're engaged in what Bruce Schneier calls feudal security. Actually, a historian reader of mine pointed out that this should really be called minorial security because feudalism is a little different from minorialism. But Schneier is good at crypto. So we'll let him we'll let him come up with a name feudal security. And the model of feudal security is pretty straightforward. It starts from the premise that out there, there are bandits, right? Bad guys who want to do terrible things to you and your data. And it's really hard to defend yourself against those bandits. But there are warlords who have built giant and penetrable fortresses, warlords like Google, Apple, Facebook, even Salesforce. And that if you only surrender your free will and volition to those warlords and move into their fortresses and live by the rules they set, they will hire fearsome mercenaries to defend you against the bandits that are outside those walls. Now that works well. When your interests are the same as the warlords interests, then the warlord will in fact protect you. Most of us cannot protect ourselves as well as Facebook's data security team or Apple's data security team or Google's data security team. Even the most elite among you watching this talk now have to sleep and they can run three shifts. And so they can always be on to defend our rights. But as soon as those firms interest diverge from ours, as soon as the warlord decides that we are not someone they want to protect, but rather someone that they want to expose to risk, those walls that keep the bandits out keep us in. So after the various election misinformation scandals, for example, Facebook pledged that it would no longer allow paid political disinformation and that it would label all political ads and put them in an observatory so that third parties could check their homework and see if they were actually living up to their promise. Except it turns out that Facebook doesn't do a very good job of this. And there's tons of paid political disinformation on the platform. So some researchers at NYU in the engineering school built a thing called ad observer, which is a browser plugin that Facebook users use to voluntarily scrape any political ads that they see or any ads they see on Facebook and drop them into ad observatory, which is an open repo of Facebook political ads and Facebook other ads that accounting accountability journalists, security researchers, human rights workers and other forms of academics can can look at and see whether or not Facebook is living up to its promise. And Facebook has threatened to sue the New York University engineering team on the grounds that they are exposing Facebook users to privacy risks. And Facebook says, although you say that your code doesn't gather any personally identifying information, and although that code is free and open source software that can be audited by third parties, we have been deputized to babysit our users and to keep them safe. We are the warlord here, and we don't think that you are doing a good job. And so we are going to kick you off of our platform using the courts to get rid of you. This happens also with other big tech giants, right? So Apple, for example, was just the subject of a New York Times expose that claimed that they had backdoored their servers for the Chinese government. So the Chinese government could spy on human rights activists and other dissidents and incorporate them into its totalitarian program of torture, of forced labor, and of other illegal forms of punishment to maintain the dominance of the Chinese Communist Party. Even if you don't believe that story, the one thing that Apple does admit that they've done is remove all the working VPNs from its app store at the behest of the Chinese government. Not because Apple doesn't want to defend its users, but as between defending its users and maintaining access to its factories in China, it's decided that its users can be jettisoned and their interests can be jettisoned. And because you're locked into Apple's app store, you can't install an app that Apple hasn't approved. So you can't install an app that has working crypto that isn't backdoored. You can't install a VPN that works. And Google does this too. Google gathers your data in all kinds of ways. And one of the things that we discovered last summer during the Black Lives Matter uprising is that Google was responding to illegal reverse warrants, where the government would come and say, give me the names of everyone whose device said that they were located in this place. And rather than stick up for their users, they were just dumping docks on everybody who was at or near a demonstration. Thankfully, we're finally seeing through feudal security. We're finally ending the false binary between warlords and bandits. There are lots of people who are saying maybe what we need to do is not make ourselves beholden on warlords, but rather delegate decisions about what is and isn't acceptable to do with our private data, not to corporate boardrooms, but to democratically accountable states and to enact national strong privacy laws that have private rights of action so you can sue people who abuse your privacy. This movement has been a long time building. Mike Maznick from TechDirt published a very influential paper called Platforms Not Protocols that was then picked up by the CEO of Twitter, Jack Dorsey, who used it as the basis for something called Project Blue Sky. And all of these envision a future in which companies like Facebook, Apple, and Google have to contend with new market entrants, whether they're startups, other large firms, tinkers, cooperatives, nonprofits, and other hybrid entities that can come in and offer users more self-determination. Now, the front line of the interop wars, right, the place where we see the interop proposals coming in, are proposals for mandates from governments. So last year in the United States, they had the access act that would demand that the platforms expose APIs to third parties so they could plug into them with a layer of referees who would mediate between new market entrants and companies like Facebook and Twitter and Google to decide whether or not those new market entrants could be trusted to handle your private data safely. There's also the DMA and the DSA and the European Union and the Competition and Markets Authority report from the United Kingdom last year, all of which envision some form of mandated interoperability. And those mandates address three kinds of interop, data portability. And so that means that you can take the data that the company has gathered on you and bring it with you somewhere else. And the big platforms, they object to this by saying, well, it's not really your data, that when they when they tricked you into giving them your address book, which had the phone numbers and addresses and unique identifiers of all of your friends, you were giving them someone else's data and so they don't they can't let you take it with you when you go somewhere else. Now, this is a very thorny question, right, the question of whose data that is and when and how you should be allowed to take it. And we should resolve that question by having privacy laws that make adjudications about when that data can and can't be moved, not by asking Mark Zuckerberg or Tim Cook or Sundar Pinchai, whether or not they think it's a good idea for you to do something that would really hurt their bottom line, because we can never know whether their answers are answers that are in service to privacy or in service to their own parochial needs. The second kind of interop that's proposed in these rules is back end interop where you have to expose an API. The version that was proposed last year in the US started with the idea that these companies are the product of lots and lots of mergers and so they already have APIs to tie together the different services that they've bought, you know, Facebook has to have an API to connect WhatsApp and Instagram and Facebook's back ends. So they say, okay, whatever you've got is probably as good as you know how to make it, right, you won't have nerfed it yet. And so we're going to demand that you expose that to third parties because it's going to be as good as it can get. And they rely on some form of intermediary who sits between the dominant platform and the new market entrance to make judgment calls about whether the privacy rules are going to be respected. And there are lots of other people who proposed versions of this. Frances Fukuyama is leading a group at Stanford that has proposed something that they very confusingly call middleware to do this. Daphne Keller, who's an academic also at Stanford has proposed something she calls magic APIs. And the third form of mandated interop that we've seen in these various proposals is what's called delegatability. So one form of delegate and that's when you delegate choices about your interaction with a service to a third party that interacts with the party, the service on your behalf. So one form of delegatability might be to have a service that can access the UI of someone else's platform on your behalf to autopilot a headless browser that goes in navigates through the services UI, gets your waiting messages, gets your waiting interactions or updates and scrapes them out puts them somewhere else where you can interact with them in another context and then autopilots that back out to the rest of the service. But there are other forms of of delegatability like the right to repair rules are kind of delegatability where the you have states like Massachusetts that have passed ballot initiatives that say that manufacturers have to let third parties repair your device. So you delegate to that third party to an auto mechanic or to a mobile phone repair service. You delegate to them the right to look inside your device and make changes to it. These are good. But the devil is in the details with these kinds of mandates like who gets to decide what is and isn't okay for privacy, whether or not the mandate is over specified. We've seen proposals for mandates that say we're going to tell you what kind of code you have to run to interact with third parties and you're not allowed to change that code. So imagine it's two in the morning and your pager goes off and you find out that this code has got a vuln in it and that some bad guy is exfiltrating terabytes of data from your service and you can't patch that whole unless you can wake a bureaucrat up and auto or Washington DC and convince them that you're that you're changing that API not for anti competitive reasons but to protect your users. So what we really want in these mandates as they mature is for them to take account of that failure mode and to rather than over specifying how the interoperability should work by specifying what code you can use instead specifying things like outcomes like you have to accept certain API messages and then it has to return a certain result but how you get there is your own business. Now mandates are a great start to interoperability but they're only a start because they're brittle and the reason that mandates are brittle is that companies cheat like crazy. So before I mentioned the Massachusetts right to repair ballot initiative in 2012, Massachusetts went to the ballot box and voted with an overwhelming majority. I think it was 89 percent to force auto manufacturers to expose the repair codes that traveled around the wired networks in cars, the CAN bus in cars, so that independent mechanics could affect repairs and immediately after that was turned into law the major auto manufacturers in the United States started to move their diagnostic data from the CAN bus and into a wireless network within the car and the wireless network was not contemplated in the rule and so for another nine years or eight years rather the auto manufacturers were allowed to freeze out independent mechanics until last year when finally there was another ballot initiative that said you know for avoidance of doubt we meant the wireless network too and in that eight years when the auto companies were able to get away with subverting the mandate they drove a lot of independent mechanics into bankruptcy and if those mechanics wanted to continue to have their jobs they had to get jobs working for authorized service depots so they just devastated the sector and we don't want that we do not want a future in which a company can cheat drive all the nascent competitors out of business scare any kind of venture capitalist or state agency or would be cooperative founders away from going head to head with them again because they see how the cheating goes and then therefore just just get to hum along continuing to dominate things so there is a way to make these robust and that's to create some kind of consequences for cheating some immediate self-help consequences for cheating those self-help consequences come in the form of of what we call adversarial interoperability or competitive compatibility so interoperability there's there's lots of ways to accomplish it right you can use standards you can have a regular gauge of rail that that causes everyone to be able to run on the same tracks everyone can form their HTTP headers in a way that complies with a w3c or an itf standard but then there are other ways of doing interoperability an improvised or hackers way of doing interoperability this adversarial interoperability and it has a very long and honorable tradition you know when mark zuckerberg exposed facebook to all users and expanded it from from just academic users to the general public he had a really big problem which is that the general public was already using social media they were using myspace which was owned by this rapacious australian billionaire named rupert murdoch and rupert murdoch wasn't going to let those people go easily and rather than asking mark zuckerberg to come rather than mark zuckerberg asking myspace users to come hang out on facebook and just wait doing nothing until their friends notice that they left and quit facebook as well or myspace as well and came to facebook to hang out with them and rather than organizing like an international everybody leaves myspace day mark zuckerberg made a little bot a competitive compatibility bot that would go to facebook with your or myspace with your credentials scrape your waiting messages put them in your facebook inbox and allow you to reply to them and then would autopilot those back into your myspace outbox and it would have a footer that said i sent this from facebook maybe it's time you left myspace it's not just facebook that did this every big tech company has relied on competitive compatibility in its history you know there was a moment when apple was nearly driven out of business by microsoft because they wouldn't update office for the mac and so if you were the lone mac user in a windows environment and someone sent you a word file you had to open it up and mac word and mac word was like this absolutely cursed piece of software even by the standards of microsoft it was a terrible piece of software and if you were dumb enough to open and save a document in mac word no one would ever be able to read it again including you rather than begging bill gates to to fix mac office steve jobs just hired some engineers to reverse the file formats in office and they made i work with keynote and with pages and with numbers and that meant that you could read and write all the microsoft office files without having to use microsoft software but as these companies have gotten bigger and bigger they have figured out how to take com com away from their nascent competitors to make sure that nobody does unto them what they did unto others if you were going to try and reverse engineers say the itunes file formats the way that apple reverse engineered the microsoft file formats they would hit you with suits under the anti-circumvention provisions the copyright modernization act in canada and the digital millennium copyright act in the united states they'd hit you with cyber security complaints under the computer fraud and abuse act they probably hit you for trade secrecy interference and so on every one of these companies has contributed to the slow foreclosure of competitive compatibility if we restore competitive compatibility if these bills or if the antitrust settlements that we reach with these companies say that they're not allowed to use copyright patent trade secrecy terms of service and other legal instruments to foreclose on legitimate competitors then they face a very different equilibrium to the one they have now think back to those mechanics in massachusetts if uh if the big three automakers nerfed their mandated interface but competitive compatibility was still a thing a couple of smart mit kids could have just hacked together something that reversed the error codes that went on the wireless network something that had a cost of goods of like five bucks that they could sell for a hundred bucks to not just every mechanic in massachusetts but every mechanic in america every mechanic in the world who would get a tool that was more capable that was cross device diagnostic that would work on every model of car that could have ancillary services that made the company is less relevant it would have admired them in a kind of guerrilla war where they would have to update their own messages to freeze out this new interoperable product that would screw up their own mechanics their customers would be angry at them and then the mit kids would just patch the firmware on these devices that they'd sold to everyone and they would be starting all over again businesses would much rather have a manageable quantifiable risk where they know exactly how their competitors are going to compete with them than to be engaged in this chaotic form of guerrilla warfare of technological arms races where the attacker always has the advantage because they only need to find one mistake you've made in your firewall rules in your file formats in your encryption in your uh tpm's uh and you have to make no mistakes at all and where your patch pisses people off and their patch makes people happy so there are some risks to creating interoperability to privacy those risks shouldn't be glossed over there are companies that do a good job of protecting users privacy when their users privacy coincides with customers privacy it's probably true that it would make harder for facebook to fight off the next cambridge analytica or stop people from x-fil trading devices from ios or data from ios devices if apple and facebook had to have a competition be harder for gmail to be secure if you could interoperate with gmail using third party clients that expose all of the functionality and not just the limited functionality of ipop imap or pop but at the same time um you are also going to provide service to people whose interests aren't reflected by the companies and um at the same time um the uh juiciness of those big companies as a target will decline because you'll have the um users spread out across more services so one compromise won't get you 2.6 billion facebook users it also means that companies will have to spend more resources fighting off interoperators you know today the bringing a lawsuit against an interoperator the way say facebook did with power ventures where they just sued a company that did to facebook exactly what facebook had done to myspace and prevailed in court and scared off every investor that might have invested in a company like them if facebook could only best a company like that by getting mired in a technological arms race other companies might come along to compete with them and so they wouldn't be able to have these one done victory conditions um but there are also some advantages here you know if you want a warlord to keep uh the your interests at heart one of the things that you can do to influence them is convince them that you might leave their fortress and go somewhere else if they uh mess you over it's to make sure that you have a way to leave if it turns out that your interest in there is diverge that disciplines their decision making privacy is important it's too important to leave up to unilateral corporate decisions it's too important to enforce by kind of trying to figure out how to retrofit copyright or cybersecurity or other laws that were never made with privacy in mind as our front front line for defending privacy we need a federal privacy law in every territory that has a private right of action that enables internet users to seek justice when companies do not respect the law monopoly is what uh is um monopoly is what brought this back right though the way that we ended up with this kind of privacy invasion is because of monopoly the ability of monopolist to gather data cheaply the ability of monopolist to frustrate good privacy law and competition is what will help deminopolize the world interoperability undoes the perverse incentive to over collect data to maintain your monopoly to say well facebook uh has to remain a monopoly because we have so much data that if we were forced to break up then no one could protect all that data because it's too powerful and an interoperable platform will always be incentivized to practice minimization and consent because otherwise it could lose users and face lawsuits thank you