 Amazing, okay, well, that was really, really extraordinary presentations with a lot to think about, rather than too much preamble, I'd like to sort of dive right into these themes and see if we can start to make some connections between these different speakers. You know, I was thinking when Mary Char, when you were speaking in SJ and Robert during your, I was thinking about one of the first jobs I had when I moved to Montreal. I found it on Craigslist back when people found jobs on Craigslist. And the job title was for social media manager. And so I get hired at this job, which is paying an extraordinary $15 an hour at the time. It's in this kind of strange, sparse, empty office. The pitches, and this is at the very, very early days of Twitter before pretty much anyone knew what Twitter was. And the idea was we were supposed to be building Twitter accounts for financial companies so that they could connect with consumers. And so the idea was we build up these dummy accounts with tons of followers and then they get handed off to clients. Took me a couple of weeks, you know, with my 19-year-old, not quite developed prefrontal cortex. But eventually I realized that what was actually going on was these accounts were being used to promote penny stocks once we lost control over them. And it was essentially like a very rudimentary pump and dump scheme, as far as I could figure, I quit after that. Now I'm a member of the bar. But shady is all get out. And this would have been 2008, 2009, 2010, so early days. And so I guess what I'm trying to say is that these practices are really about as old as the platforms themselves. And I think as you recognize, information, influence campaigns, disinformation, misinformation propaganda, these are not new things either. But looking at how the practice has evolved, it seems to me that as things get, you know, as actors become more sophisticated, there's a research methodology problem in the sense that you only catch the people who are really bad at it. And there's also a sort of constitutional or a political problem in the sense that the better they get at it, the more the behavior starts to look like real human speech, legitimate political engagement, right? And there's a cohort, including many politicians and some scholars that are starting to call on platforms to take down or restrict this kind of activity. And I'm really wondering from all of your different vantage points, what you make of those legal and political demands? How, you know, is it the state's role to control this kind of speech? If so, through what frameworks ought we think ought we to think about that? You know, how much of this is a real freedom of expression problem? How much of this is a market problem? How much is something in between? Maybe, I don't know, if SJ or Roger, you want to speak first to that point and we'll circle right through. Well, we're talking about behavior looking like real human speech and legitimate engagement, but one of the things about disinformation is it's a structural problem. It's the falsehood is often in the structure in things like amplifications. And the detection on that is different to, is this true, false? Should we throw this away? So I talk sometimes about, I don't want to restrict free speech, but I do want to restrict artificial microphones. So how do we ensure that you have people dumping on top of you? And the catching people about one of the nice things about disinformation is they have to advertise. You know what they care about, you know what they're trying to do. You can, you can see their activity. And also a lot of the focus for a long time was on this thing has become visible widely to the public. Now we do something about it. What we focus on is the left early stages of the planning and the setup stages. Can we stop this at those levels? Can we stop even before campaigns start by addressing the people who are doing this? Why? So it's the reason we move to risk as the thing that we're monitoring a thing we're looking at. Truth is hard. Behavior is difficult. But looking at how bad, how much, how far seems to be a sensible way to go. I'll stop talking now because there are many, many other things I want to have from everybody else. That's a really great starting point. Marie-Ca, do you have thoughts on this? Oh, you're muted. There you're better. Okay. Yes, well, it's a really good answer. I don't know what to add more. What I could say is that I think that a big problem with this information is that yes, sometimes for people it's hard to find good truth information online. So the first thing that they see or they saw is false information. So they believe it's true. They have some false information out there have so many lights read to it. So people think it's true because it's popular. So I guess it's true. I think that's something that with, I read so many good science literature out there, but often they are private. So people can't go and have those information. So I think this is also a problem, more sort of like kind of social problem that we should really tackle more also because yes, we could focus on dealing all the disinformation online, but people need to find information also. So yeah, that will be my answer. Yeah, those are really interesting thoughts. I mean, I have a hard time not thinking about this through a sort of constitutional perspective, right? And that's the paradigm that I worked through these issues. But part of it is the First Amendment as the Canadian Charter protects falsehoods as well as truth, protects foreign speech as well as domestic speech doesn't make distinctions between that type of content. And so anyway, I'm interested, Cory, what your thoughts are on this, because I think some of this circles back to, you know, the heart of your argument, which is that maybe there's a market, there's a question of incentives. There's a problem here. So maybe we'd love to hear from you. Yeah, this is a subject that's of course really difficult. And as someone who generally favors free expression, but is also very alive to the way that there are hecklers vetoes and that marginal voices can be squeezed out of a free expression environment by harassment and other free expression activities that that actually end up reducing the amount of discourse. These are really significant questions. So one of the first things that I think we should have clarity on and you fainted out of there, I think a little Lex with when you talked about constitutional and Charter rights to be wrong, is that we need to be clear about what our goal is when it comes to odious, but lawful speech. Do we want to engineer a situation in which views that are lawful, but odious are unutterable, not just in the public square where you might, if you stood up in a restaurant and started screaming racist epithets, they might sling you at the door, but just no one ever says them. I would love to live in a world in which no one ever uttered a racial epithet, but I also don't want to live in a world in which the way that we accomplish that is by setting rules about what people can say in private contexts among themselves. So this gives rise to this question about the speech platform or the speech policies of the big platforms, leaving aside just for a moment the impossibility of having a good speech policy that covers 2.6 billion people speaking hundreds of languages all across the world with lots of different contexts and when it's okay to swear and when it's not or to have a sexualized discourse or not or to use terms that are slurs, but use them within an in-group against whom the slurs are normally directed. So black people using the n-word or gay people using the word queer, all of those things are so contextual and difficult and I'm just going to park them off to one side and just say that we have a story that these are private spaces and so they get to set their own rules and when a private space sets its rules, it's not censorship, it's just editorializing that you know what the government tells you you're allowed to say is censorship, but what the restaurateur says you're allowed to say is not, but if you could imagine an experiment in which one restaurant called the No Politics at the Dinner Table restaurant gets a license to operate and then taps vast capital markets and acquires all of the other restaurants and no government intercedes to stop them from doing so and then they get all these predatory advantages like they can strike most favorite nation deals with all the farmers so no one else can get their food and if you try to open a restaurant down the street from them they underprice all of your items and they hire your chefs away double the wage. You could end up in a situation where no one's allowed to talk about politics at the dinner table and in which there would be no constitutional look-in and in which you might make the argument that this is more like the rule and the de facto rule we have where if you stand up and start screaming racial epithets any restaurateur would throw you out because it's just socially beyond the pale as opposed to kind of the artificial rule constructed by one billionaire who owns this vast chain of restaurants and so I don't think it solves our problem of people holding odious or wrong views if we shatter the speech monopolists Facebook and Twitter and other major platforms for discourse it might even make new problems but what it does do is it makes us confront this question of what we actually want to happen about odious speech and if it turns out that every platform in a big diverse marketplace where there's lots of places to speak and lots of web hosts and lots of DNS providers and lots of every other layer of the stack they all have the same policy and no one makes them do it in law then maybe it is like the rule that just says like I'm sorry if you're gonna fart like that you can't be in my restaurant and not like the rule like here in Zuckerberg's empire there's just some things we don't talk about and if you don't like it find somewhere else except I bought them all and so that that's I think that the the place where this conversation can at least not be mired in these dumb questions about like is did there isn't it censorship and instead start to work towards what is a speech policy that shelters marginalized voices from the hecklers veto and what is a speech policy that moves questions of what is an isn't lawful speech out of the hands of an autocrat who's accountable to no one but their shareholders and in some cases not even them because the special structure of the shareholdings in Facebook and Google where the founders own not the majority of shares but the majority of voting shares and instead moves it into the demos where we have an accountable way of deliberating. Yeah super thoughtful I think that that's that's a really helpful way to think about it I think you know in Canada we have a gang of legislators who are I think I think it's fair to say shockingly and dangerously uninformed on very basic questions of technology policy most of the time and we need to look perhaps no further than the current debate going on on bill c-10. At the same time I think there's an emerging consensus that leaving these companies to sell and there's two essential narrative about the appropriate role of the state of the appropriate role of governments in responding right there's there's one school of thought the Cori Doctor O or the Elizabeth Warren school of thought maybe okay not right it's it's a regulation problem you know but it's a market regulation problem it's something that we look at through the lens of antitrust consumer protection privacy human rights law and then there's a sort of another approach that looks looks more like a remedial approach targeting the harm so focused on creating laws that force platforms to move certain kinds of content to de-platform certain speakers to limit the spread of certain kinds of harmful information whether that's you know an extreme speech or hate speech or politically problematic speech or misinformation disinformation so we have these kinds of these two points of entry into this conversation about regulation and I'm wondering how these approaches play out in the way you think about these these problems about the problems of how we organize our relationships online you know I was thinking a little bit about for example uh I think I wasn't sure maybe it was S&J maybe it was Roger talking but sort of like actors to counter disinformation how do you anyway what are your thoughts on these different and maybe they're not two completely distinct schools but what are your thoughts on how to tackle these big problems what's the role of the state in all of this um I don't know if one of you wants is I could just choose you like my students at random um maybe Roger we haven't heard from you yet can I put you in the hot seat yeah sure so a couple thoughts on that um at least a final entry point and I won't speak too much about markets because that's that's not really my job but I will I can riff on the technology and I think something that really needs to be considered here is not just like how do we want to structure our own laws and what should our responsibility be with these narratives and these ideas and what do we want to allow but but also consideration of how do we prevent our platforms by being weaponized from people who are going to make it their mission to abuse these things it's not enough that we're just it's not enough that we just decide to allow free speech or we you know break up Facebook or whatever the case is but whatever the alternative is it still has to be something that can't be abused and if it can be it's just as easily we've kind of lost that um so I'm not sure that answers your question but it's on a note today I think do you have that oh I have many thoughts so I mean you're focusing on the platforms and the government and there are many many many actors in these spaces because it isn't just one space you've got the community space you've got um the platform not the platform space but the whole internet space you have so many people who can respond and it seems strange to me just to listen to hey what can the government do it's like we're all part of this we can all help solve this um it's like sort of having a word with your uncle at the dinner table it's like let's work together which is why we push so much on how you can work with groups how you can work with different types of responses and the other part is people talking about harms now harms frameworks are really useful and we use them and we look at all the different types of harm you know physical onwards but risk is different from harm you need to think about how far this is going what the likelihoods are what the targets are so so who is vulnerable to what what is vulnerable to what your vulnerabilities are new to communities to businesses to to governments to countries and they have different needs so the things that the government may do to protect itself and its country may be different to the things the platform will do to protect itself and quite often you need to build regulation to force the platforms i am tired of the number of friends who are different being who've been de-platformed and because that's just the way the regs work you know you you trip over them so i think the government and the platform angles interesting but there is more beyond i mean if we're talking platforms let's make it hard to micro target now we have strong privilege privacy regulations you you lose the ability to target down to demographic level with things like political advertising with things like maybe you can still target down you can still use like group types but put that friction in the system sorry i i get a little bit excitable about responses because we've spent a couple of years working on this of course of course and i think i think that's really uh you know and a really useful synthesis of um what cori is saying um and what you're saying around this issue of disinformation and misinformation and you know this this question of market it's really the whole business model right uh improving privacy protections means that the ability to target individuals based you know for example on protected grounds based on on really unique idiosyncratic things about their lives things that that pull on our hearts and our minds you know those those tools of persuasion become less powerful right and we you know we all benefit for lots of different reasons uh when there are stronger privacy protections in place so i think that's that's part of their reflection too Manny Kat you have something to add yes exactly just like i said in my talk there's kind of a problem between the fact that all those disinformation problems taking this problem etc are on social media platforms and the solutions are also on the social media platforms i think it's a lot to put on like some people uh well there are many people working at those social media platforms but yeah i think it should be spread uh bigger than that because just like as you said it's it's a social problem there are multiple uh people uh concerned by that and um i don't have the perfect answer to this one but yes i think we should go and spread it our way more wider than this to really find a good solution yeah super helpful um maybe we can just build on this a little bit we have a lot of questions um about gab and parlor or it sounds so weird to say that it's somebody who also speaks French parlor uh which you know and these these uh entities have a really interesting relationship with with the idea that Corrie was proposing earlier the the idea that a lot of the problems on these platforms can be solved through more competitive markets and there's also a really close connection with this idea of self-determination and the idea of protected spaces for kind of core individual political expression online and yet these are also unambiguously awful places on the internet um so how do how do each of you see these platforms uh or these entities uh from your respective vantage points are they counterbalanced against the idea of echo chambers do they create them what kinds of threats and risks are are they the source of what do how do they how how do you fit these emerging entities or are they even emerging you know like I I'm a little bit more of the internet era of flortan right and so there's you know how different anyway maybe Corrie you can start how do you think about these places online I I think S.J. really put her finger on something very important when she talked about friends who've been deplatformed because the the notion of deplatforming has been mostly hijacked by elements of the far right and and xenophobic movements but deplatforming is a concern dates back to the deplatforming of indigenous activists trans activists sex workers and sex worker advocates black glass matter advocates uh you know you you can see it being weaponized at the state level for example with what's happening in Cambodia where the dictator requires everyone to adhere to facebook's real names policy so that he can round them up and torture them if they speak out against him and if they won't then he has facebook deplatformed them for violating their real names policy and you know the concern about Gavin Parler is not who will speak for the Nazis the concern is that we have we are creating weapons that can be wielded by people who we don't trust as much as the people who've got their hands on them today it's very similar to the concern that you alluded to Lex with um c10 where there you know there are a lot of people in my twitter mentions who are stalwarts of Justin Trudeau perhaps forgetting that his political legacy includes a father who declared martial law and raided dissident groups offices stole their membership lists and blackmailed them but uh people who are fans of Justin Trudeau don't think that the crtc will do anything that is bad trust in the rule of man and not the rule of law and seem completely blind to the possibility that prime minister dug forward or prime minister faith goldie might not use these powers in ways that that uh they are happy to have i i think that it is possible to make good technology rules notwithstanding that we have seen such bad ones emerge uh and and you can see that governments can make rules on highly technical questions that they are not themselves briefed on in terms of the the actual parliamentarians when they're motivated to do so when it's important and when the stakeholders have political juice no one in parliaments a microbiologist to my knowledge and yet white canadians who live in cities have potable water leaving aside boil water advisories and indigenous communities across the country right people who have political juice can get the government to find out what the science says and then do it as policy and so you know i i think that it behooves us to ask what is it that acts as the countervailing force that stops people even those with political juice from getting good policy and i think it's monopoly i think that when your industry fits around one table and when everyone used to work with each other at one of the other five companies that make up your sector and you're all friends and you're like godparents to each other's kids and your executors of each other's wills you don't even have to ever explicitly arrive at a at a at a conspiracy you know although it's easier right we see we see that like facebook and google concluded illegally to fix ad rates that came out in the texas uh antitrust case but you know they wouldn't have to because you have a senior executive from google who's now the chief operating officer of facebook who knows what google's plans are she just has to say this is the agenda of the industry and she's in a position to make it happen and when there's a duopoly it just happens and so i think that if we want to have good policy like we can we can ask the question what would that good policy be but i think we also have to ask the question what structurally creates the space for good policy and that is to weaken the power of industry so that it is subject to governance interesting yeah i think that's that's a really that's a really helpful framework for thinking about this um mania roger sj you have thoughts on this i also going back to this point on gab and parlor and what we make of these these kinds of spaces online how they affect your work anyone shall i take it um so i've been in parlor since it started um and was crawling around the data sets that we we have and initially it was a set of influencers you had epoch times all the way through it but as it grew a bit i started seeing k-pop stands so the k-pops who who flooded out hashtags and bought tickets and stuff and i started seeing rainbows where the gay people had come in and put in information too so i again anyone can be part of the it's information space if you think of everyone having having access to information space then you have the ability to move in those spaces and i'll match you have the ability to do interesting things in those spaces so i don't see them as necessarily dangerous things any more than meeting in somebody's front room is a dangerous thing i i find the injection of narratives um for state purposes a dangerous thing but again this is about understanding and gardening your space and i really did enjoy all the k-pop some the real name be platforming i i for a while we've been thinking about things like how would you do third party verification we have people in our groups who are only known by the handles everyone knows grog everyone trusts the grog nobody knows who he is doesn't matter because he has an identity tied to his handles so there have to be better ways of doing that um the weapon part i i i mean the last couple of years in the us sms broadcasts were big even if you take away the platform you're you're not taking away the ability to mask you mitigate or mass broadcast but you are taking away the ability to intelligently pull people in and perhaps the imaging part which has some value that's that's a random thoughts but i'm gonna pass over to someone who's slightly more together on this marika you on the floor uh yes um i'm not i don't really have like a sixth opinion about this i used to think that they were sort of eco chambers like social media platforms that they were like more close-minded people about sort of stuff but as listening to your both answers uh kori and sg i'm like a little bit more nuanced i don't know many much about them they're they're mostly close now um but yeah so uh i'm really nuanced about those ones i do think it's important to look at them and i'm kind of uh interesting as seeing how it will go on the like next years or so i really don't know how this is it is is it's going to be more popular i think so but also they do have some problems just being hosted so yeah good question one thing uh you know one of the comments you made sj was like look this is not just about states it's not just about um you know it's not just the relationship between states and individuals really the private sector has to think about these issues and in really extraordinarily complex ways we have a question also uh you know asking about the implications for the not-for-profit sector how should um these entities think about the kind of work you do about the the kinds of risks you study uh what what are your i mean we can think about you know the kind of like silly we can think about like wayfarer and qanon um you know but like what what what does it look like when you're a business or a non-profit and you find yourself part of one of these stories actually i've been working with um large nonprofits for most of this year so it's it's my daily daily thing but um businesses that there are different ways they're likely to be affected um there might be direct disinformation at a business um there are lots of reasons you want to do that sort of dumping their shares is one one thing you know up your share price um you might want to manipulate the business by disinformation on their principles so board members etc but that there hasn't been that much of it there's a little bit of a fisticuffs over in a couple of telecoms companies um fighting it out most business it's going to be a side effect on them so for example british telecom in the uk um because of the cobit 5 g rumors their engineers started being attacked um people were setting light to um cell phone towers as part of their disinformation response so this is why you say have a plan because even though you're unlikely to be a direct target yet and we think that the point at which the risks to the principles to the people who are doing it plus the gains that come from it um are equivalent to starting to overtake ransomware so that might be that ransomware is reduced enough or there may be hybrid ransomware stroke disinformation campaigns that will probably see much more industry around this at the moment industry tends to be people doing influence operations on behalf of another entity so the ebla um a couple of other smaller groups which used to be market agencies just do it with falsehoods so most of this is thinking about how you might get caught up in the stream of it and thinking about if that happens how who you're going to call who you're going to work with who actually does this you can go ask for help before that happens just doing some of that red teaming planning simulation so that your team knows the things it's got to worry about before it's actually in the thick of it so yeah uh that that's that's just thinking around some of the things you need to do I mean practically absolutely practically pre-bunking wins at the moment so messaging based um especially if you're like nonprofits getting ahead of those narratives getting information out into space getting trusted information in trusted space and going where the people are so just already being ahead of your branding your image it is helpful before you even get to it um I'm going to leave space for the next person I think we may have some thoughts on those well I was going to say you know I grew up with enough crunchy granola people that I've heard a lot of these conspiracies for a long time and one thing that's very striking about conspiracies say about vaccines is that the arguments haven't changed materially right the things that people say about vaccines are about the same and so if a if a view becomes more widespread but the but the rhetoric of that view hasn't changed then something else has changed and um there are those who say well the thing that changed is that big tech figured out how to bypass our critical faculties with machine learning and you know the the the biggest proponents of that view are big tech the only they say it not as by way of apology but instead in their sales literature buy our ads and will convince people to buy your fidget spinners but um the the uh other possibility and and something that's well documented within conspiracism literature and conspiratorial studies of conspiracy conspiracies and conspiratorialists is that people's material conditions make them vulnerable to conspiratorial accounts of events that if people like live through a conspiracy right if if you have lived through an instance in which you were lied to by powerful people in a way that materially harmed you then the next time someone says the reason you've been harmed is that powerful people have conspired to harm you that explanation has power it has plausibility and you know another word for conspiracy is corruption when people get together and make a deal uh abuse their power and authority and their trust to harm other people and benefit themselves we can call that a conspiracy we can call a corruption and one of the handmaidings of corruption is monopoly that you know when when you ask an anti-vaxxer today why they don't believe vaccines it's rare that you'll get a lazy answer what you'll often get is a an incredibly energetically wrong answer an answer that is chapter and verse on a bunch of things that are a hundred percent true about the pharmaceutical industry like it is highly concentrated like it's major named families have done things to suborn their regulators for example by can by by putting up misinformation about whether opioids were harmful or could be safely prescribed over long times that these had real material consequences that apatech once told a researcher at sick kids hospital that if she warned her subjects in a drug trial that members of her of their cohort were becoming gravely ill that they would pull funding to sick kids for future research uh you know like all of those things are true and so if you say to me why do you trust vaccines and I say well I trust the science and I trust the regulator I have to say although I trust vaccines neither of those things are true of me I don't trust the science or the regulator in the sense that like I know that Elsevier spent years publishing lookalike journals that weren't peer reviewed and that pharmaceutical companies could publish marketing claims and that were indistinguishable from their peer reviewed journals and I don't have the statistical background to understand whether the trials have real explanatory power I'm not privy to what's going on in the halls of power I don't know if the regulators are or aren't captured and so from moment to moment there's this kind of epistemological chaos and terror where questions that we can never adjudicate for ourselves should you get in the 737 max is your kid being turned into a dunce by distance education should you trust a vaccine or will you get a blood clot all of those questions are questions that like even if you can answer one of them you do not have enough time in your life to get enough phd's to answer all of them and where the mechanism by which we normally resolve them which is by hoping that we have neutral adjudicators who hear expert evidence and come to a conclusion that reflects that best evidence that's not there we're in chaos so what can a nonprofit do well nonprofits can agitate against corruption they can agitate for fairness and they can agitate for transparency and for good governance because that is the foundation on which we believe we build resistance to uh uh conspiratorial accounts because when you have good governance conspiracy is harder to pull off Roger do you have thoughts on that by any chance um no i'm i'm gonna leave it there i think that was perfect actually don't mean to put you on the spot there um okay i'm i'm taking a look at the time and i maybe we'll just like to kind of end with one last big question um and i think it's people in the audience at a conference like north side are in a really interesting position because there are people with extraordinary technical skill and talent um there are also people who uh you know uh maybe have the have the the the talent to sort of change some of these things or to you know or there are people who are working for uh the warlords and the bandit so to speak uh they are the the architects but also the plumbers of surveillance capitalism you know so what uh maybe we can uh kind of go through for each of you maybe we'll start with bank car uh what do you think are the responsibilities of an ethical technologist in this current environment and what do you think are the most important technical problems that you should be working on solving and it could be related to your research in your talk today it could be a more general reflection uh what does it mean to be ethical what is the important work to be done um thank you yes uh that's a really good question um i haven't put so many thoughts on that that's not my uh my uh my specialty but i think that obviously for me i really research about political interference um and disinformation with social bots um i think obviously if in my case obviously was getting more and more harder through the years to research those those kind of social bots online because of the privacy and everything it's for really good reasons the privacy is getting way harder uh to help help researchers to get their information uh but also this is very um hard to uh it's getting way harder to do so and i think that further practitioners obviously would kind of need to help each other in some sort of way so we can really provide information because i don't think for good reasons gonna be get way harder to research those kind of things to prevent those kind of things uh for privacy reasons but obviously there are also on the other side we do need to prevent those disinformation and those uh political interference online uh so this is a quite a difficult question for me to answer yeah you're on me i feel like you took a great step at it um uh roger do you have thoughts on this question sure yeah um so i think like cori mentioned mentioned about you know having enough time in your life to acquire all phd so you need to really understand these issues and that's and that's like pretty bad one um so for us in tech you know we're building systems and i think we need to get more thought to how can we build systems that enable people to trust the right things and how can we make the information that people need available you know in a way that benefits society and benefits the individual you know it's easy to build a recommendation engine you know that gets your like racist sound goal to go dive into q and i you know the world's a shittier place for it you know that's that's simple right but how do we like you know how do we build that same ecosystem where people are serviced better and not drawn into you know these harmful these harmful relationships or harmful material i guess maybe that even circles back to the the original question about like censorship or what do we want to allow online you know if we can build systems like that that are kind of more like equitable we can you know somewhat maybe to disperse those arms you know maybe they don't go away but they're not amplified and we're not we're not connecting all you know the the diaspora of awful people you know so uh i'll leave it there but yeah that's that's it i say do you have that always but not so good at actually finding the mute button so for me it's look at the whole system i mean pablo who works with us has a saying that it's a thousand bullet solution to a thousand bullet problem um there are many different moving parts in this that you can work on um simple moves uh it's like fixing this before the disinformation hits the platforms doing things like i i have a load of old accounts just lying around that might be zombies i mean all of us do it's just the way the internet's grown up so think about how you clean the systems back to where you have communities within them um think you know just reducing those amplifications putting in friction putting delays in things making things age out just as a system slowing it a little um i played with this slow internet for a while it was like basically we did twitter and a typewriter but if you slow people down a little they're just generally nicer because they have to think about what they're doing um diversity i mean system as people process technology culture um having a diversity of people um always always one of the best solutions for many tech problems is having diverse teams so you have all these different angles and what's going on rather than just um i don't want to say tech bros because we're not all tech bros but it's also within your teams and listening to the people who are most affected by things like disinformation attacks so black women and specifically have been a a subject of disinformation for longer than most people online and they have opinions they have voices a lot of them have been themselves bounced off of systems just because other people have manipulated systems around them but the last thought really is that the people in platforms a lot of them know what the solutions could be they know things they could do to fix some of the issues they want to do them but they don't have the top cover to do them so that's that interaction between platforms and government again it's it's having the regulation that gives the top cover for the people who have seen things like the these friction moves and want to do them but they're competing against um the things like shareholder driven business goals so just make sure that this health of our system is actually up at the table with the amount of money it's making until next kori i think you have the last word here sure so i mean i think that um the the the the kind of expert that i want to speak to are people who are working in in hacking policy more than hacking code although i think some of us do both uh and you know hacking policy requires political will it requires that we we build big broad-based movements to hold people to account change the way that politicians think about what they can and can't do uh and what they can and can't get away with the overton window and in this i am really informed by the copyright scholar james boyle who's at the duke university and jamie he talks about the history of the term ecology and he says that before ecology was coined we had a bunch of different issues but not a movement and if you cared about owls and i cared about the ozone layer like how is it that we would be on the same side right you're you're fighting for like charismatic nocturnal birds and i'm worried about the gaseous composition of the upper atmosphere those are not obviously the same issue right but the term ecology took a thousand issues and made them into one movement it took a thousand constituencies and made sure that they all had each other's backs it changed the political calculus changed the way that we talk about this stuff and i think we could be on the verge of that for monopoly is there a bunch of people are pissed off that all their beer comes from two companies or all their spirits come from two companies or that there's only three record labels or four movie studios or one theatrical exhibitor or four giant accounting firms who are uh implicated in every single horrific corporate collapse that brings down huge swas of the economy you know if anyone from Ontario was listening here remembers the carillion collapse all four of the big four accounting firms had their figures in that and guess what they were the only companies big enough to get the contracts to unwind the bankruptcy so they got paid again millions of dollars for uh unwinding the company that they helped fraud its way into a global collapse all of these people don't know it but they're worried about the same thing right if your glasses went up a thousand percent it's because one company looks Otica Esselor owns every eyeglass brand you've ever heard of from uh uh coach to Dolce and Gabbana to uh all of her peoples they also own every retailer sears optical target optical uh sunglass hut um uh they own boush and loam they own um uh what's the other big one lens crafters and they also make more than 50 of the lenses in the world and they've raised prices a thousand percent a year uh in a decade rather and and if you're pissed off because like the wrestlers you grew up with are on go fund me begging for pennies so they can die with dignity because Vince McMahon misclassified them as contractors and took away their health insurance and now they can't treat their work related injuries you're pissed off about monopoly and we have the chance to build a broad based movement now this is this is a turning point where people are turning around and going the the thing that joins up oil companies lying about themselves roasting the planet and insurance companies lying about whether the bonds that they floated were any good and all of these other firms that have gotten away with the literal and figurative murder for decades is monopoly and once we realize that once we realize that my owls and your ozone layer are really part of the same picture then we can build an unstoppable social movement and i think that is a beautiful and important place to end and i think all of you each of you for your participation today i'm really looking forward to seeing the other talks um i've learned a lot i know everyone watching has to thank you