 So first of all, thank you all for coming out today. When I was stepping outside at my house and seeing that terrible rain, I was thinking to myself, I probably wouldn't have come out to a talk tonight, so I really appreciate all of you coming. For those of you who don't know me, my name is Suzy Dunn. I'm a law professor here at the Shulika School of Law. I teach a contracts law and I also teach an upper year law and technology and intellectual property law course, but my main area of research is looking at technology-facilitated gender-based violence, which is a mouthful of a term, but we'll walk through what I mean by that term and go over some of the different laws and ways that people have been hoping to address tech-facilitated violence as it emerges in its different forms online. So for what we'll cover today, we'll start with criminal law and go over some of the criminal laws that exist. We'll look at historical ones that you can apply to tech-facilitated violence and also some that have been developed specifically for tech violence and talk about some of the pros and cons that come with using a criminal response to addressing these types of harms, which we'll also do for the civil and non-legal options. Next, we'll talk about civil responses. There's all sorts of civil responses that you can use to address tech violence. One of the ones that's commonly used for nude images is copyrights because there's a lot of protection for copyright on the internet in comparison to harmful behavior. So it can actually be a more effective tool compared to using regular civil torts or statutes and getting an effective remedy. And then we'll talk about some of the non-legal responses as well because with any sort of harm that we're thinking of in society, law is typically the last place that people go when they want to get supports for things. Often they're going to family and friends first or they're going to NGOs or they're going to schools. And so often the law is the last place to go too. So I think it's quite important to think about non-legal options and also the ways that our governments can support those non-legal options. She's just gonna take my mask off because it's a bit breevy. So the first thing we'll touch on is criminal law. Oh, no, I'll talk first about what we mean by tech facilitated gender-based violence. So this is a term, Jane Bailey, Nicola Henry and Asher Flynn have coined this term, tech facilitated violence. Some other organizations have used it as well, but essentially it's an umbrella term to describe any sort of digital technology that's used to perpetrate interpersonal harassment, abuse or violence. Generally when we're talking about gender-based tech facilitated violence, we're looking at people who've been impacted by this harm who might be marginalized or are part of an equality-seeking group because of their gender identity or their gender expression. So a lot of history of gender-based violence focuses a lot on women and girls who are one group of people who are impacted by this type of harm, but of course trans people, agender, non-binary and gender non-conforming people are also impacted by tech facilitated gender-based violence. And they're usually targeted specifically because of their gender identity and expression. And what we see specifically when it comes to gender-based harms is that people are often targeted in very sexual ways and so women will be threatened with rape rather than being threatened with violence, which is what we see more men being threatened with on the internet. And when it comes to harassing comments, they're often focused on a person's gender. So talking about their body, talking about the way they look and how they don't fit within really restrictive heteronormative sexist ideals. And also what we see with this type of harm, research shows that even when we have men and women experiencing similar levels of various types of tech facilitated violence, what we see is the impact on women, trans and gender non-conforming people is that the harms that they experience are higher. So we see higher rates of mental health issues. We see higher rates of risks of physical safety, feelings of fear, and higher rates of suicidal ideation. So quite serious impacts when it comes to the gendered aspect of these types of harms. And it includes a lot of terms. This is a list that came from a paper which I'll speak further about by Cynthia Koo. So the Women's Legal Education and Action Fund, which is one of the best legal organizations dealing with gender equality in Canada, wrote a report called Deplatforming Misogyny, which is a report that looks at a feminist interpretation of how to address tech facilitated violence, specifically by moderating social media companies. And so there's been a lot of conversation legally around whether we should actually regulate the behavior of social media companies and their expectations around content moderation. And so there's a variety of terms on here. Some you'll be familiar with, hate speech, threats, impersonation, those are all terms that most people are likely familiar with and some where you might not be as familiar with them. So things like doxing. So doxing is a term that comes from early days in the internet when people were dropping dox. And so generally you'd publish documents to expose some sort of issue or expose the identity of a person. Doxing today is mostly used to release someone's private information. So for example, when someone's nude images are posted on a pornography site or a revenge porn site, often what will happen is they'll also post their home address, they'll post their full name, their workplace, their social media information, their phone number and their email address. And that causes additional harms to the person who then has other people contacting them, asking them for sexual contact or harassing them in other ways. So doxing can be a really troubling experience for people and some people have actually had to do things like change their legal name. A woman named Holly Johnson was an academic who had a sexual video of her released by her ex-boyfriend that was affiliated with her name saying that she made it for her students. And there was no way she could get it off the internet. And so the solution for her was to actually legally change her name so that when you Googled the name that she was affiliated under as an academic you wouldn't find these sexual videos of her. And other people have had to move addresses. There's a lot of trans women have been doxxed and threatened and swatted which is when you call the police and you call in a fake bomb threat or a fake kidnapping and the police show up at someone's house to arrest the person usually heavily armed which has happened recently in Canada with a trans woman and she had to flee her home because of doxing. There's also the non-consensual distribution of intimate images, which again is a mouthful of a term. There's a lot of mouthy academic terms that we use here. A lot of people recognize this behavior as revenge porn. That's not a term that most academics use anymore because it suggests that someone has done something deserving of revenge and then pornography is conceptualized by a lot of people as content that was produced to be consumed for consensually for sexual pleasure but with non-consensual distribution of intimate images the images have been released in a way that they're not meant to be publicly consumed. So we generally avoid that term. Deep Fakes is also another one that some people may have heard of but might not be familiar with. It's a use of AI technology where it maps out your face and then it can put it onto another video. So forms of gender-based violence where that's happened is often with women celebrities and politicians. Their face will be superimposed onto a pornographic video to make it look as though they're engaging in sexual activity they haven't engaged in. And Deep Fakes used to be a little bit more glitchy. They kind of came into open source availability on Reddit in 2017. And at that time you could tell, like they would be like fuzzy around their face or be fuzzy around their mouth. And you could, if you really looked at it, you could tell it was a fake video but now because it's open source, because it uses AI, the advancements in Deep Fakes are incredible and really good Deep Fakes are you wouldn't be able to tell the difference between a real video and a fake video. And so there was an example of a woman in India who was reporting on sexual violence that the Indian government hadn't been responding to well, Rana Ayub, and in response to that, a sexual Deep Fake was made of her and spread throughout WhatsApp and Twitter to try and delegitimize her and say that she was someone who was engaging in salacious sexual activity that shouldn't be believed in her journalism. So it's been used in quite a different few ways. And then sex distortion is sexual extortion. So we saw a real rise of this during COVID. So traditionally most extortion has been targeted at children. So what will happen is someone will befriend a young person on the internet, pretending to be another young person, they'll get into a romantic or sexual relationship and the young person on the other end will send a flirty or sexy photo. And then that photo will be used to extort them for more contents with threats of posting that content online. And so we saw that in the case of Amanda Todd, who was a young BC woman in 2012, I believe this happened to her. And there was a Dutchman who was doing this to dozens of young people across the world. And in her case, he kept releasing the photos and tormenting her over a two year period. She was also tormented by her classmates and harassed by her classmates and eventually died by suicide. And her death and the death of Retea Parsons, which was a similar incident of photos being taken without consent, but in a live situation with some of her classmates and they were released and she was bullied and she also died by suicide. And those two cases were really the first cases in Canada that brought major media attention to this issue and you started to see government response. Because at the time, even though there were extortion laws, there weren't laws around the non-consensual distribution of intimate images. So what these men and boys were doing to these girls wasn't actually illegal. Like it might have fit in other laws, like child pornography laws, but that's not a perfect fit for it. And so during that time, you saw a lot more response from the government and you saw the introduction of criminal and civil torts around the non-consensual distribution of intimate images. But generally what we see with all of these forms of harms, they're really just old forms of gender-based violence that we've seen in the past that are now using new forms of technology to enact them. And so we saw this way back in the day with the advent of anything from faxes to cameras to telephones. We used to have a specific law about the harassment of women on telephones because people would call women at night before there was caller ID and they'd just breathe into the phone and hang up or they'd say really creepy things and they'd hang up. So you've seen laws developed every time there's new technology, abusive people find ways to use that technology to continue their abuse. And I think that that's a really interesting and important point because one of the challenges we've seen with tech violence is that a lot of institutions like the police or law enforcement want to separate the two and they want to say it's just online, it's not that harmful, just turn your phone off, just get off social media and that's how you can avoid these things. But what we know during COVID, like all of our lives are so deeply intertwined with technology that we can't separate the two. And so I actually think during COVID a lot of people's perspectives have changed, but at the time when Ritea Parsons was being bullied by her classmates, that was a lot of response she got from the police and schools and social workers. We're just saying, just turn it off, just ignore it, it's not really that harmful because there's not necessarily a physical impact to it. And so getting rid of that online offline divide. And I'm sorry, I missed my piece about sex distortion. And so sex distortion during COVID really increased for men because what happened is normally sex distortion is targeted at children or women. But what we found was there was an uptick for men during that time because men were alone in their houses, people were having a lot of online relationships and connecting with people in digital spaces. And there was actually transnational organizations that were taking advantage of men by pretending to be women, getting photos and videos from them and then extorting them, not for more sexual content, but for money. And we saw that across the board. We had one of our members of parliament, Tony Clement, I believe it was, had this happen to him. And he's someone who had very high security clearance in the government. That didn't happen during COVID, it happened before COVID. But you see this specific targeting of men who might be a bit more willing to share sexual images in a digital context because they're less likely to be targeted by things like non-consensual distribution. But then you saw this uptick of people finding a way to financially extort them. And so if you do this to enough men and you get $1,000 per man, you can actually make quite a bit of money from sexual extortion. So there's a lot of barriers when it comes to accessing the law when we're dealing with tech issues. And this is across the board, not just with tech violence. So we know there is a lack of legal knowledge in the legal profession. I tell all my students this all the time, like you should know about tech, you should know about social media because it's the future of law no matter what area of law you're in. But right now our generation of lawyers and even legal academics and judges really lack technical expertise and technical knowledge and that can be a challenge on many fronts. So it can be difficult to understand how the technology works in order to be able to collect digital evidence that's admissible for legal cases. It can be difficult to explain that to a court, especially if you're talking to a judge who might not be familiar with the technology like trying to explain TikTok to someone who's never used it before can be tricky. So we have this and we've seen some movements of expectations that lawyers and judges have an increase in their technical knowledge, which I think is really positive, but it's a slow process. Law enforcement is the same issue. What we see is that a lot of the technical knowledge for law enforcement is focused on child abuse materials because that is one of the most significant harms children are extraordinary vulnerable and there's a lot of desire to put resources into that area. But when it comes to adults, there's a little bit less of a drive to have those resources there. And so when law enforcement are having to make decisions around where to put the few people they have with specialization in tech, it's not always in tech facilitated violence and the average police officer might not know exactly how to collect the digital evidence they need. And digital evidence is sensitive because you can drop your phone, someone can go onto the account, they can delete it. You can accidentally wipe your drop box. There's all sorts of ways to lose digital evidence which creates vulnerabilities for accessing legal responses to this type of harm. And anonymity is an issue, both for the perpetrator and for the victim. So if you don't know who it is that's harassing you, it's on some anonymous accounts or it's been posted on Pornhub and you don't know who's posted it up there, can be very difficult. And when it comes to criminal law, you need someone to charge. So if you don't know who it is, that's a challenge. And so unmasking the perpetrator is something that people often rely on the police to do, but we also have to balance that with the need to protect anonymity and privacy more generally on the internet. So we have to balance giving police powers to unmasking people with also protecting the larger needs of privacy and anonymity as a human right in many ways. There's also the desired anonymity of the victim in the United Kingdom when they first introduced their non-consensual distribution laws. If your nude image had been released and you went to the police or you filed a civil case, your name would be published in the court records. And then the newspapers would go on and they'd say, Susie Dunn has had her nude image released. It was released on this site and she's charging this person. And then that would be in the newspaper and then everyone would be able to find and view those images more, increasing the harm that the person's experiencing and really discouraging people from reporting. Because if the thing that you want is for people not to see these pictures and then you're not anonymous, it can cause a lot of problems. In Canada, generally you can apply for anonymity but it's an extra court process if you're an adult that just costs a bit more money. So again, it's another barrier people face. The internet, of course, you can replicate things, you can copy things, you can save things. It's very difficult to get content off the internet. One of the first cases on nude image sharing in a civil court in Canada, the lawyer that worked with her, I've been in contact with her and it's maybe eight or nine years later and she still has to pay a reputation managing company to go out and try and find those images and take them down. Because they're always replicated, they're always put back up. And so it's an ongoing process to get content taken down. So expediency of getting material down before it's replicated and spread across the internet is important and the law is slow. And we all know the law is slow and the law is slow for a lot of good reasons but it creates barriers to the actual remedies that people are seeking. And then jurisdictional challenges. So in the case of Amanda Todd, the person who was targeting her was a Dutch citizen living in the Netherlands. And so he actually got charged in the Netherlands first, served his sentence and then he had to get extradited over to Canada and was just sentenced here in Canada. Which she died, I believe, in 2012 or 2013 and now he's just got sentenced this year in Canada. And even to figure out who he was was a challenge and then finding out he's in another country, depending on the relationships that we have with certain countries, you might not even be able to get an extradition order. And that case was a really well-known case that people cared a lot about. So there was a lot of incentive for the government to extradite him. But in other cases, sometimes they'll say, we don't know who this person is, their IP address is in Germany, it's too much work. And so that's a real barrier that victims face when they're trying to seek recourse through the legal system. Criminal law, you can apply a lot of existing criminal laws. Often with tech law, people are saying, let's create new laws, we need new laws, but often you don't need new laws. You can apply existing laws. When I was doing my master's and PhD, I worked with the Equality Project under Professor Jane Bailey at the University of Ottawa. And I read and summarized around 800 cases that we found in criminal law that had some elements of technology and some elements of violence. And we looked at all genders of people for this. And we found a lot of laws could actually apply. So harassment applied, extortion, hate propaganda, identity fraud, which is impersonation, intimidation, and uttering threats. We found all of those laws applied equally. It didn't matter if it was online or in person. But then we did find that the government had to create some specific laws. So voyeurism before voyeurism existed, which is taking secretive images of people usually for a sexual purpose. The laws that existed before that were called peeping Tom laws. And so it was trespassing at night. So if you snuck onto someone's property and looked in their window, you could get charged with an offense. But once we had hidden cameras, the person wasn't actually physically there all the time. And so they had to introduce voyeurism laws, which counts for recordings. And in countries that are a lot more technologically advanced than we are like South Korea, we've seen epidemics of uses of hidden cameras in change rooms, in bathrooms, in hotel rooms that are live streamed onto pornography sites. And so there's very few places in South Korea where people feel safe. And they've had protests of hundreds of thousands of women going out into the street to call for action on this type of harm because it's so common and so accessible to people. The non-consensual distribution of intimate images, which is publishing intimate images online, often through digital means, but it can also be if you print off a copy of it and shared it with other people, it would count as well. But there was no law that fit perfectly with that offense. So that's a new one that was created following the Todd and Parsons case. And then there's things like the unlawful use of a computer. So we've had to introduce things like hacking laws that never existed before computers existed. A lot of the laws that are created around children. So we have issues around child sexual abuse material, which in the criminal code is called child pornography. But again, there's been a shift in the use of how people describe the language of that. And then a lot of laws around luring children and showing children sexually explicit material online or inviting them for sexual contact has been an issue. One of the earliest issues that arose when the internet was invented. And a new issue that's been coming up that's kind of debated about whether it should be criminalized or not or unsolicited dick pics. So you've seen trends internationally where this is on the more extreme side of things where people will be on public transits and anyone who's got airdrop, any woman who's got airdrop open on their phone will just get a dick pic sent to them. And they don't know who it is, but they know it's someone on the train. And it's very scary for some people because you don't know if it's the person sitting next to you, you don't know if they're gonna follow you off the train. It's a very upsetting experience. But in Canada right now, we don't have any laws around that. And there's also lots of circumstances when we send unsolicited images that might not necessarily be harmful. Like typically on a dating app, you should probably ask first. The best practice is to ask first. But there might be some circumstances where an unsolicited image might still be wanted and might need to be criminalized. But right now the only law that we have for that is making sexual material available to a child. And it's generally like if it's an adult sending material to a child who's under a particular age, I believe it's under 14 for a sexual purpose. But that's another area of law that we're starting to see people pushing for legislation on that. I'm pros and cons of criminal law. So criminal law we all know is problematic. And what we know in particular about criminal law with gender-based violence and sexual violence is that it has not been the most effective tool for actually addressing sexual violence. There is a lot of lack of trust in the institution. A lot of women are reluctant to report sexual offenses or gender-based violence offenses because they've had negative experiences with the police. I think there's been efforts to change. But that still exists. There's also a historic disparate treatment of indigenous people, of black people, the LGBTQ community that might make them hesitant to use the criminal law. And also we haven't, like when it comes to imprisoning people it doesn't necessarily stop this type of violence. So like it might get the person out of the community but when it comes to rehabilitation and actually changing the behavior of people you know there's very mixed results on how effective the criminal law system is in changing larger societal norms around sexual violence. And this is inclusive of tech-facilitated violence. So there's a lot of barriers that people experience there. The victim also has no choice in the litigation strategy. The crown is representing their interest. So they don't have a lot of choice necessarily on how the case is guided compared to civil law. And then there's this term that's used a lot in tech-facilitated violence called lawful but awful. So there's a lot of content that is terrible that we know hurts people that we know probably shouldn't be on the internet but it doesn't reach the thresholds of criminal liability or civil liability. And so in those cases there it's not even necessarily content that we do want criminalized but it's content that we probably want addressed for larger societal purposes but doesn't necessarily fit within the criminal law. The benefits of using the criminal law is that if you report to a police station and they take your case seriously the investigation is funded by the state unlike civil law which is very financially inaccessible for many people. You don't have to worry about the costs. There are of course costs associated with it but mostly it's funded by the state. The police can gather evidence in a way that the average person probably can't do. They can get warrants. They can get orders. They can get access to IP addresses that an individual person might not be able to get on their own. Of course there are serious incidents that do warrant state intervention. As much as there is criticism of the criminal law system there are some circumstances where we do want state intervention in harms that occur in society. And for some people having recognition by the states that what happened to them was wrong is very important. And even though the criminal justice system is difficult to go through as a victim of sexual or gender-based violence it's still worth it for them to go through and to have a decision at the end if they're successful that recognizes what happened to them was wrong. Civil laws are really a lot of people have been advocating for civil law solutions. And we've seen real movements in that. So again like criminal law you can apply common law torts to these types of harms. So intentional infliction of mental suffering. There's all sorts of privacy torts that have been introduced depending on the jurisdiction that you're in, defamation and harassment. Harassment is not a tort in all jurisdictions. I believe it's a tort in Alberta. And then they tried to bring an argument to introduce a tort of harassment in Ontario and it failed. But then a year or so later they actually were able to bring in a tort for online harassment. But the threshold for that is extremely high. It's very malicious. It's very repeated. It's very intentional. It's very serious consequences on someone's life. But there have been torts that have been introduced specifically for online harms. We've also seen the introduction of many statutes particularly around the non-consensual distribution of intimate images here in Nova Scotia. I actually think we have one of the most progressive ones on the books. It's really broad. It's really interesting. We'll talk a bit more about it later. People have already, have also used provincial privacy acts where there's a private right of action. So in BC, Alberta and Quebec people have used those statutes. And then PAPEDA which is our commercial privacy laws has actually been used by some of the individuals who were complaining about their content being posted on Pornhub and not being taken down. They made a complaint to the privacy commissioner saying that their personal information was on this commercial website and this commercial website was not taking it down. And so there have been complaints to commercial organizations as well. As I mentioned before, copyright is a tool that people can use. Most sexual images have been taken by the person themselves. So if you have copyrights, you can make a file a copyright complaint. In Canada we have a notice and notice regime which means if you make a complaint it has to be delivered to that person. They get noticed and they should take action. But it's different than what we see in the United States where they have a notice and takedown regime where if you make a copyright complaint the content has to be taken down immediately or else the website can actually be held liable for copyright infringement themselves. But if you know of anyone who's ever had their images shared without consent because most social media companies are in America you can actually make a file a complaint through the Digital Millennium Copyright Act and get content taken down. And so that's often one of the most effective and efficient ways to get content taken down. But if you are using that tool you wanna make sure that you collect evidence of it being up before you make a complaint to a social media company or about copyright because if the content is deleted by the company you may no longer have evidence if you wanna bring some legal action. And this is part of our, this is very wordy slide. This is part of our act here in Nova Scotia which is the Intimate Images and Cyber Protection Act. So most statutes across Canada most provinces have one by now protect against allow for a civil right of action if your nude images have been released. But here in Nova Scotia we also include cyber bullying in it because again with what happened to Ritea Parson the disclosure of her image was only one part. There was also all of this other bullying that happened so they include things like impersonation, inciting another person to commit suicide which happened with Amanda Todd a lot of her peers were telling her to kill herself impersonating another person. Disclosure of sensitive information not just nude images. So there's this broad array of behaviors that fit with under our act. And the one thing that I think is really good about our acts is that we actually have a statutorily empowered body that's included in that act which is basically a complex way of saying that the government has under that legislation an option or possibly an obligation to have a government funded body where people who have experienced cyber bullying or nude image disclosure can actually go to for support. And so you can call them they can help you understand what technical ways you can get content removed through social media companies they can help you manage some of the emotional or relationship challenges that you're having. And then if you do decide to go and pursue a legal remedy they can help you understand your rights. And I think that's one of the biggest challenges right now is people don't know what their rights are they don't know where to get help they don't understand the law. And so having these types of organizations I think they should be across Canada they should be better funded they should be more well known is extraordinarily helpful for these types of harms. Australia has the e-safety commissioner which is the gold standard for this type of organization. They have they do massive amounts of research they do public education campaigns they have a direct place where you can make complaints and then that organization will actually take it out of the hands of the victim and if they deem the content illegal or inappropriate they'll contact the social media company and get it taken down for you so it just relieves a lot of burden on people and they've had a 90% success rate in getting content taken down. Most of the content they've had trouble getting taken down that was deemed harmful is typically on websites that aren't the major social media site so it'll be some random defamation site that's the websites like hosted in Lithuania and it's just impossible to get to them legally. But Alexa Dodge who is a professor at SMU here in Halifax she did a report on CyberScan and what she found was that very few people actually want to escalate what's happening to them to the law. Very few people want to report to the police very few people want to go through the civil route most people just want to get a little bit of help on how to take the content down and for most cases that are less serious that's all they need and you're able to get the content taken down but for more serious incidences or people do want a legal response there are options for that but they found in interviewing she interviewed people who worked for CyberScan that very few people actually wanted to escalate things to a legal response they really just wanted emotional support and technical support and according to our legislation and what CyberScan offers they also offer restorative justice options but what Professor Dodge has found is that those haven't really been fully implemented yet and so that's something that CyberScan in her recommendations should work a little bit more on making sure that there's a variety of options for people for responses and then the pros and cons of civil law they're the typical pros and cons that always come with civil law the cons are it is expensive you know if you want to go through one of these statutes if you want to go to the civil courts it's going to cost you five to $20,000 easy if not more so there's a major expense for it the length of time you know if you want to have a very quick response you know and this litigation is going to take a year or two it can take a while before you can get an injunction or an order to get the content taken down the cost is inaccessible to most people which also skews our case reporting so the cases that we're seeing right now that have been reported are generally from people who can afford lawyers you know and so when we're looking at who's represented as people who've been harmed you know often it's wealthier upper class women you know who fit and they are real victims but also fit within this idealized version of what we think a good victim is in these types of both crimes and civil cases and again there's also a few lawyers who specialize in this area I get calls all the time saying this has happened to me do you know a lawyer that I can talk to and I only know you know like very very few lawyers to even recommend anyone to and all those lawyers are very busy and then even if you are successful the defendant might not have enough money to pay you know if it's your ex-boyfriend and your ex-boyfriend is broke you're even if you're successful that you might not get the money from them either way but the pros is say if the defendant is wealthy enough to pay and you are successful and you can afford the law you have much more control over litigation compared to using a criminal law response you can get injunctions and take down orders sometimes before the trial is finished which is ultimately what people are asking for you can recover costs again as I said before you know there's so many costs that come up with this reputation manager is moving therapy so you can get some of your costs back and for those people who don't want to engage in the criminal law system you can avoid criminalization in that situation the final legal tool that's being considered is content moderation and so we've seen content moderation laws be passed in other countries so in Australia they do have a civil penalty if content isn't taken down the government can penalize them for not taking content down in Germany they actually have a fairly strict content moderation laws particularly around hate speech if content isn't taken down within 24 or 48 hours depending on the content the social media sites can get a fine so it incentivizes them to moderate content but it's also a very divisive topic because it implicates freedom of expression whenever you're having the government regulate anything where they're going to tell private companies how to moderate the content that's on their websites there's always a push and pull between freedom of expression and protection from harms and one thing that I think is really important to keep keep in mind when we're thinking about freedom of expression is often we're fed this concept of freedom of expression from the United States and freedom of expression in the United States is very different than in Canada we have much more balancing of rights here in Canada we have justifiable reasons to limit certain types of expression and content moderation depending on what's being moderated could fit in within those justifiable balances because if we don't if we don't control some of the content that's on the internet basically what happens is that certain groups of people are unsafe on the internet and so they are no longer able to express themselves on the internet and so what we've seen from previous research is that female journalists anyone talking about feminism anyone talking about critical race theory anyone talking about transgendered issues faces extraordinary amounts of online harassment that's done to drive them off the internet that's done to drive them to stop talking about what they're talking about and so when we're thinking about freedom of expression we need to think beyond the rights of the person who is posting harmful content and also take into consideration what are the underlying values of freedom of expression that we want to be protecting and how can we regulate this type of behavior in a way that allows for a safe place to be on the internet while also not over-regulating speech which is a challenge I'm sure any of these laws that are created are going to have charter challenges to them unquestionably and social media companies I criticize them a lot but they're also doing a lot of good work they have content moderation they allow for blocking and muting of people which is a tool that a lot of people use some companies have blurring images so anyone who's used Bumble Bumble is a dating app where you can exchange photos with each other and of course sexual content is shared on that and when it comes to unsolicited dick pics there might be a conversation that you're engaged with where you get an unsolicited nude photo and you're interested in it but the first thing that comes up is it's blurred so if you're in the mood for that that's what you're looking for you can click it and it unblurs but if it's some jerk who's sending you a dick pic and you really don't want to see that you can just swipe it away without having to see it so companies are creating different tools for safety hashing has been introduced for child sexual abuse materials and nude images where if there's images that have been found that are these types of images they do what's called hashing which basically creates a numerical code for the picture and then they have that so like Facebook or institutions that try and delete child sexual abuse material and they can use that hash to identify the image elsewhere on the site or elsewhere on the internet and then find ways just to take it down and Facebook did one thing which was controversial because it could be helpful but also was a bit weird you could preemptively send your photos so like if you broke up with your ex and you were worried they were gonna share your nude photos and you had all your nude photos on your phone you could send them to Facebook and Facebook would hash the pictures so that if in the future your ex went to go post the pictures online they already would have a hash of it and it'd be deleted immediately but I think many of us would feel a bit uncomfortable was sending our nudies to Facebook but it was something they and they try they're trying right like some companies are trying there's also filtering algorithms play a big part in how we see content what we see on Twitter, what we see on YouTube so they can work on their algorithms to de-prioritize hateful comments to de-prioritize certain contents a Google can de-index things so some people might be familiar with the right to be forgotten often what happens there is that the content will still exist on the internet but if you search for it so if it's you're like Susie done nudies it won't show that it'll just show my professional profile or whatever so it'll just be de-indexed and not on the internet and then some content moderation allows for the deletion of content and the actual removal of users like right now we're going through a really interesting time on Twitter with this shift from Elon Musk purchasing Twitter again and everyone's kind of waiting to see whether people like Trump who engages in a lot of harmful behavior online will be allowed back on Twitter and how they'll manage that content moderation which right now Twitter is relatively unregulated it's an American company that's protected by Section 230 of the drawn a blank on the the statutes but which basically says that internet companies can't be held liable for the content that's on their sites and the exceptions to that are copyrightable material so there are exceptions to this when Elon is complaining that there are not exceptions to this there are copyright is accepted and they also passed two laws which were quite controversial FOSTA and SESTA which was supposed to prevent trafficking material but also led to a lot of websites just removing all new content from their websites because they were worried it might be classified as trafficking material but there have been exceptions to this but as Canada makes choices around content moderation it'll be interesting to see how enforceable those rules will be so this is some of the recommendations that Cynthia Koo made in her report de-platforming misogyny basically having requirements that platforms have easy to use clear content moderation policies that people can understand that are explainable and that expedite the complaints well that they should publish comprehensive transparency reports because what we know right now is that content moderation policies exist but we don't know how they're making decisions we don't know what percentage of content is actually being dealt with content moderation is often farmed out to contractors who their job is to look at terrible content all day long and have a few seconds to decide whether it's good or not good and obviously people make mistakes in those jobs so having more comprehensive transparency reports would allow people to examine and critique those companies and allowing for the immediate removal of certain extremely harmful forms of content such as non-consensual distribution and child sexual abuse material without having to get a court order so those are some suggestions that Leif made and so Canada did produce a technical paper that made an initial suggestion on how to legislate this and ultimately they were following something similar to the German model where with these four types, five types of harms with child sexual abuse material, non-consensual images, terrorist content content that incites violence and hate speech there'd be requirements to remove that content after 24 hours but when they proposed that there was a lot of controversy around it because things like child sexual abuse materials are very easy to identify and take down where something like hate speech or inciting violence is much more difficult to make an analysis on and people were saying should we be expecting social media companies to make decisions on constitutionally protected content that isn't super clear so they went back to the drawing board they got an expert advisory committee on online safety that has produced some suggestions the government's now thinking about how to regulate this but the suggestion they made was that we have a risk-based approach so that companies have an obligation to identify the risks that are on their platforms how they would mitigate those risks and then they would have to report what they've identified and how they're mitigating those risks and if they don't mitigate them well then there might be something like administrative monetary penalties that the government could use to enforce better content moderation but it's a lot broader and allows for a little bit more flexibility which there's pros and cons to all of those and then pros and cons for content moderation you know the pros is that it's faster it's accessible if you can just go to Instagram and have it dealt with then it's much easier than going to court allowing for some corporate social responsibility of these companies I think is a broader social goods again as I mentioned earlier it can promote more expression if more people feel safe to communicate their views it's often a solution people want they just want the content taken down and it addresses some of these like awful but lawful contents the cons is that to date it hasn't been very effective it's very burdensome for governments to enforce when we think about this we think of like the large players but this law could potentially apply to thousands of websites across multiple jurisdictions which is very legally complex the business model of a lot of these companies is not built for it it's built on more content, more advertising so having enforced content moderation goes against the business model of many of these companies so there will be resistance to these laws and then there's also the risk of government overreach so the first proposal also included a lot of suggestions that the government could go then and like search Twitter's offices to get information in order to check about whether they're fulfilling their content moderation practices and so there's a lot of push and pull between how much do we want to give the government power to access content to review whether these companies are maintaining the regulation that they should be and not and that's all I'll just kind of like buzz through the last few so the final piece is non-legal responses so right now there's a few organizations in Canada that are doing really good work the British Columbia Society of Transition Houses has a tech safe website, tech safety website where you can go on there they've got tips on cyber security how to protect your privacy how to collect digital evidence and it's written up for people who are likely going to be self-represented in court so it's very accessible and then for lawyers who are unfamiliar or who like want to learn a bit more about digital evidence collection or law students it's a really accessible resource to get it is limited to British Columbia Rhiannon Wong who is the person who headed this project has been hired by Women's Shelter Canada to expand this across Canada but it's a fabulous resource that I recommend people going to if you want to learn a bit about tech safety even if you're not threatened by stalking or harassment it's just really good on how to make sure you've got good passwords and how to check to see if other people are accessing your email or your phone there's really easy ways to figure that out that we often don't look into how to turn off your location tracking the YWCA and MediaSmartz is also doing a lot of great work in this area if you work with any young people if you have kids this is an amazing guide it's a guide for trusted adults a lot of parents didn't necessarily grow up with the internet in their own lives and are now raising children with the internet and it's hard to figure out how to do that well and so the YWCA created this amazing guide to learn how to talk to kids about online safety and privacy and they've also just done a lot of youth-based research that's extraordinarily helpful and this is just a list of some Canadian resources and international resources I generally, as I come across things I post them on my website so I've got a website SusieDunn.com and if you just go to the resources section I've got all these community organizations and cybersecurity organizations that have how-to tips on them so if you're interested in any of that I keep an ongoing list I don't update it all the time so I can't promise whether it's totally up to date but there's so many resources out there that people can access okay so thank you so much for almost at seven and I wanted to leave a little bit of time for questions so thanks for coming, any questions? Yes? Do you think the fact that technology, the tech field is a male dominated area is what do you see that as impacting us? Yeah, so it really contributes to it so what we see is that in most large tech companies we see predominantly men, male engineers who are developing and producing these products and there's certain things where you just think man, if there had been a woman in the room or a racialized person in the room we would have known that like Apple air tags could be used to stock women and so I think that there is a bit of an issue within tech companies that they don't have enough people who are not just represented by their identity like I think putting women into tech companies doesn't necessarily solve the problem like you need to have people who are committed to equality issues and who are committed to thinking about how to make products safe and how to make them good and so some practices that companies have they have privacy by design so every time you're creating new tech you should be running through how to ensure privacy is protected and now there's pushes for safety by design as well too so thinking about actually having a process where people have to think through those issues and I think there's also a pipeline issue where a lot of computer science engineers don't have this type of knowledge in their education in the way that like as lawyers like we have to take legal education and legal ethics courses but in engineering right now a lot of those issues that have to do specifically with tech safety are still optional and so I think there's also a need to change how we educate computer science engineers so that when they do get access to creating technology that they're more thoughtful about it and women are not like women who have startups the rate of venture capitalists who are funding women's startup ideas is so small like so, so, so small if you're a woman heading a company trying to get money for it it's very difficult and especially if they have to do with safety or equality it's really difficult to get funding and you'll see men creating apps like and there are lots of great men in tech I don't wanna harp on that too much but there's certain examples of things like I've seen funding for an app where they wanted to get around not get around but they wanted to digitize consent so basically when you're dating someone before you have any sexual interaction with them you'd sit down on an app you'd put like what your sexual preferences are on a blockchain or you know some sort of thing and then if either person like breach the contract there'd be a technical solution for it but like we all know that like sexual consent is ongoing and it doesn't matter if you fill anything out on an app this is not gonna get around consent right and that like in those kinds of things get funded which blows my mind so I do think there are some issues with the tech industry where they need to work a lot more on improving the technology they're creating any other questions? Yeah I was wondering if you could speak a little bit to what I heard the people that were there because like it seems to be copyright it's sort of the knockout the whole market we know this is a recent issue for like non-sullivating non-public material Yeah it's really hard and so and the thing with copyrights the people who are probably gonna be most successful on a copyrights complaint with deepfakes are the people who are the bodies in the videos and so this has been an issue that's come up as well to a lot of sex workers and form performers are saying like they're also being harmed by deepfakes because their content is being having someone else's face put on it and being misrepresented but really the base of the person would be easier to make a copyright complaint because deepfakes you have to have a large collection of someone's image to make a mock-up of their face to be put on but you can't really tell from a deepfake which photos have been used and so even though there might be misuse of copyrightable images in that content it's much harder to make a copyright complaint so in a lot of other countries they have altered images included in their non-consensual distribution laws so their civil laws and their criminal laws include things like deepfakes there was a woman, Noelle Martin who had a lot of her images photoshopped and then later on deepfaked and she actually became a law student became a lawyer and did a ton of legal advocacy in Australia and so a lot of the laws in Australia include altered images and here in Canada I believe in New Brunswick our civil, the civil statute there does include altered images I'd have to double-check that but when they were proposing their statute they had included altered images so that's one place for it to be connected and then under civil personality rights torts you can also capture them but the challenge with personality rights torts is that in certain jurisdictions it leans a little bit more heavily to commercial content like it's meant to not have your pictures be in an ad so depending on what province you're in you might have more success if it's been sold like if your image has been sold but a lot of deepfake pornography is just made for fun and no one's really making money off of it so it doesn't necessarily fit perfectly under those laws and that's an area where I think there's a gap where what about our faces, what about our voices, what about our identities need to be protected from deepfake technology because voices have been used as well not so much in this context but people of deepfake, people's voices have been called up, CEO at a company and said hey I'm your boss, can you transfer $500,000 into another account and the voice is so convincing that they do and so stealing people's voices and images and faces is a new issue around fraud as well but that would likely fit under fraud laws, yeah. I have a question about how victims or perpetrators are in Canada is there a bigger area for those victims getting justice? Yeah I think it would basically be the flip side of what happened with Amanda Todd, you know so you'd be able to see because the crime and this is the interesting thing about tech right because like where does the crime occur and so the crime can occur in the jurisdiction that they're in but it also could occur here and so there was one case in British Columbia where it was a woman whose husband created a whole fake website of her it was like her first and last name.com and published all this harassing information about her and untrue information and she initially called the BC police to say you know I'm being harassed by my ex but I live in the United States and they were like not our jurisdiction, not our problem and then she went to the States and she said this issue is happening to me and they're like no, no, no, no that issue is happening in Canada not our issue, go to Canada and again it was one of these difficult situations where she ended up going to the media and then when I went to the media she was able to get the Canadian police to take it seriously and at first again they were a bit wishy washy on the harassment charge because for harassment you have to have a sense of fear so it has to be like repeated harmful behavior that causes you fear and there was also some question around well can she actually be afraid if he's in Canada and she's in the United States and eventually they were successful and he did get charged with criminal harassment but then the challenge with him as well too is once he got out of jail he just put the website back up and he just doesn't care at all either and so there's these issues even if you have court orders and people are flaunting them like how do you get content taken down off the internet but it is a challenge, jurisdiction's a major issue. Yeah. That actually triggered a response that triggered a question for me which is a family that has a question and so it got to the international law. Yeah, yeah. Do you think, given the ubiquity of the internet and how accessible can be basically from anywhere to any American that will use or just like that were asked where they're from didn't there any activity at all for international action on this? Like some kind of body that might start tomorrow you're an American Canadian but a girl of large national to solve some of these problems I mean uniform, uniform remedy whether it's criminal civil or coronation that gets over to the jurisdiction of child that is there any activity for the international regulation of itself? For child sexual abuse material there unquestionably is like there is international organizations they share material with each other they share things with you do see a real movement on child sexual abuse material and with tech facilitated violence just in the last like four or five years we've seen a lot more attention by the UN like there's been a few like just recently I got invited to be on a global committee with a few different countries that are looking to talk about these issues so I think there's an appetite to think about it and to talk about it and I think there is a model with child sexual abuse material that's already been implemented but again that's the one issue that everybody just likes that you know like when we talk about that material there's no question it is wrong but other things like even releasing adult images of people like there's not a consistent response across the board so you have to get buy-in from all these different countries which I think will be a challenge but I do think with the internet there is and these tech facilitated crimes there is much more of a need to get international even to have consistency in like what the laws look like could be helpful so that people don't have to think you know do I have to go to Canada to get this dealt with if the person's there or can I do it within my own jurisdiction but I think there is an appetite for international conversation and we'll see where that goes okay well it's seven now so unless anyone has one more burning question we'll let you get back into the rain okay thank you so much for coming