 Go ahead. All right. Hi everybody. Wow, I can't believe I'm getting to speak to so many of you at once is really exciting. If you didn't know my name is Caitlin Bowden. I am the CEO founder of badass, which stands for battling against demeaning and abuse of selfie sharing. I'm going to be doing a brief intro real quick of who we are what we do. Then I'm going to introduce you to my amazing team. So here we go. Well, badass started out originally because I found out that I myself was a victim and you know, I saw the scope of the issue how big it is. I was a victim of non consensual pornography or not consensual image intimate images being shared online. When I didn't want them to be, I started researching the websites that were surrounding this kind of stuff and saw how big the problem was and how there were no resources for this sort of stuff. I got really sick of feeling helpless. So I started reaching out to other victims. I kind of wanted a crowdsource justice. And, you know, I found all these amazing wonderful human beings who were just as just as sick of being as helpless as I was. And we just got together and sort of brainstorming different ways we could fight back. Now, before all this happened, I was a bartender and didn't have any experience in any sort of, you know, tech field, or really much of anything other than making the best Long Island you'll ever have in your entire life. You know, but I went from being a bartender to, you know, helping introduce legislation in the state of Ohio that would criminalize revenge porn, which we did get passed. We were organizing, you know, a huge group and helping out thousands of victims, helping them remove their photos and collect evidence and see some justice. You know, and then I got to go and introduce a federal bill. You know, which is still, you know, going through the process but hopefully pretty soon it might be a criminal act in the United States, which is what we'd like to see. And now I get to talk to you guys so really this is like the peak of my career thus far. You know, but why am I speaking to you this is Defcon you don't want to hear about, you know, legislation right now, you're supposed to be here talking about hacking and tech. And, you know, and I'm here kind of because y'all keep insisting on it. Everyone keeps insisting in my hacker I cannot write a line of code, but you know I have helped organize 1000 victims to fight back and we've gotten to do a lot of really cool stuff. You know, from, you know, posting thousands of pictures of Shrek to drown out threads of nudes being shared without consent. And, you know, teaching them some tips and tricks on how to be secure and, you know, still express their sexuality online. You know, and there's other things that we've done that are best left for sky talks. My lawyer is here she will yell at me so gotta be quiet on that one. And here we are. I'm going to introduce you to this amazing and wonderful team. I'm actually just going to be moderating this panel I'm going to be letting them speak because I get a lot of chances to and it's time for their voices to get heard. So, without further ado, let me introduce you to my bad asses. Alright, first off, we have Kate venable, the aforementioned lawyer. She is our head of legal. She keeps us out of trouble and, you know, helps victims with their own specific cases. Next we have Rachel lamp who's my COO. She is amazing and wonderful and has more empathy I think than any other human being ever at my entire life. Marley Farlow who's our CMO. She is creative. She is fun. She is organized, which is the opposite of me, very disorganized. We next we have Ali Barnes, who is our CTO, who explains all the things that you know you guys talk about when I have dumb questions and teaches me a little bit about tech. And finally we have Timmy doomsday, who is our CISO CISO, however, that said, and he is a blast he's hilarious to be around and he is really focused on making sure that we have all of our security locked on tight. And I'm going to be asking them some questions and you can ask them some questions and let's get this thing started hey. What's happening. So I made a really brief recap of what we've, you know, done thus far and what we've accomplished with badass, but I would love to hear. Let's start this out, you know, on the positive and I want to hear y'all brag and talk about your favorite thing that has happened thus far with badass what is your favorite accomplishment. Let's start with Rachel. My favorite accomplishment with badass is definitely getting that law passed in Ohio, it absolutely was life changing it was so great to be able to take something that would assist so many victims because a lot of us like in the beginning we were based in Ohio. So being able to have a course of action for our immediate base was just really life changing. Yeah. Kate, I know you've got no yeah. The law was awesome. Never really expected that to happen on in the same vein of things I never expected to happen was, you know, my best friend is like let's start an on profit. And here we are. I really like when I was a law school never thought ever that this is anything remotely close to what I would be doing and it's so cool and it's so awesome and it's so much more interesting than half of what all my friends do. So, I really enjoy it. All right, Marley. And as a team I'd say like every time we help a victim is just like a huge accomplishment when I see one of our victims come into our group and let us know that they actually receive justice. Definitely something to rejoice about and then on a personal level, my favorite professor from my university asked me to come and teach a course with her to like hundreds of students about this issue. I did it once and she's asked me to come back and definitely and I'm just super pumped about that. All right, Ali. So favorite accomplishment that I've gotten to like be around and witness is definitely tits for deaf con was great to witness and on a personal level, you know just getting to. I think one of my favorite things has been like the cybersecurity awareness month that we did and stuff like that is in my favorite. All right. And finally, Timmy the newest member of this team. Right, so I'm like the squid right so, you know, my biggest accomplishment honestly is to just be here with these wonderful people on this panel today. You know, I started out Rachel recruited me as a, as just a, you know, humdrum volunteer working working victim intake, you know, and now I get to, I get to sit here with these wonderful people and talk to all you. All right. So now that we've gone over the past a little bit let's talk about the present. I know lately we've been a little bit absent on our social media pages we've been taking just a teensy bit of a hiatus although we are still reachable. In case of emergency if victims need to reach out. But so where have we been. Well, I think we all took a break to reflect on pretty much what we wanted badass to be in the future going forward and also we were just really out of spoon. And as most of us are victims of some sort of sexual abuse or image abuse. It really, it takes a toll to continuously have to, you know, go through and reopen those wounds and assist victims. Okay. Sorry. I was just going to say and I think like, we're not. I mean, while we do a lot of remote stuff, we're not immune from, you know, the effects of quarantine. Everybody else is feeling like we're all still here trying to figure out what in the world it means to work from home and how we have to change our model like nobody knows what like the next day brings in terms of coven and all that. So I think that went a lot like went into our decision to just like take a step back because we're all trying to figure it out along with everybody else. Yeah. I think a big thing also is that we, you know, we started this and it just blew up. And we were just learning as fast as we could. You know, the best ways to go about doing things and to run an organization at this size. I mean, we went from being a group of 10 of us to a global thing. And, you know, we, we were just scrambling to catch up, but nothing we didn't have any, any, you know, procedures and things in place that were long term sustainable. We were just doing whatever we needed to do in that moment. And we needed to organize, we needed to figure out the best ways to do things. Yeah. And I think like because of, because of a lot of us being victims, like we just coming into the organization came into it like, crazy passionate and wanting to take on like anything and everything. So a lot of us were like, like I'm in marketing, but then I'm going out and trying to do law enforcement training and I'm doing classes at colleges and like taking on so much because I want to and I'm excited to help and move this forward. And then of course, you know, burnout comes into play and we have to look at ourselves as an organization and look at what we want to do, but we want to focus on and do well and then expand from there. Exactly. And you mentioned law enforcement training. So let's talk a little bit about the laws and responses from law enforcement. Just quick thumbs up or thumbs down from the team here. Do you think that law enforcement is doing an excellent job of handling reports and complaints about when it comes to non consensual image abuse. What do we think. All right, so let's talk about that. I'm going to say what you think can be done to change that training for one. Yeah, well like for me, as a victim, going through my own personal case it took nine times to report my own victimization and a lot of the reason was, nobody knew that they that they handled cyber crimes, and that they didn't handle cyber crime units and they told me to call the FBI and it just like blew my mind. I'm looking at a statute saying like it literally says number four local law enforcement handles this issue. So like why is there this disconnect between you understanding that you can do something for victims. And yeah, so I'm just going to say training is one of the biggest issues here. Okay. Something else that I noticed and maybe Kate you can speak to this a little bit more but like when I was first brought on, you know I was working on building resources for victims and one of that was going through and just making the law a little bit more digestible. And it seems like a lot of the laws that are on the books can't even decide what a nude is. And that seems to be like the basis of the entire conversation so if we can't even agree on a vocabulary to talk about these things. How are we going to push effective legislation. Yeah, I mean I would, I would definitely agree with that. Like it's not only just like what is a nude but I know Mary Anne Franks has talked about this a lot and a lot of the stuff that she does with CCRI and I mean we most of the laws have these intent to harm clauses and basically what it said. Basically you have to prove that the poster intended to harm the victim in whatever way by posting it and it's next to impossible to prove and it's next to. Yeah. Yeah, that vocabulary malicious. Yeah, it makes it pretty much impossible to enforce at all and I think. Yeah, I mean it just goes on what you said to me like, we have to agree on something and then we write these laws and we think they sound all good and great. But then intent to harm makes it this, I wouldn't say useless per se, but it makes it hard. And complicates it further once you get into crossing state lines or country lines because the language of each state law is so entirely different and, you know, it's hard enough getting police to want to pursue justice for these victims but once it crosses state lines they just give up pretty much all together. Yeah, the language has definitely been an issue. Speaking of language. Most people know about the type of abuse that we're fighting against they know it as revenge porn. What do you guys think about just that wording. I hate it. I like I get it. It's like, the media kind of jumped on that term and sensationalized it, especially with the fat panning and, and things like that happening but when you really break it down revenge porn specifically deals with like intimate partners. Abusing one another. That only represents 8% of non consensual image abuse cases. So I don't like it for that reason but like, more than that the term revenge automatically implies that the victim did something wrong. So there's like victim blaming built into it. And then the term pornography also has a false sense of consent built into it because generally people who are creating and creating pornography content are consenting to be taped and shared which none of these victims are so personally I don't like the term I will use it in cases where I'm trying to get some awareness out and people aren't understanding what I'm talking about so more as a reference point. Or if I'm specifically dealing with somebody who's former partner or current partner is abusing them. Yeah. I like the way that you just dropped that stat out there like it's only 8%. Are there any more stats that you think are really important to share. Actually, since you are our stats person, you know this stuff better than any of us. That's on stats. Yeah, it's actually kind of interesting because there's not a ton of resources dedicated to researching this topic. Like CCRI is amazing. They have so far have put out the most research on the topic. I just see one that came out in February that said that the rate in which this is happening is one in three people are being victimized. And that was just two months ago. But that was that was geared more towards European countries and wasn't a worldwide view. The most recent stat we have as far as worldwide is concerned is 2017 which said it was one in eight people. I'm not sure we'll get into lots of reasons why that is rising quickly, but it's hard to say I'm going to go with probably something closer to one in five, one in six, as of currently but it's something that we really need research on. And then I know another interesting fact is that, you know, this affects three out of four women victims are are women in most cases are female identifying people. But in cases where the victim does identify as male, it's 90% of those cases in incorporate sex torsion as well as the image abuse. So, that's just some, some facts. And yeah, those are scary, scary figures to hear. Yeah, that many people this is happening to this is a huge problem. You know, but the biggest thing that we hear when we discuss these sorts of things with the public or in the press. Immediately the first thing we hear is, well, this wouldn't be a problem if you just didn't take those pictures. And so we hear it so much I know we all have a ton of different responses that we go to. So let's let's share all those responses let's just get that out of the way. Rachel and I can see it. I have so many words for people who such things but I'm going to keep it professional ish. It is, it is not the victim's fault. Like, it's 2020. We have cell phones attached to us they're basically part of us. It only makes sense that, you know, people would be using their cell phones and their sex lives as well I mean it we use it to record when we sleep like to tell us when we fart, like when we talk in our sleep, like of course that's going to be incorporated into sexual relationships as well. I think the thought of excluding that or blaming the person because of them using and expressing their sexuality within a consenting form, and that being abused it's just, it's senseless. One thing that bugs me is that people always say that when the fact is we are, you don't even need to take a nude to become a victim anymore. You know, I see deep fakes are already being brought up here in the chat. And that is, you know, something we have been monitoring. So, what, what should we do about the deep face do we see that becoming more of a problem. Yeah, I mean I feel like the media recently has just realized that that's a thing, but it's something I think our organization has been talking about amongst ourselves and within the community of victims and activists for over a year now I would say. And then all of a sudden like was in the last like, I think six months or so, the media is like, oh deep fakes and I'm just like looking at the headlines like, duh. But yeah, no I mean it's, it doesn't necessarily mean you took a nude or anything anymore like it means some jerk put your face on some random naked person and is now exploiting you on the internet. The deep face victim came to us about a year and a half ago. And since then it's it's it's it's going to continue to rise like it human curiosity and then us just being bored and quarantine people have a lot of free time on their hands. Deep fakes are going to continue to be on the ride. Yeah, I'm actually interested to find out what the stats are on that just because. Like being quarantined. People are just stuck in their computers and there's all these apps coming out all the time that allow people to create deep fakes. I know that there's that one. I think a year ago or a little less than a year ago that came out was kind of making waves and I really wish I like I knew how to how to stop that or any advice to prevent that from happening but it's just. It's cropping up everywhere and so quickly that I'm just I'm not I'm at a loss personally as to what to do. And I mean it doesn't help that you know the think about the processing power it takes to do some of that stuff. It wasn't just sitting on your desk. Two or three years ago, whereas now I have a laptop that could probably generate a deep fake in an hour or two. You know, with the right with the right settings. And never underestimate, you know, someone that's time rich and money poor, which is a lot of us right now. Yeah, I was going to say that's exactly it. People have so much more time to write these tools and stuff now. Yeah. But beyond deep fakes, I mean, that's a, it still is a problem that is growing and it's something we're anticipating. But you know the majority of the people coming to us. You know, their images that have been leaked or something that they took or allowed a trusted partner to take. And that was leaked without their consent. So, say someone wants to, you know, still continue doing that, which we encourage we encourage that sort of thing, but we want them to do it safely. What tips do you have for people to sext and take nudes and not become victims. So there's a few things I feel like that we tell people and I mean, I feel like this is the perfect time where I love how we do cybersecurity awareness month because I feel like it's a perfect time to like flood everyone with all the tips every day and Rachel did such a good job. Yeah, I almost said yesterday, last year, putting stuff together but, you know, basic things are like simple things but you know turning off the location on your phone and making sure your address is not attached to your data. That's coming from your camera roll. So pay attention to what apps you're using and what data they're storing of yours. Something that a lot of people don't think of that I've seen people get bit on is like snapchat memories. You take a picture on snapchat and you think it goes away but snapchat saves it and is like trying to remind you about it later but all it takes at that point is someone to get your snapchat password if you don't have everything protected properly. So those things, you know, keep things we say keep two factor on for all of the things that you can. You know, pay attention to what's in the pictures that you're sending to people that you don't know or trust. Make sure there's maybe no random box in the background with your home address on it. Yeah, those are the things come to mind on top of my head. Making sure you know what's in your pictures. Exit data. You want to make sure to wipe that so you're not accidentally sharing location and stuff like that. What else. Oh, I was just sorry. No, no, I was done go. No, I was going to say like so in my data. This is like actually really helped my practice and me helping victims and, you know, like just raising awareness in general as to like how to protect your own privacy not just in terms of image abuse but like if there's domestic violence or stalking, or some other, you know, like men saying or whatever crime you can think of that involves somebody being creepy, like it's basic privacy and so change your passwords, turn on two factor turn off your location. Don't give your significant other like unfettered access to where you are 24 seven because that's creepy anyway but that's neither here nor there. But I mean, it's just, I mean basically what Ali said like it's stuff that people don't think about and with the way that all these companies like Google and Apple and whoever like track our data these days it's important to know what data we're sharing and who is seeing it and what control we have over who it goes to and where it goes to and when. And also, sorry. I was going to say to add on to that you could, you can simply watermark your photos with the name of the person you're sending them to. And also you can you can create kind of a contract between each other by like asking your partner explicitly, like not to send these photos or it will cause me harm. Those are really important words because we know that that's how the laws really frame them is that intent to harm so if you can just check that box immediately and screenshot it keep it in a file with their name. Then I think that would really do a lot in the way of prevention and also assisting in your own legal case if it comes to that point. Yeah. I always want to remind people you know you want to think about the future and always hope for the best prepare for the worst. You know, make that screen that contract screenshot at your lawyer if they ever need one will thank you for that. So yeah, that's really important. I know that we had discussed Snapchat and other various apps that are being used to share these images and what. What, you know, responsibility do they have. Do you think. To, you know, prevent this sort of thing from happening and what are you seeing platforms and companies doing that they're just falling short and letting, you know, victims. Victimization happen on their platform. I mean, you want the real answer. I mean, for my perspective, like, Facebook could like respond to subpoenas that are from not law enforcement, maybe possibly at some point in time that might be helpful. Like, if somebody sends stuff during like using Facebook messenger, it's next to impossible to get anything useful for court. And here I'm not law enforcement, so they don't care. I mean, other people can speak to what they've experienced, but that's the first thing that comes to mind for me. Well, the other thing that the first thing that comes to mind for me are hosting companies, right? There's usually a blanket policy that says, don't post these kinds of things. But image abuse, if there's, especially if there's not minors involved, right? They may or may not be on their abuse priority list. So, I think it's important to hold these hosting companies accountable and keep letting them know when they're willingly hosting, you know, websites and clients that are committing image abuse, like Cloudflare. Oh, wait, wait, what did we have planned for Cloudflare when we said that at some point? I just want to throw that out there. Yeah, just we're waving we're saying hi. Now, we've run into these issues with Cloudflare for a very long time. They have repeatedly, you know, and it's not just them, but they are the most egregious when it comes to this. What, you know, other apps, let's call them out, like who is just doing a really shitty job of letting victimization happen on their platform and versus who do you think's doing RA, like who's actually trying. I think that Twitter is quite a shame. They could really respond to reports. I remember that there was an actual Twitter account that was for one of these hosting sites and it took two years for it to be removed. And they openly boasted about having specifically leaked images and revenge porn. So, Twitter. I can speak to a couple of ones that have at least been trying. And mind you, their methods aren't perfect and obviously victimization still happens quite a bit on their platform, but at least they are a either trying to address the problem or prevent it. You know, Facebook, they came out with a pilot to try to prevent images from being uploaded in the first place. And yes, it's Facebook and nobody's going to trust them. As I believe the press, you know, kind of twisted it to, you know, send Facebook your nudes, which is the worst idea ever. But they were trying. They addressed that there was an issue and, you know, they acknowledged it and they are trying something. And the other one that has been actually good to work with, and I know this is, you know, a bit of a controversial opinion, but porn hub has had their moderators involved with us from the very beginning. And given us a direct line to making sure that things get taken down really quickly as soon as, you know, anyone discovers there's a problem and no one has even, you know, really tried to make sure the problem never happens in the first place yet. But I'm waiting to see that happen next and I'm really excited about that. Anyone else want to call someone out want to call it a platform start a fight. Let's do it. I mean, we've had issues with Instagram forever. Oh, yeah, like reporting. I mean, I personally have no someone who it's not necessarily like revenge porn or image abuse per se but it started like his victimization started with Instagram and it's been years. And it's taken thousands of people reporting and millions of dollars extorted for them to even do anything. They finally like CNN got the story recently but other than that like I mean it's insane like everybody I know who has been victimized by victimized on Instagram has had the worst time, even getting them to listen. Yeah, and that's a shame because I feel like a lot of like high school leaks end up on Instagram and the fact that they're not very responsive in a very fast way. Really, it's a shame because that stuff should be knocked out relatively quickly, you know. I have one. Reddit. So, like they, if they try sometimes. My, my qualm with them is that they don't. They have moderators who who do remove content which is great, but they don't retain that content so that victims can claim it and use it as evidence, which is kind of huge. Like in where I'm in Florida and it's a stackable offense here. In my case, my image was posted on Reddit and also on another site. And the person who did it was only facing a misdemeanor because Reddit took down the image and I had no proof that it was ever there. So, you know, I reached out to them directly and asked them like what's going on, can you put something in place if you can flag images as non-consensual image abuse, then you should also be able to like keep them in a repository that's saved, saved for law enforcement to contact and get later. Alas, there's nothing. Absolutely and how could we forget discord. That's actually the first time I ever came to discord was because of my images being in a entire revenge porn discord server, which is lovely and they also remove content without preserving any of the data. You just have to really hope that the officer you're working with is able to send the preservation of data quick enough before that content is just gone forever. And that's the stream getting dropped immediately. I mean, and that's not even in just, you know, image abuse cases with adults, we've run into that issue as well. A lot of our work, you know, does involve CSAM, which is exploitation of children. And so it's been really frustrating to see companies drop the ball on that a lot. So we've discussed, you know, the platforms and their responsibility and I know a little bit ago I just saw a really great talk actually. You know, just a couple hours ago it was in crypto village and it was regarding section 230. As well as the upcoming, you know, the earn it act, which is gaining momentum. You know, through the process and you know what are your thoughts on those. Yeah, I mean, with CDA 230, it has a lot of nuance to it. It has some good points that also has it also creates loopholes that I don't know. I don't want to like criticize it too much because I'm not sure that the loopholes are necessarily intentional. I mean, when we were talking about Cloudflare earlier, I mean, their web hosts aren't responsible for the content that they're hosting because of laws like CDA 230 and while they are meant to be helpful, like they also create more problems. And it just, it needs to be retooled so that it's a better law and has less room for wiggle, like less wiggle room. Um, you know, the biggest argument that I really heard when it comes to that, you know, and a lot of what our work is whenever we're trying to talk about legislation regarding this form of abuse, the biggest argument I hear about is about how, you know, NCP is it's a free speech thing. And people claim that, you know, this is free speech. And what are your thoughts on that? Well, I think that nudity and nude photographs could be free speech, but once it enters into the realm of harming people, that's when it exits, you know, free speech up until a point. But what, at what point, do we consider things to be harmful? Is it, I think it comes down to the consent personally. Yeah, I'd agree with that. I think that's where the, I think that's part of the argument. The highly intent to harm thing, um, hint into harm clause is, is that's what makes it not free speech anymore, but we can't define it that broadly and make it that vague because it becomes unenforceable. Free speech is complicated. It's way more complicated than the average person would think it is. And yes, we have free speech, but it doesn't mean that you can say what you want and do what you want without repercussions. And it doesn't mean that it also doesn't mean that just because it's free speech, it's the right or moral thing to do. Like there's a difference between what our country allows us to say and what is morally and ethically correct. Um, yeah, legality doesn't always equal morality nor vice versa. Yep. So yeah, I noticed I've seen quite a few questions in the chat. Also, big props to my amazing husband TC if you haven't noticed he has been helping us field these questions he's making sure that we get them. He's sending messages that I might be missing because I'm also moderating the panel and don't always have time to look at this chat. He's making sure we see them. And just basically being the most amazing personal assistant role. So, thank you to see. So, I would like to start addressing some audience questions. Alright, the 1st 1 is given the type of victims that you help. Do you find that you as an organization are now targeted in order to gain access to lists of clients by harassers and such like. And they, they aims that toward Ali. So, I mean, let's start that off with you. Yeah. So, I would say it definitely makes us more at risk. I don't know if I would say that it's for people wanting to get a list of our victims or clients. People that we help so much as it's people that are pissed off that we're doing what we're doing and they kind of fill this need to like plot a revenge against us. It's kind of thing. We're taking away their control, right? So. They're revenge revenge porn. Yes. Revenge. I mean, I think Caitlyn can tell you this we have a whole lot of trolls. And yeah, I can we talk real quick though, like speaking of tools, can we just talk real quick about tips for Defcon from last year. And how that went. If anyone remembers that. And just, I remember. Yeah, that was fun. People are mad. They were so mad. But yeah, I mean, we've gotten like, I mean, yeah. We've gotten. You can only use, you can only use money you made in infosec to come to Defcon. That's the only money you're allowed to use. Yeah, poor money should be spent on poor things. Exactly. Just what we did. Right. Right. I like that. Yeah. And we've had like members of the org be threatened, re victimized docs, like all kinds of things. So I'd say, yeah, 100% people do target us, or they try to. And, you know, we all support one another. And do our best to fight against that too. I just wanted to show off the whore dollars, the stickers that we had for Defcon last year that say, that says, I spent my money at Defcon and. She's got a titty out her favorite one. Just one. I think I have three. To my three whore dollars. What are you going to buy? What four things are you going to buy? Something special. I don't know where it came from. Alright, so the next question. Alright, so we discussed, you know, do we see ourselves more of target? I think so we have had people try to infiltrate our. You know, private spaces that we have for victims. We have a lot of different things that we do to make sure that when people contact us, they are who they say they are. You know, because there have been some, you know, threats made of people trying to gain access to victims or, you know, looking to. You know, there's a whole separate, there's a whole separate fetish for people that enjoy seeing victims reactions when they find out their photos are out there. They like to get the screenshots of them notifying victims. Hey, your pictures are out there and they freak out. And I had somebody try to do that to me actually on Twitter rather recently. And I just made fun of them. It was great. Because I no longer give a shit. But that is a thing. So gaining access to victims and watching them in their most vulnerable states. You know, that's the how awful is it that people get off to that. Just saying. The next question is, what are some privacy tips that parents can pass on to their kids or teens? You know, recognition of exploited, exploitative behavior, protecting oneself online from malicious X someone and so forth. I'm going to still, I'm going to like repeat what we said a little bit earlier and just like, in general, have the talk about consent with your with your kids and your teens, even starting at a really, really young age, like it applies to everything. It's never not sexy. It's never weird to bring up that like, I want you to do this or that with my body or images of my body and only that. So yeah, just impress upon them the importance of consent and having those conversations with their partners. So if they are sharing images, which it probably will be. They do need to have those conversations about what is allowed to happen with those images that they do share. Yeah, I also want to say. Your kids. Oh, sorry. No, go on you were clapping. I was going to say, do not force your kids to hug people that I can't agree more. You can start talking about a consent at a very early age. It's not just consent that you want to cover. You also want to cover what your agency is having agency over your own body, how to say no, how to pick out when something is making you uncomfortable. These are conversations that people don't want to have because they're uncomfortable, but they need to because later after a victimizing incident and after trauma is going to be way more awkward. I promise. Not to mention traumatic. Yeah, I think just. No, and I'm just going to pop in real fast to like, because you're right, like consent agency go hand in hand and like actually just kind of recently. In the last couple of years, there's a new kind of method of looking at consent, called the triangle of consent I encourage everybody to look it up. So that incorporates power, agency and communication and all three of those things come together to create consent. So I think that's a really good starting point for talking to your kids and teens. I also think that really important addition to that is once your kids reach the teen years and you notice that they start to look at pornography. I think that talking about consensual pornography and what that looks like and what they should be recognizing and what they should be consuming. I think that's really important as well. Yeah, I think that yeah, having just healthy talks surrounding sex. I think that's really important, but it's also really important to talk to them about, you know, security and things like that because they, you know, that this is a really big place of where both things, you know, intersect. As I say, I agree with that like, you know, I mean, if your child isn't comfortable doing something with their boyfriend or their girlfriend, they should be able to say yo, I'm not comfortable with this like teaching your kids the way to stand up for themselves will go a long way and I mean, I know it's not necessarily like image abuse related, but I mean it really is like all about teaching consent. And recognizing threats. Just teaching kids to be careful when they're online and talking to people when they're on tick tock and interacting with people that they, they think are one person but they might be somebody else. Like, don't put your private information or your personal information anywhere. Just general upset I think is really important to start that conversation as soon as your kids have access to a device. All right. Next question is, and I was told to ask this one next is Jordan wanted to know they have a famous friend named, you know, I'm going to redact the name because the stress and effect was real. And she's a very bad stalker who has now been stalking my other friends who have worked for her provided outfits business, etc. should she contact badass for this or something she and her friends are very lost. Yeah, contact us. Yeah, definitely. Timmy you're the one that said we need to take this question. Absolutely. I mean, I think, I think if there's like that public cry for help. I definitely think that that's something you should contact us with. And that's the whole reason we have this org. Absolutely. I do want to mention that you're. That while we do focus squarely on image abuse. Okay, which is non consensual leaking of nude images it also has to do with upskirting any sort of non consensual behavior online or offline that involves images. You know, we get approached a lot when it comes to questions about stalking about harassment about, you know, abuse and while we might not have every answer. We do know a lot of people that we can too. And we will always do our best to address these issues. As much as we can. So yeah, yeah, absolutely. I think the best way to sum it up and I actually think Caitlyn's the one that said that said this to me when I first started was, if you think you need to contact badass like if that's even a question in your mind. Reach out. Yeah, I mean, even if we can't necessarily, even if you don't think that it pertains to what we do odds are somebody in this group will be able to help you. Somebody has experienced what you experienced somebody is somebody's there. And that's the beauty of our group is you're not alone. And you always have somebody to talk to. I will say that when this happens to you online, especially, you know, when it happens to you in my situation, I didn't understand the internet at all. I barely went outside Google and Facebook. And so when it happened to me, I felt really, really helpless and overwhelmed and alone. Above all, I felt isolated and scared because I didn't know who was doing this to me I didn't know who had been helping them who all these people making these horrible comments about my body. I didn't know who they were so when I turned to somebody for help, it might be one of those people that's just looking to make fun of me more. Kick me when I'm down when I'm feeling vulnerable. So, you know, just knowing that you're not alone, and that there is someone that you can talk to that you can trust and isn't going to be judging you and isn't going to be re victimizing you is important. And that can make a huge difference. So, yeah, that is absolutely have them contact us they can do so with any of our Twitter handles. And we'll put those in the chat. And, you know, or through our website www.badassarmy.org, which also I saw one of the questions was how can people support you guys. And how can people support y'all and the best way they can do that is through our website. We have some donation links that they can use as well as just letting people know we're here. So next question. What about cases outside of the US like Ireland or the UK. We have a UK group. We know people. I mean, our organization has contacts like internationally for. I mean, we can. We can find someone who can help you pretty much is what my answer to that with me. Our, our net is wide and we can. We all know enough people who know people who can point you where you need to go. I want to mention that the US is really far behind everybody else. That's what I was going to say. The UK, they have a bunch of words their laws are built stronger. And Australia, oh my gosh, Australia is just really ahead of this they have an entire section of their government that is just, you know, it's a government funded organization to prevent this. And they're really tackling things that the US is still, you know, most people don't acknowledge that they're things that exists like deep fakes or, you know, how this all play, you know, they're really starting to make things happen in Australia so it. Yeah, we have a lot of we have a lot of ways to go. That's literally what I was going to say so yay. Um, somebody said it really comes down to this how can a lot or set of laws be created or phrase to fight this issue without being used to clamp down on speech or sex workers, similar foster sesta, which is. You know, something that by all means it was meant to help people that are getting victimized and. You know, help prevent this sort of victimization by, you know, in making platforms. Accountable but at the same time, it hasn't really been used for that has it. No, what it's been used for is to attack and harm sex workers. And it has a body count at this point. So how do you think that we should be addressing this in a way that isn't going to either a use to harm sex workers or be be used to infringe on our free speech. Yeah, that's really hard. But like my first initial thought is to just is to kind of focus that language on like. I guess focus the law itself on proof of consent so like you're responsible if you're posting content to prove that the subject and that content has consented to being in it. Like, maybe if there's if that comes into play somewhere that will help a lot. Like requiring websites that allow pornography to get a release from every person that is actively participating on screen. Yeah. I mean, that wouldn't be any different than other websites that posts. I was verified. Yeah, like just verify the user. I think the US speaking of being behind and how other countries deal with it. I think the US over criminalizes that behavior. From my personal opinion on it, not necessarily like legal opinion on it, but Yeah, I think there's a way to write these laws that doesn't make it like that makes consensual sex work legal and doesn't criminalize it and lets people do what people are going to do. I mean, we're country founded on privacy. And also a large portion of our victim based our sex workers because, you know, you want your photos in a consensual way to be released to a certain market and a certain bubble and once it releases outside of that. It can be just as harmful as for anyone else, you know. So I think that these sorts of laws help and also harm. You know, it's double ended. Yeah, Rachel, that's a good point too is. I mean, you're, it's just because you're a sex worker doesn't mean that your copyrighted personal work isn't your intellectual property that you get to keep and that you're entitled to. One doesn't consent to to consent for one doesn't translate to consent for many, allowing a certain, I mean, even transactional consent is consent and that is the thing that matters. And justice traumatic to have people that you didn't want to be seeing your images or, you know, often with sex workers, we find them doxxed. Right. So it's double the trauma right there. I mean, I think it's the ultimate irony that we're speaking at a hacker conference where we talk about how broken copyright laws are in the DMCA and at the same time, you know, that's one of our as an organization biggest tools to fight some of this, you know, is the copyright laws. Oh yeah. Hey, you know, we have a really important question I do want to address. And this is from a friend of mine, we've. And they asked what should happen is allowing rules that help victims without removing liability protection platforms outside of CSAM and CP and deeply concerned about the bills that have been proposed like earn it, which is definitely a pro censorship law and even more worrying LAD act, which went outlaw encryption in the US and I want to know what the panel thinks about this. Anyone else want to take that one first or sort of cool if I jump. You can do it. All right. Cool. Go for it. You know, I kind of started in on this one with, you know, fossa sesta and how these laws. They're you want they're using CSAM as just, you know, a way to kind of put a friendly spin on this law like if, you know, your lawmakers aren't passing it that's because they hate kids. But really, when it comes down to it, it's removing the privacy of every user. And the fact is we've seen them be put in place and we see exactly how little they actually help the things that they're supposed to be preventing. What these laws are going to be doing is destroying censorship and freedom and I don't think that's right. I hate these laws they are, I feel like they've been bought and paid for by, you know, certain, you know, very specific lobbies and I think they're bullshit and we've definitely everybody needs to get involved if there's anything that badass has proved it's that everybody can get involved when it comes to legislation. And if you care about something you need to start calling and making stuff happen and you need to call your lawmakers and talk to them about the earn it act and that, you know, encryption isn't really important to stay something that we have available. Yeah, I mean, oh, sorry, Timmy, go ahead. Real quick, I'll take, I'll take a line out of a talk Corey, Dr. O'gave at DEF CON a couple years ago about the war on general purpose computing and something that I think these lawmakers forget is that it is not possible to make a general purpose computer that does everything except the one thing that you don't want it to do. That's just not how computers work that's not how math works, and that is a nuance that is lost by the old white men writing these laws. Yeah. That's a real quick point. Real quick, we got like a minute so Oh, I was just going to say, I thought you were telling me to stop. Oh, sorry. Yeah, that's a really good point to me but also like to go up, Caitlin was saying like there is a balance between free speech and censorship and I think if we're going to start doing things like outlawing encryption and censoring material like we really need to We really need to take into consideration and without boring y'all with conlaw jurisprudence. You know, that's, it's not something to be taken lightly. And I think we really need to examine how we do this and what precedent it could set for the future. I think Kate just really wanted to say jurisprudence. I did. Okay, all right, as long as we got that cleared up goal achieved all right real quick. We are going to be going into the voice chat I know I am because, frankly, y'all can't make me shut up. So, yeah. But I really quick. I just want to see if anyone has one final messages they'd like to share with the people watching and listening to this. Okay, anyone final message Rachel. I'd say everyone please reach out if you have a question comment concern we would love to hear you and we would love to integrate you into our, our group of like, you know, sorry, I cut out the webcam. But, you know, there's a place for everyone and you do not have to go through image abuse alone. We're here for you. And, yeah. Oh, me. Yeah. Okay, I'll keep it quick. Be awesome to one another fuck cloud flare black lives matter. All right, Ali. Gonna echo Timmy's foot cloud flare comment. Thanks for listening to us reach out to us if y'all need anything we're always around to answer questions and we definitely are going to keep talking about this stuff. So, Marley. So, ditto on what everybody said and to add to it. Yeah, let's please keep talking about this conversation about how we can make laws that don't infringe on speech and encryption, but that will help victim saving that's a really important conversation to have and your voices are really helpful. Kate, do you just want to see improvements again. No. Black lives matter and consent is key. I'll always get it. Right. I'm going to end this whole thing up by saying thank you to everyone out there watching thank you to the, you know, everyone organizing def con this year I know it has been crazy and chaotic and you. Having do a lot of things that is that are brand new and I just want to say it's been wonderful thus far. I miss you all. I can't wait to see you in Vegas next year. And meanwhile, you know, life is short have all the orgasms consensual healthy orgasms or cookies depending on what you want. And just stay badass.