 All right, so you are in room two and you are about to hear a talk on safer online sex arm reduction and queer dating tips So if this was not our expectation now is a good time to change rooms and The talk will be led by Norman Shermas. Norman Shermas is a security and privacy arm reduction specialist They work with activists globally and they have a folk as particular focus on sex workers queer trans and gender non-conforming communities Norman works as an independent consultant and they are a member of the open privacy support of directors So you have the floor? It's really great to be here at North Sec and To have actually a day where we had a keynote that was talking about some of these issues of users and the talk right before With Alyssa talking about some of the ethics that are involved so I really appreciate that this is here at this track in that sense and Before I actually dive into the topics I Want to just start with the content advisory and I know we have the code of conduct But I want to just make make sure it's clear that this is a sex positive talk that focuses on marginalized communities So my intention is to have this be an inclusive space. That's free from stigmas around the use of tech for sexuality and With this since I'm talking about harm reduction my work draws on examples of Sex work sex education drug use homelessness in sort of a broad big picture sense and It's there's nothing explicit or graphic in terms of discussions of sex acts or drug use But I want just to give people a heads up that that's part of what we're going to be talking about so Actually as Alyssa Helpfully sort of frame things. I want to have an idea of who's thinking about the users in tech right now Is there anyone in the room that's actually working or aimed at users? So I see a couple people Let's have like maybe one or two people just do a quick shout out of what they're doing Here Please raise your hands because my visual memory is not the best. Oh, okay. Well, I'll go on your hand I'm Florence. I'm working with courier to work basically for a building website for queers in a feminist taking of way To give them a war. I was online if freely Awesome. I'm Harlan. I work for the United States digital services and we're helping Member people who live in the u.s. Better access government services of all kinds Awesome. So I ran. Yes Hi, I'm Maggie. I'm a Mozilla fellow this year and I'm also tours user advocate And I'm also working on anti-harassment initiatives Awesome. Yeah, thanks everyone for who sort of shout out for sharing And in part when I was thinking about this is that we have also these sort of like large categories And I'm not certain if everyone who sort of spoke up what their exact role is But we tend to have like UI UX designers We have people in the product teams that are thinking about features We have policy folks that are thinking about for example anti-harassment policies what they look like But the security team is generally not thinking about the user or if they are they're thinking about the user in a way that might not be the most positive you might be thinking about the user as a Potentially malicious actor on a trusted network using language like that Insider threat maybe right? That's the sort of language that security folks use So the question really comes down to Who's thinking about the user's safety security and privacy and tech and This is the question that I've been interested in with my work in tech in particular But it comes from before that because I've worked within marginalized communities for quite a long time and In those spaces tech is not necessarily a solution, but it's sometimes a new form of oppression or a new risk that comes out So how are we actually sort of grappling with this and we see Examples that even though there are people who are thinking about this they're not necessarily doing a good job We see for example with uber They had sent out this email recently which is a check your ride every time email to remind people to make sure you're Getting into the correct uber so that you know, they're telling you to check the license plate the make model the driver's photo and It's actually it's really nice that they made this But the context of why they made this email was someone got murdered for not getting into an uber for someone They thought it was an uber and it wasn't and It also the way that there was have responded it only focuses on a very specific scenario One where the user is inherently at fault It doesn't look at or talk about some of the other publicly discussed cases that have happened with things like Driver assaulting users with things like drivers going out of their way to make a higher fare things like that and I'm sure that some of this has to do with the question of liability which I know again came up a lot during Alyssa's talk And that's something that is I'm sure again in the back of the mind of all of these companies with how can they do some of this messaging with that? Another example is Around what we talked about cyber flashing so there have been high profile cases of airdrop being used to send people Images so then example of what this looks like is the the middle picture with the censored phallus there's a headline from Huffington Post that notes, you know that again that this is a sexual offense and a recent article from Rob Pagora Raro Who's saying in this article that Apple could fix the airdrop issue with some technical fixes? And this has been something that has been an issue for like four or five years now. This is not a new issue and It's really interesting because some of the technical fixes are not necessarily very difficult One of the things that Rob mentions in the article is you can have a time limit for accepting from everyone But most people often focus on the user should have known sort of like terms and services But or the user has the control and they've just configured it improperly so you know these are some of the people thinking about users and tech and security and Perhaps another high-profile case is grinder grinder has been getting a lot of flak recently and most recently with this article talks in part about the the current case of Herrick versus grinder in the US That's about a grinder user whose X created a fake profile and sent a bunch of people for hookups to to the plaintiff and This gets into me like a very complicated case that involves. What's the platform's liability with this? That has now been a lot of discussion in in the US and other places But it sort of opens up this idea or this perception that with these sort of continued issues that platforms are not Thinking or don't care about user safety so how can tech or security better support user safety and This is where I enter and I come from this approach of a harm reduction approach And I know I'm not the first person to talk about harm reduction within Security the I think one of the earliest examples is in 2012 violet blue gave a talk about harm reduction focusing on hackers as a Population for harm reduction work has anyone seen that talk? So there's only a couple of people so I'm I just wanted to put that out there that there are other people to Potentially go see which is really great But I'm also taking a slightly different focus because I'm focusing directly on the end users of the app and not on the hackers themselves and To start with harm reduction. I like to start thinking about where it came from this Tweet very hopefully to me in like succinctly notes that harm reduction was started by sex workers Queer and trans people of color people who use drugs people in the streets saving their own lives and all the intersections Thereof not by public health folks And they know to sort of respect the origins because when people talk about harm reduction They immediately go to talking about public health Because the public health programming is where harm reduction has really entered into policy And so those become the very easy examples for people to think of but my own experience and approach really comes from working with marginalized communities and Being a member of some of these communities so That's I have potentially different perspective than just this policy route and My work is deeply indebted in that sense to these communities that I've been a part of so the aspects of harm reduction and a quick and easy way of thinking about it is Simply that it's aimed to minimize harm from activities. These activities are usually illegal, but they don't have to be it's explicitly opposed to stigmatization and judgment around the activities and this is a really important point because So often people who for example use drugs might be stigmatized and say that they deserve it Whatever happens because they're using drugs It's not about ignoring the harm that happens from activities, so it's very upfront about that and It's about putting the power in the Communities those that risks in their hands and there's even Recognition of new forms of harm that can come with harm reduction approaches what I mean by that is For sex workers who carry condoms if you carry more than like two condoms and you're stopped by police That can be used as evidence of prostitution so you know using Harm reduction methods like using condoms when you're meeting with someone to have sex with them Can also then be a way to put someone at a new risk. So harm reductions also upfront about that The the key parts also at sort of a higher level is that they're opposed to abstinence approaches So saying that it's either this or that you know You can only have one or the other and it also recognizes the larger structures and systems that are in play and That means that you're taking an inherently Intersectional approach or you're looking in a broad way at what's happening What's harm reduction for a white sex worker might be different than what's harm reduction for a black sex worker because of the way that the Police interact with the sex worker for example So part of this is that the fact that harm reduction inherently recognizes the power of pleasure And when I'm talking about pleasure here, I don't necessarily mean just sex drugs It can be a more abstract idea like Being able to dress or be the person that you want to be or how you feel you are It can be you know the ability to take time off of work and just like have some downtime right so when we When we put things in this approach That says you have to either do something that's pleasurable or you have to be safe Most of the time people are gonna do what's pleasurable and so it creates again this This danger potentially where people might know risks but say essentially fuck the risks because What's pleasurable is more important and no one's trying to tell them that there are other options and So I'm gonna run through about four examples of harm reduction programs To give an idea of what this looks like in practice The first one are needle or syringe exchanges and this is one of the classic examples of harm reduction and It's when we think about this from a public policy perspective Or public health perspective one of the goals of needle exchange programs is to reduce the spread of HIV Hepatitis and other blood-borne illnesses and the basic idea with Needle exchanges is you either go to a place or there's potentially like a van But you're able to give a used needle and get a new one usually for free there might be some like small cost involved, but The goal is that way you're not sharing needles and if someone's infected you spread an infection to a lot of people it's not trying to get people to stop doing drugs or other uses of needles and initially These were actually seen as illegal people doing needle exchanges and syringe exchanges were could get in trouble with the law because it was seen as supporting drug use, right and You know part of this is as things have moved forward part of the goal of needle exchange programs has been to change the laws so nowadays most needle exchange programs are legal and Canada has even started piloting needle exchange programs in prisons, which is Pretty radical actually that they're having this recognition and they're willing to sort of go that step and that has like governmental support and So the so we they've moved forward to make what they're doing more acceptable and oftentimes people are working at De-stigmatizing drugs at large so working on things like decriminalization or legalization another prime example of harm reduction programs are safer sex and reproductive justice and Usually people talk about this as just safer sex, but I think within the US context of what's going on right now with abortions It's really important to note that part of this includes safe access to abortions and to other things That are linked to reproductive justice and to link to like family planning. That's also a big piece of harm reduction So I'm assuming some people have had some variation or have been exposed to some variation of safer sex education Through knowing that condoms might be a thing, but sometimes these also don't go far enough. There are additional types of protective equipment prophylactics like dental dams and others that people use There's things like PrEP which is pre-exposure Prophylaxis to help spread HIV and that's been actually very effective And there's also having access to regular STD STI testings to help ensure that you're not spreading infections So those are some examples of harm reduction programs within the safer sex and reproductive justice world But there's also things that are we'll say human or behavioral practices that people are able to adopt and this can include things like negotiation of sexual encounters prior to having sex with someone and this is very common within kink communities To ensure that you have things like safe words set up so while you engage in sex that might be more dangerous you're able to Or actually seen as more dangerous to be able to sort of stop pause and you know seize the activity to help make sure everyone is okay It can include things like sex workers charging less for someone to use condoms Because economics is a reality It can include things like not brushing your teeth or flossing before or after oral sex and even though that's a relatively minor risk of exposure Usually when you brush or floss you do have a little bit of bleeding in your gums. So again, there's all of these different types of sort of small practices that are built into the harm reduction approach And so I'm not I know I can do like a whole try talk on the sex ed But we're gonna like, you know, move on but if anyone has any of these questions. I'm happy to talk afterwards or around and Another example and this is moving away from the traditional examples that people talk about to talking about Homeless communities and a lot of my experience comes with working with homeless youth and a drop-in center is a place That's not a shelter But where someone can go during the day to get services and some of the services that they provide like access to food Clothing medical supplies it can be computers so they can communicate with people so they can create resumes It can be a mailing address In the in the US. I'm not sure how accurate it is in Canada But in the US mailing addresses are still a necessity to get certain parts of government services So having a mailing address is really important They can offer skills building opportunities one of the the places that I'm familiar with Used to have a screen printing workshop for example where homeless youth can help gain actual like sort of vocational skills or learn how to do screen printing and They if they sort of did it long enough they can also create their own art So they're given opportunities at expression and they actually made money off of this they gained not only skills but they gained a form of sort of economic independence and Some of them will even offer street outreach after they're close So we'll go out and they'll provide things like the access to food the clothing medical supplies They'll provide that to people living on the streets who might not come into the drop-in Center and The last example I want to give is Humanitarian assistance for border crossings So this example is quite specific to an organization in Arizona called no more deaths or no more at this and They've hit the news actually recently in the US because the folks that are working for No more deaths are being targeted for surveillance and harassment by police and other legal cases for doing this work and what the work that they primarily do is sort of exactly what you see up here on the screen they provide food and water for people crossing the desert and Having lived in the desert I can tell you even in city centers where you can easily go and pick up water from like a gas station or a corner store People die every year from dehydration and heat So if you then imagine you're crossing a desert where you don't have that kind of infrastructure support How important the water is for people to be able to stay hydrated and continue walking and At the moment there's a lot of Again a lot of conversations around what's what's going on There's a lawsuit about whether this is considered legal or illegal activities The organization says that what they're doing is legal and they provide legal supporter services But it's an example right now of what's of some of these conversations Sort of in real time as something that's not necessarily explicitly called harm reduction in the same way that other public health programs are is being debated so Harm reductions inherently radical If you hadn't picked up on that While we were talking about no more deaths. There's been this very strong xenophobic anti-immigrant anti-migrant rhetoric cropping up in the US and globally it's not just in the US unfortunately and So while the media and other policy folks are trying to dehumanize People who are crossing and part of this is language of saying you know illegals that The goal of no more deaths again is simply to view them as humans who deserve to live and provide water and that's really radical in this context and in this day and age and Harm reduction actually does exist in the in the tech space as well And so I want to show at least a couple of examples so that way we can you know see some of these ways That this is already being integrated The first one is talking about sort of social media and privacy I have a tweet here that I wrote a while ago in response to Research that Dragan Akarian did on the surveillance and intimidation against people live-streaming police brutality in the US and The the tweets just simply noting how important it is to have access to live-streaming and to social media platforms Beyond just connecting people because this is about preserving evidence, right? and Why people who have been advocating for delete Facebook which has been such a common refrain especially after the Cambridge Analytica Conversation Just doesn't work in those situations, but it goes beyond just sort of preserving evidence It links into the way that we think about connection as well. I tend to do a lot of work in places where I Facebook's free basics which used to be called internet org is active and what that means is that there's people who have free access to Facebook They have free access to like Instagram, but they don't have free access to email So things like Facebook become a very important way for people to connect to each other to family to family abroad Things like that. So when we talk about things like delete Facebook, we're also then assuming, you know, there are alternatives that we have and so part of this like humanization process is Noting these other sort of examples and thinking about these people, you know that we're not necessarily part of their group But we can see them as deserving of connection in the same way that we are The other example is talking about intimate images what people might talk about as revenge porn and A very common refrain I hear is if you don't want Your nudes online just don't take them or don't send them and This is to me a perfect example of this abstinence approach within tech It's saying that you can either do something like, you know, send a send a sexed or you can sort of not There's no in between and it ignores the way that sending nudes send sexting people can be an important Piece of someone's self-identity. It can be a way that someone can help Feel okay with their body and have body positivity in really again really powerful ways of Identity and things that are really important beyond perhaps just pleasure and sort of an intimate moment between other between individuals who are at a distance and There's an organization called coding rights that has an awesome zine called safer nudes that talks about this and does a very positive sex positive Take on it So I highly recommend if you do sexed or you're interested in safer ways in terms of talking to people about sexting to take a look at that zine and I want to highlight also the Facebook's response to revenge porn because this was quite controversial But they were actually trying to do harm reduction at a platform level whether they talked about it that way or not That's really what was going on and so Facebook's response if you're not familiar was Saying send us your nudes and we'll hash the image and if that hash appears we'll like stop it from being uploaded or spread and Again, it's really great that they're trying to do this harm reduction approach. It's unclear if You they a they talk to people and like sort of built and designed with them and as opposed to just talking to people and then building on their own and It also ignores the fact of like trust and the role that trust plays in this and whether people actually trusted Facebook enough Especially with some of their other sort of issues But to me it's a really great example of moving away from this conversation of perfect security or perfect privacy that we tend to talk about So before I jump into the case study, I just want to see if there's any questions about the harm reduction approach cool If I can have let's make it a video hashed and kept off of Facebook because it's revenge form porn cat I also Submit the video documenting the police abuse and have that kept off of Facebook as revenge porn Sorry, I might I might not be following Well, I can rephrase Yeah, it was if we send some content to Facebook and we determined like this would be objectionable to have it shared Because it's an intimate photo. Well, the same technology could be used to censor some other any content that they would deem unacceptable Yeah, yeah, I mean so Yes, so we're the this comes down to what we'll perhaps say approaches, right? And this is where I'm talking about the Facebook revenge porn is We'll say an example not necessarily an example that want that needs to be followed or the only way of You know sort of going about this because like you bring up there are there are issues of them being able to potentially censor or use this as a way of Having you know using let's say AI or machine learning other things to automatically Remove content and some of this actually you saw with YouTube taking down Videos that were evidence of human rights abuses and war crimes because of like sort of AI and things like that So again in the same things that are used Potentially for some of this harm reduction stuff can also be we'll say misused in sort of other ways The the key difference or the key thing I would like to sort of promote with the thought of harm reduction is how harm reduction is aimed at Putting the voices of the at-risk communities At the forefront So one of the reasons I noted it was unclear whether Facebook designed this with People who have had non-consensual spread of intimate images is because If they didn't really center those voices and design with them I'm not sure whether that's gonna be a long lasting or a good solution Right and so if our goal is to sort of really reduce harm and build something that will last That's not gonna necessarily be the right route Does that help? Cool, we can talk afterwards So Okay, so let's let's jump into the so was there another hand that I saw oh shit. Sorry my computer just Went to sleep and so we'll jump into the the case study in just a second and so we're talking about queer dating apps here and this work really comes from a project that was run by Asana Rigo from article 19 and looking at Apps and abuses that were going on in Egypt Lebanon and Iran and So I want to like also like preface this right now that even though I'm the one here talking about this There are so many other people that were involved in this project that I'm not necessarily gonna be able to name them in part do like safety issues But like this is not like me Doing all of this work And there was an entire coalition and people who like volunteer time to do this So the background of this is it really started back in 2014 in Egypt after Al-Sisi took power There we had reports of using social media and apps like grinder to trap gay people and we were told that this was linked to Geolocation and at the time you had include security had published their Research on tinder and geolocation Another researcher created grinder map that was able to identify and locate users of grinder and The apps had actually responded or so the gay dating apps had responded so like scruff hornet grinder all did like user safety awareness little pop-ups and So we're jumping in and starting sort of slightly from that or we designed from that and we started with the users That was that was our main goal And we wanted to basically help them connect queer folks connect in these countries And if they want to hook up to do that, but our main goals to understand again, what's going on and how they're connecting So our methods we used a lot of different types of methods landscape overview. So this is like desk research Tech usage and awareness survey and again that had heavy framing on geolocation features That's probably the least qualitative part of our methodology We did interviews Including sort of group conversations. We had legal review of court cases We did security and privacy analysis and we had this direct collaboration between the security privacy experts sorry put experts in quotes because There's lots of sort of questionable advice. That's given and the the queer communities themselves and In particular for the the interviews those were actually run and helped designed by some of the different local groups that we worked with I'm not going to name them right now due to time, but if you are interested there are a few that are public And so our initial results showed that It's not only dating apps that were being used for hookups and dating and this was especially true in Iran due to Censorship and so things like grinder were censored. So people tended to use things like Instagram, you know Because you use what you have available to you Participants were largely aware of the risks They were even doing things like GPS spoofing and sort of other methods to try and protect themselves The GPS spoofing is a little questionable because sometimes people would route Or jailbreak their phone which could expose them to greater security risks So that's like also part of this harm reduction approaches noting that by not giving advice people could unintentionally putting themselves Or making themselves insecure or putting themselves at greater risk or danger There was also this tension between anonymity and the lack of trust among users And I highlight this because we didn't really go necessarily anywhere But I think this is a common conversation that's going on in all these platforms how do you Think about like sort of authentic authentication of a user beyond let's say like a password right so you can know if it actually is a person if it's a bot and then how do you build that trust in these digital spaces and Perhaps in a fun way the participants weren't only interested in digital risks They wanted other information about you know sex ad legal as well So I'm gonna go through a couple key results The first one is that participants wanted to use the apps despite the risks And this is I think probably the most important result when thinking about why take a harm reduction approach The typical approaches to security were not addressing the participant needs There was a lot of conversation about like why don't you use encryption? Why don't you just update your phone and they were based like that's not That's not what's going on here and In part because the primary risk were other users on the platform And so that's one of the reasons when you were perhaps starting to think about why Traditional security advice was not working is because they weren't thinking about the risk that the users were facing They were thinking about perhaps like risks to the system right and the app platform So again something like end-to-end encryption or using something like signal or another secure messenger would have done nothing to mitigate the risk of the other user being malicious and This one actually really surprised me is that people liked the location-based aspect and the reason that this surprised me I think is part because within the larger conversation around privacy In the US and Canada and Europe we tend to be so Anti-location-based services we talk about it as creepy all of that But what we what we were learning is that? Location is really important for safety decision-making for some people They want to be able to meet up with people in their areas that way they can meet up in person for other people There was a risk because you might be able to be identified based upon your last name Of having meeting up with people in too close of proximity So you'd want to make sure to meet up with people further away, so you might not communicate actually with someone who's close to you There we go, so and so we also did like I said, we also did this legal analysis and Afsana the principal investigator as I mentioned earlier is the one who really developed this methodology Around sort of looking at the evidence But we can sort of think about it as legal forensics trying to pick up the pieces of what we saw with what was being used to understand sort of what's going on and to be able to identify things to Ways to reduce harm from what's the evidence that's being used and the what we saw where that accounts were Created explicitly for entrapment or identifying suspects and so that can be people will say infiltrating group message apps screenshots And watermarked pictures, so some of the apps used to have watermarks were used for blackmail potentially for like blackmail or also blackmail to arrest someone and Since the context that we're talking about did include checkpoints having the presence of the app On the phone was a risk in itself Because at a checkpoint someone might take your phone say open it and they'll flip through and they'll say oh you have grinder Or you have hornet or you have these images and so that was also a key risk um Perhaps like importantly and similar to some of the conversations with sex work and how sex work is policed is that no sexual activity? Was needed for the arrests, so they had other ways of going around this so even in places where Homosexuality wasn't outright outlawed. They used other other laws and again, there were other ways of Bending the laws or interpreting the evidence to be able to do arrests So we found that understanding the digital evidence was actually an essential Point to basically understanding sort of the adversary methods or what's going on? And so if we hadn't done this we couldn't have made some of the recommendations that we did So in the one of the key recommendations that we had was this app cloaking feature and app cloaking is something that came from Work that was done by the Guardian project and the Guardian project had designed for the Android a way to change the icon of the app So it didn't look like you know that original icon and so again It's basically just changing the app icon and that was a key recommendation Especially for that part of having that presence of the app on your phone to follow best practices for transport security and I sort of chuckle as I say this because if we think about where the ethics involved and like trying to build some of These things in the process. This is theoretically something that should have been done from the very beginning At the time that we did the research in the conversations, none of the apps had used SSL pinning and or certificate pinning and I don't think any of them were even using HTTPS or secure transport for their images for their media servers Which is which is huge when we know that images are being used as evidence Working with the local community is like a key one as well and that goes to parts like centering those at risk and Increase the safety with the geolocation feature and we provided some potential recommendations But we also noted that this needs a lot more research So again, we were not saying to get rid of the location feature Which is part of the initial response in 2014 apps would also at that point time give you the option to turn off location services So you're not reporting your location to the app I'm sorry So within the implementation what we had sort of done as results of this I should say other than coming up with that list of recommendations we actually worked with grinder to implement some of the recommendations and Among the fun things that came out of that is a new method for app cloaking on iOS As I noted the Guardian project had only done work on Android But with the work that we had done we had now now again a new way of doing this for iOS Which was sort of new research or a new feature in that sense and the creation of non-tech resources again These are things that were requested by By the the communities that we talked to So context specific legal information localized sex health fact sheets and these were in the languages that they spoke So the question is where do we go from here in terms of harm reduction like what's how do we actually apply this in practice and I think one of the first things is to recognize that security privacy and safety are not binary and So again, there's not this idea of a perfect security or perfect safety or perfect privacy that there are there's a spectrum And that you know people will do their own analysis around this Focus on understanding and humanizing the users so often We I think so stigmatized and we don't think of the users as humans And I know that this is part of the role of designers But not everyone has designers and even when we have designers. They're not necessarily Doing the best job when it comes to the humanization part in part because they're building personas and other things that Developmental models that might exclude other users, right? And so there is already attention within design and design itself, you know needs to be perhaps part of this conversation with that There are no edge cases in the sense I've been in enough conversations with security privacy folks where I've heard people talk about the most at risk or marginalized users as edge cases and if we're thinking about dehumanizing language This is a perfect example of that sort of dehumanizing language We need to move away from stigmatizing it user activities I'm sure I would not be surprised I'm not going to necessarily ask anyone to raise their hands But I would not be surprised if there are people in this room who have thought or said things like a user clicked on that fishing link That was so obvious. They're so stupid. Why did they do that? And that's an example again of stigmatizing user activities as opposed to trying to sort of understand and build with them and This is the this is the part that Alyssa had mentioned as well that we were talking about Which is incorporating user safety, not just system risks into threat modeling and system design some of those traditional security parts But of course this brings up really that question of who owns user-centric security and privacy and safety within a company and within the structures that we have Due to like where security and privacy teams are usually placed They're under things like compliance which is gonna, you know, like shift that approach They might be based within organization or company operations, which means that they're not thinking about the end user of the app or the platform and Even if they're based within let's say the product team as in the product teams the one who is building this They might not necessarily have the power or ability to sort of implement some of these things Because the security team might need to be the one who signs off on everything, you know So there are already some of those tensions even within where security and privacy can be placed There's obviously the design teams maybe have a role the product or app teams could as well And the the product teams we those were actually the folks that we were talking to at Grinder They were the ones that were sort of the owners internally And I want to perhaps leave with the question of what would it look like to bring harm reduction to your work and I think this is an important question in part because I've talked so much about we'll say these more social scenarios But I also strongly believe that harm reduction as an approach can be used and brought into the work that people do within corporate environments and I would high I would highly encourage people who are in sort of enterprises or corporate environments to think about ways that they can bring in and incorporate harm reduction and The implications or what that means into the security work that they do so I guess thank you and we'll go to questions