 Good morning good afternoon and good evening and welcome to this session on coercive control resistant design My name is Henry Nash from IBM and I'm here with my colleague Leslie nuttler and together. We're going to take you through this topic In a second less is going to deep dive into the specifics But I just want to introduce the concept because the terminology may not be familiar to everyone Coercive controllers where one individual looks to modify the behavior of another through nefarious means in order to control them Probably the most common example you will be aware of is that of domestic abuse? I'm not talking about the physical aspect of that, but also the mental aspect of that in terms of the control and the monitoring and so forth of the other individual What you might not be familiar with however is that technology is playing an increasingly Key role in the way that perpetrators control their victims Whether it be the smartphones we carry or the subscription accounts we have or the smart iot devices in the home All these are being used by perpetrators in order to take control of their victims Now what you're going to hear is the output of some research done by IBM in conjunction with industry and academia as well as support groups for from domestic abuse Victims and we're going to lay out some principles that we as a technology industry the people who make this stuff Can actually follow to minimize the opportunity for perpetrators to misuse our technology in order to abuse We see this as the start of a conversation and there's going to be a live Q&A at the end of this and we encourage you to join us and Discuss what the next steps will be But for now Leslie over to you Thanks Henry So as Henry said my name is Leslie not all and I'm one of the four co-authors of the coercive control resistant design paper That was born from an IBM initiative focused on determining how we can make technology resistant to being used as a weapon in abusive relationships The idea for this initiative came about when some colleagues were openly Discussing and laughing about how they're remotely controlling the heating and Wi-Fi at home Unbeknownst to their partners who are at home at the time Now it's very possible that their partners also found this funny once they found out but It seemed disrespectful After some research it soon became clear that this kind of manipulation of technology is prevalent in domestic abuse Particularly coercive control where patterns of coercive and controlling behavior Create such fear of another person the victim will do anything the abuser requires So we started looking into how we could make technology resistant to this kind of manipulation and eventually we wrote the design principles paper The aim of that paper and this talk is not to be prescriptive But rather to inform the technical community about how technology is being used for coercive control So that knowledge can be applied at the design stage potentially making our tech a little bit safer Technology is increasingly becoming a part of our daily lives and new technologies can bring immense Benefits into the home they safeguard us ease our routines and enhance our experiences Unfortunately those same technologies can also be used as a tactic of domestic abuse And when used in that way, they make the abuser appear all seeing all knowing and all powerful An Australian survey of domestic violence support workers found an almost complete overlap between technology abuse and domestic abuse with 98% saying they had clients who had experienced technology facilitated abuse The victims feel like they're under constant surveillance. They feel vulnerable Powerless and they feel like nowhere is safe Domestic abuse is a prevalent problem in society with an estimated one in five people Experiencing it in their lifetime for women. This rises to one in three In fact more than one in ten of all offenses Recorded by the UK police are domestic abuse related At its heart domestic abuse is all about control. It's a systematic patterned purposeful behavior on the part of the abuser to control the victim Historically responses to domestic abuse have been built on a violence model Equating severity with a number of beatings or injuries however The form of abuse that drives most victims to seek outside assistance is where they've been subjected to a pattern of domination that includes tactics to intimidate restrict and Control their behavior It's this pattern of domination that is coercive control and it can be just as devastating as physical violence Often causing severe depression and post-traumatic stress disorder In fact Experts like in coercive control to be intake in hostage It creates invisible chains and a sense of fear that pervades all elements of a victim's life In December 2015 England and Wales became the first nations in the world to criminalize coercive control with the government's legislation Highlighting that technology can be used by perpetrators This is because modern technology gives the abusers ever-growing ways to harm their victims Using the tools of everyday life So how is Technology being used to abuse? Well the methods are diverse Using spyware for surveillance mirroring devices to track usage Bombarding with abusive messages harassment via social media financial monitoring Location tracking as well as using smart home devices to observe or cause distress Abuses gain access to personal and home devices online accounts and even children's toys and devices What's particularly insidious is that applications designed with the best of intentions are being used for malicious purposes To give you a couple of examples At home we have a connected doorbell and with the app I'm able to remotely see who's at the door or be informed of any movement outside It's a great piece of tech and it was designed with safety in mind However, I could use it to monitor and entrap my family because I receive Instant notifications the moment any attempt is made to leave the home. I Also have a credit card app and it provides me with purchase notifications with the amount and vendor details Again, a really useful piece of tech and it was built to help combat fraud however My husband is a joint card holder in that account and he is unable to access the app That gives me all the control. I can constantly and instantly monitor all spending both mine and his While my husband is unable to see anything at this point You may be wondering how big of a problem is this? Are we talking about a niche issue? Whilst there is no central reporting for technology facilitated abuse the number of cases appear to be on the rise You may have seen the various news articles talking about survivors experiences Many of which are quite disturbing in fact a UK news investigation looking into technology abuse found an 1800% increase in alleged cyber stalking offenses between 2014 and 18 Refuge the largest UK provider of shelters for domestic abuse victims who also run the national helpline found that nearly three-quarters of the people seeking their assistance last year had faced abuse via technology in The past two years the charity has also seen a rise in cases which involve Abuses using IOT devices against their victims such as smart locks webcams and smart eating It is very possible that such cases are under reported because many victims are simply unaware of what is happening to them In addition cyber security company Kopersky reported that stalker where infections grew by 40% in 2019 So considering all this I think we are safe to say that technology facilitated abuse is a significant issue We've talked about how domestic abuse is a prevalent issue in society But what is deeply worrying is that a recent UN report exploring the impact of COVID-19 on women highlighted a trend of increased abuse as homes are placed under strain from Self-isolation and lockdown This has become so widespread that the UN chief is calling for measures to address this and I quote horrifying global surge in domestic violence During the lockdown UK charity refuge reported that 66% more people were calling the national domestic abuse helpline and that there was a 950% increase in traffic to their website The way that technology is used to enforce control during lockdown may have shifted focus It's less likely that perpetrators would be tracking location or calendars But other techniques may be utilized more and can be just as damaging Some examples could be Monitoring video chats or using smart home devices for manipulation Victims may have no means of escape or have limited support during this time So it's much easier for abusers to gain that control It's important to recognize that we should not build our products in a way that casts us in the role of saviour It may seem that in actions such as automatically deleting an account Blocking a profile or disabling some functionality is the obvious solution This could have very real consequences for the victim potentially leading to an escalation of abuse They must be given the ability to make their own decisions We also do not recommend the big red button solution where functionality is added That calls for help on behalf of the victim so they can be rescued Many victims would be horrified to be thought of as needing rescuing a mixture of manipulation pride and shame Not only are there already reporting mechanisms available This approach assumes that leaving is the safest or most desirable choice If victims aren't allowed to make their own decisions They're simply being moved from one form of control to another It's important to find that right level of response not ignoring or controlling simply enabling Unfortunately, there is no easy answer to ending technology facilitated abuse Co-assist control resistant design is about more subtle decisions that need to be made balancing intended with unintended consequences To aid in making these decisions our researchers identified five key design principles Now most of these topics are probably already familiar to you When looked at through the lens of coercive control they can take on additional meaning First is diversity Often when we're designing new technology we have target users in mind However, they might not be the only type of users that end up using our technology and Unexpected users often utilize our systems in unexpected ways Having a diverse design team broadens the understanding of user habits Thus enabling greater exploration of use cases both the positive and the negative Second privacy and choice We need to make it easy for users to make active informed decisions about their privacy settings Little red buttons or phrases like advanced settings can intimidate users Causing them to just pick the default settings without necessarily understanding the consequences of that choice Third combating gaslighting Gaslighting is where a person manipulates someone psychologically into doubting their memories and judgment If you're able to remove all evidence of an action taking place or if there never was any evidence to begin with This could cause someone to start to question their memory Permanent Timely notifications as well as auditing are essential for making it clear who has done what when Fourth security and data We have GDPR that tells us to only collect and share personal information that we actually need But that consideration could be extended to other types of data Car apps are a good example here whereby they share all car journeys with all users of an app Ignoring the fact that some journeys may be considered private It's also worth noting that most perpetrators aren't hacking into applications to cause harm They are using standard accounts with standard functionality This type of user may not be considered as part of your standard security threat model Lastly we have technical ability Victims of coercive control live in complex error-shifting worlds and may lack the energy or confidence to navigate new technologies If all users could intuitively use and understand technology This would help reduce the risk of abusers weaponizing their greater technical confidence Either with threats or by installing applications the victim doesn't understand Henry will now explore some example scenarios with you to bring context to these principles While the examples themselves are fictional they draw on our research to ensure that they are realistic The first of the principles is diversity and as part of the open-source community We probably all think we have pretty good diversity in our projects most well-governed Open-source communities, you know a meritocracy and so we think we have pretty good diversity but consider this example Mary Has left her abusive husband taking the two children with her they're now in a refuge and She's made sure that that she's turned off all the location sharing, you know options in her applications Probably been spoken to the children to say you know don't disclose where we are but unfortunately as part of One of the applicants they use which is a photo sharing application the husband asked for a picture of the homework one of the children are Doing and they share that What they don't realize is that embedded in the photograph is actually their location and so the husband tracks it So that where they are and arrives outside the refuge forcing the mother and children to flee again to another environment Now the creators of the app probably didn't consider children as part of their user base Yeah, when we're doing agile design thinking, you know We often concentrate on the key personas or we consider the 80-20 rule of the features we want to get into the next release We don't go on and consider You know, you know potential, you know, you know other users outside of that environment which may you know Be people using the application who inadvertently won't understand what Some of the features really mean and so It's important that when we're doing our design we think about diversity in a more extensive way than we may have done already You know as well as the diverse development team and a user base What are the profiles outside of our typical target market because we can't control typically who's going to use the stuff we build What are the you know, then happy parts? Well, what are what are the consequences of some of the malicious intense and Sometimes, you know, some of those intense some of those outcomes could be incredibly extreme even though the probability is small Typically, if it's a small probability, we discount that we need to reconsider that type of thinking when it comes to You know cursive control being applied to the applications used diverse data in the decisions that we make Purpose in choice is another key principle We're probably all pretty familiar with every time we sign up with a new social media service We have to answer questions about what kind of policy we want to do with the data we share Often it's default settings or some kind of advanced settings where we have to choose being various attributes But the consequence of those Choices we make are often not obvious consider Alex here who starts using a new social media application and receives a Friend request from a female acquaintance at work, which he accepts Not realizing that the default settings for this social media application means that Accepting that request will get posted to his timeline that his wife can see later His abusive wife takes the task about this and punishes him because of it he just didn't realize what the default settings actually meant and when you think about the implications of these things within Curse of control we really need to do a much better job explaining What the settings are in fact we should avoid the default settings or advanced setting kind of Trying to be simple about two choices, you know we to actually explain it in much greater detail You know and there's some you know some principles that you can part of the checklist on Prince in choice that you can think about here About really making that clear and there are other areas where I think we're in the it's infancy of thinking about how We should do this think about the shared smart devices in the home You know the couple of the personal assistants, I won't use their names to avoid triggering them But the one beginning in a and the one beginning in s You know often they are controlled by one individual set up for the whole household Who you know who has you know the the rights to the data, you know How is the privacy setting set for the individuals the different speakers who talk to those devices? Which carry out commands on their behalf? You know, this is an area where we think much more what needs to be done about how People kind of give their permission for that can rescind their permission and so forth for these Household shared devices where typically one person and occasionally it is the perpetrator often the perpetrator Who controls the account for that device and so they can see everything about what's happening on that device? preventing gaslighting is another principle gaslighting is where One person looks to gain even further control of another By base basically making that other person doubt their own sanity Take the example of Rob here who works from home most of the time And he's noticed some pretty strange things happening recently the heating seems to go up and down without him changing it The stereo somebody blasts on in the middle of a call Lights come off and on, you know, you know at the room behest his partner Angus asked him to send some emails and social media Posts which he did but then can't find any evidence of it at all and So his abusive partner Angus starts to really Question Rob and saying like you must be imagining these things, you know, you clearly You know going mad you say you send this stuff, but it's not there. Look. There's nothing on these devices And in that way Angus gains even further control because Rob really starts to doubt, you know, what to reality is in In fact, of course, what's happening is that Angus can carry out most of these Things remotely due to the internet connected nature of devices around the home and our social media accounts and so forth So what we really need to do is consider, you know, how do we actually? You know enable a single source of truth to be maintained Through these interconnected devices so that someone can't basically erase history And what actually happened is, you know, is retained and marked so that any partner That has after this can actually check what really happened There need to be notifications for when things do happen so that, you know, everyone is informed when certain changes are made if there are Local devices that can be controlled remotely then maybe a local override is an important thing to make sure that You know people who are being abused can actually retain some form of control You know, we've solved a number of these, you know Parallel problems in the open-source community and in other software areas where we think about, you know Like an immutable log of administrator control to change it so that, you know administrators can't change The fact that they did something, you know These are some of the techniques that we need to apply to actually ensure that, you know, we can keep people safe And by the way, if you're wondering why it's called gaslighting It, you know, refers to a play in the 1930s and a film in 1940s called Gaslight, in fact When indeed an abusive husband tries to convince his wife that she is going mad Due to the fact that the gas lights are going up and down and it's due to that he's carrying out a bunch of things And since and that term is stuck, you know, since that very day Security in data is clearly another key principle When applications and systems are designed Typically there's a set of assumptions made about how users will manage their security credentials Every user will have a, you know, some kind of unique name and a password for instance that they won't share They'll keep it secret The reality is though in abusive relationships, then often those norms don't apply Take the example of Lynn here who's in a relationship with Roger Roger's now insisted that she should share her password with him He says it's part of being in a close relationship. So she reluctant he does so What she doesn't realize is that due to fact her browser history has been Stored and he has access to that. He can now go then try her username and password at all the other accounts And websites she's been to and there's a good chance that they'll work. She may well have reused those credentials And so now she's given access to a whole wide variety of her data Of course, there are systems that are already being rolled out to try and combat this kind of abuse Two-factor authentication is one example where if it was being used for all these accounts Rogers having the password wouldn't be sufficient But there are even problems there usually the defaults on your smartphone is That Texts which is often how the one-time number of arrives from two-factor authentication is displayed on the lock screen. Well The abuser may see can then see that and read it if they're watching the the victim's phone So again another example of how you know default options fall down in these kind of scenarios and the checklist we have here is really for you to think about the different ways About security and data in this kind of abusive scenario What are the threat models that really count usually threat models? We think about are what about I have a hacker coming in someone who doesn't have the credentials who's trying to gain those credentials But what about an actor who has been willingly given the credentials and may want to cause malicious harm to the account holder? That's really a threat model we consider, but it is the threat model. We're talking about in this example And if there is data that I've shared, you know into the system What if a malicious person an abuser gets hold of the information? What is the damage they can do with that? Whether it corresponds to GDPR or not Significant damage can be caused and we need to consider that as part of the actual security and data of our systems technical ability is Another principle that we've outlined here and this can relate to a technical deficit between Perpetrator and a victim or just the technical complexity of some of the devices that are being installed into our homes Take the example here Bolivia who's at home looking after a newborn son Controlling her controlling father has actually installed a video doorbell on the pretext that it's good for her safety and the reality is of course that he now has a video access of whenever she Leaves the home or whenever anyone visits And she's aware that some of this is going on But it's not clear because it's not obvious how the device works Is it recording all the time only when certain people come to door is their facial recognition involved? You know the you know, it's not clear from the way the device is configured and set up What actually is happening and so she feels that she really is being controlled And in fact she is because she is now fearful to actually leave the house So some of the challenges here are the fact that clearly in this example It's highly unlikely. I would suggest that usability testing was done across a diverse user base back to our diversity issue at the start and There certainly weren't considered the kind of abusive relationship Opportunities that such a technology might provide Often these devices, you know seems simple on the surface. It's like a doorbell, you know on on the on the door But often require tinkering to understand how it works there's an application to install on a smartphone which needs credentials who has access to those credentials and You know is that where the target demographics really thought Through when these things were designed, you know These things should be be able to you know used by beginners without Significant technical knowledge in fact this whole area of of technical knowledge is actually becoming weaponized now in terms of Often abusers and perpetrators will Make, you know ridiculous technical claims to their victims like I can hack into your systems Or you know, I'll know wherever you are I can access all your data. I can delete it You know some of the stuff that we're used to seeing from you know from security spam emails and so forth These are having weaponized by abusers to control their victims So we really have to up our game in terms of how We consider the technical ability requirements for products so that the abusers Cannot misuse them in this way So there are the five key principles that we've identified and we would urge you to consider all of these as part of the applications that you're building Some of these obviously are very user interface specific but others are part of the system side the cloud side of These applications very few applications that are used by abusers are standalone applications these days You know, they're the social media applications. They're the things that have You know service I componentry often complex micro service architectures that are delivering those solutions via the cloud and in those areas the traditional aspects that we're all involved with Security and data and audit and logging all have roles to play in ensuring we can deliver these principles I'm looking further into the future. You know, we're thinking of things like Today we have AI toolkits that can look at for example gender bias in the data sets for AI AI can we develop AI toolkits can look at potential vulnerabilities that abusers could exploit in some of these Principles that we've outlined today These are some of the things that we'll be looking at in the future and we welcome the conversation With all of you and how we progress this And now back to Leslie for a final recap For this initiative our focus has been on the abuse of intimate partners and their immediate family However, coercive control also happens in other types of relationships, especially where there's a power imbalance Some examples would be between carers and the vulnerable or elderly within institutions and even in the workplace We've seen that abusers are adept at leveraging technology for their own purposes In our constantly changing and growing world of technology It is projected that there could be 125 billion interconnected devices by 2030 As these devices become more prevalent and even more is shared online This will provide abusers with a wider array of tools to use against their victims Few of us in the technical community intentionally create technology to cause harm But it is possible that we are disconnected from the unintended effects of our creations So we encourage you to lead the way in adopting these design principles Open up conversations and ask the questions By building technology with an eye towards making it resistant to abuse Our creations will harness innovation while limiting unforeseen consequences Hopefully shaping lives and society for the better Finally, if anyone is affected by anything we've discussed today, there is support available to you These include police response, online support, helplines, refuges and other services We've included some organizations here who can help Apologize. I think I had no sound or I apologize But yeah, I was just saying if there are any questions then Yeah, Leslie and I would be very happy to take them Now if there's something we weren't clear about or something we failed to cover we'd love to hear about it Okay. Well, if there are no questions Feel free to follow up with either Leslie or myself We're very happy to continue this conversation about how members of the open source community can get involved in these This initiative And we're looking to spread that as wide as possible Um, and Leslie anything from you before we close? No, just to say I guess one thing Um, these principles aren't static. We would very much like to evolve them. So yeah feedback is more than welcome That's it really Great. Thank you very much. Enjoy the rest of the conference and we look forward to speaking to uh offline Thank you