 to introduce you today to Eva and Chris. Eva, she is a senior researcher at Privacy International. She works on gender, economical, and social rights, and how they interplay with the right to privacy, especially in marginalized communities. Chris, she is the technology lead at Privacy International. And his day-to-day job is to expose company and how they profit from individuals. And especially specifically today, they will tell us how these companies can even profit from your menstruations. Thank you. Thank you. Hi, everyone. It's nice to be back at CCC. I was at CCC last year. This talk is going to be, if you were at my talk from last year, this is going to be a slight vague part too. And if you're not, I'm just going to give you a very, very free cap because there is like a relationship between the two. So yeah, this, I say, give a little bit of background about how this project started. Then we're going to talk a little bit about administration apps and what an administration app actually is. Then we're going to talk a little bit through some of the data that these apps are collecting. We're going to talk through how we did our research, our research methodology, and then what our findings are and our conclusions. So last year, I and a colleague did a project around how Facebook collects data about users on Android devices using the Android Facebook SDK. And this is whether you have a Facebook account or not. And for that project, we really looked when you first opened apps and didn't really have to do very much interaction with them, particularly about the automatic sending of data in a post-GDPR context. And so we looked at a load of apps for that project, including a couple of period trackers. And that kind of led onto this project because we, as I say, looked at loads of apps across the disparate different areas of categories. Whereas so we thought we'd like hone in a little bit on period trackers to see what kind of data, because they're far more sensitive than many of the other apps on there, like, well, you might consider your music history to be very sensitive. So just as a quick update on the previous work from last year, we actually followed up with all of the companies from that report. And by the end of going through multiple rounds of right of response, over 60% of them have changed practices either by disabling the Facebook SDK in their app, or disabling it till you gave consent, or removing it entirely. So I'm going to pass over to Eva Blondemonte. He's going to talk through the administration apps. So I just want to make sure that we're all on the same page, although if you didn't know what administration apps is and you still bothered coming to this talk, I'm extremely grateful. So how many of you are using an administration apps or have a partner who's been using an administration apps? Oh, my god. Oh, my OK. I didn't expect that. I thought it was going to be much less. OK, well, for the few of you who might not know what an administration app, I'm still going to go quickly through what an administration app is. It's the idea of an administration app. We also call them period tracker. It's to have an app that track your administration cycle. So they tell you what day is your most fertile. And you can obviously, if you're using them to try and get pregnant, or if you have, for example, a painful period, you can sort of plan accordingly. So that's essentially the main two reasons users would be looking into using an administration apps pregnancy period tracking. Now, how did this research start? As Chris said, obviously, there was this whole research that had been done by previous international last year on various apps. And as Chris also already said, what I was particularly interested in was the kind of data that administration apps are collecting. Because as we'll explain in this talk, it's really actually not just limited to an administration cycle. And so I was interested in seeing what actually happens to the data when it is being shared. So I should say we're really standing in the shoulders of giants when it comes to this research. There was previously existing research on administration apps that was done by a partner organization coding rights in Brazil. So they had done research on the kind of data that was collected by administration apps and the granularity of this data. And yet a very interesting thing they were looking at was the gender normativity of those apps. And Chris and I have been looking at dozens of those apps. And they have various data sharing practices, as we'll explain in this talk. But they have one thing that all of them have in common is that they are all pink. And the other thing is that they talk to their users as women. They don't even compute the fact that maybe not all their users are women. So there is a very sort of narrow perspective of pregnancy and female's bodies and how does female sexuality function. Now, as I was saying, when you're using a mainstream app, it's not just your mainstream cycle that you're entering. So this is some of the questions that mainstream apps ask. So sex, there's a lot about sex that they want to know. How often is it protected or unprotected? Are you smoking? Are you drinking? Are you partying? How often? We even had one app that was asking about masturbation, your sleeping pattern, your coffee drinking habits. One thing that's really interesting is that, and we'll talk a little bit more again about this later, but there is very strong data protection laws in Europe called GDPR, as most of you will know. And it says that only data that's really necessary should be collected. So I'm still unclear what masturbation has to do with tracking your mainstream cycle, but other thing that was collected is about your health. And the reason health is so important is also related to data protection laws, because when you're collecting health data, you need to show that you're taking extra step to collect this data because it's considered sensitive personal data. So extra step in terms of getting explicit consent from the users, but also extra step on top, on behalf of the data controller, in terms of showing that they're making extra step for the security of this data. So this is the type of question that was asked. There's so much asked about vaginal discharge and the kind of vaginal discharge you get was all sort of weird objective for this, sticky, creamy. So yeah, they clearly thought a lot about this. And there's a lot about mood as well. Even, yeah, I didn't know romantic was a mood, but apparently it is. And what's interesting, obviously, about mood in the context where we've seen stories like Cambridge Analytica, for example, so we know how much companies, we know how much political parties are trying to understand how we think, how we feel. So that's actually quite significant that you have an app that's collecting information about how we feel on a daily basis. And obviously, when people enter all this data, their expectation at that point is that the data stays between them and the app. And actually, there is very little in the privacy policy that would normally suggest otherwise. So this is the moment where I actually should say we're not making this up. Like literally everything in this list of questions were things, literal terms that they were asking. So we set out to look on the most popular mediation apps. Do you want it? Here you go. I forgot to introduce myself as well. Really, that's terrible speaking habit. Christopher Weatherhead. Yep, Privacy International's technology lead. So yeah, as I said about my previous research, we have actually looked at most of the very popular administration apps, the ones that have hundreds of thousands of downloads. And these apps are like, as we were saying, this kind of work has been done before. And a lot of these apps have come into quite a lot of criticism. I'll spay the free advertising about which ones particularly. But most of them don't do anything particularly outrageous, at least between the app and the developer's servers. A lot of them don't share with third parties at that stage, so you can't look between the app and the server to see whether they're sharing. They might be sharing data from the developer's server to Facebook or to other places, but at least you can't see in between. But we're an international organization and we work around the globe. And most of the apps that get the most downloads are particularly Western, US, European, but they're not the most popular apps necessarily in a lot of contexts like India and the Philippines and Latin America. So we thought we'd have a look and see those apps. They're all available in Europe, but they're not necessarily the most popular in Europe. And this is where things start getting interesting. So what exactly do we do? Well, we started off by triaging through a large number of period trackers. And as Eva said earlier, every logo must be pink. And we were just kind of looking through to see how many trackers. This is using extra privacy. We have our own instance in PI. And we just looked through to see how many trackers and who the trackers were. So for example, this is Maya, which is exceptionally popular in India, predominantly. It's made by an Indian company. And as you can see, it's got a large number of bundle trackers in it. Clevertat, Facebook, Flurry, Google, and InMobile, that might be. So we went through this process and this allowed us to cut down because there's hundreds of period trackers. Not all of them are necessarily bad, but it's nice to try and see which ones had the most trackers where they were used and trying to just triage them a little bit. From this, we then run through PI's interception environment, which is a VM that I made. I actually made it last year for the talk I gave last year. And I said I'd release it after the talk and it took me like three months to release it. But it's now available. You can go onto PI's website and download it. It's a man in the middle proxy with a few settings for mainly looking at iOS and Android app to do data interception between them. And so we run through that and we get to have a look at all the data that's being sent to and from both the app developer and third parties. And here's what we found. So out of the six apps we look at, five shared data with Facebook. And out of those five, three ping Facebook to let them know when their users were downloading the app and opening the app. And that's already quite significant information and we'll get to that later. Now what's actually interesting and the focus of our report was on the two apps that showed every single piece of information that their users entered with Facebook and other third parties. So just to brief you, the two apps we focused on are both called Maya. So that's not very helpful. One is spelled Maya M-A-Y-A. The other one is spelled Maya M-I-A. So yeah, just bear with me because this is actually quite confusing. But so initially we'll focus on Maya M-Y-A, which is, as Chris mentioned, it's an app that's based in India. They have a user base of several millions. They're based, yeah, based in India. User based mostly in India, also quite popular in the Philippines. So what's interesting with Maya is that they start sharing data with Facebook before you even get to agree to their privacy policy. So I should say already about the privacy policy of a lot of those apps that we looked at is that they're literally the definition of small prints. It's very hard to read. It's legally language. It really puts into perspective the whole question of consents in GDPR because GDPR says that consents must be informed. So you must be able to understand what you're consenting to. When you're reading those extremely long, extremely opaque privacy policies of law, literally all the pre-managed creation apps we've looked at, excluding one that didn't even bother putting their privacy policy, actually. It's opaque. It's very hard to understand. And they absolutely definitely do not say that they're sharing information with Facebook. So as I said, data sharing happens before you get to agree to their privacy policy. The other thing that's also worth remembering is that when they share information with Facebook, it doesn't matter if you have a Facebook account or not, the information's still being relayed. The other interesting thing that you'll notice as well in several of the slides is that the information that's being shared is tied to your identity through your unique ID identifiers, also your email address. But basically, most of the questions we got when we released the research was like, if I use a fake email address or if I use a fake name, is that okay? Well, it's not because even if you have a Facebook account through your unique ID identifier, they would definitely be able to trace you back. So there's a little way to actually anonymize this process unless you're deliberately trying to trick it and use a separate phone, basically for regular users. It's quite difficult. So this is what it looks like when you enter the data. So as I said, didn't lie to you, this is the kind of questions they're asking you. And this is what it looks like when it's being shared with Facebook. So you see the symptom changing, for example, blood pressure, swelling, acne. This is all being shared through graph.facebook, through the Facebook SDK. This is what it looks like when they shared your contraceptive practice. So again, we're talking health data here. We're talking sensitive data. We're talking about data that should normally require extra steps in terms of collecting it, in terms of how it's being processed. But nope, in this case, it was shared exactly like the rest. This is what it looks like. So yeah, in sex life, it was a little bit different. So this is why it looks like when they're asking you about you just had sex, was it protected, was it unprotected? The way it was shared with Facebook was a little bit more cryptic, so to speak. So if you have protected sex, it was entered as love to unprotected sex. It was entered as love three. I managed to figure that out pretty quickly, so it's not so cryptic. So yeah, that's also quite funny. So Maya had a diary section where they encourage people to enter their notes and their personal thoughts. And I mean, it's a main creation app, so you can sort of get the idea of what people are gonna be writing down in there or I expected to write down. It's not gonna be their shopping list, although shopping links could also be personal, sensitive personal information, but. So we were wondering what would happen if we were to write in this diary and how this data would be processed. So we entered, literally we entered something very sensitive entered here. This is what we wrote. And literally everything we wrote was shared with Facebook. Maya also shared your health data, not just with Facebook, but with a company called Clevitt App that's based in California. So what's Clevitt App? Clevitt App is a data broker basically. It's a company that is sort of similar to Facebook with the Facebook SDK. They expect app developers to hand over the data and in exchange, app developers get insights about how people use the app, what time of the day, the age of their users. They get sort of information and analytics out of the data that they share with this company. It took us some time yet to figure it out because it shared as a wicked wizard. Wicked rocket. Wicked rocket, yeah. Yeah, but that's exactly the same. Everything that was shared with Facebook was also shared with Clevitt App. Again, with the email address that we were using, everything was shared. Now let's look at the other Maya. It's not just the name that's similar. It's also the data sharing practices. Maya is based in Cyprus. So in European Union. I should say in all cases, regardless of where the company is based, the moment that they market the product in European Union, so i.e. like literally every apps we looked out, they need to, well they should respect GDPR, European Data Protection Law. Now, the first thing that Maya asks when you're starting the app and again I'll get to that later about the significance of this is why you're using the app. Are you using it to try and get pregnant or are you just using it to try to track your periods? Now it's interesting because it doesn't change at all the way you interact with the app. Eventually the app stays exactly the same but this is actually the most important kind of data. This is literally called the gem of data collection. It's trying to know when a woman is trying to get pregnant or not. So the reason this is the first question they ask is, well my guess on this is they wanna make sure that even if you don't actually use the app, that's at least that much information they can collect about you. And so this information was created immediately with Facebook and with AppSliar. AppSliar is very similar to Clevver Tap in the way it works. It's also a company that collects data from those apps and a lot of services in terms of analytics and insights into user behavior. It's based in Israel. So this is why it looks like when you enter the information, so yeah, masturbation, peel, what kind of peel you're taking, your lifestyle habits. Now where it's slightly different is that the information doesn't immediately get shared with Facebook but based on the information you enter, you get articles that are tailored for you. So for example, like when you select masturbation, you will get masturbation, what you want to know but are ashamed to ask. Now what's eventually shared with Facebook is actually the kind of article that's being offered to you. So basically, yeah, the information is shared indirectly because then your Facebook did you start, you've just entered masturbation because you're getting an article about masturbation. So this is what happened when you enter alcohol, so expected effects of alcohol in a woman's body. This is what happened when you enter and protected sex. So effectively all the information is still shared just indirectly through the articles you're getting. And yeah, last thing also I should say on this in terms of the articles that you're getting is that sometimes they were sort of also kind of like crossing the data. So the articles will be about like, oh, you have cramps outside of your periods, for example, like during your fertile phase. And so you'll get the article specifically for this. So the information that's shared with Facebook and with Apps Flyer is that this person is in their fertile period in this phase of their cycles and having cramps. Now why are menstruation apps obsessed with burning up if you're trying to get pregnant? And so this goes back to a lot of the things I mentioned before about wanting to know in the very first place if you're trying to get pregnant or not. And also this is probably why a lot of those apps are trying to really nail down in their language, in their discourse, how you're using the apps for. When a person is pregnant, they're purchasing habits, their consumer habits change. When, obviously you buy not only for yourself but you start buying for others as well. But also you're buying new things you've never purchased before. So what a regular person will be quite difficult to change a purchasing habit was a person that's pregnant. They'll be, advertisers will be really keen to target them because this is a point of their life where their habits changed and where they can be more easily influenced one way or another. So in other words, it's peak advertising time. In other words, in picture, there's research done in 2014 in the US that was trying to evaluate the value of data for a person. So an average American person that's not pregnant was 10 cents. A person who's pregnant would be $1.50. So you may have noticed we're using the past tense when we talked about, well, I hope I did when I was speaking, definitely in the slides at least, we use the past tense when we talk about data sharing of this app. That's because both my and Mia, which would do the two apps who were really targeting this different report stopped using the Facebook as the kid when we wrote to them about our research before we published it. So it was quite nice because they didn't even like rely on actually us publishing the report. It was merely at the stage of like hey, this is all right of response. We're gonna be publishing this. Do you have anything to say about this? And essentially what they had to say is like, yep, sorry, apologies, we're stopping this. I think what's really interesting as well for me about how quick the response was is it really shows how this is, this is not a vital service for them. This is a plus, this is something that's a useful tool. But the fact that it immediately, that it could immediately just stop using it I think really shows that it was, I wouldn't say a lazy practice, but it's a case of like, as long as no one's complaining, then they're gonna start using, they're gonna carry on using it. And I think that was also the discourse with your research. There was also a lot that changed your behaviors after. A lot of the developers sometimes don't even realize necessarily what data their app is sharing with people like Facebook, with people like ClevrTap or whoever. They just integrate the SDK and hope for the best. We also got this interesting response from our supplier is that it's very hypocritical, essentially what they were saying is like, oh, like we specifically ask our customers or yeah, to not share health data with us. Specifically for the reason I mentioned earlier, which is because of GDPR, you're normally expected to take extra step when you process sensitive health data. So their response is that they ask their customer to not share health data or sensitive personal data. So they don't become liable in terms of the law. So they were like, oh, we're sorry, like this is a breach of or contract. Now, the reason it's very hypocritical is that obviously when they have contracts with the main creation apps and actually Maya was not the only apps that were main creation apps that we're working with. I mean, what can you generally expect in terms of the kind of data you're gonna receive? So here's a conclusion for us that research works. It's fun, it's easy to do. Chris has not published the environment. It doesn't actually, once the environment is sort of set up, it doesn't actually require a technical background as you saw from the slides. It's pretty straightforward to actually understand how the data is being shared. So you should do it too. But more broadly, we think it's really important to do more research, not just at the stage of the process, but generally about the security and the data sharing practices of apps because it's hard a lot and more and more people are interacting with technology and using the internet. So we need to think much more carefully about the security implication of the apps we use and obviously it works. Thank you. Thank you. So yeah, please line up in front of the microphones. We can start with microphone too. Hi, thank you. So you mentioned that now we can check whether data is being shared with third parties on the path between the user and the developer, but we cannot know for the other options for these whether it's not being shared later from the company to other companies. Have you thought of, have you conceptualized some ways of testing that? Is it possible to? Yeah, so you could do a data-subject access request under the GDPR and that would let like, the problem is it's quite hard to necessarily know how the process of the system outside of the app to serve a relationship. It's quite hard to know the processes of that data. And so it's quite opaque. They might apply a different identifier to it. They might do other manipulation to that data. So trying to track down and prove this bit of data belong to you is quite challenging. This is something we're gonna try, we're gonna be doing in 2020 actually. We're gonna be doing data-subject access request of those apps that we've been looking at to see if we find anything, both under GDPR but also under different data protection laws in different countries, to see basically what we get, how much we can obtain from that. So I'd go with the signal, Angel. So what advice can you give us on how we can make people understand that from a privacy perspective, it's not better to use pen and paper instead of entering sensitive data into any of these apps? I definitely wouldn't advise that. I wouldn't advise pen and paper. I think for us, really the key, the work we're doing is not actually targeting users, it's targeting companies. We think it's companies that really need to do better. We'll often ask about advice to customers or advice to users and consumers. But what I think and what we've been telling companies as well is that their users trust you and they have the right to trust you. They also have the right to expect that you're respecting the law. The European Union has a very ambitious legislation when it comes to privacy with GDPR. And so the least they can expect is that you're respecting the law. And so no, this is the thing. I think people have the right to use those apps. They have the right to say, well, this is a useful service for me. It's really companies that need to have their game, that need to live up to the expectations of their consumers. Not yet a way around. My microphone is on. Hi, so from the talk, it seems, and I think that's what you did, you mostly focused on Android-based apps. Can you maybe comment on what the situation is with iOS? Is there any technical difficulty or is it anything completely different with respect to these apps and apps in general? There's not really a technical difficulty, like the setups are a little bit different, but functionally you can look at the same kind of data. The focus here, though, is also, so it's two-fold in some respects. Most of the places that these apps are used are heavily dominated Android territories, places like India, the Philippines. iOS penetration there and Apple device at penetration is very low. There's no technical reason not to look at Apple devices, but like in this particular context, it's not necessarily hugely relevant. Does that answer your question? And technically, with your setup, you could also do the same analysis with an iOS device. Yeah, let's say there's a little bit of a change to how you have to register the device as an MDM device, like have a profile, a mobile profile, but otherwise you can do exactly the same level of interception. Hi, my question is actually related to the last question. It's a little bit technical. I'm also doing some research on apps, and I've noticed with the newest versions of Android that they're making it more difficult to install custom certificates to have this passed through and check what the apps are actually communicating to their home servers. Have you found a way to make this easier or...? Yes, so we actually hate the same issue. You are in some respects. So the installing of custom certificates was not really an obstacle because you can add them... If it's a root device, you can add them to the system store and then they are trusted by all the apps on the device. The problem we're now hitting is that Android 9 and 10 have TLS 1.3. And TLS 1.3 detects that there's a man in the middle or at least it tries to and might terminate the connection. This is a bit of a problem. So currently, all our research is still running on Android 8.1 devices. This isn't going to be sustainable long term, though. Four? Hi, thank you for the great talk. Your research is obviously targeted in a constructive, critical way towards companies that are making apps surrounding menstrual research. Did you learn anything from this context that you would want to pass on to people who research this area more generally? I'm thinking, for example, of Fatima and Corb in the US who've done microdosing research on LSD and are starting a breakout study on menstrual issues. Well, I think, and this is why I was concluded on this, I think there's still a lot of research that needs to be done in terms of the sharing. And obviously, I think anything that touches on people's health is a key priority, because it's something people relate very strongly to. The consequences, especially in the US, for example, of sharing health data like this, of having data, even like your blood pressure and so on. Like, what are the consequences if those informations are going to be shared, for example, with like insurance companies and so on? So this is why I think it's absolutely essential that you have a better understanding of the data collection and sharing practices of the services the moment when you have health data that's being involved. Yeah, because we often focus about this being an advertising issue, but in that sense as well, like insurance and even credit referencing and all sorts of other things become problematic, especially when it comes to pregnancy related. Yeah, even employers could be after this kind of information. Six. Hi, I'm wondering if there is an easy way or a tool which we can use to detect if apps are using our data or reporting them to Facebook or whatever, or if we can even use those apps but block this data from being reported to Facebook. Yes, so you can firewall off graph.facebook.com and stop sending data. There's a few issues here. So firstly, it doesn't really, like, this audience can do this. Most users don't have the technical nuance to know what needs to be blocked, what doesn't necessarily need to be blocked. It's on the companies to be careful with users' data. It's not up to the users to try and defend against, no, it's not, it shouldn't be on the user to defend against malicious data sharing or data. And also, one interesting thing was that Facebook had put this in place of where you could opt out from data sharing with the apps you're using. But that only works if you're a Facebook user. And as I said, this data has been collected whether you're a user or not. So in a sense, people who are on Facebook users, they couldn't opt out of this. The Facebook SDK that developers are integrating, the default state for sharing of data is on. It's, the flag is true. And although they have a long thing on, a long legal text on the help pages for their developer tools, it's like, unless you have a decent understanding of local data protection practice or local protection law, it's like it's not something that most developers are gonna be able to understand why this flag should be something different from on. Why this, you know, there's loads of flags in the SDK, which flag should be on and off depending on which jurisdiction you're selling to. It's, or your data, your users are gonna be in. Signal Angel again. Do you know any good apps which don't share data and privacy friendly, probably even one that is open source? So, I mean, the problem which is why I wouldn't, I wouldn't want to vouch for any app. Is that even in the apps that, you know, where in terms of like the traffic analysis we've done, we didn't see any data sharing, as Chris was explaining, the data can be shared at a later stage and it'd be impossible for us to really find out. So, I, no, I can't be vouching for any app, I don't know if you have anything. Yeah, the problem is that we can't ever look in, like at one specific moment in time is to see whether data's being shared and like what was good today might be bad tomorrow, what was bad yesterday might be good today. Although I have been, I was in Argentina recently speaking to a group of feminist activists and they have, they've been developing a administration tracking app. And their app was removed from the Google Play Store because it had illustrations that were deemed pornographic, but there were illustrations around medical related stuff. So, even people who were trying to do the right thing going through the open source channels are still fighting a completely different issue when it comes to administration tracking. It's a very fine line. Three. Sorry, we can't hear. The mic's not working. Microphone three. First, yes. Thanks for the great talk. Oh yeah, great, perfect. I was wondering if the Graph API endpoint was actually in place to track menstruation data or is it more like a general purpose advertisement tracking thing or yeah? So my understanding is that there's two broad kinds of data that Facebook gets. There's automated app events that Facebook are aware of. So app open, app close, app install, relinking. Relinking is quite an important one for Facebook. That's where it checks to see whether you already have a Facebook account logged in to log the app to your Facebook account from outstanding. There's also a load of custom events that the app developers can put in that is then collated back to a data set of what I would imagine on the other side. So when it comes to things like whether there's a nausea or some of the other health issues, it's actually being cross-referenced by the developer. So I'll ask you a question. Yes, thank you. Five, microphone five. Can you repeat what you said in the beginning about the menstruation apps used in Europe, especially Clue and Period Tracker? Yeah, so those are the most popular apps actually across the world, not just in Europe and the US. A lot of them, in terms of the traffic analysis stage, a lot of them have not cleaned up their apps. So we don't see, we can't see any data sharing happening at that stage. But as I said, I can't be vouching for them and saying, oh yeah, those are safe and fine to use because we don't know what's actually happening to the data once it's been collected by the app. Or we can say that as far as the research we've done goes, we didn't see any data being shared. Like those apps you mentioned have been investigated by the Wall Street Journal and the New York Times relatively recently. So they've been, had quite like a spotlight on them, so they've had to really up their game in a lot of ways, which is what we like everyone to do. But as Eva says, we don't know what else they might be doing with that data on their side, not necessarily in between the phone and the server, but from their server to another server. Microphone one. Hi, thank you for the insightful talk. I have a question that goes in a similar direction. Do you know whether or not these apps, even if they adhere to GDPR rules, collect the data to then at a later point, at least sell it to a higher, like the highest bidder because a lot of them are free to use and I wonder like, what is their main goal? Besides, I mean, advertisement is how they make profit. And so, I mean, the whole question about them trying to know if you're pregnant or not is so that this information can eventually, you know, be monetized through the target, the advertisement. Like when you're actually, when you're using those apps, you could see in some of the sites, like you're constantly like being flashed with like all sorts of advertisement on the app. You know, whether they're selling it externally or not, I'm not, you know, I can't tell. But what I can tell is, yeah, their business model is advertisement, so they are deriving profit from the data they collect, absolutely. Again, on microphone one. Thank you. I was wondering if there was more of a big data kind of aspect to it as well because these are really interesting medical, like information on women's cycles in general. Yeah, and the answer is like, I can't, this is a bit of a black box and especially in the way, for example, that Facebook is using this data, like we don't know. We can assume that this is like part of the, we could assume this is part of the profiling that Facebook does of both their users and their non-users. But the way this data is actually processed also by those apps through data brokers and so on, it's a bit of a black box. Microphone one. Yeah, thank you a lot for your talk and I have two completely different questions. The first one is you've been focusing a lot on advertising and how this data is used to sell to advertisers. But I mean, like, whether you aim to be pregnant or not, it's like, it has to be the best kept secret, at least in Switzerland, for any female person because like, if you also want to get employed, your employer must not know whether or not you want to get pregnant. And so I would like to ask, like, how likely is it that this kind of data is also potentially sold to employers who might want to poke into your health and like reproductive situation. And then my other question is entirely different because we also know that female health is one of the least researched topics around and that's actually a huge problem. Like, so little is actually known about like female health and the kind of data that these apps collect is actually a goldmine to do our own research on health issues that are specific for certain bodies, like female bodies. And so I would also like to know, like, how would it be possible to still gather this kind of data and still collect it but use it for like a beneficial purpose, like to improve knowledge on these issues? Sure. So to answer your first question, the answer will be similar to the previous answer I gave, which is, you know, it's a black box problem. It's like, it's very difficult to know exactly, you know, what's actually happening to this data. Obviously GDPR is there to prevent some things from happening. But as we've seen from this app, like they were, you know, towing a very blurry line. And so the risk obviously of a, this is something that can't be relied. I can't be saying, oh, this is happening because I have no evidence that this is happening. But obviously the risk are multiple. The risk are like employers, as you say, that insurance companies that could get it, that political parties could get it and target their messages based on the information they have about your mood, about, you know, even the fact that you're trying to start a family. So yeah, there is like a very broad range of risk. The advertisement we know for sure is happening because this is like the basis of their business model. The risk, the range of risk is very broad. To just expand that, like again, as Eva said, we can't point out a specific example of any of this. But if you look at some of the other data brokers, so Experian is a data broker. They collect, they have a statutory response, or in the UK at least they have a statutory job of being a credit reference agency, but they also run what is, I believe the deemed data enrichment. And one of the things that employers can do is buy Experian data to when hiring staff. Like I can't say that this data ever ends up there, but there is people collecting data and using it for some level of auditing. And to answer your second question, I think this is a very important problem you point out is the question of like data inequality and whose data get collected for what purpose. There is, I do quite a lot of work on like delivery of state services, for example. When there are populations that are isolated, this is not using technology and so on, you might just be missing out on people, for example, who should be in need of healthcare, of state support and so on, just because you lack data about them. And so female health is obviously a very key issue. We literally lack sufficient health data about women on women's health specifically. Now, in terms of how data is processed in medical research, this is actually protocol in place normally to ensure consent, to ensure explicit consent, to ensure that the data is properly collected. And so I think I wouldn't want to mix the two just because the way those apps have been collecting data, if there's one thing to take out of this talk is that it's been nothing short of horrifying, really. That data is being collected before and shared before you even get to consent to anything. I wouldn't trust any of those private companies to really be the ones carrying, well, taking parts in medical research on those. So I agree with you that there is a need for better and more data on women's health, but I don't think any of those actors so far have proved to be trusted on this. Microphone two. Yeah, thank you for this great talk. Short question. What do you think is the rationale of these menstruation apps to integrate the Facebook SDK if they don't get money from Facebook or being able to commercialize this data? That's a good question. It could be a mix of things. So sometimes it's literally the developers literally just have this as part of their tool chain and their workflow when they're developing apps. I don't necessarily know about these two period trackers or other apps are developed by these companies, but in our previous work, which I presented last year, you find that some companies just produce a load of apps and they just use the same tool chain every time and that includes by default the Facebook SDK as part of their tool chain. And some of them include it for what I would regard as genuine purposes, like they want their users to share something or they want their users to be able to log in with Facebook. In those cases, they include it for what would be regarded legitimate reason. But a lot of them just don't ever actually, they haven't integrated it, it does app events and they don't ever really use anything of it other than that. And then a lot of developers seem to be quite unaware of the default state is verbose and how it sends data to Facebook. Yeah, maybe we can close with one last question from me. You tested surely a bunch of apps. How many of them do certificate pinning? You see this as a widespread policy or they just? Not really, I've yet to see, I did have a problem doing an analysis where certificate's been pinned. As I say, TLS 1.3 is proven to be more problematic than pinning. Yeah. Okay, well, thank you so much and yeah.